Steps to become a Machine Learning Engineer :-

Steps 1:-Introduction to ML : –

Overview of Topics :- Machine learning problems, parameter vs. hyperparameter, overfitting, training, validation, testing, cross-validation, regularization

https://www.toptal.com/machine-learning/machine-learning-theory-an-introductory-primer

Step2:- Classifications Algorithms :- http://www.stats.ox.ac.uk/~flaxman/HT17_lecture13.pdf

Overview of Topics :- Definition of a decision tree, metrics of impurity, greedy algorithm to split a node, tree depth and pruning, ensemble of trees (random forest)

Step 3 :- Bayesian Decision Theory :-

Overview of Topics :- Bayes rule: Prior, likelihood, posterior, evidence, Gaussian density, sufficient statistics, maximum likelihood derivation for mean and covariance

https://www.khanacademy.org/partner-content/wi-phi/wiphi-critical-thinking/wiphi-fundamentals/v/bayes-theorem

https://www.mathsisfun.com/data/bayes-theorem.html

https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf

Step 4 :- Linear models :-

Overview of the Resources :-

linear regression and its analytical solution, loss function, gradient descent and learning rate, logistic regression and its cost, SVM: hinge loss with L2 penalty.

https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf

Step 5 :- Kernelization :-

Overview of the Resources :-

Dual form of an SVM, kernels for a dual form, examples of kernels and their typical uses, SVR in primal form, SVR in dual form.

Step 6 :- Feature Selection and Engineering :-

Overview of the Resources :-

T-test, forward selection, features for images, features for audio, features for images, features for NLP, PCA, ZCA, K-PCA.

a) Supervised Features selections : https://www.researchgate.net/publication/275228384_Supervised_feature_selection_A_tutorial

b) 10 Effective Techniques on Feature Selections :- https://www.machinelearningplus.com/machine-learning/feature-selection/

c) Image Fundamentals :- http://openimaj.org/tutorial/pt02.html

d) Audio Fundamental :- http://openimaj.org/tutorial/pt04.html

e) Music Features Extractions in Python : https://towardsdatascience.com/extract-features-of-music-75a3f9bc265d

Step 7 :- Dense and shallow neural networks :-

Overview of Topics : –

Logistic regression as a sigmoid, single hidden layer using sigmoid and ReLU, approximation of any function using a single hidden layer, overfitting, advantage of multiple hidden layers, neural networks for regression, multi-regression, multi-classification using softmax, back propagation.

a) Multilayer perceptron :- https://d2l.ai/chapter_multilayer-perceptrons/mlp.html

b) Visual proof for NN computing :- http://neuralnetworksanddeeplearning.com/chap4.html

d) Forward propagations, Backward propagations and Computational Graphs :- https://d2l.ai/chapter_multilayer-perceptrons/backprop.html

Steps 8: Advanced topics in neural networks :-

Overview of the Resources :-

Weight initialization, momentum, weight decay, early stopping, batch SGD, advanced optimizers such as RMSprop and ADAM.

Training Neural Network 1
Training Neural Network

Step 9:- Clustering

 Overview of the Resources :- K-means, DB-SCAN, agglomerative clustering, scaling of dimensions, goodness of clustering.

Clustering Stanford Andrew NG
Prof Sudeshna IITKgp

https://en.wikipedia.org/wiki/Cluster_analysis

HURRAH!!!!! You are Eligible for Machine Learning Engineer.

Share This Post
Have your say!
00
Skip to toolbar