NOTES on Neural networks
Neural networks came about from the fact that not everything can be approximated by a linear/logistic regression—there may be potentially complex shapes within data that can only be approximated by complex functions
Machine Learning Interview questions :- https://media.licdn.com/dms/document/C511FAQFqurRI1pxS3w/feedshare-document-pdf-analyzed/0?e=1572685200&v=beta&t=WFLbt2YqxMSRUhYfVXvNy5mKmoVAhb2bF1w007N4JbI
An introductions to Hidden Markov Models by Stanford.
Intuition behind SVD
SVD breaks down a linear transformation (i.e. a matrix) into three fundamental parts: a rotation, an axis aligned stretching/scaling, and finally another rotation. The first rotation is necessary to prepare the space for scaling. The second moves the now properly scaled space into it’s ultimate rotational…[Read more]
Reinforcement Learning using TensorFlow.
Excellent lecture Computer Vision by “Lawrence Livermore National Laboratory” by Matthew Raver.
Python Data Science Tutorial: Analyzing the 2019 Stack Overflow Developer Survey By Corey Shafer.