- Brushed up the basics of Linear Algebra.
- Implemented Linear Regression with one variable using Gradient Descent referring Andrew Ng's Stanford ML course.
Link of Work Link
- Implemented Linear Regression with multiple variables using Gradient Descent and Normal equation.
Link of Work Link
- Was facing trouble in understanding vectorized implementations using numpy arrays, so referred to various blogs and videos and rewrote few functions to their fully vectorized form.
- Went through the theory of Logistic Regression for two class classification problem and learnt things regarding:
- Logit function and Logistic Regression Model
- Hypothesis Representation
- Linear and Non-Linear decision boundaries
- Went through the theory of Logistic Regression for two class classification problem and learnt things regarding:
- Logistic Regression Cost function
- Gradient descent for optimizing the cost function
- Learnt about different advanced optimization algorithms(e.g Conjugate gradient, BFGS),which are better than gradient descent.
- Understood vectorized implementation of Logistic Regression.
- Went through the theory of Multi-Class Logistic Regression for classification.