Week 7 (July 8 –July 14) Progress
Started studying algorithms and mathematics for implementing the ClassificationNeuralNetwork
class
This week, I started work on the ClassificationNeuralNetwork.m
and fitcnet.m
files. I also began studying the mathematics behind neural network classifiers. After reviewing the MATLAB documentation, I identified the following components for the ClassificationNeuralNetwork
class:
- Weights initializer: Layer weights are initialized using either 'glorot' or 'he'.
- Biases initializer: Biases are initialized using either 'zeros' or 'ones'.
- Loss function: Cross entropy.
- Solver used: L-BFGS (Limited-memory quasi-Newton code for bound-constrained optimization).
Initially, I am focusing on getting the forward propagation to work properly. You can see my work here.
Resources I found helpful include:
- "Neural Network from scratch in Python" by Sentdex (unfortunately, the playlist was never completed).
- Andrew Ng's lecture, where he mentions that understanding the detailed mathematics behind the L-BFGS optimization technique is not necessary; we just need to know how to use it.
- Andrew Ng's Machine Learning from Octave playlist, starting from lecture 401.
I successfully passed my Midterm Evaluation as part of GSoC.
I submitted my first Pull request, which was successfully merged into the Statistics repository.
Our next meeting is scheduled for July 23rd at 16:30 UTC. Any help related to the algorithm part is greatly appreciated. See you in the next blog. 😄
Comments
Post a Comment