Week 8 (July 15 –July 21) Progress
Started work on backpropagation for ClassificationNeuralNetwork class
After reviewing the MATLAB documentation, I successfully implemented the following components for the ClassificationNeuralNetwork
class:
- Weights Initializer: Layer weights are initialized using either 'glorot' or 'he'. (Helper function)
- Biases Initializer: Biases are initialized using either 'zeros' or 'ones'. (Helper function)
- Loss Function: Cross entropy. (Helper function)
- One-Hot Vector Encoder: (Helper function)
I ran various tests to ensure everything is working correctly, such as:
- Cross Entropy Loss Function: I created matrices for
y_pred
andy_act
and fed them to an online function and to my implementation. The outputs matched. - Forward Propagation: I used trained weights and biases from MATLAB for the Fisheriris dataset, gave them to my predict function, and the outputs were the same.
The only remaining task is backpropagation, and I am facing some issues like non-conformant arguments (matrix multiplication problems). I will try to resolve this before our meeting. You can see my work here. Any help related to the algorithm part is greatly appreciated.
Resources I found helpful include:
- "Neural Network from scratch in Python" by Sentdex (unfortunately, the playlist was never completed).
- Andrew Ng's lecture.
- Andrew Ng's Machine Learning from Octave playlist, starting from lecture 401.
Our next meeting is scheduled for July 23rd at 16:30 UTC. Also, the Olympics are going to start on the 26th, so I will try to complete all my work before then. See you in the next blog. 😄
Comments
Post a Comment