Posts

Showing posts from July, 2024

Week 9 (July 22 – July 28) Progress

Fixing Backpropagation for the ClassificationNeuralNetwork Class In the previous meeting, we attempted to address several issues, such as non-conformant arguments (matrix multiplication problems). Andreas suggested posting on the discourse forum to get insights from the community. I decided to hold off on that and instead watched several YouTube videos and read various articles to see if I could solve the problems myself. Fortunately, I was able to fix the matrix multiplication issue 😄. However, now I'm facing a new challenge: the optimization process isn't working as expected 😩. You can see more about the problem  here . My work is available  here . Any assistance with the algorithm part would be greatly appreciated. Our next meeting is scheduled for August 6th at 16:30 UTC. I hope to resolve this issue before our meeting and aim to have the algorithm part completed by then. This will allow me to start implementing other methods, like resubpredict , crossval , etc. See...

Week 8 (July 15 –July 21) Progress

Started work on backpropagation for ClassificationNeuralNetwork class After reviewing the  MATLAB documentation , I successfully implemented the following components for the ClassificationNeuralNetwork class : Weights Initializer: Layer weights are initialized using either 'glorot' or 'he'. (Helper function) Biases Initializer: Biases are initialized using either 'zeros' or 'ones'. (Helper function) Loss Function: Cross entropy. (Helper function) One-Hot Vector Encoder: (Helper function) I ran various tests to ensure everything is working correctly, such as: Cross Entropy Loss Function: I created matrices for y_pred and y_act and fed them to an online function and to my implementation. The outputs matched. Forward Propagation: I used trained weights and biases from MATLAB for the Fisheriris dataset, gave them to my predict function, and the outputs were the same. The only remaining task is backpropagation, and I am facing some issues like non-co...

Week 7 (July 8 –July 14) Progress

Started studying algorithms and mathematics for implementing the ClassificationNeuralNetwork class This week, I started work on the ClassificationNeuralNetwork.m and fitcnet.m files. I also began studying the mathematics behind neural network classifiers. After reviewing the  MATLAB documentation , I identified the following components for the ClassificationNeuralNetwork class: Weights initializer: Layer weights are initialized using either 'glorot' or 'he'. Biases initializer: Biases are initialized using either 'zeros' or 'ones'. Loss function: Cross entropy. Solver used: L-BFGS (Limited-memory quasi-Newton code for bound-constrained optimization). Initially, I am focusing on getting the forward propagation to work properly. You can see my work  here . Resources I found helpful include: " Neural Network from scratch in Python " by Sentdex (unfortunately, the playlist was never completed). Andrew Ng's  lecture , where he mentions t...

Week 6 (July 1 –July 7) Progress

  Added tests to classdef  "ClassificationSVM " and "fitcsvm", Started studying algorithms and mathematics for implementing ClassificationNeuralNetwork classdef This week, I added some tests to the ClassificationSVM.m and fitcsvm.m files. I also started studying the mathematics behind neural network classifiers. After studying the MATLAB documentation , I identified the following components for the ClassificationNeuralNetwork class: Weights initializer: Layer Weights are initialized by 'glorot' or 'he' Biases initializer :  Biases are initialized by 'zeros' or 'ones' Loss function : Cross entropy Solver used : L-BFGS (Limited-memory quasi-Newton code for bound-constrained optimization) After some research, I have outlined the workflow that I will be following: Initialize weights and biases using the initializerParameter function Implement forward propagation Implement the predict function (using the initialized weights and biases)...