In this project our goal will be implementing a stochastic gradient descent algorithm for a neural network with one hidden layer.
See our implementation of NNetOneSplit function using R language here.
You can also see our implementation of Gradient Descent Algorithm using R language here.
In case you cannot run the notebook, we have generated a web page version of the notebook with outputs.
See it here or source code here.
Because the project is written in Python notebook, so it is necessary to have Jupyter on your machine.
You can install it here.
You can use git clone
to clone the project or just click the green button to download a ZIP file.
Use any tools you like to unzip the project into the folder you want.
Remember to set default initialization path to the place where you put ipynb file. Open Jupyter Notebook in your browser and open
project4.ipynb
Click 'run' to run each cell.
Maybe you need to wait for a while in some cells to train and load.
This is our fourth group project of CS499 Deep Learning course in Spring 2020 at NAU
You can find the requirements for this project here
Dr. T.D.Hocking - tdhock at SICCS
- Zhenyu Lei - lei37927
- Jianxuan Yao - JianxuanA
- Shuyue Qiao - SHUYUEQIAO
Any cloning or downloading before the project due date constitutes an infringement of our intellectual property rights, and after that it goes to open source. For any of the aforementioned infringements, Zhenyu Lei, Jianxuan Yao and Shuyue Qiao will report this to the NAU Academic Integrity Hearing Board.