A neural network implemented from scratch in Go to classify MNIST digits.
- Input Layer: 784 units.
- Hidden Dense Layer: 128 units (ReLU activation).
- Output Dense Layer: 10 units (Softmax activation).
- Loss Function: Cross-Entropy.
Clone the repository and ensure Go is installed on your system. It should be compatible with Go 1.23.4. Then run:
go run .
Alternatively, you can build an executable:
go build -o main . && ./main
Place the MNIST dataset files in a directory named datasets
. Ensure the following files are included:
t10k-images-idx3-ubyte
t10k-labels-idx1-ubyte
train-images-idx3-ubyte
train-labels-idx1-ubyte
After training, the program shows:
- Training accuracy and loss per epoch.
- Validation accuracy and loss per epoch.