Skip to content

Bitu-Singh-Rathoud/neural-network-numpy

Repository files navigation

neural-network-numpy

Implementation of a neural network from scratch using NumPy for MNIST digit classification with backpropagation and gradient descent.

Neural Network from Scratch (NumPy)

πŸ”— Project Link

https://github.com/Bitu-Singh-Rathoud/neural-network-numpy

πŸ“Œ Overview

This project implements a neural network from scratch using NumPy without relying on high-level deep learning libraries. It demonstrates core concepts like forward propagation, backpropagation, and gradient descent for digit classification.

πŸš€ Features

  • Fully connected neural network implementation
  • Forward propagation and backpropagation
  • Gradient descent optimization
  • Activation functions (ReLU, Sigmoid, Softmax)
  • Training on MNIST dataset

πŸ›  Tech Stack

  • Python
  • NumPy
  • Machine Learning
  • Deep Learning Fundamentals

πŸ“Š Results

  • Achieved classification accuracy on handwritten digit dataset (MNIST)
  • Demonstrated learning through loss reduction over epochs
  • Visualized prediction outputs and performance

▢️ How to Run

pip install -r requirements.txt
python main.py

About

Implementation of a neural network from scratch using NumPy for MNIST digit classification with backpropagation and gradient descent.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors