Skip to content

A modular neural network implemented from scratch in Python. Includes customizable architecture, training with backpropagation, evaluation metrics, and visualizations using the EMNIST dataset.

License

Notifications You must be signed in to change notification settings

anupbhowmik/Neural-Network-From-Scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Network From Scratch

This project implements a neural network using Python. The implementation includes key features such as model architecture, training, evaluation, and optimization. The project is structured to be modular and easily extensible.

Table of Contents

Features

  • Customizable Model Architecture: Supports multiple layers with configurable activation functions and layer sizes.
  • Training and Evaluation: Implements training loops with backpropagation and optimizers.
  • Dataset: Uses the EMNIST dataset. Loads and preprocesses datasets efficiently.
  • Performance Metrics: Computes accuracy, precision, recall, and F1 score for classification tasks.
  • Hyperparameter Tuning: Allows modification of learning rates, batch sizes, and optimization techniques.
  • Report: Generates a detailed report with training and evaluation metrics.

Implementation

The core implementation includes the following components:

Model Definition

It consists of:

  • Input Layer
  • Hidden Dense Layers (with ReLU as activation function)
  • Dropout Layer (for regularization)
  • Output Layer (for classification or regression tasks)

Training Process

The training process involves:

  1. Forward propagation
  2. Loss computation (cross-entropy for classification)
  3. Backpropagation
  4. Parameter updates using ADAM optimizer
  5. Performance tracking

Visualization

The implementation includes visualization tools to analyze:

  • Training and validation loss
  • Accuracy trends
  • Model performance metrics

Installation

To run this project, ensure you have the following dependencies installed:

pip install -r requirements.txt

Usage

Run the neural_net.ipynb notebook step by step to train and evaluate the neural network.

License

This project is licensed under the MIT License.

About

A modular neural network implemented from scratch in Python. Includes customizable architecture, training with backpropagation, evaluation metrics, and visualizations using the EMNIST dataset.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published