A minimal feedforward neural network for multiclass classification, implemented in PyTorch and demonstrated on the Iris dataset.
This project implements a custom feedforward NN from scratch (using PyTorch tensors and autograd) to classify Iris flowers into three species: setosa, versicolor, and virginica. All details—architecture, data loading, training loop, and evaluation—are in the notebook.
- Custom architecture: Configurable layers and widths (no
nn.Linearstack; usesnn.Parameterand manual forward pass). - Iris dataset: Loaded via scikit-learn; 90% train / 10% test split.
- Training: SGD optimizer, cross-entropy loss, ReLU in hidden layers, softmax at output (via
CrossEntropyLoss). - Metrics: Training and test loss plus accuracy logged every 10 epochs.
- Python 3.x
- PyTorch
- scikit-learn
- NumPy
- Matplotlib
Install with:
pip install torch scikit-learn numpy matplotlib-
Clone the repository:
git clone https://github.com/AugusGuarna/Feedforward_NN.git cd Feedforward_NN -
Open and run the notebook:
- Locally: Open
NN_multiclass_pred.ipynbin Jupyter or VS Code. - Online: Use the “Open in Colab” badge at the top of the notebook.
- Locally: Open
No extra setup or data download is required; the notebook loads the Iris dataset automatically.
Feedforward_NN/
├── README.md
└── NN_multiclass_pred.ipynb # Full implementation and walkthrough
| Section | Description |
|---|---|
| Architecture | Input (4) → hidden (6, 8, 10) → output (3); ReLU and softmax. |
| Data | Iris loaded, encoded, split into train/test, wrapped in DataLoaders. |
| Model | FeedForwardNeuralNetwork class and parameter check. |
| Training | Loss, optimizer, 200-epoch loop with train/test loss and accuracy. |
| Results | Short discussion of performance and data size. |
This project is open source. Use and adapt as needed.