Skip to content

Mofazzal874/Neural-Network

Repository files navigation

Ongoing Project(Learning Phase)

Attention in Transformers - DL.AI

This folder is part of the larger Neural Network project and focuses on implementing various concepts related to attention mechanisms in transformers. Each file contains detailed code and explanations provided in Jupyter notebooks.


Repository Structure

File Name Description
Lesson_3-Self-Attention.ipynb Implementation of Self-Attention mechanism, explaining how attention weights are computed and applied to input sequences.
Lesson_6-Masked_Self-Attention.ipynb Implementation of Masked Self-Attention, used in tasks like autoregressive modeling where future tokens are masked during computation.
Lesson_9-Attention%28Self_Masked_Encoder-Decoder_Multi-Head%29.ipynb Comprehensive implementation covering Self-Attention, Masked Attention, Encoder-Decoder Attention, and Multi-Head Attention mechanisms.

MakeMore

This section focuses on implementing character-level language models of increasing complexity. The implementations are divided into different approaches using bigram models.

Repository Structure

File Name Description
Bigrams(part-01).ipynb Initial implementation of the bigram character-level language model.
Bigrams(Part-02).ipynb Advanced concepts and improvements to the bigram model.
Bigrams(part-03- with MLP).ipynb Implementation of bigram model using Multi-Layer Perceptron (MLP).

Micrograd

This section contains implementations related to building a neural network library from scratch, including exercises and detailed explanations.

Repository Structure

File Name Description
micrograd_from_scratch.ipynb Implementation of a minimal neural network library from scratch.
micrograd_exercises.ipynb Practice exercises for understanding neural network concepts.
Building out a neural Net library(Multi Layered Perceptrons).ipynb Comprehensive implementation of Multi-Layer Perceptrons using the micrograd library.

About

My learning on NN so far

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published