Skip to content

KumarJonnala/deep-learning

Repository files navigation

deep learning

Hands-on deep learning practice in PyTorch, working through core concepts from shallow networks to modern architectures. Notebooks follow the progression in Deep Learning Illustrated (Krohn, Beyleveld, Bassens) with extensions into transformers and beyond.

Stack

  • Python, PyTorch, torchvision, matplotlib
  • Dataset: MNIST (digit classification throughout fundamentals)

Notebook Index

Foundations (Deep Learning Illustrated)

1_shallow_net.ipynb: Shallow neural network — forward pass, weights, biases 2_activation_functions.ipynb: Sigmoid, tanh, ReLU — comparison and intuition 3_cost_functions.ipynb: MSE and cross-entropy

In progress: Backpropagation & gradient descent Convolutional Neural Networks (CNNs) Recurrent Neural Networks (RNNs) LSTMs Transformers

parameters: weight w bias b activation a artificial neurons: sigmoid tanh ReLU input layer hidden layer output layer layer types: dense/fully connected softmax cost/loss function MSE cross-entropy forward propagation


About

Hands-on deep learning practice in PyTorch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors