Hands-on deep learning practice in PyTorch, working through core concepts from shallow networks to modern architectures. Notebooks follow the progression in Deep Learning Illustrated (Krohn, Beyleveld, Bassens) with extensions into transformers and beyond.
- Python, PyTorch, torchvision, matplotlib
- Dataset: MNIST (digit classification throughout fundamentals)
1_shallow_net.ipynb: Shallow neural network — forward pass, weights, biases
2_activation_functions.ipynb: Sigmoid, tanh, ReLU — comparison and intuition
3_cost_functions.ipynb: MSE and cross-entropy
In progress: Backpropagation & gradient descent Convolutional Neural Networks (CNNs) Recurrent Neural Networks (RNNs) LSTMs Transformers
parameters: weight w bias b activation a artificial neurons: sigmoid tanh ReLU input layer hidden layer output layer layer types: dense/fully connected softmax cost/loss function MSE cross-entropy forward propagation