This repository documents my structured learning journey in PyTorch and deep learning fundamentals, based on the following video course:
🔗Course Link: https://b23.tv/unRvIzG
- PyTorch fundamentals
- Backpropagation (BP) Neural Networks
- CNN-based image algorithms
- RNN-based text generation algorithms
-
📌 Day01:
- 🎯 Learning Objectives
- Understand what a Tensor is
- Learn basic tensor creation methods
- Create linear and random tensors
- Create tensors filled with specific values
- Perform tensor type conversion
- 🎯 Learning Objectives
-
📌Day02:
- 🎯Learning Objectives:
- Understand automatic differentiation in PyTorch
- Learn how gradients are computed and stored
- Implement manual gradient descent
- Update parameters using the weight update formula
- 🎯Learning Objectives:
-
📌Day03:
- 🎯Learning Objectives:
- Understand activation functions in neural networks
- Learn how Sigmoid and Tanh map data values to specific ranges
- Plot activation functions and their derivatives
- Understand the role of autograd in gradient computation
- Learn how to detach tensors from computation graphs
- Simulate a simple linear regression training process
- Understand forward pass → loss calculation → backward update workflow
- 🎯Learning Objectives:
-
📌Day04:
- 🎯Learning Objectives:
- Understand the role of loss functions in evaluating model performance.
- Learn regression losses: MAE, MSE, and Smooth L1.
- Understand Binary Cross-Entropy and Multi-class Cross-Entropy for classification.
- Learn how CrossEntropyLoss combines Softmax and loss computation.
- Build a simple neural network in PyTorch using nn.Module.
- Understand the forward pass workflow: weighted sum → activation → output.
- Learn common parameter initialization methods (Random, Xavier, Kaiming).
- 🎯Learning Objectives:
-
📌Day05:
- 🎯Learning Objectives:
- Understand how learning rate affects gradient updates and model convergence.
- Learn learning rate decay strategies: StepLR, MultiStepLR, and ExponentialLR.
- Understand optimization algorithms: Momentum, AdaGrad, RMSProp, and Adam.
- Learn Exponential Moving Average (EMA) and its role in smoothing gradients.
- Understand regularization techniques to reduce overfitting.
- Learn Dropout and how random neuron deactivation improves generalization.
- Understand Batch Normalization and how it stabilizes training.
- Apply these techniques in an ANN example for mobile phone price classification.
- 🎯Learning Objectives:
-
📌Day06
- 🎯 Learning Objectives:
- Understand different image types (binary, grayscale, RGB) and how images are represented in arrays.
- Learn to create and visualize images using NumPy and Matplotlib.
- Understand the structure of Convolutional Neural Networks (CNN): convolution layer, pooling layer, and fully connected layer.
- Learn how convolution layers extract image features and generate feature maps.
- Understand pooling operations (max pooling and average pooling) for dimensionality reduction.
- 🎯 Learning Objectives: