Skip to content

yulorrrrr/Pytorch_Self_Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

49 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚀 PyTorch Learning & Practice

This repository documents my structured learning journey in PyTorch and deep learning fundamentals, based on the following video course:

🔗Course Link: https://b23.tv/unRvIzG

The content covers:

  1. PyTorch fundamentals
  2. Backpropagation (BP) Neural Networks
  3. CNN-based image algorithms
  4. RNN-based text generation algorithms

🗓️ Daily Learning Breakdown:

  • 📌 Day01:

    • 🎯 Learning Objectives
      1. Understand what a Tensor is
      2. Learn basic tensor creation methods
      3. Create linear and random tensors
      4. Create tensors filled with specific values
      5. Perform tensor type conversion
  • 📌Day02:

    • 🎯Learning Objectives:
      1. Understand automatic differentiation in PyTorch
      2. Learn how gradients are computed and stored
      3. Implement manual gradient descent
      4. Update parameters using the weight update formula
  • 📌Day03:

    • 🎯Learning Objectives:
      1. Understand activation functions in neural networks
      2. Learn how Sigmoid and Tanh map data values to specific ranges
      3. Plot activation functions and their derivatives
      4. Understand the role of autograd in gradient computation
      5. Learn how to detach tensors from computation graphs
      6. Simulate a simple linear regression training process
      7. Understand forward pass → loss calculation → backward update workflow
  • 📌Day04:

    • 🎯Learning Objectives:
      1. Understand the role of loss functions in evaluating model performance.
      2. Learn regression losses: MAE, MSE, and Smooth L1.
      3. Understand Binary Cross-Entropy and Multi-class Cross-Entropy for classification.
      4. Learn how CrossEntropyLoss combines Softmax and loss computation.
      5. Build a simple neural network in PyTorch using nn.Module.
      6. Understand the forward pass workflow: weighted sum → activation → output.
      7. Learn common parameter initialization methods (Random, Xavier, Kaiming).
  • 📌Day05:

    • 🎯Learning Objectives:
      1. Understand how learning rate affects gradient updates and model convergence.
      2. Learn learning rate decay strategies: StepLR, MultiStepLR, and ExponentialLR.
      3. Understand optimization algorithms: Momentum, AdaGrad, RMSProp, and Adam.
      4. Learn Exponential Moving Average (EMA) and its role in smoothing gradients.
      5. Understand regularization techniques to reduce overfitting.
      6. Learn Dropout and how random neuron deactivation improves generalization.
      7. Understand Batch Normalization and how it stabilizes training.
      8. Apply these techniques in an ANN example for mobile phone price classification.
  • 📌Day06

    • 🎯 Learning Objectives:
      1. Understand different image types (binary, grayscale, RGB) and how images are represented in arrays.
      2. Learn to create and visualize images using NumPy and Matplotlib.
      3. Understand the structure of Convolutional Neural Networks (CNN): convolution layer, pooling layer, and fully connected layer.
      4. Learn how convolution layers extract image features and generate feature maps.
      5. Understand pooling operations (max pooling and average pooling) for dimensionality reduction.

About

These files are part of a 7-day learning series where I compile notes from multiple sources to study important PyTorch packages.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages