This repository contains Jupyter notebooks that serve as an introduction to the basics of PyTorch. Each notebook will introduce different concepts as listed in the corresponding section.
Notebook dl-intro
introduces basic concepts of DL using logistical regression.
- Introduction to supervised learning and classification. Logistic regression using CIFAR10.
- Using matplotlib.
- Loss functions, activation functions and gradient descent.
- PyTorch packages, tensors, and gradients.
- Building neural network models using PyTorch
Notebook feedforward
introduces basic layering architecture of neural networks using fully connected layers.
- PyTorch datasets, dataloaders and data transforms
- Data batches and stochastic gradient descent
- Different activation functions, softmax and cross entropy loss
- Using PyTorch on GPUs
Notebook convolution
introduces convolutional layers
- Convolution operation, kernels, strides.
- Convolutional layers in PyTorch
- Pooling and maxpooling in PyTorch
- Using the confusion matrix to analyze results
Notebook detecting-emotions
introduces the use of Kaggle and Kaggle datasets
- How to use Kaggle and import Kaggle datasets from Colab or Kaggle itself
- Reduce overfitting using dropout
- Using PyTorch to load image datasets for classification
Notebook custom_dataset
introduces building custom datasets in PyTorch
- Using the pandas package
- Building a custom dataset using PyTorch
- Handling categorical data as input
- Building multimodal models
Notebook fraud-detection-pytorch
introduces the autoencoders architectures in PyTorch and use it to detect credit card fraud
- Autoencoders
- Some functionality of sklearn
- Some statistics