This repository contains my solutions and code for the first homework assignment of the DL course, covers fundamental concepts of deep learning, from theoretical foundations to practical implementation using PyTorch.
The theoretical portion of this homework covers fundamental mathematical concepts crucial to deep learning. My handwritten answers are submitted in a single PDF file as required by the course. The topics covered are:
- Matrix Differentiation: Derivations for various matrix expressions.
- Backpropagation: Step-by-step calculation of gradients for a custom neural network architecture.
- Optimization: Analysis of the Backtracking Line Search and Adam optimization algorithms.
- Regularization: Concepts related to Lipschitz continuity and the role of Weight Decay.
This part of the homework involves hands-on implementation using PyTorch. The code is organized into separate files for each question:
- Basics: An introduction to PyTorch, where I implemented a simple classification task. The required files are
Basics.ipynbandpytorch_basic.py. - NN Scratch: Building a complete neural network using custom modules without relying on PyTorch's built-in
nnlibrary. Theutilsfolder, which was provided, is essential for running this part of the code. - Optimization: A practical assignment on optimization algorithms that was submitted in person.
- Lazy Gradient: An assignment exploring memory challenges when working with neural networks.