My coursework for Andrew Ng’s Deep Learning Specialization.
Note
I’ve trimmed this repo down to just the key code for my own reference. If you need the full workspaces, you’ll find plenty of other repos with them.
This repositry contains:
- Neural networks from scratch (forward/backprop, vectorization)
- Optimization techniques (mini-batch gradient descent, momentum, RMSProp, Adam)
- Regularization methods (L2, dropout, early stopping)
- Initialization strategies (He, Xavier, random)
- Hyperparameter tuning and model debugging
- Deep convolutional networks (ResNets, transfer learning)
- Recurrent models (RNNs, LSTMs, GRUs)
- Sequence-to-sequence models with attention
- Word embeddings and word vectors
- Transformer-based architectures for NLP