Deep Learning with Python
Collection of deep-learning code examples, tutorial-style Jupyter notebooks, and projects.
Many of the Jupyter notebooks are built on Google Colab and may employ special functions exclusive to Google Colab (for example uploading data or pulling data directly from a remote repo using standard Linux commands).
Here is the Github Repo.
- Python 3.6+
- NumPy (
pip install numpy)
- Pandas (
pip install pandas)
- MatplotLib (
pip install matplotlib)
- Tensorflow (
pip install tensorflowor
pip install tensorflow-gpu)
Of course, to use a local GPU correctly, you need to do lot more work setting up proper GPU driver and CUDA installation.
If you are using Ubuntu 18.04, here is a guide.
If you are on Windows 10, here is a guide
It is also highly recommended to install GPU version in a separate virtual environment, so as to not mess up the default system install.
- Keras (
pip install keras)
NOTE: Most of the Jupyter notebooks in this repo are built on Google Colaboratory using Google GPU cluster and a virtual machine. Therefore, you may not need to install these packages on your local machine if you also want to use Google colab. You can directly launch the notebooks in your Google colab environment by clicking on the links provided in the notebooks (of course, that makes a copy of my notebook on to your Google drive).
For more information about using Google Colab for your deep learning work, check their FAQ here.
I created a utility function file called
DL_utils.py in the
utils directory under
Notebooks. We use functions from this module whenever possible in the Jupyter notebooks.
You can download the module (raw Python file) from here: DL-Utility-Module
Deep learning vs. linear model
- We show a nonlinear function approximation task performed by linear model (polynomial degree) and a simple 1/2 hidden layer (densely connected) neural net to illustrate the difference and the capacity of deep neural nets to take advantage of larger datasets (Here is the Notebook).
Simple Conv Net
- Fashion MNIST image classification using densely connected network and 1/2/3 layer CNNs (Here is the Notebook).
ImageDataGenerator and other utilities
Horse or human image classification using Keras
ImageDataGeneratorand Google colaboratory platform. (Here is the Notebook)
Classification on the flowers dataset and the famous Caltech-101 dataset using
flow_from_directory()method of the
ImageDataGenerator. Illustrates how to streamline CNN model building from a single storage of image data using these utility methods. (Here is the Notebook)
- We illustrate how to show the activation maps of various layers in a deep CNN model with just a couple of lines of code using
Keractlibrary. (Here is the Notebook)
Adding object-oriented programming style to deep learning workflow
- Adding simple Object-oriented Programming (OOP) principle to your deep learning workflow (Here is the Notebook).
Callbacks using ResNet
- ResNet on CIFAR-10 dataset, showing how to use Keras Callbacks classes like
ReduceLROnPlateau. You can also change a single parameter to generate ResNet of various depths. (Here is the Notebook).
- Time series prediction using simple RNN (a single RNN layer followed by a densely connected layer). We show that a complicated time-series signal is correctly predicted by a simple RNN even when trained with only 25% of the data. (Here is the Notebook)
Text generation using LSTM
- Automatic text generation (based on simple character vectors) using LSTM network. Play with character sequence length, LSTM architecture, and hyperparameters to generate synthetic texts based on a particular author's style! (Here is the Notebook).
Bi-directional LSTM for sentiment classification
- Bi-directional LSTM with embedding applied to the IMDB sentiment classification task (Here is the Notebook)
Generative adversarial network (GAN)
- Simple demo of building a GAN model from scratch using a one-dimensional algebraic function (Here is the Notebook)