Deep Learning with Python (Website)
Collection of a variety of Deep Learning (DL) code examples, tutorialstyle Jupyter notebooks, and projects.
Quite a few of the Jupyter notebooks are built on Google Colab and may employ special functions exclusive to Google Colab (for example uploading data or pulling data directly from a remote repo using standard Linux commands).
Here is the Github Repo.
Authored and maintained by Dr. Tirthajyoti Sarkar (Website, LinkedIn profile)
Requirements
 Python 3.6+
 NumPy (
pip install numpy
)  Pandas (
pip install pandas
)  MatplotLib (
pip install matplotlib
)  Tensorflow (
pip install tensorflow
orpip install tensorflowgpu
)Of course, to use a local GPU correctly, you need to do lot more work setting up proper GPU driver and CUDA installation.
If you are using Ubuntu 18.04, here is a guide.
If you are on Windows 10, here is a guide
It is also highly recommended to install GPU version in a separate virtual environment, so as to not mess up the default system install.  Keras (
pip install keras
)
NOTE: Most of the Jupyter notebooks in this repo are built on Google Colaboratory using Google GPU cluster and a virtual machine. Therefore, you may not need to install these packages on your local machine if you also want to use Google colab. You can directly launch the notebooks in your Google colab environment by clicking on the links provided in the notebooks (of course, that makes a copy of my notebook on to your Google drive).
For more information about using Google Colab for your deep learning work, check their FAQ here.
Utility modules
Utility module for example notebooks
I created a utility function file called DL_utils.py
in the utils
directory under Notebooks
. We use functions from this module whenever possible in the Jupyter notebooks.
You can download the module (raw Python file) from here: DLUtilityModule
Generalpurpose regression module (for tabular dataset)
I also implemented a generalpurpose trainer module (NN_trainer.py
) for regression task with tabular datasets. The idea is that you can simply read a dataset (e.g. a CSV file), choose the input and target variables, build a denselyconnected neural net, train, and predict. The module gives you back a prediction function (trained) which can be used for any further prediction, analytics, or optimization task.
Check out the module here and an example notebook here.
Notebooks
Deep learning vs. linear model
 We show a nonlinear function approximation task performed by linear model (polynomial degree) and a simple 1/2 hidden layer (densely connected) neural net to illustrate the difference and the capacity of deep neural nets to take advantage of larger datasets (Here is the Notebook).
Demo of a generalpurpose regression module
 We implemented a generalpurpose trainer module for regression task with tabular datasets. The idea is that you can simply read a dataset (e.g. a CSV file), choose the input and target variables, build a denselyconnected neural net, train, predict, and save the model for deployment. This the demo notebook for that module (Here is the Notebook).
Simple Conv Net
 Fashion MNIST image classification using densely connected network and 1/2/3 layer CNNs (Here is the Notebook).
Using Keras ImageDataGenerator
and other utilities

Horse or human image classification using Keras
ImageDataGenerator
and Google colaboratory platform. (Here is the Notebook) 
Classification on the flowers dataset and the famous Caltech101 dataset using
fit_generator
andflow_from_directory()
method of theImageDataGenerator
. Illustrates how to streamline CNN model building from a single storage of image data using these utility methods. (Here is the Notebook)
Transfer learning

Simple illustration of transfer learning using CIFAR10 dataset (Here is the Notebook)

Transfer learning with the famous Inception v3 model  building a classifier of pneumonia from chest Xray images. (Here is the Notebook)
Activation maps
 We illustrate how to show the activation maps of various layers in a deep CNN model with just a couple of lines of code using
Keract
library. (Here is the Notebook)
Adding objectoriented programming style to deep learning workflow
 Adding simple Objectoriented Programming (OOP) principle to your deep learning workflow (Here is the Notebook).
Keras Callbacks
using ResNet
 ResNet on CIFAR10 dataset, showing how to use Keras Callbacks classes like
ModelCheckpoint
,LearningRateScheduler
, andReduceLROnPlateau
. You can also change a single parameter to generate ResNet of various depths. (Here is the Notebook).
Simple RNN
 Time series prediction using simple RNN (a single RNN layer followed by a densely connected layer). We show that a complicated timeseries signal is correctly predicted by a simple RNN even when trained with only 25% of the data. (Here is the Notebook)
Text generation using LSTM
 Automatic text generation (based on simple character vectors) using LSTM network. Play with character sequence length, LSTM architecture, and hyperparameters to generate synthetic texts based on a particular author's style! (Here is the Notebook).
Bidirectional LSTM for sentiment classification
 Bidirectional LSTM with embedding applied to the IMDB sentiment classification task (Here is the Notebook)
Generative adversarial network (GAN)
 Simple demo of building a GAN model from scratch using a onedimensional algebraic function (Here is the Notebook)
Scikitlearn wrapper for Keras
 Keras Scikitlearn wrapper example with 10fold crossvalidation and exhaustive grid search (Here is the Notebook)