Neural Networks of L-Layers, N-Neurons with Dropout and L2 Regularization

In this notebook, we will walk through the design, training, and testing of neural networks with multiple hidden layers. We will train networks utilizing L2 regularization and dropout. These neural networks will be used for logistic regression, which is an archaic name for binary classification.

The binary classification will be performed on a simple 2-D dataset. This data will be randomly generated based on two interleaving half circles. The data points for one half circle are labeled 0 and the others are labeled 1.

The goal of this binary classification will be first to classify each data point according to which half circle it belongs to. We will illustrate how well our networks do this by drawing the decision boundary. This decision boundary will also reveal overfitting. We will then investigate the impact of regularization techniques on this overfitting.

Previous
Previous

Neural Networks with Minibatch Stochastic Gradient Descent and Momentum

Next
Next

Neural Networks With Single Neuron Implementation