### Learning to code a 2 layer Neural Network from scratch

- Get link
- Google+
- Other Apps

# Coding a 2 layer Neural Network for binary classification from scratch¶

In [1]:

```
import numpy as np
import matplotlib.pyplot as plt
import numpy.matlib
```

### Dataset creation- data set is centred around two clusters c1 and c2 as their centre points.¶

In [2]:

```
c1 = [2,3]
c2 = [10,11]
no = 50
Class1 = np.matlib.repmat(c1, no,1) + np.random.randn(no,len(c1))
Class2 = np.matlib.repmat(c2, no,1) + np.random.randn(no,len(c2))
Data = np.append(Class1,Class2,axis = 0)
Trainlabel = np.append(np.zeros((no,1)),np.ones((no,1)),axis = 0)
```

### Plotting the data¶

In [3]:

```
import matplotlib.pyplot as plt
plt.plot(Class1[:,0],Class1[:,1],'ro')
plt.plot(Class2[:,0],Class2[:,1],'bo')
plt.ylabel('Data')
plt.show()
```

In [4]:

```
m = Data.shape[0]
X = Data.T
y = Trainlabel.T
```

#### n_i => number of nodes in input layer(here its 2, since each data point is an element of R^2 )¶

#### n_h => number of neurons in the hidden layer¶

#### n_l => number of neurons in the last layer.¶

### A 2 layer neural network means it has 1 hidden layer and 1 output layer. Since it is a binary classification we have used only 1 neuron in the output layer.¶

In [17]:

```
n_i = 2
n_h = 2
n_l = 1
learningrate = 0.005
numiter = 5000
l = [n_i,n_h,n_l]
W1 = np.random.randn(n_h,n_i)*0.01
b1 = np.random.randn(n_h,1)*0.01
W2 = np.random.randn(n_l,n_h)*0.01
b2 = np.random.randn(n_l,1)
print X.shape
```

(2, 100)

### Activation Functions¶

In [6]:

```
def sigmoid(x,play):
z = 1/(1+np.exp(-x))
if (play == "forward"):
return z
elif (play =="backward"):
return z*(1-z)
```

In [7]:

```
def relu(x,play):
if (play=="forward"):
return np.maximum(x,0)
elif (play=="backward"):
x[x<=0] = 0
x[x>0] = 1
return x
```

### Main part of our program. Here we are minimizing our cost function with respect to weights and bias. And we choose that weights and bias for which our cost function is minimum. Of course this depends on many factors. They are number of epochs, number of neurons in hidden layer, learning rate. These are called hyperparameters and a optimum value for these hyperparameters can be obtained by hyperparameter tuning.¶

In [18]:

```
for i in range(numiter):
Z1 = np.dot(W1,X) + b1
A1 = relu(Z1,play = "forward")
Z2 = np.dot(W2,A1) + b2
A2 = sigmoid(Z2,play = "forward")
L = (1.0/m) * -np.sum(np.multiply(y,np.log(A2)) + np.multiply(1-y,np.log(1-A2)))
dZ2 = A2 - y
dZ1 = np.multiply(np.dot(W2.T,dZ2),relu(A1,play="backward"))
dW2 = (1.0/m) * np.dot(dZ2,A1.T)
db2 = (1.0/m) * np.sum(dZ2,axis = 1,keepdims = True)
dW1 = (1.0/m) * np.dot(dZ1,X.T)
db1 = (1.0/m) * np.sum(dZ1,axis = 1,keepdims = True)
W1 = W1 - learningrate * dW1
b1 = b1 - learningrate * db1
W2 = W2 - learningrate * dW2
b2 = b2 - learningrate* db2
print "Loss for ",i,"th iteration =>",L
```

### Prediction¶

In [19]:

```
##Predictions
def prediction(X,W1,b1,W2,b2):
Z1pred = np.dot(W1,X) + b1
A1pred = relu(Z1pred,play = "forward")
Z2pred = np.dot(W2,A1pred) + b2
A2pred = sigmoid(Z2pred,play = "forward")
prediction = []
for i in range(A2pred.shape[1]):
if (A2pred[0][i]>0.5):
prediction.append(1)
elif (A2pred[0][i] <= 0.5):
prediction.append(0)
N = len(prediction)
prediction = np.array(prediction)
prediction =prediction.reshape(1,N)
return prediction
```

### Prediction for Train data¶

In [20]:

```
predictionTrain = prediction(X,W1,b1,W2,b2)
```

In [21]:

```
Truelabel = y
```

### Predicted Label and True label¶

In [13]:

```
print "prediction label for traindata:",predictionTrain
print "True label for train data:", Truelabel
```

## (We should actually predict for unseen data which is not in our training set. Creation of test data and prediction is given below)¶

### Creating Test data¶

In [22]:

```
Tc1 = [3.5,4]
Tc2 = [11.5,12]
TClass1 = np.matlib.repmat(Tc1, no,1) + np.random.randn(no,len(c1))
TClass2 = np.matlib.repmat(Tc2, no,1) + np.random.randn(no,len(c2))
TData = np.append(TClass1,TClass2,axis = 0)
Testlabel = np.append(np.zeros((no,1)),np.ones((no,1)),axis = 0)
X2 = TData.T
y2 = Testlabel.T
```

### Plotting the test data¶

In [23]:

```
import matplotlib.pyplot as plt
plt.plot(TClass1[:,0],TClass1[:,1],'ro')
plt.plot(TClass2[:,0],TClass2[:,1],'bo')
plt.ylabel('Data')
plt.show()
```

### Prediction for Test data¶

In [24]:

```
predictionTest = prediction(X2,W1,b1,W2,b2)
print "prediction label for testndata:",predictionTest
print "True label for test data:",y2
```

- Get link
- Google+
- Other Apps

We at Coepd declared Data Science Internship Programs (Self sponsored) for professionals who want to have hands on experience. We are providing this program in alliance with IT Companies in COEPD Hyderabad premises. This program is dedicated to our unwavering participants predominantly acknowledging and appreciating the fact that they are on the path of making a career in Data Science discipline. This internship is designed to ensure that in addition to gaining the requisite theoretical knowledge, the readers gain sufficient hands-on practice and practical know-how to master the nitty-gritty of the Data Science profession. More than a training institute, COEPD today stands differentiated as a mission to help you "Build your dream career" - COEPD way.

ReplyDeletehttp://www.coepd.com/AnalyticsInternship.html