Saturday, February 27, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Neural Networks

Neural Network from Scratch – Becoming Human: Artificial Intelligence Magazine

March 20, 2019
in Neural Networks
Neural Network from Scratch – Becoming Human: Artificial Intelligence Magazine
587
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

Credit: BecomingHuman

Previously in the last article, I had described the Neural Network and had given you a practical approach for training your own Neural Network using a Framework (Keras), Today’s article will be short as I will not be diving into the maths behind Neural but will be telling how we create our own Neural Network from Scratch .

You might also like

Creative Destruction and Godlike Technology in the 21st Century | by Madhav Kunal

How 3D Cuboid Annotation Service is better than free Tool? | by ANOLYTICS

Role of Image Annotation in Applying Machine Learning for Precision Agriculture | by ANOLYTICS

We will be using the MNIST dataset. For just importing the dataset we will be using Keras and all other will be written using numpy.

The backpropagation

The toughest part that you might face during the whole code will be How does this Backpropagation works and what is the logic behind this.

Let me explain something that is very simple and might be very easy to understand. Let’s say that you want to minimize some variable ‘y’ with respect to a variable ‘x’ so what we do is:

Yes you got it right we do differentiate it and apply the condition of dy/dx=0

Trending AI Articles:

1. The AI Job Wars: Episode I

2. Bursting the Jargon bubbles — Deep Learning

3. How Can We Improve the Quality of Our Data?

4. Machine Learning using Logistic Regression in Python with Code

Now, this is similar to what happens in backpropagation too. We have a loss function after the end of feed-forward which needs to be minimized with respect to the weights vector or matrices of each layer. So basically what we have to do is find dc/dw(n)…..to dc/dw(1) and finally multiply it with the learning rate and finally subtract it from the corresponding ‘w’s after each set of the epoch.

So if this is so easy why shouldn’t you first try on your own for a single layer and then finally see my code.

We will be covering three layers of Neural Network and will be constructing it from scratch.

The FeedForward :

As I had explained earlier in my post of Neural Networks we have a linear line function whose output is given non-linearity with the help of activation function like ReLu, Sigmoid, Softmax,tanh and many more

Our feedforward equation is given by —

y=wx+b where y is the output and w are the weights and for now we neglect the bias values.

So if we have a three-layer neural network we have:

#Making of feed-forward function
import numpy as np
def sig(s):
return 1/(1+np.exp(-1*s))
def sig_der(s):
return s*(1-s)
class NN:
def __init__(self,x,y):
self.x = x
self.y = y
self.n = 64 #no of neurons in the middle layers
self.input_dim = 784
self.out_dim = 10

self.w1 = np.random.randn(784,self.n)
self.w2 = np.random.randn(self.n,self.n)
self.w3 = np.random.randn(self.n,10)

   def feedforward(self):
self.z1 = np.dot(self.x,self.w1)
self.a1 = sig(self.z1)
self.z2 = np.dot(self.a1,self.w2)
self.a2 = sig(self.z2)
self.z3 = np.dot(self.a2,self.w3)
self.a3 = sig(self.z3)

Till now we have built our normal feedforward network which requires actually minimal thinking. Now, lets start off with the hard part THE BACKPROPAGATION.

Code the Hard BACKPROP

Now basically what does neural network do is that it first passes on the random set values through the layers and predicts a value and compares it with the actual image and gets the error, now the task is to minimize this error and how we do it is by using the basic chain rule of derivatives.

dc/dw3 = dc/da3 * da3/dz3 * dz3/dw3

Basically, as we are doing a classification problem and thus we will be using cross_entropy for this.

def cross_entropy(real,pred):
return (pred - real)/number of samples
so dc/da3 * da3/dz3 = a3 - y
and dz3/dw3 = a2
dc/dw2 = dc/da3 * da3/dz3 * dz3/da2 * da2/dz2 * dz2/dw2
#The upper equation is simply followed by a simple chain rule
dc/w1 = dc/da3 * da3/dz3 * dz3/da2 * da2/dz2 * dz2/da2 * da2/dz1 * dz1/dw1

I will not be writing the code for backpropagation but I have provided enough information to write the code. Just write and can confirm it through

The only track you have to keep is the matrix size that is used if that is handled carefully your output will be perfect

And finally, if you have more interests regarding neural networks you can try out the similar problem for Dogs vs Cats Dataset and see the accuracy. In the next article, I will be starting off with CNN — Convolutional Neural Networks.

We will also write Convolutional Neural Networks from Scratch and also through Keras.

Follow my articles from https://medium.com/@dubeysarvesh5525

Don’t forget to give us your 👏 !

Credit: BecomingHuman By: SARVESH DUBEY

Previous Post

The 5 best digital marketing courses for your digital success this year

Next Post

Machine learning identifies links between world’s oceans

Related Posts

Creative Destruction and Godlike Technology in the 21st Century | by Madhav Kunal
Neural Networks

Creative Destruction and Godlike Technology in the 21st Century | by Madhav Kunal

February 26, 2021
How 3D Cuboid Annotation Service is better than free Tool? | by ANOLYTICS
Neural Networks

How 3D Cuboid Annotation Service is better than free Tool? | by ANOLYTICS

February 26, 2021
Role of Image Annotation in Applying Machine Learning for Precision Agriculture | by ANOLYTICS
Neural Networks

Role of Image Annotation in Applying Machine Learning for Precision Agriculture | by ANOLYTICS

February 26, 2021
Label a Dataset with a Few Lines of Code | by Eric Landau | Jan, 2021
Neural Networks

Label a Dataset with a Few Lines of Code | by Eric Landau | Jan, 2021

February 25, 2021
How to Make Data Annotation More Efficient? | by ByteBridge | Feb, 2021
Neural Networks

How to Make Data Annotation More Efficient? | by ByteBridge | Feb, 2021

February 25, 2021
Next Post
Machine learning identifies links between world’s oceans

Machine learning identifies links between world’s oceans

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

Malicious Amazon Alexa Skills Can Easily Bypass Vetting Process
Internet Privacy

Malicious Amazon Alexa Skills Can Easily Bypass Vetting Process

February 26, 2021
Give Your Business Users Simple Augmented Analytics
Data Science

Give Your Business Users Simple Augmented Analytics

February 26, 2021
AI and machine learning to help global battle with cancer
Machine Learning

AI and machine learning to help global battle with cancer

February 26, 2021
Why your diversity and inclusion efforts should include neurodiverse workers
Internet Security

Why your diversity and inclusion efforts should include neurodiverse workers

February 26, 2021
North Korean Hackers Targeting Defense Firms with ThreatNeedle Malware
Internet Privacy

North Korean Hackers Targeting Defense Firms with ThreatNeedle Malware

February 26, 2021
The Beginner Guide for Creating a Multi-Vendor eCommerce Website
Data Science

The Beginner Guide for Creating a Multi-Vendor eCommerce Website

February 26, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • Malicious Amazon Alexa Skills Can Easily Bypass Vetting Process February 26, 2021
  • Give Your Business Users Simple Augmented Analytics February 26, 2021
  • AI and machine learning to help global battle with cancer February 26, 2021
  • Why your diversity and inclusion efforts should include neurodiverse workers February 26, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates