DEVNET.

Understanding Back propagation In Neural Networks

I wanted to ask a question, when newborn baby born does he able to think and start recognizing the things at day 1. The answer is no because baby has to undergo a training process at every second that let him or her know that this is your mother, father, brother and sisters. Once this training is completed, the connection between the neurons become so strong; easily he or she start recognizing his family members.

But what happens if someone try to show the earlier known faces with some resembling faces like sister of mother who is not mother but resembles like mother? The baby tries to relate the existing images with the older images of mother and figure out that this is not my mother but exactly looks like mother. The entire process of rethinking and making it correct thinking known as back propagation.

Neural Network Mathematics explained how does neural networks can be trained by using simple algorithms. Back propagation is the one of the good way to let your connections know that the current given weight and bias value is not good and we need to change it to get better results.

Let’s imagine a three layer neuron network as below shown in the image with “w” as weights and “b” as bias. These are random numbers or we can use Gaussian method also to populate these numbers.

Share:

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

Become a member

Full Access to 739 Lessons. New Lessons Added Every Week!

Awesome Deal! Get 2 Months for FREE!

No Obligations. Cancel At Any Time!