Photo by bruce mars on Unsplash

In this blog, we will study about the importance of weight initialization technique in the neural network. We will also cover the problem caused due to wrong initialization of weights in the neural network. This article has been written under the assumption that the reader is already familiar with the concept of neural network, weight, bias, activation functions, forward and backward propagation etc.

In regression techniques such as Linear and Logistic regression, we initialize weights to be zero or some random value but the same technique proves not to be fruitful for neural networks. …


An engineering student who is exploring Machine Learning world.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store