## In the Previous post, I have described **What is Perceptron**. So, if you don’t know what is Perceptron, don’t worry, Learn, What is Perceptron by clicking the below Button.

## So, now we’ll Learn **Why Do We Need a Learning Algorithm for Perceptron** and we’ll discuss the Algorithm.

**Why Do We Need a Learning Algorithm for Perceptron**

**So, let’s get started.**

**So, let’s get started.**

**Why Do We Need a Perceptron Learning Algorithm?**

**Did You Know?**

** The main difference between Mcculloch Pitts Neuron and Perceptron is, an introduction of numerical weights (w**_{1}, w_{2}, w_{3}, … w_{n} ) for inputs and a mechanism for learning these weights in Perceptron.

_{1}, w

_{2}, w

_{3}, … w

_{n}) for inputs and a mechanism for learning these weights in Perceptron.

*What are these Weights?*

*Well, these weights are attached to each input. Like, X1 is an input, but in Perceptron the input will be X1*W1. Each time the weights will be learnt. For this learning path, an algorithm is needed by which the weights can be learnt. *

**So, Now we are going to learn the Learning Algorithm of Perceptron.**

**So, Now we are going to learn the Learning Algorithm of Perceptron.**

**Let’s Assume,**

**Positive inputs ( P ) with label 1.**

**and Negative inputs ( N ) with label 0.**

**and Initialize, W = [ W**_{0}, W_{1, }W_{2, … ,} W_{n }]

_{0}, W

_{1, }W

_{2, … ,}W

_{n }]

**What do you mean by Convergence in this algorithm? **

### This algorithm converges when all the inputs are classified correctly on Training data set.

We can re-write the algorithm in a simple way – Let’s consider two Vectors → w and x.

**w = [ w**w and x are two vectors. So, the Dot Product will be –_{0}, w_{1}, w_{2}, …, w_{n}] and x = [ 1, x_{1}, x_{2}, …, x_{n}]**By this rule, the Perceptron will train or learn the weights. This is why it is named as ****Perceptron Learning Algorithm****.**

**By this rule, the Perceptron will train or learn the weights. This is why it is named as**

**.**

## Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates from our team.

## You have Successfully Subscribed!