What is Perceptron Learning Algorithm in Neural Networks

[et_pb_section bb_built=”1″ _builder_version=”3.16.1″ next_background_color=”#000000″][et_pb_row make_fullwidth=”on” make_equal=”on” _builder_version=”3.16.1″ module_alignment=”center”][et_pb_column type=”3_5″][et_pb_text _builder_version=”3.16.1″]

In the Previous post, I have described What is Perceptron. So, if you don’t know what is Perceptron, don’t worry, Learn, What is Perceptron by clicking the below Button.

[/et_pb_text][et_pb_button button_url=”https://tec4tric.com/2018/10/what-is-perceptron.html” url_new_window=”on” button_text=”What is Perceptron ?” button_alignment=”center” _builder_version=”3.17.2″ custom_button=”on” button_text_size=”33px” button_text_color=”#0c71c3″ button_border_color=”#000000″ button_font=”||||||||” button_icon=”%%209%%” button_on_hover=”off” box_shadow_style=”preset2″ hover_enabled=”0″ button_text_size_last_edited=”on|phone” button_text_size_phone=”20px”]   [/et_pb_button][et_pb_text _builder_version=”3.16.1″]

So, now we’ll Learn Why Do We Need a Learning Algorithm for Perceptron and we’ll discuss the Algorithm.

So, let’s get started.

[/et_pb_text][et_pb_code _builder_version=”3.17.2″]<script async src=”//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js”></script><!– [et_pb_line_break_holder] –><ins class=”adsbygoogle”<!– [et_pb_line_break_holder] –> style=”display:block; text-align:center;”<!– [et_pb_line_break_holder] –> data-ad-layout=”in-article”<!– [et_pb_line_break_holder] –> data-ad-format=”fluid”<!– [et_pb_line_break_holder] –> data-ad-client=”ca-pub-1750980986506231″<!– [et_pb_line_break_holder] –> data-ad-slot=”3802595016″></ins><!– [et_pb_line_break_holder] –><script><!– [et_pb_line_break_holder] –> (adsbygoogle = window.adsbygoogle <!– [et_pb_line_break_holder] –> []).push({});<!– [et_pb_line_break_holder] –></script>[/et_pb_code][/et_pb_column][et_pb_column type=”2_5″][et_pb_image src=”https://tec4tric.com/wp-content/uploads/2018/10/perceptron.jpg” align=”center” _builder_version=”3.16.1″]   [/et_pb_image][/et_pb_column][/et_pb_row][/et_pb_section][et_pb_section bb_built=”1″ _builder_version=”3.16.1″ prev_background_color=”#000000″ next_background_color=”#000000″][et_pb_row _builder_version=”3.17.2″][et_pb_column type=”4_4″][et_pb_text _builder_version=”3.16.1″]

Why Do We Need a Perceptron Learning Algorithm?

[/et_pb_text][et_pb_text _builder_version=”3.16.1″ inline_fonts=”Times New Roman”]

Did You Know?

The main difference between Mcculloch Pitts Neuron and Perceptron is, an introduction of numerical weights (w1, w2, w3, … wn ) for inputs and a mechanism for learning these weights in Perceptron.

[/et_pb_text][et_pb_text _builder_version=”3.17.2″]

What are these Weights?

Well, these weights are attached to each input. Like, X1 is an input, but in Perceptron the input will be X1*W1. Each time the weights will be learnt. For this learning path, an algorithm is needed by which the weights can be learnt.

So, Now we are going to learn the Learning Algorithm of Perceptron.

[/et_pb_text][et_pb_code _builder_version=”3.17.2″]<script async src=”//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js”></script><!– [et_pb_line_break_holder] –><ins class=”adsbygoogle”<!– [et_pb_line_break_holder] –> style=”display:block; text-align:center;”<!– [et_pb_line_break_holder] –> data-ad-layout=”in-article”<!– [et_pb_line_break_holder] –> data-ad-format=”fluid”<!– [et_pb_line_break_holder] –> data-ad-client=”ca-pub-1750980986506231″<!– [et_pb_line_break_holder] –> data-ad-slot=”4841320570″></ins><!– [et_pb_line_break_holder] –><script><!– [et_pb_line_break_holder] –> (adsbygoogle = window.adsbygoogle || []).push({});<!– [et_pb_line_break_holder] –></script>[/et_pb_code][/et_pb_column][/et_pb_row][/et_pb_section][et_pb_section bb_built=”1″ _builder_version=”3.16.1″ prev_background_color=”#000000″ next_background_color=”#000000″][et_pb_row _builder_version=”3.16.1″][et_pb_column type=”4_4″][et_pb_text _builder_version=”3.17.2″]

Let’s Assume,

Positive inputs ( P ) with label 1.

and Negative inputs ( N ) with label 0.

and Initialize, W = [ W0, W1, W2, … , Wn ]

[/et_pb_text][et_pb_image src=”https://tec4tric.com/wp-content/uploads/2018/10/pl2.png” align=”center” _builder_version=”3.17.2″]   [/et_pb_image][et_pb_text _builder_version=”3.17.2″]

What do you mean by Convergence in this algorithm? 

This algorithm converges when all the inputs are classified correctly on Training data set.

[/et_pb_text][et_pb_code _builder_version=”3.17.2″]<script async src=”//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js”></script><!– [et_pb_line_break_holder] –><ins class=”adsbygoogle”<!– [et_pb_line_break_holder] –> style=”display:block; text-align:center;”<!– [et_pb_line_break_holder] –> data-ad-layout=”in-article”<!– [et_pb_line_break_holder] –> data-ad-format=”fluid”<!– [et_pb_line_break_holder] –> data-ad-client=”ca-pub-1750980986506231″<!– [et_pb_line_break_holder] –> data-ad-slot=”3802595016″></ins><!– [et_pb_line_break_holder] –><script><!– [et_pb_line_break_holder] –> (adsbygoogle = window.adsbygoogle <!– [et_pb_line_break_holder] –> []).push({});<!– [et_pb_line_break_holder] –></script>[/et_pb_code][/et_pb_column][/et_pb_row][et_pb_row _builder_version=”3.17.2″][et_pb_column type=”4_4″][et_pb_text _builder_version=”3.17.2″] We can re-write the algorithm in a simple way – Let’s consider two Vectors → w and x. w = [ w0, w1, w2, …, wn ]       and        x = [ 1, x1, x2, …, xn ] w and x are two vectors. So, the Dot Product will be – [/et_pb_text][et_pb_image src=”https://tec4tric.com/wp-content/uploads/2018/10/pl3.png” align=”center” _builder_version=”3.17.2″]   [/et_pb_image][et_pb_text _builder_version=”3.17.2″]

By this rule, the Perceptron will train or learn the weights. This is why it is named as Perceptron Learning Algorithm.

[/et_pb_text][et_pb_post_nav _builder_version=”3.17.2″]   [/et_pb_post_nav][/et_pb_column][/et_pb_row][/et_pb_section][et_pb_section bb_built=”1″ prev_background_color=”#ffffff” _builder_version=”3.15″ use_background_color_gradient=”on” background_color_gradient_start=”#3c8fd8″ custom_padding=”111px|0px|54px|0px|false|false” top_divider_style=”waves2″][et_pb_row custom_padding=”100px|0px|40px|0px|false|false” _builder_version=”3.15″][et_pb_column type=”4_4″][et_pb_signup mailchimp_list=”sayan de|e86c68b56b” layout=”top_bottom” name_field=”on” success_message=”Thanks For Subscribing!” title=”Subscribe” description=”<br /> ” _builder_version=”3.16″ header_level=”h1″ header_font=”ABeeZee|||on|||||” header_text_align=”center” header_text_color=”#ffffff” header_font_size=”53px” header_font_size_last_edited=”on|desktop” use_background_color=”off” custom_button=”on” button_text_size=”24px” button_text_color=”#ffffff” button_font=”||||||||” button_on_hover=”off” button_text_size__hover_enabled=”off” button_one_text_size__hover_enabled=”off” button_two_text_size__hover_enabled=”off” button_text_color__hover_enabled=”off” button_one_text_color__hover_enabled=”off” button_two_text_color__hover_enabled=”off” button_border_width__hover_enabled=”off” button_one_border_width__hover_enabled=”off” button_two_border_width__hover_enabled=”off” button_border_color__hover_enabled=”off” button_one_border_color__hover_enabled=”off” button_two_border_color__hover_enabled=”off” button_border_radius__hover_enabled=”off” button_one_border_radius__hover_enabled=”off” button_two_border_radius__hover_enabled=”off” button_letter_spacing__hover_enabled=”off” button_one_letter_spacing__hover_enabled=”off” button_two_letter_spacing__hover_enabled=”off” button_bg_color__hover_enabled=”off” button_one_bg_color__hover_enabled=”off” button_two_bg_color__hover_enabled=”off” /][/et_pb_column][/et_pb_row][/et_pb_section]

Sayan De

Sayan De is pursuing his M.Tech in CSE. His interest area of work is Machine Learning, Deep Learning, Deep NLP, Computer Vision, Data Science, Linux, and a little bit of Website Development.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.