Skip to content

Adaline was published by Bernard Widrow and his doctoral student Tedd Hoff and can be considered as an improvement on the perceptron algorithm. This algorithm illustrates the key concepts of defining and minimizing the continuous cost function that lays the groundwork for understanding more advanced machine learning algorithms for classification…

Notifications You must be signed in to change notification settings

aryalp2/Adaptive-Linear-Neuron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Adaptive-Linear-Neuron

Adaline was published by Bernard Widrow and his doctoral student Tedd Hoff and can be considered as an improvement on the perceptron algorithm. This algorithm illustrates the key concepts of defining and minimizing the continuous cost function that lays the groundwork for understanding more advanced machine learning algorithms for classification, such as logistic regression, support vector machines and regression models. The key difference between the Adaline rule (also known as the Widrow-Hoff rule) and Rosenblatt's perceptron is that the weights are updated based on a linear activation function rather than a unit step function like in the perceptron. In Adaline, this linear activation function is simply the identity function of the net input.

About

Adaline was published by Bernard Widrow and his doctoral student Tedd Hoff and can be considered as an improvement on the perceptron algorithm. This algorithm illustrates the key concepts of defining and minimizing the continuous cost function that lays the groundwork for understanding more advanced machine learning algorithms for classification…

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published