In Neural Networks NN, the BackPropagation BP algorithm is still the approach of preference for training large networks. BP is a supervised learning method used by MLP for training. It is a gradient descent algorithm that aims to minimize the errors between the network's output and the desired outcome. The most widely used training algorithm, back-propagation, is a gradient descent technique that efficiently computes the derivatives' values and modifies the weights according to a parameter known as the learning rate.
-
Notifications
You must be signed in to change notification settings - Fork 0
Heart attack prediction using first derivative
rayana87/Backpropagation
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Heart attack prediction using first derivative
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published