@WelchLabsVideo
  @WelchLabsVideo
Welch Labs | Neural Networks Demystified [Part 4: Backpropagation] @WelchLabsVideo | Uploaded 9 years ago | Updated 2 hours ago
Backpropagation as simple as possible, but no simpler. Perhaps the most misunderstood part of neural networks, Backpropagation of errors is the key step that allows ANNs to learn. In this video, I give the derivation and thought processes behind backpropagation using high school level calculus.

Supporting Code and Equations:
github.com/stephencwelch/Neural-Networks-Demystified

In this series, we will build and train a complete Artificial Neural Network in python. New videos every other friday.

Part 1: Data + Architecture
Part 2: Forward Propagation
Part 3: Gradient Descent
Part 4: Backpropagation
Part 5: Numerical Gradient Checking
Part 6: Training
Part 7: Overfitting, Testing, and Regularization

@stephencwelch
Neural Networks Demystified [Part 4: Backpropagation]Learning To See [Part 6: Its Definitely Time to Play with Legos]Self Driving Cars [S1E2: ALVINN]Link to posters in bio! welchlabs.com/resourcesHow to Science [Part 2: Our Universe = Math?]The equant.Oppenheimer reading list book number four. #oppenheimerAlexnet in 60s.#oppenheimer #readinglist #book  number five. Full list at www.amazon.com/shop/welchlabs.Side ViewQuick!Learning To See [Part 12: Lets Get Greedy]

Neural Networks Demystified [Part 4: Backpropagation] @WelchLabsVideo

SHARE TO X SHARE TO REDDIT SHARE TO FACEBOOK WALLPAPER