This is the central content of the neural network. First of all, remember this sentence: the gradient direction is the direction of the fastest descent, don't ask why, just remember. Therefore, when the neural network adjusts the weight according to the error, it is adjusted according to the direction of gradient descent. <br>Secondly, the essence of back propagation is the back propagation of error information. Because only through the back propagation of error information, the previous neurons can know how to adjust their own weights and thresholds to optimize the network. Backpropagation should also distribute error information according to weights. Neurons with large weights have a large adjustment range, while neurons with small weights have a small adjustment range. After all, those with large weights should bear the main responsibility.
正在翻译中..