Error back propagation neural network, BP neural network which is a kind of according to the error back propagation algorithm training of the multilayer feedforward network, is one of the most widely used neural network model, with arbitrary complex pattern classification ability and excellent dimensional function mapping ability, can learn and store a lot of input - output model mapping relation, Without the need to reveal the mathematical equation describing the mapping relationship in advance, it solves the xOR and some other problems that simple perceptron cannot solve. Structurally as shown in figure 1, the BP network has input layer, hidden layer and output layer, and the training of the input layer of the input data, the format is determined by the input data, the hidden layer is to point to the parameters of the neural network weight matrix, it is not necessarily the 1 layer, the number of neurons is a layer of nodes, number of hidden layer upon layer refers to hide how many layers, in essence, BP algorithm takes network error square as objective function and uses gradient descent method to calculate the minimum value of objective function. It has strong nonlinear mapping ability and flexible network structure. The number of intermediate layers and the number of neurons in each layer can be set arbitrarily according to the specific situation, and its performance is different with the difference of the structure.