Table of Contents [show]
06/26/2009 – added a new indicator BPNN Predictor with Smoothing.mq4, in which prices are smoothed using EMA before predictions.
08/20/2009 – corrected the code calculating the neuron activation function to prevent arithmetic exception; updated BPNN.cpp and BPNN.dll
08/21/2009 – added clearing of memory at the end of the DLL execution; updated BPNN.cpp and BPNN.dll
Brief theory of Neural Networks:
Neural network is an adjustable model of outputs as functions of inputs. It consists of several layers:
input layer, which consists of input data
hidden layer, which consists of processing nodes called neurons
output layer, which consists of one or several neurons, whose outputs are the network outputs.
All nodes of adjacent layers are interconnected. These connections are called synapses. Every synapse has an assigned scaling coefficient, by which the data propagated through the synapse is multiplied. These scaling coefficient are called weights (w[i][j][k]). In a Feed-Forward Neural Network (FFNN) the data is propagated from inputs to the outputs. Here is an example of FFNN with one input layer, one output layer and two hidden layers: