Dear All,
I am trying to build an artificial neural network.
So far my network is divided into several layers, all of which comprising a certain amount of neurons. Every neuron is connected to every other neuron in the previous and following layer. Neurons receive an input from all their connections. These inputs have been multiplied by a value which is the "weight" of the connection between the two. All inputs are summed and the output of the neuron is the result of a sigmoid function which takes the sum of inputs as a parameter. Overall, the network produces a result R. The "error" of the network is given by E = 1/2*(O-D)^2 where D is the desired result. At this point, having:
a) The individual output of all neurons stored within the neuron objects
b) The overall error of the network
How should I modify the weight of the connections between neurons?
Cheers :)
Dario
I am trying to build an artificial neural network.
So far my network is divided into several layers, all of which comprising a certain amount of neurons. Every neuron is connected to every other neuron in the previous and following layer. Neurons receive an input from all their connections. These inputs have been multiplied by a value which is the "weight" of the connection between the two. All inputs are summed and the output of the neuron is the result of a sigmoid function which takes the sum of inputs as a parameter. Overall, the network produces a result R. The "error" of the network is given by E = 1/2*(O-D)^2 where D is the desired result. At this point, having:
a) The individual output of all neurons stored within the neuron objects
b) The overall error of the network
How should I modify the weight of the connections between neurons?
Cheers :)
Dario