F. Y. M. Tech. Civil
Semester - II
Dendrit
es
Axon
BACK TO 97
Structure of Neuron
Components
Dendrites: are branching fibers that extend from the cell body .
Soma or cell body: contains the nucleus and other structures,
support chemical processing and production of neurotrnasmitters.
Axon: singular fiber carries information away from the soma to
the synaptic sites of other neurons (dendrites and somas), muscles
or glands.
Axon hillock : the site of summation for incoming information.
At any moment, the collective influence of all neurons that
conduct impulses to a given nueron will determine whether or not
an action potential will be initiated at the axon hillick and
progogated along the axon.
Components..
Myelin sheath: consists of fat containing cells that insulate the
axon from electrical activity. This insulation acts to increase the
rate of trnasmission of singals. A gap exists between each myeline
sheath cell along the axon. Since fat inhibits the progagation of
electricity, the signals jump from one gap to the next.
Nodes of Ranvier: are the gaps (about 1 micron) between myelin
sheath cells along axons . Since fat servers as a good insulator, the
myelin sheaths speed the rate of trnasmission of an electrical
impusle along the axon.
Synapse: is the point of connection between two neurons or a
neuron and a muscle or a gland. Electrochemical communication
between neuron takes place at these junctions.
Terminal Buttons: small knobs at the end of an axon that release
chemicals called neurothrnsmitters.
Neuron
Processes inside the biological neural networks are very
complex and they still cannot be completely studied and
explained.
There are hundreds of different types of biological neurons in
human brain, so it is almost impossible to create a mathematical
model that will be absolutely the same as the biological neural
network.
However, for practical application of artificial neural networks,
it is not necessary to use complex neuron models. Therefore,
the developed models for artificial neurons only remind us to
the structure of the biological ones and they have no pretension
to copy their real condition
Neural Network
Neural network is composed of numerous mutually connected
neurons grouped in layers.
The complicity (participation) of the network is determinate
(defined) by the number of layers.
Beside the input (first) and the output (last) layer, network can
have one or few hidden layers.
The purpose of the input layer is to accept data from the
surroundings. Those data are processed in the hidden layers and
sent into the output layer. The final results from the network are
the outputs of the neurons from the last network layer and that is
actually the solution for the analysed problem.
The input data can have any form or type. The basic rule is that
for each data we must have only one input value. Depending on
the problems type, the network can have one or few outputs.
Weight Coefficients
Weight coefficients are the key elements of every neural network.
They express the relative importance of each neurons input and
determine the inputs capability for stimulation of the neurons.
Every input neuron has its own weight coefficient. By
multiplying those weight coefficients with the input signals and
by summing that, we calculate the input signal from each neuron.
In the figure (neuron), the input data are marked as X1, X2 and
X3, and the appropriate weight coefficients are W1, W2 and W3.
The input neuron impulses are W1X1, W2X2 and W3X3. Neuron
registers the summed input impulse which is equal to the sum of
all input impulses: X = W1X1 + W2X2 + W3X3.
The received impulse is processed through an appropriate
transformation function (activation function), f(x), and the output
signal from the neuron will be: Y = f(x) = f(W1X1 + W2X2 +
W3X3).
Weight Coefficients
Weight coefficients are elements of the matrix W that has n
rows and m columns. For example, the weight coefficient
Wnm is actually the mth output of the nth neuron (Fig. 2).
The connection between the signals source and neurons is
determined by the weight coefficients. Positive weight
coefficient means speeding synapse and negative
coefficient means inhibiting synapse. If Wij = 0 it means
that there is no connection between these two neurons.
One very important characteristic of neural networks is
their ability for weight adjustment according to the
received history data, which is actually the learning
process of the network.
Activation Function
The main purpose of the activation (transformation) function is to
determine whether the result from the summary impulse X =
W1X1 + W2X2 + .... + WnXm can generate an output.
This function is associated with the neurons from the hidden
layers and it is mostly some non-linear function.
Almost every non-linear function can be used as an activation
function, but a common practice is to use the sigmoid function
(hyperbolic tangent and logistic) with the following form:
= 1/(1+ ^( ) ), where: Yt is normalized value of
the result of the summary function. The normalization means that
the outputs value, after the transformation, will be in reasonable
limits, between 0 and 1
If there is no activation function and no transformation, the output
value might be too large, especially for complex networks that
have few hidden layers.
22
BACK TO 97
24
Basics of NN
Learning of ANN
Most commonly used learning system is back-propagation
model.
The learning algorithm processes the patterns in two stages.
In the first stage, the input pattern generates a forward
flow of signals from the input layer to the output layer.
The error of each output neuron is then determined from
the difference between the computed values and the
observed (experimental) values.
The second stage involves - readjustment of weights &
biases in the hidden and output layers to reduce the
difference between computed and desired outputs.
BACK TO 97
26
BACK TO 97
27
BACK TO 97
28
A AND B
This network will contain two inputs and one output. A neural network
that recognizes the AND logical operation is shown in Fig.
There are two inputs to the network shown in Fig. Each neuron has a
weight of one. The threshold is 1.5. Therefore, a neuron will only fire if
both inputs are true. If either input is false, the sum of the two inputs will
not exceed the threshold of 1.5.
Consider inputs of true and false. The true input will send a value of one
to the output neuron. This is below the threshold of 1.5. Likewise,
consider inputs of true and true. Each input neuron will send a value of
one. These two inputs are summed by the output neuron, resulting in two.
The value of two is greater than 1.5, therefore, the neuron will fire.
A OR B
0
0
1
0
1
0
0
1
1
A XOR B
0
0
1
0
1
0
0
1
1