Anda di halaman 1dari 4

Code.

No: 37378
R05 SET-1

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD


IV .B.TECH – I SEM REGULAR EXAMINATIONS JANUARY- 2010
ARTIFICIAL NEURAL NETWORKS
(COMMON TO EIE, BME, ETM)
Time: 3hours Max.Marks:80
Answer any FIVE questions
All questions carry equal marks
---

1. Draw the schematic diagram of a biological neuron and explain the function of each
part? [16]

2.a) Use Mc Culloch- Pitts neurons to design logic networks to implement the following
functions: Use two neurons
i) O k + 2 = x11k x2k x31k
ii) O k + 2 = x1k x2k
b) Compare Boltzman learning with memory based learning [16]

3.a) What are the blocks in a recognition and classification system? Explain them?
b) Draw the architecture for Radial basis function network? And explain? [8+8]

4. Draw the madaline architecture and explain MR II algorithm. [16]

5. Compare the performances and architectures of back propagation algorithm with full
counter propagation networks. [16]

6. Discuss the energy function and storage capacity of discrete Hopfield network

7.a) What are the 3 states of ART network.


b) Explain the training algorithm for ART network [8+8]

8. Discuss about character recognition using bidirectional associative memory.[16]

*****
Code.No: 37378
R05 SET-2

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD


IV .B.TECH – I SEM REGULAR EXAMINATIONS JANUARY- 2010
ARTIFICIAL NEURAL NETWORKS
(COMMON TO EIE, BME, ETM)
Time: 3hours Max.Marks:80
Answer any FIVE questions
All questions carry equal marks
---

1.a) What are the different neural network architectures and explain them?
b) Discuss about the historical development of neural networks? [8+8]

2.a) Explain about Hebbian learning rule?


b) The vertices of a three dimensional bipolar binary cube represent eight states of a
recurrent neural network with 3 bipolar binary neural. The equilibrium states are
O1 = [−1 − 1 − 1]t and O2 = [1 1 1]t . Sketch the desirable state transitions between the
vertices. [8+8]

3.a) What is the effect of steepness of the activation function on back propagation
learning?
b) Write the back propagation learning algorithm? [8+8]

4. Draw the madaline architecture and explain MRI algorithm. [16]

5. Discuss the following learning laws. [16]


a) Winner takes all
b) Grossberg-in star and
c) Grossberg out star

6.a) What are the assumption to be satisfied for a network to form a Hopfield network?
b) What is the physical significance of energy function used in Hopfield neural network?
[16]

7.a) What is the effect of vigilance parameter in ART networks


b) Discuss how simultaneous linear equations are solved using neural networks
[16]

8. Discuss how simultaneous linear equations are solved using neural networks [16]

*****
Code.No: 37378
R05 SET-3

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD


IV .B.TECH – I SEM REGULAR EXAMINATIONS JANUARY- 2010
ARTIFICIAL NEURAL NETWORKS
(COMMON TO EIE, BME, ETM)
Time: 3hours Max.Marks:80
Answer any FIVE questions
All questions carry equal marks
---

1.a) Discuss about different learning methods in artificial neural networks?


b) What are the limitations of a computer when compared to our brain? [8+8]

2.a) Write about perceptron learning rule?


b) A single neuron network using f(net)=sgn (net) has been trained using the pairs of X i , di
as shown below:
⎛ ⎡1⎤ ⎞ ⎛ ⎡0⎤ ⎞ ⎛ ⎡ −2 ⎤ ⎞
⎜ ⎢ −2 ⎥ ⎟ ⎜ ⎢ −1⎥ ⎟ ⎜ ⎢0⎥ ⎟
⎜ X = ⎢ ⎥ ; d = −1⎟ ; ⎜ X = ⎢ ⎥ , d = 1⎟ ; ⎜ X = ⎢ ⎥ ; d = −1⎟
⎜ 1 ⎢3⎥ 1 ⎟ ⎜ 2 ⎢2⎥ 2 ⎟ ⎜ 3 ⎢ −3 ⎥ 3 ⎟
⎜⎜ ⎢ ⎥ ⎟
⎟ ⎜⎜ ⎢ ⎥ ⎟
⎟ ⎜ ⎜ ⎢ ⎥ ⎟⎟
⎝ ⎣ −1 ⎦ ⎠ ⎝ ⎣ − 1⎦ ⎠ ⎝ ⎣ − 1⎦ ⎠
4
The final weight obtained using the perceptron rule is W = [3 2 6 1]t . Knowing that
correction has been performed in each step for C=1, determine the following weight: W3,
W2, W1 by back tracking the training. [16]

3.a) Explain the algorithm for R-category discrete perceptron training?


b) What is the effect of cumulative weight adjustment in back propagation learning? [16]

4. Explain the Adaline architecture? What are its applications? [16]

5. What are Kohenen’s self organizing maps? Write the architecture and training algorithms
for Kohemen SOMsd? [16]

6.a) Consider a discrete Hopfield net with


S(1) = (1 -1 1 -1) t(1) = (1 -1)
S(2) = (-1 -1 1 1) t(2) = (-1, 1)
Find the weight matrix using outer product rule.
b) Write about capacity of Hopfield neural network [8+8]

7. State and explain bidirectional associative memory (BAM) energy theorem? [16]

8. Explain how traveling salesman problem can be implemented using Hopfield network
[16]

*****
Code.No: 37378
R05 SET-4

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD


IV .B.TECH – I SEM REGULAR EXAMINATIONS JANUARY- 2010
ARTIFICIAL NEURAL NETWORKS
(COMMON TO EIE, BME, ETM)
Time: 3hours Max.Marks:80
Answer any FIVE questions
All questions carry equal marks
---

1.a) Compare biological neuron with artificial neuron?


b) Write about ramp type activation function? [8+8]

2.a) A recurrent network with three bipolar binary neurons has been trained using the
correlation learning rule with a single bipolar binary input vector in a single training step
only. The training was implemented starting at W0 = 0, for C = 1. The resulting weight
⎡ 1 −1 −1⎤
matrix is W = ⎢⎢ −1 1 1 ⎥⎥ . Find the vectors X and d that have been used for training.
⎢⎣ −1 1 1 ⎥⎦
b) Discuss about the delta learning rule.
[10+6]

3.a) Write the single discrete perceptron training algorithm and explain?
b) Define the following:
i) discriminant function
ii) dichotomizer. [8+8]

4. Compare Adaline and Madaline algorithms? Bring out their salient features [16]

5. Draw the architecture of LVQ and explain its training algorithm? [16]

6.a) Compare continuous Hopfield network and discrete Hopfield network?


b) What are the limitations of Hopfield networks? [8+8]

7.a) Write about recognition phase in ART networks.


b) What is meant by simulated annealing? [8+8]

8. With neat figures explain the implementation of A/D converter using Hopfield network?
[16]

*****

Anda mungkin juga menyukai