Anda di halaman 1dari 6

Page |1

The sense of sight and hearing


Lets begin by asking ourselves, what is the purpose of perception? According to
Albert Bregman, perceptual faculties evolved to allow us build a useful representation of
reality and provide the where, when and what of the events around us. To be able to do that,
we rely on data that our brain is provided by different sense organs after they are physically or
chemically stimulated by outside factors. Among the five human senses (sight, hearing, smell,
touch and taste), vision and hearing are considered the most complex because of their
considerable representation within the cortical regions of our central nervous system.
Interestingly, despite their complexity, hearing and vision fails us more often and more
rapidly than do taste, smell or touch. In the following pages I will discuss about these two
important senses and compare the differences and similarities between them.
The eye is the organ responsible for the sense of sight. Light enters the eye through the
pupil, which is a hole in the middle of the iris, then it is focused with help from a transparent
layer (the cornea) and a convergent lens (the crystalline) on to the photo-sensitive layer of the
eye called retina. The image formed there is smaller and upside down, but the brain is able to
flip it so we can perceive the world as it is. The retina is made of two types of light-sensitive
cells: the cones which are sensitive to color and the rods, which are sensitive to the intensity
of light. Cones are responsible for our photopic vision (when the environment is so bright that
the rods are not functioning) and come in three different sorts, each of them being specialized
in sensing long-waves, middle-waves and short-waves (usually knows as blue, green and red
sensitive cells). On the other hand, rod-cells are all the same and respond very well in dim
light, playing an important role in our scotopic or night time vision. All the visual information
gathered by the photosensitive cells leave each eye through the optic nerves and converge at a
point in the brain called the optic chiasm, where a crossing-over of the pathways is produced.
From there, information arrives to the occipital lobes, in the visual cortex and the brain
produces the sense of sight.
The auditory system of the human is also as complex as the vision system, being
responsible with the sense of hearing and is divided into two parts: the peripheral part and the
central part. The auditory periphery is the first to deal with the sound waves and is not a part
of the nervous system. As sound waves go through air, they first meet the outer ear which is
composed of the pinna (the actual visible part of the ear) and the auditory canal. The pinna
has a significant role in altering sounds, especially at high frequencies and is very important

Page |2

in our ability to localize the source of the sound. As sound goes further, it causes the tympanic
membrane and then the three ossicles (malleus, incus and stapes) to vibrate. The vibration is
then transferred through a series of windows to the inner ear. The inner ear is represented by
cochlea, a spiral shaped cavity filled with liquid and a basilar membrane which is displaced in
different amounts on its length, based on the frequency of the vibration. The movement of the
basilar membrane is then transferred to the organ of Corti and then further to the temporal
lobe, in the auditory cortex, producing the sensation of hearing.
First of all, the differences between hearing and vision are far more numerous than the
similarities. For example, the ear and the eye derive from different primitive germ layers and
handle different spectrums of sensing. On the other hand, the similarities between them are
often striking. Both organs are able to collect random stimuli that are impinging on the
organism and sort these stimuli by giving them temporal and spatial organization, thus
complementing our perception of the world. Moreover, they do it with maximum efficiency,
being able to respond to stimuli so weak that the limiting value is imposed by the nature of the
physical stimuli rather than the capacity of the organ.
In terms of sensitivity, it is important to mention that neither organ produces zero
sensation if a stimulus of zero intensity is applied. Both of them manufacture their own slight
level of activity. For the eye, an idioretinal grey is perceived when no light hits the retina. For
the ear, a static noise is perceived from the movement of different tissues in the body. This
was easily demonstrated in an anechoic room: noises coming from the beating of the heart,
sounds covering a wide frequency spectrum because of mandibular movements that resonates
in the ear canal and even imbalances in the auditory system itself being responsible for the
endaural sound.
In terms of physical energy transduced, it may seem like the eye is far superior to the
ear, because radiant energy is on a less powerful level than air movement, the latter of which
may could be so strong as to be sensed by the skin. However, in spectral regions where both
of the organs are most efficient, the eye and the ear are roughly similar. It has been shown that
the eardrum has to move on a distance of only 0.1 of the diameter of a hydrogen molecule,
requiring about 10-9 ergs of energy before perceiving a sound, which is no far away from
vision which requires between 2.2 to 5.7 x 10-10 ergs. These values are approximately the
energy required by a mosquito to raise its wing. The interesting fact is that the sensitivity of
these organs is so elevated in the way in which the minimum necessary energy to stimulate
vision is very close to the magnitude of the ultimate radiation quantum (Hecht) and for the
ear, the energy produced by collision of air molecules in random Brownian movement is
enough to trigger the sensation of hearing.

Page |3

In terms of reaction time, a study done by Thompson et al. has shown that the reaction
time to detect a visual stimuli is slower than time to detect an auditory one. His study shows
that the visual reaction time lays between 180 and 200 milliseconds, compared to the 140-160
milliseconds for sound. However, there are also researches, for example the one done by
Verleger which shows that visual reaction time is much faster than auditory during and after
physical exercises. Also the reaction times are different from a gender to another, researches
done by Engel demonstrating that male athletes react faster to both sound and visual stimuli
compared to female athletes.
In terms of memory, a new research from the University of Iowa, published in PLOS
One, suggest that peoples memory for auditory information is slightly worse than visual and
touch memory. The theory was tested on a group of 100 undergraduate students. They have
been exposed to three types of stimuli: sounds, visual elements and things that could be
touched. After a period of time they have been put to remember those stimuli and the results
showed that the students were worse at remembering the auditory cues than the visual ones.
Also, a short-term memory experiment was conducted, where participant were asked to listen
to pure sound tones, look at different shades of a colour and feel the different kind of
vibration patterns by gripping a metallic bar. Each set of tones, colours and vibration patterns
was separated with a delay from one to 32 seconds. Even if the students memory declined
proportionally with the amount of time passed, the decline was much abrupt in sounds, rather
than visual things. Moreover, in a second experiment, researchers tested students memory
using stimuli that they might encounter on an everyday basis like: audio recordings of braking
dogs, videos with a basketball game (without sound) and touched common objects like coffee
mugs. As it was expected, an hour later, student were least apt to remember the sounds they
heard, but their memory for the visual and tactile experience were the same. These
experiments show that the way in which we store audio information may be different from the
other types of senses, and the way we learn and memorize things could be dramatically
improved if we add a visual and tactile experience in the learning process.
Even if there seems to be consistent differences between the ear and the eye, a new
UCLA psychology study from December 2011 shows that our senses of hearing and sight are
working very closely together, in a way that we did not know before. Ladam Shams, author of
the new study and UCLA associate professor of psychology says that sight and hearing are
influencing and communicating with each other at a more basic level than scientist usually
believed. Hearing and vision are intertwined to a degree that even if the sound is not relevant
to the task, it still has a great influence in the way we see the world. This collaboration
between the senses of hearing and sight was compared to the way the smell affects the taste;

Page |4

when the information from the other sense is ambiguous or too weak, the other sense can kick
in and clarify the perception.
It was generally known, maybe without evidence, that people who suffer from vision
loss become better at discriminating between sounds, their hearing capabilities becoming
more pronounced in an attempt of the brain to complement the loss of visual information.
However, a group of researchers examined the relationship between vision and hearing and
finally understood how the brain is able to rewire itself in order to adapt to the missing sense.
In an article published in the journal Neuron, neuroscience professor Hey-Kyoung Lee and
biologist Patrick Kanold, have conducted a series of experiments on mice and were able to
uncover how neural connection in the area of the brain that manages hearing and vision work
to support each sense. Results of their experiments showed that the brain in the mice that were
kept for a week in a dark environment in order to simulate blindness suffered a rewiring, thus
making them better at sensing sound. Lee says that our result would say that not having
vision allows you to hear softer sounds and better discriminate pitch. Moreover, the study
has great chances to be tested on humans, especially on those who received a cochlear
transplant, but they dont know at the moment how many days would be enough for a human
to stay in dark in order to benefit their hearing capabilities and neither if they would want to
do that. On the other side, if hearing sense is lost, the same principle as above applies.
Another study published in The Journal of Neuroscience shows that people born deaf use
portions of the brain that are be responsible for hearing for vision processing. These new
findings in the research of neuroplasticity do not have implications only for the rehabilitation
of the blind and deaf, but also for how is the brain capable of rewiring itself after different
traumas and brain injuries.
To sum up briefly, the senses of hearing and sight are extremely important for our
perception of the world we live in. Even if they may seem totally different from one another,
both are working in harmony and complement each others blind spots in order to deliver us
a full range of perceptual sensations, awareness and immersion in the surroundings of our life.
Moreover, in delicate situation when one of the senses its lost, our brain in able to adapt to let
the other sense complement the missing of the other, thus reinforcing my opinion that hearing
and sight work hand in hand and both are equally important for our perception.

Page |5

References
Bates, M. (2012, September 18). Super Powers for the Blind and Deaf. Retrieved December
1, 2014, from http://www.scientificamerican.com/article/superpowers-for-the-blind-and-deaf/

Than, K. (2014, February 5). Temporary Vision Loss Can Boost Hearing. Retrieved
December 1, 2014, from http://www.insidescience.org/content/temporary-vision-loss-canboost-hearing/1551

Moore, B. (2013). BASIC STRUCTURE AND FUNCTION OF THE AUDITORY


SYSTEM. In An introduction to the psychology of hearing (Sixth Ed.).

Sivian, L., & White, S. (1933). Minimum Audible Sound Fields. The Journal of the
Acoustical Society of America, 288-321.

Adams, R.J., Sheppard, P., Cheema, A., and Mercer, M.E. (2013, December 25). Vision vs
hearing: Direct Comparison of the Human Contrast Sensitivity and Audibility Functions,
Meeting abstract presented at VSS 2013, retrieved December 1, 2014

Snowden, R., & Thompson, P. (2012). The first steps in seeing. In Basic vision: An
introduction to visual perception (Rev. Ed.). Oxford: Oxford University Press.

Thompson, P., Colebatch, J., Brown, P., Rothwell, J., Day, B., Obeso, J., & Marsden, C.
(1992). Voluntary Stimulus-sensitive Jerks and Jumps Mimicking Myoclonus or Pathological
Startle Syndromes. Movement Disorders, 7(3), 257-262.

Verleger, R. (1997). On the utility of P3 latency as an index of mental chronometry.


Psychophysiology, 34(2), 131-156.

Wolpert, S., & Menon, D. (2011, December 8). Sound and vision work hand in hand, UCLA
psychologists report. Retrieved December 1, 2014, from
http://newsroom.ucla.edu/releases/sound-and-vision-work-hand-in-220261

Page |6

Andrei Porfireanu
Student number: 20136708
Aalborg University Esbjerg
Medialogy, Semester 3
MED3-3-E14

Anda mungkin juga menyukai