Chapter 4
Chapter 4
Step 1:
Sensation: Detecting external events with sense organs and turning those stimuli into
neural signals.
Step 2:
Perception: Attending, organizing, and interpreting stimuli that we sense.
The raw sensations are turned into information the brain can process through
Transduction.
Transduction: When specialized receptors transform the physical energy of the outside
world into neural impulses.
These neural impulses travel into the brain and influence the activity of different brain
structures, which gives rise to our internal representation of the world.
The transduction of light occurs when it reaches receptors at the back of the eye; light-
sensitive chemicals in the retina then convert this energy into nerve impulses that travel
to brain centres where colour and motion are perceived and objects are identified.
The transduction of sound takes place in a specialized structure in the ear called the
cochlea, where sound energy is converted into neural impulses that travel to the hearing
centres of the brain.
The Doctrine Of Specific Nerve Energies: The idea that the different senses are
separated in the brain (proposed in 1826 by Johannes Müller).
Studies have found that pathways from sensory organs to their brain structures aren’t
fully distinct until age 3. For example, speech activates both the visual and auditory
brain structures in babies. This shows that perception is a skill our brains learn.
Changes in our sensory and perceptual worlds elicit an orienting response, which allows
us to quickly shift our attention to new or altered stimuli.
Psychophysics: The field of study that explores how physical energy such as light and
sound and their intensity relate to the psychological experience.
A popular approach was to measure the minimum amount of a stimulus needed for
detection, and the degree to which a stimulus must change in strength for the change to
be perceptible to people.
Absolute Threshold: The minimum amount of stimulation required for a person to detect
the stimulus 50 percent of the time
This effect was formalized into an equation by Ernst Weber (1795–1878) called Weber’s
Law.
Weber’s Law: The just noticeable difference between two stimuli changes as a
proportion of those stimuli.
Imagine you’re holding 50 g of candy in your hand. The just noticeable difference is 5 g
(i.e., you can tell the difference between 50 g and 55 g of candy). Your friend hands you
100 g of candy. Weber’s law would suggest that the just noticeable difference would be
10 g. If the just noticeable difference of 50 g is 5 g, and if 100 g is 50 g doubled, then
the just noticeable difference of 100 g should be 5 g doubled: 10 g.
An individual’s ability to detect stimulus relies on both their sensory organs and their
current autonomic-nervous-system arousal (Being nervous will cause you to pay more
attention to stimuli).
Priming was found to be unable to create new motivation but capable to enhance
already existing motivations.
The Figure-Ground Principle: A basic Gestalt principle is that objects or “figures” in our
environment tend to stand out against a background.
The text in front of you is a figure set against a background, but you may also consider
the individual letters you see to be figures against the background of the page.
The figure-ground principle applies to hearing as well. When you are holding a
conversation with someone in a crowded party, you are attending to the figure (the
voice of the individual) against the background noise (the ground).
Exactly which object is the figure and which is the ground at any given moment depends
on many factors, including what you are motivated to pay attention to.
Proximity and Similarity are two additional Gestalt principles that influence perception.
Proximity: The tendency to treat two or more objects that are in close proximity to each
other as a group (Ex, Because of their proximity, people standing next to each other in a
photograph are assumed to be a group.)
Similarity: The tendency to group together objects based on their visual similarity (Ex,
can be experienced by viewing groups of people in uniform, such as two different teams
on a soccer field.)
Continuity, or “Good Continuation”: The perceptual rule that lines and other objects tend
to be continuous, rather than abruptly changing direction.
Phonetic Reversal: Where a word pronounced backwards sounds like another word
(e.g., dog and god).
People were found to be unable to distinguish the original content of speech when it is
played backwards.
However, once participants knew about the supposed “backwards message” in songs
they were able to perceive it. This is known as Top-Down Processing.
Divided Attention has been shown to lower performance in both laboratory and real-
world studies.
Selective Attention increases performance at the particular task but also lowers
perception of your surroundings.
To ensure that this sequence of events begins correctly, the eye needs specialized
structures that allow us to regulate how much light comes in, to respond to different
wavelengths of light, to maintain a focus on the most important objects in a scene, and
to turn physical energy into action potentials, the method by which information is
transmitted in the brain.
Long waves are seen as red, short waves are seen as blue, and waves between green
and blue are seen as green. (intensity)
Low-amplitude waves are seen as dim colours, whereas high-amplitude waves are seen
as bright colours.
Lightwaves also vary in how many different waves are being viewed at once. You will
see the colour that corresponds with the most amount of waves present. The more of a
single wave preset the more saturated the colour.
Cornea: The clear layer that covers the front portion of the eye and also contributes to
the eye’s ability to focus.
Pupil: Regulates the amount of light that enters by changing its size; it dilates (expands)
to allow more light to enter and constricts (shrinks) to allow less light into the eye.
Iris: A round muscle that adjusts the size of the pupil; it also gives the eyes their
characteristic colour.
Lens: A clear structure that focuses light onto the back of the eye.
Accommodation: When the lens changes its shape to ensure that the light entering the
eye is refracted in such a way that it is focused when it reaches the back of the eye.
Transduction: When the light reaches the back of the eye, it will stimulate a layer of
specialized receptors that convert light into a message that the brain can then interpret.
Retina: Lines the inner surface of the back of the eye and consists of specialized
receptors that absorb light and send signals related to the properties of light to the brain.
Photoreceptors: Where light will be transformed into a neural signal that the brain can
understand.
Photoreceptors are located at the back of the eye so they are protected and have a
constant blood supply.
Glial Cells: Located at the front of the eye to help gather and guide light to targeted
areas of the retina. These help optimized our ability to see colour in the daytime.
Ganglion Cells: Cells in the front of the retina that gather information from the
photoreceptors.
This information will then alter the rate at which the ganglion cells fire. The activity of all
of the ganglion cells is then sent out of the eye through the optic nerve.
Since the optic nerve is in the back of the eye there is an area called the optic disk
where there are no photoreceptors. This creates a blind spot.
Invoke Perception: The visual areas of the brain are able to “fill in” the missing
information for us.
4.2.4 The Retina: From Light To Nerve Impulse
There are 2 types of photoreceptors. Rods and cones.
Rods: Photoreceptors that occupy peripheral regions of the retina; they are highly
sensitive under low light levels. Rods are particularly responsive to black and grey.
Cones: Photoreceptors that are sensitive to the different wavelengths of light that we
perceive as colour. Cones tend to be clustered around the fovea, the central region of
the retina.
When rods and cones are stimulated by light their physical structure briefly changes
which then alters the activity of neurons in the different layers of the retina.
The final layer to receive this changed input consists of ganglion cells, which eventually
output to the optic nerve.
The ratio of ganglion cells to cones in the fovea is approximately one to one; in contrast,
there are roughly 10 rods for every ganglion cell. So, all of the input from a cone is
clearly transmitted to a ganglion cell, whereas the input from a rod must compete with
input from other rods.
Cones are clustered in the fovea (i.e., at the centre of our visual field) and have a one-
to-one ratio with ganglion cells, while rods are limited to the periphery of the retina and
have a ten-to-one ratio with ganglion cells.
These differences help explain why colourful stimuli are often perceived as sharp
images while shadowy grey images are perceived as being hazy or unclear.
Dark Adaptation: The process by which the rods and cones become increasingly
sensitive to light under low levels of illumination.
During dark adaptation, the photoreceptors are slowly regenerated after having been
exposed to light. Cones regenerate more quickly than rods, often within about 10
minutes. However, after this time, the rods become more sensitive than the cones. We
do not see colour at night or in darkness because rods are more active than cones
under low light levels.
4.2.5 The Retina And The Perception Of Colours
Currently, two theories exist to explain how cells and photoreceptors in the eye can
produce colourful experiences.
One theory suggests that three different types of cones exist, each of which is sensitive
to a different range of wavelengths on the electromagnetic spectrum. These three types
of cones were initially identified in the 18th century by physicist Thomas Young and then
independently rediscovered in the 19th century by Hermann von Helmholtz.
The relative responses of the three types of cones allow us to perceive the many
different colours that comprise the spectrum
In the 19th century, Ewald Hering proposed the Opponent-Process Theory of colour
perception.
This type of perception is consistent with the activity patterns of retinal ganglion cells.
A cell that is stimulated by red is inhibited by green; when red is no longer perceived (as
when you suddenly look at a white wall), a “rebound” effect occurs. Suddenly, the
previously inhibited cells that fire during the perception of green are free to fire, whereas
the previously active cells related to red no longer do so. The same relationship occurs
for yellow and blue as well as for white and black.
Nearsightedness or Myopia: Occurs when the eyeball is slightly elongated, causing the
image that the cornea and lens focus on to fall short of the retina.
People who are nearsighted can see objects that are relatively close up but have
difficulty focusing on distant objects.
Farsightedness or Hyperopia: Occurs when the length of the eye is shorter than normal.
In this case, the image is focused behind the retina.
Farsighted people can see distant objects clearly but not those that are close by.
Laser Eye Surgery: Reshaping the cornea using a laser to focus incoming light on the
retina.
The first major destination is the optic chiasm, the point at which the optic nerves cross
at the midline of the brain.
For each optic nerve, about half of the nerve fibres travel to the same side of the brain
(ipsilateral), and half of them travel to the opposite side of the brain (contralateral).
The outside half of the retina (closest to your temples) sends its optic nerve projections
ipsilaterally. In contrast, the inside half of the retina (closest to your nose) sends its optic
nerve projections contralaterally.
The result of this distribution is that the left half of your visual field is initially processed
by the right hemisphere of your brain, whereas the right half of your visual field is initially
processed by the left hemisphere of your brain.
Fibres from the optic nerve first connect with the thalamus, the brain’s “sensory relay
station.”
The thalamus is made up of over 20 different nuclei with specialized functions. The
lateral geniculate nucleus is specialized for processing visual information. Fibres from
this nucleus send messages to the visual cortex, located in the occipital lobe, where the
complex processes of visual perception begin.
This division of our visual system performs a critical function: object recognition.
Groups of neurons in the temporal lobe gather shape and colour information from
different regions of the secondary visual cortex and combine it into a neural
representation of an object.
Brain imaging experiments have shown that damage to this stream of vision causes
dramatic impairments in object recognition. Other studies have noted that different
categories of objects - such as tools, animals, and instruments - are represented in
distinct areas of the anterior temporal lobes.
One group of stimuli - possibly the most evolutionarily important one in our visual world -
may have an entire region of the brain dedicated to its perception.
Perceptual Constancy: The ability to perceive objects as having constant shape, size,
and colour despite changes in perspective.
Shape Constancy: Judging the angle of the object relative to our position.
Size Constancy: Based on judgments of how close an object is relative to one’s position
as well as to the positions of other objects.
Brain imaging studies have corroborated the location of the “face area” of the brain in
the bottom right temporal lobe.
Using fMRI, researchers have consistently detected activity in this region, now known
as the fusiform face area (FFA). The FFA responds more strongly to the entire face than
to individual features; unlike other types of stimuli, faces are processed holistically
rather than as a nose, eyes, ears, chin, and so on.
However, the FFA shows a much smaller response when we perceive inverted (upside-
down) faces. In this case, people tend to perceive the individual components of the face
(e.g., eyes, mouth, and so on) rather than perceiving the faces as a holistic unit.
Linear Perspective: When all parallel lines (orthogonals) converge in a single vanishing
point on the horizon line.
Convergence: A type of binocular depth cue. Occurs when the eye muscles contract so
that both eyes focus on a single object.
Convergence typically occurs for objects that are relatively close to you.
One reason humans have such a fine-tuned ability to see in three dimensions is that
both of our eyes face forward. This arrangement means that we perceive objects from
slightly different angles, which in turn enhances depth perception.
Retinal Disparity (or Binocular Disparity): The difference in the relative position of an
object as seen by both eyes, which provides information to the brain about depth.
Most primates, including humans, have stereoscopic vision, which results from ov
erlapping visual fields. The brain can use the difference between the information
provided by the left and right eye to make judgments about the distance of the objects
being viewed.
Monocular Cues: Depth cues that we can perceive with only one eye.
Motion Parallax: Monocular Cue used when you or your surroundings are in motion.
4.3 The Auditory and Vestibular Systems
4.3.1 Sound And The Structures Of The Ear
The function of the ear is to gather sound waves. The function of hearing is to extract
some sort of meaning from those sound waves.
4.3.2 Sound
Sound waves are changes in mechanical pressure transmitted through solids, liquids, or
gases. Sound waves have two important characteristics: frequency and amplitude.
Frequency: Refers to wavelength and is measured in hertz (Hz), the number of cycles a
sound wave travels per second.
High-frequency sounds, such as tires screeching on the road, have short wavelengths
and a high pitch. Low-frequency sounds, such as those produced by a bass guitar, have
long wavelengths and a low pitch.
High-amplitude sound waves are louder than low-amplitude waves. Both types of
information are gathered and analyzed by our ears.
Humans are able to detect sounds in the frequency range from 20 Hz to 20 000 Hz.
The most noticeable part of your ear is the pinna, the outer region that helps channel
sound waves to the ear and allows you to determine the source or location of a sound.
Sound waves reaching the eardrum cause it to vibrate. Even very soft sounds, such as
a faint whisper, produce vibrations of the eardrum.
The middle ear consists of three tiny moveable bones called ossicles, known
individually as the malleus (hammer), incus (anvil), and stapes (stirrup).
The eardrum is attached to these bones, so any movement of the eardrum due to sound
vibrations results in the movement of the ossicles.
Cochlea: A fluid-filled membrane that is coiled in a snail-like shape and contains the
structures that convert sound into neural impulses.
The pressing and pulling action of the ossicles causes parts of the basilar membrane to
flex. This causes the fluid within the cochlea to move, displacing these tiny hair cells.
When hair cells move, they stimulate the cells that comprise the auditory nerves. These
auditory nerves send signals to the thalamus, the sensory relay station of the brain, and
then to the auditory cortex, located within the temporal lobes.
There are two ways that we localize sound. First, we take advantage of the slight time
difference between a sound hitting both ears to estimate the direction of the source.
Second, we localize sound by using differences in the intensity in which sound is heard
by both ears, a phenomenon known as a sound shadow.
Nuclei in the brainstem detect differences in the times when sound reaches the left
versus the right ear, as well as the intensity of the sound between one side and the
other, allowing us to identify where it is coming from.
The Volley Principle: Groups of neurons fire in alternating (hence the term volley)
fashion. A sound measuring 5000 Hz can be perceived because groups of neurons fire
in rapid succession.
Cells within different areas across the auditory cortex respond to specific frequencies.
For example, high musical notes are processed at one end of the auditory cortex, and
progressively lower notes are heard as you move to the opposite end.
As in the visual system, the primary auditory cortex is surrounded by brain regions that
provide additional sensory processing. This secondary auditory cortex helps us to
interpret complex sounds, including those found in speech and music. Interestingly, the
auditory cortices in the two hemispheres of the brain are not equally sensitive.
We are not born with a fully developed auditory cortex. Researchers have identified a
number of different changes in the brain’s responses to sounds during the course of
development:
- Brain imaging studies have shown that infants as young as three months of age
are able to detect simple changes in pitch
- Infants can detect silent gaps in a tone (an ability that may help us learn
languages) between the ages of four to six months
- Infants develop the ability to localize sound at approximately eight months of age
- By 12 months of age, the auditory system starts to become specialized for the
culture in which the infant is living
- Infants who are 10 to 12 months of age do not recognize sound patterns that are
not meaningful in their native language or culture, children in this age group show
different patterns of brain activity when hearing culturally familiar and unfamiliar
sounds
This brain plasticity explains why many of us have difficulty hearing fine distinctions in
the sounds of languages we are exposed to later in life.
4.3.7 Sensation And The Vestibular System
Vestibular System: A sensory system in the ear that provides information about spatial
orientation of the head as well as head motion.
Vestibular Sacs: Structures that influence your ability to detect when your head is no
longer in an upright position.
This section of your vestibular system is made up of two parts: the utricle (“little pouch”)
and the saccule (“little sac”).
The bottom of both of these sacs is lined with cilia (small hair cells) embedded in a
gelatinous substance. When you tilt your head, the gelatin moves and causes the cilia
to bend. This bending of the cilia opens up ion channels, leading to an action potential.
Semicircular Canals: Three fluid-filled canals found in the inner ear that respond when
your head moves in different directions (up-down, left-right, forward-backward).
At the base of each of these canals is an enlarged area called the ampulla. The neural
activity within the ampulla is similar to that of the vestibular sacs - cilia (hair cells) are
embedded within a gelatinous mass. When you move your head in different directions,
the gelatin moves and causes the cilia to bend. This bending, again, makes an action
potential more likely to occur.
Although it may seem as though the vestibular system would only fire when we moved
our heads in different directions, the vestibular sacs and semicircular canals actually
provide the brain with a continuous flow of information about the head’s position and
movement.
These two parts of the vestibular system send information along the vestibular ganglion,
a large nerve fibre, to nuclei in the brainstem. Vestibular nuclei can then influence
activity in a number of brain areas.
The vestibular nuclei also project to part of the insula, an area of the cortex that is
folded in the interior of the brain. This region helps us link together visual,
somatosensory, and vestibular information.
4.4 Touch And The Chemical Senses
4.4.1 The Sense Of Touch
Sensual experiences are dependent on the actions of several types of receptors located
just beneath the surface of the skin, and also in the muscles, joints, and tendons.
These receptors send information to the somatosensory cortex in the parietal lobes of
the brain, the neural region associated with your sense of touch.
One simple method of testing sensitivity, or acuity, is to use the two-point threshold test.
Regions with high acuity, such as the fingertips, can detect the two separate, but closely
spaced, pressure points of the device, whereas less sensitive regions such as the lower
back will perceive the same stimuli as only one pressure point.
Body parts such as the fingertips, palms, and lips are highly sensitive to touch
compared to regions such as the calves and forearm. Research has shown that women
have a slightly more refined sense of touch than men, precisely because their fingers
(and therefore their receptors) are smaller.
The sensitivity of different parts of the body also influences how much space in the
somatosensory cortex is dedicated to analyzing each body part’s sensations. Regions
of the body that send a lot of sensory input to the brain such as the lips have taken over
large portions of the somatosensory cortex while less sensitive regions like the thigh
use much less neural space.
Our skin, teeth, corneas, and internal organs contain nerve endings called nociceptors,
which are receptors that initiate pain messages that travel to the central nervous
system. Nociceptors come in varieties that respond to various types of stimuli - for
example, to sharp stimulation, such as a pin prick, or to extreme heat or cold.
Two types of nerve fibres transmit pain messages. Fast fibres register sharp, immediate
pain. Slow fibres register chronic, dull pain.
Although both slow and fast fibres eventually send input to the brain, these impulses
first must travel to cells in the spinal cord; the firing of neurons within the spinal cord will
influence how this pain is experienced.
According to this theory, cells in the spinal cord regulate how much pain signalling
reaches the brain. The spinal cord serves as a “neural gate” that pain messages must
pass through.
The spinal cord contains small nerve fibres that conduct pain messages and larger
nerve fibres that conduct other sensory signals such as those associated with rubbing,
pinching, and tickling sensations. Stimulation of the small pain fibres results in the
experience of pain, whereas the larger fibres inhibit pain signals so that other sensory
information can be sent to the brain. Thus, the large fibres close the gate that is opened
by the smaller fibres.
According to gate-control theory, if you stub your toe, rubbing the area around the toe
may alleviate some of the pain because the large fibres carrying the message about
touch inhibit the firing of smaller fibres carrying pain signals.
Our experience of pain obviously involves input from the spinal cord to the
somatosensory cortex, this provides our brain with information about the location of the
aversive stimulation.
● Expectations and memory can both increase (or decrease) your feelings of pain.
● Attention can influence how painful a stimulus seems.
● Pain is also related to emotions; negative emotions increase the perception of
pain, while positive emotions can provide a buffer against it.
This interaction is why the same painful stimulus might rate as a 5/10 on a pain scale
one day and as a 7/10 another day - cognitive and emotional factors likely differed
between the two days.
4.4.3 Phantom Limb Pain
Phantom limb sensations are frequently experienced by amputees, who report pain and
other sensations coming from the absent limb.
After limb amputation, the area of the somatosensory cortex formerly associated with
that body part is no longer stimulated by the lost limb. Healthy nerve cells become
hypersensitive when they lose connections.
Taste is registered primarily on the tongue, where roughly 9000 taste buds reside. On
average, approximately 1000 taste buds are also found throughout the sides and roof of
the mouth.
Sensory neurons that transmit signals from the taste buds respond to different types of
stimuli, but most tend to respond best to a particular taste. Our experience of taste
reflects an overall pattern of activity across many neurons and generally comes from
stimulation of the entire tongue rather than just specific, localized regions.
The middle of the tongue has very few taste receptors, giving it a similar character to
the blind spot on the retina. We do not feel or sense the blind spot of the tongue
because the sensory information is filled in, just as we find with vision.
Taste receptors replenish themselves every 10 days throughout the lifespan, the only
type of sensory receptor to do so.
Receptors for taste are located in the visible, small bumps (papillae) that are distributed
over the surface of the tongue. The papillae are lined with taste buds.
The bundles of nerves that register taste at the taste buds send the signal through the
thalamus and on to higher-level regions of the brain, including the gustatory cortex; this
region is located in the back of the frontal lobes and extends inward to the insula (near
the top of the temporal lobe). Another region, the secondary gustatory cortex, processes
the pleasurable experiences associated with food.
4.4.5 The Olfactory System: Smell
The Olfactory System: Is involved in smell - the detection of airborne particles with
specialized receptors located in the nose.
Our sensation of smell begins with nasal airflow bringing in molecules that bind with
receptors at the top of the nasal cavity.
Within the nasal cavity is the olfactory epithelium, a thin layer of cells that are lined by
sensory receptors called cilia - tiny hair-like projections that contain specialized proteins
that bind with the airborne molecules that enter the nasal cavity.
Humans have roughly 1000 different types of odour receptors in their olfactory system
but can identify approximately 10 000 different smells.
These groups of cilia then transmit messages directly to neurons that converge on the
olfactory bulb on the bottom surface of the frontal lobes, which serves as the brain’s
central region for processing smells. (Unlike our other senses, olfaction does not involve
the thalamus.)
The olfactory bulb connects with several regions of the brain through the olfactory tract,
including the limbic system (emotion) as well as regions of the cortex where the
subjective experience of pleasure (or disgust) occurs.
4.4.7 Synesthesia
Neuroimaging studies have provided some insight into this condition. For instance, one
research group tested synesthetes who have specific colour perceptions that appear
whenever they read a number (e.g., every time they see “2”, it appears with a yellow
border). These researchers found activity in areas of the brain related to colour
perception in synesthetes, but not in non-synesthetes.
More recent studies suggest that the brains of people with synesthesia may contain
networks that link different sensory areas in ways not found in other people.