LECTURE--Psych Class Chapter 4--Sensation and Perception: Vision, Thresholds, and Visual Processing (Lecture Notes)
Perception and Sensation: Key Concepts
Sensation vs. perception
Sensation: bottom-up process of detecting environmental stimuli via sensory organs.
Perception: interpretation and meaningful experience of those stimuli by the brain.
Not a simple one-to-one mapping: a small change in a stimulus does not always produce a proportional change in perception.
Transduction
Sensory organs convert physical energy (light, sound, etc.) into neural signals (action potentials) that the brain can process.
Thresholds and sensitivity
Absolute thresholds: the minimum stimulus intensity needed for detection 50% of the time.
Detection thresholds are not fixed; they vary with context, attention, motivation, and adaptation.
Sensitivity: the ease with which one can detect a stimulus; conceptually inversely related to threshold.
Filtering and attention:
The brain acts as a data-reduction machine, filtering out irrelevant information (e.g., background noise) to focus on task-relevant input.
Attention modulates which information gets processed more deeply.
Dynamic nature of sensation and perception
The transition from sensation to perception is dynamic and context-dependent.
Individual differences and biases (e.g., your own name in a noisy room) influence what you detect.
Adaptation
Neurons adapt to constant stimulation, reducing responsiveness to unchanging stimuli.
Examples: ignoring a persistent fan noise, dark/light adaptation in the visual system, olfactory adaptation to a scent.
Adaptation helps the brain emphasize novel or important information over background noise.
Examples and everyday relevance
In a loud room, a whisper is hard to detect unless its volume increases proportionally with background noise (Weber’s Law).
In a quiet room, soft speech can be detected with a small ΔI; in loud environments, larger ΔI is needed.
Perception is influenced by context, goals, motivation, and prior experience.
Weber’s Law and Just Noticeable Difference (JND)
Weber’s Law basics
If you want to maintain constant detectability of a unit change in a stimulus, that change must scale with the overall stimulus intensity.
Mathematical form (Weber’s Law):
where:JND is the just noticeable difference,
I is the baseline stimulus intensity,
k is a constant (Weber fraction).
Example interpretation from lecture
In a quiet room, background noise I ≈ 0.2 and a detectable signal increment ΔI ≈ 0.1; this yields
If background noise increases, the required ΔI to maintain the same detection level increases proportionally (you’d need a larger signal change to notice a difference).
Practically: a whisper in a very loud environment needs to be much louder (more than doubling) to be perceived.
Implications
Thresholds are not fixed; they depend on the level of background information and competing stimuli.
The relationship between sensitivity and thresholds underpins how we experience attention, learning, and perception in noisy environments.
Sensory Adaptation, Attention, and Real-World Filtering
Attention as a selective filter
Attention determines what information the brain continues to process for perception and action.
Example: you can still hear environmental sounds (fan hum) but filter them unless they become relevant.
Adaptation across senses
Vision: dark adaptation when entering a dark theater; your eyes adjust to low light, taking several seconds to reveal detail.
Olfaction: scents become negligible in a moment due to adaptation; re-activation occurs if you leave and return.
Perceptual importance and experience
The brain prioritizes information related to task goals and relevance, shaping what you perceive as important.
Perceptual experience is shaped by prior exposure and learned associations (background knowledge changes perception).
Synesthesia (cross-modal perception)
Definition: blending across sensory modalities (e.g., colors associated with sounds, tastes, or names).
Typical features: colors linked to numbers/letters, sounds with textures or tastes.
Neuroimaging evidence: activation across multiple sensory areas during synesthetic experiences, not just a single modality.
Implications for memory and associative learning: cross-modal links can create rich, durable associations between stimuli.
The Visual System: Anatomy and Transduction
Key structures and their roles
Cornea: the transparent outer layer that helps focus entering light.
Iris and pupil: regulate the amount of light entering the eye via pupil dilation/constriction.
Lens: adjusts focus (accommodation) to project a sharp image on the retina.
Retina: back surface where photoreceptors (rods and cones) transduce light into neural signals.
Optic nerve: transmits retinal signals to the brain; exits the eye at the blind spot.
Thalamus (Lateral Geniculate Nucleus, LGN): relay center to the visual cortex.
Primary visual cortex (V1): retinotopically organized processing of basic visual features.
Photoreceptors: rods and cones
Rods: high sensitivity to light; function well in dim conditions; more peripheral; contribute to motion and silhouette detection.
Cones: color and high visual acuity; dense in the fovea (central retina); responsible for sharp detail and color perception.
Distribution and function
Rods outnumber cones; rods dominate peripheral retina; cones concentrated in the fovea to support high-acuity vision.
Fovea: high cone density; central vision with rich color/detail.
Peripheral retina: more rods; crucial in low-light detection and coarse shape.
Retinal arrangement and convergence
Cones tend to have more direct, one-to-one connections (less convergence) → higher acuity.
Rods converge more onto intermediate neurons → greater sensitivity in low light but reduced acuity.
The blind spot
The optic nerve exits the retina at a location with no photoreceptors, creating a blind spot.
Brain fills in gaps using surrounding information and prior experience to maintain a continuous perceptual field.
From eye to brain: retinotopy and cortical magnification
Retinotopic organization: spatial mapping from retina to visual cortex preserves spatial relationships.
Cortical magnification: foveal inputs occupy disproportionately large areas of V1, reflecting high acuity processing for central vision.
Visual pathways for knowing and doing
Ventral stream (the “what” pathway): visual recognition and identification of objects; linked to memory and knowledge.
Dorsal stream (the “where/how” pathway): spatial awareness and guiding actions; supports navigation and interaction with objects in space.
Integration of dorsal and ventral streams enables both recognition and appropriate action in real-world tasks (e.g., finding Waldo requires knowing what Waldo is and where he is in the scene).
Color Vision: Wavelengths, Pigments, and Perception
Physical properties that drive visual perception
Wavelength: primarily determines color perception; different wavelengths correspond to different color experiences.
Amplitude/Intensity: determines perceived brightness.
Color perception arises from the brain’s interpretation of combinations of wavelengths and their intensities.
Visible spectrum and limits
Humans transduce a subset of the electromagnetic spectrum (visible spectrum); wavelengths outside this range are not perceived because our photoreceptors are not responsive to them.
Color as a perceptual construct
Color does not exist as a separate physical property in the external world; it is a brain-mediated interpretation of light information.
Color vision is subject to context, adaptation, and individual differences in perception.
Color vision deficiencies
Photoreceptor differences (e.g., missing or malfunctioning pigments) lead to color vision deficiencies.
Some perceptual colors are more consistent across individuals even with deficiencies (e.g., greens may appear brown for some; patterns are systematic).
The photopigments and the cones
Three cone photopigments with sensitivities to different parts of the spectrum (roughly corresponding to red, green, and blue components).
Trichromatic theory: color perception arises from the relative activation of these three cone types.
Pixel-level color representation and RGB color model
At the retinal/early processing level, color information can be understood in terms of three primary channels (red, green, blue).
Additive color mixing: combining varying intensities of red, green, and blue produces the wide range of perceivable colors.
CRTs and modern displays illustrate RGB subpixels; color on screen is created by mixing these three primary colors.
The two major theories of color processing
Trichromatic theory (cone-level): explains how the eye encodes wavelength information via three cone types.
Opponent-process theory (perceptual-level): explains how the brain interprets color in terms of opposing pairs (e.g., red-green, blue-yellow) and accounts for afterimages and color adaptation effects.
Current understanding: both theories are correct, but apply at different levels (eye vs brain) and describe different aspects of color processing.
Afterimages and mutual inhibition
Afterimages (e.g., staring at a colored patch then viewing a neutral field) illustrate opponent-process interactions and color opponency.
Mutual inhibition: activity in one color pathway suppresses its opponent, shaping perceptual experience when stimuli are removed or adapted.
Color opponency and perception implications
Color perception is not a simple mapping from single wavelengths to color; it emerges from dynamic interactions among cone inputs and downstream neural circuits.
Visual Processing and Object Recognition
Visual object recognition as a hierarchical process
Early stages extract basic features (edges, orientation, color, motion).
Higher-order areas integrate features to form coherent object representations.
Retinotopic maps and cortical magnification in the cortex
Visual inputs stay organized by spatial location as they progress from retina to cortex.
The foveal region’s information is disproportionately represented in the visual cortex due to cortical magnification.
Visual agnosias: disruption of recognition with preserved perception or action
Visual agnosia: broad category for deficits in recognizing objects despite intact basic perception.
Visual object agnosia: difficulty recognizing objects by sight, but can still perceive other attributes and may draw or manipulate objects.
Prosopagnosia (face blindness): specific inability to recognize familiar faces, often with preserved perception of other objects and features; may still use non-facial cues (clothing, voice) for recognition.
Capgras syndrome: a disconnection between facial recognition and emotional response leading to beliefs that a familiar person is an impostor; arises from disrupted visual-emotional integration.
Kathroff syndrome (Capgras-related condition) discussed in lecture as a related phenomenon.
Spatial and memory integration
Memory and familiarity feed into ventral-stream processing for object recognition (vision for knowing).
Spatial awareness and action guidance rely on dorsal-stream processing (vision for action).
Interaction between streams is essential for goal-directed behavior and navigation in space.
Identity, Memory, and Cross-Modal Associations
Synaesthesia and cross-modal associations
Cross-modal perceptual experiences (e.g., seeing colors when hearing sounds, or associating a taste with a shape).
Brain imaging shows activation across multiple sensory systems during synesthetic experiences.
Potential implications for memory: rich cross-modal links can support more robust memory traces.
The memory-vision link in perception
Perceptual experiences are shaped by prior knowledge and experiences stored in memory.
Recognition and memory contribute to the interpretation of sensory input in real time.
Practical Implications and Real-World Relevance
Why these concepts matter
Understanding thresholds explains why we notice changes in our environment under different conditions (noise, light, attention, motivation).
Knowledge of adaptation helps in designing environments (e.g., classroom lighting, hearing aid settings, interface design).
Insight into color vision explains why color differences matter in design, labeling, and safety-critical contexts.
Awareness of agnosias and prosopagnosia fosters empathy and informs clinical assessment and rehabilitation strategies.
Ethical and philosophical notes
Color as a perceptual construct invites reflection on the nature of reality and perception; colors are not “out there” as intrinsic properties independent of observers.
Cross-modal experiences (synesthesia) illustrate the brain’s creative ways of encoding and linking information, challenging simplistic models of perception.
Summary of Key Concepts to Remember
Sensation vs. perception; transduction; data reduction by sensory organs; attention filters.
Absolute thresholds, 50% detection, and the concept of just noticeable differences (JND).
Weber’s Law:
with illustration that ΔI scales with baseline intensity to preserve detectability.Adaptation: neural and perceptual adjustments to constant stimulation across senses (vision, olfaction).
Visual system anatomy: cornea, pupil/iris, lens, retina; rods vs. cones; cone density at the fovea; blind spot.
Visual pathways: retina → LGN (thalamus) → V1; dorsal (where/how) vs ventral (what) streams; retinotopy and cortical magnification.
Color vision: RGB trichromacy in the eye; opponent-process theory in perception; afterimages; color deficiencies.
Perception and action integration: vision-for-know/vision-for-action distinction and their neural bases.
Visual agnosias and Capgras syndrome: how specific brain disruptions affect recognition, identity, and emotional associations.
Synesthesia: cross-modal experiences; implications for memory and perception;
brain evidence shows multi-sensory activation during synesthetic experiences.Real-world takeaways
Expect perceptual differences across contexts; adjust environments to optimize attention and perception.
Color perception is a brain-based interpretation; consider perceptual variability in design and accessibility.
Recognize the separable but interacting pathways for recognizing objects and guiding actions; this informs both education and clinical assessment.