1/484
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
VR systems share 3 main features
Immersion, interaction & sense of presence
Immersion is how well the VR system isolates the user from reality and whether the user's senses create a believable virtual world.
the ability of the VR to trick you into feeling somewhere else It is linked to the hardware capabilities and depends on objective stimulus richness
Head-mounted display (HMD)
Large glasses that project 2 separate images create a convincing illusion of 3D
VR input devices
controllers gloves motion trackers
cues used by the brain to create the perception of space
binocular cues monocular cues dynamic cues
monocular cues
allow us to see depth w/one eye cues that do not need interaction of both eyes e.g shading, texture
occulsion
creates an illusion of depth by having objects overlap e.g darker square behind light one shows the light one behind
linear perspective
a depth indication based on perspective distortion e.g objects further away appear smaller
Binocular vision
ability to use both eyes together to create a unified image provides superior quality & depth perception creates 3d
Binocular depth cues
require input from both eyes to create depth e.g disparity
Disparity
Comparing slightly different visual cues creates depth
Binocular disparity
Difference in images between the 2 eyes eye further away means worse stereopsis
Dynamic cues
cues received through movement motion parallax accretion
motion parallax
a depth cue where nearby objects appear to move faster/in the opposite direction than distant objects from an observer's perspective
accretion
A moving surface can uncover hidden textures leads brain to perceive textures as part of a closer object creates a sense of relative depth
stereopsis
brain’s transformation of disparity into depth perception Through retinas, we get a slightly different view of the world surgeons need very good stereopsis
stereovision
uses convergence + retinal disparities
Crossed vs uncrossed disparities distinguish near vs far objects
VR uses parallax between left/right displays to fake depth
convergence
the closer an object is, the more your eyes cross/converge to keep it focused in both retinas
horopter
the area in which we fixate so that we can correspond the separate images of the 2 eyes all points in reality that are mapped onto corresponding points on the retina from the horopter
Anaglyphs
images composed of 2 slightly different perspectives of the same object & superimposed on each other in contrasting colours produced a 3d effect when viewed through corresponding colored filters e.g 3D images
large disparity
a greater horizontal shift between the left & right objects appear more separated enhancing 3D effect/sense of depth
Small disparity
objects appear closer to each other, creating a subtle 3D effect a slight perception of depth makes the scene feel less dramatic and more natural ideal for scenes where you want a gentle or comfortable 3D experience without a strong "pop-out" effect
Panum’s fusional area
region where vision disparity can still be fused
depth perception
created by minimal differences of the L & R eye images
stereoblindness
inability to see in 3D using stereopsis cannot perceive depth by combing images results in a blur 7% (2019)
Stereopsis is important because
precise depth perception (certain professions) perception & action (think sports & camouflage) 3D vision in other animals, evolutionary evidence gaming, 3D movies and VR improve our stereovision
Why should we care about stereovision in VR research
problem of creating 3d space problem of accommodative distance & vergence
Vergence
the coordinated movement of both eyes in opposite directions to sustain binocular vision ◦ eye’s vison needs to meet to see the 3d vison ◦ Adjustment of viewing angles of both eyes to match the depth of objects ◦ can cause retinal disparity
Vergence-Accommodation Conflict (VAC)
when the eyes converge on the HMD screen (near) but accommodation cues suggest a different depth from retinal disparity (far) mismatch between vergence & accommodation
causes visual discomfort & eye strain fixed by bringing objects clsoer to display plane
degree of freedom within VR
3 DoF 6 Dof
3 DoF
can track rotational motion but not translational can track whether the user has turned their head left/right, tilted it up/down, or pivoted less popular
6 DoF
360 degree enviroment which can be changed & moved within it can track movement and person
why would a VR be tethered?
tethered to a computer programme allowing it to run immersive experience
First ever experience of virtual reality (19th century)
creations of 360 degree murals illusion of being part of the depicted scene e.g historical event
Extended reality (XR) spectrum
VR is on a spectrum from partially immerive to fully immersive digital
(VR) <——Mixed reality——→ Real (Augmented reality)
Stereoscope (Wheatstone, 1838)
Beginning of VR First device showing depth from 2 images stereoscopic photos & viewers led to the explosion of the 3D industry
Link Trainer (1929)
early use of technology to create an immersive enviroment for training First flight simulator electromechanical device simulated turbulence used extensively in WWII
view master (1939)
popular stereoscopic viewer used for virtual tourisms
Pygmalion’s spectacles (Weinbaum, 1930s)
fictional goggles predicting modern VR predicted experience a fictional world through all senses
Sensorama (Heilig, 1950s)
early multisensory immersion via small cabinet designed to create virtual world sight, sound, smell, touch, motion very big/inconvient
Telesphere Mask (Heligi, 1960)
first head-mounted display lacked motion tracking abilities
Head sight (Bryan & Comeua, 1961)
first motion tracking HMD designed for military purposes allows users to remotley view dangerous situations video screen for each eye w/magnetic motion tracking system allowing user to look around critical step in development of VR HMDs & motiona tracking
Ultimate display (Sutherland, 1965)
described a virtual world that could be viewed through HMD visual, auditory & interactive laid the theoretical groundwork for many aspects of VR considered a foundational blueprint for VR
Sword of Damocles (Sutherland & Sproull, 1968)
First VR/AR HMD connected to a computer required wires/framed rooms head tracking via wires hanging from the ceiling
Flight simulators (1960-90s)
development of VR for aviation head tracker, 180deg configeration CG graphics real-time interactivity used for NASA astronaut training Simulated driving a rover on mars
VR in 1970s-80s
Krueger’s AR ◦ video place MIT movie map ◦ precursor to google street view research dominated by military, NASA & Aerospace
VR 2010s-present
Oculus Rift prototypes facebook buys oculus consumer VR boom standalone HMD → oculus quest On going advances
Current applications of VR
Entertainment/gaming cinema (limited) museums ◦ mobility education ◦ safe exploration ◦ career training ◦ combat training
why is virtual reality (VR) often used in psychology research?
to provide controlled immersive & manipulable environments
Does using VR always mean that research findings will have higher ecological validity?
No ecological validity depends on how the VR scenario is designed/used
Stress management & VR
enables soldiers to learn resilience & empowerment
VR & military use
safe training environment multiple simulation scenarios Caballero (2018) disaster risk management & emergency preparedness
Medicine & VR
distraction during painful procedures as pain relief rehabilitation of stroke patients phobia
VR & treatment of PTSD
Rizzo (2015) created a VR exposure system ◦ stimulates scenarios of wars and 9/11 ◦ used for safe exposure Can be used for military sexual trauma
Psychology & VR
management of emotions ◦ NatureTrek addressing fear of public speaking dealing with complicated situations ◦ Bodyswaps treatment of a broad range of mental health problems
difference between AR & VR
AR supplements reality VR replaces reality
Augmented Reality (AR)
seamless perception of the real environment combined with virtual content in real time allows the user to see the real world w/virusal objects superimposed within the real world
characteristics of AR
combines reality & virtuality interactive in real time virtual contents are registers in 3D
whole apparatus of the visual system
sensory cells via visual nerves to visual centres in the brain
how do humans perceive
through sensory impressions perceived image is created in brain regions
suspension of disbelief
ability to blank out obvious contrast between a fictious world & reality can be exploited to successfully create visual virtual environments
VR/AR systems requires
a computer system that consists of ◦ essential components for collection of info about user/user interactions (tracking) generation of stimuli for the user (e.g., images and sounds) simulation of the virtual world
Human information process
key senses in VR
visual perception
eye ◦ rods/cones) fovea (sharpest vision) Saccades ◦ rapid shifts to build high-res scene
Haptic perception
tactile, kinesthetic, proprioception. Sensitive via mechanoreceptors, thermoreceptors, & nociceptors.
multisensory perception
auditory haptic vestibular optical flow & vection presence immerion
kinaesthesia
sensations that occur when active muscle contractions are involved enables us to feel movement in general/direction of movement
auditory perception
low spatial resolution can only distinguish if they are several degrees apart relies on head shape filtering good temporal resolution (2-3ms)
proprioception
all sensations related to body position, both at rest/motion
Provide us with the body’s position in space & position of joints/head
sense of position
Proprioception & kinaesthesia is stimulated by
haptic joysticks, exoskeletons & motion platforms
perception of movement
Human body has elementary motion detectors available for the visual perception of movement detects local movement’s direction/speed
perceptual robustness
humans often correct distortions automatically (e.g., watching a cinema screen from the side)
vestibular system
Hair cells in the inner ear detect fluid movements in the archways of the organ of equilibrium senses acceleration balance
The larger the pupil distance
further back in depth
Double vision (diplopia)
When disparities outside Panum’s fusional area ◦ cannot fuse image together resulting in 2 images ◦ should be avoided at all costs fixed by adjusting eye separation (enlarge fusion area) or cyclopean scale
Frame cancellation
Object near display edge gives conflicting depth cues ◦ disparity = object is in front ◦ occlusion = object is behind illusion breaks fixed by keeping negative parallax objects away from edges
Space perception discrepancies
Users often underestimate distance in VR (up to 50%). ◦ limited depth cues, FOV, presence, cognitive effort. Fixed by exaggerating depth cues (shadows, fog), portals to real-world spaces.
Cybersickness
nausea, dizziness, headaches. Causes: sensory conflict, postural instability, system latency, mismatched depth cues. Mitigation: low latency (<20 ms), gradual exposure, teleportation, blur during rotation.
What is XR (Rauschnabel)
an abbreviation for all new reality formats ◦ not just extended reality
AR & VR should be treated differently because they have fundamental differences
AR is a continuum ranging from assisted to mixed reality
VR is a contimum from atomistic to hollistic VR
Difficulty measuring presence
presence is a subjective, internal feeling usually measured via questionnaires ◦ rate how strongly they felt inside the VR world Behavioural observations can be used ◦ noting how p move, react or interact in VR physioloigcal measures ◦ heart rate, skin conductance to detect signs of emotional & sesnory engagement
design & technical features of the immersive system influence presence (felton, 2021)
virtual resolution tracking accuracy audio quality
why does context matters when making presence (Felton, 2021)
a compelling narrative or meaningful task can heighten presence can be broken by glitches or unrealistic visuals
importance of the characteristics of the user which influence presence (Felton, 2021)
their expectations previous experiences w/technology willingness to engage w/enviorment all shape how present they feel
Presence is a 2 part experience (Weber et al, 2021)
attentional immersion ◦ how involved p is perceived realism of virtual reality ◦ realsitic it looks
what hardware offers high interaction & high immersion
full haptic suit HMD prescription lenses spatial sound headphones trackers treadmills (Roto VR)
what hardware provides low interaction & low immersion
head tracking mobile device
what is needed to create an immersive/realistic VR system
a combination of infared LEDs motion sensory cameras screens ◦ allow a headset to gather relevant info & present to eye
Perfect immersion in VR
same sensorial info as the real world
How can you increase presence in VR
context more compelling story more enaged = more real
what technical elements make an immersive experience in VR
tracking display rendering
tracking in VR
process of measuring body movements of user most important parameter of VR eye tracking most important
Rendering in VR
creating a realistic scene around individual by instantiating sights, sounds, touch and smell for new location
Display in VR
manner in which physical senses are replaced w/digtial info new sights processed/rendered to be delivered to user
Process of VR immersion
stereoscopic lenses
postioned between screen & eye to disort creating 3D effect headset passes 2 images through lens (one for each eye)
Infrared cameras
within headset adjusts light to the user's needs allows the device to shift content as head moves some can track eye movement
VR headsets
high resolution displays presenting the content of VR world
uses Fresnel lenses
focus content for the user smartphones can work as headsets by adding optical elements
latency
time delay between a user's movement + corresponding visuals align virtual info with real world