1/81
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
VR
creates end-to-end mechanisms that replaced our natural real world environments w/immersive simulation
uses combination hardware & software
immersion, presence & interaction
high interaction & high immersion
full haptic suit
HMD
low interaction & low immersion
head tracking
mobile device
but there is debate to what is full immersion
to create an immersive & realistic VR system needs a combination of
infrared LEDs
motion sensory
cameras
screens
allows a headset to gather relevant info & present it to human eye
HMD VR
Immersion
the ability of the VR system of actually tricking you into feeling that you’re somewhere else
perfect immersion in VR = same sensorial info as the real world
Presence
how you’re really engaged & feel yourself inside the virtual world
if story is compelling, they will be completely absorbed by it
indicates how much the user feels engaged w/VR experience
more of an illusion
what technical elements make an immersive experience in VR
tracking
rendering
display
tracking
process of measuring body movements of the user
Of the 3 parameters that comprise a VR → most important
eye movements is most important
Rendering
instantiating appropriate sights, sounds, touch and sometimes even smell for the new location
user may change their position rapidly → necessary to render this info real fast & almost in real time
Display
the manner in which the physical senses are replaced w/digital info
once the sights/sounds for new location are processed & rendered they need to be delivered to the user
Process of VR immersion
display/optics generate separate images for each eye
graphics processor generates rendering
controllers capture input commands
sensory/input infer postion & movement
Stereoscopic lenses
postioned between screen & eye to disort creating 3D effect
headset passes 2 images through lens (one for each eye)
Infrared cameras
within headset
adjusts light to the user's needs
allows the device to shift content as head moves
some can track eye movement
VR headsets
high-resolution displays presenting the content of VR world
uses Fresnel lenses
focus content for the user
smartphones can work as headsets by adding optical elements
Latency
time delay between a user’s movement + corresponding visuals
Align virtual information with the real world
minimal latency
is necessary to ensure that what we see changes simultaneously when we move around, turn our heads, or look in a specific direction
generally considered to be below 20 milliseconds
high latency
can break immersion & induce motion sickness
can even occur in 2d
high-resolution VR headsets
better view quality & higher immersion
creates more processing load on the device
why if field of view important in VR headsets
average human sees 200-220° arc around head
In VR FoV is limited - modern ones support 110°
Frame rate
determines how immersive the experience is + linkiness of cybersickness
eyes can see 1,000 frame per second, but the brain only interprets 150FPS
Frame rate & VR sickness
VR developers find anything less than a rate of 60 FPS causes nausea & headaches.
Refresh rate
number of frames that the display can output per second
High refresh rates (75Hz+) alleviate motion sickness & reduce flicker-induced fatigue
minimises blur & maintains immersion
require powerful processors → increased cost of the VR headset
Spatial audio
increasingly important in any VR experience
Sound & where it comes from influence our perception of a 3D space
Graphics processing
is a resource-heavy task and the two main classes of VR devices, tethered and standalone, approach this differently
Tethered devices & graphic processing
offload graphic processing to a dedicated computer w/powerful GPU through USB 3.0 & HDMI
e.g Oculus Rift S & HTC Vive
BUT tether cables can be unwieldy/limit the user’s movements
Standalone devices & graphic processing
use onboard mobile GPU for graphics processing
Oculus Go & Google Cardboard
mobile GPU is not as powerful as a PC-based GPU.
unit has to rely on inbuilt battery power which is also limited
Head, movement & position tracking
use a combination of sensors, gyroscopes, & AI to influence what we see as we move around.
Basic headsets initially used 3DoF systems
only allowed us to look left, right, and up and down
most advanced headsets (Varjo XR-4) use 6DoF to ensure we can look around in a 360-degree format,
like we would in real life
tracking sensors
go beyond the capabilities of controllers
communicates orientation & position of a tracked entity to heads
can be within the room, controller, finger/hand trackers or headset itself
enables users to perform more natural-feeling actions to interact
types of tracking sensor
room-scale & external (e.g. Oculus Rift)
internal on the device (inside-out) (e.g. Oculus Quest).
Virtual reality controllers
hardware components that allow us to take action
allow us to submit info direcly into software
support a combination of buttons, trackpads & joystick controls bundled into a hand-held unit that communicates wirelessly w/hmd
interacting within the VR world
To interact w/world, the device has to capture inputs provided by the user
certain devices support the user’s gaze, head-pose and simple hand movements as inputs.
Eye-facing infrared cameras, gyroscopes and accelerometers sense the inputs.
The input controller units are self-contained and support 3DoF or even 6DoF
challenges facing VR
VR sickness & discomfort from prolonged exposure
ethics/cybersecurity issues of metaverse
affordability/time-consuming development
best VR accessories for deeper immersion
Virtual Reality Cameras
Innovative VR Controllers
Accessories for Spatial Sound
VR Trackers, Sensors, and Base Stations
Haptic Gloves & Suits
Accessories for New Sense Stimulation
Chairs & Treadmills
Accessories for User Comfort
Custom Prescription Lenses
Virtual Reality Accessories for Battery Power
Chairs & treadmills
Companies like Roto VR are producing chairs attached to a motor system that allows users to move around more freely when seated while also connecting them with in-app experiences.
Innovative VR Controllers
“Tap” creating knuckle straps that allow users to interact w/virtual keyboards w/natural finger movements
Mobile VR headsets
Mobile headsets are shells w/lenses
Lenses separate the screen into 2 images for your eyes
turns a smartphone into a VR device
Relatively inexpensive
Not tethered
Phones aren’t designed specifically for VR,
can’t offer best visual experiences
underpowered compared w/PC or game console-based VR
no positional tracking with mobile VR. Y
you can’t look around objects.
e.g Merge Vr
Standalone VR headsets
An all-in-one VR headset, or standalone headset, puts everything in the headset needed to convince you that you’re in another world.
It is a single integrated piece of hardware → wireless
e.g Meta Quest series
PC / Console connected VR headsets
provide a more immersive experience at a higher price point
Tethered
cables from the headset to an external piece of hardware to power the headset
E.g Vive or PlayStation VR
3 main types of movement and positioning tailored for each play area size
Roomscale VR
Seated and/or Standing
Roomscale VR
set a boundary or play area to move freely around in the game
With these games, can physically move around your space to interact w/environment
seated & standing VR
user stays roughly in the same place & uses the controller to move instead of physically moving through a space.
Motion Sickness/VR sickness
sense of nausea & unease many experience after being exposed to an immersive experience for an extended period
some p are more susceptible
women & 50+ are generally more likely to feel unwell after using a VR headset
how can VR sickness be reduced
Improved spatial tracking
Sensors capable of tracking movement can significantly reduce symptoms
Enhanced user interfaces
Handheld controllers cause a disconnect
creates sensory conflict leading to disorientation & discomfort
Reduced Latency
more time it taken to register in-app movements more confusing for the brain
investing new technologies to help minimize latency
Fast refresh frame rate
Psychological issues of VR
“uncanny valley” effect when interacting w/realistic avatars
anxiety & stress among user
VR used in training sessions might lead to an increased level of stress when exposed to worrying situations & simulations
Surgery simulations
More time users spend in a virtual world, the more likely they feel disconnected from the real world
suggested that p w/mental health issues should avoid VR headsets
Software
to create content for VR, creator need to develop/deploy software ecosystems customised for device
2 leading software ecosystems,
SteamVR and Oculus (PC or Mobile)
Google Daydream supports VR on smartphones
complex & involves a steep learning curve
browser-based webXR has emergence as an alterantive VR tool
types of software
Unreal & Unity engines provide SDKs (Software Development Kits)
integration for most platforms (e.g PS VR)
Unity vs. Unreal Engines
2 of the most popular game engines available to create VR experiences
Unreal Engine has a higher quality output, at the expense of ease of use e.g Batman Arkham VR
gLTF/GLB
a neutral, open source format.
Khronos Group created this format for 3D web, AR, VR, Games and 3D advertising
based on JSON → stores some data in external files like textures (JPEG or PNG), shaders (GLSL), or geometry & animation data (BIN)
FBX
is a proprietary 3D file format
Originally developed by Kardara
used in the film & video game industry
It supports geometry, appearance & animations (skeletal & morphs)
most popular for animation
used as an exchange format between different programs like Maya, 3DSMax, AutoCAD, Roman’s CAD, and others.
OBJ
It is a neutral 3D format when used as an ASCII variant.
when used as a binary variant, it is proprietary.
3D printing, graphics, & 3D scanning all use this file format
ability to store geometry, colour & texture information.
does not support animations, but is one of the most popular interchange formats for 3D graphics.
data in VR
VR will provide coordinates
may include behavioural interactions
reading VVV
Metaverse
collective virtual shared space
an evolution of the internet where users can work, play, socialise in a fully immersive/interconnected enviroment
Mystakidis (2022) heavily reliant on XR technologies to create seamless experiences
aims to integrate words more completely
issues regarding metaverse
issues of privacy
data security
digital divide
cost
realisation of the metaverse
requires a comprehensive intergration of
hardware
software
content
creates immersive experience
Hardware
central to creating immersive/interactive experiences
provide visual & auditory experiences
HMDs
crucial for visual immersion by tracking user’s head movements
input tool to enhance user’s interaction w/VW
handset tools for tactical interaction
eyetracking
voice recognition
body trackers/treadmills
implementation of the metaverse involves several stages
design
model training
operation,
evaluation
in the design phase of the metaverse what is considered
goals
constraints
user scenarios
model training phase of the metaverse involves
data analysis
user modelling
iterative learning to finetune the system
operation phase of the metaverse involves
considerations for system operation, simulations, and network environments
evaluation phase of the metaverse involves
assesses content fidelity, interaction authenticity, and implementation feasibility
Current landscape of VR
characterised by rapid advancements in hardware & software
investing billions into developing infrastructure
why are there growing concerns about privacy, security & ethical use
XR devices collect vast amounts of sensitive data
including biometric information, pose significant risks if not properly managed
Metaverse promises an interconnected virtual world
involving even greater amounts of data & more complex interactions between users and digital environments.
applications of the metaverse
simulation is most prominent
gaming
education
marketing
social research
provide immersive & interactive experiences that mimic real-world scenarios, offering valuable insights and training opportunities
challenges of metaverse
cyber motion sickness
caused by the disconnect between physical movement and visual feedback.
demand for real-time data processing & high-speed rendering in a 360-degree field of view poses technical challenges that require continuous innovation
Metaverse is crucial for maintaining user engagement.
interdisciplinary collaboration, technological advancements, and user experience innovation will be essential to overcoming these challenges
density of pixels within HMD
high pixel density is essential to avoid visible graininess & ensure sharp imagery
Experimental headsets → retinal projection technology
projects images directly onto the retina using micro-projectors & mirrors
Advocates suggest that this approach may reduce eye strain,
Unable to match the full immersive capacity of LCD & OLED systems
Field of view
average human sees the world around them in a 200 to 220-degree arc around the head
Our vision from our left and right eyes overlap at an angle, which allows us to see in 3d
optics & lenses
lenses are used to magnify/shape the image so that it fills a users field of view
horizontal field of view of around 90 to 100 degrees is necessary to convincingly simulate a natural environment
quality of lenses determines the clarity of the image
design is as important as the display itself
Head tracking
allows the headset to translate movements of the user’s head into changes in perspective within the virtual environment
achieved through combinations of accelerometers, gyroscopes, and external tracking cameras
Advanced HMD do directional & postopal tracking
enabiling users to lean/move slightly for enhanced realism
eye tracking
monitors the direction of gaze allowing adjustment of depth of field
enable gaze-based interaction & enhance realism by allowing virtual characters to respond naturally to user attention.
motion tracking
fundamental to the VR experience
level of motion tracking depends
cockpit requite head & hand
VR world require full body
motion tracking must account for 6DoF
translational movement
x, y, and z axes,
rotational movement
pitch, yaw, and roll)
types of motion tracking
optical
non-optical
optical motion tracking
relies on cameras/imaging devices like light emitting
may wear full body suits
infrared depth sensors
superior in capturing precise full-body motion
non-optical motion tracking
microelectromechanical sensorys
accelerometer, gyroscopes & magnetometers
developed fro automotive & mobile now compact
allows controllers/HMD to capture motion data at low latency
indispensable for portable headsets
often combined with optical methods to increase accuracy and stability
VR gloves
treadmills
future for motion tracking
may be transformed by brain-computer interfaces
control robotic limbs through direct neural signals
muscle reinnervation have demonstrated that rerouted nerve signals can control artificial limbs in a manner closely resembling natural movement
software in VR
requires specialised tools & processes that bring together 3D modelling, real time rendering & advanced user interaction mechanics
3D Modeling and Asset Creation
requires specialised 3D modelling software e.g blender
animate objects in virtual space
textures applied to 3d models need to be detailed and optimised to prevent lag in rendering
Real-Time Rendering
Determines how the virtual environment is visually presented to the user
must render scenes at high frame rates, typically 90 frames per second (fps) or higher, to maintain a smooth and immersive experience
requires powerful hardware & optimised software that can handle the intense computational load
scenes
collects all renderable + audible (geometry, lights, audio, cameras) and their spatial hierarchy
At runtime, it’s rendered from the user’s viewpoint to one (mono) or two+ (stereo/multi-projector) images
creation of 3d objects
vertices
edges
faces
triangles
object is formed