CS 135 Virtual Reality - VR World
Virtual Reality (VR) Overview
Definition of VR: A simulated experience that can be similar to or completely different from the real world.
Historical perspective on the evolution of VR technology.
Agenda for VR World
Sense & Sensor
Motion & 6 Degrees of Freedom (6DoF)
Hardware
Software
Perception & Sensation
VR Systems
Components of VR systems:
Configuration Control: Managing how VR adapts to user input.
Natural Stimulation: Mimicking real-world sensory inputs (sight, sound, touch).
Rendering: Creating the virtual environment.
Neural Pathways: Understanding how sensory information travels from sensory organs to the brain.
Senses and Sensors
Sensors are transducers that convert external energy into signals.
Comparison of human senses with mechanical sensors:
Vision: Light waves (colors) / Mechanical: Cameras
Hearing: Sound waves / Mechanical: Microphones
Touch: Pressure and vibrations / Mechanical: Touchpads
Balance: Gravity detected by the inner ear / Mechanical: Gyroscopes
Sensor Dynamics
Sensors operate within a configuration space defined by degrees of freedom (DoF):
Ear: 3 DoF translational (x,y,z), 3 DoF rotational (yaw, pitch, roll)
Eye: Similar configuration capabilities plus extra DoF for focus and blinking adjustments.
Display Types
Audio Display
Rendering: Headphones / Stimulus: Pressure waves
Types of placement:
Room-fixed: Static relative to environment
Body-fixed: Moves with the user
Visual Display
Rendering: Monitors, head-mounted displays
Stimulus: Electromagnetic waves
Trade-offs: Energy vs. resolution based on distance from the display.
Lens considerations: Wide fields-of-view (FOV) vs. distortion; typical FOV ~110 degrees.
Tracking Mechanisms
Track user's movements to update the VR environment:
Cameras: Inside-out (e.g., augmented reality) & Outside-in (e.g., HTC Vive).
Inertial Measurement Unit (IMU): Estimating orientation, angular velocity, and linear acceleration.
Compass (magnetometer) for navigation and orientation.
Touch Display
Rendering: Haptic devices / Stimulus: Cutaneous (skin sense) and Kinesthesia (body sense).
VR Hardware Summary
Key hardware components:
Display Systems: Visual, audio, touch, etc.
Computational Hardware: CPU, GPU, specialized hardware.
Input Devices: Joysticks, keyboards, gloves.
Tracking Systems: IMU, cameras, LiDARs.
Power Supply: Battery for portable devices.
VR Software Framework
Diagram of VR software:
Inputs: Head tracker, game controller, keyboard/mouse.
Computation: VR world generator processes input to create the virtual environment.
Outputs: Visual renderer (display), aural renderer (audio), haptic renderer (touch).
VR World Generators
Common game engines:
Unity, Unreal Engine, Google Street View
Various simulators for specific applications.
Physical vs. Virtual Worlds
Techniques to reconcile physical limitations and virtual experiences:
Geo-fencing for defined user areas.
Matching user motion with virtual representations to avoid motion sickness (ideally within a 36ms threshold).
Human Perception in VR
Process: Sensory inputs are processed by the brain leading to perception.
Important considerations for VR designers:
Factors influencing perception (distance, resolution, FPS).
Issues like latency and nausea should be actively managed.
Depth Perception Techniques
Binocular Cues: Using differences in images from both eyes to judge distance.
Convergence: Eye muscles’ focus angle helps define distance.
Shadow Stereopsis: Utilizing different shadows from images to estimate depth.
Adaptation in Brain Processing
Over time, brain adapts to stimuli leading to a change in perception.
Examples of adaptation:
New glasses leading to unusual perceptions.
Perception of stationary objects during head movement (gimbal control).
Summary of VR World Concepts
Focus areas: Senses, sensors, hardware, software, and perception.
Schedule
Weekly breakdown of lectures and lab activities related to VR concepts and projects, including timelines for exams and project proposals.