1/13
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No study sessions yet.
Computer Vision
Science and technology that enables machines to “see.” This can range from simple light detection, like my garbage door opener’s safety system, to advanced systems that recognize objects and interpret events.
Computer Vision Two Branches
1st focuses on developing systems that extract information for images, advancing theory and algorithms
2nd is more applied, using cameras & sensors to control machines, such as robots or autonomous vehicles
Autonomous Vehicle: Case Study
These be chole’s rely on sensors their surroundings in real time. U.S. Dept of Transportation National Highway Traffic Safety Administration defines 6 levels of vehicle autonomy from level 0 (fully human driven to level 5 (Completely auto under all conditions. Despite the progress, commercial vehicles remain Level 3 or 4.
Level 1 Vehicle Autonomy
Basic driver assistance- such as cruise control- automates 1 task, but the drive handles everything else
Level 2 Vehicle Autonomy
The vehicle can street & control acceleration/deceleration, but the driver must remain alert & ready to take control
Level 3 Vehicle Autonomy
The vehicle can drive itself in controlled environments, such as highways, but the driver must take over in complex situations
Level 4 Vehicle Autonomy
The vehicle can operate autonomously in well-mapped areas like urban centers but still requires human intervention in extreme conditions, such as severe weather or unpaved roads
Controlled Environment & Human Oversight
Controlled conditions are necessary because today’s computer vision systems struggle with the unpredictability and complexity of real world scenarios. Many autonomous vehicles rely on remote operators to take control.
Capabilities Autonomous Vehicles Need
Vehicles need to predict and react to their surroundings like human drivers intuitively do. They must anticipate pedestrian movements, adjust to weather changes and interpret road signals while coordinating with other vehicles & infrastructure.
Hawk Eye System
Uses multiple high speed cameras to track position of tennis ball and determine whether serve is in and out. Has predicative ability, uses trajectory data from immediate preceding moments to esti where the ball is going to next
Hawk Eye System & Human Vision
Human perception often relies on prior experience to predict where an object will go. Hawk Eye is an esti, just like human perception based on inferences from incomplete visual data.
PITCHf/x
Tracks pitched ball speed, trajectory and spin. Using multiple high speed cameras around stadium
Connecting Human & Machine Vision
Human brain relies on regulators learned from experience such as assumptions that light comes from above or that objects in motion tend to stay in motion
Robotic & Computer Vision
Robots use vision to navigate & interact with their environment. Ex. Flippy use computer vision to monitor cook times can also track nearby humans