1/26
Flashcards covering concepts from the Intelligent Agents lecture.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Agent
Something that acts in an environment, made up of a body and a controller.
Sensors
Part of an agent that converts stimuli into percepts.
Actuators
Part of an agent, also called effectors, that convert commands into actions in the environment.
Embodied Agent
An agent that has a physical body.
Robot
An artificial purposive embodied agent.
Rational Agent
An entity that perceives its environment and acts according to some rules to achieve its goals.
Rationality
A rational agent chooses whichever action maximises the expected value of the performance measure given the percept sequence to date.
Performance Measure (Automated Taxi/Cab Example)
Safety, destination, profits, legality, comfort, etc.
Environment (Automated Taxi/Cab Example)
Streets, traffic, pedestrians, weather, etc.
Actuators (Automated Taxi/Cab Example)
Steering, accelerator, brake, horn, speaker/display, etc.
Sensors (Automated Taxi/Cab Example)
Video, accelerometers, gauges, engine sensors, keyboard, GPS, etc.
Performance Measure (Internet Shopping Agent Example)
Price, quality, appropriateness, efficiency.
Environment (Internet Shopping Agent Example)
Web sites, vendors, shippers.
Actuators (Internet Shopping Agent Example)
User interface controls, URL links, web forms
Sensors (Internet Shopping Agent Example)
HTML pages (text, graphics, scripts)
Simplest Environment
An environment that is fully observable, deterministic, episodic, static, discrete and single-agent.
Most Real Situations
An environment that is partially observable, stochastic, sequential, dynamic, continuous and multi-agent.
Perfect Rationality
The agent reasons about the best action without taking into account its limited computational resources
Bounded Rationality
The agent decides on the best action that it can find given its computational limitations.
Simple Reflex Agents
Select action on the basis of only the current percept.
Reflex Agents with State (Model-Based Reflex Agent)
Maintain internal state to tackle partially observable environments
Goal-Based Agents
The agent needs a goal to know which situations are desirable, often investigated in search and planning research. Future is taken into account.
Utility-Based Agents
Certain goals can be reached in different ways; some ways are better and have a higher utility. A utility function maps a sequence of states onto a real number
Learning Element
Introduce improvements in performance element.
Critic
Provides feedback on agent's performance based on fixed performance standard
Performance Element
Selecting actions based on percepts; Corresponds to the previous agent programs.
Problem Generator
Suggests actions that will lead to new and informative experiences.