1/24
Vocabulary flashcards covering key AI concepts from the notes.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Artificial Intelligence
The study of making computers perform tasks that currently require human intelligence.
Agent
An entity that perceives its environment through sensors and acts on it through actuators, capable of autonomous action and goal pursuit.
Environment
The surroundings with which an agent interacts; the world perceived by the agent and manipulated through actions.
Percept
The agent’s perceptual inputs at a given instant.
Percept Sequence
The complete history of everything the agent has perceived to date.
Agent Function
A map from the percept sequence to an action.
Rational Agent
An agent that behaves to achieve the best expected outcome given its percepts and knowledge.
Performance Measure
The criteria that determine how successful an agent is.
PEAS
Performance Measure, Environment, Actuators, and Sensors—the four components used to describe a task environment.
Task Environment
The environment in which an agent operates described by PEAS.
Fully Observable
An environment where the agent’s sensors provide complete information about the state at each point in time.
Partially Observable
An environment where some aspects of the state are hidden or obscured by noisy or incomplete sensor data.
Deterministic
An environment where the next state is completely determined by the current state and the agent’s action.
Stochastic
An environment where outcomes are probabilistic; the next state is not determined with certainty.
Episodic
A task environment where experiences are divided into atomic episodes; each episode is independent of previous ones.
Sequential
An environment where current decisions affect future decisions and outcomes.
Static
An environment that does not change while the agent deliberates.
Dynamic
An environment that can change while the agent is deliberating.
Discrete
An environment with discrete states, percepts, and actions (distinct, separate values).
Continuous
An environment with continuous states, percepts, and actions (no discrete steps).
Known
An environment where the outcomes for all actions are given or predictable.
Unknown
An environment whose dynamics are not known in advance, requiring the agent to learn.
Simple Reflex Agent
An agent that selects actions based only on the current percept, using condition–action rules; works best when the environment is fully observable.
Model-Based Reflex Agent
An agent that maintains an internal state and a model of the world to handle partial observability and track unseen aspects.
Goal-Based Agent
An agent that selects actions to achieve specific goals, enabling flexible behavior by adding goal information.