1/22
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Local Search
Keep track of a single current state, moving only to neighboring states, trying to improve current state.
Iterative improvement algorithms
State space is a set of complete configurations.
Local Maximum
A state which is better than all its neighbors states, but there is another state somewhere that is better than it.
Global Maximum
The best possible state of a state space. It has the highest value of objective function.
Flat Local Maximum
A flat space in the landscape where all the neighbor states of the current state have the same objective value
Shoulder
A plateau region that has an uphill ascent
Hill-climbing
Always chooses the best/most beneficial successor state, according to the objective function. If no neighbor state has a higher value, then return to the current state. If more than one successor states are tied, choose the next state at random.
Foothill Problem
Search process gets stuck at local maxima
Non-determinism
Random steps in random directions
Backtracking
Maintain a list of visited states. If the search reaches an undesirable state, backtrack to a previous configuration and explore a new path.
Ridge Problem
The search space may contain ridges of high-quality states which fall off steeply to low quality states on either sides.
Plateau Problem
In a plateau, local changes will bring about no change to the quality function so the search has no way of knowing how to proceed.
Sideways move
A move to a state with the same quality as the current state. Limit the number of sideways moves allowed
Simple Hill-Climbing
Takes into account one neighboring state. If the neighboring state is better, it becomes the new current state
Steepest Ascent Hill-Climbing
Considers multiple neighboring states and selects the one that gives the best result
Stochastic Hill-Climbing
Selects at random from the uphill moves. The probability of selection varies with the steepness of the uphill move.
First-Choice Climbing
Randomly generate successors until a better one is found
Random-restart Hill-climbing
Repeated HC searches from randomly generated initial states until the goal state is reached
Simulated Annealing
Escape from local optima by also accepting moves that are worse than the current solution (with a probability that decreases during the search)
Local Beam Search
Like Simulated Annealing, but keep track of k states instead of one state. Initially have k randomly selected states.
Genetic Algorithms
A successor state (offspring) is generated by combining two parent states. A state is represented as a chromosome string of genes. The evaluation function is a fitness function, and higher values are better.
Survival of the fittest: more fit individuals are selected to reproduce, less fit individuals are discarded.
Selection Strategies
Roulette Wheel - Every individual can become a parent with a probability proportional to its fitness
Stochastic Universal Sampling - Similar to Roulette Wheel, but the same individual cannot be both parents
Rank-based Selection - Every individual is ranked according to their fitness; higher ranked individuals are preferred to be parents
Tournament Selection - Select k individuals at random and select the best out of these to become a parent. Repeat for the next parent.
Cross Over Operator
Essentially mixing the genes of the parents to make the genes of the offspring.
One point crossover - a random crossover point is selected and the tails of its two parent are swapped to produce new offspring.
Multi-point crossover - multiple crossover points are selected
Uniform crossover - treat each gene separately and swap/dont’t swap.
Mutation operator
Introduce genetic variability into the population.
Bit flip - select one or more random bits and flip them
Swap - randomly select two positions on the chromosome and switch them
Scramble - A subsection of genes is chosen and scrambled randomly
A subsection of genes is chosen and reversed