Module 2

Introduction to Meta-Heuristic and Evolutionary Algorithms

  • Brief review of methods for searching the decision space of optimization problems.

  • Components of meta-heuristic and evolutionary algorithms are introduced.

  • Relation to engineering optimization problems is highlighted.

  • Topics covered include:

    • Coding of algorithms.

    • Dealing with constraints.

    • Generation of initial solutions.

    • Iterative selection of solutions.

    • Performance evaluation of algorithms.

  • A general algorithm encompassing steps of all meta-heuristic and evolutionary algorithms is presented.

2.1 Searching the Decision Space for Optimal Solutions

  • The decision space includes all possible solutions for an optimization problem.

  • Objective: Find the best solution based on the objective function.

  • Procedures for finding optimum include:

    • Sampling Grid: Evaluates all possible solutions for discrete problems; impractical for large problems.

    • Random Sampling: Evaluates possible solutions chosen randomly, with a specific probability of finding the optimal solution.

    • Targeted Sampling: Utilizes knowledge from previous solutions to guide the search for new samples, prioritizing areas where optimal solutions are likely.

2.1.1 Sampling Grid

  • Purpose: Evaluates all solutions to identify the optimum.

  • Can be practical only for relatively small discrete problems.

  • Continuous problems may use a grid to approximate evaluations.

  • Reducing grid size increases accuracy but raises computational burden.

2.1.2 Random Sampling

  • Randomly selects and evaluates solutions.

  • The probability of selecting an optimal solution is calculated using the hypergeometric distribution.

  • Shows inefficiency similar to sampling grid methods as it lacks learning from previous evaluations.

2.1.3 Targeted Sampling

  • Builds upon the principles of learning from past evaluations to select future samples.

  • Provides a systematic approach to search for optimal solutions, essential in meta-heuristic and evolutionary algorithms.

2.2 Definition of Terms of Meta-Heuristic and Evolutionary Algorithms

  • Algorithms involve a sequence of operations aimed at solving problems iteratively.

  • Essential components include:

    • Initial State: Starting point for variables, which can be predefined or randomly generated.

    • Iterations: Operations performed iteratively to refining solutions.

    • Final State: Condition to stop the algorithm, satisfying specific termination criteria.

    • Initial Data: Information required for simulations and parameters for algorithm execution.

    • Decision Variables: Values computed by the algorithm representing potential solutions.

    • Fitness Function: Transformed objective function considering penalties for constraints.

    • Constraints: Conditions that define the feasible space for valid solutions.

2.2.1 to 2.2.10

  • Detailed definitions of each term and their importance in understanding meta-heuristic/evolutionary algorithms, including examples of decision variables and objective functions for various engineering problems.

2.3 Principles of Meta-Heuristic and Evolutionary Algorithms

  • The simulation model interacts with optimization algorithms to evaluate decision variables, state variables, objective functions, and constraints.

  • The iterative procedure uses past solutions for generating new potential solutions aimed at improving the overall objective function.

  • Emphasizes efficiency and independence from the simulation model in terms of algorithm execution.

2.4 Classification of Meta-Heuristic and Evolutionary Algorithms

  • Algorithms can be classified based on:

    • Nature-Inspired vs. Non-Nature-Inspired Algorithms (e.g., Genetic Algorithms vs. Tabu Search).

    • Population-Based vs. Single-Point Search Algorithms (e.g., Genetic Algorithms vs. Local Search Methods).

    • Memory-Based vs. Memory-Less Algorithms (using historical search data to guide future searches).

2.5 Meta-Heuristic and Evolutionary Algorithms in Discrete or Continuous Domains

  • Solutions defined as arrays of decision variables.

  • Decision variables categorized as binary, discrete, or continuous depending on the problem context.

2.6-2.7 Dealing with Constraints

  • Discusses methods for managing infeasible solutions, including:

    • Removal Method: Eliminates infeasible solutions from consideration.

    • Refinement Method: Refines infeasible solutions to be feasible.

    • Penalty Functions: Adds penalties to the objective function in case of constraint violations.

2.8 Fitness Function

  • The penalized objective function represents the fitness function, integrating penalties for constraint violations affecting optimization decisions.

2.9 Selection of Solutions in Each Iteration

  • Selection process determines which current solutions contribute to generating new solutions.

  • Methods can be random or deterministic, affecting the search's efficiency and effectiveness based on the selective pressure applied.

2.10 Generating New Solutions

  • New solutions generated from current solutions during each iteration are critical for the algorithm's advancement.

2.12-2.16 General Algorithm and Performance Evaluation

  • Presents a general algorithm's order of operations for meta-heuristic and evolutionary methods.

  • Evaluates the performance based on convergence towards a near-optimal solution, considering factors like number of evaluations and iterations.

Conclusion

  • This chapter elucidates various approaches to meta-heuristic and evolutionary algorithms, emphasizing their use in optimization problems across engineering contexts.

robot