Apcsp


Algorithm Analysis

Algorithms are instructions computers follow to solve problems. Understanding how efficient these instructions are—how much time and memory they use—is important, especially when working with large amounts of data.

  • Efficiency Focus: If you're sorting a million numbers, you want to know how long it takes and how much memory it uses.

  • Binary vs. Sequential Search:

    • Sequential (Linear) Search: Check each item one by one until you find what you're looking for. It’s simple but can take a long time, especially for large lists.

    • Binary Search: Only works on sorted lists but is much faster. You keep splitting the list in half until you find the target. For example:

      • Searching in 16 items takes 4 guesses (logarithmic growth).

      • Searching in a million items takes just 20 guesses!

    • Key Point: Binary search is like guessing a number in a game of "higher or lower." Sequential search is like checking every number one by one.


Sorting Algorithms

Sorting means organizing data, like putting numbers in order. There are different ways to do it:

  1. Selection Sort:

    • Find the smallest number and put it in the first position.

    • Repeat for the next smallest, and so on.

    • Example: Sort [9, 7, 4, 5] → Step 1: [4, 7, 9, 5] → Step 2: [4, 5, 9, 7] → Done: [4, 5, 7, 9].

  2. Insertion Sort:

    • Think of it like organizing cards in your hand.

    • Start with one card, add the next, and insert it into the right spot.

    • Keep going until all cards are sorted.

  3. Bubble Sort:

    • Compare two items at a time and swap them if they’re out of order.

    • Largest items "bubble up" to the top, like air bubbles rising in water.

  4. Merge Sort:

    • A smarter way: Split the list into smaller chunks, sort each chunk, and merge them back together in order.

  5. Bucket and Radix Sort:

    • Group data into "buckets" based on a feature, like their first digit, and then organize each group.


Limits of Algorithms

Some problems are really hard or even impossible to solve with algorithms.

  1. Undecidable Problems:

    • Some questions can’t be answered by any algorithm. For example, the Halting Problem asks: “Will this program ever stop?” It’s impossible to know for sure in every case.

  2. Intractable Problems:

    • These can be solved in theory, but it would take so long that it’s impractical.

    • Example: The Traveling Salesman Problem (TSP) asks for the shortest route to visit a bunch of cities. For a few cities, it’s doable. For 100 cities, the number of possible routes is so huge (factorial growth) that even supercomputers struggle.


Parallel and Distributed Computing

When problems are too big for one computer to handle quickly, we can divide the work among multiple processors or computers.

  1. Parallel Computing:

    • Imagine sorting 100 cards by dividing them into 4 piles and having 4 people sort simultaneously.

    • This speeds things up because different parts are done at the same time.

  2. Distributed Computing:

    • Instead of one big computer, use many smaller ones connected over a network.

    • Great for solving massive problems or handling huge data, like weather forecasting or analyzing DNA.

  3. Challenges:

    • Some parts of a problem still need to be done in sequence (step-by-step), which limits how much faster parallel computing can be.


Procedural Abstraction

This is about making code easier to understand and reuse by using procedures (mini-programs) with flexible inputs (parameters).

  • Old way: Use fixed commands, like "move forward 10 steps" repeated 4 times.

  • New way: Use a single command like move_forward(40). The number (40) is the parameter, making the code simpler and reusable.

  • Example: Write a single procedure draw_square(side_length) that can draw any size square. Just pass the side length (e.g., 50, 100) when you call it.


Why Does This Matter?

  • Choosing the Right Tool: Some algorithms are faster and better for specific tasks. For example:

    • Use binary search if the data is sorted.

    • Use merge sort for large datasets because it’s efficient.

  • Understanding Limits: Knowing that some problems are unsolvable or slow helps set realistic expectations.

  • Parallel Power: Big problems can be solved faster by sharing the load across multiple computers.


  1. The Evolution of Search:

    • The web transitioned from a library-like structure (Web 1.0) with static pages to a dynamic marketplace of ideas (Web 2.0).

    • Early web organization depended on human-curated directories, but modern search engines provide instant access to vast information.

  2. Search Engines as Brokers:

    • Search engines like Google don't create content but connect users to it. This gives them immense influence over how people access and perceive information.

  3. The Process of Searching:

    • Search engines gather data, index it, and match user queries to relevant content.

    • The algorithms prioritize relevance and rankings based on complex factors, such as keyword usage, page popularity (like Google's PageRank), and user behavior.

  4. Challenges and Consequences:

    • Search results can be biased, either intentionally (like sponsored links) or unintentionally (algorithmic design).

    • Search engines shape public knowledge by determining what is seen or hidden, raising concerns about control and censorship.

  5. Impact on Knowledge and Society:

    • The vast unstructured nature of the web allows for exploration but also makes finding accurate information difficult.

    • Legal, ethical, and technical issues arise, such as the persistence of cached pages even after deletion and debates over digital privacy and intellectual property.


Simplified Explanation:

  • Imagine the web as a giant messy library where books (websites) are added randomly every day. A librarian (search engine) helps you find what you need quickly by reading and organizing parts of every book in advance.

  • These librarians use rules to decide what’s important. However, they might miss some books or show only those from popular authors (ranking by links and popularity).

  • They also keep snapshots of books (cached pages), so even removed or outdated books can still be found, sometimes leading to controversies about privacy and copyright.

  • Finally, these librarians are powerful—they influence what knowledge you see and even how you think about the world, like a teacher choosing your reading list.