Seven Patterns of AI & Related Concepts (Lecture Notes)

When to Use AI & What It Is Not Suited For

  • Questions to decide if AI/cognitive tech is appropriate:
    • Do you have a problem with repetitive inputs and outputs? (high repeatability)
    • Do you require a solution that can be repeated exactly the same way each time?
    • Is there little variability in the process, data, inputs, or flows?
    • Do you need a problem solution with a high degree of accuracy?
    • Can you tolerate ambiguity, or do you require deterministic results with no false positives/negatives?
    • Do you understand the problem you’re trying to solve?
  • What AI & Cognitive Technologies are not suiting:
    • Repetitive, deterministic automation tasks that can be coded or recorded directly
    • Formulaic analytics best handled by traditional BI platforms
    • Systems requiring 100% accuracy where training would not guarantee perfection
    • Situations with insufficient data to train a model
    • Scenarios where hiring a person is easier, cheaper, or faster
    • A need to do AI without understanding what it is or what it’s for
  • Value and caution:
    • There is substantial value in AI & cognitive tech when used appropriately; avoid vague or misapplied deployments

The Seven Patterns of AI (Overview)

  • A categorization system to group AI applications into like areas
  • Patterns covered:
    • Recognition
    • Conversation & Human Interaction
    • Predictive Analytics & Decisions
    • Goal-Driven Systems (Reinforcement/Optimization)
    • Autonomous Systems
    • Patterns & Anomalies
    • Hyper-Personalization
  • For each pattern, identify data needs, suitable algorithms, and typical applications

Pattern 1: Recognition

  • Objective: Make sense of unstructured data (images, audio, video, handwriting, etc.)
  • Technologies:
    • Computer vision
    • Deep learning (e.g., convolutional neural networks)
    • Optical character recognition (OCR)
  • Applications / Examples:
    • Facial recognition (identifying people in images)
    • Object detection and classification in images or video
    • Handwriting/text recognition (beyond OCR)
    • Gesture and motion analysis (pose estimation)
    • Sound and music recognition (instruments, songs, animal calls)
  • Key terms (concepts often used):
    • Object recognition
    • Classification
    • Segmentation
    • Pose estimation
    • Event detection
    • Scene reconstruction
    • Image indexing and search
    • Motion estimation
    • 3D modeling
    • Image restoration
  • Real-world uses:
    • Security and surveillance (facial recognition)
    • Retail analytics (gesture-based interactions)
    • Medical (pose estimation, image analysis, diagnostics)
    • Entertainment and content analysis
    • Document processing (image-based data extraction)
  • Important dataset reference:
    • ImageNet: a large image dataset used for computer vision research
  • ImageNet specifics:
    • Created in 2006 by Fei-Fei Li; over N ext{ images} > 14{,}000{,}000 labeled images
    • Organized according to WordNet hierarchy; used for benchmark challenges
    • ImageNet Large Scale Visual Recognition Challenge (ILSVRC) started in 2010
    • Concerns: data may contain labeling errors; some sources report about 5 ext{?} ext{%} mislabeled data and embedded bias

Pattern 2: Conversation & Human Interaction

  • Objective: Machines interact with humans using natural language and other human-centric channels
  • Technologies:
    • Natural Language Processing (NLP)
    • Natural Language Understanding (NLU)
    • Natural Language Generation (NLG)
    • Speech-to-Text (STT) / Automatic Speech Recognition (ASR)
    • Text-to-Speech (TTS)
  • Applications:
    • Chatbots (text or voice)
    • Voice assistants (e.g., Siri, Alexa)
    • Machine translation
    • Sentiment analysis
    • Content summarization
  • NLP/NLU/NLG concepts:
    • NLP: getting machines to understand and communicate in human language
    • NLU: understanding meaning, intent, entities in text/speech
    • NLG: generating human-like text/speech from data
    • STT/ASR: converting spoken language into text
    • TTS: converting text into spoken language
  • Content summarization & analysis:
    • Produce concise overviews of large texts or content
    • Use NLU to extract key information and provide semantic summaries

Pattern 3: Predictive Analytics & Decisions

  • Objective: Use past/current data to predict future outcomes and aid decision-making
  • Technologies:
    • Machine learning models for classification, regression, time-series analysis
  • Applications:
    • Forecasting (inventory, weather, finance)
    • Risk/fraud detection ( spotting anomalies in transactions )
    • Recommendation engines
    • Predictive maintenance (anticipating equipment failure)
    • Marketing optimization (targeting campaigns, predicting customer behavior)
  • Key concepts:
    • Decision support: presenting data and alternatives to inform decisions
    • Predictive analytics: forecasting trends/outcomes based on history
  • Practical guidance:
    • Use when past/current data can inform future outcomes
    • Best-suited for identifying trends and risks, and for making actionable recommendations

Pattern 4: Goal-Driven Systems (Autonomous/Optimization)

  • Objective: Find the optimal solution to a problem through trial and error or simulation
  • Technologies: Reinforcement learning, simulation, optimization, planning
  • Applications:
    • Scenario simulation
    • Game playing
    • Resource optimization (money, equipment, time, other resources)
    • Robo-advising and real-time bidding
  • Key idea: Learning what the best outcomes are by iterating over many scenarios
  • Related notes:
    • DeepMind and AlphaGo/AlphaZero milestones highlight reinforcement learning capabilities
    • This pattern excels where exploring many possible outcomes yields the best strategy

Pattern 5: Autonomous Systems

  • Objective: Minimize human involvement; systems perceive, predict, plan, and act independently
  • Examples:
    • Autonomous vehicles
    • Autonomous robots/software
    • Autonomous business processes (software bots making decisions)
  • Key distinctions:
    • Automation: repetitive, rule-based tasks; often human-programmed
    • Autonomy: systems that can operate with perception, prediction, planning and handle variability
  • Levels of autonomy (example: vehicles):
    • Level 0: No autonomy; human does everything
    • Level 1: One automated function (e.g., automatic braking)
    • Level 2: Two or more automated functions; human still in control
    • Level 3: Can handle dynamic driving, but human intervention may be needed
    • Level 4: Driverless in certain environments
    • Level 5: Fully autonomous; no human involvement
    • Note: Some slides mention 6 levels overall; commonly referenced as Levels 0–5 (six levels total)
  • Key terms:
    • Automation vs Autonomy
    • Robot / robotics / cobot (collaborative robot)
    • Cobots operate in close proximity to humans to assist rather than replace
  • Practical considerations:
    • Perceive, predict, plan constitute three core intelligence capabilities
    • Assess whether ML is involved and whether the system can improve with experience
    • If these conditions aren’t met, the system may not be truly intelligent

Pattern 6: Patterns & Anomalies

  • Objective: Detect patterns and outliers in large datasets to uncover insights
  • Technologies: Pattern recognition, anomaly detection, classification
  • Applications:
    • Fraud detection (identify unusual transactions)
    • Error detection/correction (spot and fix errors in data)
    • Intelligent monitoring (system health, cybersecurity)
    • HR/candidate screening and profiling
    • Content moderation (flag inappropriate content)
  • Key concepts:
    • Classification: grouping data into categories
    • Anomaly detection: identifying data points that deviate from the norm
  • Real-world uses:
    • Banking/finance for fraud detection
    • IT monitoring and cyber threat detection
    • HR for candidate screening
  • Illustrative example:
    • Walmart case where pattern analysis helped identify that strawberry Pop-Tarts purchase spikes occur before hurricanes

Pattern 7: Hyper-Personalization

  • Objective: Treat each individual as an individual, with profiles that adapt over time
  • Technologies: Personalization, recommendation systems, adaptive learning algorithms
  • Applications:
    • Personalized content and experiences (news feeds, ads)
    • Personalized product recommendations (e-commerce)
    • Personalized medicine and treatment
    • Personalized finance (custom plans)
    • Personalized education (adaptive learning)
  • Key terms:
    • Personalization: tailoring offerings to user characteristics
    • Hyper-personalization: tailoring to each individual, beyond group buckets
    • Recommendation system: suggests products/content/actions based on user profile and behavior
  • Real-world uses:
    • Social media feeds, online shopping recommendations, targeted advertising
  • Notes:
    • Personalization drives better engagement, while recommendations help surface relevant options

RPA and Intelligent Process Automation (IPA)

  • Robotic Process Automation (RPA):
    • Purpose: automate repetitive software tasks at the user interface level (UI automation)
    • Types:
    • Attended Bots: assist humans in real-time; collaborate with employees
    • Unattended Bots: run in the background; automate back-office processes
    • Technologies: low-code/no-code platforms, scripting, screen recording, rule-based automation
    • Applications: data entry, invoice processing, data transfer between systems, customer-support workflows, document processing
    • RPA as an alternative to BPO and APIs; can reduce swivel-chair processes
  • Intelligent Process Automation (IPA):
    • Adding AI to RPA to handle variability, unstructured data, and exceptions
    • Levels of intelligent automation (increasing sophistication):
    • Level 0: Basic automation (rule-based, non-intelligent)
    • Level 1: Language/context awareness; rudimentary automation
    • Level 2: Intelligent process awareness
    • Level 3: Autonomous process optimization
    • Capabilities enabled by IPA:
    • Handling variability in inputs and steps
    • Perception, prediction, and planning
    • Unstructured data handling, process discovery, and dynamic process changes
    • Automatic process documentation and data correction
    • End-to-end orchestration of multiple bots for optimization

Robotics, Automation, and the 4 D’s / 3K’s of Robotics

  • The 4 D’s / 3K’s:
    • Dull, Dangerous, Dirty, Demeaning
    • 3K’s often refer to tasks that are still challenging or undesirable for humans in proximity to robots
  • Robotics vs Automation:
    • Robotics: design, construction, operation, and application of robots
    • Objective: augment human capabilities with or without AI
  • Cobots (Collaborative Robots):
    • Created in the 1990s to work alongside humans in shared spaces
    • Aim to enhance human performance rather than fully replace humans

Intelligent Automation: Levels & Distinctions

  • Automation vs Autonomy:
    • Automation: repetitive tasks, often programmed by humans
    • Autonomy: systems that perceive, predict, and plan with minimal human involvement
  • Levels of Automation (example: autonomous vehicles):
    • Level 0 through Level 5 (six levels in total)
  • Key takeaway: True intelligence involves perception, prediction, and planning, plus the ability to adapt and handle exceptions without human input

Autonomous Retail & Notable Examples

  • Autonomous Retail: removing humans from the loop in shopping experiences
  • Examples:
    • Amazon Go (cashier-less stores)
    • LoweBot (intelligent store assistant, 2016)
    • Walmart shelf-scanning bots (2017; faced challenges)
  • Open questions: will autonomous bot baristas or similar retail automation become widespread?

AlphaGo, AlphaZero, and the Power of Self-Play

  • Goal-driven systems demonstrate reinforcement learning for real-world games and optimization problems
  • AlphaGo (Go-playing AI by DeepMind): defeated human champion Lee Sedol in 2016
  • AlphaZero: learned to play games at superhuman levels through self-play in a short time; surpassed AlphaGo
  • Key takeaway: Self-play allows learning optimal strategies without human data bias

Integrating Patterns: Example - Assistant-Enabled Commerce

  • Scenario: AI chatbot on a website helps customers navigate products and answer questions
  • Patterns used:
    • Hyper-Personalization: pull data about the customer for tailored replies and recommendations
    • Conversation: natural language interaction with users
    • Pattern discovery: scan large data stores to identify customer buying patterns
  • Takeaway: Real-world applications often blend multiple AI patterns to deliver a cohesive assistant experience

Practical Considerations, Challenges, and Key Definitions

  • Automation vs Autonomy (basic definitions):
    • Automation: repeatable task execution; not necessarily intelligent
    • Autonomous systems: require perception, prediction, and planning; handle variability and exceptions
  • Core AI concepts:
    • Computer Vision: interpreting visual data
    • NLP/NLU/NLG: language-based capabilities
    • RPA: software bots automating UI-level tasks
    • IPA: RPA enhanced with AI for more flexible processing
  • Important terms:
    • Cobot: Collaborative robot that works with humans
    • IPA: Intelligent Process Automation
    • Attended vs Unattended bots: collaboration vs standalone automation
    • Low-code/No-code: platforms allowing rapid automation development by non-developers
  • Practical guidance notes:
    • Align AI projects with problems that require perception, prediction, and planning, and tolerate probabilistic outcomes
    • Be mindful of data quality and bias (e.g., ImageNet labeling biases)
    • Consider human-in-the-loop for tasks where humans outperform automation or where data is scarce
    • When in doubt, assess ROI and feasibility: some problems are better solved with rules-based programming or human labor

Quick Reference: Key Numerical and Factual Highlights

  • ImageNet dataset: over N > 14{,}000{,}000 labeled images
  • ImageNet/ILSVRC: started in 2010 as a benchmark for computer vision
  • Data bias concerns: some sources report up to 5 ext{\%} mislabeled data in ImageNet
  • AI-assisted visual inspection accuracy: reported to be A{AI} = 1.90 imes A{human} (i.e., 90% greater accuracy than humans)
  • Autonomy levels for vehicles: Levels L
    ightarrow ext{0 through 5} (six levels total)
  • Levels of Intelligent Process Automation (IPA): ranges from Level 0 (basic automation) to Level 3+ (autonomous orchestration)
  • The 4 D’s of Robotics: Dull, Dangerous, Dirty, Demeaning
  • Cobots: collaborative robots designed to work alongside humans in shared spaces

Ethical, Philosophical, and Practical Implications

  • Not every problem benefits from AI; traditional programming, rules-based approaches, or human labor can be more reliable and cost-effective
  • Data quality and bias matter: training data (e.g., ImageNet) can contain mislabeled or biased samples that affect model performance and fairness
  • Autonomy introduces responsibility and safety considerations: failures in autonomous systems can have real-world consequences; robust testing and governance are essential
  • The hype vs. reality: automation and AI are powerful when used to complement humans (IA/IPA) rather than replace critical human skills entirely
  • Transparency and explainability: some AI applications (e.g., decision support, finance, healthcare) require interpretable models and auditable outputs

Summary: How to Approach AI Projects (Guiding Principles)

  • Before starting: assess whether inputs/outputs are stable, data is sufficient, and accuracy requirements permit probabilistic outcomes
  • Prefer cognitive solutions when problems involve perception, uncertainty, and complex decision-making
  • For deterministic, highly repetitive tasks with strong data, rule-based automation may be more appropriate
  • Build hybrids: combine patterns (e.g., Conversational + Predictive Analytics + Hyper-Personalization) to achieve end-to-end capabilities
  • Plan for evolution: start with RPA for repetitive tasks, then layer IPA to handle variability and unstructured data, and eventually explore goal-driven or autonomous components where appropriate