ACCT 331 WEEK 11
Course Information
Course Title: ACCT 331: Introduction to Applied Artificial Intelligence
Location: Schreiber Hall, Room 302
Schedule: Tuesday & Thursday, 1:00-2:15 PM
Course Code: 10304
Course Schedule and Important Dates
Exams:
Exam #2: Week 12, on 11/13/25
Final Exam: Week 1-14, on 12/12/00/25
AI Courses You Can Use
Google DeepMind: Build Your Own Small Language Model
Description: Learn fundamentals of language models.
Duration: 6 hoursGoogle DeepMind: Train a Small Language Model (Challenge Lab)
Description: Develop foundational tools and data preparation for training models.
Duration: 1 hour 30 minutesGoogle DeepMind: Represent Your Language Data
Description: Learn how to prepare text data for models.
Duration: 4 hoursGoogle DeepMind: Design And Train Neural Networks
Description: Focus on the training process for models, spotting and mitigating issues.
Duration: 4 hoursGoogle DeepMind: Discover The Transformer Architecture
Description: Investigate the transformer architecture mechanisms.
Duration: 4 hoursGoogle DeepMind: AI Research Foundations | Google Skills
Current News in AI
JPMorgan Chase and AI Investment
Jamie Dimon, CEO of JPMorgan Chase, states that the bank's $2B investment in AI has 'paid for itself.'
AI implementation has driven significant savings across business lines since the initial investment in 2012, averaging a $2 billion benefit for the expense.
KPMG and AI Integration
KPMG will examine how employees use AI tools as part of annual reviews, tracking effectiveness and encouraging integration into work.
This change reflects how AI is reshaping consulting, with companies investing deeply in AI technologies.
Rising Tech Investments
Major tech companies like Meta, Oracle, Google, and Microsoft have increased annual capital expenditures significantly from 2016 to 2024, with a clear rising trend in AI investments.
User Growth of Platforms
ChatGPT's user base has been rapidly growing, projected to reach millions of weekly active users over the next two years, indicating a surge in interest and utilization.
Financial Outlook for AI
OpenAI projects a challenging cash flow scenario over the next years, confronting investors with anticipated losses.
The AI Revolution
The future of AI holds both challenges and providential opportunities for economic growth.
Historical parallels: Industrial Revolution and Digital Revolution significantly uplifted living standards, despite fears of job losses due to automation.
The discourse emphasizes the historical expectation that advancements create more jobs than eliminated, urging cautious governmental policy-making to avoid obstructing progress.
Key Findings from AI Research
Everyday AI
Mainstream Adoption: 46% of business leaders now leverage Generative AI (GenAI) daily; 80% engage weekly.
Business applications span across internal roles from IT to Operations, increasingly applying AI for efficiency in daily tasks.
Proving Value
Approximately 72% of enterprises measure business-linked ROI metrics with rising confidence in future investments in AI technologies, especially in sectors like Financial Services.
The Human Capital Lever
A growing commitment from leadership in AI adoption is observed, yet skill gaps and training budgets represent ongoing challenges.
LLMs and Generative AI in Business
Key Applications
Data Analysis & Productivity: GenAI usage in areas including document summarization and data analytics showcases efficiency gains.
Revenue Growth: Improved personalized experiences may lead to significant increases in sales conversions by 15-30%.
Cost Reduction: Automating customer service channels can lead to a reduction in operational costs by 70-85%.
Key Concepts
Language Models
Definition: Language models utilize AI to predict subsequent words in sentences.
Training Data: Language models are trained with vast amounts of text to generate coherent outputs.
Types of Models:
Self-Supervised Learning (SSL): Training approach that builds labels from input data, enhancing learning efficiency without manual labeling.
Transformers: Model architecture that enables handling long-range dependencies in data concurrently rather than sequentially, enhancing training and generational capabilities.
Technical Depth: LLMs and Their Working Principles
Mechanisms
Tokenization
Process: Text inputs segmented into tokens enabling manageable computation.
Embeddings
Definition: Vectors in high-dimensional space representing tokens, signifying semantic relationships among words.
Self-Attention
Functionality: Weighs the importance of words in relation to one another, enabling context-aware representations.
Business Application and Evaluation Strategies
Key Evaluation Metrics: Perplexity, BLEU Score, ROUGE, F1, and others measure prediction accuracy, quality of generated content, and various qualitative metrics.
Importance: Helps ensure that AI systems in production do not emit hallucinations or biased outputs.
Generative AI Assessment and Challenges
Evaluation Strategies
Absolute Scoring: Simple ratings on defined quality scales.
Comparative Ranking: Evaluators rank outputs from different AI systems to assess quality.
Error Analysis: Identification of specific weaknesses to guide improvements.
Challenges in Evaluating Generative AI
Multiple Valid Outputs: Unlike traditional tasks, generation may produce many acceptable responses.
Subjective Quality Assessment: Creativity and engagement are difficult to quantify accurately.