Brief History of Computing, Internet, AI & Social Media
Early Calculating Devices
- Pascaline ( 1642–1644 ): mechanical calculator for addition/subtraction.
- Leibniz Step Reckoner ( 1673 ): expanded on Pascal; multiplication by repeated addition and shifting; strong binary advocate.
- Jacquard loom ( 1801 ): punched cards to pattern fabric; earliest use of punch cards in computing.
- Arithometer ( Thomas de Colmar ): patented ( 1820) and produced ( 1851−1915); first commercially successful mechanical calculator (4 operations).
- Difference Engine ( Charles Babbage, 1822): tabulated polynomial functions; decimal; hand-cranked.
- Analytical Engine ( Charles Babbage, proposed 1837): first mechanical general-purpose computer; father of modern computing.
Ada Lovelace & Early Programmers
- Augusta Ada Lovelace ( Countess of Lovelace, UK, 1842−1844): mathematician; recognized machine could do more than calculation; wrote an algorithm for the Analytical Engine; considered among the first computer programmers.
Punched Cards & Data Processing
- Herman Hollerith ( 1884): punched card tabulating machine; began semi-automatic data processing; IBM lineage.
The Analytical Engine: Main Components
- Input: punched cards (Jacquard) for data and numbers; machine could punch numbers onto cards.
- Mill (Arithmetic Unit): could perform all four arithmetic operations, comparisons, and optional square roots.
- Store (Memory): held a large number of values; input/output via cards; decimal fixed-point arithmetic.
- Program control: separate cards for operation and data; allowed dynamic change of behavior; automation of instruction sequences.
Architecture & Early Generations (Overview)
- Difference Engine: decimal, mechanical, not programmable; fixed polynomial differences.
- Analytical Engine: decimal, mechanical, program-controlled by punched cards; memory and mill; considered Turing-complete in concept.
- Later designs (e.g., Ludgate’s) extended the ideas; input via punched cards; program/data separation.
Pre-Electronic Computing & Key Machines
- Colossus Mark 1 ( 1943): electro-mechanical/codebreaking by Tommy Flowers; early programmable digital device.
- Harvard Mark I ( 1944): decimal electro-mechanical; 24-channel punched paper tape input; no conditional branches.
- ENIAC ( 1945): decimal electronic; patch cables/switches; not stored-program initially.
- IAS/Stored-Program Concept ( 1945−1951): binary; stored program in memory; memory word length and instruction design.
- Zuse Z4 ( 1940s−45): binary floating point; electro-mechanical memory (Germany).
- Manchester Baby ( 1948): first stored-program electronic computer (UK).
- UNIVAC I ( 1951): early commercial stored-program computer (US).
The Stored Program Concept & von Neumann Architecture
- Stored program concept: program and data stored in the same memory for sequential fetch/decode/execute.
- IAS machine ( stored-program, binary): 40-bit word; memory of 1024 words; two 20-bit instructions per word; input via punched cards; output via cards.
- Harvard vs von Neumann: Harvard separates program memory from data memory; von Neumann uses a single shared memory.
IAS vs Analytical Engine: Quick Equivalence (High-level)
- Input/Output: Jacquard looms (Analytical Engine) vs IBM punched cards (IAS).
- Processing: The Mill and Store (Analytical Engine) vs 20 accumulators / ALU-style units (IAS).
- Number system: Fixed-point decimal in both early designs; stored-program concept realized in IAS.
- Memory/Program: Analytical Engine had mechanical memory; IAS used electronic memory with stored instructions.
The First Generations of Computers
- First Generation (Vacuum Tubes): ENIAC, UNIVAC, IBM 701/704/709, Manchester/Fairchild-era prototypes; programming via wiring and switches.
- Second Generation (Transistors): IBM 704/7090 family; PDP series; FORTRAN (1957) popularized; core memory becomes standard.
- Third Generation (Integrated Circuits): ICs enable smaller, faster machines; IBM System/360 family; DEC PDP-11 (1970) popular; UNIX influence grows.
- Fourth Generation (Microprocessors): 1971–1980; Intel 4004 → 8080 → 8086/8088; Altair 8800 (S-100 bus) spurs microcomputers; Apple I/II (1976-77); CP/M and BASIC become common.
- Fifth Generation (AI & Multimedia): early PCs dominate; GUI on Macs; Windows rises; multimedia, networking, and on-device AI begin to shape the landscape.
Timeline: IBM PC, DEC, and Processors (Key Milestones)
- 1971: Intel 4004 (first microprocessor).
- 1974: Intel 8080; Sord SMP80/08 (8080-based microcomputer) enters market; Datapoint 2200 influence on 8-bit era.
- 1976: Apple I; Apple II (1977) popularize personal computing; CP/M ecosystem grows.
- 1978: Intel 8086/8088; IBM PC 5150 (1981) uses 8088; MS-DOS becomes dominant.
- 1980s: 80286, 80386, 80486; PC/XT, PS/2 era; rise of graphical OSs.
- 1990s–2000s: Pentium line; Windows 95/NT; Mac OS evolves; Internet integration expands.
The Stored-Program Concept & Architecture Details
- IAS machine specifics: 40-bit word, two 20-bit instructions per word; memory of 1024 words; two's complement negative representation; input/output via Jacquard-like devices.
- ENIAC vs Analytical Engine mapping: ENIAC used manual wiring; Analytical Engine used punched cards and a stored program concept (in spirit).
- PDP-1 to PDP-11: minicomputers that popularized interactive computing and UNIX development; DEC’s influence on OS design and programming languages.
Internet & World Wide Web Timeline
- ARPANET hardware & protocols:
- 1969–1972: ARPANET development; host-to-host protocols and email ( 1972 ).
- 1973: Ethernet (Metcalfe) standardization begins; network classes defined (A/B/C).
- 1980: TCP/IP adopted as defense standard; ARPANET transitions to TCP/IP in 1983.
- 1989–1990: WWW concept proposed by Berners-Lee; HTTP, HTML; WorldWideWeb browser/server launched.
- 1990–1991: Internet grows; first web servers and early browsers appear.
- 1993–1995: Mosaic, Netscape Navigator popularize the Web; commercialization accelerates.
Social Media & Chat History (Online Interaction)
- Early online communities and chat: USENET, BBS; ELIZA ( 1966 ) as early chatbot.
- 1980s–1990s: IRC, AIM, ICQ, MSN Messenger introduce real-time chat and presence.
- 1990s–2000s: SMS/UCS; first modern social networks begin (1997 SixDegrees; 2004 Facebook; 2005 YouTube; 2006 Twitter).
- 2010s–present: Instagram (2010); mobile-first social apps dominate; short video (TikTok) emerges (mid-2010s).
- Trends: algorithmic feeds, creator economy, platform regulation and data privacy concerns.
History of Artificial Intelligence (AI) Timeline
- 1950: Turing publishes Computing Machinery and Intelligence; introduces the Turing Test.
- 1956: Dartmouth Workshop coinage of AI term; early symbolic AI explores rules and knowledge.
- 1960s–1970s: Early expert systems and symbolic AI.
- 1980s: Rise of neural networks and backpropagation.
- 2000s: Growth of machine learning, big data, and deep learning.
- 2010s: Deep learning breakthroughs (ImageNet 2012, Go/AlphaGo 2016, NLP transformers 2017).
- 2020s: Generative AI boom (GPT, Gemini, Claude, etc.); ChatGPT (Nov 2022) popularizes large-language models.
AI Projects & Organizations
- Anthropic: Claude models (Claude 1 → Claude 3).
- Mistral (France): open-weight models (Mixtral variants).
- Stability AI: Stable Diffusion (open-source image generation).
- Microsoft Copilot: AI integration in Office, Windows, GitHub.
- Apple Intelligence: on-device + cloud AI initiatives.
Quick Notes for Recall
- Key turning points: Jacquard punched cards → Babbage’s Analytical Engine → stored-program concept → ENIAC vs IAS → transistor era → ICs → microprocessors → PCs → Internet & Web → AI age.
- Architecture themes: stored-program (von Neumann) vs Harvard; input/output evolution from punched cards to keyboards, disks, displays.
- Generations emphasize scale and technology: vacuum tubes → transistors → ICs → microprocessors.
- The web transformed communication: ARPANET → TCP/IP → HTTP/HTML; World Wide Web catalyzing modern internet services.
- AI timeline highlights: from Turing test to deep learning to generative AI and large language models.