Notes: Chapter 1-7 Introduction to Computers and Python
Chapter 1: Introduction
- Course goals: cover three topics
- How a computer works, with a focus on binary numbers
- Introduction to the Python environment and lab setup
- Basic Python programming concepts and immediate hands-on practice
- Structure of the module: from hardware basics to Python usage and simple programming techniques
- Emphasis on the end goal: by the end of the course, potentially be able to write a large project; connection to software engineering concepts
- High-level view of computer operation as a sequence of steps processed by hardware
- The life cycle of a program:
- Program starts on the hard drive (non-volatile storage)
- Program loads into RAM (volatile memory) for execution
- CPU executes a sequence of steps, one at a time, in a predefined order
- Each step is an instruction coming from a programming language like Python, Java, Fortran, C, etc.
- Programmer’s job summarized:
- Write steps in the correct order (instructions)
- Build algorithms that express the intended computation
- Useful background knowledge for programming:
- Mathematics: functions, proofs, and how they relate to algorithms
- Science and data analysis: derivatives, integration, discrete math
- Basic machine understanding: how memory, CPU, and I/O interact
- Data representation and the need to move from symbols to binary (symbolic vs digital information)
- Role of compilers: translate symbolic code into binary that the CPU can execute
- Channeling theme: everything in a computer fundamentally relies on binary and low-level representations
Chapter 2: Unique Binary Number
- Core idea: symbolic information (letters, operators) is translated into binary for computer processing
- ASCII mapping (illustrative examples from the slides):
- The symbols 'a', 'b', 'c', …, and operations like '+', ' ', etc. each map to unique binary values
- Example mappings from the ASCII table shown in the lecture:
- 'A' (capital A) → 65 (decimal)
- 'a' (lowercase a) → 97 (decimal) [note: ASCII includes both uppercase and lowercase blocks; the slide highlights that lowercase letters occupy a different range than uppercase]
- '0' → 48, '1' → 49
- Binary storage and human-readable symbols:
- Binary numbers are stored as sequences of bits (0s and 1s)
- A string is a sequence of characters (letters, digits, symbols) concatenated into a unit
- A byte is 8 bits: ext{Byte} = 8 ext{ bits}
- Common storage units (base-10 interpretations used in the slides):
- 1 ext{ MB} = 10^6 ext{ bytes}
- 1 ext{ GB} = 10^9 ext{ bytes}
- 1 ext{ TB} = 10^{12} ext{ bytes}
- Bits and transistors:
- A bit is the fundamental binary element, implemented physically as a transistor (charged = 1, discharged = 0)
- A computer contains billions of transistors; example: one gigabyte = 8 imes 10^9 transistors
- Moore’s Law (historical context provided in the slides):
- Gordon Moore (1965) suggested that the number of transistors would double roughly every year, a trend that held broadly through 2018 and was expected to continue toward 2025
- Practical implication: more memory (RAM), faster CPUs, larger storage over time
- RAM vs hard disk (non-volatile storage vs volatile memory):
- RAM (random access memory): fast, volatile memory used while a program runs
- Hard disk / solid-state disk: slow, non-volatile storage used to persist programs and data when not running
- Symbolic vs digital information recap:
- Humans work with symbols (letters, numbers, operators)
- Computers operate on binary data; symbols are translated to binary by compilers/interpreters
- Foundational definitions introduced:
- Symbol: human-readable representation (e.g., letters, digits) that is mapped to binary
- Bit: 0 or 1, the smallest unit of data
- Byte: 8 bits
- RAM: working memory during program execution
- CPU: central processing unit that executes instructions
- String: a sequence of symbols treated as a unit
- Binary and textual data in practice:
- Text, numbers, images, audio, and video are ultimately stored as binary data, with format-specific organization (e.g., file extensions) to indicate how to interpret the bitstream
Chapter 3: A Binary Number
- Decimal (base-10) overview: digits 0–9, with place values 10^0, 10^1, 10^2, …
- Example: 2019 = 2 imes 10^3 + 0 imes 10^2 + 1 imes 10^1 + 9 imes 10^0
- Binary (base-2) overview: digits 0 and 1, with place values 2^0, 2^1, 2^2, …
- Example: binary 183 (base-2) is 10110111_2, which equals
- 183 = 2^7 + 2^5 + 2^4 + 2^2 + 2^1 + 2^0 = 128 + 32 + 16 + 4 + 2 + 1 = 183
- Example: binary 26 (base-2) is 11010_2, which equals
- 26 = 2^4 + 2^3 + 2^1 = 16 + 8 + 2
- Notation and readability:
- To avoid ambiguity, binary numbers are often written with a base subscript: e.g., 11011012 vs 109{10}
- Converting binary to decimal (power-series view):
- For a binary number with bits $$bk b{k-1} \