Unit 10: Infinite Sequences and Series
Sequences: Definitions, Notation, and Limits
A sequence is an infinite succession (list) of numbers arranged in a specific order, usually generated by a rule. In calculus, you typically write a sequence as \{a_n\} (read “the sequence a_n”), where a_n is the nth term and n is always an integer (usually a positive integer).
Formally, a sequence is a function whose domain is the positive integers. The key idea is that sequences let you study what happens “in the long run” as n becomes very large.
A common example is
a_n=\frac{n-1}{n}
Beginning with n=1, the terms are 0,\frac{1}{2},\frac{2}{3},\dots
Why sequences matter
Sequences are the foundation for infinite series. A series is built by adding up terms of a sequence. Before you can decide whether an infinite sum makes sense, you need language for whether the terms approach a value, approach zero, oscillate, or blow up.
Common sequence behaviors
A convergent sequence is one where the limit exists and is a finite real number. If that limit exists and equals L, you write
\lim_{n\to\infty} a_n = L
A divergent sequence is one where the limit does not exist or is infinite (goes to \infty or -\infty).
An oscillating sequence is a common kind of divergence where the terms bounce between values and never settle, such as
a_n = (-1)^n
The official (epsilon) definition of sequence limit
A sequence has a limit L if for any \varepsilon>0 there is an associated positive integer N such that
|a_n-L|