Master convergence tests and understand when infinite sums make sense
An infinite series is a sum of infinitely many terms. But when does such a sum make sense? The series converges if the sequence of partial sums converges to a finite limit.
We'll explore various convergence tests that help determine whether a series converges, including the comparison test, ratio test, root test, and the important distinction between absolute and conditional convergence.
A series converges if its partial sums Sn = a1 + a2 + ... + an approach a finite limit. Watch how different series behave -- some settle down to a value, others grow without bound.
Geometric series with r = 1/2, converges to 1
Sum: S = 1.000000
Key insight: A series is really a sequence of partial sums in disguise. All the tools from sequence convergence apply: a series converges if and only if its partial sums form a Cauchy sequence.
The geometric series is the most important series in analysis. It converges if and only if |r| < 1, with a closed-form sum. Adjust the ratio to see the boundary between convergence and divergence.
Σ rⁿ = r + r² + r³ + ... = r / (1 - r) when |r| < 1
With r = 0.50, the sum converges to 1.0000
Key insight: The geometric series sum a/(1-r) for |r| < 1 is both a closed-form formula and a comparison benchmark. Many convergence proofs ultimately reduce to bounding a series by a geometric one.
Different tests work best for different series. The ratio test excels with factorials, the root test with nth powers, while the divergence test quickly catches series that cannot possibly converge.
Geometric series with r = 1/2, converges to 1
lim_{n→∞} aₙ ≠ 0 ⟹ Σaₙ diverges
lim aₙ = 0, so the test is inconclusive (series may still diverge)
L = lim_{n→∞} |a_{n+1}/a_n|
L = 0.5000 < 1, so the series converges absolutely
Computed L ≈ 0.5000
L = lim_{n→∞} |a_n|^{1/n}
L = 0.5000 < 1, so the series converges absolutely
Computed L ≈ 0.5000
b_n ↓ 0 ⟹ Σ(-1)^n b_n converges
Series is not alternating, test does not apply
Key insight: No single convergence test works for all series. The art of series analysis is choosing the right test -- ratio for factorials, root for exponentials, comparison for polynomial decay.
The p-series converges if and only if p > 1. The borderline case p = 1 (harmonic series) diverges despite its terms going to zero -- a crucial example showing that terms going to zero is necessary but not sufficient for convergence.
Converges if and only if p > 1
Current (p = 2.0):
Sum ≈ 1.6251Exact: π²/6 ≈ 1.6449
Key insight: The harmonic series diverges even though its terms approach zero. This is the classic example showing that the divergence test (terms not going to zero) catches failures, but passing it guarantees nothing.
A conditionally convergent series can be rearranged to sum to any value! This stunning result shows why absolute convergence is so important -- absolutely convergent series always sum to the same value regardless of ordering.
A conditionally convergent series can be rearranged to converge to any real number!
The alternating harmonic series Σ(-1)^(n+1)/n converges to ln(2) ≈ 0.693. But by rearranging its terms (adding positive terms until we exceed the target, then negative terms until we go below), we can make it converge to any value we want—in this case, 1.50.
Key insight: Absolute convergence means the series of absolute values converges. Such series behave like finite sums -- rearrangement-invariant and safe to manipulate. Conditional convergence is fragile by comparison.