Sequences & Limits

Explore convergence, divergence, and the precise epsilon-N definition of limits

The Language of Limits

Sequences are the foundation of analysis. A sequence converges to a limit L if its terms eventually get arbitrarily close to L and stay there. The precise epsilon-N definition makes this intuition rigorous.

This section will help you visualize what this definition really means and build intuition for working with limits.

The Epsilon-N Definition of Convergence

The heart of rigorous analysis: for a sequence to converge to L, we must be able to trap all sufficiently late terms within any tolerance epsilon of L. Drag the epsilon slider to see how the required N changes -- smaller tolerance demands going further into the sequence.

0.01 (tight)2.0 (loose)

Current sequence: aₙ = 1/n

Limit: L = 0.0000

Within ε of limit
Outside ε band
N marker
ε-neighborhood

Understanding the ε-N Definition

  • The green band shows all values within ε of the limit L
  • The red dashed line marks N: all terms after this point must stay in the band
  • Try decreasing ε and watch how N must increase to compensate
  • A smaller ε means a stricter requirement, so we need to go further into the sequence

Key insight: Convergence means that for every epsilon > 0, no matter how small, there exists an N such that all terms beyond index N lie within epsilon of the limit. The universal quantifier on epsilon is what gives this definition its power.

Convergent Sequence Explorer

Watch sequences approach their limits in real-time. Observe how different sequences converge at different rates -- some approach quickly, others more gradually. The animation reveals the "settling down" behavior that defines convergence.

Speed:
Term 0 / 60

1/n

The harmonic sequence converges to 0

Limit: L = 0

Monotonicity: decreasing

Key insight: The rate of convergence varies widely. Geometric sequences like (1/2)^n converge exponentially, while 1/n converges much more slowly. Both satisfy the epsilon-N definition, but the required N differs dramatically.

Why Sequences Fail to Converge

Not all sequences converge. Explore the three main ways a sequence can diverge: growing without bound (unbounded), oscillating between values without settling, or behaving chaotically within bounds. Compare with convergent sequences to see the difference.

Unbounded (→ +∞)

Terms grow without bound as n increases. No matter how large a number you pick, the sequence eventually exceeds it.

This sequence: Diverges to +∞ (unbounded)

Why Divergence Matters

  • Unbounded sequences grow without limit; no ε-neighborhood can contain their tails
  • Oscillating sequences have multiple cluster points but no single limit
  • Chaotic sequences are bounded but don't settle toward any value
  • Being bounded is necessary but not sufficient for convergence

Key insight: A sequence can be bounded yet still diverge (like (-1)^n), showing that boundedness alone does not guarantee convergence. Monotonicity plus boundedness does -- that is the Monotone Convergence Theorem.

The Monotone Convergence Theorem

A powerful result: every bounded monotone sequence must converge. An increasing sequence bounded above is "squeezed" toward its supremum; a decreasing sequence bounded below approaches its infimum. This theorem is fundamental for proving convergence without knowing the limit in advance.

Monotone Convergence Theorem (MCT)

Every bounded monotone sequence converges to its supremum (if increasing) or infimum (if decreasing).

If (aₙ) is increasing and bounded above, then lim aₙ = sup{aₙ}

Current sequence: aₙ = 2 - 1/n converges to 2.00

Why Does This Work?

  • An increasing sequence bounded above must have a least upper bound (supremum)
  • The sequence approaches but never exceeds this supremum
  • By the completeness of real numbers, this supremum must be the limit
  • Similarly, a decreasing sequence bounded below converges to its infimum

Key insight: The Monotone Convergence Theorem relies crucially on the completeness of the real numbers. In the rationals, a bounded increasing sequence can "converge" to an irrational gap.

The Bolzano-Weierstrass Theorem

Every bounded sequence has a convergent subsequence. This demo visualizes the bisection proof: repeatedly halving the interval and selecting terms from the more populated half extracts a subsequence that must converge. This theorem connects boundedness to convergence in a deep way.

Outside interval
Inside interval
Selected for subsequence

Bolzano-Weierstrass Theorem

Every bounded sequence has a convergent subsequence.

The Bisection Method:

  1. Start with interval [-1, 1] containing all terms
  2. Bisect: divide interval in half at midpoint
  3. Choose the half with more (or equal) terms
  4. Select one term from that half for our subsequence
  5. Repeat — the intervals shrink, forcing convergence

Key insight: Bolzano-Weierstrass is equivalent to the completeness of R. It says that bounded sequences, even chaotic ones, always contain hidden convergent patterns.

Key Takeaways

  • Epsilon-N definition -- a sequence converges to L if for every epsilon > 0, all terms beyond some index N lie within epsilon of L
  • Divergence -- sequences can fail to converge by being unbounded, oscillating, or behaving chaotically
  • Monotone Convergence Theorem -- every bounded monotone sequence converges, a direct consequence of completeness
  • Bolzano-Weierstrass -- every bounded sequence has a convergent subsequence, connecting compactness to sequential convergence