Explore convergence, divergence, and the precise epsilon-N definition of limits
Sequences are the foundation of analysis. A sequence converges to a limit L if its terms eventually get arbitrarily close to L and stay there. The precise epsilon-N definition makes this intuition rigorous.
This section will help you visualize what this definition really means and build intuition for working with limits.
The heart of rigorous analysis: for a sequence to converge to L, we must be able to trap all sufficiently late terms within any tolerance epsilon of L. Drag the epsilon slider to see how the required N changes -- smaller tolerance demands going further into the sequence.
Current sequence: aₙ = 1/n
Limit: L = 0.0000
Key insight: Convergence means that for every epsilon > 0, no matter how small, there exists an N such that all terms beyond index N lie within epsilon of the limit. The universal quantifier on epsilon is what gives this definition its power.
Watch sequences approach their limits in real-time. Observe how different sequences converge at different rates -- some approach quickly, others more gradually. The animation reveals the "settling down" behavior that defines convergence.
The harmonic sequence converges to 0
Limit: L = 0
Monotonicity: decreasing
Key insight: The rate of convergence varies widely. Geometric sequences like (1/2)^n converge exponentially, while 1/n converges much more slowly. Both satisfy the epsilon-N definition, but the required N differs dramatically.
Not all sequences converge. Explore the three main ways a sequence can diverge: growing without bound (unbounded), oscillating between values without settling, or behaving chaotically within bounds. Compare with convergent sequences to see the difference.
Terms grow without bound as n increases. No matter how large a number you pick, the sequence eventually exceeds it.
This sequence: Diverges to +∞ (unbounded)
Key insight: A sequence can be bounded yet still diverge (like (-1)^n), showing that boundedness alone does not guarantee convergence. Monotonicity plus boundedness does -- that is the Monotone Convergence Theorem.
A powerful result: every bounded monotone sequence must converge. An increasing sequence bounded above is "squeezed" toward its supremum; a decreasing sequence bounded below approaches its infimum. This theorem is fundamental for proving convergence without knowing the limit in advance.
Every bounded monotone sequence converges to its supremum (if increasing) or infimum (if decreasing).
Current sequence: aₙ = 2 - 1/n converges to 2.00
Key insight: The Monotone Convergence Theorem relies crucially on the completeness of the real numbers. In the rationals, a bounded increasing sequence can "converge" to an irrational gap.
Every bounded sequence has a convergent subsequence. This demo visualizes the bisection proof: repeatedly halving the interval and selecting terms from the more populated half extracts a subsequence that must converge. This theorem connects boundedness to convergence in a deep way.
Every bounded sequence has a convergent subsequence.
The Bisection Method:
Key insight: Bolzano-Weierstrass is equivalent to the completeness of R. It says that bounded sequences, even chaotic ones, always contain hidden convergent patterns.