Search results
Results from the WOW.Com Content Network
In mathematical analysis, the alternating series test proves that an alternating series is convergent when its terms decrease monotonically in absolute value and approach zero in the limit. The test was devised by Gottfried Leibniz and is sometimes known as Leibniz's test , Leibniz's rule , or the Leibniz criterion .
Like any series, an alternating series is a convergent series if and only if the sequence of partial sums of the series converges to a limit. The alternating series test guarantees that an alternating series is convergent if the terms a n converge to 0 monotonically, but this condition is not necessary for convergence.
If r > 1, then the series diverges. If r = 1, the root test is inconclusive, and the series may converge or diverge. The root test is stronger than the ratio test: whenever the ratio test determines the convergence or divergence of an infinite series, the root test does too, but not conversely. [1]
An example of a conditionally convergent series is the alternating harmonic series. Many standard tests for divergence and convergence, most notably including the ratio test and the root test, demonstrate absolute convergence. This is because a power series is absolutely convergent on the interior of its disk of convergence. [a]
In mathematics, the comparison test, sometimes called the direct comparison test to distinguish it from similar related tests (especially the limit comparison test), provides a way of deducing whether an infinite series or an improper integral converges or diverges by comparing the series or integral to one whose convergence properties are known.
The Riemann series theorem states that if a series converges conditionally, it is possible to rearrange the terms of the series in such a way that the series converges to any value, or even diverges. Agnew's theorem characterizes rearrangements that preserve convergence for all series.
A classic example is the alternating harmonic series given by + + = = +, which converges to (), but is not absolutely convergent (see Harmonic series). Bernhard Riemann proved that a conditionally convergent series may be rearranged to converge to any value at all, including ∞ or −∞; see Riemann series theorem .
Those methods work on oscillating divergent series, but they cannot produce a finite answer for a series that diverges to +∞. [6] Most of the more elementary definitions of the sum of a divergent series are stable and linear, and any method that is both stable and linear cannot sum 1 + 2 + 3 + ⋯ to a finite value (see § Heuristics below) .