The alternating series criterion serves to prove convergence of an alternating series, i.e. a series where the pre-signs alternately change from positive to negative, like
or
(with all
being positive). Series of this kind can be convergent, but not absolutely convergent. I those cases, criteria for absolute convergence will fail, but the alternating series criterion may be successful.
The alternating series criterion goes back to Gottfried Wilhelm Leibniz and was published in 1682.
Introductory example: Convergence of the alternating harmonic series
Series treated by the alternating series criterion will often converge, but not converge absolutely. Perhaps, the most prominent example for such a converging, but not absolutely convergent series is the alternating harmonic series
. Convergence of it can be shown by making sure that the sequence of partial sums
converges. For
, those partial sums are
Those partial sums tend to make jumps of ever smaller getting distance. Those partial sums with odd indices, (
) seem to be monotonically decreasing and those with even indices
seem to be increasing. A simple calculation can mathematically verify this assertion: For all
there is
i.e.
(monotonically decreasing). Analogously,
, so
(monotonically increasing).
If we could now show that
is bounded from below and
is bounded from above, then both sequences would converge by the monotonicity criterion. Luckily, this is exacly the case: all odd partial sums are bounded from below by any even partial sum and all even partial sums are bounded from above by any odd partial sum: For all
so
and
. We therefore have the bounds
and
. Hence,
is bounded from below by
and
is bounded from above by
.
The monotonicity criterion now implies that both the subsequences of partial sums
and
are convergent.
In order to get convergence of the series, we need to show that the sequence of partial sums
converges. This is for sure the case, if both the odd and the even subsequence
and
converge to the same limit.
How can this be shown? First, let us assign a name too the limits:
and
. The statement we want to show can then mathematically be expressed as
. We show this by subtracting both limits from each other, which is equivalent to taking the limit of the sequence difference:
Above, we showed
which is a null sequence:
Hence,
which means
.
From this, we can imply that the sequence of partial sums
converges to
. Mathematically, we need to stay closer than any
to the limit value after surpassing some sequence element number
for the corresponding
For a fixed
, both the odd and the even partial sum sequences have a suitable
, which we name by
(odd) and
(even):
After reaching the greater of these two numbers
, both sequences stay closer than
to the limit value and
Generalizing the proof idea / alternating series test
Now we consider any alternating series. Can we use the same proof as for the alternating harmonic series to show that our general alternating series converges? The answer will depend on the properties of the general alternating series. We used the following properties from the alternating harmonics series:
- The sequence of coefficients
without the alternating presign is monotonically decreasing. This gave us monotonicity and boundedness of the two partial sums
and
, so we could show that they converge. Without the monotonicity, this may not be the case.
- Further, we used that
is a null sequence. This was needed to show that both
and
had the same limit, so
converges to that limit. If
converged to a constant
, then
would in the end tend to "jump" up and down by an amount of
and the limits of
and
may differ by
, so they are not equal.
No further properties of the alternating harmonic series have been used for the proof. So we may use the above proof steps to show convergence of a general alternating series:
Theorem (Alternating series test)
Let
be a non-negative, monotonically decreasing null sequence of real numbers, i.e.
. Then, the alternating series
converges.
The proof uses the same steps as the convergence proof for the alternating harmonic series above.
Proof (Alternating series test)
We need to show that the sequence of partial sums
converges.
Step 1: The odd subsequence
is monotonically decreasing and the even subsequence
is monotonically increasing, as for any
there is
and analogously
.
Step 2:
is bounded from below and
is bounded from above, since for
there is
So
and
The monotonicity criterion yields convergence of both the partial sums
and
.
Step 3:
and
converge to the same limit. Let
and
. In step 2, we proved convergence of both sequences, so we can use the sum rule for limits of sequences:
On the other hand,
since both
and
are null sequences. So both limits are equal (
).
Step 4:
also converges to
. Since
and
converge to
, both approach
up to any
:
We now take the greater number
and obtain that also
approaches
up to
:
So the series
converges - and we are done with the proof.
Proof alternative
Alternatively, one may use the Cauchy criterion for proving the alternating series criterion.
Math for Non-Geeks: Template:Alternativer Beweis
Application example
Notes to the alternating series test
- Of course, we can also change the series presigns from positive
negative
positive
to negative
positive
negative
and get a valid convergence criterion for series like
. The proof is the same, under an interchange of
and
.
- One can also start from
, i.e. consider series like
or
. Any starting index
is OK. The proof is just the same including an index shift.
- As above, the alternating series test does only lead convergence, but no absolute convergence. For instance, the alternating harmonic series
converges by the alternating series test. However, it does not converge absolutely.
- The alternating series test can never be used for implying divergence of a series. If a series fails to meet the criteria for the alternating series test, it can still converge. There is an example warning about this below
- The proof for the alternating series implies that
with
and
is a sequence of nested intervals.
- One can also prove the more general Dirichlet test and then conclude the alternating series test as a special case. Further infos will be given in the end of this article.
- We could also take
to be a non-positive and monotonically increasing null sequence. I.e. it approaches
from below. The proof works the same way. Especially, that means
converges, whenever
is just any monotone null sequence.
A test problem
Math for Non-Geeks: Template:Aufgabe
Additional question: Does this series converge absolutely?
What, if a condition is not fulfilled?
It is important to check that the 2 conditions for the alternating series test are fulfilled! There are alternating series, which do not meet one. The following examples will illustrate alternating series, where
is either not converging to 0 (our example converges to 1) or not monotone. Both examples fail to be convergent (although they are alternating). The third example is an alternating series, which fails the alternating series test (as it is not monotone), but nevertheless converges. So the alternating series test does not identify all convergent alternating series.
Example (Example 2: Monotonicity is needed.)
Next, we consider the series
with
. Will it converge? Let us try the alternating series test:
is not monotonically decreasing. This can be easily seen considering the first series elements:
. Clearly,
. More generally, for all
there is
(can be shown by induction) and hence
. So the sequence of elements is not monotonically decreasing, since otherwise we would have
. Only the subsequence of elements with even and with odd index would be monotonically decreasing on their own.
- However,
is a null sequence.
Math for Non-Geeks: Template:Aufgabe
Hence, the alternating series test does not apply. In fact, we can show that the sequence of partial sums
is unbounded, so the series diverges. We use the estimate
This implies
Since the harmonic series diverges, also
and the entire series will diverge. Loosely speaking, the reason for the divergence is that the series corresponding to the positive and negative elements,
and
diverge to
at different speeds. And their speed difference is so huge that the series of differences diverges, too.
Conclusion: Error bounds for the limit
The alternating series test can show converges, but does not give us the limit. For instance, for the alternating harmonic series , there is
. But this limit can not be computed by the alternating series test. However, we can approximate the limit by considering partial sums and the alternating series test will provide us with a neat upper bound for the error of such an approximation.
We have seen above in this article, that the sequence of partial sums with odd index
is monotonically decreasing and converges to the limit
. Further,
, where the infimum of a set is the greatest possible lower bound to its elements. Hence,
for all
, so we have upper bounds for the limit getting better and better. Conversely,
is monotonically increasing with
. So
gives a lower bound for all
. That means, we have an estimate
and
.
How good is the estimate? We subtract the two inequalities and get
So, the series elements serve as a precision indicator for the estimate of the limit by partial sums:
Theorem (Error estimate for approximating alternating series)
If an alternating series
converges by the alternating series test, then the limit can be approximated by the partial sums with maximum error
Generalizing the alternating series test to the Dirichlet test
The Dirichlet test serves for proving convergence of series of the form
. It extends the alternating series test to cases where there is not
. This is particularly useful, if the presign does not change from element to element (like
) but can have streaks without a change in between (like
) . The proof is based on Abel's partial summation, which is quite some work to do. We will not state it here.
The conditions for
are exactly the same as for the alternating series test. Actually, with
, we just get the alternating series test as a special case:
Math for Non-Geeks: Template:Aufgabe