The alternating series criterion serves to prove convergence of an alternating series, i.e. a series where the pre-signs alternately change from positive to negative, like or (with all being positive). Series of this kind can be convergent, but not absolutely convergent. I those cases, criteria for absolute convergence will fail, but the alternating series criterion may be successful.
Series treated by the alternating series criterion will often converge, but not converge absolutely. Perhaps, the most prominent example for such a converging, but not absolutely convergent series is the alternating harmonic series . Convergence of it can be shown by making sure that the sequence of partial sums converges. For , those partial sums are
Those partial sums tend to make jumps of ever smaller getting distance. Those partial sums with odd indices, () seem to be monotonously decreasing and those with even indices seem to be increasing. A simple calculation can mathematically verify this assertion: For all there is
i.e. (monotonously decreasing). Analogously, , so (monotonously increasing).
If we could now show that is bounded from below and is bounded from above, then both sequences would converge by the monotony criterion. Luckily, this is exacly the case: all odd partial sums are bounded from below by any even partial sum and all even partial sums are bounded from above by any odd partial sum: For all
so and . We therefore have the bounds and . Hence, is bounded from below by and is bounded from above by .
The monotony criterion now implies that both the subsequences of partial sums andare convergent.
In order to get convergence of the series, we need to show that the sequence of partial sums converges. This is for sure the case, if both the odd and the even subsequence and converge to the same limit.
How can this be shown? First, let us assign a name too the limits: and . The statement we want to show can then mathematically be expressed as . We show this by subtracting both limits from each other, which is equivalent to taking the limit of the sequence difference:
Above, we showed which is a null sequence:
Hence, which means .
From this, we can imply that the sequence of partial sums converges to . Mathematically, we need to stay closer than any to the limit value after surpassing some sequence element number for the corresponding
For a fixed , both the odd and the even partial sum sequences have a suitable , which we name by (odd) and (even):
After reaching the greater of these two numbers , both sequences stay closer than to the limit value and
Generalizing the proof idea / alternating series test
Now we consider any alternating series. Can we use the same proof as for the alternating harmonic series to show that our general alternating series converges? The answer will depend on the properties of the general alternating series. We used the following properties from the alternating harmonics series:
The sequence of coefficients without the alternating presign is monotonously decreasing. This gave us monotony and boundedness of the two partial sums and , so we could show that they converge. Without the monotony, this may not be the case.
Further, we used that is a null sequence. This was needed to show that both and had the same limit, so converges to that limit. If converged to a constant , then would in the end tend to "jump" up and down by an amount of and the limits of and may differ by , so they are not equal.
No further properties of the alternating harmonic series have been used for the proof. So we may use the above proof steps to show convergence of a general alternating series:
Theorem (Alternating series test)
Let be a non-negative, monotonously decreasing null sequence of real numbers, i.e. . Then, the alternating series converges.
The proof uses the same steps as the convergence proof for the alternating harmonic series above.
Proof (Alternating series test)
We need to show that the sequence of partial sums converges.
Step 1: The odd subsequence is monotonously decreasing and the even subsequence is monotonously increasing, as for any there is
and analogously .
Step 2: is bounded from below and is bounded from above, since for there is
So and
The monotony criterion yields convergence of both the partial sums and .
Step 3: and converge to the same limit. Let and . In step 2, we proved convergence of both sequences, so we can use the sum rule for limits of sequences:
On the other hand,
since both and are null sequences. So both limits are equal ().
Step 4: also converges to . Since and converge to , both approach up to any :
We now take the greater number and obtain that also approaches up to :
So the series converges - and we are done with the proof.
Of course, we can also change the series presigns from positivenegativepositive to negativepositivenegative and get a valid convergence criterion for series like . The proof is the same, under an interchange of and .
One can also start from , i.e. consider series like or . Any starting index is OK. The proof is just the same including an index shift.
As above, the alternating series test does only lead convergence, but no absolute convergence. For instance, the alternating harmonic series converges by the alternating series test. However, it does not converge absolutely.
The alternating series test can never be used for implying divergence of a series. If a series fails to meet the criteria for the alternating series test, it can still converge. There is an example warning about this below See below.
The proof for the alternating series implies that with and is a sequence of nested intervals.
One can also prove the more general Dirichlet test and then conclude the alternating series test as a special case. Further infos will be given in the end of this article.
We could also take to be a non-positive and monotonously increasing null sequence. I.e. it approaches from below. The proof works the same way. Especially, that means converges, whenever is just any monotone null sequence.
How to get to the proof? (Alternating series test)
In order to apply the alternating series test, we need to show that the sequence is a monotonously decreasing null sequence. That means, we need 2 proof steps:
We prove: is monotonously decreasing, i.e. . This can be shown by proving that the ratio between two elements is or their difference is . Differences of square roots are hard to handle, so we investigate the ratio .
We prove: is a null sequence, i.e. . We can prove this using limit theorems for sequences.
Proof (Alternating series test)
The sequence is nonnegative () and
Step 1:
Now, the square root defines a strictly monotonously increasing function. Hence, it is , whenever the number under the root is . We investigate the behaviour of the expression under the root in the limit :
Hence, , so is monotonously decreasing.
Step 2:
So is a null sequence.
Additional question: Does this series converge absolutely?
No, diverges. For large , the elements scale like , which is even larger than a harmonic series . More precisely,
It is important to check that the 2 conditions for the alternating series test are fulfilled! There are alternating series, which do not meet one. The following examples will illustrate alternating series, where is either not converging to 0 (our example converges to 1) or not monotonous. Both examples fail to be convergent (although they are alternating). The third example is an alternating series, which fails the alternating series test (as it is not monotonous), but nevertheless converges. So the alternating series test does not identify all convergent alternating series.
Example (Example 1: Convergence to 0 is needed.)
We consider the series with . Does it fulfil the requirements for the alternating series test? Let us check:
is monotonously decreasing.
Exercise (Monotony)
Prove it.
Proof (Monotony)
We investigate the ratio , so . Hence, is monotonously decreasing.
converges to 1, so it is not a null sequence. We can easily see this by explicitly computing the limit: .
So the alternating series test can not be applied. And indeed, the series diverges, since for large , it essentially behaves like . More precisely, the divergence can be proven by the term test: The sequence of elements can not be a null sequence, since the subsequence converges to 1. So the corresponding series diverges by the term test.
Example (Example 2: Monotony is needed.)
Next, we consider the series with . Will it converge? Let us try the alternating series test:
is not monotonously decreasing. This can be easily seen considering the first series elements: . Clearly, . More generally, for all there is (can be shown by induction) and hence . So the sequence of elements is not monotonously decreasing, since otherwise we would have . Only the subsequence of elements with even and with odd index would be monotonously decreasing on their own.
However, is a null sequence.
Exercise (Null sequence)
Prove it.
Proof (Null sequence)
The subsequences of elements with only odd or only even index are both null sequences : and . So
and
(More precisely, some suitable numbers are provided by and .)
Now, we set . So if the index surpasses , it will also have surpassed and , so both subsequences stay closer to 0 than , which means that the entire sequence will stay closer to 0 than :
Therefore, converges to .
Hence, the alternating series test does not apply. In fact, we can show that the sequence of partial sums is unbounded, so the series diverges. We use the estimate
This implies
Since the harmonic series diverges, also and the entire series will diverge. Loosely speaking, the reason for the divergence is that the series corresponding to the positive and negative elements, and diverge to at different speeds. And their speed difference is so huge that the series of differences diverges, too.
Example (Example 3: A converging alternating series, which fails the alternating series test.)
Let us consider the series with . Now,
is not monotonously decreasing, as for all there is .
So the alternating series test does not apply. However, the series converges by direct comparison:
Exercise
Prove that converges.
Solution
There is
for all
So the series converges by direct comparison, even though its elements do not form a monotonous sequence.
The alternating series test can show converges, but does not give us the limit. For instance, for the alternating harmonic series , there is . But this limit can not be computed by the alternating series test. However, we can approximate the limit by considering partial sums and the alternating series test will provide us with a neat upper bound for the error of such an approximation.
We have seen above in this article, that the sequence of partial sums with odd index is monotonously decreasing and converges to the limit . Further, , where the infimum of a set is the greatest possible lower bound to its elements. Hence, for all , so we have upper bounds for the limit getting better and better. Conversely, is monotonously increasing with . So gives a lower bound for all . That means, we have an estimate and .
How good is the estimate? We subtract the two inequalities and get
So, the series elements serve as a precision indicator for the estimate of the limit by partial sums:
Theorem (Error estimate for approximating alternating series)
If an alternating series converges by the alternating series test, then the limit can be approximated by the partial sums with maximum error
Example (Error estimate for approximating alternating series)
Let us try to get some numerical values for the limit of the alternating harmonic series :
We take and get the estimate precision . Since , we know that the limit must be in the interval . We round this fractions to decimal numbers and obtain . In fact, the limit is . However, this is only a coarse approximation which already needed summing up 8 terms. There are better ways to compute the logarithm, e.g. Newton's method. The good news is that the above estimate works for any series which converges by the alternating series test.
Generalizing the alternating series test to the Dirichlet test
The Dirichlet test serves for proving convergence of series of the form . It extends the alternating series test to cases where there is not . This is particularly useful, if the presign does not change from element to element (like ) but can have streaks without a change in between (like ) . The proof is based on Abel's partial summation, which is quite some work to do. We will not state it here.
Theorem (Dirichlet's test)
Let and be real sequences with
The partial sum sequence being bounded and
being monotonously decreasing
.
Then, the series converges.
The conditions for are exactly the same as for the alternating series test. Actually, with , we just get the alternating series test as a special case:
Exercise
Prove that fulfils the first condition for Dirichlet's test, i.e. is bounded.
Solution
There is
So alternates between 1 and 0 and is obviously bounded.