The Integral Test

If f is positive, continuous, and decreasing for x\geq 1 and a_{n} = f(n) , then \displaystyle\sum_{n=1}^\infty{a_n} and \displaystyle\int_{1}^{\infty}f(x) dx either both converge or both diverge.

To prove this theorem we first partition the interval [1, n] into n-1 unit intervals. The total areas of the inscribed rectangles and the circumscribed triangles are as follows:

\displaystyle\sum_{i=2}^{n}f(i) = f(2) + f(3) + .... +f(n) (inscribed area)

\displaystyle\sum_{i=1}^{n-1}f(i) = f(1) + f(2) + ... + f(n-1) (circumscribed area)

The precise area under the graph of f from x=1 to x=n lies between the inscribed and circumscribed areas which implies \displaystyle\sum_{i=2}^{n}f(i) \leq \displaystyle\int_{1}^{n}f(x) dx \leq \displaystyle\sum_{i=1}^{n-1} f(i) . Using the nth partial sum, S_n = f(1) + f(2) +... + f(n) , we can write this inequality as S_n -f(1) \leq \displaystyle\int_1^n f(x) dx \leq S_{n-1} . Now assuming that \int_1^\infty f(x) dx converges to L, it follows that for n \geq 1 , S_{n} - f(1) \leq L\Rightarrow S_{n} \leq L + f(1)  . Consequently, {S_n} is bounded and monotonic, and thereby by the Bounded Monotonic Sequence Theorem it converges. So, \sum a_n converges.