Usually, I don't care too much about convergence as a general overview of the argument is what I aim at here, and otherwise I'll just trust that things "behave well". But some words concerning convergence won't harm.

It's a well known fact that the harmonic series (which we shortly touched in the previous post) diverges. I think the best (though not easiest) proof of this to compare it with the corresponding integral:

(Let's pause for a moment to celebrate the first of the numerous appearances of our good friend the logarithm.) Though this does require some basic calculus, it's nice because *a)* it not only tells us that the harmonic series grows to infinity, but also *how fast* it does so (incredibly slowly, for the record), and *b)* the same argument works to prove that the series *converges* if you sum over with instead. In other words, the harmonic series is *just about* divergent (explaining the slow rate of divergence).

So we have the series which we know to be convergent for , and thus call as a function. The exact same argument works for since for we have

Don't worry if you didn't get that, it's just an aside. We've already seen two arguments why the series defining would equal the Euler product. Since the series is known to be convergent, the product better be convergent as well, right? Well, to be perfectly honestly, I would have had to establish my equalities first for some finite case, and then take limits, prove convergence, and so on. I don't care much about technicalities, but let's still convince ourselves that the Euler product does converge.

So, how do you prove the convergence of a product? The best strategy in mathematics is always to reduce the problem in front of you to a different problem that you already understand well. In this case, the well-understood problem is the convergence of series. Now a tool that can turn a product into a sum would be incredibly helpful. This tool is, of course, the logarithm. Let's say we want to know if a certain product converges. This means the expression evaluates to a finite number, in which case we can take logarithms, and -- behold! -- we have a series:

Note also, much like series can only converge if tends to , products can only converge if tends to . This is why we usually express a product as .

Knowing this, in order to prove that converges, we need to prove that converges. I tried to spell this out myself, but failed miserably as I worked my way through the technicalities. Instead, we just notice that if converges, so does , and hence by the above remark . In our case, we need to examine (where is a common notation for the -th prime).

Piece of cake after all that we know. We just note that the series that runs over all natural numbers instead of just the primes certainly is larger than the original sum, and hence

which is just the harmonic series again that we know to be convergent for . Wasn't so bad, huh?

For those that are still with me, let's take a look at the assertion that we just brushed over above. We want to prove that the convergence of implies the convergence of . For this, we need the incredibly useful expansion

which is valid for . Taking the modulus yields

where the geometric series made another guest appearance. Now, for this yields

We assumed that converges, hence certainly satisfies for sufficiently large , and thus

So the convergence of immediately implies the (absolute) convergence of .

OK, enough of the technicalities, I promise we won't bother too much with these convergence examinations in the future.