# Roots of integers

An integer is either a perfect square or its square root is irrational. Essentially: when you compute the square root of an integer, there are either no figures to the right of the decimal or there are an infinite number of figures to right of the decimal and they don’t repeat. There’s no middle ground — you can’t hope, for example, that the decimal expansion might stop or repeat after a hundred or so terms.

The proof of this theorem is surprisingly simple, not much harder than the familiar proof that the square root of 2 is irrational.

Suppose $\tfrac{a}{b}$ is a fraction in lowest terms, i.e. $a$ and $b$ are co-prime (i.e. their gcd is 1), and $\tfrac{a}{b}$ is a solution to $xn = c$ where $n > 0$ is an integer and $c$ is an integer. Then:

$(\dfrac{a}{b})^n = \dfrac{a^n}{b^n} = c$

and so:

$\dfrac{a^n}{b} = c b^{n-1}$

Now the right side of the equation above is an integer, so the left side must be an integer as well. But $b$ is relatively prime to $a$, and so $b$ is relatively prime to $a^n$. The only way $\tfrac{a^n}{b}$ could be an integer is for $b$ to equal 1 or -1. And so $\tfrac{a}{b}$ must be an integer.

Another way to get the same result is to assume $\tfrac{a}{b}$ is an irreducible fraction and is not an integer (i.e. $b \neq 1$), and consider $(\tfrac{a}{b})^n$. Clearly $a^n$ and $b^n$ are co-prime and the denominator $b^n \neq 1$, so $\tfrac{a^n}{b^n}$ is not an integer.

So what we said about square roots extends to cube roots and in fact to all integer roots (for example, the fifth root of an integer is either an integer or an irrational number). In other words: no (non-integer) fraction, when raised to a power, can produce an integer.

(reblogged from John D. Cook’s blog)

This site uses Akismet to reduce spam. Learn how your comment data is processed.