I've recently gotten in number theory, using Theory of Numbers by Andrew Adler as a starting point and came across a theorem that states,
Suppose a and b are not 0, let d = (a, b). Then d is the smallest positive integer that can be expressed as a linear combination of a and b.
Is the converse true, i.e. if I find an integer that is the smallest positive integer which is a linear combination of two integers a and b, then it must be the greatest common divisor of a and b? For example, if I found out 1 = xa + yb for some integer x and b, must 1 be the greatest common divisor, since it is the smallest positive integer possible?
Answer
It's an equivalence. That is, we have an if and only if.
For the "converse", recall that we have that by Bezout's identity, $(a,b)$ can be written as a linear combination of $a$ and $b$ (this involves the Euclidean algorithm). So if $d$ is the smallest number that has this property, we have $d\le (a,b)$. But of course $d\ge (a,b)$, since any number dividing $a$ and $b$ divides $d$ (by the fact that $d$ is a linear combination of $a$ and $b$). So $d=(a,b)$.
No comments:
Post a Comment