Assuming that gcd, prove that \gcd(a+b,a^2+b^2) = 1 or 2.
I tried this problem and ended up with
d\mid 2a^2,\quad d\mid 2b^2
where d = \gcd(a+b,a^2+b^2), but then I am stuck; by these two conclusions how can I conclude d=1 or 2?
And also is there any other way of proving this result?
Answer
From what you have found, you can conclude easily.
If d divides two numbers, it also divides their gcd, so
d| \gcd (2a^2,2b^2) = 2 \gcd (a,b) ^2 =2.
So, d is a divisor of 2 and thus either 1 or 2.
No comments:
Post a Comment