Friday, 21 March 2014

elementary number theory - Prove gcd(a+b,a2+b2) is 1 or 2 if gcd(a,b)=1




Assuming that gcd, prove that \gcd(a+b,a^2+b^2) = 1 or 2.





I tried this problem and ended up with
d\mid 2a^2,\quad d\mid 2b^2
where d = \gcd(a+b,a^2+b^2), but then I am stuck; by these two conclusions how can I conclude d=1 or 2?
And also is there any other way of proving this result?


Answer



From what you have found, you can conclude easily.



If d divides two numbers, it also divides their gcd, so




d| \gcd (2a^2,2b^2) = 2 \gcd (a,b) ^2 =2.



So, d is a divisor of 2 and thus either 1 or 2.





No comments:

Post a Comment

real analysis - How to find lim_{hrightarrow 0}frac{sin(ha)}{h}

How to find \lim_{h\rightarrow 0}\frac{\sin(ha)}{h} without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...