Wednesday 25 December 2013

elementary number theory - if $a,b$ are both integers and coprime, prove that the $gcd(a^2 b^3,a+b) =1$




I'm trying to solve this problem. I should be able to do it using simple divisibility properties but I don't know how.




Let a and b be integers such that they are coprime. Prove that $\gcd(a^2b^3,a+b)=1$




For instance... I thought that the gcd divides both $a^2b^3$ and $a+b$ so it must divide a sum of them. I've tried going this way but it's not clear to me where it should lead me. Any hint will be welcomed. Thanks.


Answer



Suppose that $p$ is a prime number such that $p|a^2b^3$ then $p|a$ or $p|b$. Let's say $p|a$. If $p|(a+b)$ then we should have $p|b$ what is impossible because $a,b$ are coprimes.



No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...