Thursday, 23 November 2017

elementary number theory - Prove $gcd(k, l) = d Rightarrow gcd(2^k - 1, 2^l - 1) = 2^d - 1$

This is a problem for a graduate level discrete math class that I'm hoping to take next year (as a senior undergrad). The problem is as stated in the title:




Given that $\gcd(k, l) = d$, prove that $\gcd(2^k - 1, 2^l - 1) = 2^d
- 1$.




The problem also says "hint: use Euclid's lemma," although I'm not sure what part of the problem should be applied to. I've been thinking about it for a few days and I'm completely stumped.




I'm not really even sure how to show that it divides either $2^k - 1$ or $2^l - 1$. From the given, we know that $\exists c: dc = k$. We need to show that $\exists c': (2^d - 1)c' = 2^k - 1$. Obviously $c' = 2^c$ gives you $(2^d - 1)c' = 2^k - 2^c$, but I can't figure out how to get rid of the extra terms that the $- 1$ brings in in various places.



From Euclid's lemma on the left side, you know $\exists i: di = k - l$, and applying it on the right side, you know it suffices to show that $\gcd(2^k - 2^l, 2^l - 1) = 2^d - 1$. And by Bezout's identity, it's enough to show that $2^d - 1$ can be written as a linear combination of either of those things.



Can anyone give me a hint?

No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...