Question: For this questions: Prove that the following algorithm computes gcd(x, y) the greatest common divisor of x and y, and show its worst-case running time.
For this questions: Prove that the following algorithm computes gcd(x, y) the greatest common divisor of x and y, and show its worst-case running time. BINARYGCD(x,y): if x = y: return x else if x and y are both even: return 2*BINARYGCD(x/2,y/2) else if x is even: return BINARYGCD(x/2,y) else if y is even: return BINARYGCD(x,y/2) else if x > y: return BINARYGCD( (x-y)/2,y ) else return BINARYGCD( x, (y-x)/2 ) I saw the posted answer. It tried to get GCD for 132 and 84. The end result only returns 3 following the logic. How does it come up with 12? It didn't explain on the answer. I cannot get any line of the code where 3 * 3. Appreciate any guidance!
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
