12

http://en.wikipedia.org/wiki/Binary_GCD_algorithm

This Wikipedia entry has a very dissatisfying implication: the Binary GCD algorithm was at one time as much as 60% more efficient than the standard Euclid Algorithm, but as late as 1998 Knuth concluded that there was only a 15% gain in efficiency on his contemporary computers.

Well another 15 years has passed... how do these two algorithms stack today with advances in hardware?

Does the Binary GCD continue to outperform the Euclidean Algorithm in low-level languages but languish behind due to its complexity in higher level languages like Java? Or is the difference moot in modern computing?

Why do I care you might ask? I just so happen to have to process like 100 billion of these today :) Here's a toast to living in an era of computing (poor Euclid).

jkschneider
  • 23,906
  • 11
  • 68
  • 99
  • You can have a bench mark for testing them, just in one for loop (for example size of 1000) create a list of random numbers, and then for all pair of number calculate binary and in another loop calculate euclid gcd, what's the problem? IMO, Still in modern computers binary should be faster specially with bigger numbers. – Saeed Amiri Nov 19 '11 at 08:32
  • I could, and that would be fairly representative of a particular language on a particular processor on a particular OS. This is a common enough numerical operation that I was curious more generally what the preferred solution in high performance applications today. – jkschneider Nov 19 '11 at 08:43
  • If you have to do 100 billion today, any time spent debating the most efficient solution is going to cost you more lost time than simply implementing one or the other would've. – Nick Johnson Nov 21 '11 at 03:58

1 Answers1

6

The answer is of course "it depends". It depends on hardware, compiler, specific implementation, whatever I forgot. On machines with slow division, binary GCD tends to outperform the Euclidean algorithm. I benchmarked it a couple of years ago on a Pentium4 in C, Java and a few other languages, overall in that benchmark, binary gcd with a 256-element lookup table beat the Euclidean algorithm by a factor of between 1.6 and nearly 3. Euclidean came closer when instead of immediately dividing, first a few rounds of subtraction were performed. I don't remember the figures, but binary still was considerably faster.

If the machine has fast division, things may be different, since the Euclidean algorithm needs fewer operations. If the difference of cost between division and subtraction/shifts is small enough, binary will be slower. Which one is better in your circumstances, you have to find out by benchmarking yourself.

Daniel Fischer
  • 174,737
  • 16
  • 293
  • 422