The accelerated integer GCD algorithm

  • Authors:
  • Kenneth Weber

  • Affiliations:
  • Kent State Univ., Kent, OH

  • Venue:
  • ACM Transactions on Mathematical Software (TOMS)
  • Year:
  • 1995

Quantified Score

Hi-index 0.00

Visualization

Abstract

Since the greatest common divisor (GCD) of two integers is a basic arithmetic operation used in many mathematical software systems, new algorithms for its computation are of widespread interest. The accelerated integer GCD algorithm discussed here is based on a reduction step proposed by Sorenson (k-ary reduction), coupled with the dmod operation similar to Norton's smod. Some practical limitations of Sorenson's reduction have been eliminated. Worst-case complexity is still O(n2) for n-bit input, but actual implementations given input about 4096 bits long perform over 5.5 times as fast as the binary GCD on one computer architecture having a multiply instruction. Independent research by Jebelean points to the same conclusions.