A polynomial factorization challenge

  • Authors:
  • Joachim von zur Gathen

  • Affiliations:
  • -

  • Venue:
  • ACM SIGSAM Bulletin
  • Year:
  • 1992

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the early 1970s, a major paradigm shift took place in algorithms research, away from experimental results to asymptotic analysis. Knuth popularized the "Big O" notation, and Hopcroft says in his 1986 ACM Turing Award (with Robert Tarjan) address: "During the 1960s, research on algorithms had been very unsatisfying. A researcher would publish an algorithm in a journal along with execution times for a small set of sample problems, and then several years later, a second researcher would give an improved algorithm along with execution times for the same set of sample problems. The new algorithm would invariably be faster, since in the intervening years, both computer performance and programming languages had improved. The fact that the algorithms were run on different computers and programmed in different languages made me uncomfortable with the comparison. It was difficult to factor out both the effects of increased computer performance and the programming skills of the implementors---to discover the effects due to the new algorithm as opposed to its implementation. Furthermore, it was possible that the second researcher had inadvertently tuned his or her algorithm to the sample problems ... I set out to demonstrate that a theory of algorithm design based on worst-case asymptotic performance could be a valuable aid to the practitioner."