Impact of Buffer Size on the Efficiency of Deadlock Detection

  • Authors:
  • J. M. Martinez;P. Lopez;J. Duato

  • Affiliations:
  • -;-;-

  • Venue:
  • HPCA '99 Proceedings of the 5th International Symposium on High Performance Computer Architecture
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

Deadlock detection is one of the most important design issues in recovery strategies for routing in interconnection networks. In a previous paper, we presented an efficient deadlock detection mechanism. This mechanism requires that when a message header blocks it must be quickly notified to all the channels reserved by that message. To achieve this goal, the detection mechanism uses the information provided by flow control.Some recent commercial multiprocessors use deep buffers, since they may increase network throughput and efficiently allow transmission over long wires. However, deep buffers may increase the elapsed time between header blocking at a router and the propagation of flow control signals, thus negatively affecting the behavior of our deadlock detection mechanism. On the other hand, deeper buffers reduce deadlock frequency. As a consequence, buffer size has opposing effects on deadlock detection. In this paper, we analyze by simulation the influence of these effects on the efficiency of our deadlock detection mechanism, showing that overall performance improves with buffer size.