Link scheduling in wireless sensor networks: Distributed edge-coloring revisited

  • Authors:
  • Shashidhar Gandham;Milind Dawande;Ravi Prakash

  • Affiliations:
  • xG Technology, Inc., United States;School of Management, The University of Texas at Dallas, TX, United States and Department of Computer Science, University of Texas at Dallas, Richardson, TX, United States;Department of Computer Science, University of Texas at Dallas, Richardson, TX, United States

  • Venue:
  • Journal of Parallel and Distributed Computing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the problem of link scheduling in a sensor network employing a TDMA MAC protocol. Our algorithm consists of two phases. The first phase involves edge-coloring: an assignment of a color to each edge in the network such that no two edges incident on the same node are assigned the same color. Our main result for the first phase is a distributed edge-coloring algorithm that needs at most (@D+1) colors, where @D is the maximum degree of the network. To our knowledge, this is the first distributed algorithm that can edge-color a graph using at most (@D+1) colors. The second phase uses the edge-coloring solution for link scheduling. We map each color to a unique timeslot and attempt to assign a direction of transmission along each edge such that the hidden terminal problem is avoided; an important result we obtain is a characterization of network topologies for which such an assignment exists. Next, we consider topologies for which a feasible transmission assignment does not exist for all edges, and obtain such an assignment using additional timeslots. Finally, we show that reversing the direction of transmission along every edge leads to another feasible direction of transmission. Using both the transmission assignments, we obtain a TDMA MAC schedule which enables two-way communication between every pair of adjacent sensor nodes. For acyclic topologies, we prove that at most 2(@D+1) timeslots are required. Results for general topologies are demonstrated using simulations; for sparse graphs, we show that the number of timeslots required is around 2(@D+1). We show that the message and time complexity of our algorithm is O(n@D^2+n^2m), where n is the number of nodes and m is the number of edges in the network. Through simulations, we demonstrate that the energy consumption of our solution increases linearly with @D. We also propose extensions to account for non-ideal radio characteristics.