A generalized VQ method for combined compression and estimation
ICASSP '96 Proceedings of the Acoustics, Speech, and Signal Processing, 1996. on Conference Proceedings., 1996 IEEE International Conference - Volume 04
Decentralized Estimation Using Learning Vector Quantization
DCC '09 Proceedings of the 2009 Data Compression Conference
A collaborative sensor-fault detection scheme for robust distributed estimation in sensor networks
IEEE Transactions on Communications
Universal decentralized detection in a bandwidth-constrained sensor network
IEEE Transactions on Signal Processing - Part I
Linear Coherent Decentralized Estimation
IEEE Transactions on Signal Processing
Nonparametric decentralized detection using kernel methods
IEEE Transactions on Signal Processing
Distributed estimation and quantization
IEEE Transactions on Information Theory
Soft nearest prototype classification
IEEE Transactions on Neural Networks
Hi-index | 0.10 |
A typical approach in supervised learning when data comes from multiple sources is to send original data from all sources to a central location and train a predictor that estimates a certain target quantity. This can be inefficient and costly in applications with constrained communication channels, due to limited power and/or bitlength constraints. Under such constraints, one potential solution is to send encoded data from sources and use a decoder at the central location. Data at each source is summarized into a single codeword and sent to a central location, where a target quantity is estimated using received codewords. This problem is known as Decentralized Estimation. In this paper we propose a variant of the Learning Vector Quantization (LVQ) classification algorithm, the Distortion Sensitive LVQ (DSLVQ), to be used for encoder design in decentralized estimation. Unlike most related research that assumes known distributions of source observations, we assume that only a set of empirical samples is available. DSLVQ approach is compared to previously proposed Regression Tree and Deterministic Annealing (DA) approaches for encoder design in the same setting. While Regression Tree is very fast to train, it is limited to encoder regions with axis-parallel splits. On the other hand, DA is known to provide state-of-the-art performance. However, its training complexity grows with the number of sources that have different data distributions, due to over-parametrization. Our experiments on several synthetic and one real-world remote sensing problem show that DA has limited application potential as it is highly impractical to train even in a four-source setting, while DSLVQ is as simple and fast to train as the Regression Tree. In addition, DSLVQ shows similar performance to DA in experiments with small number of sources and outperforms DA in experiments with large number of sources, while consistently outperforming the Regression Tree algorithm.