Elements of information theory
Elements of information theory
Communication systems engineering
Communication systems engineering
Recurrent radial basis function networks for optimal symbol-by-symbol equalization
Proceedings of the COST #229 international workshop on Adaptive methods and emergent techniques for signal processing and communications
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Plant identification via adaptive combination of transversal filters
Signal Processing - Signal processing in UWB communications
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Assessing Approximate Inference for Binary Gaussian Process Classification
The Journal of Machine Learning Research
A Unifying View of Sparse Approximate Gaussian Process Regression
The Journal of Machine Learning Research
Neural Networks: A Comprehensive Foundation (3rd Edition)
Neural Networks: A Comprehensive Foundation (3rd Edition)
SVM multiregression for nonlinear channel estimation in multiple-input multiple-output systems
IEEE Transactions on Signal Processing
Fast adaptive digital equalization by recurrent neural networks
IEEE Transactions on Signal Processing
IEEE Journal on Selected Areas in Communications
Support vector machine multiuser receiver for DS-CDMA signals in multipath channels
IEEE Transactions on Neural Networks
Joint nonlinear channel equalization and soft LDPC decoding with Gaussian processes
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
We propose Gaussian processes (GPs) as a novel nonlinear receiver for digital communication systems. The GPs framework can be used to solve both classification (GPC) and regression (GPR) problems. The minimum mean squared error solution is the expectation of the transmitted symbol given the information at the receiver, which is a nonlinear function of the received symbols for discrete inputs. GPR can be presented as a nonlinear MMSE estimator and thus capable of achieving optimal performance from MMSE viewpoint. Also, the design of digital communication receivers can be viewed as a detection problem, for which GPC is specially suited as it assigns posterior probabilities to each transmitted symbol. We explore the suitability of GPs as nonlinear digital communication receivers. GPs are Bayesian machine learning tools that formulates a likelihood function for its hyperparameters, which can then be set optimally. GPs outperform state-of-the-art nonlinear machine learning approaches that prespecify their hyperparameters or rely on cross validation. We illustrate the advantages of GPs as digital communication receivers for linear and nonlinear channel models for short training sequences and compare them to state-of-the-art nonlinear machine learning tools, such as support vector machines.