Elements of information theory
Elements of information theory
Minimum entropy of error principle in estimation
Information Sciences—Informatics and Computer Science: An International Journal
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Learning from Examples with Information Theoretic Criteria
Journal of VLSI Signal Processing Systems
Side chain placement using estimation of distribution algorithms
Artificial Intelligence in Medicine
Survey paper: Errors-in-variables methods in system identification
Automatica (Journal of IFAC)
Adaptive estimated maximum-entropy distribution model
Information Sciences: an International Journal
Error Entropy in Classification Problems: A Univariate Data Analysis
Neural Computation
Identification of Wiener systems with binary-valued output observations
Automatica (Journal of IFAC)
A note on information entropy measures for vague sets and its applications
Information Sciences: an International Journal
Digital Control System Analysis and Design
Digital Control System Analysis and Design
Maximum entropy membership functions for discrete fuzzy variables
Information Sciences: an International Journal
A study on the global convergence time complexity of estimation of distribution algorithms
RSFDGrC'05 Proceedings of the 10th international conference on Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing - Volume Part I
Average time complexity of estimation of distribution algorithms
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
A fast recursive total least squares algorithm for adaptive FIR filtering
IEEE Transactions on Signal Processing - Part I
Error whitening criterion for adaptive filtering: theory and algorithms
IEEE Transactions on Signal Processing
On the convergence of a class of estimation of distribution algorithms
IEEE Transactions on Evolutionary Computation
Nonlinear channel equalization for QAM signal constellation usingartificial neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Entropy analysis of estimating systems
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Blind identification and deconvolution of linear systems driven by binary random sequences
IEEE Transactions on Information Theory
Generalized information potential criterion for adaptive system training
IEEE Transactions on Neural Networks
Reversed version of a generalized sharp Hölder's inequality and its applications
Information Sciences: an International Journal
Hi-index | 0.07 |
Recently, the minimum error entropy criterion, an information theoretic alternative to the traditional mean square error criterion, has been successfully used in the contexts of machine learning and signal processing. For system identification, however, the MEE criterion will be no longer suitable if the training data are discrete-valued, since minimizing error's discrete entropy cannot constrain error's dispersion. In this paper, to make the MEE criterion suitable for the discrete-valued data cases, we give a new entropy definition for the discrete random variables, i.e. the @D-entropy, based on Riemann sums for finite size partitions. A probability weighted formula is established to calculate the average partition.This new entropy retains some important properties of the differential entropy and reduces to discrete entropy under certain conditions. Unlike discrete entropy, the @D-entropy is sensitive to the dynamic range of the data, and can be used as a superior optimality criterion in system identification problems. Also, we present a plug-in estimate of @D-entropy, analyze its asymptotic behavior and explore the links to the kernel based and m-spacing based estimates for differential entropy. Finally, the @D-entropy criterion is applied in system identification with coarsely quantized input-output data to search for the optimum parameter set. Monte Carlo simulations demonstrate the performance improvement that may be achieved with the @D-entropy criterion.