Integration of Stochastic Models by Minimizing α-Divergence
Neural Computation
A nature inspired Ying-Yang approach for intelligent decision support in bank solvency analysis
Expert Systems with Applications: An International Journal
Stock Prediction Using FCMAC-BYY
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Part II--Advances in Neural Networks
An online Bayesian Ying-Yang learning applied to fuzzy CMAC
Neurocomputing
Online FCMAC-BYY Model with Sliding Window
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
Bayesian Ying Yang system, best harmony learning, and Gaussian manifold based family
WCCI'08 Proceedings of the 2008 IEEE world conference on Computational intelligence: research frontiers
Fuzzy CMAC with incremental Bayesian Ying-Yang learning and dynamic rule construction
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Machine learning problems from optimization perspective
Journal of Global Optimization
Evolutionary FCMAC-BYY applied to stream data analysis
SEAL'10 Proceedings of the 8th international conference on Simulated evolution and learning
Automatic oriental medical diagnosis via BYY learning based discrete independent factor analysis
ICOSSSE'05 Proceedings of the 4th WSEAS/IASME international conference on System science and simulation in engineering
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Image segmentation with BYY-RPCL framework
IScIDE'11 Proceedings of the Second Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
KCMAC-BYY: Kernel CMAC using Bayesian Ying-Yang learning
Neurocomputing
Hi-index | 0.00 |
The nature of Bayesian Ying-Yang harmony learning is reexamined from an information theoretic perspective. Not only its ability for model selection and regularization is explained with new insights, but also discussions are made on its relations and differences from the studies of minimum description length (MDL), Bayesian approach, the bit-back based MDL, Akaike information criterion (AIC), maximum likelihood, information geometry, Helmholtz machines, and variational approximation. Moreover, a generalized projection geometry is introduced for further understanding such a new mechanism. Furthermore, new algorithms are also developed for implementing Gaussian factor analysis (FA) and non-Gaussian factor analysis (NFA) such that selecting appropriate factors is automatically made during parameter learning.