Neural Computation
An Introduction to Variational Methods for Graphical Models
Machine Learning
Greedy and Local Search Heuristics for Unconstrained Binary Quadratic Programming
Journal of Heuristics
Variational mixture of Bayesian independent component analyzers
Neural Computation
Canonical Dual Approach to Binary Factor Analysis
ICA '09 Proceedings of the 8th International Conference on Independent Component Analysis and Signal Separation
Learning Deep Architectures for AI
Foundations and Trends® in Machine Learning
Machine learning problems from optimization perspective
Journal of Global Optimization
BYY harmony learning, independent state space, and generalized APT financial analyses
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Binary Factor Analysis (BFA) uncovers the independent binary information sources from observations with wide applications. BFA learning hierarchically nests three levels of inverse problems, i.e., inference of binary code for each observation, parameter estimation and model selection. Under Bayesian Ying-Yang (BYY) framework, the first level becomes an intractable Binary Quadratic Programming (BQP) problem, while model selection can be conducted automatically during parameter learning. We conduct extensive experiments to reveal that the performance order of four BQP methods is reversed from making BQP optimization to making BYY automatic model selection, which implies that learning is not merely optimization. Moreover, the BFA learning algorithm is further developed with priors over parameters to improve the performance. Finally, based on BFA, we empirically compare BYY with Variational Bayes (VB) and Bayesian information criterion (BIC).