Integration of Fault Detection and Diagnosis in a Probabilistic Logic Framework
IBERAMIA 2002 Proceedings of the 8th Ibero-American Conference on AI: Advances in Artificial Intelligence
Faults Diagnosis in Industrial Processes with a Hybrid Diagnostic System
MICAI '02 Proceedings of the Second Mexican International Conference on Artificial Intelligence: Advances in Artificial Intelligence
A Maximum Entropy Approach for Collaborative Filtering
Journal of VLSI Signal Processing Systems
Adaptive estimated maximum-entropy distribution model
Information Sciences: an International Journal
An Extension of Iterative Scaling for Decision and Data Aggregation in Ensemble Classification
Journal of VLSI Signal Processing Systems
An expert system using an extended AND-OR graph
Knowledge-Based Systems
A clustering algorithm based on an estimated distribution model
International Journal of Business Intelligence and Data Mining
A fault detection approach based on machine learning models
MICAI'05 Proceedings of the 4th Mexican international conference on Advances in Artificial Intelligence
Sensor-fusion system for monitoring a CNC-Milling center
MICAI'05 Proceedings of the 4th Mexican international conference on Advances in Artificial Intelligence
Hi-index | 0.00 |
We propose a method for learning a general statistical inference engine, operating on discrete and mixed discrete/continuous feature spaces. Such a model allows inference on any of the discrete features, given values for the remaining features. Applications are, e.g., to medical diagnosis with multiple possible diseases, fault diagnosis, information retrieval, and imputation in databases. Bayesian networks (BNs) are versatile tools that possess this inference capability. However, BNs require explicit specification of conditional independencies, which may be difficult to assess given limited data. Alternatively, Cheeseman (1983) proposed finding the maximum entropy (ME) joint probability mass function (pmf) consistent with arbitrary lower order probability constraints. This approach is in principle powerful and does not require explicit expression of conditional independence. However, until now the huge learning complexity has severely limited the use of this approach. Here we propose an approximate ME method, which also encodes arbitrary low-order constraints but while retaining quite tractable learning. Our method uses a restriction of joint pmf support (during learning) to a subset of the feature space. Results on the University of California-Irvine repository reveal performance gains over several BN approaches and over multilayer perceptrons