On the hardness of approximating minimization problems
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
The hardness of approximation: gap location
Computational Complexity
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
A threshold of ln n for approximating set cover
Journal of the ACM (JACM)
Minimum-Entropy Data Partitioning Using Reversible Jump Markov Chain Monte Carlo
IEEE Transactions on Pattern Analysis and Machine Intelligence
an entropy-driven system for construction of probabilistic expert systems from databases
UAI '90 Proceedings of the Sixth Annual Conference on Uncertainty in Artificial Intelligence
Minimum Entropy Combinatorial Optimization Problems
CiE '09 Proceedings of the 5th Conference on Computability in Europe: Mathematical Theory and Computational Practice
Tight results on minimum entropy set cover
APPROX'06/RANDOM'06 Proceedings of the 9th international conference on Approximation Algorithms for Combinatorial Optimization Problems, and 10th international conference on Randomization and Computation
Operations Research Letters
Boolean functions over nano-fabrics: improving resilience through coding
IEEE Transactions on Very Large Scale Integration (VLSI) Systems
Fuzzy fast classification algorithm with hybrid of ID3 and SVM
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Recent Advances in Soft Computing: Theories and Applications
Improved approximation algorithms for low-density instances of the Minimum Entropy Set Cover Problem
Information Processing Letters
Hi-index | 0.00 |
We consider the minimum entropy principle for learning data generated by a random source and observed with random noise.In our setting we have a sequence of observations of objects drawn uniformly at random from a population. Each object in the population belongs to one class. We perform an observation for each object which determines that it belongs to one of a given set of classes. Given these observations, we are interested in assigning the most likely class to each of the objects.This scenario is a very natural one that appears in many real life situations. We show that under reasonable assumptions finding the most likely assignment is equivalent to the following variant of the set cover problem. Given a universe U and a collection I = (S1, ..., St) of subsets of U, we wish to find an assignment f : U → I such that u ∈ f(u) and the entropy of the distribution defined by the values |f-1 (Si)| is minimized.We show that this problem is NP-hard and that the greedy algorithm for set cover s with an additive constant error with respect to the optimal cover. This sheds a new light on the behavior of the greedy set cover algorithm. We further enhance the greedy algorithm and show that the problem admits a polynomial time approximation scheme (PTAS).Finally, we demonstrate how this model and the greedy algorithm can be useful in real life scenarios, and in particular, in problems arising naturally in computational biology.