Communications of the ACM
Cooling schedules for optimal annealing
Mathematics of Operations Research
Simulated annealing and Boltzmann machines: a stochastic approach to combinatorial optimization and neural computing
Learnable and Nonlearnable Visual Concepts
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning DNF under the uniform distribution in quasi-polynomial time
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Encouraging Experimental Results on Learning CNF
Machine Learning
On the Learnability of Disjunctive Normal Form Formulas
Machine Learning
An O(nlog log n) learning algorithm for DNF under the uniform distribution
Journal of Computer and System Sciences
Computational experience with general equilibrium problems
Computational Optimization and Applications
Journal of Complexity - Special issue for the Foundations of Computational Mathematics conference, Rio de Janeiro, Brazil, Jan. 1997
Stochastic simulations of two-dimensional composite packings
Journal of Computational Physics
Annealing Algorithms for Multisource Absolute Location Problems on Graph
Computational Optimization and Applications
Feature minimization within decision trees
Computational Optimization and Applications
Multicategory Classification by Support Vector Machines
Computational Optimization and Applications - Special issue on computational optimization—a tribute to Olvi Mangasarian, part I
Computational Optimization and Applications
Local Search in Combinatorial Optimization
Local Search in Combinatorial Optimization
Learning Equilibrium Play: A Myopic Approach
Computational Optimization and Applications
Machine Learning
Machine Learning
Machine Learning
Hi-index | 0.00 |
Usually, local search methods are considered to be slow. In our paper, we present a simulated annealing-based local search algorithm for the approximation of Boolean functions with a proven time complexity that behaves relatively fast on randomly generated functions. The functions are represented by disjunctive normal forms (DNFs). Given a set of m uniformly distributed positive and negative examples of length n generated by a target function F(x1,…,xn), where the DNF consists of conjunctions with at most ℓ literals only, the algorithm computes with high probability a hypothesis H of length m · ℓ such that the error is zero on all examples. Our algorithm can be implemented easily and we obtained a relatively high percentage of correct classifications on test examples that were not presented in the learning phase. For example, for randomly generated F with n = 64 variables and a training set of m = 16384 examples, the error on the same number of test examples was about 19% on positive and 29% on negative examples, respectively. The proven complexity bound provides the basis for further studies on the average case complexity.