Communications of the ACM
Management Science
An introduction to computational learning theory
An introduction to computational learning theory
On the Fourier spectrum of monotone functions
Journal of the ACM (JACM)
Bundling Information Goods: Pricing, Profits, and Efficiency
Management Science
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
On Learning Monotone Boolean Functions
FOCS '98 Proceedings of the 39th Annual Symposium on Foundations of Computer Science
Oblivious routing in directed graphs with random demands
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
On learning monotone boolean functions under the uniform distribution
Theoretical Computer Science - Algorithmic learning theory(ALT 2002)
A Nonparametric Approach to Multiproduct Pricing
Operations Research
Item pricing for revenue maximization
Proceedings of the 9th ACM conference on Electronic commerce
Submodular Approximation: Sampling-based Algorithms and Lower Bounds
FOCS '08 Proceedings of the 2008 49th Annual IEEE Symposium on Foundations of Computer Science
Approximating submodular functions everywhere
SODA '09 Proceedings of the twentieth Annual ACM-SIAM Symposium on Discrete Algorithms
Unbalanced expanders and randomness extractors from Parvaresh--Vardy codes
Journal of the ACM (JACM)
Local search for balanced submodular clusterings
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Pseudorandomness
Proceedings of the twenty-third annual ACM-SIAM symposium on Discrete Algorithms
Submodular functions are noise stable
Proceedings of the twenty-third annual ACM-SIAM symposium on Discrete Algorithms
Efficiently learning from revealed preference
WINE'12 Proceedings of the 8th international conference on Internet and Network Economics
Hi-index | 0.00 |
There has been much interest in the machine learning and algorithmic game theory communities on understanding and using submodular functions. Despite this substantial interest, little is known about their learnability from data. Motivated by applications, such as pricing goods in economics, this paper considers PAC-style learning of submodular functions in a distributional setting. A problem instance consists of a distribution on {0,1}n and a real-valued function on {0,1}n that is non-negative, monotone, and submodular. We are given poly(n) samples from this distribution, along with the values of the function at those sample points. The task is to approximate the value of the function to within a multiplicative factor at subsequent sample points drawn from the same distribution, with sufficiently high probability. We develop the first theoretical analysis of this problem, proving a number of important and nearly tight results. For instance, if the underlying distribution is a product distribution then we give a learning algorithm that achieves a constant-factor approximation (under some assumptions). However, for general distributions we provide a surprising Omega(n1/3) lower bound based on a new interesting class of matroids and we also show a O(n1/2) upper bound. Our work combines central issues in optimization (submodular functions and matroids) with central topics in learning (distributional learning and PAC-style analyses) and with central concepts in pseudo-randomness (lossless expander graphs). Our analysis involves a twist on the usual learning theory models and uncovers some interesting structural and extremal properties of submodular functions, which we suspect are likely to be useful in other contexts. In particular, to prove our general lower bound, we use lossless expanders to construct a new family of matroids which can take wildly varying rank values on superpolynomially many sets; no such construction was previously known. This construction shows unexpected extremal properties of submodular functions.