On the Complexity of Learning Lexicographic Strategies
The Journal of Machine Learning Research
Democratic approximation of lexicographic preference models
Proceedings of the 25th international conference on Machine learning
An Efficient Upper Approximation for Conditional Preference
Proceedings of the 2006 conference on ECAI 2006: 17th European Conference on Artificial Intelligence August 29 -- September 1, 2006, Riva del Garda, Italy
The complexity of learning separable ceteris paribus preferences
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Ceteris Paribus preference elicitation with predictive guarantees
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Learning conditional preference networks with queries
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Preferences in AI: An overview
Artificial Intelligence
Aggregating conditionally lexicographic preferences on multi-issue domains
CP'12 Proceedings of the 18th international conference on Principles and Practice of Constraint Programming
Hi-index | 0.00 |
We consider the problem of learning a user's ordinal preferences on a multiattribute domain, assuming that her preferences are lexicographic. We introduce a general graphical representation called LP-trees which captures various natural classes of such preference relations, depending on whether the importance order between attributes and/or the local preferences on the domain of each attribute is conditional on the values of other attributes. For each class we determine the Vapnik-Chernovenkis dimension, the communication complexity of preference elicitation, and the complexity of identifying a model in the class consistent with a set of user-provided examples.