The Influence of Social Dependencies on Decision-Making: Initial Investigations with a New Game
AAMAS '04 Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems - Volume 2
Reputation in the joint venture game
Proceedings of the 6th international joint conference on Autonomous agents and multiagent systems
Modeling how humans reason about others with partial information
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
Simultaneously modeling humans' preferences and their beliefs about others' preferences
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
Learning social preferences in games
AAAI'04 Proceedings of the 19th national conference on Artifical intelligence
Modeling reciprocal behavior in human bilateral negotiation
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Hi-index | 0.00 |
With increasing frequency, computer agents participate in collaborative and competitive multiagent domains in which humans reason strategically to make decisions. The deployment of computer agents in such domains requires that the agents understand something about human behavior so that they can interact successfully with people; the computer agents must be sensitive to how people reason in strategic settings as well as to the social utilities people employ to inform their reasoning. To date, these design requirements for computer agents have received relatively little attention. To further research in this area, we are developing the Colored Trails (CT) testbed [5], a configurable and extensible open-source system for use by the research community at large to investigate multiagent decision making.