Jeffrey-like rules of conditioning for the dempster-Shafer theory of evidence
International Journal of Approximate Reasoning
The Combination of Evidence in the Transferable Belief Model
IEEE Transactions on Pattern Analysis and Machine Intelligence
Artificial Intelligence
On the Conceptual Status of Belief Functions with Respect to Coherent Lower Probabilities
ECSQARU '01 Proceedings of the 6th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Jeffrey's rule of conditioning generalized to belief functions
UAI'93 Proceedings of the Ninth international conference on Uncertainty in artificial intelligence
An evidence-theoretic k-NN rule with parameter optimization
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
A signal detection system based on Dempster-Shafer theory andcomparison to fuzzy detection
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Combining ambiguous evidence with respect to ambiguous a priori knowledge. I. Boolean logic
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
A target identification comparison of Bayesian and Dempster-Shafer multisensor fusion
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
A neural network classifier based on Dempster-Shafer theory
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Target identification based on the transferable belief model interpretation of dempster-shafer model
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Sensor Resource Management driven by threat projection and priorities
Information Sciences: an International Journal
Hi-index | 0.07 |
In Bayesian probabilistic approach for uncertain reasoning, one basic assumption is that a priori knowledge about the uncertain variable is modeled by a probability distribution. When new evidence representable by a constant set is available, the Bayesian conditioning is used to update a priori knowledge. In the conventional D-S evidence theory, all bodies of evidence about the uncertain variable are imprecise and uncertain. All bodies of evidence are combined by so-called Dempster's rule of combination to achieve a combined body of evidence without considering a priori knowledge. From our point of view, when identifying the true value of an uncertain variable, Bayesian approach and evidence theory can cooperate to deal with uncertain reasoning. Firstly all imprecise and uncertain bodies of evidence about the uncertain variable are fused to achieve a combined evidence based on a priori knowledge, then the a posteriori probability distribution is achieved from a priori probability distribution by conditioning on the combined evidence. In this paper we firstly deal with the knowledge updating problem where a priori knowledge is represented by a probability distribution and new evidence is represented by a random set. Then we review the conditional evidence theory which resolves the knowledge combining problem based on a priori probabilistic knowledge. Finally we discuss the close relationship between knowledge updating procedure and knowledge combining procedure presented in this paper. We show that a posteriori probability conditioned on fused body of evidence satisfies the Bayesian parallel combination rule.