Knowledge-based interpretation of outdoor natural color scenes
Knowledge-based interpretation of outdoor natural color scenes
Machine Learning - Special issue on learning with probabilistic representations
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Estimating continuous distributions in Bayesian classifiers
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Classification of flourescence in situ hybridization images using belief networks
Pattern Recognition Letters
On the Classification of a Small Imbalanced Cytogenetic Image Database
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Bayesian classifiers based on kernel density estimation: Flexible classifiers
International Journal of Approximate Reasoning
Bayesian network classification using spline-approximated kernel density estimation
Pattern Recognition Letters
Hi-index | 0.00 |
Previous research has indicated the significance of accurate classification of fluorescence in situ hybridisation (FISH) signals for the detection of genetic abnormalities. Based on well-discriminating features and a trainable neural network (NN) classifier, a previous system enabled highly-accurate classification of valid signals and artefacts of two fluorophores. However, since this system employed several features that are considered independent, the naive Bayesian classifier (NBC) is suggested here as an alternative to the NN. The NBC independence assumption permits the decomposition of the high-dimensional likelihood of the model for the data into a product of one-dimensional probability densities. The naive independence assumption together with the Bayesian methodology allow the NBC to predict a posteriori probabilities of class membership using estimated class-conditional densities in a close and simple form. Since the probability densities are the only parameters of the NBC, the misclassification rate of the model is determined exclusively by the quality of density estimation. Densities are evaluated by three methods: single Gaussian estimation (SGE; parametric method), Gaussian mixture model assuming spherical covariance matrices (GMM; semi-parametric method) and kernel density estimation (KDE; non-parametric method). For low-dimensional densities, the GMM generally outperforms the KDE that tends to overfit the training set at the cost of reduced generalisation capability. But, it is the GMM that loses some accuracy when modelling higher-dimensional densities due to the violation of the assumption of spherical covariance matrices when dependent features are added to the set. Compared with these two methods, the SGE and NN provide inferior and superior performance, respectively. However, the NBC avoids the intensive training and optimisation required for the NN, demanding extensive resources and experimentation. Therefore, when supporting these two classifiers, the system enables a trade-off between the NN performance and NBC simplicity of implementation.