Learning Patterns in Multidimensional Space Using Interval Algebra

  • Authors:
  • A. Osmani

  • Affiliations:
  • -

  • Venue:
  • AIMSA '02 Proceedings of the 10th International Conference on Artificial Intelligence: Methodology, Systems, and Applications
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we propose a machine learning formalism based on genrealized intervals. This formalism may be used to diagnose breakdown situations in telecommunication networks. The main task is to discover significant temporal patterns in the large databases generated by the monitoring system. In this kind of applications, time duration is relevant to the alarms identification process. The shapes of the decision boundaries are usually axis-parallel with constraints. The representation of examples in our formalism is similar to the representation described in the Nested Generalized Exemplar theory [Sal91]. This theory of generalization produces an excellent generalization with interpretable hypotheses [WD95] in domains where the decision boundaries are axis-parallel.Using Allen qualitative relations between intervals, firstly we will give an adapted organization of the set of relations, then we will define an operator of generalization and we will give a table of qualitative generalization. Finally we suggest two learning algorithms. The second one uses a topologic lattice between relations to optimize the first one.