A Nearest Hyperrectangle Learning Method
Machine Learning
Elements of machine learning
Approximating hyper-rectangles: learning and pseudo-random sets
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
Modeling and simulating breakdown situations in telecommunication networks
IEA/AIE '99 Proceedings of the 12th international conference on Industrial and engineering applications of artificial intelligence and expert systems: multiple approaches to intelligent systems
Maintaining knowledge about temporal intervals
Communications of the ACM
Discovery of Temporal Patterns. Learning Rules about the Qualitative Behaviour of Time Series
PKDD '01 Proceedings of the 5th European Conference on Principles of Data Mining and Knowledge Discovery
Version spaces: a candidate elimination approach to rule learning
IJCAI'77 Proceedings of the 5th international joint conference on Artificial intelligence - Volume 1
Hypercuboid-formation behaviour of two learning algorithms
IJCAI'87 Proceedings of the 10th international joint conference on Artificial intelligence - Volume 1
A new proof of tractability for 0RD-horn relations
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Hi-index | 0.00 |
In this paper we propose a machine learning formalism based on genrealized intervals. This formalism may be used to diagnose breakdown situations in telecommunication networks. The main task is to discover significant temporal patterns in the large databases generated by the monitoring system. In this kind of applications, time duration is relevant to the alarms identification process. The shapes of the decision boundaries are usually axis-parallel with constraints. The representation of examples in our formalism is similar to the representation described in the Nested Generalized Exemplar theory [Sal91]. This theory of generalization produces an excellent generalization with interpretable hypotheses [WD95] in domains where the decision boundaries are axis-parallel.Using Allen qualitative relations between intervals, firstly we will give an adapted organization of the set of relations, then we will define an operator of generalization and we will give a table of qualitative generalization. Finally we suggest two learning algorithms. The second one uses a topologic lattice between relations to optimize the first one.