Crytographic limitations on learning Boolean formulae and finite automata
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Journal of the ACM (JACM)
Text classification using string kernels
The Journal of Machine Learning Research
Chinese number-names, tree adjoining languages, and mild context-sensitivity
Computational Linguistics
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Characterizing structural descriptions produced by various grammatical formalisms
ACL '87 Proceedings of the 25th annual meeting on Association for Computational Linguistics
Planar languages and learnability
ICGI'06 Proceedings of the 8th international conference on Grammatical Inference: algorithms and applications
Learning Balls of Strings from Edit Corrections
The Journal of Machine Learning Research
Grammatical inference as a principal component analysis problem
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
A machine learning approach for statistical software testing
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Structural statistical software testing with active learning in a graph
ILP'07 Proceedings of the 17th international conference on Inductive logic programming
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
A spectral approach for probabilistic grammatical inference on trees
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
Planar languages and learnability
ICGI'06 Proceedings of the 8th international conference on Grammatical Inference: algorithms and applications
Some Alternatives to Parikh Matrices Using String Kernels
Fundamenta Informaticae
Hi-index | 0.00 |
Using string kernels, languages can be represented as hyperplanes in a high dimensional feature space. We present a new family of grammatical inference algorithms based on this idea. We demonstrate that some mildly context sensitive languages can be represented in this way and it is possible to efficiently learn these using kernel PCA. We present some experiments demonstrating the effectiveness of this approach on some standard examples of context sensitive languages using small synthetic data sets.