Parsing-Experimente
Head-driven statistical models for natural language parsing
Head-driven statistical models for natural language parsing
A non-projective dependency parser
ANLC '97 Proceedings of the fifth conference on Applied natural language processing
EACL '99 Proceedings of the ninth conference on European chapter of the Association for Computational Linguistics
Robust German noun chunking with a probabilistic context-free grammar
COLING '00 Proceedings of the 18th conference on Computational linguistics - Volume 2
Robust parsing with weighted constraints
Natural Language Engineering
A stochastic topological parser for German
COLING '02 Proceedings of the 19th international conference on Computational linguistics - Volume 1
Probabilistic parsing for German using sister-head dependencies
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Annotation strategies for probabilistic parsing in German
COLING '04 Proceedings of the 20th international conference on Computational Linguistics
A dependency-based method for evaluating broad-coverage parsers
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Acceptability prediction by means of grammaticality quantification
ACL-44 Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics
Hybrid parsing: using probabilistic models as predictors for a symbolic parser
ACL-44 Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics
The benefit of stochastic PP attachment to a rule-based parser
COLING-ACL '06 Proceedings of the COLING/ACL on Main conference poster sessions
Constraint-based Modeling and Ambiguity
International Journal of Artificial Intelligence in Education
Hi-index | 0.00 |
We present a parser for German that achieves a competitive accuracy on unrestricted input while maintaining a coverage of 100%. By writing well-formedness rules as declarative, defeasible constraints that integrate different sources of linguistic knowledge, very high robustness is achieved against all sorts of extragrammatical constructions.