The Conference on Computational Natural Language Learning (CoNLL) features a shared task, in which participants train and test their learning systems on the same data sets. In 2017, one of two tasks was devoted to learning dependency parsers for a large number of languages, in a realworld setting without any gold-standard annotation on input. All test sets followed a unified annotation scheme, namely that of Universal Dependencies. In this paper, we define the task and evaluation methodology, describe data preparation, report and analyze the main results, and provide a brief categorization of the different approaches of the participating systems.

CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies

Sanguinetti, Manuela;Simi, Maria;STELLA, ANTONIO;
2017-01-01

Abstract

The Conference on Computational Natural Language Learning (CoNLL) features a shared task, in which participants train and test their learning systems on the same data sets. In 2017, one of two tasks was devoted to learning dependency parsers for a large number of languages, in a realworld setting without any gold-standard annotation on input. All test sets followed a unified annotation scheme, namely that of Universal Dependencies. In this paper, we define the task and evaluation methodology, describe data preparation, report and analyze the main results, and provide a brief categorization of the different approaches of the participating systems.
2017
CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies
Vancouver, Canada
3-4 Agosto 2017
Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies
Association for Computational Linguistics
1
19
978-1-945626-70-8
http://www.aclweb.org/anthology/K/K17/K17-3001.pdf
Universal Dependencies, parsing, dependency syntax, evaluation
Zeman, Daniel; Popel, Martin; Straka, Milan; Hajic, Jan; Nivre, Joakim; Ginter, Filip; Luotolahti, Juhani; Pyysalo, Sampo; Petrov, Slav; Potthast, Mar...espandi
File in questo prodotto:
File Dimensione Formato  
K17-3001.pdf

Accesso aperto

Tipo di file: PDF EDITORIALE
Dimensione 252.03 kB
Formato Adobe PDF
252.03 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1652589
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 261
  • ???jsp.display-item.citation.isi??? ND
social impact