Publications

Stats

View publication

Title FAQ(current) Corrections(current) Submissions(current) Tools Impact on the Quality of Annotations for Chat Untangling
Authors Jhonny Cerezo, Alexandre Bergel, Felipe Bravo-Marquez
Publication date 2021
Abstract The quality of the annotated data directly influences in
the
success of supervised NLP models. However, creating annotated datasets is
often time-consuming and expensive. Although the annotation tool takes an
important role, we know little about how it influences annotation quality.
We compare the quality of annotations for the task of chat-untangling made
by non-experts annotators using two different tools. The first is SLATE, an
existing command-line based tool, and the second is Parlay, a new tool we
developed that integrates mouse interaction and visual links. Our
experimental results indicate that, while both tools perform similarly in
terms of annotation quality, Parlay offers a significantly better user
experience.
Downloaded 12 times
Pages 215-220
Conference name Conference of the Association for Computational Linguistics: Student Research Workshop
Publisher Association for Computational Linguistic
PDF View PDF
Reference URL View reference page