Abstract
Søgaard (2020) obtained results suggesting the fraction of trees occurring in the test data isomorphic to trees in the training set accounts for a non-trivial variation in parser performance. Similar to other statistical analyses in NLP, the results were based on evaluating linear regressions. However, the study had methodological issues and was undertaken using a small sample size leading to unreliable results. We present a replication study in which we also bin sentences by length and find that only a small subset of sentences vary in performance with respect to graph isomorphism. Further, the correlation observed between parser performance and graph isomorphism in the wild disappears when controlling for covariants. However, in a controlled experiment, where covariants are kept fixed, we do observe a strong correlation. We suggest that conclusions drawn from statistical analyses like this need to be tempered and that controlled experiments can complement them by more readily teasing factors apart.
Originalsprog | Engelsk |
---|---|
Titel | Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers) |
Forlag | Association for Computational Linguistics |
Publikationsdato | 2021 |
Sider | 1090-1098 |
ISBN (Elektronisk) | 9781954085527 |
Status | Udgivet - 2021 |
Begivenhed | Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL-IJCNLP 2021 - Virtual, Online Varighed: 1 aug. 2021 → 6 aug. 2021 |
Konference
Konference | Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL-IJCNLP 2021 |
---|---|
By | Virtual, Online |
Periode | 01/08/2021 → 06/08/2021 |
Sponsor | Amazon Science, Apple, Bloomberg Engineering, et al., Facebook AI, Google Research |
Bibliografisk note
Publisher Copyright:© 2021 Association for Computational Linguistics.