Abstract
Standard methods for multi-label text classification largely rely on encoder-only pre-trained language models, whereas encoder-decoder models have proven more effective in other classification tasks. In this study, we compare four methods for multi-label classification, two based on an encoder only, and two based on an encoder-decoder. We carry out experiments on four datasets-two in the legal domain and two in the biomedical domain, each with two levels of label granularity- and always depart from the same pre-trained model, T5. Our results show that encoder-decoder methods outperform encoder-only methods, with a growing advantage on more complex datasets and labeling schemes of finer granularity. Using encoder-decoder models in a non-autoregressive fashion, in particular, yields the best performance overall, so we further study this approach through ablations to better understand its strengths.
| Originalsprog | Engelsk |
|---|---|
| Titel | Findings of the Association for Computational Linguistics, ACL 2023 |
| Antal sider | 1 |
| Forlag | Association for Computational Linguistics (ACL) |
| Publikationsdato | 2023 |
| Sider | 5828-5843 |
| ISBN (Elektronisk) | 9781959429623 |
| DOI | |
| Status | Udgivet - 2023 |
| Begivenhed | 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 - Toronto, Canada Varighed: 9 jul. 2023 → 14 jul. 2023 |
Konference
| Konference | 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 |
|---|---|
| Land/Område | Canada |
| By | Toronto |
| Periode | 09/07/2023 → 14/07/2023 |
| Sponsor | Bloomberg Engineering, et al., Google , Liveperson, Meta, Microsoft |
| Navn | Proceedings of the Annual Meeting of the Association for Computational Linguistics |
|---|---|
| ISSN | 0736-587X |
Bibliografisk note
Funding Information:We thank our colleagues at the CoAStaL NLP Lab and the anonymous reviewers for their feedback. This work was fully funded by the Innovation Fund Denmark (IFD).
Publisher Copyright:
© 2023 Association for Computational Linguistics.