Abstract
In this paper, we propose a novel lightweight relation extraction approach of structural block driven convolutional neural learning. Specifically, we detect the essential sequential tokens associated with entities through dependency analysis, named as a structural block, and only encode the block on a block-wise and an inter-block-wise representation, utilizing multi-scale Convolutional Neural Networks (CNNs). This is to (1) eliminate the noisy from irrelevant part of a sentence; meanwhile (2) enhance the relevant block representation with both block-wise and inter-block-wise semantically enriched representation. Our method has the advantage of being independent of long sentence context since we only encode the sequential tokens within a block boundary. Experiments on two datasets i.e., SemEval2010 and KBP37, demonstrate the significant advantages of our method. In particular, we achieve the new state-of-the-art performance on the KBP37 dataset; and comparable performance with the state-of-the-art on the SemEval2010 dataset.
| Originalsprog | Engelsk |
|---|---|
| Artikelnummer | 105913 |
| Tidsskrift | Applied Soft Computing Journal |
| Vol/bind | 86 |
| Antal sider | 9 |
| ISSN | 1568-4946 |
| DOI | |
| Status | Udgivet - 2020 |