Abstract
We tackle the crucial challenge of fusing different modalities of features for multimodal sentiment analysis. Mainly based on neural networks, existing approaches largely model multimodal interactions in an implicit and hard-to-understand manner. We address this limitation with inspirations from quantum theory, which contains principled methods for modeling complicated interactions and correlations. In our quantum-inspired framework, the word interaction within a single modality and the interaction across modalities are formulated with superposition and entanglement respectively at different stages. The complex-valued neural network implementation of the framework achieves comparable results to state-of-the-art systems on two benchmarking video sentiment analysis datasets. In the meantime, we produce the unimodal and bimodal sentiment directly from the model to interpret the entangled decision.
Original language | English |
---|---|
Journal | Information Fusion |
Volume | 65 |
Pages (from-to) | 58-71 |
ISSN | 1566-2535 |
DOIs | |
Publication status | Published - 2021 |
Bibliographical note
Publisher Copyright:© 2020 Elsevier B.V.
Keywords
- Machine learning
- Multimodal sentiment analysis
- Quantum theory