Abstract
The recently introduced locally orderless tensor network (LoTeNet) for supervised image classification uses matrix product state (MPS) operations on grids of transformed image patches. The resulting patch representations are combined back together into the image space and aggregated hierarchically using multiple MPS blocks per layer to obtain the final decision rules. In this work, we propose a non-patch based modification to LoTeNet that performs one MPS operation per layer, instead of several patch-level operations. The spatial information in the input images to MPS blocks at each layer is squeezed into the feature dimension, similar to LoTeNet, to maximise retained spatial correlation between pixels when images are flattened into 1D vectors. The proposed multi-layered tensor network (MLTN) is capable of learning linear decision boundaries in high dimensional spaces in a multi-layered setting, which results in a reduction in the computation cost compared to LoTeNet without any degradation in performance.
Originalsprog | Engelsk |
---|---|
Publikationsdato | 2020 |
Antal sider | 6 |
Status | Udgivet - 2020 |
Begivenhed | 1st Workshop on Quantum Tensor Networks in Machine Learning: In conjunction with 34th NeurIPS, 2020 - Online Varighed: 11 dec. 2020 → … |
Konference
Konference | 1st Workshop on Quantum Tensor Networks in Machine Learning |
---|---|
By | Online |
Periode | 11/12/2020 → … |
Bibliografisk note
Accepted to the First Workshop on Quantum Tensor Networks in Machine Learning. In conjunction with 34th NeurIPS, 2020. Source code at https://github.com/raghavian/mltnEmneord
- cs.CV
- cs.LG
- stat.ML