Establishing haptic texture attribute space and predicting haptic attributes from image features using 1D-CNN

Waseem Hassan, Joolekha Bibi Joolee, Seokhee Jeon*

*Corresponding author for this work

Research output: Contribution to journalJournal articleResearchpeer-review

3 Citations (Scopus)

Abstract

The current study strives to provide a haptic attribute space where texture surfaces are located based on their haptic attributes. The main aim of the haptic attribute space is to come up with a standardized model for representing and identifying haptic textures analogous to the RGB model for colors. To this end, a four dimensional haptic attribute space is established by conducting a psychophysical experiment where human participants rate 100 real-life texture surfaces according to their haptic attributes. The four dimensions of the haptic attribute space are rough-smooth, flat-bumpy, sticky-slippery, and hard-soft. The generalization and scalability of the haptic attribute space is achieved by training a 1D-CNN model for predicting attributes of haptic textures. The 1D-CNN is trained using the attribute data from psychophysical experiments and image features collected from the images of real textures. The prediction power granted by the 1D-CNN renders scalability to the haptic attribute space. The prediction accuracy of the proposed 1D-CNN model is compared against other machine learning and deep learning algorithms. The results show that the proposed method outperforms the other models on MAE and RMSE metrics.

Original languageEnglish
Article number11684
JournalScientific Reports
Volume13
Issue number1
ISSN2045-2322
DOIs
Publication statusPublished - Dec 2023
Externally publishedYes

Bibliographical note

Funding Information:
This research was supported in part by the IITP under the Ministry of Science and ICT Korea through the ITRC program (IITP-2023-RS-2022-00156354) and in part by the Preventive Safety Service Technology Development Program funded by the Korean Ministry of Interior and Safety under Grant 2019-MOIS34-001.

Publisher Copyright:
© 2023, The Author(s).

Cite this