Achieving joint perception of an object from multisensory resources: Visually impaired person’s tactile explorations in the context of instructor’s verbal descriptions.

Brian Lystgaard Due, Louise Lüchow, Savi Camilla Drachmann Jakobsen

Research output: Contribution to conferenceConference abstract for conferenceResearchpeer-review

Abstract

When a visual impaired person (VIP) acquires new technological aids, ICT consultants are often involved in the process of familiarizing with the new device. The VIP’s basic questions are: what is this device, how does it work, what is its material form, what are the functionalities, etc. The activity of familiarizing with the device involves typically instructional sequences with verbal descriptions and embodied explorations. In this paper, we explore the atypical interaction between VIP and ICT consultants in the context of two types of new AI-technologies: A Google Home speaker and a pair of Envision smart glasses. The paper is based on EMCA and video recordings (Heath et al., 2010; Mondada, 2019) of instructional sequences from VIP’s home environment. The paper shows how participants creatively co-construct an observable understanding of the object’s material and functional features based on the participants use of different sensory resources. Contrary conceptions of instructional actions as imbedded within a sender-receiver model (Shannon & Weaver, 1949), we show how participants monitor each other in minute detail and shifts between different sequential organizations: either the instructor produce verbal descriptions of a specific feature, and the VIP produce tactile explorations of the technology as a response, or the other way around; that the VIP does tactile explorations and then a verbal description of the material feature and its function immediately follows. Although the sequences are different, they are also alike with regards to the organization of the base adjacency pair: that there is a conditional relevance (Schegloff, 1968) between the ICT’s verbal descriptions and the VIP’s tactile explorations. Across different situations, types of technologies and sequential organizations, we thus show a more profound social order in which the participants together build the relevant information about the object in and through distributing perception based on the specifies of particular sensory resources (cf. Due, 2021): the instructor’s visual orientation and verbal descriptions and the VIP’s tactile explorations and verbal accounts. We use the examples to discuss how the intertwined nature of the sensory resources and the creative building on each other’s distributed perception is vital for accomplishing the activity and thus establish possibility for social inclusion in mundane Activities of Daily Living (ADLs).


Due, B. L. (2021). Distributed Perception: Co-Operation between Sense-Able, Actionable, and Accountable Semiotic Agents. Symbolic Interaction, 44(1), 134–162. https://doi.org/10.1002/symb.538
Heath, C., Hindmarsh, J., & Luff, P. (2010). Video in Qualitative Research. SAGE Publications Ltd.
Mondada, L. (2019). Contemporary issues in conversation analysis: Embodiment and materiality, multimodality and multisensoriality in social interaction. Journal of Pragmatics, 145, 47–62. https://doi.org/10.1016/j.pragma.2019.01.016
Schegloff, E. A. (1968). Sequencing in Conversational Openings. American Anthropologist, 70(6), 1075–1095.
Shannon, C. E., & Weaver, W. (1949). The Mathematical Theory of Communication. University of Illinois Press.
Original languageEnglish
Publication date2022
Publication statusPublished - 2022
Event Atypical Interaction Conference 2022 - Newcastle University, Newcastle, United Kingdom
Duration: 27 Jun 202229 Jun 2022
https://conferences.ncl.ac.uk/aic2022/

Conference

Conference Atypical Interaction Conference 2022
LocationNewcastle University
Country/TerritoryUnited Kingdom
CityNewcastle
Period27/06/202229/06/2022
Internet address

Cite this