Tactile communication systems such as Tadoma and braille demonstrate that language can be conveyed effectively through the skin. In addition to these systems that were developed for individuals with hearing and/or visual impairments, over the years there have been a number of attempts to develop tactile vocabularies and displays for general use. Two major challenges in creating these systems have been to determine the most effective unit of communication (i.e., character, phoneme, word, concept, tacton) and the optimal strategies for training people in their use. Over the past decade with the growth in haptic technologies and devices we see a resurgence of interest in developing tactile communication systems that are easy to learn and retain. This is in part driven by advances in wearable technologies and the need to offload the overworked visual and auditory systems. Fundamental questions remain to be answered in developing these tactile communication systems and these will be the focus of the proposed Cross-cutting Challenge session.
The CCC session will cover the spectrum of issues related to using the skin as a medium of communication. Topics addressed by the speakers will include: how should language be encoded on the skin, should such encoding be the same for speech and text, how does the location of the display influence the parameters available for use in communication, do multisensory cutaneous displays enhance learning and lead to better outcomes, what are the challenges in creating refreshable braille displays and other display technology for individuals who are visually impaired?
Speakers from both within and outside the haptics community will provide an overview of previous work as well to present talks focusing on ongoing research on this topic. Different approaches toward conveying language through the skin will be covered, including systems designed to convey textual representations of written English and those that are based on speech communication.
The interactive sessions will include presentations by researchers working on haptic devices for conveying gestural languages, novel approaches to tactile speech communication and multisensory interactions between hearing and touch.