Many haptic devices and applications have been developed over the past few years, making it appropriate to consider where haptic interaction is useful and where it is not. For example, what types of data should be presented through touch, and in what domains? Where does haptics add to the user experience ,and where does it detract from that experience? We would really like some submissions from the Haptics L community. Details of the meeting are below.
Workshop on Haptic and Audio Interaction Design 31st August to 1st September 2006 University of Glasgow
www.dcs.gla.ac.uk/~mcgookdk/multivis/workshop.html
Second call for papers
Submission Deadline FRIDAY 31st MARCH 2006
Overview
Technologies to enable multimodal interaction have matured to a level that we can now think about design issues rather than just technology development or proofs of concept. Haptic (tactile and force-feedback), auditory (speech and non-speech) and multimodal interfaces provide novel methods for users to interact with computer systems. Such interactions can provide benefits for all users, with particular advantages for visually impaired people and mobile device users who might have limited access to standard visual displays. However, how can we design effective haptic, audio and multimodal interfaces? In what new application areas an we apply these techniques? Are there design methods that are useful? Which evaluation techniques are particularly successful?
Little work has investigated how the haptic and auditory modalities can be efficiently and effectively combined. Is there information which is better communicated using one modality than another? How can we link haptic and auditory displays so that changes in one modality are reflected in the other? Additionally, how should we interact with these new displays and interfaces? Is a direct manipulation interaction style still appropriate? Such a technique with a force feedback device is efficient, but may not be appropriate for all types of displays. How should we interact with a tactile display, or manipulate a sonified graph?
Whilst systems which use audio or haptic interaction have been shown to be useful, neither has the bandwidth of the visual modality, which causes many problems in situations where visual displays are not available. Because of this, a sensible approach is to combine both haptic and auditory interaction to increase communication bandwidth and create multimodal interfaces. The question is how to do this in a way that combines the advantages of both senses, rather than overloading the user.
This workshop is part of the EPSRC funded MultiVis project (www.multivis.org), which is investigating non-visual visualisation. The aim of the workshop will be to concentrate on interaction in haptic, audio and multimodal displays, bringing together interested researchers to try to answer some of these questions and to move the field forward.
Themes
Contributions are welcomed in (but not limited to) the following areas:
Novel haptic, audio and multimodal interfaces and interactions
Evaluating multimodal interactions
Design principles for multimodal systems
Multimodal visualisations
Cross modal interactions
Auditory and haptic displays for visually impaired people
Multimodal gaming and entertainment
Collaborative multimodal systems
Novel systems and interactions using other modalities (e.g. taste, smell)
Submissions of up to 10 pages in Springer LNCS format should be submitted in pdf format to mcgookdk@dcs.gla.ac.uk by Friday the 31st March 2006.
Submissions will be rigorously peer reviewed with notification of acceptance or rejection by 19th May 2006. Papers should be submitted in Springer Verlag format as we hope to publish as part of the LNCS series. See the website for full details.
Contact
David McGookin and Stephen Brewster
Department of Computing Science
University of Glasgow
17 Lilybank Gardens
Glasgow
G12 8QQ
tel: +44 (0)141 330 8430
fax: +44 (0)141 330 4913
email: {mcgookdk, stephen} @dcs.gla.ac.uk
www.dcs.gla.ac.uk/~mcgookdk/multivis/workshop.htm