Prof Stephen Brewster – University of Glasgow
Multimodal mobile interaction – making the most of our users’ capabilities
Mobile user interfaces are heavily based on small screens and keyboards. These can be hard to operate when on the move which limits the applications and services we can use. This talk will look at the possibility of moving away from these kinds of interactions to ones more suited to mobile devices and their dynamic contexts of use where users need to be able to look where they are going, carry shopping bags and hold on to children at the same time as using their phones. Multimodal (gestural, audio and haptic) interactions provide us new ways to use our devices that can be eyes and hands free, and allow users to interact in a ‘head up’ way. These new interactions will facilitate new services, applications and devices that fit better into our daily lives and allow us to do a whole host of new things.
I will discuss some of the work we are doing on input using gestures done with fingers, wrist and head, along with work on output using 3D sound and haptic displays in applications such as for mobile devices such as text entry and navigation. I will also discuss some of the issues of social acceptability of these new interfaces; we have to be careful that the new ways we want people to use devices are socially appropriate and don’t make us feel embarrassed or awkward.
Stephen Brewster is Professor of Human-Computer Interaction (HCI) in the School of Computing Science at the University of Glasgow. He leads the Multimodal Interaction Group, which has a world leading reputation in designing novel user interfaces, particularly for mobile and touchscreen devices. His focus is on multimodal interaction, using multiple sensory modalities (particularly hearing, touch and gesture) to create richer interactions between human and computer.
School of Computing Science & Digital Media, Robert Gordon University, St Andrew Street, Aberdeen, Lecture Room C48, 14:10 – 15:10.