Human-computer interaction


Research staff working in human-computer interaction

Rebecca Fiebrink  /  Marco Gillies  /  Mick Grierson  /  William Latham  /  Frederic Fol Leymarie  /  James Ohene-Djan  /  Atau Tanaka  /  Robert Zimmer

TV aerials

Better than Life

A collaboration with interactive theatre company Coney, to create an interactive performance that is streamed live online and which online audiences can interact with the physical show. This project is funded by the Nesta R&D fund for the arts Marco Gillies Better Than Life website

Graphic of a circular eye

Enabling Audiovisual User Interfaces

The project investigates how human-computer interactions can be audiovisualized in order to improve user experience and usability. AVUI links interaction, sound and image, building upon the concept of Graphical User Interface by adding interconnected sound and image. Atau Tanaka. Enabling AVUIs website

A warped image of a motorway

Sound, Image and Brain

This project takes research in brain-computer interfaces, audio-visualisation, participation and gaming, and develops it in partnership with industry and public organisations. AHRC-funded project with Mick Grierson, Roll7 and Soundandmusic. Sound, Image and Brain website

Blurry photo of a person touching a cinema screen

Design Patterns for Inclusive Collaboration

This project examines how people (particularly those with sensory impairments) combine & map information from one sense to another, to inform the design of technology to improve social and workplace inclusion. Atau Tanaka. Design Patterns for Inclusive Collaboration website



A technology transfer consortium developing innovative interface products for music, gaming, and e-Health applications. EU-funded project led by Atau Tanaka, Mick Grierson and Rebecca Fiebrink. RAPID-MIX website

Chick peas in close-up

Machine Learning as Design Tool

By enabling people to instantiate designs of new musical instruments, game controllers, character animations, and other systems from data, rather than by writing code, we make design more efficient and accessible to more people, as well as enabling people to discover new design possibilities. Several research threads in the department link into this theme of making machine learning usable, creative tool. Rebecca Fiebrink, Marco Gillies, Atau Tanaka and Mick Grierson.

Human Interactive logo

Human Interactive

International conference focussed on man/machine interaction. W.Latham, F. F. Leymarie. Human Interactive website

Close-up of an online map

Augmenting Physical Spaces with Digital Information

This project explores the use of location-based services, innovative geo-tagging and geo-links to augment physical locations with digital information. James Ohene-Djan working with Department of EscolaPolitecnica, University de Sao Paulo, Brizal and WinkBall. 

Instructions on using a mobile phone

Optimising mobile device interaction

Mobile interaction design and techniques with an emphasis on enhancing one-handed interaction using either back-of-device gestures or adapted interfaces. Kate Devlin and Karsten Seipp. Optimising mobile device interaction website

An old manuscript page with words written on it

The Digital Fauvel

A research project exploring human-computer interaction in the digital humanities, The Digital Fauvel is a new digital platform for the 14th century multimedia manuscript Roman de Fauvel. Formerly funded by David A. Gardner ‘69 Magic Project in the Humanities, USA Rebecca Fiebrink Yale University, Princeton University. Digital Fauvel website

  • Zayaruznaya, A., and R. Fiebrink. “Reimagining the facsimile: Project report on The Digital Fauvel” Early Music 42(4): 599-604. November 2014.
IGGI logo

Centre for Doctoral Training in Intelligent Games and Game Intelligence

A four year EPSRC-funded PhD programme training the next generation of researchers in digital games research, in collaboration with York and Essex Universities and over 50 games companies. Atau Tanaka, Jeremy Gow, William Latham, Simon Colton. IGGI website

A person playing on a keyboard connected to an iPad


SoundLab aims to find simple and effective ways to help people with learning disabilities to express themselves musically and collaborate with other people using both readily available musical technologies and also cutting edge research in interface design and machine learning. NESTA/ACE/AHRC-funded project led by Mick Grierson and Simon Katan working with Heart n Soul. Make your Soundlab website

Image of a synthesiser keyboard

Latency-Free Real-time Internet Collaboration

This project employs pattern recognition and overlay networks to predict human actions before they occur and send them efficiently over the Internet to a remote collaborator. This allows latency-free collaboration for time-sensitive applications such as music performance, even over very long distances. Rebecca Fiebrink, previously supported by the Project X Fund, Princeton.

  • VIDEO: MalLo: A Predictive Percussion Instrument For Internet Performance website
  • Zeyu Jin, Reid Oda, Adam Finkelstein, Rebecca Fiebrink. 2015. MalLo: A Distributed, Synchronized Instrumentfor Internet Music Performance. Proceedings of the 15th International Conference on New Interfaces for Musical Expression (NIME).
  • Oda, R., A. Finkelstein, and R. Fiebrink. “Towards note-level prediction for networked music performance.” Proceedings of New Interfaces for Musical Expression (NIME), Daejeon, South Korea, May 27–30, 2013.
A photo of hallways indoors

Digital cultural heritage

Data capture, such as laser scanning of archaeological sites and artefacts, and visualisations and simulations of past environments. Kate Devlin and Frederic Fol Leymarie.