skip to main content
Goldsmiths - University of London
  • Students, Staff and Alumni
  • Search Students, Staff and Alumni
  • Study
  • Course finder
  • International
  • More
  • Search
  • Study
  • Courses
  • International
  • More
 
Main menu

Primary

  • About Goldsmiths
  • Study with us
  • Research
  • Business and partnerships
  • For the local community
  • Academic departments
  • News and features
  • Events
  • Give to Goldsmiths
Staff & students

Staff + students

  • New students: Welcome
  • Students
  • Alumni
  • Library
  • Timetable
  • Learn.gold - VLE
  • Email - Outlook
  • IT support
  • Staff directory
  • Staff intranet - Goldmine
  • Graduate School - PGR students
  • Teaching and Learning Innovation Centre
  • Events admin
In this section

Breadcrumb navigation

  • Events
    • Degree Shows
    • Black History Month
  • Calendar

Gesture-sound interaction in digital media


30 Mar 2011, 2:00pm - 3:00pm

LG01, Professor Stuart Hall Building

Event overview

Department Computing
Contact m.gillies(@gold.ac.uk)

This event will be delivered by Frederic Bevilacqua, Head of the Real-Time Musical Interactions Team IRCAM- Centre Pompidou, STMS-CNRS UPMC Paris, France

Frederic will present an overview of the research and applications performed by the Real-Time Musical Interactions Team of IRCAM (Paris).

Synopsis:

We have developed for the last seven years various methods and tools for computer-based gesture analysis, with the general goal to use body movements to interact with sonic and/or visual environments. This research has largely been influenced by sustained collaborations with musicians/composers and dancers/choreographers.

We will present some of these works, focusing on gesture research and interfaces. In particular, we will present the cases of musical interfaces and various experiments we have been carried on in music pedagogy. We will also present dance performances and interactive installations we have collaborated on.

In music, we studied physical gestures of musicians such as the bow movement of violin players. This allowed us to formalize key concepts about continuous gesture control, gesture vocabulary and co-articulation (similarly to speech production). This fundamental research led us to design augmented instruments, incorporating these challenging concepts. In parallel, we are designing new interfaces and paradigms to control sonic environments, individually or collectively. In particular, we are developing tools to re-perform sound and music with such interfaces. In particular, we developed a "gesture follower" system that allows for the recognition and synchronization of gestures with sound materials.

In dance, we will present performances and installations, where we used the same technology than for music. While designed with different goals and aesthetics, two of them use a similar interaction principle: the visitor is invited to dance “imitating” dance material displayed on a large screen. This brings us back to open questions with musical interfaces: how can we learn gestures and the interaction with digital media, and how this affects our gesture and sound perception?

Dates & times

Date Time Add to calendar
30 Mar 2011 2:00pm - 3:00pm
  • apple
  • google
  • outlook

Accessibility

If you are attending an event and need the College to help with any mobility requirements you may have, please contact the event organiser in advance to ensure we can accommodate your needs.

Event controls

  • About us
  • Accessibility statement
  • Contact us
  • Cookie use
  • Find us
  • Copyright and disclaimer
  • Jobs
  • Modern slavery statement
Admin login
  • Twitter
  • Linkedin
  • TikTok
  • Instagram
  • YouTube
© Goldsmiths, University of London Back to top