Multimodal Interaction

Multimodal Interaction

Multimodal Interaction - The Project

A virtual piano that can be played by gesture with a kinect.

Project progress

mmiPosted by Jimmy 28 May, 2015 17:58:56
After deciding the subject of our project and planning all the tasks we have to do, we introduced all this element to other groups. Our presentation is attached to this post.

We remind that our project is a music keyboard which is controlled by gestures and voice. On the screen, the user can see a keyboard. He is able to interact with it by making gestures (push keys) and these gestures will generate a sound. He can use voice too in order to change instrument and rythm. Our output modalities are also sound and visual so that the user can have feedbacks for his actions.

In addition to this presentation, we start discovering technical points to develop this system. So we begin developping with Processing and the kinect sensor. Now, we are able to track a specific part of an user, like his hand, his shoulder, his head ... and we can detect some gestures, like the push gesture that interest us for our project. Moreover, we find some usesfuls sounds.

For the next week, we will continue developping and learning to use kinect's features. We will search how to implement speech recognition which is an interesting part of our project. Finally, we will search another sounds in order to offer several possibilities to users.


  • Comments(0)//mmi.formisano.me/#post2