Multimodal Interaction

Multimodal Interaction

Multimodal Interaction - The Project

A virtual piano that can be played by gesture with a kinect.

CW: 23

mmiPosted by Maria 04 Jun, 2015 14:34:07

This week each member of the group prepared a research / programming task for the next meeting.

Task 1: programming a well designed and scalable piano as interface.

Task 2: research about speech recognition with the kinect and the processing framework.

We found out, that it won't be easy to merge both systems. So instead of wasting our precious time in finding solutions, that might not exist, we decided to change one of the input modalities. Instead of voice, we want to control our piano with the mouse / keyboard.

Task 3: get a visual feedback when the hand is above a certain area.

Together we merged these solutions in our programm. And implemented sound as well. So every time the hand hovers above a certain key we hear the matching musical tone.

Programming as a group seemed to be a very effective way to work on the code. So we decided after the meeting last Wednesday that we are going to continue this way of working.

In the next meeting we want to complete our basic interface (see pitch presentation), where anyone can play our virtual piano with visual and auditive feedback.



  • Comments(0)

Fill in only if you are not real





The following XHTML tags are allowed: <b>, <br/>, <em>, <i>, <strong>, <u>. CSS styles and Javascript are not permitted.