We presented the first version of COMO – Collective Movement, at the International Conference on Movement and Computing, Goldsmiths University of London, 2017.
COMO is a collection of prototype Web Apps for movement-based sound interaction, targeting collective interaction using mobile phones.
The PiPo module API for writing your own processing and analysis objects is now available and documented here: http://recherche.ircam.fr/equipes/temps-reel/mubu/pipo
It includes an example xcode project to build a simple pipo mxo for Max that also works within MuBu.
MaD allows for simple and intuitive design of continuous sonic gestural interaction. The motion-sound mapping is automatically learned by the system when movement and sound examples are jointly recorded. In particular, our applications focus on using vocal sounds – recorded while performing action, – as primary material for interaction design. The system integrates of specific probabilistic models with hybrid sound synthesis models. Importantly, the system is independent to the type of motion/gesture sensing devices, and can directly accommodate the use different sensors such as, cameras, contact microphones, and inertial measurement units. The application concerns performing arts, gaming but also medical applications such auditory-aided rehabilitation. More…
The piece Five Out of Six by Christopher Trapani for small ensemble and electronics won the the ICMC 2014 Best Piece Award (Americas).