top of page

Entangled

 

A Multi-Modal, Multi-User Interactive Instrument in Virtual 3D Space Using the Smartphone for Gesture Control

Presented in NIME 2021

Entangled, a multi-modal instrument in virtual 3D space with sound, graphics, and the smartphone-based gestural interface for multi-user is introduced.

 

Within the same network, the players can use their smartphone as the controller by entering a specific URL into their smartphone’s browser.

 

After joining the network, by actuating the smartphone's accelerometer, the players apply gravitational force to a swarm of particles in the virtual space.

 

Machine learning-based gesture pattern recognition is parallelly used to increase the functionality of the gestural command.

 

Through this interface, the player can achieve intuitive control of gravitation in virtual reality (VR) space.

 

The gravitation becomes the medium of the system involving physics, graphics, and sonification which composes a multimodal compositional language with cross-modal correspondence.

 

Entangled is built on AlloLib, which is a cross-platform suite of C++ components for building interactive multimedia tools and applications.

    • Developed general-purpose gesture-based smartphone 3D interface (C++).
    • Developed a musical instrument using smartphones' gesture-based interface (C++).

    • Developing a multimodal 3D interface with pattern recognition using signal processing and machine learning. 

The smartphone interface transmitting buttons, accelerometer, and gyroscope data through OSC​
bottom of page