This demonstration from Behringer shows a mixed reality interface for real-time music manipulation. It utilizes Microsofts Hololens for displaying information in an augmented reality environment. The hand movements and position are tracked by a Leap Motion and the data is fed to the Deepmind12 synthesizer. The demonstration above still seems a bit clunky here and there, and the actual music manipulation seems partially edited, but still, the implications of the technology does stand to reason and teases my curiosity. I think music production and live performances are great fields for applying mixed reality interfaces, since it has inherently a haptic and sensual workflow, adding a new layer of interaction can benefit the creative process. Since it still happens in a digital realm, you have the possibility to hack, customize and modify the input and outputs of that workflow. An example of that would be the online course on Kandenze, where Rebecca Fiebrink of the Goldsmiths University of London shows ways of modifying sensors and inputs with the means of machine learning to basically make custom instruments, e.g. for live performances or experimental music production.