NSynth Super

NSynth Super is an experimental musical instrument using Google’s NSynth machine learning algorithm. It lets you morph sounds between 4 different instruments at a time – for instance you can be 20% harp, 40% guitar, 30% marimba and 10% dog barking – the algorithm would use machine learning to fuse those sounds together using a neural network and let you play it on a keyboard. You can also interactively modify the ratios of this mix live using the touch screen.

I worked with Google Creative Lab to build an interface for the touchscreen on the device, build prototypes and collaboratively design the interface. I also helped with some of the audio playback engine.

More info on the project can be found here:
https://nsynthsuper.withgoogle.com

In May 2018, Google asked me again to come and help out, to build a version of the software that could be projected on a big screen just before Google CEO’s annual keynote at the Google I/O event in Mountain View. They wanted it to explode it into 3D to visualise the neural networks that were going on under the hood, and fly around them whilst following a giant orb which represented the performer’s finger.

So of course I said yes. Unfortunately I’ve been unable to track down a decent video of the event.