Six Music Creators Inspired By Video Games
From Nintendo’s Game Boy to Microsoft’s Kinect, we meet six electronic music creators whose music making experience have been influenced by the world of video games.
—
February 21st, 2019
We think it is HUGE how little by little video games are gaining space within the music industry. Today, video games’ music has left the screen and evolved into
Martin Delaney,
Lisa McKendrick
Aka Nnja Riot – Game Boy performer
“I’m using the Nintendo DSi and the 3DS, which will both run the Korg software that I’m using. I realised that game devices were a very portable way of making music while I was travelling. It’s both the interface and the sounds that I like – the sounds are reminiscent of video games and have a glitchy, lo-fi quality, and the interface is easy to use while killing time on my bus journey. I also use a variety of synths, inductors, LDR oscillators that I’ve made myself, vocals, and beats that I’ve programmed. I’m also using the Fort Processor, which I co-designed with Tim Drage.”
Laurie French
Game Boy performer
“I discovered the Chiptune community, and that people were making music on the Game Boy – I chose to work with Nanoloop. The Game Boy needs a little extra to get you going. Pop the cartridge in, stick some headphones on, and as long as you have hands and perseverance, you can start composing! The limitations are also the Game Boy’s strength; I am forced to make creative choices and sacrifices in order to get the sounds I want from it. Playing live is really fun: when you amplify the Game Boy, it really comes to life and can sound awesome!”
Stormfield
Kinect user, AV artist
“The Kinect spits out loads of tiny beams of infra-red light, and uses the incoming reflected beams to build a 3D image of what’s in front of it, and for motion capture. They released the SDK a while back, and people began hacking into it, extending its capabilities outside gaming: stuff like Synapse, TUIO, Z Vector and ways in via Quartz Composer, Max and Processing. Because it’s mass produced by Microsoft, you have in your hands relatively cheap and very capable hardware. I’ve used it on AV shows with Scald and Mick Harris as a live feed, to capture the performer’s outline while letting the room stay dark.”
Mark Towers
LEGO Mindstorms builder
“I always look for fun ways to interact with the computer, and LEGO Mindstorms offers a really creative approach to building custom instruments/controllers that integrate with the software. I really like the idea of physical/mechanical systems working seamlessly with Live 9/10 and Max For Live. The toy-like nature of LEGO clearly appeals to a wide range of people. Feedback has suggested that they find the devices intuitive and fun, but more importantly, they’re inspired to tweak and build their own instruments.
“Many people have grown up playing with LEGO and understand the simple rules required to conceptualise and build, or just build and see where it goes. Combine that with Max For Live and you have a pretty powerful creative platform to develop your ideas. I’m particularly fond of the Mindstorms Infrared Sensor, it’s super responsive and a pair of them make a great theremin!”
Jim Simons
Developer, AliveInVR
“Seeing the amazing potential of VR for innovative music creation and having worked in music technology for companies such as Yamaha and Focusrite, I really wanted to build my own music products based upon VR. This app started as a prototype, and turned out to be so much fun that I took it further to release.
“There are a lot of things here that other control surfaces can’t do; 3D control of parameters, the ability to arrange your triggers anywhere in the virtual room (for example, arranging the steps of a step sequencer in a spiral around yourself); binaural spatial audio to mix spatially – wherever you put a clip or loop in space, you can hear the sound coming from that location using HRTF (Head Related Transfer Function – it’s a 3D audio thing).
“For collaborative sessions, every user has an avatar that tracks the hands and head/body, and anyone in the shared session can interact with the session in near real-time. None of this stuff would be possible without the Unreal Engine, which is the game engine used to develop it. The open nature of the Unreal Engine means that adding all the custom MIDI stuff, audio routing, loopback ASIO is possible.”
Daniel James Ross
Myo performer
“I became interested in the Myo after seeing Atau Tanaka use them in his live performance and in his piece Suspensions (for piano and Myos), and I bought a second-hand one. I like to use Balandino Di Donato’s Myo Mapper software; it sends data from the Myo, via OSC [Open Sound Control], to anywhere you want – it also has a number of scaling, smoothing and averaging functions that are very useful. I like being able to put my body into a performance, adding some humanity to my computer music; I can make it respond to my every gesture and muscle twitch. I can program it so that if I work hard, it notices! Now they’re discontinued, I hope I can find a second-hand one, or that an open-source one becomes available.”
Read the original article as published in musictech.net here.
Recent Comments