The digital age has opened many new doors with respect to music and music production. Nowadays, just about anyone can produce a track with simply a laptop and a pair of headphones. This revolution began with the advent of the so-called Digital Audio Workstation (or DAW for short). A DAW is essentially a software program that is modelled after traditional recording equipment. Examples include Garage Band, Logic Pro, Ableton, and Pro Tools. These programs allow for limitless creativity when it comes to music production and sound design, and the DAW technology has advanced to the point where the software can interface with other devices that act as “controllers,” or data inputs. This opens some exciting new doors for innovation.
One such area of innovation is brainwave-controlled music. Imagine the day when we can sit down at a computer and create music simply by thinking it! It would certainly remove a lot of barriers for people, especially those who do not know how to play an instrument, or perhaps cannot play an instrument because of physical limitations. The concept of brainwave-controlled music is now closer than ever with the invention of consumer grade, portable, brainwave headsets, such as Muse, Emotiv, and NeuroSky. These headsets can be integrated with the digital audio workstation, and their associated brainwave data can be used to control different parameters in the music production software, such as volume, reverb, and delay.
Angie C, an artist and musician from Canada, plans to explore the concept of brainwave-controlled music through her artistry and pair it with wearable technology fashion (think light-up dresses and outfits with integrated technology). Her first proof-of-concept project will be showcased at this year’s MakeFashion Wearable Technology Gala in Calgary, Alberta, Canada. Her project, named Opus Minerva, will consist of brainwave controlled music, and a performance outfit that will be equipped with LED lights that will change colour and pulsate in response to the music - basically an outfit that will act as a visual equalizer. The brainwave-controlled music component will be driven by the Muse brainwave headset and Ableton Live. The idea is that brainwave data from the Muse headset will trigger particular instrumental “loops” in the music based on the wearer’s brainwave state. So, for example, if a person’s brain is more active, the music may change to include more drums, or if their brain is more relaxed, some of the percussion may be eliminated from the music to provide a calmer musical effect. The changes in the music will, in turn, be reflected visually on Angie’s performance outfit.
If you would like to follow along and learn more about this project, please like and follow Angie C on social media: