The first Music Tech Fest took place through the end of last week and part of the weekend at Ravensbourne in London.
The event gathered artists, performers and technologists who gave presentations and made new things on the spot to address the future issues of music as we plough on through this digital era.
One of the problems being addressed at the event was that of music discovery and how to handle the sheer volume of audio material now available on the Internet. Matthew Davies works for Inesc Porto in Portugal, researching sound and music computing. He presented RAMA, a relational artist map for visualisation and music discovery.
RAMA harnesses data from Last.fm to link together different musical artists and displaying this as a visualisation that, in turn, takes users to videos on YouTube. Maps of varied complexity can be created in order to then make playlists.
“There’s a wealth of information and there’s too much music, basically I just want to listen to the things I like and there are many, many ways of trying to recommend this content,” said Davies. “Sometimes I think that music recommendation can be like horoscopes, you can always find a way to link these pieces of music together and if it works for you, that’s fine.”
“There are big players that dominate at the moment which is good on the whole, for artists after the music industry crash in sales,” he continued. “It would be nice to think that with a steady growth in user base that some applications will push through and maybe RAMA is one of them.”
What does sound look like?
The event provided plenty of ways of participants to be creative and work with experts. Not only were they asked to express themselves but to also consider what might be behind the design and user experience for the next music app.
Peter Kirn, journalist, composer and media artist who runs the Create Digital Music site, presented a workshop exploring the relationship between visuals and music with drawing at a metaphor.
He said, “When you create an application for an iPad or a new electronic music invention you have a blank slate with which to begin and you can go anywhere. So literally start with a blank piece of paper and sketch.”
It appears that from the session there were some shared intrepretations and feelings about graphics and music and Kirn says that people were able to interpret shapes into sound in similar ways.
This could mean that designers should take into account how people react to and ‘read’ music visually when thinking about the design for their next killer app.
At the moment there appears to be two main schools of thought as Kirn explained, “There seem to be applications that are built for space aliens that have entirely new, inventive, interfaces and there are applications that are designed to be as familiar as possible. So for a guitarist, things look like a guitar. As people seem to enjoy both and may collect apps of both types, there is plenty of opportunity to explore those very different avenues.”
It was Peter Kirn’s writings that influenced the upcoming release of an app by record label Ninja Tune. Matt Black of Coldcut and Ninja Tune was at the event with a sneak preview of the prototype that will be available in June and should enable users to remix parts of the label’s back catalogue. There was also a jamming session where hackers at the event who had created instruments or applications played with Black as he mixed live for the audience.
Hack all of the sounds
Along with the demonstrations and talks there was a hack event where participants were asked to open their minds to computing and music to create almost anything they could think of. Ariel Elkin, of the London Music Hackspace, was running the hack camp and although he appeared to have had very little sleep, he said that events like this provided great satisfaction in enabling technologists to get projects going that might not otherwise see the light of day.
Lightsabers were hacked to play back new sounds, circuit benders had an old CASIO keyboard open and were creating strange new noises while operating with wires and clips, Phd students wired up a ping pong table that plays music depending on how good your game is, music generated by the bmp of your movements could be produced with new apps and a site for guitarists to share chords more socially were all whipped up in the space of a couple of days.
Avi Ashkenazi, interaction designer, was blowing minds with his presentation at the festival. He created an installation for the even which encouraged people to record vocal sounds which then became part of a musical track and was in turn translated into a grid of images projected onto the ceiling of the venue. He also showed a way for hip hop artists to include the crowd or a remote audience into live performances by looping recorded sounds and playing them back as part of a gig.
In future this could mean that even if you can’t travel to see your favourite artists, you can still be a part of the performance.
The creators of the Listening Machine, which we took a look at last week were in attendance. Not only does the site translate tweets into sound, it looks as though it is set to evolve from its initial incarnation with the music of cellist Peter Gregson and the Britten Sinfonia to also allow users to include field receordings to enrich the music it can express.
Though many of the activities at the Music Tech Festival seemed to be a little esoteric, there were many products for attendees to explore. The strange new experiments that took place may have appeared to be wild but these are the raw processes that may turn up in the next runaway success in musical applications.
To check out the talks and presentations from the event, check back with the Music Tech Fest site where video should be posted in the coming weeks.
Get the TNW newsletter
Get the most important tech news in your inbox each week.