This is my Bachelor's degree project in design, from the Bergen National academy of the arts.
When writing this, I'm still reeling after my presentation, and the great feedback from my sensors. They thought it was a "really, really good project", and they even used the word "wise".
The project was also selected by the dean to be shown at the annual school year commencement ceremony.
Now, enough tooting of my own horn, this is what it's about:

The problem I wished to attack, was the lack of visual communication in most of today's electronic live acts. The artist usually stands on stage with a laptop, bobbing his head, and that's it.

In acoustic music, even in the worst case scenario, there is at least some element of performance, since the instruments don't play themselves. In electronic music, unfortunately they do.

So how can we bring the performance back into electronic music?

Short answer: I expanded the physical space into a virtual one, and brought back the physicality of the performance through the magic of technology.

Long answer: Watch this video presentation.
If you just want the gist, you should get the idea by watching this short visualization video:
This following video is in many ways the crescendo of the whole project; a working example of one song being played with this system.
Throughout the concert, the visual expressions can change dramatically. But this is how one song can look. The music is one of my own tracks, "Three tails".

Logotype sketches for the sub-systems Virtual artists and The grid.
Visualization of the workflow in both audio and video
Early inspiration: Tron
The project involved a lot of greenscreening. Hilarious times.
A quick reel of some of the process
Various visual examples of implementation
I'll be working on different implemetation methods in the months forward, to make this thing happen on more stages. if you want the inside scoop, you should have a look at my website, get on the mailing list or follow me on twitter!