Thumbnail

MusicEyes

Released
2019
Client
MusicEyes
Agency
North Kingdom
Tags
three.jswebglvuejs

See what you hear

MusicEyes is a visual method for learning and understanding the inner workings of music and compositions. Students make visualizations by controlling shapes, colors, camera, effects directly in the browser.

A large amount of creativity and learning have been put into this amazing product. It includes loads of shaders and effects. Examples of techniques learned and used: Optimizing rendering by reducing draw calls and batching notes together makes playback really performant, multi-track sync and mixing, educational games, live-performing with external physical controllers, communicating with the editor and loads of other things.

I created the WebGL-player in three.js and came up with all the note-shapes, effects and scripting to make it all come together. A true passion project that I have gotten the opportunity to work on for many years.

Image
Image
Image
Image
Image
Image
Image

Rendering styles

There is a wide range of expressions that can be used. 3D geometry, flat coloured 2D shapes, hand-drawn (or shader drawn actually) shapes like paint, pencil, aquarelle. All styles have different rendering options based on where it is in time. Before, during and after played. That is used for anticipation of incoming notes, hiding and reduce focus on less important notes.

Tracks and notes are shown with pitch and duration by default, but can be changed with offsets or different paths like arcs, curves or spirals.

The camera can also be fully controlled, either with presets or custom controls.

So many things can be mentioned here, but I let you check out some real animations instead.

ImageImageImageImage

Ludwig van Beethoven - Symphony No. 5 - I. Allegro con brio

Image

Put your best headphones on and immerse yourselves in this composition. When you can see future notes approaching it’s adding a layer of understanding that you can’t get when only listening to the music.

It’s rendered in realtime. Each note is perfectly synced to the music. Try also to scrub the song by dragging anywhere at the screen or use the timeline navigation.

Live performances

The player is also used in live performances. A question I often get is if the animation could be synced automatically to the orchestra, for example with realtime audio-analysis. But it's very hard to do if it's not a fixed beat or played to a metronome-click. In the orchestra, the tempo of the music can change at any time, even stop and change speed in the middle of a beat. The conductor have total control over the tempo and it's a moving and dynamic medium. For the visuals to be perfectly synced, the only viable alternative is to play it like an instrument, which is done with a usb-controller called the "crank" or a kinect controller like in the clip below.

There are more impressive music visualisations done in advanced video tools, but it's hard to match the experience of perfectly synced visuals that show every note played in a full symphonic orchestra. It really deepens the connection to what you hear. Like graphical subtitles for something complex.

Image

The Editor

Watch this tutorial to see how the users can create their animations.

Games

A couple of educational games were developed to learn about the music in a playful way. A SingStar-like sing a long game that detected your pitch, a match-the-beat game to play along with midi or the keyboard. In workshop they can also create customized quizzes that can highlight, solo, control playback and notes to ask specific questions from a score.