BLOG

The Road to RTMV

2019 - Present

Introduction

This blog will be a representation of the progress I am making in my quest to making Real-Time Music Videos (RTMV). 

My interests have lately developed into searching to use my vfx skill set for out of the box idea's. Years back my interests where grounded in becoming a compositor. After learning more about the 3D side of vfx I became very invested in learning how to make 3D assets from start to finish. Lookdev and texturing became my thing. Now this year, things have slowly leaned more towards Real-Time rendering and the possibilities of that technology. I want to tell stories through images, but if I kept on going in the traditional direction of working at a vfx studio, the possibilities of that got severely less. Real-Time technologies allowed me to essentially become a one man band, especially the benefit of Directing your own stories/ideas. Good to note is that I do not think a one man band is better than an entire team, definitely not (in most cases)! My point is I can make my own cinema, without the hassle of real world limitations.

After experimenting a lot with basically traditional film making in a virtual world, a friend showed me a music video made entirely in Unreal Engine 4. It wasn't necessarily super pretty, but it delivered the message. I thought that was really cool and my imagination started ramping up. I love music and combining cinema with music seems like the perfect combination to me. So that is kind of what started this adventure. I am now mostly invested in experimenting with ways to make visually interesting imagery that reacts to music. I have been doing so for the past 2-3 months. The blog will mostly be all of the little experiments and tests I am doing while researching visual styles and ideas. Have a good read!

P.s.

This kind of started after an expedition to Germany with school. The image below is a key element to this development.

May

Unfortunately the earliest tests of mapping audio to light intensity has been lost (at least for now, I can't find the videos). I had already managed to build an audio sampler that reads any audio file you give the input. After that the output of the sampler could drive anything that had number inputs. In this case I decided to let it drive the intensity of lights and also the position. So whenever the music got more intense, the light brightness got more intense. Also the light had the ability to "Bop", which means it moves up and down to the music. 

After the early tests with lights I implemented it in a more grounded environment. The environment was built by a small team and me. The camera is moving randomly and the audio is mapped to the intensity of the sunlight. 

What sparked this adventure even more where some tests I did with a good friend. Here you can see us recording camera and motion capture live. In this test particularly we decided to record a very simple music video. I was dancing to some music wearing the mocap suit (underneath my clothes) and my friend was handling the camera. For some reason it got us very excited to make a real-time music video once we had time.

June

After the first tests I was inspired to build a system that did a sort of light show. The idea was to have a sequence of lights turn on and off at the beat of a song. 

So I started at the beginning. The system that I thought of and that still holds today is that each light in the sequence gets a number (lightNr). When the system reaches that number the corresponding light turns on and the previous one turns off. First, I had to get the counting to work. You can see that in the video to the right.

After that, it was time to map it to actual lights. This system does not change the intensity at all, it simply disables the entire light object.

There was one problem though, the system just kept on counting upwards. If I only have 5 lights, then it has to go back to the first light after the last one. This was done by telling the counter to reset to 0 after the highest lightNr had been passed. 

In the examples above, the system counted each time I pressed the "A" key. So this time I told it to react to any output of the audio sampler. This sampler updated every new frame, so the input was very fast. But it worked so that was good. 

Getting there! Time for a small bunker party.

Now it was time to assemble a scene with the new system. Based on the image of the garage entry (as seen at the start of this blog) I built a simple environment to test the system in. I also implemented a minimum input number so the system doesn't count up every frame, but only does so when the minimum input has been reached. Now it truly reacts to the audio.

From there on I just kept building the garage and adding parameters to the system. Mainly a control for the amount of lights that are on at any given time. The cool thing about doing this in Real-Time is that every time you run it, you can adjust the parameters live. So every time you run it you can have different light colors, or a different count sensitivity etc. 

Although the system works, it only does so for that specific task. It is not very flexible. I have yet to build it more procedurally so that the lightNr's get assigned automatically and the sequence pattern can vary. It is far from done. However, you can get some cool results out of it in this state!

At the end of June I discovered "NONOTAK Studio". This is a duo that makes incredible light art installations. Their work inspired me to build something alike their work, in combination with my light system. This mainly meant building 3D light models. This is what makes it a little more realistic, since you can see the lights and what's holding them even when they're turned off. The new volumetric fog in Unity's HDRP helped set the tone. 

Above that I added that the emissive geometry that the lights are connected to automatically change to the color and intensity of their correspondent light parent. 

Oktober

Recently found out about a new software called "TouchDesigner". I think this is the answer to what I want to achieve. TouchDesigner is a realtime interactive visuals system used for many different things including festival light shows. TouchDesigner can communicate with things like DMX interfaces making it perfect for directing lights. I have yet to dive deep into the software but it looks very promising. More to come!

  • LinkedIn - Black Circle
  • Artstation pictogram