I've known @pauser the promotor for some years now We've recently been doing some AR together & I've assisted briefly in fixing textures and getting the below piece below completed. It's a common theme that I'm seeing that converting a 3d file to an AR ready file format like .usdz or .reality can be challenging. Some textures won't translate to AR formats properly - the other and most common issue I've seen is file size - ultimately it's got to be streamed from a mobile device using data - so ideally file size needs to be very small.
When working with Dave Stitch our starting points usually defining the project's aims before we look at some of our more long term ideas and ambitions. There were three locations - I'm based in the Midlands - he is down south. The festival is in a city called Olomoch in CZ. Our starting point was to spend some time exploring Olomoch in street view / VR
As the public move around this virtual space - both in VR or AR they will effectively become part of the composition and have some controls over the output, through movement alone.
The one object that stood out was the rather gothic looking 'Sloup Nejsvětější Trojice' - the column of the holy trinity as Google translated it. This large structure built as a monument to celebrate the end of the plague strikes a dark and fractal like pose in one of the town squares.
I capture this image and converted it into a spectrogram using a variety of software to see what result was the most effective. Spectrograms show the bandwidth frequencies of sound as an image, with colour representing amplitude and frequency plotted by positioning on the graph.
So I converted the image of the Sloup Nejsvětější Trojice into a sound.
The holy trinity of funk drumming!! Whether I came up with this as an excuse to use my freshly made custom drum machine - the TR-33N - I cannot say, but the new drum machine beckoned.
Based on our previous experiment with 360 stems for Cubing the Sphere I was aware that the synchronization of each audio stem could not be accurate, so I had to use the drum machine to compose arhythmical drumbeats akin to some far out jazz. The listener would become the space between the notes.
I recorded 3 stems of Kick, Snare and Hats. To keep to an arhythmical timing (hard on a drum machine.) I set a 'Cronograf' clock timer to the stop position and then triggered the beats manually, this caused loops at 124bpm but that sat outside a grid.
Due to the modular nature of the TR-33N, I was able to trigger an LFO at the same time that would affect the control voltage of various elements of the sound. The results were satisfyingly un-drum like.
It occurred to me that the space in which we are operating is almost outside of time, the artwork has to be summoned to be witnessed. I then thought about what it would be like to show our artworks to the people of Olomouc at around the time that the Sloup Nejsvětější Trojice was built.
With that in mind, Dave set about writing the music of the future to be experienced by the people of the past. What would Olomouc's medieval forefathers have thought of modern tech and Extended Reality?
I want the visuals to really punch through this time and for the public to be able to walk into the composition and feel like they are immersed in the artwork. As with all our projects, I allow the audio to inform and drive the animations - this time I wanted the public to be able to get inside these animations - and see the sounds.
The theme of impressing the medieval forefathers of Olomouc was important - I wanted the composition to have a futuristic feel, yet retain some traditional ideals. With it being a street art festival I also wanted to find shapes that complimented graffiti writing.
I found painting in VR so liberating as it's just so hands on - I was forced to physically move to the music in unexpected and unconventional ways. One of the downsides of being a digital artist is that I can feel tied to a desk at times - this process was closer to like dance or even yoga than anything I've ever done creatively within the digital space.
I wanted each 3d sculpture to be completely synced with the audio - this posed an issue as the composition isn't short so I couldn't export a 3d object with a long, perfectly synced animation from Blender. The solution was unexpected: I animated the object within Reality Composer. This allowed me to not only keep the file size down low, but also offer a tighter control over matching the audio to the visuals.
The learnings from our last project - where Dave had learnt that he needed to compose Asynchronous beats - I knew that I needed to output visuals as 360 degree equirectangular video - for full immersion.