I've known @pauser the promotor for some years now  We've recently been doing some AR together & I've assisted briefly in fixing textures and getting the below piece below completed.    It's a common theme that I'm seeing that converting a 3d file to an AR ready file format like .usdz or .reality can be challenging.   Some textures won't translate to AR formats properly - the other and most common issue I've seen is file size - ultimately it's got to be streamed from a mobile device using data - so ideally file size needs to be very small. 
Pauser is currently working on a platform to buy / sell AR piece through the blockchain & has always been ahead of the curve when it comes to exploring emergent tech.   
We share many of the same interests with new tech - so it seemed logical that he would feature XR artworks in his street art festival.  One of my learns from Lockdown was that culture needs to be resilient. If we're going to find ourselves locked away in our homes again for the best part of half a year - we must find ways to transmit culture - that aren't reliant on traditional forms. ie galleries / physical spaces.

When working with Dave Stitch our starting points usually defining the project's aims before we look at some of our more long term ideas and ambitions.  There were three locations - I'm based in the Midlands - he is down south. The festival is in a city called Olomoch in CZ.  Our starting point was to spend some time exploring Olomoch in street view / VR
We have now completed a number of extended reality (XR) projects together - our Cubing the Sphere at VRHAM ( Hamburg festival ) earlier in the year where there were some big learns.  The main concept we've been exploring for the last year has been composing audiovisual work primarily to be experienced in Extended Reality.

Cubing the Sphere,  VRHAM 2021

Exploring how sound could be warped around a 3d space, we've been inspired in part by Bernhard Leitner’s Soundcube.    With modern tech, it is incredible now is to think that his work can now be built in VR in a very short space of time, using tools like Spoke / Hubs.
We have chosen Spoke / HUBS as our tool to put together these VR concepts - it's a reasonably simple platform, yet we've found it offers all the spatial sound options that we're looking for.  The visuals are a bit limiting, however it forces a focus on the audio and pushes me to be creative with the visuals.  
With DENISOVA, we wanted to develop our concept and increase from the 3 audiovisual stems that we used for VRHAM - and add 1-2 more stems.   We've currently not managed to record compositions in 8D - yet - so we're looking at small incremental improvements. 
The concept is the same as before - the audiovisual stems are positioned relative to a central anchor point and spatial audio is configured.  Within HUBS we set the audio drop-off to be higher whilst nudging the volume up on all audiovisual stems.  This allows the public to appreciate the complete composition ( with all audiovisual stems ) from a central position -or to explore each audiovisual stem individually and focus on each composition. 

As the public move around this virtual space - both in VR or AR  they will effectively become part of the composition and have some controls over the output, through movement alone. 
This was an interesting process and more refined than our Cubing the Sphere piece but composed in a similar way - with arhythmical and chaotic structures designed to occupy individual space but also interact harmonically with the other stem.  The following is an account from Dave Stitch on how he went about composing: 
The Sloup Nejsvětější Trojice spectrogram
The Sloup Nejsvětější Trojice spectrogram
The The Sloup Nejsvětější Trojice statue in Olomouc
The The Sloup Nejsvětější Trojice statue in Olomouc
After spending sometime scrolling though the digital streets, I started to do some light research around Olomouc. Apparently it is known for its ornate fountains and monuments.

The one object that stood out was the rather gothic looking 'Sloup Nejsvětější Trojice' - the column of the holy trinity as Google translated it.  This large structure built as a monument to celebrate the end of the plague strikes a dark and fractal like pose in one of the town squares.

I capture this image and converted it into a spectrogram using a variety of software to see what result was the most effective.  Spectrograms show the bandwidth frequencies of sound as an image, with colour representing amplitude and frequency plotted by positioning on the graph.
So I converted the image of the Sloup Nejsvětější Trojice into a sound.

Testing the new TR33N drum machine out on DENISOVA

Based on the Holy trinity aspect of the monument, I pondered what I could consider my own (less) holy trinity.  I came up with:


The holy trinity of funk drumming!!  Whether I came up with this as an excuse to use my freshly made custom drum machine - the TR-33N - I cannot say, but the new drum machine beckoned.

Based on our previous experiment with 360 stems for Cubing the Sphere I was aware that the synchronization of each audio stem could not be accurate, so I had to use the drum machine to compose arhythmical drumbeats akin to some far out jazz.  The listener would become the space between the notes.

I recorded 3 stems of Kick, Snare and Hats.  To keep to an arhythmical timing (hard on a drum machine.) I set a 'Cronograf' clock timer to the stop position and then triggered the beats manually, this caused loops at 124bpm but that sat outside a grid.

Due to the modular nature of the TR-33N, I was able to trigger an LFO at the same time that would affect the control voltage of various elements of the sound. The results were satisfyingly un-drum like.
After these recordings were made, I began to thing about the nonlinear nature of augmented reality. And how it could be possible to create work now that could be seen in the near future, even though it doesn't exist as matter.  Applying this to the sculpted monument of Sloup Nejsvětější Trojice, I then thought about the differences between our digital 3D work and traditional sculptures.

It occurred to me that the space in which we are operating is almost outside of time, the artwork has to be summoned to be witnessed.  I then thought about what it would be like to show our artworks to the people of Olomouc at around the time that the Sloup Nejsvětější Trojice was built.
With that in mind, Dave set about writing the music of the future to be experienced by the people of the past.  What would Olomouc's medieval forefathers have thought of modern tech and Extended Reality? 

I want the visuals to really punch through this time and for the public to be able to walk into the composition and feel like they are immersed in the artwork.   As with all our projects, I allow the audio to inform and drive the animations - this time I wanted the public to be able to get inside these animations - and see the sounds. 

The theme of impressing the medieval forefathers of Olomouc was important - I wanted the composition to have a futuristic feel, yet retain some traditional ideals. With it being  a street art festival I also wanted to find shapes that complimented graffiti writing. 

Early outputs from Gravity Sketch

I also spent some time visiting the Holy Trinity Column in Olomouc virtually and it occurred to me that I could use VR within the design process.   For this project, the starting point became VR.  I have struggle with Tilt Brush since Google have discontinued it - primarily the lack of import / export features.  Using Gravity Sketch however ( a 3D design platform usually reserved for  product designers. ) I was able to create spontaneous results that fitted the aesthetic that we were looking for. 

Using Gravity Sketch to paint in 3d - on the Oculus Quest 2 

Painting in VR was incredibly liberating & I soon developed a defined process: Each audio stem is loaded and set on loop. I enter VR and Gravity sketch. I Paint in 3d to the stem for as long as I hear a sound.  When the sound goes quiet - I change colour - so for most of the stems, there's a few seconds of quick painting - then chance colour then paint etc. It's really fast, spontaneous and intuitive and very colourful.  

For most stems there was usually only 4-5 seconds of audio to paint to, before I had to change colour

The outputs from Gravity Sketch were 3d sculptures -some examples can be found below - in some cases they were very unexpected.   Whilst some looked skeletal and artificial, others seemed to have close synergies with nature and sometimes looked floral.

I found painting in VR so liberating as it's just so hands on - I was forced to physically move to the music in unexpected and unconventional ways.   One of the downsides of being a digital artist is that I can feel tied to a desk at times - this process was closer to like dance or even yoga than anything I've ever done creatively within the digital space. 
From this design process I outputted six 3d sculptures - one for each audio stem .  These were stripped back and converted into an AR format ( .usdz)  For the augmented reality component of the project, I did have to make signifiant compromises to the quality - the sculptures had to be decimated ( the number of polys reduced ) and the material textures in most cases were not compatible and had to be simplified.    Ultimately these need to work on a mobile device on the fly - with 4g - the other limiting factor was the Scavengar AR app limited us to a 20MB file size limit. 

Augmented reality - audio reactive 3d sculptures hauntingly disappear & reappear in the viewers reality


I wanted each 3d sculpture to be completely synced with the audio - this posed an issue as the composition isn't short so I couldn't export a 3d object with a long, perfectly synced animation from Blender.  The solution was unexpected: I animated the object within Reality Composer. This allowed me to not only keep the file size down low, but also offer a tighter control over matching the audio to the visuals. 
Revisiting our original vision of impressing the medieval forefathers of Olomouc and exploring the theme of summoning and witnessing, I decided that I wanted each sculpture to appear and disappear whilst synced to the audio.  The tie into Olomouc's gothic heritage was now set with these haunting virtual sculptures.  When "summoning" the AR through an iOS mobile device,  and overlaying them on our reality, these became akin to ghostly apparitions which were installed in different locations throughout the city of Olomouc using the Scavengar app

Testing the AR on iOS

The next step was the VR component which was exciting as I knew that we could achieve more  than with AR.   By making the same the 3d sculptures react to the audio I was able to again create separate narratives for each audiovisual stem. 
I ran the 3d objects that I sculpted in VR - back into After effects and let the audio manipulate them. So the harder sounds look like they sound, the same goes for floaty, softer sounds.  In this way, the process has become a constant dialogue between audio and visuals - with both components informing each other at every step of the creative process.

Converting Desisva (Echo Bass) to  equirectangular 360 degree clip - for VR.

The VR component became known as DENSISOVA VOID, based on the Denisova square in Olomouc - based on the Denisova square in Olomouc: presenting a dystopian glimpse at Olomouc at an unknown point in the future.  In the same way that Dave had folded the audio to create the sounds of a digital water fountain, I used a simple particle system in Spoke to emulate the water fountain in Denisova square. 

The learnings from our last project - where Dave had learnt that he needed to compose Asynchronous beats - I knew that I needed to output visuals as 360 degree equirectangular video - for full immersion.
WE testing out the DENISOVA VOID at David C Hughes’ VR night at Granby St in Leicester - as part of Leicester Design season.

DENISOVA being user tested at the Interact Digital Arts VR night

Find out more /  try the AR / VR out on the main project page:

You may also like

Back to Top