VR meetup – Creative Storytelling through VR

We have been talking about VR and similar technologies and what it means for the average consumer for quite some time now. As a medium, it has the potential to transform media consumption as we know it. VR as it stands now, has an obvious use case for gaming, and few innovative tinkerers and studios have slowly started realising its potential through 360 VR films as well.

HFpromo

What does it mean for traditional advertising though? Could an enhanced immersive experience be used for increasing engagement with brands, through interactive storytelling? Market research suggests 360-degree video engages people much more than traditionally filmed content. Audiences watch panoramic videos five times as long and they are seven times more likely to share them. Research also supports positive ad impressions, something Microsoft is exploring with its “Ad Pano” format. Interactive panoramic ads have been found to increase product purchase intent 65% over standard ads. (Source)

To explore such questions and deliberate the potential of VR for digital agencies, we will be attending a meetup on Monday 27th April, organised by London based Happy Finish, who are a global creative production studio specialising in high end retouching, CGI, animation, motion grading and VFX for global brands, advertising agencies and photographers. We will be giving a talk and be a part of a panel to examine the advertising and commercial opportunities that VR promises to create.

Few tickets are still available and can be purchased here.

News|  Talks

Björk – Stonemilker VR

Bjork-3

This weekend will see the preview launch of Stonemilker, the haunting first track of Björk’s worldwide hit album Vulnicura, in 360 virtual reality and powered by 3Dception’s real-time 3D audio rendering. The video will be previewed at MoMA PS1 as well as Rough Trade stores in the US and UK. This is just weeks after our announcement of 3Dception’s use in Songlines, currently in exhibition at MoMA, New York.

In Björk’s  words:

i am also incredibly proud to offer a premier of stonemilker . with 360 3D sound mix for virtual reality headset.

this came about as a spontaneous fruit of mine and andrew huang’s collaboration . we had already done black lake , the “family” moving album cover and the black lake “book cover” trailer and then found us in iceland one day with nothing to do and a 360 camera lying about . we discussed its potential for intimacy and andrew then suggested we take it to the beach where the song was written . it immediately rang true for me as that location has a beautiful 360 panoramic view which matches the cyclical fugue like movement in the song . if the song has a shape it is sort of like a circle that just goes on forever .
i had recorded the strings with a clip on mike on each instrument . we have made a different mix where we have fanned this in an intimate circle around the listener .
so as you watch this in the virtual reality headset it will be as if you are on that beach and with the 30 players sitting in a circle tightly around you.

To present such a unique experience, we designed a completely re-imagined workflow that blends the worlds of linear sound design and interactive audio.  Björk’s vision of making the user a part of her world in Iceland, where the film was shot, compelled us to push the envelope of what is possible on mobile in VR. In collaboration with talented teams across continents, we worked towards building a new audiophile VR player which marries the world of real-time binaural audio with high definition visuals. All of this, just to bring you an experience you’ve never seen or heard before.

The video was directed by Andrew Thomas Huang, audio remixed for VR by Chris Pike and the mobile VR delivery platform was integrated and pieced together by Third Space Agency (3SA) and digital production specialists Rewind. Visitors to MoMA and Rough Trade will be shown the Stonemilker film on the new cross platform Freefly virtual reality headset developed by Proteus VR Labs.

Björk

bjork

The Museum of Modern Art presents a retrospective of the multifaceted work of composer, musician, and singer Björk. The exhibition draws from more than 20 years of the artist’s daring and innovative projects and her eight full-length albums to chronicle her career through sound, film, visuals, instruments, objects, and costumes.

On the third floor, Songlines presents an interactive, location-based audio experience through Björk’s albums, with a biographical narrative that is both personal and poetic.

We’re honoured and proud to be involved in Songlines, with 3Dception powering the location-based augmented audio experience. We’ve worked with a fantastic team across the world to ensure that the whole experience runs exceptionally well and sounds amazing off an iPod Touch. Yes, an iPod Touch. The exhibition runs from March 8–June 7, 2015. Don’t miss it if you are in New York! Get in touch for press or other enquires.

This:

Only the 3Dception engine could provide us with the audiophile quality and ease of integration our creative and technical approach for the Björk Songlines project demanded.

The Two Big Ears team and their audio programming expertise were key in our ability to provide visitors to Björk’s show the revolutionary head tracking psychoacoustic spatialised audio experience the artist had visualised for presenting her pure audio work to MoMA audiences.

– Andrew Melchior, Third Space Agency,  Producer of Björk Songlines @ MoMA

V1.0 And A Preview

3Dception Unity is out of beta! We’ve got lots of announcements to make, starting with this post (noticed the new website?) and more scheduled over the coming weeks.

We launched the alpha version of 3Dception Unity back in April 2014, initially supporting desktops and have since expanded support to mobile, Wwise, Fabric and native SDKs. We were fortunate to very quickly grow a sizeable user base. Every project, bug report, feedback, email and tweet has helped us make 3Dception a better product. Thank you — each and every one of you.

3Dception Unity v1.0 is out today, with 3Dception Wwise v1.0 following shortly. Here’s a summary of what’s new, with a peak of some of the stuff you should see in upcoming updates:

Ambisonics

With many of our users developing 360/VR movies and video, we decided to include ambisonics b-format decoding within 3Dception. Just as you would setup a 3Dception Source, you can create a 3Dception Ambi Array, drop your ambisonic recordings and you should have a fully binaural soundfield responding to the orientation of the listener with all the normal playback functionality that is available within Unity. We’ve seen and heard some really great results so far.

Optimisations

v1.0 includes many internal CPU and memory optimisations, with 6x decrease in the memory footprint of a single source compared to the previous version. This makes it possible to run 3Dception even on older mobile devices — including the iPod Touch and Android devices from a few years ago.

Geometry Based Spatialised Reflections

Very soon 3Dception will include support for auto-generated spatialised reflections from scene geometry. For more complex environments you no longer need to draw complicated portals or room zones. We’ve just started previewing this with a select few developers and should be releasing it publicly very soon. We’ve been working on this tech for a while and can’t wait to show it off! As usual, this comes with the same efficiency so you can do some really cool results even on mobile devices 🙂

Fast Path-Finding Occlusion/Obstruction

Also coming soon is a very very fast path-finding occlusion and obstruction system, not just for direct sound but for reflected sound too! When combined with doppler, binaural spatialisation and the geometry based system, the soundscape achieves a whole new level of realism. Everything feels a lot more natural. It also means that you can spend more time working on creative decisions rather than setting up complicated systems. More work for our engine, less work for you.

Environment Modelling

With this release we’ve begun to expose some of the internal physics parameters within 3Dception. For example, you can set the speed of sound to completely unrealistic values to get some really cool doppler and early reflection effects. We will be expanding this system to include air absorption, temperature and humidity modelling, making it easier to model highly realistic or absolutely bizarre worlds.

User Accounts

We’re upgrading our release and delivery pipeline. If you are a 3Dception user, you should receive login details to our new account management system over the next few days. This should make it easier to keep track of downloads, licenses and other resources.

Effect Of Distance Parameters

George Vlad is a sound design intern at Two Big Ears. This is the continuation of a series of experiments with sound design parameters and audio spatialization. Read the first post on ‘Pitch And Perception’ here.

Continuing my journey through VR audio enhanced by 3Dception, I investigate the effects of two other parameters: minimum distance and rolloff factor. As expected, the effects of these parameters, including pitch, work in tandem and therefore must be considered together to manipulate the audience or the player.

Minimum distance

According to the 3Dception manual, minimum distance is the distance after which the attenuation starts to take effect. At a first glance this would normally affect the perceived distance between the listener and the source object. As you can experience in video #1 below, this assumption is correct.

However, in addition to changing the perceived distance, there is an interesting side effect, especially when the source is visible to the player. At values lower than the default 1, the object appears to be smaller than it really is. As expected, for values larger than 1, the perceived size of the object increases. The rolloff curve seems to not only affect our perception of distance, but also the size of the object.

Rolloff factor

As per the 3Dception manual, the rolloff factor affects the exponential attenuation model. Values greater than 1 will result in a steeper curve while values smaller than 1 will result in a gentler curve. Continue…

Pitch And Perception

George Vlad is a sound design intern at Two Big Ears. Starting with this post he will be documenting his experiments with sound design parameters and audio spatialization on this blog.

Following up on the previous blog post I have been investigating the role of pitch in the perception of movement and spatialization. The first thing I need to mention is that this is my no means an exhaustive process. I would rather say that I’m scratching the surface and providing food for thought for whoever finds this as fascinating as I do.

I designed a set of 5 similar sounding files resembling an internal combustion engine hum. The frequency content in these files lies mostly below 1 kHz as you can see in the picture. The Wwise engine is programmed to pick one of the 5 files at random creating an indefinite loop. I first played the Intro scene with the original sound palette so as to examine its perception with regards to azimuth, elevation, movement and spatialization. My conclusions with regards to this particular file should be taken with a grain of salt when applied to other files. Pitch is only one of many parameters that can alter and skew the perception of movement, elevation, spatialization and so on.

A few words about the Wwise and 3Dception settings I used. The default doppler option in 3Dception was disabled to avoid pitch changes. 3Dception’s room modeling was enabled with the default settings and the attenuation mode was set to 3Dception’s default as well. You will notice an audible gap in the video at certain points. This seems to be caused by the Wwise engine that needs to catch up with the fact that the files are shorter once they are pitched up. I then proceeded to raising the pitch on the sound palette by 2 semitone increments. This helped me observe the changes in perception and enabled me write the following notes.

Azimuth

Although the unpitched sound is diffuse, there’s no doubt that initially it is coming from the left side. Once the Robot object reaches the camera and goes off screen again it is similarly easy to pinpoint it as coming from the right side. It is safe to say that increasing pitch does not affect the azimuth perception by a great deal.

Continue…

Design Explorations

My name is George Vlad and I’m a sound designer, composer and all round audio guy. Over the past 5 years I’ve been involved in many aspects of video game audio production, from recording to editing and design, from writing music to engineering and restoration. The only aspect of game audio that I’m still learning the ropes of is implementation through middleware.

Parameters

A month ago I embarked on a sound design internship with Two Big Ears with the purpose of widening my implementation horizons and getting to know 3D audio and VR. Coming from a mostly creative environment the learning curve was pretty steep, but luckily my previous experience with Wwise through SoVGA helped me get up to speed with implementation in Wwise and Unity. Over the coming weeks I will be exploring the effect of various sound design parameters and their relationship with the perception of 3D audio and documenting my findings on this blog. I aim to explore a parameter every week.

Pitch

One of the first questions that come to mind when exploring virtual reality and binaural audio is how pitch relates to the perception of movement and spatialisation. We perceive sound as coming from moving sources based on subtle frequency shifts. Thus, a sound source that is approaching will sound ever so slightly higher in pitch, while a source that is moving away from us will produce a sound that is lower in pitch — also known as Doppler effect. The perception of elevation on the other hand is determined by the spectral content of a given sound. As a general rule, low frequency rich sounds will be perceived as coming from lower than high frequency rich sounds.

In the next week I plan on investigating how much of a role pitch plays in the perception of movement, while also testing its effects on the quality of spatialisation. For all my experiments I will be using the 3Dception demo Unity project as a testing ground. I will be attaching 3 different sound palettes to the main character which is a neat little flying robot. Each sound palette will be different from the others in both frequency content and general timbre. This will allow me to observe any changes in perception over the same in-game event by switching from one palette to another. I will record footage of my experiments so that it’s easier to compare the different settings.

3Dception Unity 0.6 — What’s New?

We’ve just pushed out v0.6.0b of 3Dception Unity as we get closer to exiting beta. This update includes over 30 new features, fixes and improvements. Here’s a summary of what’s new:

More Sources!

The active source limit for non-commercial projects has been increased to 10! Yay!

Pricing

The Basic option has been dropped to £9/month from £14/month. You can also optionally pay upfront for a year of free upgrades, if monthly payments isn’t your thing.

Room Model With Variable Surfaces

Room models now include separate reflection properties per surface in the room (walls, ceiling, floor), making it possible to design dynamic environments with a lot more character. The UI gizmos have also been improved, so that the walls change in opacity depending on their reflection property. In one look you’d be able tell what your room sounds like. Cool! Here’s a GIF, just because:

rooms

Continue…

3Dception for Fabric

UserInterface

Our aim with 3Dception is to make great sounding algorithms available across different devices, systems and workflows. This doesn’t just mean good quality and efficient algorithms, but also supporting the workflows that make good design possible. Fabric for Unity is a great middleware tool if you are developing games and apps for Unity. We’re happy to announce that we’ve partnered with Tazman-Audio to make 3Dception compatible with Fabric. Integration scripts for Fabric will be available for free in the upcoming update of 3Dception — which includes a whole bunch of improvements, optimisations and great new features.

3Dception is currently available for Unity (natively and Fabric) and Wwise across OSX, Windows, Linux, iOS and Android. We’ve got a long list of announcements to make in the coming months and we can’t wait to share it!

Full press release below! Continue…

Interview on A Sound Effect

??????????????????

I was recently interviewed by A Sound Effect blog about where audio for VR stands and where it is headed. You can read the full interview here.

I’m sure the technology for audio in VR is going to progress far — we are hard at work and so are all the other players. But, we are very keen to see and help progress the sound design language for VR. There’s so much to explore. If you’re designing audio for VR, we’d like to hear your thoughts! Get in touch!