The Museum of Modern Art presents a retrospective of the multifaceted work of composer, musician, and singer Björk. The exhibition draws from more than 20 years of the artist’s daring and innovative projects and her eight full-length albums to chronicle her career through sound, film, visuals, instruments, objects, and costumes.

On the third floor, Songlines presents an interactive, location-based audio experience through Björk’s albums, with a biographical narrative that is both personal and poetic.

We’re honoured and proud to be involved in Songlines, with 3Dception powering the location-based augmented audio experience. We’ve worked with a fantastic team across the world to ensure that the whole experience runs exceptionally well and sounds amazing off an iPod Touch. Yes, an iPod Touch. The exhibition runs from March 8–June 7, 2015. Don’t miss it if you are in New York! Get in touch for press or other enquires.


Only the 3Dception engine could provide us with the audiophile quality and ease of integration our creative and technical approach for the Björk Songlines project demanded.

The Two Big Ears team and their audio programming expertise were key in our ability to provide visitors to Björk’s show the revolutionary head tracking psychoacoustic spatialised audio experience the artist had visualised for presenting her pure audio work to MoMA audiences.

– Andrew Melchior, Third Space Agency,  Producer of Björk Songlines @ MoMA

3Dception Unity 0.6 — What’s New?

We’ve just pushed out v0.6.0b of 3Dception Unity as we get closer to exiting beta. This update includes over 30 new features, fixes and improvements. Here’s a summary of what’s new:

More Sources!

The active source limit for non-commercial projects has been increased to 10! Yay!


The Basic option has been dropped to £9/month from £14/month. You can also optionally pay upfront for a year of free upgrades, if monthly payments isn’t your thing.

Room Model With Variable Surfaces

Room models now include separate reflection properties per surface in the room (walls, ceiling, floor), making it possible to design dynamic environments with a lot more character. The UI gizmos have also been improved, so that the walls change in opacity depending on their reflection property. In one look you’d be able tell what your room sounds like. Cool! Here’s a GIF, just because:



3Dception for Fabric


Our aim with 3Dception is to make great sounding algorithms available across different devices, systems and workflows. This doesn’t just mean good quality and efficient algorithms, but also supporting the workflows that make good design possible. Fabric for Unity is a great middleware tool if you are developing games and apps for Unity. We’re happy to announce that we’ve partnered with Tazman-Audio to make 3Dception compatible with Fabric. Integration scripts for Fabric will be available for free in the upcoming update of 3Dception — which includes a whole bunch of improvements, optimisations and great new features.

3Dception is currently available for Unity (natively and Fabric) and Wwise across OSX, Windows, Linux, iOS and Android. We’ve got a long list of announcements to make in the coming months and we can’t wait to share it!

Full press release below! Continue…

Interview on A Sound Effect


I was recently interviewed by A Sound Effect blog about where audio for VR stands and where it is headed. You can read the full interview here.

I’m sure the technology for audio in VR is going to progress far — we are hard at work and so are all the other players. But, we are very keen to see and help progress the sound design language for VR. There’s so much to explore. If you’re designing audio for VR, we’d like to hear your thoughts! Get in touch!

3Dception Wwise


The past few months have flown past us. With a monthly release cycle we’ve been squashing bugs and making major improvements, both in terms of quality and performance. Thanks for your encouragement and support!

v0.5 of 3Dception Unity was released two weeks ago and included support for iOS and Android. It now works effortlessly on Windows, OSX, Linux, iOS and Android. We aren’t stopping here, there’s a truckload of improvements and features that are currently being implemented and we are working hard to bring 3Dception to all major development platforms.

Our next stop: Wwise. 3Dception for Wwise is currently in closed beta and works as a mixer plugin in Wwise 2014. We are excited to bring our efficient 3D audio and room modelling algorithms to an amazing authoring tool that is widely used across the industry. While we work towards a public release, you are welcome to try it out and put it through its paces. Sign up below and we’ll send you builds very soon.

Support for Wwise also means support for Unreal Engine 4. 3Dception Wwise includes full support for Unreal Engine 4 and you’ll receive our helper scripts and documentation to get you started.

More news and announcements coming soon!

Get 3Dception Wwise Beta:

3Dception: So Far

We’ve been overwhelmed by the response we’ve received since the announcement of 3Dception two weeks ago. Thank you. Both the download and usage count have crossed our expectations by great margins.  We’ve had loads of questions ranging from pricing (which we just released) to technology. We’ll talk more about our technology in the coming weeks and months through this blog.


3Dception has been quite a journey for us. We began majority of the work about two years ago by developing a binaural solution for Android (yes, a difficult platform for audio). This led us to develop proprietary algorithms and methods that have allowed us to scale very quickly, while maintaining quality. Managing to run LOADS of binaural sources on an Android phone that is a few years old, while still keeping the battery usage in check was a great achievement for us.

Why a desktop first release then? We’ve got 3Dception working well on all our target platforms (including some we haven’t announced), but because of the current demand for binaural audio on desktop operating systems (i.e, Oculus and VR) we chose to target that first. If you’ve shipped software previously you know of the long bridge that must be crossed when moving from a ‘working product’ to a ‘shippable product’. That said, the mobile versions of 3Dception will be out very soon.


Alpha — Beta

3Dception has been under constant scrutiny and testing since day one of development. We’ve had academics, researchers, visually impaired groups, audio professionals, designers and other non-audio people lend us their ears. We developed about eight different games and augmented experiences over the span of two years to test our algorithms. It was extremely useful and humbling to constantly put our work ‘out there’. None of these games and apps were made public (for obvious reasons), but they were the easiest way to engage people with our technology and find areas for improvement.


Since the release two weeks ago, we’ve been working hard on finalising the pricing plans while squashing bugs and making improvements under the hood. The beta version will be released in two weeks and available for commercial projects.


Unity and libpd


We are big fans of Pure Data at Two Big Ears. It makes it quick and easy to prototype ideas, or even treat our prototypes as ‘shippable’ products. Our most recent project (which you should hear about in the next week or so) involved a lot of work in Unity. There have been many attempts to get Pure Data (libpd) and Unity working. Patrick Sébastien had done amazing work in getting libpd to work with Unity on Windows (libpd-4-unity). In my spare time I took it on myself to extend support to OSX, Android and iOS.

Success? Yes (almost).

You will find some differences in the way it is all setup, if you have used libpd-4-unity in the past. The project structure needed streamlining to maintain cross-platform compatibility. Thankfully there is a whole library of helper APIs in Unity to make this possible. Let me drill into the specifics:


libpd works *within* Unity. Unlike Kalimba (which is a great alternative for iOS/Android), it is a plugin for Unity and runs on the audio thread. This does mean that libpd works only with Unity Pro (although, there are workarounds to this that I haven’t tested or tried).

The ‘heart’ of the libpd setup in Unity is a single script: LibPdFilterRead.cs. It includes basic methods for initialising libpd and opening and closing patches. It automatically queries the sample rate and buffer size and sets things up depending on the platform/system.

Asset Hierarchy:

All Pd assets (patches, audio files, text files, externals) must be placed in Assets > StreamingAssets > PdAssets. libpd will not be able to find any of the resources in a build if the assets are placed in another folder.



It is recommended that you add LibPdFilterRead.cs to the camera or an empty game object in the scene. It is not recommended to have multiple active instances of LibPdFilterRead.cs in a scene (this could be enforced programmatically, but hasn’t been done).  The scene 01_LibPd_Basic.unity included in the project demonstrates this.

LibPdFilterRead.cs uses the onAudioFilterRead callback in Unity. You can think of it as a plugin insert on the DSP chain. This means that you can also send audio data into Pd from Unity (audio playback within Unity or microphone input). The scene 02_LibPd_ADC.unity included in the project demonstrates this.

The project includes two other scripts: GUIToggleScript.cs and GUITextScript.cs that demonstrate sending and receiving messages from Pd.

Creating your own project:

To use libpd-4-unity in your own project, copy the ‘Plugins’ folder , the ‘LibPD’ folder (this includes the C# bindings) and create the ‘PdAssets’ folder under StreamingAssets (create this folder too if you don’t have it already). Add LibPdFilterRead.cs to a game object, specify the patch name and you should be good to go!



There doesn’t seem to be any major problems on OSX — I’ve tried building multiple projects and they all work fine. Externals work fine too! The full libpd API hasn’t been tested, but this will happen at some point in the near future.

The Unity Editor (previewing your game within Unity) at times fails to open a patch (SIGILL for those interested). I haven’t had the chance to track down this issue and it can be temporarily fixed by restarting Unity. This doesn’t happen in a build though.

This said, unity-4-libpd hasn’t been extensively tested so if you come across any problems do let us know!


Oh the joys of Android development!

The good news: it runs fine. No issues if you have a patch with no dependencies (audio files, text files, externals). The problem is that any data within an Android APK isn’t easily accessible by Unity/libpd. The workaround is to run a few lines of code and copy the data from the APK to the SD card (this is what a lot of apps do). Currently it only copies the Pd patch you’d like to open in Unity and nothing else. Ideally, it should be copying across the whole PdAssets folder to the SD card (or could borrow a trick or two from how libpd-android is setup). This is high on my priority list and will be fixed as soon as I get the time. If you have ideas for workarounds let me know!


This is pending and is on the to-do list. In time!


I haven’t tested the current build on Windows yet, but Patrick’s previous work seemed to have run fine. Feel free to let us know. I’ll test it as soon as I have my windows development pipeline working again.


The documentation/readme/Wiki needs work. I will add information on building the plugins for OSX and Android soon.

To conclude:

There is work pending, but the project in its current form should allow you to get started on creating cool procedural audio/music within your game! If you have ideas for further development, please fork and create a pull-request.

Thanks again to Patrick Sébastien and everyone else involved in the libpd/Pd project for all their hard work.

Go get it here.

Give us a shout if this helped or made life tougher for you. Follow us on Twitter or on Facebook.