Sounds in Space 2017 Pictures

Well, we had a lovely time at Sounds in Space again, this year.  Thanks to all who contributed to the day (guests, presenters, poster peeps and generous sponsors!).  I’ll be uploading and sharing presentations/videos from the day, soon, but in the meantime, here are some pictures (some of the 360 pics are better viewed via the link below).

Google Photos Album of Sounds in Space 2017

Sounds in Space 2017 Live Streams

Tomorrow we’re holding our annual Sounds in Space Research Symposium.  This year we’ve decided to stream the entire event on both YouTube and Facebook using binaural audio (well, Ambisonics to binaural as it happens).  The event is 9.30am (GMT+1) until around 5.00pm.  If you’re interested in watching the events, here’s the links:

YouTube Link
Facebook Link

It’s looking like it’ll be a great day, so do tune in 🙂

Installing Python and FFMPEG on a Mac using HomeBrew

I’ve been asked a few times what’s the best way to install FFMPEG on a Mac with a decent set of libraries included.  Here’s the best way I’ve found (also the most compatible way of installing Python too).

NOTE : If you already use MacPorts as your package manager, don’t use homebrew as well….things will go funny.  If you don’t know what MacPorts is, then you’re unlikely to be using it, so the commands below will work fine 😉

Continue reading “Installing Python and FFMPEG on a Mac using HomeBrew”

Sounds in Space – Audio for Virtual Reality Animations

I’ve had a few people ask for me to share the animations from my Surround Audio for VR presentation that I delivered at Sounds in Space this year.  I’ve made a video of the powerpoint (30 seconds per slide) so everything can be viewed in context (note there’s no audio, though!).  If you weren’t at the event, it goes through both the graphics and audio processing needed to create VR content and shows the limitations, with respect to the inter-aural level (ILD) and time (ITD) differences reproduced by the Ambisonics to Binaural process at varying orders.  8th order Ambisonics does a great job reproducing both the ILD and ITD up to 4kHz.

 

YouTube Binaural Reaper Project

So, here’s an example (but empty) Reaper project that contains the YouTube binaural filters I measured.  You’ll need to use your preferred Ambisonics plug-ins of choice, and I’m assuming FuMa channel ordering etc.. they’ll be remapped by a plug-in.

There is a bundle of JS effects in the folder too, that you’ll need to install (instructions at : http://reaperblog.net/2015/06/quick-tip-how-to-install-js-plugins/) which allow for:

  • Ambisonic Format Remapping (FuMa -> ambiX)
  • Ambisonic Field Rotation
  • Multi-channel Meter

YouTube have now released the official ones they use (but in individual speaker format….not the most efficient way of doing it!), so it’ll be interesting to compare!

As described in a previous post, the ReaVerb plug-in is filtering W, X, Y and Z with a pair of HRTFs which are then simply summed to create the Left and Right feeds.

YouTube Binaural Project Template

YouTubeBinProject

YouTube 360 VR Ambisonics Teardown!

UPDATE : 4th May 2016 – I’ve added a video using the measured filters. This will be useful for auditioning the mixes before uploading them to YouTube.

So, I’ve been experimenting with YouTube’s Ambisonic to Binaural VR videos.  They work, sound spacious and head tracking also functions (although there seems to be some lag, compared to the video – at least on my Sony Z3), but I thought I’d have a dig around and test how they’re implementing it to see what compromises they’ve made for mobile devices (as the localisation could be sharper…)

Cut to the chase – YouTube are using short, anechoic Head Related Transfer Functions that also assume that the head is symmetrical.  Doing this means you can boil down the Ambisonics to Binaural algorithm to just four short Finite Impulse Response Filters that need convolving in real-time with the B-Format channels (W, X, Y & Z in Furse Malham/SoundField notation – I know YouTube uses ambiX, but I’m sticking with this for now!).  These optimisations are likely needed to make the algorithm work on more mobile phones.

Continue reading “YouTube 360 VR Ambisonics Teardown!”

YouTube, Ambisonics and VR

Introduction

So, last week Google enabled head (phone!) tracked positional audio on 360 degree videos.  Ambisonics is now one of the defacto standards for VR audio.  This is a big moment!  I’ve been playing a little with some of the command line tools needed to get this to work, and also with using Google PhotoSphere pics as the video as, currently, I don’t have access to a proper 360 degree camera.  You’ll end up with something like this:

So first, the details. Continue reading “YouTube, Ambisonics and VR”

Sounds in Space 2014 – Video, Pics and Feedback

Sounds in Space happened on 30th June this year and was an excellent day (the programme and details of the day can be found here).  There are always things which could be done better, and hopefully we’ve got all these noted ready for next year (fingers crossed).  If you weren’t able to make the event, then the details below may give you a glimpse of what you missed and whether you’d like to come next time!

The 27 speaker 3D surround sound setup was the best we’ve ever had, made possible with the help of recent graduates from Sound, Light and Live Event Technology and Richard and Mark, technicians in Electronics and Sound.  Alex Wardle (graduate from Music Tech and Production) also created a video of the event which can be viewed at:

http://youtu.be/iK9rnn743hc 

Simon Lewis, Creative Technologies Research Group member took a few pics of the day which you can find at the bottom of this post. Continue reading “Sounds in Space 2014 – Video, Pics and Feedback”

10 Minute Research Presentation

I presented a 10-minute guide to my research into Ambisonics, and its impact on the under-graduate cirriculum, post-graduate validations and use in industry at the Factulty of Arts, Design & Technology Research Day last Friday (21st May).  The presentation is shown below: