Sing! Ultimate A Capella
Once again getting to work with Ben Foley and Alex King at Buckloop, this was a project to create a ton of on-stage visuals for Sky TV’s talent contest, “Sing – Ultimate A Capella”, broadcast over 6 weeks in September/October 2017.
Briefs were straight-forward enough, usually in the region of a couple of reference images for each tune, for which we then made some abstract designs and animations. Ben had built an E3D pre-viz rig for the stage into which we slotted our designs to get an idea of how it would look once finished, which made things a little easier to handle when it came to getting things right for the media server.
The basic format of the show was simple – all the acts sang acapella (no backing tracks or band), so to get this far, they were all excellent singers to start off with.
Show 01 started off with 6 groups, and one was eliminated in each show. Obviously we wouldn’t know till the day who would get through, or how far they would go, so we had to make background visuals for each group for each round, with 2 songs each per show, knowing that over half wouldn’t be used.
So a 5 man crew assembled, a mograph mecha Voltron of awesomeness, to blast this into shape admidst the ever changing whims of studio execs who changed the show title 3 times in 3 weeks, short 1-image briefs (“Make like this!” or “It’s inspired by this album cover!”), a leaking studio roof (during a hot July in Bristol, there was a storm and minor flood… FFS), a blazingly short turnaround (just under 4 weeks for pretty much everything), decamping over to London for final changes and finessing on-site during rehearsals. And all this while I was packing up our belongings to move from Bristol to Dublin after 13 years…
We had a mock-up of the stage in Element 3D to slide our animations into to give an idea to the client, and ourselves, of how it would look on set, as the pixel map we were using needed a lot of fiddling to get right. Most of the time, this didn’t matter when using more abstract blobs and morphing shapes, but a handful of designs needed to be locked in perfectly for them to look the part.
So, AE Shader then. Goddamn amazing. I’ve seen it deployed in the battlefield on ads and explainers for the National Lottery, but using the thing was like lifting the lid on Pandora’s box. All of a sudden, I was able to look at footage and photos in a completely different way in order to drive animation.
The basic workflow is simple enough –
– Make an animation of something changing state over time, such as a light box turning on, or numbers counting up.
– Make the animation and the source footage the same duration using time remapping.
– Deploy AE Shader. It does its thing at the various prompts and time remaps your changing numbers or light box based on the luminance/R/G/B/A of your source footage.
– Fuck. Yes.
– All sorts of cool shit happens.
Here you can see how the source footage influences the shape of the cloned lightboxes. You can swap the footage for a gradient, a stroke layer in conjunction with the Vegas effect, fractal noise, pretty much anything going from black to white and back again.