Real-Time Technologies are Taking Animated Characters into Unexpected Places

2016-04-19-1461057071-4587294-BurBerry.jpg

By Christopher O'Reilly - ECD Nexus & Nexus Interactive Arts

With brands increasingly interested in developing more dynamic, on-going conversations with their audiences, it has become critical for the production industry to develop new technologies within their storytelling process.  Real-time seems to be the mot du jour in marketing circles and unexpectedly it's animation, in many ways the least real-time of mediums that is coming up with some of the answers.  Any animation professional can tell you it’s a career that needs patience, with projects often requiring long lead times, and multiple collaborators to pull off their magic. Increasingly however combining a traditional animation film crew and pipeline with coders and artists from the world of gaming and interactive is changing the nature of the creative output into more dynamic experiences.

Using talent or celebrities to front your campaign remains an attractive option to advertisers, but it can slow down the conversation if you’re dependent on their availability.  While Wieden & Kennedy showed us that working incredibly well for Old Spice with their twitter responsive ads, it would be tricky to maintain that level of responsiveness across the year, if not least for the demands it would put on Mr. Issiah Mustafa.  Animated characters or design-based worlds, pose no such obstacles and new technologies in real-time rendering are making this increasingly a reality.

Recently Nexus collaborated with Burberry’s digital team to create physical puppets in Parisian store Printemps’ windows that you could interact with in real-time through your mobile… no download required.  The system also allowed the audience access to privileged cameras with which they could snap the experience and share seamlessly to Instagram all through their handsets. This is real-time interaction with an animated character at another level.

It’s also interesting to look at some of the work being created in virtual reality.  While many of these experiences are ‘baked’ (unchanging stories that you can merely look around), others are utilising game engine type technologies to make their stories adaptive and intelligent to the viewers’ experience.  A recent collaboration with the Google ATAP team on their Spotlight Stories showed us just how powerful the processor in your mobile truly is, delivering real-time rendered VR experiences with no lag at 60 frames per second through a handset.  Our collaboration together focused on how the story and its characters could be triggered by the handset’s understanding of where you are looking and adapt the story accordingly.  It’s not hard to see how other data, be that social media, the weather that day, or your most recent run could not become part of dynamic film content.  Developments in real-time may turn out to be the significant technological development in VR.

So where is this all going? With Bots posed to overtake apps as our go to way to interact with a brands’ digital persona, it’s going to make the world of real-time visualisation increasingly pertinent.  Microsoft Tay’s recent rogue outbursts showed that there is still a way to go and undoubtedly it will be a challenge for brands to make people like an AI character. But if anyone understands how to make people have emotional engagement with a bunch of pixels, it’s the animation industry.