Futures of Storytelling from Sketches to Screens

3-5th December 2019


The ViZARTS SUMMIT 2019 was focused on the rapid evolving real-time technologies and how these can be incorporated into tomorrow’s film, tv and animation pipelines. 

From motion capture, real-time engines to AI and version control. We covered it all in our 7 alluring Keynote talks, 17 informative hypertalks, 7 comprehensive workshops and 12 inspiring demos.

The three day summit was kick-started by Kirsten Skytte and Thomas Borch Nielsen who in their inspiring key-note talk highlighted the importance of real-time technologies as they reflected upon the learning outcomes from the Danish animated children’s film “Captain Bimse” that they directed and produced. The duo set out to make the first feature film in Denmark produced in the real-time game-engine software Unity.

While game engines have been around for some years and have a massive potential in being used in the film industry, they highlighted that they have never – in Denmark – been used to create an entire feature film and that this fact had been the hardest to work with. 

Despite the use of the latest technologies and a team of educated workers the production ended up being six months longer than expected and hereby also 1 million above budget. 

While being a first mover can be tricky since you have to learn everything as you go, it can sometimes start a positive reaction.  We experienced how Christan Faber from Rig21 and Dave Hunt from Unity for example produced the case study “Lego Lunar Landing” – A real-time virtual production for LEGO exploring the possibilities of the real-time tech in animation.

This production utilized the latest virtual production technologies allowing cameramen, directors and CG artists to work at the same time. 

At ViZARTS our focus is, and will always be, to create a space for filmmakers and techies – experienced or noobs – where they can meet and discuss what possibilities are out there, what the need is and how we collaboratively can fulfill that need. 

We had a brief “post-card” speech from Allan Lückow who, as he put it,  just wanted to ” say hello from Avatar”. 
Although Lückow’s speech was very short you could feel that he had the attention of the audience as we could all imagine how the technologies we had been discussing and tried out at the ViZARTS events could be up-scaled to larger productions resulting in them being a lot more efficient and creative than the existing workflows. 


The technologies affords an entirely new way of producing where you are more reliant on technology than you have ever been. 

And as the Law of the wise man Murphy states: “Anything that can go wrong will go wrong”.  So you always have to be well prepared. 

To avoid disasters when working with real-time technologies we brought Johannes Wilke in who is the coordinator of the software team at Glassbox technologies. 

Wilke came prepared with his tech as he took the stage and blew the minds of the audience with the possibilities of the DragonFly and Beehive tech.
Both software systems optimize the workflows of the real-time production pipeline allowing the team to collaboratively work inside the same scene, controlling the camera as you would in a live-action scene while at the same time having version control on so your changes are always saved. 


The important thing for us was to create a Summit that introduced filmmakers and techies to the future of storytelling. The first two days covered the magic that lies in creating stunning scenes in game engines and shooting them with virtual cameras – but a story is not complete without its characters.

Showing us how to breathe life into your virtual characters we invited Mathias Søndergaard from Rokoko that provided users with a low budget motion capture solution. 


We also warmly welcomed Flora Jullien from  Dynamixyz who showcased their latest facial capture rig that through the use of cameras and infrared lighting allows you to capture the facial expressions of your actors and transfer them to the virtual characters live or post-rendered. That technology has been used to bring Hulk to life in movies and for all facial animations in the game “Red Dead Redemption”.

To further inspire our journey through the future of storytelling we had Ian Lowe visit from Dolby Vision as well as Nick Thye who gave us the latest do’s and don’ts from a commercial perspective. 


As the ViZARTS founder, Henrik Schønau Fog states:  “This is just the beginning of real-time technologies. Not everybody knows they exist, but once they do – they will revolutionize what we know as filmmaking.” 


So with this, we want to wish all ViZARTS a delayed Happy new year and a big Thank You for making the ViZARTS Summit 2019 happen.


We would thus like to thank the keynote speakers and hypertalkers, the demo crews and Medialogy mock-ups teams, the moderator Alex Lehman, the volunteers, Amunet Studio, Nordisk Film Fund, Samsung, the sponsors, the steering committee, the collaborating educations and organizations, Aalborg University, Medialogy

– and not least all the fantastic attendees.