Filming The Future: How AI Directed A Sci-Fi Short
‘When we teach machines to write, machines don’t replace us any more than a piano replaces a pianist. They become extensions of us.’
Remember Sunspring? The film was written by AI ‘Benjamin’ back in 2016? Well, he’s back with another piece of urm… art?… in the form of new B-Movie inspired sci-fi piece Zone Out.
Sunspring was the product of director Oscar Sharp and AI researcher Ross Goodwin after entering a 48-hour film challenge for the Sci-Fi London film festival. It was written by Benjamin after being fed 100’s of Science Fiction scripts and using Nvidia Jetson enabled them to ‘write’ a screenplay in time enough for the competition. However, this time, they wanted Benjamin to have a more rounded role within the filmmaking process by acting and directing the film as well as writing it.
Benjamin was given the following prompts passed from Sci-Fi London:
Title: Zone Out
Dialogue: ‘They wanted to call it Adam – seriously”
Prop and Action: A character holds a lens and turns it to reflect a bright light
Optional Science Idea: A genetically tailored virus targets only pregnant women
Overcoming the issue that, well, Benjamin is a programme and not an actor, they fed in videos of the actors who starred in Sunspring (Thomas Middleditch, of Silicon Valley, Elisabeth Gray and Humphrey Ker) and old films available in the public domain – like Night of the Living Dead, The Brain That Wouldn’t Die and The Last Man on Earth – good old B movies.
Using these snippets, the system, or Benjamin, applied facial swapping technology and voice generators to orchestrate the performances from prerecorded clips of the actor’s faces and voices, and create the new film by inputting these with scenes from the old films.
Oscar Sharp explains in his Ted Talk that he used to play around with dice to create storylines, but struggled with adapting this to dialogue snippets this way. This is where Goodwin came in, who was fascinated with NLP and their budding friendship began. Goodwin used a Markov chain algorithm to predict which words would come next in a body of text, then applied this to film scripts. Word Camera that uses artificial intelligence tech – an algorithm that will print poetry and dialogue based on what it sees in the picture you take. You can have a go at this at https://word.camera/ for yourself!
Using the Cornell Movie Dialogs Corpus and the LSTM neural network, Benjamin could handle the formatting of a screenplay and was able to replicate it. By training the system to write letter by letter, then learnt words and then, sentences and paragraphs.
The musical score in the film was generated by AI, and it seems to be rather disjointed with the surrealist subject matter and its upbeat tempo. It could it serve to be a bit darker maybe to really create the sense of unease the rest of the film does. On the other hand, the scores connotations of positivity and hope do well to create an unsettling juxtaposition with the dark film subject and imagery.
Regardless of what you think about the film itself in terms of what we’d assess a human director on, the technology at play here has come about in this form at the right time. We’ve recently seen an influx of DeepFakes as celebrities are imposed into rather unfavourable footage. And also, Jordan Peele’s PSA, where we see a convincing video of Barack Obama warning against this kind of fake news.
As with all technology, it’s about how we use it. Do we want to end up, as Peele says, in some dystopian future, or do we want to follow in the steps of Sharp and Goodwin and use this technology as an extension of our own creativity?
Watch Benjamin’s back catalogue
- Sunspring (2016) https://www.youtube.com/watch?v=LY7x2Ihqjmc&t=6s
- It’s No Game (2017) https://www.youtube.com/watch?v=5qPgG98_CQ8
- Zone Out (2018) https://www.youtube.com/watch?v=vUgUeFu2Dcw