About Good Gate Media
Good Gate Media is a video game development company based in Cardiff. It specialises in making interactive story games, where the player is able to make decisions that change the narrative.
We make interactive movies that we turn into games
The scenes are shot using the same process as we would use for a linear film, then they are incorporated into the game engine to create full motion video (FMV) games. These games are multi-platform – Xbox, PlayStation, Nintendo Switch, iOS, Steam, etc. – and give the player the ability to choose their narrative path.
We wanted to try using Unreal Engine for virtual filmmaking, instead of traditional VFX
We had a piece of IP that we wanted to develop into an interactive movie, an adaptation of Ian Livingstone’s book, Deathtrap Dungeon. It seemed like a great opportunity to explore a new way of filming, so we applied for Clwstwr funding. We received £107,000 from Clwstwr, with the aim of producing a demo.
We test things through proof of concept or shorts, so that was what we aimed to do
You can read reviews or case studies of other people’s experiences of doing something, but what is important is how the process works for you and your company. Going into our project, we treated it like a normal film in so much as there is a script, but we incorporated gaming elements from the beginning. Finally, we created what is, in essence, a prototype.
Unreal Engine saves Hollywood filmmakers time and money
Our CEO, John Giwa-Amu, had seen how bigger-budget filmmakers were applying Unreal Engine to filmmaking, and he saw its potential. Independent filmmakers don't work with Hollywood budgets, and rarely have the opportunity to work within epic fantasy worlds - but we felt that if the tech could save money for the big studios, it could do the same for us. It was an exciting prospect.
Using Unreal Engine flipped our filmmaking process on its head
Usually, we film against a green or blue screen and rotoscope everything afterwards. When you use Unreal Engine, you create the 3D virtual world beforehand in pre-production, then you film in that virtual world. You have a normal camera that captures the actors in real time and you have a virtual camera that captures the movement around the set in real time. Then, you put the two things together in post-production.
Creatively, it allowed us to do a lot more with a lot less
Theoretically, you could cut out masses of post-production time by using Unreal Engine. You track in real time, you light in real time, and you can tweak the VFX right then and there, on set. We had some fantastic VFX artists and VP specialists working with us, and watching what they were implementing was like nothing I'd seen before on a film set. Our director of photography, who heads up the camera team, put on a VR headset and roamed around inside the VFX virtual world in preparation.
On set, because we were capturing shots in real time, the actors could watch themselves immediately in playback (even though it wasn't fully rendered), or they could watch other actors on the monitor while it wasn't their own scene. Ordinarily, VFX filming can be very cold - just a blue backdrop and a couple of props. Seeing themselves in the CGI landscape was a whole different level of reality for the actors. With LED screens - the next level up - the actors would be able to stand in front of the actual image and see the whole CGI landscape from where they stand on set.
Unreal could save money in the long run, but there are costs involved
Unreal Engine limits the costs of the art department, of locations, of all the supplementary suppliers and the materials that are required to shoot in an actual dungeon, or on the side of an actual mountain. However, there are non-traditional, bespoke systems that you've got to put in place in order to be able to use the software. The project required a specialist virtual production team and virtual production camera called the Mo-Sys StarTracker.
Unfortunately, we didn't get the chance to use the huge LED screens that companies with a higher budget use because we couldn't afford them. When we make the feature, which we’re aiming to start prep on in the coming year, that’s what we’ll fully explore.
Our main issue was that we stretched the project to its limits, as independent filmmakers often have to
We were awarded a substantial chunk of funding, which we're extremely grateful for, but we wanted to create a prototype product alongside the R&D. Rather than only filming a single scene, we scripted a whole sequence of scenes both in CGI and on location in Cardiff Castle. This was because we wanted to test the tech and, crucially, tell a story. That's really the only way we could tell if Unreal Engine could work as a tool for indie filmmakers.
Skills gaps in the industry impacted what we could achieve
We simply don't have enough boots-on-the-ground VFX supervisors coming up in the UK. On an independent film budget, it was unrealistic to hire someone who’s in such high demand for the whole of a project. We needed a VFX supervisor for about three months, not three days. It's probably one of the most important jobs in the film and High-End TV sector, and for some reason we aren't seeing the numbers of personnel we need to maintain forward momentum in the independent space.
Also, though the link between the CGI and real-life personnel was straddled by Good Gate's production team and the specialists we brought in throughout pre-production, the gap in understanding between traditional shooting methods and this new cutting-edge tech was too great. We couldn't educate a whole crew of people in three filming days; we needed a stronger knowledge base in the first instance. The intensity of the shoot led to delays in communication, which led to a much longer post-production process than we’d envisaged.
If we'd done one scene with one actor with just a couple of shots, then we could've made the budget work. Unfortunately, that'd never be feasible, because you simply can't spend £100K a shot; you need to be creating content at a much higher rate.
That said, on the full feature project which this test has helped generate interest for, the prep will be months rather than weeks. We’ll also likely shoot for three months so this tech will be applied much more economically. Short form work is always a tricky arena to perfect brand-new systems of working.
As a funded R&D project, this has proved invaluable
We hope our findings can be drawn on as part of future developments in virtual filmmaking. We now have a much better understanding of the sort of education and infrastructure we need to invest in to make the technology work for us.