This month we invite you to meet Alexis Haggar, Visual Effects Supervisor at award-winning independent visual effects studio, Lexhag VFX. Alexis details his journey into effects, his work in Virtual Production, the tools he uses, including Mistika Boutique, and advice for planning a Virtual Production.

Los Angeles Post Production Group: After film school you started working at a game company doing feature and advertising work. What made you decide to start Lexhag, now an award-winning independent visual effects studio in London and Norfolk, and what sets you apart from other VFX companies?
Alexis Haggar: Straight out of Film School, graduating as a director, I tried to get some work as a director but quickly decided that it was going to take a long time to build a directorial career, and I still wanted to learn.
The job with the games company came through a film school friend who had gone there to start their movie arm. He asked if I wanted to help him make a series of 6 commercials to go out on prime time Sci-Fi Channel. This was an opportunity I couldn’t miss so I joined the company to produce the ads. Straight after that, we were offered an opportunity to make a feature with the same company, which we did and invited all of our Film school colleagues to make it with us.
Once those projects were finished, I moved into the Special Effects world where I spent time blowing things up and making weird and extraordinary rigs. At this point (perhaps 2004), the SFX industry was worried that digital would take over, and many of their jobs were already being replaced with CGI. In high school (secondary for us brits) I had completed my Art A-level with CGI, so I was well versed/self-taught and felt I could transit from SFX to VFX.
At this point, I wanted to keep both sides alive, and there wasn’t a company that had departments to do both. It was at this point that I decided to start my own company that could service both SFX and VFX or, at the very least, be able to design effects with both disciplines in mind.
Lexhag was born, and to this day, we’re still recruiting people that are practical/creative and are technology savvy; there’s never been a better time to combine all the skills.
LAPPG: You recently announced that Lexhag is now offering Virtual Production services, but it seems like this is not something totally new for your company. Can you tell us how you started doing this work and what types of services you now offer including how you support other production workflows?
AH: Virtual Production, as the industry knows it (or the current trend), is rear screen projection on steroids. We now have a whole load of tech that can make the old “dumb” methods “smart”.
Because we’ve come from a place where we’ve had experience with props, miniatures, camera tricks, projecting effects on-set and the whole post-production process (and we’re filmmakers at heart), it seemed natural for us to get involved.
We started by designing and building our own VP setup specifically for vehicles using funding that we won from an innovation grant. We created two POC’s, working with our partners; they supplied the film producing resource, and we supplied our image/VFX resource.

LAPPG: What is the most important technical advice you can offer someone about doing Virtual Production?
AH: Plan your shots. Do not expect to be able to walk into a volume and it all works. It might seem obvious to say, but I get the feeling that people are using it to get them out of sticky situations. It’s another tool to create shots and can be used for lots of reasons and still requires the same amount of planning and effort one puts into a ”normal” shot.

LAPPG: Virtual Production is certainly rising in popularity, but have you seen situations where Virtual Production was not the right choice for a production? How did that play out?
AH: Personally, I haven’t been in that situation. I think that’s come from the supervisor in me; I like to create “10-minute tests” to see if the theory will work or has a chance of working. It’s basically a soft prototype.
For our POC’s we did more than 10-minute tests; we pretty much tech-vised all of the sequences. It was too expensive to rock up and test on the stage, so I worked everything out with a mixture of CAD, Blender and Resolve. Perhaps my SFX training came into play because I also made sure we had different ways to supply the material, too. Traditionally, if we build an SFX rig, it would go through hours of testing and stop working precisely when the camera record button was pressed. Amazing how that works, perhaps something to do with Quantum physics.
LAPPG: What are your go-to tools that you use most often in your work?
AH: For planning and simulation, I’ve been using a combination of Fusion360, Blender and Resolve. In combination, they can be used well to create fairly accurate pre-vis animations with Tech-vis ideals.
On the delivery side, we were able to introduce Mistika into our pipeline, which can handle many of the giant deliverables needed for the bigger stages in London.
LAPPG: Do you use Mistika VR as well as Mistika Boutique and what are the benefits of using Mistika Technology for Virtual Production?
AH: Roughly four years ago, we started using Mistika VR when we created a hybrid VP workflow. By that, I mean we have low-res LED panels throwing light onto the subject while the direct BG the camera sees is Green Screen.
We captured our plates using very high resolution 360° cameras and used MVR to deliver the 360° material which was used in the composite.
These days we have several Mistika Boutique seats because it offers many more advanced features over VR and integrates well with our existing VFX pipeline.
LAPPG: Is color handled any differently in Virtual Production than in more traditional production?
AH: Color is the same as what we’re used to in the post-production world. The London stages we’ve delivered to are growing their colour pipelines, and we’re starting to see the introduction of ACES.
LAPPG: How does Mistika Technology contribute to VPX workflow? Which aspects do you find the most relevant and which obstacles does it help you overcome?
AH: Mistika Tech is brilliant at the heavy lifting of delivering hours of array material. It’s still very much a finishing suite but has a toolset I’ve not seen in other systems.
It works very well with the other key tools and complements the VFX/Online workflow very well.
For example, in certain shots, we don’t need to leave the Mistika environment; its capable of conforming, FX finishing and delivering and does it with some speed. I’m used to a VFX toolset but have experience in the picture post-world; I see Mistika as a product that connects to both very well.
LAPPG: I know with various NDA’s you are limited in what you can say but can you tell us a little about the project you are currently designing for a new show that will feature a lot of VP and traditional SFX integration and what challenges you are facing?
AH: We are working on a project with a strong real-time environment aspect rather than shooting 2D plates. It’s too early to say what the challenges there will be, but I’m expecting things like how well the physical world joins with the digital world and how we blend the two. We want to use lots of atmos and particle effects; it will be interesting to see how they react to the LED.
LAPPG: Can you tell us about the stitching work and camera set up you’ve been doing and using recently?
AH: Our most recent stitching work has been for the Brownian Motion nine camera arrays.
These have been mainly for in-vehicle plates. The array consists of eight Arri Alexa Mini’s covering the circumference and one Red with a fisheye covering the sky. The stitching project delivers one 16k x 2k Lat Long image for cylindrical projection and one 2k square plate that covers the sky and gets mapped to the ceiling of the VP stage.
LAPPG: Can you tell us about your own UE levels for use in a real-time volume that you’ve designed?
AH: It’s too early to talk about the current project of UE levels; all I can say is we’ve been making five zones within one world. These will play back in real-time and will benefit from a live linked camera.
For our POC, we created three levels, one neon city for some driving tests, a Nevada desert range with a pre-rendered smoke sim and a modern US city block.
It’s worth noting that these scenes were made to get the team into building UE levels and to see what’s involved, from design to creation to optimization and playback. We also had ideas about what we wanted to do with the foreground, which meant we didn’t have to spend tremendous amounts of time on the UE scenes. We knew that we’d be able to introduce foreground elements to blend the nature of the UE levels into the shots.
LAPPG: Where do you see Virtual Production going in the future? Will there be something beyond VP?
AH: I think the future is bright for VP.
The market will level and will find its place. The industry will gain experience and know what kind of tech it needs to complete what shots. Different configurations of volumes will become tools for jobs.
The elephant in the room has been content. The volumes don’t do anything without material. This has been a learning curve for all. We’ve tried content from both 2D (photographic) and 3D real-time, for our tests and projects.
From a 3D real-time POV, it’s still quite time consuming and resource hungry to create large real-time levels. More training is being pushed but I don’t think there’s enough available resources to build complex levels out there, or at least people who want to build assets for film and tv.
Our studio has seen a lot of 2D content being used, Mistika has been a part of our workflow to deliver this. I think there’s more to come from this world; perhaps we’ll see different camera arrays capturing volumetric material that can be post-processed for use in a volume. I remember the Stereo 3D boom; maybe there’s room for similar capture techniques to creep back into this market (they’ll be plenty of people shouting at this blog with the mention of Stereo).
Essentially, what we want in 2D is depth and that’s what Stereo gave us. I was first introduced to Mistika during the boom and it was an amazing tool that could deal with huge data and multiple cameras. Maybe the new Stereo is Volumetric; I’d love to see Mistika take the charge.