Meet Editor Sarah Katz

We are excited to introduce you to Sarah Katz, editor at, an Adobe Company. Sarah details for us some of her early work experience –  including 8 internships(!) and offers practical advice for people trying to get into this business. She also peels back the curtain and shares the workflow she uses when working on projects for and with, the market driver for cloud-based video review and collaboration. Enjoy meeting Sarah!

LAPPG: It looks like you had a bunch of internships starting out. What led you to choose the path leading to post-production.

Sarah Katz: I actually had eight internships in college and four of them were at Viacom, now called Paramount Global. The first one at Viacom Catalyst, where they did all their creative services, really influenced me because I learned Final Cut and Photoshop. They taught me everything I knew and took a big chance on me. Even though I was a Media Studies Major at Queens College and took editing classes, the people who worked hard with me at Viacom was where I got my real education.

I was very conscious of the fact that I needed an actual skill set on my resume. My parents had hardwired into my sisters and me the importance of financial independence and security, so I felt I needed a skillset that people were going to latch onto. As I started editing, I realized that I was pretty solid at it and that skillset would be the one that stood out. At the time, I also worked heavily with Photoshop and After Effects and considered the design route, but with the way my brain works, I liked the movement of it all. I landed my first job two weeks before graduation at (Viacom’s)TV Land marketing team. I actually got it through an internal internship fair, which I ended up running the booth for every year after that. It’s a good Viacom success story—from intern to eventually producer/editor.

But back to my first job at TV Land: I was assigned to be the PA on the final season of Hot in Cleveland and I would take initiative in my spare time and just start editing little things. Like, I knew that Mario Lopez was going to be in an episode, and I knew he was in an episode of Golden Girls. I thought, let me see if I can make a hybrid promo and see how that works. And it did. And they aired it that week as their promo!

For me, it was more about getting a solid foot in the door. I just wanted to do something creative and I didn’t want to limit my options. I know a lot of people go into it with the headspace of “I want to be a director, so I’m going to be a director.” But that wasn’t my head space at all.

LAPPG: So now that you’ve been an editor for a while, what is your favorite part of the editing process?

SK: My favorite part of editing is that first pass because that’s really your baby. Everything after that is when I say goodbye to it, because rarely does a spot get approved on the first pass. And sure, there are a lot of people’s notes, and I welcome those because it’s how you grow!

Sarah standing at workstation
Sarah in her element at an standing edit workstation.

LAPPG: As you’ve worked your way up, and you went from New York to LA, it seems like short form is where you landed. So what skills or traits do you think working in short form requires and what do you like about short form?

SK: Well first, I don’t think you need to be in LA, per se. The reason I moved was because I got a job at Hulu and they moved me out here. It was one of those things where it’s like, if I don’t take that job, I’m always going to wonder what if… so I took the job. I’m still kind of bi-coastal—my whole family is in NY so I’m back and forth all the time.

When you’re thinking about short form—let’s say you’re telling a case studies type of story—you need to think about “What is the goal of this piece and how can you say it in the simplest, but most impactful way?” So as you’re screening the footage, you’re listening out for those lines and moments.

If it’s a three-minute piece, if it’s a :30 piece, I can review it over and over and over again, soaking it in the way the viewer would to see if what I’m doing is working. They’re little digestible pieces of content and I can get into my viewers’ headspace a bit.

LAPPG: What advice do you offer for people trying to get into the business?

SK: You need to start somewhere and get your foot in the door, which means you have to have a wide range of skills and be open to different entry-level opportunities. Don’t be too narrowed-in at the beginning, because there’s so much time to pivot. So, that’s my first piece of advice. And my second piece is don’t have a five-year goal. People always ask me in interviews what my five- year goal is and there isn’t one because this industry is so consistently changing that I don’t want to disappoint myself. I’d rather ride the wave and see where it takes me, with the consistent underlying goal of growing my skillset and progressing forward.

For example, landing at, making content to market products that I fully endorse and would have loved to use at Viacom and Hulu is something I’m so grateful for, but never would have imagined was part of my road map five years ago. It’s kind of a meta experience because we’re editing for ourselves. We are our target audience. We’re those industry folks who would use for production and post at our jobs even before we worked at We all have previous experience from our careers that directly impact how we build the technology for our peers.

LAPPG: So having a job with an industry leading company, like as an editor, sounds like a dream job. Can you tell us about the work you do there? And can you give us an example of a workflow for project you find challenging, but are particularly proud of?

SK: It really is a dream job because we’re using our own product every day to create productions that we use to market to customers like us. People who don’t use it and see my editor workflow at are like, “Oh my God, we don’t work that fast. We’ve never done it that fast.” To me, I just think this is the way everyone’s doing it, but I’m wrong. Working for makes me an early adopter of the future of editor workflows and I’m honored to be a part of it.

This all really hit me when we were planning our NAB shoot because we had three weeks from the day we wrapped production for this piece to be finalized. I was on set with Michael{Cioni} for a 3-day shoot and he was directing. By using our Camera to Cloud workflow, I was editing on set all three days which meant post was happening simultaneously with the shoot. I had my assistant editor there, and we were set up to Lucid Link so that we could both be in Premiere Productions at the same time to organize dailies, etc. The shoot was Wednesday to Friday. By the time we wrapped on Friday we were 75% done with the edit.

Sarah is able to start editing on set.

By Monday, we technically had a first pass to share, but we didn’t have time on [ co- founder] Emery Wells’ calendar yet. When we started the shoot, we’d calculated Camera to Cloud giving us a head start that would give us a cut to share by Wednesday, which meant that we beat our own predictions! We ended up showing Emery on Tuesday and then from Tuesday to the following Tuesday, I’d say I had like 25 passes back and forth to get picture lock.

We had a lot of music choices that everyone had different opinions about, and at the same time we had animation and VFX that needed to be placed in, and then color correction and mix. So it was a super hefty project.

We used Frame to organize the versions for all the different departments, so, like #Animation001, #Animation002, etc. The same for VFX and the rest, so we could send the versions out to all the different disciplines, including color. They’d see the 001, 002, 003 and all the spots in the piece that they needed, which meant that we were able to split it into two lanes of continuing posts, but I could keep my focus on the cut.

We delivered in three weeks, and it was awesome and incredible because Camera to Cloud gave us that time advantage. In the past, I was always waiting around for a drive, which meant that I couldn’t work during the production and that was wasted time. I feel weird because it sounds like I’ve drunk the Kool-Aid, but it really does save a ton of time!

What was also amazing was that because I was cutting while they were shooting, I could work with Michael to answer any questions we might have while we were on set. We were doing a lot to link up different shots and make it look continuous, so having that reference was incredibly valuable. Did we shoot it fast enough to match the shot before? Did we need another angle or an insert? We would be able to go back and forth and figure that out while we were still there. And then, when they were setting up for a new shot, Michael could come over to me and check out my progress.

LAPPG: Speaking of Michael Cioni, Sr. Director of Global Innovation at Adobe, he seems to be this incredible blend of creative and technical and quite an industry leader. What has been your experience working with him?

SK: Yeah. He’s a visionary through and through. You watch things about visionaries, you see them on TV or in documentaries, but I’m actually working with one, which is pretty wild. Not that he would ever call himself one and that’s not how he sees himself, but you have to understand that this man knows every single element of production, every single element of post, and he can give you notes on every aspect.

Director Michael Cioni and Editor Sarah Katz take a moment onset.

And it’s not just, oh, he’s dabbling in it. He’s an actual expert in everything that he does, so there are so many ways I can learn from him. At that same NAB we did the video for, I got a chance to present about Camera to Cloud. It was my first time presenting, and I really tried to emulate some of the ways that Michael speaks. He’s a role model for me in so many ways and I feel so lucky.

Michael Cioni and Sarah Katz working on set.

I’m always trying to keep up with his speed but sometimes I have to remember that he’s a superhuman in every possible way, and you just can’t always keep up.

Sarah Katz presenting at NAB 2022.

LAPPG: How has your work changed since using

SK: Well, first, there is literally no point in using email anymore. I mean, I use it to get invites for meetings. Slack is how I communicate about little things and is how I communicate everything that applies to the actual work. And if I’m sending a link it’s over Slack, because we’re sending someone to the project. They would know where to go to get back to that spot. gives you organized, localized notes, but it also lets you draw right on the specific video frame. I’m not a mind reader, so if you want the punch in a little to the left and you can draw the placement for me, I can see exactly what you want. Or if you want a portion of a spot sped up, let’s say, you can make a range-based comment so I know from exactly what part to what part you want sped up. These notes make it so much clearer and easier for me and it saves a few rounds back and forth.

LAPPG: And where are most of these pieces you’re editing, airing?

SK: Obviously YouTube is a big one and our social outlets. A lot of them also go onto the website. So the, “What is” video that we just completed a few weeks before this NAB piece is the first thing you see {on the website}—it’s like, “Oh, you want to know how this works—watch this video.”

Sales uses them a lot. If we’re trying to sell to an agency and we can show how an agency uses, it’s a very useful sales tool. Honestly, we get requests for video content from lots of parts of the business, from Sales to creating ads for Marketing to doing testimonials, to instructional videos and presentations.

LAPPG: How do you see Camera to Cloud contributing to future of filmmaking?

SK: It is the future of filmmaking. I mean, a friend of mine who doesn’t have a huge budget is shooting a movie and I got him a Teradek Cube to give him access to And now they can have the editor making sure that they’re capturing exactly what they need while they’re shooting, which is tremendous. They get to save money on the editor. They get to show the people who invested in the movie what’s happening quicker. This technology changes more than just the speed at which you work—it changes the way you work. You can reshoot something immediately if you need to.

I think never losing the footage is another. When you’re shooting and your footage is going immediately into the cloud, you never have to worry about losing a drive. And then there’s the ability for news footage to be immediately available. For example, when Steph Curry shot his record-breaking three-pointer? That was Camera to Cloud proxies that got put up on Twitter in 15 minutes. And the only reason it took that long was that they wanted to use the locker room footage afterwards and there wasn’t a mic directed at his mouth, so they needed to put captions on and make sure they were accurate. That’s what took the 15 minutes. But the proxies were there immediately.

I think about what it’s going to do for news, what it’s going to do for sports, what it’s going to do for archival purposes around the world. And now that Filmic Pro is integrated with Camera to Cloud, I think about things like social media. It’s so fast and so easy. An editor or an AE can just edit it and post it, especially since you can get 4K video proxy files and full-bandwidth audio to the cloud immediately.

I think Camera to Cloud for filmmaking will be tremendous, as well, because why would you want to have to ship hard drives? What about if you’re shooting in Australia and you want to use an editor who’s in Europe? It’s no longer about location, it’s about getting the best talent available. In my opinion, that’s what Camera to Cloud is going to enable for features.

LAPPG: In what way are both sides of the playing field – production and post affected by using Camera to Cloud?

SK: Basically, Camera to Cloud is about giving the people who need access to the content an easy way to get it. For production, if there’s a shoot going on, you don’t have to crowd around video village. The producer could be running around but can watch the shoot on their phone and check to make sure what shots have been recorded. Or the producers or clients don’t even have to be near the set, and they can see what’s going on. As the editor, I’m able to download the footage and start cutting. As the director, Michael can use [the integration with] ZoeLog so he can keep markers on the takes he likes and tag me. And I can get a notification from on my watch or phone.

And then beyond the production, it’s the way for all your teammates or clients to find everything in one centralized location. That’s what’s really amazing.

Sarah onset with the production team.

LAPPG: So on a more personal level, how do you maintain the work-life balance, especially working from home and do you have hobbies or any special things that you do for self-care.

SK: Yeah, it’s important, especially when working from home. I do have a separate room for my editing station, which I know is lucky. Not everyone can do that, but having that separate room, that’s my office space so mentally I can close that door. My living room’s for relaxing. My bedroom’s for sleeping. That’s super helpful.

Living in LA the weather is beautiful and I live right by Runyon so I run and have it down to a tight half hour loop that I can do in between meetings. Exercise is super helpful. I also have a standing desk, so I’m not just sitting the whole day editing.

I love music, so anytime I can go out and just hear some live music or go to a good concert, that’s always a good break from it all. Whether it’s a jazz bar or, I’m a big classic rock person, I like it all. I also love TV. I started in TV, so while movies are amazing and I do appreciate a good movie, there’s something about the storytelling of TV.

Work is a huge part of my life and I’m not upset about that. It brings me joy. At the place I am in my personal life, I have the space to give more to it. That might not always be the case, but it is right now.

Meet Colorist Ryan McNeal

We would like to introduce you to Ryan McNeal, an independent colorist, providing color services to all scales of projects, large and small across a wide range of mediums, tv, film, music videos, and web content. In this post you’ll discover how Ryan went from hired hand to owning his own studio, the techniques he uses to bring out the emotional intention of the images he works with and, how he uses Blackmagic gear and Resolve to stream, edit, and color his own projects as well as his clients’.

Los Angeles Post Production Group: Can you share how being an oil painter inspires your work as a colorist and what tools you use to be able to help tell a story and influence viewer’s emotions?

Ryan McNeal: Although Film, TV, and Streaming are all relatively modern mediums to work in, the art of story-telling through visual media started at the dawn of humanity.  We are very visual creatures and visual storytelling is a pretty universal language.  For me, the great painters in classical art history discovered and pioneered many techniques that filmmakers employ every day.  Composition, controlled color pallet, dramatic lighting, emotive subjects, etc.  When I color, I am motivated by the art of the shot and approach it like a painting.  What are the hues that will best tell this story? How much of the subject’s world should we see?  How isolated should the subject be?  What is the base emotion the audience should feel?  Is this a harsh world? Or a soft one?

Broadly speaking, I use a set of techniques to execute the emotional intention of the images.

            1. Creative correction – control of the color pallet to evoke the right emotion and genre. This may or may not include film emulation, depending on the project.
            2. Creative shaping – vignettes, pinches, gradients of luma and/or hue to build up or breakdown the focus of the image.  Where are we looking?
            3. Texture – Images can be soft or sharp, in many senses of the word.  I selectively add and reduce contrast in areas of the image as well as sharpness.  Additionally, bloom and halation can be used to soften highlights. 
            4. Color Density – How heavy the colors feel, this is more to do with luminance, rather than saturation.  Film, for instance, is generally more dense in the shadows than the highlights, whereas video is linear and equal throughout the tonal range.
            5. Grain – In most narrative, even a small amount of grain is helpful in battling aggressive compression algorithms.  A noticeable degree of film grain can be pleasing to the eye, depending on the aesthetics of the film. 

LAPPG: Your work as color assistant at Company 3 lead you eventually into becoming a freelance colorist and starting your own independent color studio with your wife, Becky, called RKM Studios. Can you tell us about the process you went through from moving from an assistant at a big company to going off in your own?

RM: Going freelance after working for a larger company was scary.  I thought for sure I had enough clients and was ready.  Reality was I quit right when my handful of clients didn’t have any work.   I put a small office on a credit card and started hustling.  I would wake up early, hit Mandy and Craigslist each day and apply to everything, regardless of pay.  I saw myself as being in this mode of needing to build out my network and I would worry about the income later.  I got good at nabbing gigs on Mandy, having learned that you can stand out by specializing and not being a jack-of-all-trades.  I paint, I draw, I do photography, I write, I direct music videos, and I color.  But to Mandy jobs, I was just a colorist and nothing else.  Consistently I was told that was the reason clients picked me: I applied early, and I wasn’t trying to be everything.

My wife was working full time and I was getting enough freelance to keep the bills current.  I would often work 16 hours and had a poor sense of boundaries with clients.  I soon got busy enough to get the office space off my credit card and asked my wife, Becky to join me and help start a legitimate business. 

That was about 7-8 years ago and since then we’ve grown into a 6 person team, having recently brought on another colorist, Michael Schatz.  As a small team, we are nimble and able to provide a boutique experience.  We’ve carved out a niche by providing high quality work and treating every project with careful and intentional collaboration.   Clients love that we get involved and care about achieving the best image for their films.

RKM Studios

LAPPG: What services does RKM Studios currently offer and how much of the work you do is now remote collaboration?

RM: We provide creative color grading services for film, tv, streaming, and social media.  We work in SDR and HDR and have two color suites setup for accurate viewing environments.  We also offer online finishing services for long-form projects. 

We’ve seen continued interest in remote sessions beyond COVID reasons.  DPs are rarely paid to attend color sessions and often have to choose between working on paying gigs or being in the color session and missing out on work.  We’ve had a lot of DPs, especially long time clients, thrilled to be able to jump on the remote stream from set so that they can still be a part of the discussions without giving up work.  Of course it’s always best if we can all be in the same room, but with the way things are evolving, that is more and more a privilege rather than a requirement.

We’ve also been doing a lot of hybrid sessions, where the director or another creative is present at our studio, and we’ll have one or more creatives on our remote stream at the same time.  For our stream we use the Blackmagic Web Presenter 4K units with calibration LUTs to ensure high-quality real-time video anywhere in the world. 

Color Suite at RKM Studios

LAPPG: What advice do you have for colorists just starting out and what skills should someone cultivate to be able to do this work successfully?

RM: You can only learn how to be an artist through experience.  For the gear and the software, the internet can teach you just about anything, but to develop your critical eye and create art, you have to culture and enrich the artistic side of yourself.  Study classical art, study art history, study color theory and how it can be applied psychologically.   Photography is an excellent adjacent hobby that can teach you all about cameras and capturing images, as well as retouching.  Personally I do analog film photography as a hobby and develop my own photos for this reason.  Get inspired by art that isn’t film.   Go to museums, look at graphic design, go to gallery openings and find out what other people think of art.  Pay attention to the psychological reaction to art from those around you.   Most of the up-and-coming colorists I see on LiftGammaGain (colorist forum) are very technically minded and are overly engaged with the tools and the gear.  In the beginning, you’re hired because you’re the guy at the right price with the gear, but as you develop, you’re hired for your taste and speed.

Similar note, post production is not a “you build it, and they come” type of business.  You are hired on the equation: (Reputation + Reliability) * YourNetwork = nColorGigs.  Don’t waste money buying all the right gear before you’ve proven that you can make it work for you.  A decent laptop and a Resolve Mini panel is enough to get started on student and low budget stuff.  You have to create a brand and culture meaningful relationships with filmmakers so that you grow and maintain your network.  If you are unpleasant to work with, you will struggle in freelance.   I was an introvert and it took me a long time to come out of my shell and be more approachable.  I learned my lesson, and I offer it in kind.

Have humility.  Ego will destroy everything it touches.  It’s easy when you are good at something to wield that as a weapon against those you work with.  That makes you difficult and it will inhibit you from learning.  You will work with many people that know less about color, image design, art—that’s why they are hiring you.  Make sure you always approach disagreements with grace and a problem solving attitude.

Ryan McNeal working with DaVinci Resolve at RKM Studios

LAPPG: RKM Studios does really impressive work from music videos for Alicia Keys, Panic! at the Disco, and the Jonas Brothers to commercials for Nike, Acura, Hasbro, and Red Bull. What types of projects get you most excited to work on and are most projects collaborative or do directors generally come in with a particular vision which you deliver?

RM: Thank you for your kind words! I am most excited about projects that are creative, it’s fun to work with colors that aren’t typical and try new things.  But more than anything, I want the relationship to be positive.  The works you mentioned above, those filmmakers are some of the nicest, patient collaborators we work with, and I value that experience so highly.  You either have the privilege of spending 8 hours together making art, or you are trapped in a dark room with unpleasant people for 8 hours making mud, and I’d rather it be the first.

Every project is a different experience. Sometimes the filmmakers have a very specific vision, and the area I can play in is narrow.  Sometimes the filmmakers do not have any specific vision for color and are looking for a creative to collaborate and dream up a look with them.  I find I have to adapt to whatever’s needed, based on the experience level of the client.   

Examples of Ryan's Work

LAPPG: When first starting out you generally don’t get to do higher profile projects like these. Can you talk about how the work has evolved?
RM: In the beginning I was sifting through work on Mandy and Craigslist.  No project was beneath me since I needed to build my skills and my network.  It was about forging relationships.  There are three clients I still work with today that discovered me on those platforms.  All of them had super low-budget projects, but I was willing and eager to put in the time and make the connection.  Nearly 8 years later, those clients bring us some of the biggest work we’ve gotten.  It’s important to invest in people, because your network brings you the work, and everyone is trying to grow.  You never know who’s going to be the next big thing.
Over time, I got a couple creative projects that got me noticed by a little bit higher caliber clients.  And then that work got me noticed by even better clients, and so on and so forth.  The power of referral is huge.  All our work comes from referral.  We also  curate our social media presence and website to promote the work that we want more of.  These days, that is long-form narrative.
It’s cool to work on high profile projects, but I’ve learned to not be overly engaged just because there is a celebrity attached.  Often times, the presence of a celebrity ends up making the job all the more difficult from a communication and efficiency stand-point.   If the celebrity has a culture of fear around them, it leads to creatives worrying about approval and second-guessing their work.  I think it takes a seasoned producer to handle the approval process and expectations, otherwise things quickly get off the rails.
My post producer and partner Becky is excellent at that sort of communication and it is so much easier to work with difficult personalities when they know the limits of engagement.

LAPPG: When you own an independent studio in many ways you can be always on the clock as you need to do whatever it takes to get your client’s project out the door on time. How do you deal with ever tightening schedules and delivery dates?
RM: It takes a team to deal with the turnaround expectations in our industry.  When I was freelance on my own, a 16hr day was prescribed by the work.  Now with a small team we can efficiently break up the work and move even quicker.
It starts with the prep workflow.  We prep and color all jobs expecting last minute edit changes, overall color notes, and pickups.  By creating a pipeline to handle the chaos, the chaos isn’t so overwhelming.
I am always exploring new tools and tech to push the envelope on speed and efficiency.  Every second we cut out of the process adds up when we are responsible for dozens of projects per month.
Recently we worked on a feature doc where the deadline was tight and we needed to be able to color at the same time that the online was happening.  So the company doing the online was able to work independently to create the online in Resolve while we started color separately and they just sent us their .drp file when it was ready and we updated mid-color with no issues.  It was a smooth process and cut out a lot of time which allowed us to hit the client’s deadline!  Without that solution we would have needed to wait a full two weeks before starting color.

LAPPG: You and your wife, Becky have been working together for a long time. How do you two balance the work and what are some of the things you’ve learned to make this professional partnership run smoothly while maintaining your relationship since the lines can be blurry between work and home in this type of situation?

RM: Becky and I work really well as a team.  Open and clear communication of expectations is key.  We always talk things out.  It is important to be extra support for each other.  As business owners, we find ourselves worn out, overwhelmed, and stressed.  We each try and take on some of that weight when it is becoming disproportionate.  We also try and set boundaries.  Sometimes ineffectively.   During the work-from-home period of 2020-2021, it was really hard for us living with our work.  I congratulate anyone who can do that, we cannot.  Moving back into an office space was important for the growth of our company, but also for our mental health and being able to physically separate work and home life.

LAPPG: You also work as a director and enjoy taking a bold and cinematic approach to your projects. Your most recent short film, Desert Rose premiered at acclaimed film festivals including the Academy Qualifying HollyShorts Film Festival in Los Angeles. Congrats on that! What gear did you use to help you tell the story you wanted?
RM: Thank you!  Desert Rose was a very fulfilling personal project and I am very excited to get my next narrative endeavor off the ground.

For Desert Rose, we shot Red Helium and used Kowa anamorphic lenses.  I love the decided vintage feel of that set, it was perfect for our western.  I also direct music videos and usually we do a paper edit in DaVinci Resolve.  I like to put in titles that describe what is happening and edit that to the music to feel out the timing.  Then we edit in Resolve and Color in Resolve.   We’ve been doing that for years now, and it’s been great to see Resolve mature fully into a more-than-capable NLE.  I do my own VFX on my directorial projects, using a combination of Blender, After Effects, and Resolve to build, composite, and color each VFX shot.

Images from Desert Rose

Meet Effects Supervisor Alexis Haggar

This month we invite you to meet Alexis Haggar, Visual Effects Supervisor at award-winning independent visual effects studio, Lexhag VFX. Alexis details his journey into effects, his work in Virtual Production, the tools he uses, including Mistika Boutique, and advice for planning a Virtual Production.

Los Angeles Post Production Group: After film school you started working at a game company doing feature and advertising work. What made you decide to start Lexhag, now an award-winning independent visual effects studio in London and Norfolk, and what sets you apart from other VFX companies?

Alexis Haggar: Straight out of Film School, graduating as a director, I tried to get some work as a director but quickly decided that it was going to take a long time to build a directorial career, and I still wanted to learn.

The job with the games company came through a film school friend who had gone there to start their movie arm. He asked if I wanted to help him make a series of 6 commercials to go out on prime time Sci-Fi Channel. This was an opportunity I couldn’t miss so I joined the company to produce the ads. Straight after that, we were offered an opportunity to make a feature with the same company, which we did and invited all of our Film school colleagues to make it with us.

Once those projects were finished, I moved into the Special Effects world where I spent time blowing things up and making weird and extraordinary rigs. At this point (perhaps 2004), the SFX industry was worried that digital would take over, and many of their jobs were already being replaced with CGI. In high school (secondary for us brits) I had completed my Art A-level with CGI, so I was well versed/self-taught and felt I could transit from SFX to VFX.

At this point, I wanted to keep both sides alive, and there wasn’t a company that had departments to do both. It was at this point that I decided to start my own company that could service both SFX and VFX or, at the very least, be able to design effects with both disciplines in mind.

Lexhag was born, and to this day, we’re still recruiting people that are practical/creative and are technology savvy; there’s never been a better time to combine all the skills.

LAPPG: You recently announced that Lexhag is now offering Virtual Production services, but it seems like this is not something totally new for your company. Can you tell us how you started doing this work and what types of services you now offer including how you support other production workflows?

AH: Virtual Production, as the industry knows it (or the current trend), is rear screen projection on steroids. We now have a whole load of tech that can make the old “dumb” methods “smart”.

Because we’ve come from a place where we’ve had experience with props, miniatures, camera tricks, projecting effects on-set and the whole post-production process (and we’re filmmakers at heart), it seemed natural for us to get involved.

We started by designing and building our own VP setup specifically for vehicles using funding that we won from an innovation grant. We created two POC’s, working with our partners; they supplied the film producing resource, and we supplied our image/VFX resource.

LAPPG: What is the most important technical advice you can offer someone about doing Virtual Production?

AH: Plan your shots. Do not expect to be able to walk into a volume and it all works. It might seem obvious to say, but I get the feeling that people are using it to get them out of sticky situations. It’s another tool to create shots and can be used for lots of reasons and still requires the same amount of planning and effort one puts into a ”normal” shot.

LAPPG: Virtual Production is certainly rising in popularity, but have you seen situations where Virtual Production was not the right choice for a production? How did that play out?

AH: Personally, I haven’t been in that situation. I think that’s come from the supervisor in me; I like to create “10-minute tests” to see if the theory will work or has a chance of working. It’s basically a soft prototype.

For our POC’s we did more than 10-minute tests; we pretty much tech-vised all of the sequences. It was too expensive to rock up and test on the stage, so I worked everything out with a mixture of CAD, Blender and Resolve. Perhaps my SFX training came into play because I also made sure we had different ways to supply the material, too. Traditionally, if we build an SFX rig, it would go through hours of testing and stop working precisely when the camera record button was pressed. Amazing how that works, perhaps something to do with Quantum physics.

LAPPG: What are your go-to tools that you use most often in your work?

AH: For planning and simulation, I’ve been using a combination of Fusion360, Blender and Resolve. In combination, they can be used well to create fairly accurate pre-vis animations with Tech-vis ideals.

On the delivery side, we were able to introduce Mistika into our pipeline, which can handle many of the giant deliverables needed for the bigger stages in London.

LAPPG: Do you use Mistika VR as well as Mistika Boutique and what are the benefits of using Mistika Technology for Virtual Production?

AH: Roughly four years ago, we started using Mistika VR when we created a hybrid VP workflow. By that, I mean we have low-res LED panels throwing light onto the subject while the direct BG the camera sees is Green Screen.

We captured our plates using very high resolution 360° cameras and used MVR to deliver the 360° material which was used in the composite.

These days we have several Mistika Boutique seats because it offers many more advanced features over VR and integrates well with our existing VFX pipeline.

LAPPG: Is color handled any differently in Virtual Production than in more traditional production?

AH: Color is the same as what we’re used to in the post-production world. The London stages we’ve delivered to are growing their colour pipelines, and we’re starting to see the introduction of ACES.

LAPPG: How does Mistika Technology contribute to VPX workflow? Which aspects do you find the most relevant and which obstacles does it help you overcome?

AH: Mistika Tech is brilliant at the heavy lifting of delivering hours of array material. It’s still very much a finishing suite but has a toolset I’ve not seen in other systems.

It works very well with the other key tools and complements the VFX/Online workflow very well.

For example, in certain shots, we don’t need to leave the Mistika environment; its capable of conforming, FX finishing and delivering and does it with some speed. I’m used to a VFX toolset but have experience in the picture post-world; I see Mistika as a product that connects to both very well.

LAPPG: I know with various NDA’s you are limited in what you can say but can you tell us a little about the project you are currently designing for a new show that will feature a lot of VP and traditional SFX integration and what challenges you are facing?

AH: We are working on a project with a strong real-time environment aspect rather than shooting 2D plates. It’s too early to say what the challenges there will be, but I’m expecting things like how well the physical world joins with the digital world and how we blend the two. We want to use lots of atmos and particle effects; it will be interesting to see how they react to the LED.

LAPPG: Can you tell us about the stitching work and camera set up you’ve been doing and using recently?

AH: Our most recent stitching work has been for the Brownian Motion nine camera arrays.

These have been mainly for in-vehicle plates. The array consists of eight Arri Alexa Mini’s covering the circumference and one Red with a fisheye covering the sky. The stitching project delivers one 16k x 2k Lat Long image for cylindrical projection and one 2k square plate that covers the sky and gets mapped to the ceiling of the VP stage.

LAPPG: Can you tell us about your own UE levels for use in a real-time volume that you’ve designed?

AH: It’s too early to talk about the current project of UE levels; all I can say is we’ve been making five zones within one world. These will play back in real-time and will benefit from a live linked camera.

For our POC, we created three levels, one neon city for some driving tests, a Nevada desert range with a pre-rendered smoke sim and a modern US city block.

It’s worth noting that these scenes were made to get the team into building UE levels and to see what’s involved, from design to creation to optimization and playback. We also had ideas about what we wanted to do with the foreground, which meant we didn’t have to spend tremendous amounts of time on the UE scenes. We knew that we’d be able to introduce foreground elements to blend the nature of the UE levels into the shots.

LAPPG: Where do you see Virtual Production going in the future? Will there be something beyond VP?

AH: I think the future is bright for VP.

The market will level and will find its place. The industry will gain experience and know what kind of tech it needs to complete what shots. Different configurations of volumes will become tools for jobs.

The elephant in the room has been content. The volumes don’t do anything without material. This has been a learning curve for all. We’ve tried content from both 2D (photographic) and 3D real-time, for our tests and projects.

From a 3D real-time POV, it’s still quite time consuming and resource hungry to create large real-time levels. More training is being pushed but I don’t think there’s enough available resources to build complex levels out there, or at least people who want to build assets for film and tv.

Our studio has seen a lot of 2D content being used, Mistika has been a part of our workflow to deliver this. I think there’s more to come from this world; perhaps we’ll see different camera arrays capturing volumetric material that can be post-processed for use in a volume. I remember the Stereo 3D boom; maybe there’s room for similar capture techniques to creep back into this market (they’ll be plenty of people shouting at this blog with the mention of Stereo).

Essentially, what we want in 2D is depth and that’s what Stereo gave us. I was first introduced to Mistika during the boom and it was an amazing tool that could deal with huge data and multiple cameras. Maybe the new Stereo is Volumetric; I’d love to see Mistika take the charge.


Meet Benjamin Voelker

It’s not everyday that we get to speak with a physicist and discuss his work on products that are high-end tools for our industry. We were lucky enough to have this opportunity recently when we spoke with Benjamin Voelker, Optical Designer/Simulations at Carl Zeiss AG.


Los Angeles Post Production Group: Please tell us about the type of work you do for ZEISS and how you got into this field?

Benjamin Voelker: Before joining ZEISS, I have worked in various fields, including nanotechnology, material science and mechanical engineering. As a trained physicist, the professional focus of my work was always on numerical modeling, i.e., creating numerical models that describe complex physical models, to simplify them and to use them for optimizing problems. Apart from work, I developed a growing interest in photography over the last 20 years. What started as a hobby quickly grew into a passion during a two-year research stay at UCSB, when I became serious with landscape photography and astrophotography.

Traveling around the world to find dark night-skies and places with great wilderness is what makes me happy. So, when I joined the ZEISS Consumer Optics Business Group in 2013, this was a unique opportunity for me to bring together work and hobby. Today I’m a senior expert in the design of optical coatings and the prediction of ghost and flare in all kinds of optical systems. Working together in a team with optical designers and mechanical designers, we develop new ideas and new optics to provide cinematographers with the tools they need.

500_ BV_SelfPortrait_Astrophotography_Scaled copy

LAPPG: Can you tell us about the idea behind the ZEISS Supreme Prime Radiance Lenses, which were released in 2019 as well how they were designed and created?

BV: The idea behind the ZEISS Supreme Prime Radiance Lenses is pretty unusual. Normally, modern optical systems are designed and optimized in a way that the image is clean and flawless, with as little “unwanted” light on the sensor as possible. Unwanted light can originate from light being scattered from mechanical surfaces of the inner contour of a lens or being reflected multiple times on optical surfaces before it unintendedly reaches the sensor plane and causes unwanted contrast loss and ghosting artifacts in the image. To tackle this basic problem in optics, as far back as 1935 engineers at ZEISS developed the T* optical coating, that literally makes glass invisible. This technology has been optimized ever since and enables us to offer optics with great neutral color rendition, the highest possible contrast, and a minimum of ghosting artefacts.

However, when me and my colleagues talked to cinematographers, we felt that some of them long for a different, less clean and perfect look. Some go for vintage lenses, with the problem that these lenses are rare and difficult or nearly impossible to service in case they fail. Others use optics with some completely uncoated lens elements; this introduces very strong white ghosting artefacts, which are completely uncontrollable, destroy the image contrast and at the same time reduce the available signal light that forms the image.

The idea behind the ZEISS Supreme Prime Radiance Lenses was to create an entire modern lens family that offers a consistent characteristic look while overcoming the difficulties and limitations just mentioned. The lens family, covering focal length from the super wide angle 18mm to the 135mm telephoto focal length, offers pleasing and controllable lens flare that can be used as a visual story-telling element.

The first major challenge was to introduce consistent lens flare over the entire lens family. Lens flare is light being reflected multiple times on lens surfaces before reaching the sensor, so if you change the properties of a single lens surface (e.g., by applying a different optical coating), the effect on lens flare is huge. By means of trail-and-error it would have been impossible to get lens flare consistent over the whole product family. Instead, we used virtual prototyping: for hundreds of combinations, the lens flare was computed in simulation models, until we found the perfect combination. This simulation was the most demanding I had done so far; on a single state-of-the-art CPU it would have been running for almost half a million hours, that’s more than 50 years!

The second major challenge was to develop a new kind of optical coating especially for the ZEISS Supreme Prime Radiance Lenses, the so-called T* blue coating.

Computer simulation model shows how rays are reflected multiple times before they reach the camera sensor as lens flare.

LAPPG: Can you explain what the T* blue coating is and how it is used?

BV: It is a new kind of anti-reflective coating design that introduces bluish lens flare of a carefully chosen intensity level, while keeping the resulting overall image contrast intact. It makes sure that the maximum apertures of the ZEISS Supreme Prime and the Radiance lenses are on the same level, at T1.5. As a benefit, it introduces a slightly warmer color rendering tone when compared to the original ZEISS Supreme Prime Lenses, and great care has been taken to avoid a green or magenta color tint on the image. The T* Blue coating on the ZEISS Supreme Prime Radiance Lenses makes them a versatile tool, if you don’t want flares to appear in a certain scene you just need to flag the light, but you still will have that nice warmer color tone.

Lens flare depends on the geometric shape and the applied coating on every optical surface.

LAPPG: It seems that 4 more lenses have been recently released in this collection. Can you tell us about those and why they were added?

BV: After introducing the initial set of ZEISS Supreme Prime Radiance lenses in 2019, we received very positive feedback from our customers. Meanwhile, four more lenses had been added to the ZEISS Supreme Prime family: the T1.5/18mm, T1.5/40mm, T1.5/65mm and T1.5/135mm. Immediately after we launched the first wave of ZEISS Supreme Prime Radiance lenses into the market, we started to develop a Radiance-Version of these four focal lengths. They perfectly round off the ZEISS Supreme Prime Radiance lens family, which now perfectly matches the ZEISS Supreme Prime lens family. As before, greatest care has been taken to reach a consistent flare look throughout the whole ZEISS Supreme Prime Radiance Family. A number of productions using ZEISS Supreme Prime Radiance Lenses are already available, you can find a list of trailers here. I’m amazed what cinematographers are getting out of the ZEISS Supreme Prime Radiance lenses, and how subtle they can use the lens flare to create a unique look. To me, there’s nothing more rewarding than watching such a production in the evening with my family and knowing that I have been part in making this look possible.

Testing a prototype in the lab and verifying the results of the computer simulation.

LAPPG: Does the eXtended Data Technology exist in the ZEISS Prime Radiance lenses and if so, can you explain a bit about how that technology works?

BV: Yes, ZEISS Supreme Prime Radiance Lenses possess Extended Data Capability just like ZEISS Supreme Prime Lenses. In short, they possess all the features of Supreme Primes but with the addition of controlled lens flares.

Extended Data or in short ‘’XD’’ is a unique technology, based on /i Cooke Metadata with the added benefit of live Shading/ Vignette Data and Distortion Data, which is useful for VFX to determine the Vignette and Distortion of a lens before work can begin. Each lens is individually profiled at the factory and the data is saved on the Lens itself which in turn can be viewed/ accessed when connected to an Extended Data compatible product. Eg: Sony Venice (PL Mount), RED DSMC2 Cameras (PL Mount), DCS Film, Factory Optic SynchroLink. To learn more about Zeiss Extended Data, follow this dedicated link to the Zeiss Extended Data page, where you will find a wealth of information, white papers, tutorials, guide downloads etc.

LAPPG: What are you most excited for in the world of high-end cinema lenses? Is there anything on the horizon that you can tell us about?

BV: It has been really interesting to see the adoption of Full Frame and Full Frame Plus Cameras and Lenses over the last few years in the industry. It’s rewarding to see the great reception we’ve had for our Supreme Primes and Radiance Lenses of course! Also, the evolution and use of advanced Metadata for VFX has come a long way and the fact that we are communicating with all the key stake-holders in every part of the chain goes to show that every individual involved is interested in contributing towards the betterment of the products or current workflows. As for telling you about anything on the horizon, we are a manufacturer and we are always creating something new and interesting!

To learn more about ZEISS Supreme Prime Radiance Lenses please visit:

Virtual Sets with Adrian Gonzalez

As we’ve been seeing lots of changes and advances in production and workflows lately, we wanted to take some time to catch up with Adrian Gonzalez, Mistika Boutique Product Manager, to discuss some of the ins and outs of using and working with Visual Sets. Here’s some helpful information about where things currently are with this new technology, the role Mistika Boutique plays in the complete creation of this content and also what capabilities are on the horizon.

Los Angeles Post Production Group: Can you tell us about yourself and your background. How did you start working with SGO and on Virtual Sets?

Adrian Gonzalez: I joined SGO eight years ago, and since then I have been working as a demo artist, training mentor and in the past two years as a Mistika Boutique Product Manager. My profile is halfway between the creative and the technical, so I regularly collaborate with the development team at SGO defining new Mistika Technology features and products, and Graphic User Interface design updates.

Our technology is one of the pioneering solutions in the immersive post-production technology sector and Mistika Boutique (and Ultima) is the only finishing system that enables the complete creation of immersive content from initial optical flow stitching, color grading, VFX and all the way to the final deliverables, so being able to apply it to Virtual Sets is a natural progression.

LAPPG: Can you explain to us the benefits of using Virtual Sets?

AG: Beside the fact that Virtual Sets tear down the limits of what can be seen through the camera on a live-action set and what needs to be imagined to be added digitally many months later, virtual production has been making the content creation process more sustainable and economical, eliminating the need for location shoots.

In addition, it removes many of the issues related with the green screens like the green hue contamination in reflective surfaces and of course the time needed to make a good selection in a complex green screen. By using Virtual Sets, the integration between the virtual footage and the real part of the shot is much better and organic.

LAPPG: What type of productions have you seen currently that are making the most use of this technology?

AG: Virtual Production is more or less just at the beginning of its journey and because of the high investment required, only bigger studios and production companies are able to afford it at the moment. Basically, all types of production are suitable for Virtual Production, however the most impactful for on-set development is definitely VFX-heavy content. One of the most well-known projects in the Virtual Set is Disney’s The Mandalorian, of which over half was filmed indoors on a virtual set.

LAPPG: Are there any situations where a typical green screen would be a better choice than Virtual Sets or should we be trying to use this newer technology as much as possible?

AG: LED backgrounds are able to provide more realistic environments for the actors and other functions that green and blue screens may lack. Shooting in Virtual Sets is smoother and provides better lighting and colors and even reflections on metallic surfaces, for example.

Green screen can be a better choice in shots where the backgrounds are extremely complex, for example CG content with camera movement, physics simulation, etc. In those shots, the background will be recreated in the post production phase rather than in the shooting phase, because it requires time and the implication of VFX and 3D packages in a complex pipeline. So, in those scenarios recreating that kind of background before shooting and before knowing exactly how that shot would look can be problematic.

LAPPG: What are the biggest challenges that occur when you are planning to use virtual sets?

AG: As mentioned before, this is just the beginning of Virtual Production and therefore there is a lack of knowledge and experience among industry professionals. Another challenge is a technical one. In order to get the best possible quality, it is pretty typical to work during most parts of the pipeline with big resolution files in formats like EXR or high quality ProRes. So, you need a powerful technical ecosystem that is capable of managing these kinds of files. As a final challenge, it requires much better preparation, because those backgrounds need to be done in advance, before the final shooting.

LAPPG: How does Mistika Technology help create solutions and solve problems?

AG: Mistika Technology enables the complete creation of Virtual Backgrounds – from the initial optical flow stitching to color grading, VFX, and final deliverables. At this moment, Mistika is the industry standard solution to work with 180º/360º shots, thanks to the unrivaled VR capabilities. At the same time, we have optimized the system to work with huge resolution formats and heavy media, so the performance is a key factor here as well. So, if you combine the fact that you can create the virtual backgrounds from scratch, in any 360º format and with the best performance, what you have at the end is the best tool for this new technique.

LAPPG: Are there any best practices or things we should be aware of when we are setting out to build virtual sets?

AG: Resolution and color workflow, probably. Resolution is key to know exactly what kind of images we need to deliver to the virtual set, and which kind of camera or 360 rig we need to use.

At the same time, knowing the display color space is important in order to build a proper color pipeline. Those images are going to be adjusted and graded like any other image, so we need to design a good workflow to manage those backgrounds from the format and the color science point of view.

LAPPG: It seems that Mistika Boutique has all the functions of Mistika VR plus more tools for compositing and grading along with an amazing timeline that allows users to edit with any kind of clip including 360. So, in what cases would we use Mistika VR instead of Mistika Boutique or is it best to use them together?

AG: Exactly – Mistika VR is actually just a small piece of Mistika Boutique. Mistika VR handles just the first part of the immersive content creation – optical flow stitching and stabilisation. Once this is done, you could simply take the project data to Mistika Boutique without the need to render in order to complete the Color and VFX. However, you could also just do the entire project (including Optical Flow stitching) in Mistika Boutique.

Mistika VR is especially useful when a big team is involved in creating those virtual backgrounds. Most of the work will be the stitching, and for that it is more efficient to have several Mistika VR licenses (which are cheaper obviously) and leave Mistika Boutique only for the finishing part. Communication between Mistika products is perfect, all you need is to save the project in Mistika VR and open it in Mistika Boutique, and you’ll have access to all the adjustments done by the VR user. From there, remember without rendering, you can grade your shot, make any composition and finally deliver it. This allows parallelization of the production of these backgrounds and saves a huge amount of time.

LAPPG: It appears Mistika VR & Mistika Boutique form a system for covering all the needs for building virtual sets. Are there any capabilities that you are looking forward to for these programs being able to do in the future that are not currently in place yet?

AG: We want to improve the management of VR images with complex aspect ratios. Normally VR shots work in a 2:1 aspect ratio, as this is the aspect ratio of an unwrapped sphere, but with Virtual Sets sometimes you need different settings that require at this moment an extra step to deliver, such as cropping an interesting area. In the future we will remove that extra step to manage those scenarios in a quicker and a smoother way. And of course, because this is a new technique that is growing every day, we will face new challenges, so we will improve our tools following those challenges and the feedback of our users, as we always do.

For a video on how to create Virtual Sets with Mistika Technology click here.

A way to make Europe


Free Membership Sign Up


Job Opportunitites

Check out our jobs board and find your next opportunity!