We would like to introduce you to Ryan McNeal, an independent colorist, providing color services to all scales of projects, large and small across a wide range of mediums, tv, film, music videos, and web content. In this post you’ll discover how Ryan went from hired hand to owning his own studio, the techniques he uses to bring out the emotional intention of the images he works with and, how he uses Blackmagic gear and Resolve to stream, edit, and color his own projects as well as his clients’.
Los Angeles Post Production Group: Can you share how being an oil painter inspires your work as a colorist and what tools you use to be able to help tell a story and influence viewer’s emotions?
Ryan McNeal: Although Film, TV, and Streaming are all relatively modern mediums to work in, the art of story-telling through visual media started at the dawn of humanity. We are very visual creatures and visual storytelling is a pretty universal language. For me, the great painters in classical art history discovered and pioneered many techniques that filmmakers employ every day. Composition, controlled color pallet, dramatic lighting, emotive subjects, etc. When I color, I am motivated by the art of the shot and approach it like a painting. What are the hues that will best tell this story? How much of the subject’s world should we see? How isolated should the subject be? What is the base emotion the audience should feel? Is this a harsh world? Or a soft one?
Broadly speaking, I use a set of techniques to execute the emotional intention of the images.
1. Creative correction – control of the color pallet to evoke the right emotion and genre. This may or may not include film emulation, depending on the project.
2. Creative shaping – vignettes, pinches, gradients of luma and/or hue to build up or breakdown the focus of the image. Where are we looking?
3. Texture – Images can be soft or sharp, in many senses of the word. I selectively add and reduce contrast in areas of the image as well as sharpness. Additionally, bloom and halation can be used to soften highlights.
4. Color Density – How heavy the colors feel, this is more to do with luminance, rather than saturation. Film, for instance, is generally more dense in the shadows than the highlights, whereas video is linear and equal throughout the tonal range.
5. Grain – In most narrative, even a small amount of grain is helpful in battling aggressive compression algorithms. A noticeable degree of film grain can be pleasing to the eye, depending on the aesthetics of the film.
LAPPG: Your work as color assistant at Company 3 lead you eventually into becoming a freelance colorist and starting your own independent color studio with your wife, Becky, called RKM Studios. Can you tell us about the process you went through from moving from an assistant at a big company to going off in your own?
RM: Going freelance after working for a larger company was scary. I thought for sure I had enough clients and was ready. Reality was I quit right when my handful of clients didn’t have any work. I put a small office on a credit card and started hustling. I would wake up early, hit Mandy and Craigslist each day and apply to everything, regardless of pay. I saw myself as being in this mode of needing to build out my network and I would worry about the income later. I got good at nabbing gigs on Mandy, having learned that you can stand out by specializing and not being a jack-of-all-trades. I paint, I draw, I do photography, I write, I direct music videos, and I color. But to Mandy jobs, I was just a colorist and nothing else. Consistently I was told that was the reason clients picked me: I applied early, and I wasn’t trying to be everything.
My wife was working full time and I was getting enough freelance to keep the bills current. I would often work 16 hours and had a poor sense of boundaries with clients. I soon got busy enough to get the office space off my credit card and asked my wife, Becky to join me and help start a legitimate business.
That was about 7-8 years ago and since then we’ve grown into a 6 person team, having recently brought on another colorist, Michael Schatz. As a small team, we are nimble and able to provide a boutique experience. We’ve carved out a niche by providing high quality work and treating every project with careful and intentional collaboration. Clients love that we get involved and care about achieving the best image for their films.
LAPPG: What services does RKM Studios currently offer and how much of the work you do is now remote collaboration?
RM: We provide creative color grading services for film, tv, streaming, and social media. We work in SDR and HDR and have two color suites setup for accurate viewing environments. We also offer online finishing services for long-form projects.
We’ve seen continued interest in remote sessions beyond COVID reasons. DPs are rarely paid to attend color sessions and often have to choose between working on paying gigs or being in the color session and missing out on work. We’ve had a lot of DPs, especially long time clients, thrilled to be able to jump on the remote stream from set so that they can still be a part of the discussions without giving up work. Of course it’s always best if we can all be in the same room, but with the way things are evolving, that is more and more a privilege rather than a requirement.
We’ve also been doing a lot of hybrid sessions, where the director or another creative is present at our studio, and we’ll have one or more creatives on our remote stream at the same time. For our stream we use the Blackmagic Web Presenter 4K units with calibration LUTs to ensure high-quality real-time video anywhere in the world.
LAPPG: What advice do you have for colorists just starting out and what skills should someone cultivate to be able to do this work successfully?
RM: You can only learn how to be an artist through experience. For the gear and the software, the internet can teach you just about anything, but to develop your critical eye and create art, you have to culture and enrich the artistic side of yourself. Study classical art, study art history, study color theory and how it can be applied psychologically. Photography is an excellent adjacent hobby that can teach you all about cameras and capturing images, as well as retouching. Personally I do analog film photography as a hobby and develop my own photos for this reason. Get inspired by art that isn’t film. Go to museums, look at graphic design, go to gallery openings and find out what other people think of art. Pay attention to the psychological reaction to art from those around you. Most of the up-and-coming colorists I see on LiftGammaGain (colorist forum) are very technically minded and are overly engaged with the tools and the gear. In the beginning, you’re hired because you’re the guy at the right price with the gear, but as you develop, you’re hired for your taste and speed.
Similar note, post production is not a “you build it, and they come” type of business. You are hired on the equation: (Reputation + Reliability) * YourNetwork = nColorGigs. Don’t waste money buying all the right gear before you’ve proven that you can make it work for you. A decent laptop and a Resolve Mini panel is enough to get started on student and low budget stuff. You have to create a brand and culture meaningful relationships with filmmakers so that you grow and maintain your network. If you are unpleasant to work with, you will struggle in freelance. I was an introvert and it took me a long time to come out of my shell and be more approachable. I learned my lesson, and I offer it in kind.
Have humility. Ego will destroy everything it touches. It’s easy when you are good at something to wield that as a weapon against those you work with. That makes you difficult and it will inhibit you from learning. You will work with many people that know less about color, image design, art—that’s why they are hiring you. Make sure you always approach disagreements with grace and a problem solving attitude.
LAPPG: RKM Studios does really impressive work from music videos for Alicia Keys, Panic! at the Disco, and the Jonas Brothers to commercials for Nike, Acura, Hasbro, and Red Bull. What types of projects get you most excited to work on and are most projects collaborative or do directors generally come in with a particular vision which you deliver?
RM: Thank you for your kind words! I am most excited about projects that are creative, it’s fun to work with colors that aren’t typical and try new things. But more than anything, I want the relationship to be positive. The works you mentioned above, those filmmakers are some of the nicest, patient collaborators we work with, and I value that experience so highly. You either have the privilege of spending 8 hours together making art, or you are trapped in a dark room with unpleasant people for 8 hours making mud, and I’d rather it be the first.
Every project is a different experience. Sometimes the filmmakers have a very specific vision, and the area I can play in is narrow. Sometimes the filmmakers do not have any specific vision for color and are looking for a creative to collaborate and dream up a look with them. I find I have to adapt to whatever’s needed, based on the experience level of the client.
LAPPG: When first starting out you generally don’t get to do higher profile projects like these. Can you talk about how the work has evolved?
RM: In the beginning I was sifting through work on Mandy and Craigslist. No project was beneath me since I needed to build my skills and my network. It was about forging relationships. There are three clients I still work with today that discovered me on those platforms. All of them had super low-budget projects, but I was willing and eager to put in the time and make the connection. Nearly 8 years later, those clients bring us some of the biggest work we’ve gotten. It’s important to invest in people, because your network brings you the work, and everyone is trying to grow. You never know who’s going to be the next big thing.
Over time, I got a couple creative projects that got me noticed by a little bit higher caliber clients. And then that work got me noticed by even better clients, and so on and so forth. The power of referral is huge. All our work comes from referral. We also curate our social media presence and website to promote the work that we want more of. These days, that is long-form narrative.
It’s cool to work on high profile projects, but I’ve learned to not be overly engaged just because there is a celebrity attached. Often times, the presence of a celebrity ends up making the job all the more difficult from a communication and efficiency stand-point. If the celebrity has a culture of fear around them, it leads to creatives worrying about approval and second-guessing their work. I think it takes a seasoned producer to handle the approval process and expectations, otherwise things quickly get off the rails.
My post producer and partner Becky is excellent at that sort of communication and it is so much easier to work with difficult personalities when they know the limits of engagement.
LAPPG: When you own an independent studio in many ways you can be always on the clock as you need to do whatever it takes to get your client’s project out the door on time. How do you deal with ever tightening schedules and delivery dates?
RM: It takes a team to deal with the turnaround expectations in our industry. When I was freelance on my own, a 16hr day was prescribed by the work. Now with a small team we can efficiently break up the work and move even quicker.
It starts with the prep workflow. We prep and color all jobs expecting last minute edit changes, overall color notes, and pickups. By creating a pipeline to handle the chaos, the chaos isn’t so overwhelming.
I am always exploring new tools and tech to push the envelope on speed and efficiency. Every second we cut out of the process adds up when we are responsible for dozens of projects per month.
Recently we worked on a feature doc where the deadline was tight and we needed to be able to color at the same time that the online was happening. So the company doing the online was able to work independently to create the online in Resolve while we started color separately and they just sent us their .drp file when it was ready and we updated mid-color with no issues. It was a smooth process and cut out a lot of time which allowed us to hit the client’s deadline! Without that solution we would have needed to wait a full two weeks before starting color.
LAPPG: You and your wife, Becky have been working together for a long time. How do you two balance the work and what are some of the things you’ve learned to make this professional partnership run smoothly while maintaining your relationship since the lines can be blurry between work and home in this type of situation?
RM: Becky and I work really well as a team. Open and clear communication of expectations is key. We always talk things out. It is important to be extra support for each other. As business owners, we find ourselves worn out, overwhelmed, and stressed. We each try and take on some of that weight when it is becoming disproportionate. We also try and set boundaries. Sometimes ineffectively. During the work-from-home period of 2020-2021, it was really hard for us living with our work. I congratulate anyone who can do that, we cannot. Moving back into an office space was important for the growth of our company, but also for our mental health and being able to physically separate work and home life.
LAPPG: You also work as a director and enjoy taking a bold and cinematic approach to your projects. Your most recent short film, Desert Rose premiered at acclaimed film festivals including the Academy Qualifying HollyShorts Film Festival in Los Angeles. Congrats on that! What gear did you use to help you tell the story you wanted?
RM: Thank you! Desert Rose was a very fulfilling personal project and I am very excited to get my next narrative endeavor off the ground.
For Desert Rose, we shot Red Helium and used Kowa anamorphic lenses. I love the decided vintage feel of that set, it was perfect for our western. I also direct music videos and usually we do a paper edit in DaVinci Resolve. I like to put in titles that describe what is happening and edit that to the music to feel out the timing. Then we edit in Resolve and Color in Resolve. We’ve been doing that for years now, and it’s been great to see Resolve mature fully into a more-than-capable NLE. I do my own VFX on my directorial projects, using a combination of Blender, After Effects, and Resolve to build, composite, and color each VFX shot.
This month we invite you to meet Alexis Haggar, Visual Effects Supervisor at award-winning independent visual effects studio, Lexhag VFX. Alexis details his journey into effects, his work in Virtual Production, the tools he uses, including Mistika Boutique, and advice for planning a Virtual Production.
Los Angeles Post Production Group: After film school you started working at a game company doing feature and advertising work. What made you decide to start Lexhag, now an award-winning independent visual effects studio in London and Norfolk, and what sets you apart from other VFX companies?
Alexis Haggar: Straight out of Film School, graduating as a director, I tried to get some work as a director but quickly decided that it was going to take a long time to build a directorial career, and I still wanted to learn.
The job with the games company came through a film school friend who had gone there to start their movie arm. He asked if I wanted to help him make a series of 6 commercials to go out on prime time Sci-Fi Channel. This was an opportunity I couldn’t miss so I joined the company to produce the ads. Straight after that, we were offered an opportunity to make a feature with the same company, which we did and invited all of our Film school colleagues to make it with us.
Once those projects were finished, I moved into the Special Effects world where I spent time blowing things up and making weird and extraordinary rigs. At this point (perhaps 2004), the SFX industry was worried that digital would take over, and many of their jobs were already being replaced with CGI. In high school (secondary for us brits) I had completed my Art A-level with CGI, so I was well versed/self-taught and felt I could transit from SFX to VFX.
At this point, I wanted to keep both sides alive, and there wasn’t a company that had departments to do both. It was at this point that I decided to start my own company that could service both SFX and VFX or, at the very least, be able to design effects with both disciplines in mind.
Lexhag was born, and to this day, we’re still recruiting people that are practical/creative and are technology savvy; there’s never been a better time to combine all the skills.
LAPPG: You recently announced that Lexhag is now offering Virtual Production services, but it seems like this is not something totally new for your company. Can you tell us how you started doing this work and what types of services you now offer including how you support other production workflows?
AH: Virtual Production, as the industry knows it (or the current trend), is rear screen projection on steroids. We now have a whole load of tech that can make the old “dumb” methods “smart”.
Because we’ve come from a place where we’ve had experience with props, miniatures, camera tricks, projecting effects on-set and the whole post-production process (and we’re filmmakers at heart), it seemed natural for us to get involved.
We started by designing and building our own VP setup specifically for vehicles using funding that we won from an innovation grant. We created two POC’s, working with our partners; they supplied the film producing resource, and we supplied our image/VFX resource.
LAPPG: What is the most important technical advice you can offer someone about doing Virtual Production?
AH: Plan your shots. Do not expect to be able to walk into a volume and it all works. It might seem obvious to say, but I get the feeling that people are using it to get them out of sticky situations. It’s another tool to create shots and can be used for lots of reasons and still requires the same amount of planning and effort one puts into a ”normal” shot.
LAPPG: Virtual Production is certainly rising in popularity, but have you seen situations where Virtual Production was not the right choice for a production? How did that play out?
AH: Personally, I haven’t been in that situation. I think that’s come from the supervisor in me; I like to create “10-minute tests” to see if the theory will work or has a chance of working. It’s basically a soft prototype.
For our POC’s we did more than 10-minute tests; we pretty much tech-vised all of the sequences. It was too expensive to rock up and test on the stage, so I worked everything out with a mixture of CAD, Blender and Resolve. Perhaps my SFX training came into play because I also made sure we had different ways to supply the material, too. Traditionally, if we build an SFX rig, it would go through hours of testing and stop working precisely when the camera record button was pressed. Amazing how that works, perhaps something to do with Quantum physics.
LAPPG: What are your go-to tools that you use most often in your work?
AH: For planning and simulation, I’ve been using a combination of Fusion360, Blender and Resolve. In combination, they can be used well to create fairly accurate pre-vis animations with Tech-vis ideals.
On the delivery side, we were able to introduce Mistika into our pipeline, which can handle many of the giant deliverables needed for the bigger stages in London.
LAPPG: Do you use Mistika VR as well as Mistika Boutique and what are the benefits of using Mistika Technology for Virtual Production?
AH: Roughly four years ago, we started using Mistika VR when we created a hybrid VP workflow. By that, I mean we have low-res LED panels throwing light onto the subject while the direct BG the camera sees is Green Screen.
We captured our plates using very high resolution 360° cameras and used MVR to deliver the 360° material which was used in the composite.
These days we have several Mistika Boutique seats because it offers many more advanced features over VR and integrates well with our existing VFX pipeline.
LAPPG: Is color handled any differently in Virtual Production than in more traditional production?
AH: Color is the same as what we’re used to in the post-production world. The London stages we’ve delivered to are growing their colour pipelines, and we’re starting to see the introduction of ACES.
LAPPG: How does Mistika Technology contribute to VPX workflow? Which aspects do you find the most relevant and which obstacles does it help you overcome?
AH: Mistika Tech is brilliant at the heavy lifting of delivering hours of array material. It’s still very much a finishing suite but has a toolset I’ve not seen in other systems.
It works very well with the other key tools and complements the VFX/Online workflow very well.
For example, in certain shots, we don’t need to leave the Mistika environment; its capable of conforming, FX finishing and delivering and does it with some speed. I’m used to a VFX toolset but have experience in the picture post-world; I see Mistika as a product that connects to both very well.
LAPPG: I know with various NDA’s you are limited in what you can say but can you tell us a little about the project you are currently designing for a new show that will feature a lot of VP and traditional SFX integration and what challenges you are facing?
AH: We are working on a project with a strong real-time environment aspect rather than shooting 2D plates. It’s too early to say what the challenges there will be, but I’m expecting things like how well the physical world joins with the digital world and how we blend the two. We want to use lots of atmos and particle effects; it will be interesting to see how they react to the LED.
LAPPG: Can you tell us about the stitching work and camera set up you’ve been doing and using recently?
AH: Our most recent stitching work has been for the Brownian Motion nine camera arrays.
These have been mainly for in-vehicle plates. The array consists of eight Arri Alexa Mini’s covering the circumference and one Red with a fisheye covering the sky. The stitching project delivers one 16k x 2k Lat Long image for cylindrical projection and one 2k square plate that covers the sky and gets mapped to the ceiling of the VP stage.
LAPPG: Can you tell us about your own UE levels for use in a real-time volume that you’ve designed?
AH: It’s too early to talk about the current project of UE levels; all I can say is we’ve been making five zones within one world. These will play back in real-time and will benefit from a live linked camera.
For our POC, we created three levels, one neon city for some driving tests, a Nevada desert range with a pre-rendered smoke sim and a modern US city block.
It’s worth noting that these scenes were made to get the team into building UE levels and to see what’s involved, from design to creation to optimization and playback. We also had ideas about what we wanted to do with the foreground, which meant we didn’t have to spend tremendous amounts of time on the UE scenes. We knew that we’d be able to introduce foreground elements to blend the nature of the UE levels into the shots.
LAPPG: Where do you see Virtual Production going in the future? Will there be something beyond VP?
AH: I think the future is bright for VP.
The market will level and will find its place. The industry will gain experience and know what kind of tech it needs to complete what shots. Different configurations of volumes will become tools for jobs.
The elephant in the room has been content. The volumes don’t do anything without material. This has been a learning curve for all. We’ve tried content from both 2D (photographic) and 3D real-time, for our tests and projects.
From a 3D real-time POV, it’s still quite time consuming and resource hungry to create large real-time levels. More training is being pushed but I don’t think there’s enough available resources to build complex levels out there, or at least people who want to build assets for film and tv.
Our studio has seen a lot of 2D content being used, Mistika has been a part of our workflow to deliver this. I think there’s more to come from this world; perhaps we’ll see different camera arrays capturing volumetric material that can be post-processed for use in a volume. I remember the Stereo 3D boom; maybe there’s room for similar capture techniques to creep back into this market (they’ll be plenty of people shouting at this blog with the mention of Stereo).
Essentially, what we want in 2D is depth and that’s what Stereo gave us. I was first introduced to Mistika during the boom and it was an amazing tool that could deal with huge data and multiple cameras. Maybe the new Stereo is Volumetric; I’d love to see Mistika take the charge.
It’s not everyday that we get to speak with a physicist and discuss his work on products that are high-end tools for our industry. We were lucky enough to have this opportunity recently when we spoke with Benjamin Voelker, Optical Designer/Simulations at Carl Zeiss AG.
Los Angeles Post Production Group: Please tell us about the type of work you do for ZEISS and how you got into this field?
Benjamin Voelker: Before joining ZEISS, I have worked in various fields, including nanotechnology, material science and mechanical engineering. As a trained physicist, the professional focus of my work was always on numerical modeling, i.e., creating numerical models that describe complex physical models, to simplify them and to use them for optimizing problems. Apart from work, I developed a growing interest in photography over the last 20 years. What started as a hobby quickly grew into a passion during a two-year research stay at UCSB, when I became serious with landscape photography and astrophotography.
Traveling around the world to find dark night-skies and places with great wilderness is what makes me happy. So, when I joined the ZEISS Consumer Optics Business Group in 2013, this was a unique opportunity for me to bring together work and hobby. Today I’m a senior expert in the design of optical coatings and the prediction of ghost and flare in all kinds of optical systems. Working together in a team with optical designers and mechanical designers, we develop new ideas and new optics to provide cinematographers with the tools they need.
LAPPG: Can you tell us about the idea behind the ZEISS Supreme Prime Radiance Lenses, which were released in 2019 as well how they were designed and created?
BV: The idea behind the ZEISS Supreme Prime Radiance Lenses is pretty unusual. Normally, modern optical systems are designed and optimized in a way that the image is clean and flawless, with as little “unwanted” light on the sensor as possible. Unwanted light can originate from light being scattered from mechanical surfaces of the inner contour of a lens or being reflected multiple times on optical surfaces before it unintendedly reaches the sensor plane and causes unwanted contrast loss and ghosting artifacts in the image. To tackle this basic problem in optics, as far back as 1935 engineers at ZEISS developed the T* optical coating, that literally makes glass invisible. This technology has been optimized ever since and enables us to offer optics with great neutral color rendition, the highest possible contrast, and a minimum of ghosting artefacts.
However, when me and my colleagues talked to cinematographers, we felt that some of them long for a different, less clean and perfect look. Some go for vintage lenses, with the problem that these lenses are rare and difficult or nearly impossible to service in case they fail. Others use optics with some completely uncoated lens elements; this introduces very strong white ghosting artefacts, which are completely uncontrollable, destroy the image contrast and at the same time reduce the available signal light that forms the image.
The idea behind the ZEISS Supreme Prime Radiance Lenses was to create an entire modern lens family that offers a consistent characteristic look while overcoming the difficulties and limitations just mentioned. The lens family, covering focal length from the super wide angle 18mm to the 135mm telephoto focal length, offers pleasing and controllable lens flare that can be used as a visual story-telling element.
The first major challenge was to introduce consistent lens flare over the entire lens family. Lens flare is light being reflected multiple times on lens surfaces before reaching the sensor, so if you change the properties of a single lens surface (e.g., by applying a different optical coating), the effect on lens flare is huge. By means of trail-and-error it would have been impossible to get lens flare consistent over the whole product family. Instead, we used virtual prototyping: for hundreds of combinations, the lens flare was computed in simulation models, until we found the perfect combination. This simulation was the most demanding I had done so far; on a single state-of-the-art CPU it would have been running for almost half a million hours, that’s more than 50 years!
The second major challenge was to develop a new kind of optical coating especially for the ZEISS Supreme Prime Radiance Lenses, the so-called T* blue coating.
LAPPG: Can you explain what the T* blue coating is and how it is used?
BV: It is a new kind of anti-reflective coating design that introduces bluish lens flare of a carefully chosen intensity level, while keeping the resulting overall image contrast intact. It makes sure that the maximum apertures of the ZEISS Supreme Prime and the Radiance lenses are on the same level, at T1.5. As a benefit, it introduces a slightly warmer color rendering tone when compared to the original ZEISS Supreme Prime Lenses, and great care has been taken to avoid a green or magenta color tint on the image. The T* Blue coating on the ZEISS Supreme Prime Radiance Lenses makes them a versatile tool, if you don’t want flares to appear in a certain scene you just need to flag the light, but you still will have that nice warmer color tone.
LAPPG: It seems that 4 more lenses have been recently released in this collection. Can you tell us about those and why they were added?
BV: After introducing the initial set of ZEISS Supreme Prime Radiance lenses in 2019, we received very positive feedback from our customers. Meanwhile, four more lenses had been added to the ZEISS Supreme Prime family: the T1.5/18mm, T1.5/40mm, T1.5/65mm and T1.5/135mm. Immediately after we launched the first wave of ZEISS Supreme Prime Radiance lenses into the market, we started to develop a Radiance-Version of these four focal lengths. They perfectly round off the ZEISS Supreme Prime Radiance lens family, which now perfectly matches the ZEISS Supreme Prime lens family. As before, greatest care has been taken to reach a consistent flare look throughout the whole ZEISS Supreme Prime Radiance Family. A number of productions using ZEISS Supreme Prime Radiance Lenses are already available, you can find a list of trailers here. I’m amazed what cinematographers are getting out of the ZEISS Supreme Prime Radiance lenses, and how subtle they can use the lens flare to create a unique look. To me, there’s nothing more rewarding than watching such a production in the evening with my family and knowing that I have been part in making this look possible.
LAPPG: Does the eXtended Data Technology exist in the ZEISS Prime Radiance lenses and if so, can you explain a bit about how that technology works?
BV: Yes, ZEISS Supreme Prime Radiance Lenses possess Extended Data Capability just like ZEISS Supreme Prime Lenses. In short, they possess all the features of Supreme Primes but with the addition of controlled lens flares.
Extended Data or in short ‘’XD’’ is a unique technology, based on /i Cooke Metadata with the added benefit of live Shading/ Vignette Data and Distortion Data, which is useful for VFX to determine the Vignette and Distortion of a lens before work can begin. Each lens is individually profiled at the factory and the data is saved on the Lens itself which in turn can be viewed/ accessed when connected to an Extended Data compatible product. Eg: Sony Venice (PL Mount), RED DSMC2 Cameras (PL Mount), DCS Film, Factory Optic SynchroLink. To learn more about Zeiss Extended Data, follow this dedicated link to the Zeiss Extended Data page, where you will find a wealth of information, white papers, tutorials, guide downloads etc.
LAPPG: What are you most excited for in the world of high-end cinema lenses? Is there anything on the horizon that you can tell us about?
BV: It has been really interesting to see the adoption of Full Frame and Full Frame Plus Cameras and Lenses over the last few years in the industry. It’s rewarding to see the great reception we’ve had for our Supreme Primes and Radiance Lenses of course! Also, the evolution and use of advanced Metadata for VFX has come a long way and the fact that we are communicating with all the key stake-holders in every part of the chain goes to show that every individual involved is interested in contributing towards the betterment of the products or current workflows. As for telling you about anything on the horizon, we are a manufacturer and we are always creating something new and interesting!
As we’ve been seeing lots of changes and advances in production and workflows lately, we wanted to take some time to catch up with Adrian Gonzalez, Mistika Boutique Product Manager, to discuss some of the ins and outs of using and working with Visual Sets. Here’s some helpful information about where things currently are with this new technology, the role Mistika Boutique plays in the complete creation of this content and also what capabilities are on the horizon.
Los Angeles Post Production Group: Can you tell us about yourself and your background. How did you start working with SGO and on Virtual Sets?
Adrian Gonzalez: I joined SGO eight years ago, and since then I have been working as a demo artist, training mentor and in the past two years as a Mistika Boutique Product Manager. My profile is halfway between the creative and the technical, so I regularly collaborate with the development team at SGO defining new Mistika Technology features and products, and Graphic User Interface design updates.
Our technology is one of the pioneering solutions in the immersive post-production technology sector and Mistika Boutique (and Ultima) is the only finishing system that enables the complete creation of immersive content from initial optical flow stitching, color grading, VFX and all the way to the final deliverables, so being able to apply it to Virtual Sets is a natural progression.
LAPPG: Can you explain to us the benefits of using Virtual Sets?
AG: Beside the fact that Virtual Sets tear down the limits of what can be seen through the camera on a live-action set and what needs to be imagined to be added digitally many months later, virtual production has been making the content creation process more sustainable and economical, eliminating the need for location shoots.
In addition, it removes many of the issues related with the green screens like the green hue contamination in reflective surfaces and of course the time needed to make a good selection in a complex green screen. By using Virtual Sets, the integration between the virtual footage and the real part of the shot is much better and organic.
LAPPG: What type of productions have you seen currently that are making the most use of this technology?
AG: Virtual Production is more or less just at the beginning of its journey and because of the high investment required, only bigger studios and production companies are able to afford it at the moment. Basically, all types of production are suitable for Virtual Production, however the most impactful for on-set development is definitely VFX-heavy content. One of the most well-known projects in the Virtual Set is Disney’s The Mandalorian, of which over half was filmed indoors on a virtual set.
LAPPG: Are there any situations where a typical green screen would be a better choice than Virtual Sets or should we be trying to use this newer technology as much as possible?
AG: LED backgrounds are able to provide more realistic environments for the actors and other functions that green and blue screens may lack. Shooting in Virtual Sets is smoother and provides better lighting and colors and even reflections on metallic surfaces, for example.
Green screen can be a better choice in shots where the backgrounds are extremely complex, for example CG content with camera movement, physics simulation, etc. In those shots, the background will be recreated in the post production phase rather than in the shooting phase, because it requires time and the implication of VFX and 3D packages in a complex pipeline. So, in those scenarios recreating that kind of background before shooting and before knowing exactly how that shot would look can be problematic.
LAPPG: What are the biggest challenges that occur when you are planning to use virtual sets?
AG: As mentioned before, this is just the beginning of Virtual Production and therefore there is a lack of knowledge and experience among industry professionals. Another challenge is a technical one. In order to get the best possible quality, it is pretty typical to work during most parts of the pipeline with big resolution files in formats like EXR or high quality ProRes. So, you need a powerful technical ecosystem that is capable of managing these kinds of files. As a final challenge, it requires much better preparation, because those backgrounds need to be done in advance, before the final shooting.
LAPPG: How does Mistika Technology help create solutions and solve problems?
AG: Mistika Technology enables the complete creation of Virtual Backgrounds – from the initial optical flow stitching to color grading, VFX, and final deliverables. At this moment, Mistika is the industry standard solution to work with 180º/360º shots, thanks to the unrivaled VR capabilities. At the same time, we have optimized the system to work with huge resolution formats and heavy media, so the performance is a key factor here as well. So, if you combine the fact that you can create the virtual backgrounds from scratch, in any 360º format and with the best performance, what you have at the end is the best tool for this new technique.
LAPPG: Are there any best practices or things we should be aware of when we are setting out to build virtual sets?
AG: Resolution and color workflow, probably. Resolution is key to know exactly what kind of images we need to deliver to the virtual set, and which kind of camera or 360 rig we need to use.
At the same time, knowing the display color space is important in order to build a proper color pipeline. Those images are going to be adjusted and graded like any other image, so we need to design a good workflow to manage those backgrounds from the format and the color science point of view.
LAPPG: It seems that Mistika Boutique has all the functions of Mistika VR plus more tools for compositing and grading along with an amazing timeline that allows users to edit with any kind of clip including 360. So, in what cases would we use Mistika VR instead of Mistika Boutique or is it best to use them together?
AG: Exactly – Mistika VR is actually just a small piece of Mistika Boutique. Mistika VR handles just the first part of the immersive content creation – optical flow stitching and stabilisation. Once this is done, you could simply take the project data to Mistika Boutique without the need to render in order to complete the Color and VFX. However, you could also just do the entire project (including Optical Flow stitching) in Mistika Boutique.
Mistika VR is especially useful when a big team is involved in creating those virtual backgrounds. Most of the work will be the stitching, and for that it is more efficient to have several Mistika VR licenses (which are cheaper obviously) and leave Mistika Boutique only for the finishing part. Communication between Mistika products is perfect, all you need is to save the project in Mistika VR and open it in Mistika Boutique, and you’ll have access to all the adjustments done by the VR user. From there, remember without rendering, you can grade your shot, make any composition and finally deliver it. This allows parallelization of the production of these backgrounds and saves a huge amount of time.
LAPPG: It appears Mistika VR & Mistika Boutique form a system for covering all the needs for building virtual sets. Are there any capabilities that you are looking forward to for these programs being able to do in the future that are not currently in place yet?
AG: We want to improve the management of VR images with complex aspect ratios. Normally VR shots work in a 2:1 aspect ratio, as this is the aspect ratio of an unwrapped sphere, but with Virtual Sets sometimes you need different settings that require at this moment an extra step to deliver, such as cropping an interesting area. In the future we will remove that extra step to manage those scenarios in a quicker and a smoother way. And of course, because this is a new technique that is growing every day, we will face new challenges, so we will improve our tools following those challenges and the feedback of our users, as we always do.
A way to make Europe
EUROPEAN REGIONAL DEVELOPMENT FUND
We had the chance this month to sit down with Simon Hayes, a digital imaging technician to discuss the role of a DIT as well as well as to share with us some of his recent work on the important feature film, Trees of Peace, about four women who find unity, hope, and strength through one of the world’s darkest tragedies, the 1994 genocide against the Tutsi in Rwanda. Join us as we learn about the various solutions used to deal with the very large files that were generated.
Los Angeles Post Production Group: Most of us know what a DIT does but for those who don’t, can you share with us a bit of a job description?
Simon Hayes: Digital imaging technician or DIT has been a bit of a catch all phrase. There are few other job titles/roles that often get confused with that of a DIT, but those roles have fewer responsibilities. Loader/Utility is the most basic role. They handle the media that the camera uses and put the cards into and out of the camera. Next is Data wrangler or media manager. Not only are they capable of handling camera media but they are also responsible for backing up the media to multiple storage devices and recycling the media for reuse. A data wrangler/media manager will also sometimes transcode footage for dailies and/or editorial. They will apply a basic look or LUT to the footage but won’t do any color grading or corrections to the transcoded footage.
Finally, the DIT role is the most involved. Besides establishing the workflow and directing the work of the Loader and Data Wrangler, DITs serve several additional roles that are important to the production process. First is working with the DP or cinematographer to preserve their visual intent. Second is working with post production teams to ensure that image and cinematic intent of the footage carries through, as well as making sure that transcoded footage will reconform with the source footage for the final color process.
A large part of DIT work involves color management. It starts with the camera and it’s sensor and continues on with things like calibration of onset monitors to building looks for scenes and creating colored dailies and transcodes for editorial. The role of a DIT is two fold: they ensure the capture and preservation of best image quality in the original data, and they also serve as a bridge between production and post production process while maintaining the cinematographer’s vision.
LAPPG: What attracted you to this type of work and how did you get started?
SH: For me, the attraction is the balance between being highly technical and visually creative. It’s using both left brain and right brain to overcome challenges.
I started working in the entertainment industry in the mid-90’s on the grip and electrical side of production before the switchover to digital. In the early 2000’s, I transitioned out of entertainment and into a computer data center, and from there, I finished my college education studying film, video and photography. Upon graduating, I moved to Los Angeles and tried to establish myself as a cinematographer. However, within a few years, I realized I had the necessary skill set for being a digital image technician.
LAPPG: How has your role and tasks changed during the pandemic and if they haven’t what did you learn or discover during the pandemic to help your work?
SH: My role hasn’t really changed much. When I’m working on-set, I spend the majority of my time isolated inside a tent.
I would say one thing that has happened more recently is the use of things like Zoom and streaming picture from remote sets to client/agency. It is not without its own challenges.
LAPPG: How did you get involved with the film, Trees of Peace and can you tell us a little bit about the film, your involvement and the timeline for production and post?
SH: The cinematographer, Michael Rizzi who I had worked with on other projects, asked me if I’d be interested in working the Sony Venice camera on a feature film. It was our first time working with this camera system, as it had just been recently released. Sony generously provided us a camera to do several camera tests. This allowed us to develop a look, with help from The Lodge at FotoKem, and test the post production workflow all the way through to the color process before principal photography started.
Trees of Peace is the story of four different women hiding together for survival during the 1994 Rwandan genocide. Most of the story is told from a 5×5 foot box with the four actresses.
Principal photography started in the middle of October (2019) and concluded in mid November. The original goal was to have post finished by March (2020) for a submission festival deadline. Due to the onset of COVID-19, this allowed for more time to be spent refining the cut of the film and working with the composer.
LAPPG: What were the biggest challenges when shooting this film from a DIT standpoint? What specific solutions did OWC provide?
SH: The biggest challenge had to do with the data generated by the camera. We were shooting 6K 2:3 but framing for 1.85 4K. Because of the higher resolution and larger frame size, this generated very large video files. The large files required three different solutions.
The first were the Thunderbay 6 RAID solutions for storing the data. We used three of these to backup all the data in triplicate: a master and two backups. These were configured for RAID 5 which allowed for some redundancy: if a hard drive were to fail in an enclosure, we would not have any data loss. Also, having 6 hard drives in an enclosure ensured that there was enough speed/ bandwidth for the media to be offloaded quickly. This was helpful when shooting high frame rates for things like the dream sequence.
Next was a 4TB Thunderbay mini with solid state drives. We used this as a temporary drive to load the day’s footage and create AVID transcodes that editorial needed. This drive is capable of moving very large amounts of data very quickly. Using this drive allowed DaVinci Resolve to create HD transcodes from the original 6K footage at more than twice real time speed.
Lastly were the OWC Envoy Pro Ex drives. These solid state drives were used to shuttle transcoded footage and audio files to post daily. Because these drives have very high read and write speeds it took minutes to move footage, which allowed the AE to get started working instead of waiting for the data to be moved onto his system. With traditional hard drives, this could have been as much as an hour. An hour might not seem much in the post world, but couple that with 20 days, and the time becomes quite sizable. The OWC Envoy drives are also super rugged, so I had no worries if one was dropped accidentally.
LAPPG: What factors played a role in deciding to use OWC products?
SH: I have been a customer of OWC since 1996. Besides delivering great products at good prices, their customer service has been outstanding. If I had a problem with something I was able to get a replacement quickly, sometimes the next day. Knowing that they stand behind their products and take care of me as a customer gave me added peace of mind.
LAPPG: How did OWC’s products fit into your workflow for the film?
SH: The Thunderbay 6 RAID drive were great due to the large capacity and bandwidth. They allowed all the original camera footage, audio, BTS footage and photos to be consolidated on one drive. It simplified keeping track of assets in post. The OWC Envoy drives allowed footage and audio to be moved quickly to the various groups working on those things.
LAPPG: What would you like to see in the future to help your workflow be more efficient?
SH: Quantum storage, but we may be a lifetime or two away from that being a reality.
I’d also like to see a continuing collaboration between OWC and the industry. OWC has invested time and resources working with DITs to come up with solutions in the ever changing landscape that is digital production. A great example of this is the new Flex 8.
LAPPG: Work/life balance has been a big talking point lately. How do you find that balance for yourself while working on a film?
SH: I think this has and will always be challenge. The long hours and difficult schedules make finding time for family and friends a real challenge. As Mark Twain said, “Find a job you enjoy doing, and you will never have to work a day in your life.” I really enjoy my work but the drawbacks are not being able to celebrate friends and family members birthdays and other special events.
Free Membership Sign Up
Check out our jobs board and find your next opportunity!