A professional organization for filmmakers to learn, share and network since 2008.

December 2024 Photos

We wanted to end the year on a high note, so we thought, who better than to have Adobe’s Karl Soule’ give us a look into Adobe & AI! He hit the ground running with explaining how Adobe has been at the forefront of bringing AI to post production and that we probably have been using AI already. He made a differentiation between Assistive AI (Sensei) and Generative AI (Firefly). Karl explains Adobe makes creative software for creative people like us and that they are not trying to replace people and they are trying to make sure this is commercially safe. On the Assistive AI side Karl showed Remix which is built into Premiere and shares how this Sensei technology can save you time. Then Karl pivoted to show us the Firefly website. This is where the  Generative AI technology is and then it gets integrated into the various applications. The important thing that sets Firefly apart is that they do not and never have trained Firefly on user or customer content. It only gets trained on public domain imagery and content that has been specifically licensed for Firefly to do so. Also they do not claim ownership on any Firefly generated content. Adobe founded the Content Authenticity Initiative a number of years ago in hopes of creating some trust where images come from by way of something called content credentialing. He then showed the Firefly for Video website which is currently in a closed beta but they are adding people as more capacity opens on the servers. In addition to typing a prompt you can pick between different types of camera motions and different types of styles, when you are going through and creating video. It also offers the ability to have a reference image that you can build from. Karl showed an example using a hand drawn sketch done by a colleague and how in Firefly for Video it was able to generate a claymation look, a cut paper look, and a traditional animation look. Then Karl showed a feature on the public beta in Premiere Pro that people could play with now called Generative Extend. This solves a real world problem that happens when editors have footage that is not long enough. The Generative Extend Tool enables the extension of a clip beyond it’s original end point. Firefly and this Generative Extend option can give you up to 10 seconds of room tone. Then Karl shows the content credentials that were attached to a clip he created using Generative Extend by uploading it to the contentcredentials.org website and you could see when and how the clip was created. He also showed a new masking function in Premiere that is borrowing from the select subject function in Photoshop that is not in public beta yet.

Free Membership Signup

Announcements

News

Blog

Meet Sean Fine

It was a true honor getting to connect with Oscar®, Emmy®, and Peobody®, award-winning filmmaker,

Meet Mike Mezeul II

Whether it’s the explosive eruption of a volcano or the swirling chaos of a tornado,

Meet Marcus LeVere

We want to introduce you to Marcus LeVere, a Vancouver-based VFX supervisor who used his

Job Opportunities

Check out our jobs board and find your next opportunity!

Become a Partner