From the Guild
THE GENESIS CODE
The Genesis Code post crew, in front of the Flat River by the riverboat Robert E. Lee. From left, first assistant editor Zach Fine, post-production supervisor Darryl Johnson, editor Edward R. Abroms and second assistant editor Charles Breiner. Photo by Zach Fine.
The Genesis Code: Post Creationism Goes Small
by Michael Kunkes
In The Genesis Code, a new indie feature currently in post-production on location in Lowell, MI, a college hockey player and a female journalism student struggle to find common ground with their spiritual faith and scientific studies, while a life hangs in the balance.
A scene from The Genesis Code. Photo courtesy Edward R. Abroms
The philosophical questions posed by the $5.2 million movie, directed by C. Thomas Howell with a cast that features Logan Bartholomew, Kelsey Sanders, Louise Fletcher and Ernest Borgnine, may never be answered satisfactorily, but the post-production workflow could serve as a primer for filmmakers on a budget looking to do end-to-end production and post in as small and self-contained a location as possible.
|The Genesis Code, which is shooting for a Thanksgiving release, is being edited by Edward R. Abroms, who knows a bit about cutting on a budget, with over ten years of indie feature credits that include Checking Out (2005); Uninvited Guest (1999) and Pride (2007); Abroms has also edited multiple episodes of Law & Order, CSI Miami, Day Break and Eureka. He’s the son of retired Guild film editor Edward M. Abroms, A.C.E., who won two Emmys, in 1970 and 1972, as well as a 1983 Oscar nomination for best film editing on Blue Thunder.
Edward R. Abroms in his editing room in Lowell, MI; 12 x 9-foot 2K screen in background. Photo by Andrew Terzes.
A very large part of the reason The Genesis Code is shooting and posting on location in Lowell, MI (pop. 4000), is Michigan’s tax incentive program, which is kicking back 42 percent of the production. And, says Abroms, as long as he continues to work in the state, the program also picks up 32 percent of his fee through completion of the first cut. That made it imperative that the production and post workflow be located in Michigan for as long as possible.
With the decision already made to shoot the film on the RED camera and edited on the just-released Final Cut Studio 3 with Final Cut Pro 7, Abroms and assistant editors Charles Breiner & Zach Fine (a one-time QA engineer on earlier versions of the software), along with post production supervisor Darryl Johnson, decided to make use of the software’s recently-introduced log-and-transfer function for RED footage. When importing RED r3d files via log-and-transfer, editors can transcode to any of the ProRes codecs on import, or just import r3D as native. Abroms decided to import directly into ProRes HQ at 2k resolution.
Abroms reports that even though ProRes HQ files are 10-15 percent larger than the RED files, it was decided to go with this method in order to maintain a 2K workflow from shoot through post. “A lot of Final Cut editors like the proxy quick time files that the RED camera generates, because they can be rendered dynamically in different resolutions, but they can be a pain to work with, and the performance isn’t what we’d like,” he explains. “Occasionally, we’d use them to double-check some footage on a drive, but for the most part, the ProRes HQ codec gives you much better performance inside Final Cut Pro and lets you bring in multiple streams and do real time effects, which is not possible with the highest resolution proxy files.”
Abroms added, “FCP’s log-and-transfer uses whatever metadata is on the RED files, including the colorspace settings chosen by the cameraman, so that the look they wanted when they shot the footage is baked into the ProRes HQ files. There were some mistakes, and we fixed those by simply re-importing those clips with a different setting. Red footage transferred in this manner does not need a ton of work to look like a real movie; it looks more like a one-light transfer that is balanced enough for an editor to work with.”
Final Cut Pro's log and transfer window. Photo courtesy Apple Computer, Inc.
Editorial received audio as Broadcast WAV files on DVD-ROM discs, and used Final Cut Pro's Merge Clips function to sync sound. According to Abroms though, this could be a tricky process. “If everything is done correctly on the set, the time code in the camera is the same as what’s on the soundtrack, but as I discovered, even using smart slates, there was a drift of up to two frames,” he says. “I ended up having to do it the old fashioned way by checking the board clap, but the feature does get you close.”
The Michigan incentive program also made it imperative to use as much of the production sound as possible. “Sometimes the sound mixer would do a mix of what he thought would be a good guide track, and that’s what went to the RED, but a lot of times we’d end up with mic hits because the radio mics were hidden under the actors’ clothing,” adds Fine. “But because we didn’t ingest the camera audio from the RED and instead merged the video with what was often a 6 or 8 track .WAV file, we had the latitude to instead pull the sound from the boom and not use the radio mics. There is going to be very little ADR on this movie, and what there is will have to be done back in Los Angeles. It was very important to make sure that what they got on the set was very clean.”
Abroms’ cutting room also served as a dailies screening room (and visual effects department), greatly aided by the presence of an IGI 12 X 9-foot screen with a 2K Panasonic 3-LCD 1080p projection system. “We are projecting our Apple ProRes HQ dailies directly out of the computer onto this screen after log and transfer,” Abroms says. “It’s been ten years or more since I was able to sit down every night with the DP and director on location and watch dailies. It was very refreshing, since I am not used to receiving the director’s input on a daily basis. I didn’t physically edit to that screen during the day, but I was able to look up from my two 24-inch cinema monitors and see my full scene across the room at nearly our final delivery resolution; it was like having a DI suite right in the editing room. IGI also came by during production and demonstrated their impressive new 4k resolution projection system combined with the Red Rocket PCI-Express card for real time playback of the native RED media. I would love to use this system on a future project.”
Abroms and his crew are also using several remote review tools, one of which is the new iChat Theatre, which allows participants in different locations to view and edit a Final Cut Pro project in real time. Say Abroms, “My producer works in Peoria, IL, and I am able to use iChat Theatre to allow him to see anything on my canvas or viewer anywhere he is; I can also switch the iChat view between clips and sequences while we talk, or turn on a time code overlay to help identify specific frames.”
Another tool the production is currently making use of is syncVUE, a conversion utility recently acquired by FUZE. “Using syncVUE, we post h.264 Quick Times to the producer over yousendit.com; he downloads the reel, then through syncVUE and using Skype, we can both sit there and simultaneously run the same edit. If I move the edit, it will update on his side; if he moves it, it will update on my side. We both have interactive controls on the edit, and he can actually go through a cut at night when we’re not at work, and make comments that can be emailed to me, and imported directly into the timeline as metadata markers.”
Fine adds that one of the great new features in Final Cut Pro 7 is the ability to use different color-coded clip and sequence markers to indicate groupings and kinds of scenes or sequences. “For example, we use purple to mark and number each line of dialogue, green markers to indicate gags, and so on,” he explains. Fine also likes FCP’s redesigned time remapping feature, which allows clips to be played at variable speeds within a sequence. “In previous versions the feature was not very intuitive, but Final Cut Pro 7 has completely changed its user interface, and they’ve made doing a speed ramp a very easy task. In addition, Final Cut Studio’s Motion software can render the time-remapped clips using optical-flow interpolation. This method tracks the movement of pixels in a clip and generates completely new discrete frames. It can take a long time to render, but we have sent a couple of clips directly from FCP's timeline to Motion in order to use this feature, and the result looked as though it had been ramped in-camera.”
Abroms adds that having the VFX department in the same room was an added bonus. “In the old days, we’d have to send out the effects to the lab for scanning and it would be three days before they would even get the shots,” he relates. “Because we also retained the raw RED files, they can have the shots an hour after they request it. We’re entirely self-contained and can be anywhere in the country with this workflow setup and don’t have to deal with a lab.”
When the cut is delivered, the DI will be done in Los Angeles on Apple’s Color 1.5, also a part of Final Cut Studio 3. “Ed is cutting in 2K, which will be the actual resolution of the delivery,” Fine explains. “Rather than send the DI house an EDL or XML files, we’re sending them the entire cut as a Final Cut Pro project file, along with the RED file backups that they will be using for the actual DI, rather than using the Pro Res HQ material. It’s a truly unique way of doing a DI.”
Michael Kunkes is a freelance editor and writer specializing in animation, production and post-production. He can be reached at firstname.lastname@example.org.