Straggling video from yesterday’s post finally live … stereographer Sean White on the mission to rig a 3D camera system portable enough to shoot the prep and implosion of a condemned sports stadium in Salvador, Brazil for the explosive demolition series Blowdown:
The great lens showdown is over: Meuser Optiks it is. After an intense push to After an intense push to choose lenses for our A cam system so we can shoot our first 3D documentary, these German lenses – 3.4 mm, designed for a 1/3-inch CCD sensor, and HD capable – were the ones that made the cut.
They work with our Iconix 1/3-inch sensors, and the interaxial distance can be set close enough to allow us to fil 1 ½ -2 metres away from our subject.
Another to follow ASAP on why the system should work to shoot the prep and implosion of a condemned sports stadium in Salvador, Brazil for the explosive demolition series Blowdown – YouTube’s fighting me.
As I’ve mentioned, the 3D files that Cineform creates only have two audio tracks.
To capture ambient noise as well as a conversation between two subjects for the explosive demolition show Blowdown, we have to capture at least three channels (a boom mic and two lavs), sometimes four (camera mic).
Our editor, Brian Mann, has been in conversation with Cineform developers to see if we could find a way to edit with more than two channels.
They’ve been very prompt in replying and helpful.
But unfortunately it looks like there’s no way to edit more than two channels of audio using the current version.
There’s no particular reason why the program’s this way – it’s just a design factor that isn’t optimal for our specific post production needs.
As far as we’re concerned it’s the best high-end game in town, and otherwise it’s working great.
Cineform’s lead Mac engineer plans to add it to the list of things to add to their future release, First Light.
In the interim, we’ll have to figure out how to adjust our workflow.
Our HD 3D monitor has arrived!
Moving to polarized is a relief – no colour loss, no red and cyan ghosting … and no headaches.
Purchasing the monitor ($6,600 later) was obviously a must – you can’t produce a high quality 3D show, like the first 3D documentary we’re going to shoot, editing in anaglyph.
We managed to fit it into an edit suite and set it up. Brian, Jakub and I check out some VFX footage on our newest 3D toy:
When it comes time to cut this Blowdown, Brian will use the monitor to see what he’s editing in Final Cut Pro (which, as I’ve mentioned, can’t edit in 3D without third party program help – we’re trying Cineform Neo3D out).
This view is key – cutting 3D shots means there’s a lot more to consider – parallax, convergence, wide and close, how much positive depth/negative depth exists in each shot.
If you cut shots with huge discrepancies in depth it’s really uncomfortable to watch, so you can’t just chop shots together – even with a flashy transition.
The only rub with the monitor is that it doesn’t do dual stream, which means the footage is technically at half-resolution (ie. don’t get both eyes full res).
So even though we’re editing in dual stream (to deliver the highest quality possible), we can’t view it that way on the JVC screen.
We looked into dual stream monitors – Panasonic’s due to release a 25-inch unit in the fall – but it’s prohibitively expensive (approximately $10,000).
And, more importantly, it’s too small for us to view our footage in a size that’s representative of our final product (how many 25-inch televisions have you seen lately?) – a shortfall that could lead to convergence that isn’t optimal for our audience.
Oh, and the JVC came with two free pairs of polarized glasses.
Looks like we’ll do just fine:
Editor Brian Mann works with the JVC HD 3D LCD monitor
Good news on the post front: we’ve upgraded from Adobe CS4 to CS5.
And the stereo scripts our compositor will use to create VFX for our first 3D documentary work with the newer version.
We’ve been gearing to get CS5 running since we moved from Leopard to Snow Leopard to take advantage of the 64-bit architecture and improve workflow.
But our VFX artist, Jakub Kuczynski, was concerned stereo 3D scripts he found online that have given him a much more efficient pipeline for stereoscopic workflow in After Effects wouldn’t transfer over smoothly to CS5.
He contacted the scripts’ developer, Christoph Keller, to ask if they’d be compatible, but he didn’t know.
Now we do. And it’s very good news – work that would take Jakub a day to do manually takes him about an hour, thanks to the scripts.
As for the CS5/Leopard upgrades, we haven’t noticed a marked increase in speed, but even a little more juice over the long run means more efficient post production overall.
We’ve fed footage from our 3D green screen shoot into post, where we’ve promptly encountered our first editing glitch. We’re running a trial version of Cineform Neo3D software to see if it will work for editing our first 3D documentary.
The reason we’re trying this program is that it allows for dual steam, which means each eye is at full res and in real time, so we can do convergence, colour correction and other editing in real time rather than having to render whenever we make an adjustment.
Cineform also works with Final Cut Pro, the editing software we normally use to cut 2D HD there’s no way to edit 3D in FC without a 3rd party programs, as far as we know.
Great for picture, but there’s a problem with audio.
The 3D files that Cineform creates will only have two audio tracks.
To capture ambient noise as well as a conversation between two subjects, we have to capture at least three channels (a boom mic and two lavs), sometimes four (camera mic).
And since Blowdown – the explosive demolition series we’ll be filming – is event-based, there’s no opportunity for ADR, and you can’t recreate most of the ambient sound in post.
The issue isn’t technical – Alister Chapman reports using the same cameras successfully, and we were able to genlock the EX3 to the EX1 by connecting the EX1’s Y channel of the component output to the EX3’s genlock in connector, just as he has.
It’s logistical …the cameras are just too big and cumbersome for this particular beam splitter rig.
We’ve modified the rig so they fit better, but getting them aligned vertically is rough – the mics protrude and we’re still seeing the edge of the box and/or the bottom of the mirror when we use our Sony EX 5.8 mm lens (which has a 56-degree horizontal angle of view).
Wide shots are a must for Blowdown, the explosive demolition documentary we’ll be filming, so we need a system that will effectively capture this kind of footage – ie. we need to hit the sweet spot on the mirror, have the cameras vertically aligned and not see the rig when we use wide-angle lenses.
The alternative is enlarging the image in post to eliminate the part(s) of the shot that contain the rig, but that will degrade the quality, so I want to try and avoid this (especially since we’ll be blowing the footage up it to a certain degree already to facilitate convergence).
So, now we’re working with Canon Canada directly to get loaners of the XF305, which has just been released.
Green screen action!
Writer Nicole Tomlinson stands in for a 3D green screen test
We’ve taken the Film Factory 3D BS Indie Rig Sony EX1/EX3 B cam setup for a full test run.
The mission: to see if the system can capture green screen footage for our first 3D documentary the way we want it to.
Stereographer Sean White mans the 3D beam splitter rig
We need these shots to create several of out our in-house visual effects, a style we prefer to classic documentary CGI because it allows us to explain extremely technical concepts in a photo-real atmosphere.
This means our transitions in and out of our footage are much more seamless … viewers can stay more immersed in the environment and focused on the story.
Here’s an example of our Blowdown VFX style:
Controlled Demolition Inc. President Mark Loizeaux outlines his demolition plan
The green screen footage we’ll need to include effects like this comes with an entirely different set of issues than the field shots we’ll have to tackle.
This environment is the most “studio” our event-based filming gets – the interviews aren’t scripted, but the lighting is set, the frame is stationary, and there’s opportunity for multiple takes.
But what we capture has to work in our compositor’s virtual environment or it’s completely useless.
Jakub Kuczynski, Parallax Film’s VFX artist, details these challenges:
We’ve thrown the footage over to post – we’ll see if it flies.