Shooting a 3D Documentary: Sony EX3s on Custom Designed Rig Get The Job Done

Production of Battle Castle is fast underway. It’s a documentary series that brings the world’s greatest medieval strongholds to life and we’ve kicked it off shooting 3D in Kent England, on the grounds of the magnificent Dover Castle.

Packing wisdom gathered from taking Blowdown 3D from production through post, we’ve optimized our beam-splitter rig for this new terrain to avoid the issues (and limitations) we had to work with during our first journey into the third dimension.

The result: one self-contained system that can capture almost everything we need.

Here’s the breakdown:

We’ve chosen to mount 2 Sony EX3s over the Canon 7Ds to avoid genlock issues we were experiencing with the Canons.

The EX3s are great, gold standard cameras and can output a clean signal straight to our Nano3D drives.

We’ve also reconfigured the sliders for more interaxial play and attached customized attachments so we can vertically hang cameras without ripping out the hot shoe mount.

We used red-rock micro components along with some custom parts to fine-tune the hand-held splitter.

Altogether it weighs 45 lbs meaning a strong DoP can hold it for 4-5 minutes before taking a break.

Well worth the effort when it means you have freedom.

Limitation worth noting – the EX3s can’t capture vista shots where the subject is faraway. We fill this gap by using a pair of Canon 5Ds on a side-by-side rig to capture these types of shots.

So what it comes down to is we now have a system with perfect sync, beautiful capture, flexibility, and portability.

What more could you ask for?

Ian Herring, President

@ianherring

Shooting a 3D Documentary: Arming our B-cam system for Blowdown

In the previous post I described the evolution of our mini beam splitter rig, engineered by the Parallax crew for portability and 3D close-ups.

Before the filming of our first documentary, Blowdown, we went back-and-forth on what cameras to mount on this custom-designed rig to complete our B cam system. It was an epic battle that ended with Canon 7Ds as victor … for this round at least.

Here’s why:

When we shot the demolition of the Fonte Nova Stadium, our Iconix A cam system rigged side-by-side, also with our very own hand-held design, took some beautiful shots.

For our B cam system, it down to Sony EX3s or Canon 7D’s. The big problem is the Sony EX3s proved too heavy and cumbersome for our purposes. This is an event-based documentary in a demolition zone – last thing we need is to haul excess weight around.

So the 7Ds were the cameras that we went with – but we knew this decision came with a couple drawbacks:

1. The 7Ds have genlock issues making it difficult to synchronize the captures between the two cameras. Meaning we’re going to have a long gop compression issue.

Translation: fast motion close to the camera will produce retinal rivalry.

2. We can’t use video feeds coming out of the cameras with our Transvideo 3D monitor.

Which means we won’t be able to overlap images and check alignment during the shoot.

3. There’s no uncompressed signal coming out that we can tap into and record to the nano3D drives – a problem in 2D as well.

Despite these limitations, we still captured great stereo images with properly set interaxials.

In the end, our confidence in our Canon 7D mini beam splitter system paid off and we have a visually-unprecedented documentary to show for it.

But as we move further into the third dimension, we’re upping our game …

Ian Herring, President

@ianherring

Shooting a 3D Documentary: How to customize your 3D gear to get the best shot

Industriousness, adaptability, and innovation are vital when it comes to the world of filmmaking – especially when you’re shooting one of the first event-based 3D documentaries ever produced for an international audience.

Before we filmed Blowdown 3D, our stereographer, Sean White, faced a huge challenge: engineer a 3D rig that could capture the variety of shots we needed and stand up to run-and-gun filmmaking on an industrial demolition site.

We decided on a beam-spitter rig because a side-by-side rig wouldn’t have allowed us to shoot the close-ups we wanted.

But there was a problem: the Film Factory Indie beam splitter rig we had purchased would have been a beast to lug around a condemned sports stadium in Salvador, Brazil.

Necessity is the mother of invention after all … if we were going to make this journey into the third dimension work, it was clear that we’d have to come up with our own rigging system.

First step: tear open our Indie Film Factory beam splitter, get to know its insides, and build it stronger.

It was a process of experimentation, ordering parts, making adjustments and modifications.

Customize, customize, customize.

Finally, we created a design that worked and hired a machinist to solder the pieces together.

The result: an aluminum box with a window for a horizontal camera and an underslung design shooting up at a mirror.

We call it the mini beam splitter rig.

Besides being close-up capable, this custom design also makes for less problems with reflection and helps protect the mirror and camera lenses from the rain, dust, etc. sometimes encountered in the field.

Above all else is its portability, which is paramount when you’re filming an event-based documentary.

All this at a fraction of what manufacturers are asking for this kind of optimized technology.

Ian Herring, President

@ianherring

Editing a 3D Documentary: Landing a Colorist

After some investigation, our team and I have finally found a studio to take on the epic task of colour grading our first 3D documentary.

The search was a tough one.

The main challenge: though many studios in the Vancouver area possess the colour correcting software to edit 3D – DaVince Resolve, Lustre, Quantel Pablo – they’re still waiting on the monitoring systems to edit in stereo.

For some studios, monitoring systems were actually in transit, boxed to be at their doorstop in less than a month … exciting evidence of the growing hunger for 3D content.

Our colorist will have their work cut out for them.

The rigours of event-based 2D documentary filmmaking versus intentionally lighted film environments means the colorist will have to deal with variable lighting that can change from one shot to the next.

Tackling discrepancies between 3D footage will be another challenge.

The way the cameras capture, miniscule differences in manufactured parts, and the way light hits these parts creates differences between left and right.

The differences are slight, but if left uncorrected could produce a big problem.

Matching stereo pairs and making content broadcast legal will take the work of an expert, but it’s worth the payoff: the absolute best of what 3D can offer before it leaves our hands.

Ian Herring, President

@ianherring

 

Editing a 3D Documentary: Scouting for Colorists

As we enter the polishing stages on our first 3D documentary, the hunt is on to find a studio that can take on the task of color correcting months of painstaking work.

It’s essential to any post-production process, but especially crucial to 3D because we’re creating an effect that only reads if stereo pairs match perfectly.

When filming, aberrations in stereo images occur because the mirrors/prisms contained within the beam splitter can change the nature of the light from one moment to the next.

Any slight difference in temperature or exposure of image pairs will confuse the brain, create discomfort, and botch the whole operation. Period.

So my team and I need to be choosy when it comes to colorists. The challenge now is finding a studio that has a system capable of stereo viewing and correction – slim pickings due to 3D’s relative newness.

And of course, one that possesses the expertise to produce the high-quality 3D imaging we are looking for at a price we can work with.

More to come …

Ian Herring, President

@ianherring

Compositing a 3D documentary: How to maximize stereoscopic effects

Now that our first 3D documentary is almost completed, compositor Jakub Kuczynski has time to give the lowdown on some of the challenges he faced editing Blowdown.

Challenge 1: Double-rendering

Because 3D is filmed in stereo pairs rendering for each eye is required. For a compositor with tight deadlines this process can be a painful one.

Challenge 2: Editing in anaglyph

3D compositing in anaglyph can be deceiving because it crushes depth perception.

The tendency is to compensate for this by creating more depth, but sometimes you can overshoot the mark – a discrepancy that becomes obvious when you view the footage on polarized monitors.

An extra step is then needed to make sure sure stereo pairs are aligning perfectly which, of course, means more time and inevitably more stress.

Challenge 3: Finding the happy balance between formats

Until that fine day when everyone is experiencing our documentaries in full stereoscopic glory it’s important to make sure shots work just as well in 2D as they would in 3D. That in itself is an art.

Here’s an example of one of Jakub’s most technically challenging shots – a 3D within 3D effect composited for Blowdown’s episode on the implosion of the Fonte Nova Stadium in Brazil:

  And here’s how he brought it all to life:

But, of course, mastering challenges like these come with the rewards of creating 3D VFX everyone can get a kick out of.

Ian Herring, President

@ianherring

3D documentary filmmaking review: Fuji W3 3D camera gets E for effort, but you can’t change the world 12 seconds at a time

Our stereographer Sean White and I had a chance to test out the Fuji FinePix REAL W3 3D camera while I was at the Victoria Film Festival showing some of our 3D documentary material last week.

In brief

First, props to Fuji for putting this little camera on the market. Almost any venture into the third dimension is a good move in our books, and the price point will surely entice more consumers to delve into this exciting new realm.

But from a technical standpoint, much like the Panasonic AG-3DA1 we tested a few weeks back, the Fuji W3 3D camera just doesn’t compare to the quality we can get from using a two-camera system on either a side-by-side or beam-splitter rig … yet.

Like the Panasonic, I suspect this will improve with time.

But there’s another huge dealbreaker: the Fuji W3 3D buffer only allows you to shoot 12 seconds of video at the highest quality settings – completely inadequate for any professional video applications.

In today’s market I expect a camera that not only shoots high-quality stills, but video as well. So, in the end, we ended up taking the camera back. Even if Fuji increases the image quality, we won’t even consider buying another one until the video recording capacity also increases … drastically.

The bottom line – props to Panasonic and Fuji for blazing some trails … but in the high-end 3D filmmaking world, two is still better than one.

Ian Herring, President

 @ianherring

The full review

Here’s a more thorough rundown of the camera’s functions, including still photos, courtesy of Sean:

It’s pretty cool that there’s even something like this available for less than $500. Having 3D stills and video in a point-and-shoot, the ability to adjust the convergence, see it instantly on an autostereoscopic display AND fit the whole thing in your pocket is impressive indeed.

However, the quality of the images doesn’t match up to the quality of today’s best 2D compact cameras. Still, I believe this is a major leap forward in terms of making 3D accessible to the masses. The stills quality is definitely acceptable for personal use and web. The autostereoscopic screen is actually quite sharp and effective once you find the sweet spot. The technology is designed so the separate left and right eyes are seen at the same time without any glasses but only from a certain viewing distance and angle.

In this case, the best view is directly in line with the screen, about 30-50cm away. What’s really great about this camera is how it makes 3D believers out of folks not accustomed to watching 3D a lot. At a recent film festival, we shot some pics of guests and we were able to explain the basics of 3D photography with instant results in the users hands. Lots of “wows” and “do that again”. Very cool.

On a technical note, the 75mm inter-axial distance is fine for most average shots but be careful not to get any closer than about 2-3m to your nearest subject or the background will “explode” (Aka: massive divergence). There’s an adapter that use mirrors to effectively reduce the interaxial to about 25mm but we haven’t tried it yet – check it out.

On the downside, the video only shoots 720, not 1080, which is fine for the web but unfortunately the video quality again is not as good as other 2D compact cameras out there. The big heart breaker was that the buffer in the camera only allows a maximum of 12 seconds video when recording at the highest settings. Lame.

Also, there’s no software support for Macintosh which is what we’re all using so I haven’t had a chance to perform any post-production functions. The camera does connect fine to a 3D monitor however it won’t display the saved converged files so you’re not really seeing the proper 3D shots.

I applaud Fuji for packing this all together, especially with the autostereoscopic screen but the video quality, video buffer and interface tools need to be improved big time.

Sean White

Watching a 3D documentary: how to tell the difference between good 3D and bad 3D

It’s easy to tell when a 3D viewing experience goes sideways – sore eyes, headaches and a general feeling of awkwardness are unmistakable indicators.

Obviously we want to avoid these issues with our 3D content. And to fully understand how to do it right, you’ve got to understand how it’s being done wrong.

So what makes bad 3D so bad?

Digital Cinema Report has posted  an in-depth article on “How to Critique 3D” on how to identify the good, the bad and the ugly:

How to critique stereo capture 3D

-Your eyes should focus easily and naturally when viewing in stereo. If you are getting headaches or your eyes cannot focus, improper alignment is the cause (DCR tip: Take off your glasses and try to spot an area of high contrast. You may see that a bright spot is a little higher for one eye than the other.)

-When items on screen are glowing or have an unnatural sheen it is due to exposure differences between the two cameras. It could be unmatched exposure or reflection issues with the beam splitter rig.

-Keep an eye open for bright objects. Glints, lens flares and spotlights create more technical issues that have to be considered during stereo capture. Glints off of metal objects can be messy and appear to be a different shape in each eye. Lens flares will “invert” and pull away from the viewer, which can be visually confusing. Spotlights can create star patterns that rotate differently in each eye making it uncomfortable to view.

-Try to determine if there is too much depth onscreen. Some say it is perfectly acceptable for backgrounds to be out of focus. Others maintain that if the viewer cannot easy focus on distant objects there is too much divergence. (DCR tip: Look for distant objects like mountains or spotlights; If the doubled-image of the mountain appears separated by many inches or feet the background is probably too far away.)

Watching anaglyphic 3D

When we attempted to post a 3D clip from the Panasonic 3DA1 camera on YouTube, it was so everyone could judge the quality for themselves … ideally by being able to view the footage in stereoscopic 3D.

But if you can only view the material in anaglyphic 3D and 2D, all is not lost – use the anaglyphic 3D mode to analyze depth and the 2D mode to analyze footage quality.

Last but not least

If one of your 3D glasses lenses looks smudged it’s likely that the focus, zoom or shutters were not properly synced during filming.

Or maybe it’s just popcorn butter.

Ian Herring, President

@ianherring

From Cineform to Blue-ray: how to prep and ship a 3D documentary for broadcaster review

The time has finally come for a full version of Blowdown 3D to be launched out into the world.

We are shipping the fine cut of our first 3D documentary to broadcasters this week.

And this time they’ll have the chance to actually see it in 3D.

It’s an obvious step – if you’re paying for 3D content chances are you’ll want to check it out at some point before you sign off on the show.

But with this emerging technology, even obvious steps are rarely easy.

So how did we get a cut out for the broadcasters to experience a full color 3DHD explosive demolition from the comfort of their plush office chairs?

First of all, our editor Brian Mann had to make sure all the shots in the cut were 3D legal and correct any colour discrepancies between the two eyes.

To create a comfortable 3D viewing experience, we also monitored the LR convergence throughout the edit.

This made for more work – dealing with issues that either didn’t exist before (converging shots) or that would have been tackled further along in the editing process (for example colour correction after picture lock).

But anyone who’s in the 3D business knows more work is the name of the game.

To send this 3D version of the show to our broadcasters, we opted for two different delivery formats: a side-by side 3DHD version of the show on Blu-ray disc and a digital anaglyph SD version.

To create the Blue-ray disc we exported the CineForm 3D file from Final Cut Pro, burned it onto the Blu-ray disc via Adobe Encore that, unlike FCP exporting functions, allowed us to create more professional customized menus and shipped via courier.

Now all you need to view this form is any Blu-ray player (doesn’t have to be 3D) and a 3D-enabled monitor.

The anaglyph version was exported as a .mov file straight from FCP and then uploaded onto our server. It’s a seriously inferior experience, but will still give those who are unfortunate enough to not have a 3DTV handy a way to see it in some version of 3D.

Though you might want to get on that, dear readers. I’d hate to see you live in the flatlands of 2D when the 3D world is just within your reach.

As for the broadcasters … maybe someone should warn them— cause they’re in for quite a ride.

Ian Herring, President

@ianherring

Watching a 3D documentary: Tips On Creating the Best 3D Television Experience

As 3D televisions (and eventually our first 3D documentary) make their way into living rooms near you, it’s time to lay down some helpful tips I found online to ensure you are getting the most out of your in-house 3D experience.

Ambience

In order to create the optimal 3D experience, you first have to create the environment for it to happen.

Dim the lights, cover the windows–black them out if possible. Essentially create yourself a tiny black hole.

By reducing ambient light you will eliminate ghosting and double vision that often botch the 3D experience.

When in doubt, darkness is always best.

Viewing Position

Until 3D televisions become 360 degrees of polarized celluloid, we’ll have to make due with flat screens that are better viewed head-on rather than from an angle.

Best then to keep the family or group of friends small so everyone gets a piece of the 3D pie.

Television Settings

Because 3D glasses are tinted, they’ll dim the movie if viewed in default settings made for 2D. So make sure to customize your settings and increase screen brightness.

3D is not normal television viewing—don’t forget it.

HD Screen 

This goes without saying, but I’m going to say it anyway: 3D is a viewing format that belongs on an ample-sized, high-def screen.

The bigger the screen the richer your experience of the imagery—think GIANT pop-up book vs. OK pop-up card.

The only other thing I’ll demand from here on in when it comes to 3D screens is they be dust and fingerprint free.

Ian Herring, President

@ianherring