Making a 3D documentary: stunning visuals, solid story

It’s the final countdown. Our stereographer, Sean White, is hustling to nail down all the gear to shoot  our first 3D documentary.

The crew will be packing up on Sunday, and flying to Salvador, Brazil on Monday to start filming the explosive demolition series Blowdown – plan to get get lots of pics and video of the gear before they leave and post here next week.

I’ll be joining them the week after, and will stay until after the implosion of the condemned Fonte Nova Stadium.

And while Sean and the others are busy thinking about how to capture great visuals in 3D, I’m going to be thinking about story.

Even though all this 3D stuff looks incredible, I still believe that if the story isn’t good, it’s not worth watching – so I need to make sure we’re investing in it first.

This means capturing key bits between Controlled Demolition Inc.’s crew – obstacles they encounter, conversations they have, and emotional moments that resonate.

On the other hand, I don’t want to be afraid to capture 3D footage that will captivate our audience visually …

Shooting a 3D documentary: thinking 3D in the field

As our first 3D documentary shoot approaches, I’m contemplating the intricacies of shooting in the third dimension. Our stereographer, Sean White, is familiarizing the rest of the crew with the new things they’ll have to take into consideration when shooting the prep and implosion of the Fonte Nova Stadium in Salvador, Brazil.

Bottom line: the jump into 3D will change how we frame and shoot the explosive demolition series Blowdown.

In other words, it will change pretty much everything.

And to make it work, the crew’s going to have to learn how to “see” in 3D.

They’ll have to think about where things are going to fall into positive and negative space.

They’ll have to identify visuals that are going to look superb in 3D – and, just as importantly, recognize the shots that won’t make the cut.

They’ll have to understand the strengths and limitations of each rig – our beam splitter, mini beam splitter, and the plethora of side by sides – so they know which one works best for which shot.

They need to recognize when we can’t get too close to a subject.

They must realize that they can’t frame something in the extreme foreground and pan to reveal a subject in the background – a trademark move to help create depth in a 2D image can mean too much volume in 3D.

They also need to make sure there are no objects floating around in the foreground (ie. wires, the edge of a wall, rebar sticking up, edge of a concrete slab) – and understand what details could be distracting.

Not to mention they’ll be working on a dusty demolition site – for example, excavators pulling up dirt, swinging into the shot as the “claw” grabs something.

They’ll not only have to think about how this will play in 3D, they’ll have to think about if one lens is dusted out, say, by the excavator’s load.

If they miss any one of the parameters on any one of the eyes this shot – or any other shot – will be useless.

Here’s to cleaning the cameras – times two.  And all the adventure that comes with it.

Lucky we have a great crew. Ready to roll.

Shooting a 3D documentary: why we use still photography to capture time lapses

A bit more re. how we plan to produce time lapses for our first 3D documentary.

We’ve used still cameras to capture time lapses for previous episodes of Blowdown, the explosive demolition show we’re gearing up to film – for transitions, establishing shots, and work that’s progressing.

Here’s a raw example of one from an episode in Season One – the implosion of four cooling towers at the Sellafield nuclear facility in England:

The reason we use this technique is because it gives us photos that are higher resolution that HD – pristine, jpeg images up to 21.1 MP.

Obviously much better quality than frame grabs off of video.

It also means our primary video cameras/crew can be used to film action – in this case demo work on the Fonte Nova Stadium in Salvador, Brazil – while the still cameras (in this case Canon 7Ds) sit unmanned on a side-by-side rig, automatically collecting shots.

In the 3D realm, the super high resolution will allow us to converge and do digital zooms in post within the time lapse without losing any quality.

The mini beam splitter rig: portability for 3D documentary filmmakers

The mini beam splitter rig concept we came up with to film elements of our first 3D documentary has come to fruition.

We decided to try and build because the thought of moving and setting up our Film Factory 3D Indie BS Rig for one or two close ups is just too painful. Now we can use this smaller unit to shoot these components for the explosive demolition series Blowdown and save schlepping the full-sized rig around the Fonte Nova Stadium in Salvador, Brazil for when we’re shooting extensive B roll in one location.

The goal was to custom-design a rig that would house two Canon 7Ds and that was small enough and light enough to be operated by one person, handheld.

To achieve this, the mini beam splitter rig:

1) Has customized aluminum rails that aren’t as big and fat as the ones on the Film Factory Indie rig.

2) Is bolted and tweaked specifically for the dimensions of the Canon 7Ds.

3) Has a smaller box.

4) Is designed so the second camera is underslung – easier to handle because it’s not as top-heavy.

5) Allows us to get camera lenses closer to the mirror.

It was whipped up in a couple of days, and it’s not pretty – but it is portable and robust. We’ll see if it works.

Showing a 3D documentary demo: polarized home entertainment gear

Great news: we’ve nailed down all the equipment I need to show our 3D demo to broadcasters next week.

We’ve cut together test footage/VFX in preparation for the first 3D episode of Blowdown – the upcoming implosion of the Fonte Nova Stadium in Salvador, Brazil – to share with broadcasters.

But I couldn’t find a 46-inch JVC GD-463D10 to show it on – the company’s out of stock and backlogged.

So JVC’s providing us with a demo model – one of only four in the U.S. – for the screening.

A rep from CineLineWest, a local supplier, contacted the company on our behalf, and they’ve arranged for one of the two demo monitors on the East Coast to be shipped for the meeting – many thanks.

I’m not sure exactly why there’s such a shortage, but we’ve heard part of the issue is a high failure rate with glass in the screens during the manufacturing process because it’s a new technology and, as with any other new product, it takes time to refine the assembly line.

Our editor, Brian Mann, is making DVDs of the demo and a HQ digital version for my MacBook Pro so I’ll have the option of playing it either way (redundancy … yes).

We ordered 10 pairs of “Buddy Holly” circular polarization glasses from 3DStereo.

Thanks to the rep there as well, who offered to drop them off at FedEx on a Saturday.

And to top it all off, two pairs “James” specs from MicroVision Optical 3D – “the most current secret agent look.” Rocking.

Now our broadcasters will be able to experience the third dimension of home entertainment – a taste of things to come.

When they see this stuff, I think they’re going to feel the way I do – like this is how it was always meant to be.

One giant step closer to the real thing.

I’m not going to rest easy until the screening is finished … I can’t wait.

Shooting a 3D documentary: how we’re syncing Canon 7Ds (VIDEO)

Yesterday’s post has created some confusion between syncing Canon 7Ds for 3D time lapse photography and syncing them for 3D videography.

I’ve edited to make clear yesterday’s post is for stills, and am now posting re. video.

To sync Canon 7Ds while shooting film footage for our first 3D documentary we plan to use the same infrared box systems we’ve rigged to sync our Canon Vixias, and slate our shots so we can sync in post.

I’ve outlined this technique in previous post for Vixias. For 7Ds we:

Set both cameras to self-time/infrared mode, which allows us to use an infrared remote (The Canon RC-5, in this case)

Position our box designed to receive any infrared signal and transmit it through a split cable to two infrared sensors.

Tape the infrared heads at the ends of the split cable to the infrared sensors on the two Canons and then use the remote to start them in sync.

And yes, because of the internal clock circuitry disparities we don’t get a 100 per cent accurate start and stop, as the cameras may not start recording on the same frame.

Our stereographer, Sean White, has found our 7Ds can be out of sync by one or two frames at 30 FPS.

To get around this we’re doing a physical slate at the start of each shot for our editor – this way he can toggle shots by a frame or two and sync from where the slate is.

Sean’s tested this system extensively.

He’s found that once you sync clips at the start they stay in sync for up to 12 minutes straight – much more time than we need to get the types of shots we’re going to capture with the 7Ds.

 

Test footage:

 

VIDEO

 

Video: Compositing for a 3D documentary – VFX elements that work in stereo

The 3D VFX pyramid fly-in shot that our compositor’s been working on to test techniques for our first 3D documentary is good to go.

Again, you’ll need the good old red-cyan goggles for this:

Our compositor, Jakub Kuczynski, explains elements in the shot:

And compares working on 3D VFX versus 2D VFX:

This shot will be part of the 3D demo I plan to show broadcasters next week.

Now all I need to do is find a 46-inch JVC monitor to show it on …

Shooting a 3D documentary: how we’re syncing Canon 7Ds (STILLS)

I’ve received a few questions re. how we’re syncing Canon 7Ds we’re using to capture elements of our first 3D documentary … so I’m blogging about it to share with everyone.

We’ve used DSLRs to get high res stills at set intervals for time lapses for years, but, of course, never in 3D.

For this 3D episode of Blowdown, the explosive demolition series we produce, the crew will use 7Ds for these time lapses – and also for establishing shots of the condemned sports stadium in Salvador, Brazil and, of course, the implosion itself.

Here’s how we’ve brought this system into the third dimension:

1) Splice cable so there are two heads on one intervalometer.

2) Attach heads to timer remote ports on two Canon 7Ds on side-by-side rig.

3) Sync using the one handheld intervalometer.

Voila!

I’ll provide more details re. our time lapse strategy later: wanted to get this bit up ASAP.

Hope it helps.

 

Watching a 3D documentary: JVC 3D HD LCD monitors clearly a hot commodity

Supply and demand disparity …

We’re trying to get a 3D monitor so I can show test footage/VFX for our first 3D documentary to broadcasters next week.

I’d like the same monitor that we purchased for our in-house purposes – the the 46-inch JVC GD-463D10 – but guess what … so far there are none to be had.

The Toronto-based supplier that we purchased it from is sold out and backlogged.

They’re trying to get us a monitor directly from JVC – or at least find someone who we can contact to push our order, but neither us nor they have been able to get through to anyone who can speak conclusively on behalf of the company as of yet.

The alternative is pursuing a Hyundai monitor – but we know the JVC works for us, and I’d much rather go with the tried and tested when showing broadcasters a demo of the techniques we’ve developed.

We’re hoping to hear something back tomorrow a.m. (PDT).

A bit nerve-wracking.

But the silver lining’s undeniable … 3D monitors are getting snapped up faster than the assembly line can churn ‘em out.

Clearly this bodes well for entertainment in stereo.