Shooting a 3D documentary: why we use still photography to capture time lapses

A bit more re. how we plan to produce time lapses for our first 3D documentary.

We’ve used still cameras to capture time lapses for previous episodes of Blowdown, the explosive demolition show we’re gearing up to film – for transitions, establishing shots, and work that’s progressing.

Here’s a raw example of one from an episode in Season One – the implosion of four cooling towers at the Sellafield nuclear facility in England:

The reason we use this technique is because it gives us photos that are higher resolution that HD – pristine, jpeg images up to 21.1 MP.

Obviously much better quality than frame grabs off of video.

It also means our primary video cameras/crew can be used to film action – in this case demo work on the Fonte Nova Stadium in Salvador, Brazil – while the still cameras (in this case Canon 7Ds) sit unmanned on a side-by-side rig, automatically collecting shots.

In the 3D realm, the super high resolution will allow us to converge and do digital zooms in post within the time lapse without losing any quality.

Shooting a 3D documentary: how we’re syncing Canon 7Ds (VIDEO)

Yesterday’s post has created some confusion between syncing Canon 7Ds for 3D time lapse photography and syncing them for 3D videography.

I’ve edited to make clear yesterday’s post is for stills, and am now posting re. video.

To sync Canon 7Ds while shooting film footage for our first 3D documentary we plan to use the same infrared box systems we’ve rigged to sync our Canon Vixias, and slate our shots so we can sync in post.

I’ve outlined this technique in previous post for Vixias. For 7Ds we:

Set both cameras to self-time/infrared mode, which allows us to use an infrared remote (The Canon RC-5, in this case)

Position our box designed to receive any infrared signal and transmit it through a split cable to two infrared sensors.

Tape the infrared heads at the ends of the split cable to the infrared sensors on the two Canons and then use the remote to start them in sync.

And yes, because of the internal clock circuitry disparities we don’t get a 100 per cent accurate start and stop, as the cameras may not start recording on the same frame.

Our stereographer, Sean White, has found our 7Ds can be out of sync by one or two frames at 30 FPS.

To get around this we’re doing a physical slate at the start of each shot for our editor – this way he can toggle shots by a frame or two and sync from where the slate is.

Sean’s tested this system extensively.

He’s found that once you sync clips at the start they stay in sync for up to 12 minutes straight – much more time than we need to get the types of shots we’re going to capture with the 7Ds.

 

Test footage:

 

VIDEO

 

Video: Compositing for a 3D documentary – VFX elements that work in stereo

The 3D VFX pyramid fly-in shot that our compositor’s been working on to test techniques for our first 3D documentary is good to go.

Again, you’ll need the good old red-cyan goggles for this:

Our compositor, Jakub Kuczynski, explains elements in the shot:

And compares working on 3D VFX versus 2D VFX:

This shot will be part of the 3D demo I plan to show broadcasters next week.

Now all I need to do is find a 46-inch JVC monitor to show it on …

Shooting a 3D documentary: how we’re syncing Canon 7Ds (STILLS)

I’ve received a few questions re. how we’re syncing Canon 7Ds we’re using to capture elements of our first 3D documentary … so I’m blogging about it to share with everyone.

We’ve used DSLRs to get high res stills at set intervals for time lapses for years, but, of course, never in 3D.

For this 3D episode of Blowdown, the explosive demolition series we produce, the crew will use 7Ds for these time lapses – and also for establishing shots of the condemned sports stadium in Salvador, Brazil and, of course, the implosion itself.

Here’s how we’ve brought this system into the third dimension:

1) Splice cable so there are two heads on one intervalometer.

2) Attach heads to timer remote ports on two Canon 7Ds on side-by-side rig.

3) Sync using the one handheld intervalometer.

Voila!

I’ll provide more details re. our time lapse strategy later: wanted to get this bit up ASAP.

Hope it helps.

 

VFX compositing for a 3D documentary: compiling a demo

As we prep to shoot our first 3D documentary, our compositor, Jakub Kuczynski, is getting ready to produce 3D VFX for the show.

We use these photo-real visuals in the explosive demolition series Blowdown, which we’ll be filming the next episode of, mostly for:

1) Locators – help orient the audience and show where the crew is working at a particular point.

2) Disaster scenarios – show the audience what will happen if different aspects demolition plan go wrong.

3) Dynamic structure shots – fly-overs, fly-throughs, fly-ins … allows audience to experience the strength, grandeur and engineering prowess of structures that Controlled Demolition Inc. implodes.

Here are two examples from previous shows: Super Stadium and Spyship:

Locator VFX: RCA Dome. Indianapolis, Indiana

Disaster scenario VFX: Hoyt S. Vandenberg. Key West, Florida

As part of this process, Jakub’s generating sample 3D VFX shots so we can assess them and so I can take some on my trip to meet with broadcasters the week after next.

We’ll combine the 3D VFX clips with some of the 3D test footage we’ve shot so I can give the broadcasters a solid visual idea of what we’ve developed so far.

One of Jakub’s shots – a fly-in to pyramids.

Why I’m including this in the 3D demo:

Look forward to the final.

Shooting a 3D documentary: demolition POV – ContourHD versus GoPro HD

3D POV, here we come. We’ve started looking into cameras to shoot POV footage for our first 3D documentary.

This content is a must – there’s nothing like being able to put a camera where no human dare go and capture the scene from that angle.

For example, what’s it like to ride on an excavator boom while it’s ripping bleachers apart?

An excavator demolishing bleachers, the RCA Dome

This is a screen grab from the demolition of the RCA Dome in Indianapolis, Indiana, the subject of a previous Blowdown show – the explosive demolition series we’ll be filming in 3D.

Now imagine part of the grapple in negative space.

But it’s going to take a bit to get there.

The cameras we use have to be:

1) Lightweight enough to be mounted onto a Magic Arm, or duct taped/fastened to something on the machine.

2) Placed at an interaxial distance small enough to film an object within 1 ½ metres from the lens.

3) Workable with a customized side-by-side rig (that we’re apparently going to build … of course).

4) Capable of turning on in sync.

Our stereographer, Sean White, has narrowed it down to two contenders: the ContourHD 1080p or the GoPro HD.

Suspect the Contours will reign because they’re less boxy, but the GoPro’s a bit better quality – so I’d like to find a way to use if possible.

We’ll see how the “dare to compare” goes.

Shooting a 3D documentary: anaglyphic test footage – 2 Canon 7Ds on a side-by-side rig

As I’ve mentioned, we’re testing several camera systems out to decide what to use to film our first 3D documentary. We’ve seen promising footage out of our A cam system – an Iconix sensor system with Meuser Optik lenses on a side-by-side rig.

Now our stereographer, Sean White, has brought in B roll he shot with our C cam system: two Canon 7D DSLRs on a side-by-side rig.

Happy to say it’s not looking bad either.

Need a pair of the old school anaglyphic glasses for this one:

There’s only one problem: some of the backgrounds are diverging past our broadcasters’ specs.

So now we’ll have to experiment with different interaxials until we fix the issue.

In the field – the demo of a condemned sports stadium in Salvador, Brazil for the explosive demolition series Blowdown – this system will be used to capture establishing shots of the structure and, of course, its implosion.

Here’s an earlier video of Sean, walking though the setup.

We’re also honing how to capture our time lapses with this system – details to come.