Compositing a 3D documentary: How to maximize stereoscopic effects

Now that our first 3D documentary is almost completed, compositor Jakub Kuczynski has time to give the lowdown on some of the challenges he faced editing Blowdown.

Challenge 1: Double-rendering

Because 3D is filmed in stereo pairs rendering for each eye is required. For a compositor with tight deadlines this process can be a painful one.

Challenge 2: Editing in anaglyph

3D compositing in anaglyph can be deceiving because it crushes depth perception.

The tendency is to compensate for this by creating more depth, but sometimes you can overshoot the mark – a discrepancy that becomes obvious when you view the footage on polarized monitors.

An extra step is then needed to make sure sure stereo pairs are aligning perfectly which, of course, means more time and inevitably more stress.

Challenge 3: Finding the happy balance between formats

Until that fine day when everyone is experiencing our documentaries in full stereoscopic glory it’s important to make sure shots work just as well in 2D as they would in 3D. That in itself is an art.

Here’s an example of one of Jakub’s most technically challenging shots – a 3D within 3D effect composited for Blowdown’s episode on the implosion of the Fonte Nova Stadium in Brazil:

  And here’s how he brought it all to life:

But, of course, mastering challenges like these come with the rewards of creating 3D VFX everyone can get a kick out of.

Ian Herring, President

@ianherring

3D documentary filmmaking review: Fuji W3 3D camera gets E for effort, but you can’t change the world 12 seconds at a time

Our stereographer Sean White and I had a chance to test out the Fuji FinePix REAL W3 3D camera while I was at the Victoria Film Festival showing some of our 3D documentary material last week.

In brief

First, props to Fuji for putting this little camera on the market. Almost any venture into the third dimension is a good move in our books, and the price point will surely entice more consumers to delve into this exciting new realm.

But from a technical standpoint, much like the Panasonic AG-3DA1 we tested a few weeks back, the Fuji W3 3D camera just doesn’t compare to the quality we can get from using a two-camera system on either a side-by-side or beam-splitter rig … yet.

Like the Panasonic, I suspect this will improve with time.

But there’s another huge dealbreaker: the Fuji W3 3D buffer only allows you to shoot 12 seconds of video at the highest quality settings – completely inadequate for any professional video applications.

In today’s market I expect a camera that not only shoots high-quality stills, but video as well. So, in the end, we ended up taking the camera back. Even if Fuji increases the image quality, we won’t even consider buying another one until the video recording capacity also increases … drastically.

The bottom line – props to Panasonic and Fuji for blazing some trails … but in the high-end 3D filmmaking world, two is still better than one.

Ian Herring, President

 @ianherring

The full review

Here’s a more thorough rundown of the camera’s functions, including still photos, courtesy of Sean:

It’s pretty cool that there’s even something like this available for less than $500. Having 3D stills and video in a point-and-shoot, the ability to adjust the convergence, see it instantly on an autostereoscopic display AND fit the whole thing in your pocket is impressive indeed.

However, the quality of the images doesn’t match up to the quality of today’s best 2D compact cameras. Still, I believe this is a major leap forward in terms of making 3D accessible to the masses. The stills quality is definitely acceptable for personal use and web. The autostereoscopic screen is actually quite sharp and effective once you find the sweet spot. The technology is designed so the separate left and right eyes are seen at the same time without any glasses but only from a certain viewing distance and angle.

In this case, the best view is directly in line with the screen, about 30-50cm away. What’s really great about this camera is how it makes 3D believers out of folks not accustomed to watching 3D a lot. At a recent film festival, we shot some pics of guests and we were able to explain the basics of 3D photography with instant results in the users hands. Lots of “wows” and “do that again”. Very cool.

On a technical note, the 75mm inter-axial distance is fine for most average shots but be careful not to get any closer than about 2-3m to your nearest subject or the background will “explode” (Aka: massive divergence). There’s an adapter that use mirrors to effectively reduce the interaxial to about 25mm but we haven’t tried it yet – check it out.

On the downside, the video only shoots 720, not 1080, which is fine for the web but unfortunately the video quality again is not as good as other 2D compact cameras out there. The big heart breaker was that the buffer in the camera only allows a maximum of 12 seconds video when recording at the highest settings. Lame.

Also, there’s no software support for Macintosh which is what we’re all using so I haven’t had a chance to perform any post-production functions. The camera does connect fine to a 3D monitor however it won’t display the saved converged files so you’re not really seeing the proper 3D shots.

I applaud Fuji for packing this all together, especially with the autostereoscopic screen but the video quality, video buffer and interface tools need to be improved big time.

Sean White

Watching a 3D documentary: how to tell the difference between good 3D and bad 3D

It’s easy to tell when a 3D viewing experience goes sideways – sore eyes, headaches and a general feeling of awkwardness are unmistakable indicators.

Obviously we want to avoid these issues with our 3D content. And to fully understand how to do it right, you’ve got to understand how it’s being done wrong.

So what makes bad 3D so bad?

Digital Cinema Report has posted  an in-depth article on “How to Critique 3D” on how to identify the good, the bad and the ugly:

How to critique stereo capture 3D

-Your eyes should focus easily and naturally when viewing in stereo. If you are getting headaches or your eyes cannot focus, improper alignment is the cause (DCR tip: Take off your glasses and try to spot an area of high contrast. You may see that a bright spot is a little higher for one eye than the other.)

-When items on screen are glowing or have an unnatural sheen it is due to exposure differences between the two cameras. It could be unmatched exposure or reflection issues with the beam splitter rig.

-Keep an eye open for bright objects. Glints, lens flares and spotlights create more technical issues that have to be considered during stereo capture. Glints off of metal objects can be messy and appear to be a different shape in each eye. Lens flares will “invert” and pull away from the viewer, which can be visually confusing. Spotlights can create star patterns that rotate differently in each eye making it uncomfortable to view.

-Try to determine if there is too much depth onscreen. Some say it is perfectly acceptable for backgrounds to be out of focus. Others maintain that if the viewer cannot easy focus on distant objects there is too much divergence. (DCR tip: Look for distant objects like mountains or spotlights; If the doubled-image of the mountain appears separated by many inches or feet the background is probably too far away.)

Watching anaglyphic 3D

When we attempted to post a 3D clip from the Panasonic 3DA1 camera on YouTube, it was so everyone could judge the quality for themselves … ideally by being able to view the footage in stereoscopic 3D.

But if you can only view the material in anaglyphic 3D and 2D, all is not lost – use the anaglyphic 3D mode to analyze depth and the 2D mode to analyze footage quality.

Last but not least

If one of your 3D glasses lenses looks smudged it’s likely that the focus, zoom or shutters were not properly synced during filming.

Or maybe it’s just popcorn butter.

Ian Herring, President

@ianherring

From Cineform to Blue-ray: how to prep and ship a 3D documentary for broadcaster review

The time has finally come for a full version of Blowdown 3D to be launched out into the world.

We are shipping the fine cut of our first 3D documentary to broadcasters this week.

And this time they’ll have the chance to actually see it in 3D.

It’s an obvious step – if you’re paying for 3D content chances are you’ll want to check it out at some point before you sign off on the show.

But with this emerging technology, even obvious steps are rarely easy.

So how did we get a cut out for the broadcasters to experience a full color 3DHD explosive demolition from the comfort of their plush office chairs?

First of all, our editor Brian Mann had to make sure all the shots in the cut were 3D legal and correct any colour discrepancies between the two eyes.

To create a comfortable 3D viewing experience, we also monitored the LR convergence throughout the edit.

This made for more work – dealing with issues that either didn’t exist before (converging shots) or that would have been tackled further along in the editing process (for example colour correction after picture lock).

But anyone who’s in the 3D business knows more work is the name of the game.

To send this 3D version of the show to our broadcasters, we opted for two different delivery formats: a side-by side 3DHD version of the show on Blu-ray disc and a digital anaglyph SD version.

To create the Blue-ray disc we exported the CineForm 3D file from Final Cut Pro, burned it onto the Blu-ray disc via Adobe Encore that, unlike FCP exporting functions, allowed us to create more professional customized menus and shipped via courier.

Now all you need to view this form is any Blu-ray player (doesn’t have to be 3D) and a 3D-enabled monitor.

The anaglyph version was exported as a .mov file straight from FCP and then uploaded onto our server. It’s a seriously inferior experience, but will still give those who are unfortunate enough to not have a 3DTV handy a way to see it in some version of 3D.

Though you might want to get on that, dear readers. I’d hate to see you live in the flatlands of 2D when the 3D world is just within your reach.

As for the broadcasters … maybe someone should warn them— cause they’re in for quite a ride.

Ian Herring, President

@ianherring

Watching a 3D documentary: Tips On Creating the Best 3D Television Experience

As 3D televisions (and eventually our first 3D documentary) make their way into living rooms near you, it’s time to lay down some helpful tips I found online to ensure you are getting the most out of your in-house 3D experience.

Ambience

In order to create the optimal 3D experience, you first have to create the environment for it to happen.

Dim the lights, cover the windows–black them out if possible. Essentially create yourself a tiny black hole.

By reducing ambient light you will eliminate ghosting and double vision that often botch the 3D experience.

When in doubt, darkness is always best.

Viewing Position

Until 3D televisions become 360 degrees of polarized celluloid, we’ll have to make due with flat screens that are better viewed head-on rather than from an angle.

Best then to keep the family or group of friends small so everyone gets a piece of the 3D pie.

Television Settings

Because 3D glasses are tinted, they’ll dim the movie if viewed in default settings made for 2D. So make sure to customize your settings and increase screen brightness.

3D is not normal television viewing—don’t forget it.

HD Screen 

This goes without saying, but I’m going to say it anyway: 3D is a viewing format that belongs on an ample-sized, high-def screen.

The bigger the screen the richer your experience of the imagery—think GIANT pop-up book vs. OK pop-up card.

The only other thing I’ll demand from here on in when it comes to 3D screens is they be dust and fingerprint free.

Ian Herring, President

@ianherring

Watching a 3D documentary: Blowdown 3D hits the Victoria Film Festival

 3D road trip!

Friday night I packed up and headed over to Vancouver Island for the Victoria Film Festival.

The mission: roll out a 15-minute show and tell about 3D cable TV to the film types.

The gear: 500 pairs of polarized glasses (think big, right?) and our 46-inch JVC HD 3D LCD monitor.

About 70 people showed up to crowd around and check out the latest and greatest in home entertainment.

What I shared with the crowd

How they felt about the experience

The final word: it was great to show some of our 3D documentary material off.

Reactions were mixed, but whether the VFF-goers praised it or panned it, it was clear that my little road trip stirred up a nice dose of stereoscopic buzz.

It’s also clear that our 3D monitor travels exceptionally well strapped to a bike rack … bonus.

On to the next adventure.

Ian Herring, President

@ianherring

YouTube 3D? Failed attempts to view clips on a polarized monitor raise doubts about functionality, access to content

Super frustrating news: after some serious testing, it seems that YouTube 3D-enabled videos do not decode properly for playback on our 3D monitor when sourced by a MacBook Pro or PS3.

For the majority of people this may seem like an inconsequential technological glitch.

But for me, it’s a huge setback in our mission to get top-notch 3D content out into the universe.

Where we’re at

After we test shot the Panasonic 3DA1 camera a few weeks back I uploaded 2D versions of some footage we captured onto YouTube.

But after seeing the same footage in stereoscopic 3D I just wasn’t satisfied – I wanted to give people with 3D-enabled devices the chance to not only experience the stuff we shot, but to judge its quality for themselves.

So I had my editing team upload a side-by-side version of interior shoot selects to YouTube:

It seemed like a simple plan – put the videos on and set them to 3D.

We managed to get anaglpyh playback working fine, with the correct aspect ratio and eye orientation. We then pushed forward to see if modern stereoscopic 3D methods worked.

They didn’t. When we tried to play the clips using half-width, side-by-side, 3D they were a no go on our JVC GD-463D10U monitor. We tried using both a MacBook Pro with DVI to HDMI and a PS3 with HDMI to HDMI.

We also tried to play three other 3D-enabled videos on YouTube that were uploaded by other people and came up against the same issue. I’ve embedded these at the end of the post if you’ve got 3D-enables gear and would like to give them a go.

What’s going wrong?

From what my team can tell, the problem seems to be that YouTube does not map the pixels properly for TV playback.

A huge caveat – and my call out to the 3D-enabled – this is not to say the YouTube 3D function definitely doesn’t work. It just doesn’t seem to work on the equipment that we have access to – a Mac computer, a PS3, and a passive filter 3DTV.

More equipment than 99.999999 per cent of the world has … but still. Our tests were not exhaustive.

The next move

This setback has put me in a difficult position. YouTube has the potential to offer a free 3D online playback solution that’s more comprehensive than anything else. And I really want to get our stuff on there so people can check it out – especially since more and more consumers are buying 3D-enabled viewing devices.

But YouTube just isn’t working for us. So where do I go from here?

We could host future 3D content on our own web server or perhaps on another web video community like Vimeo, but doing so would seriously cripple our reach. Missing out on YouTube is clearly a doozy when it comes to exposure … it’s one of the top-searched sites in the world.

And there’s another downside: neither our web server or Vimeo offer the ability to toggle between different 3D delivery formats that YouTube (in theory) could. This means we would have to render out and upload many different versions of the same video – more work for my team.

We can do further YouTube tests, and see if we can work around the issues we’ve encountered so far.  But this means potentially re-encoding and uploading new videos. It will probably require a great deal of time to invent workarounds, render new files, and upload the new tests – especially since my edit crew also has to meet the demands of film projects that are currently in production.

Also, further testing at this point feels like it could be a gamble. As far as we can tell Google has very limited support for the feature, so our only option is try, and try some more, to see if there is something undocumented that works, or try, and try some more, only to discover that it really doesn’t work after all – at least not yet.

The other option is to abandon YouTube.  That means abandoning any desire or investment to attain any of the benefit of having our 3D content on the mega site – at least until the 3D feature becomes more mainstream and (hopefully) functional.

Or finally, the wild card option: you, dear reader, have successfully watched this type of footage on YouTube, have the magic solution to this seriously irritating problem, and can’t wait to share it with us …

Ian Herring, President

@ianherring

 

YouTube 3D-enabled videos

 

 

 

Video: Editing a 3D documentary – how CineForm improves VFX post-production flow

CineForm NeoHD and Neo3D have been staples in our editing suites since we started our first 3D documentary project.

There were some hiccups when we first got the software, but we worked out the kinks and it’s been pretty solid ever since.

Now our compositor has added the program to his arsenal – and it’s paying off once again.

Before incorporating CineForm, our VFX team gave our editor ProRes videos. They would then to be transcoded into CineForm files and muxed – two extra steps for our editor for each video every single time.

Now, he can read, write and export CineForm 3D files in Adobe After Effects, and deliver them – already muxed – directly to the editor.

Our compositor, Jakub Kuczynski, on the CineForm workflow:

On how it helps the 3D editing process – mixed files, compatibility with Final Cut Pro, and high visual fidelity compression:

And on how it benefits the 3D compositor:

The gist: there just isn’t anything else on the market that’s priced right and gets the job done like CineForm – in edit and in VFX.

And in this 3D business, any step taken to reduce time and increase workflow is a step in the right direction.

Ian Herring, President

@ianherring

Watching a 3D documentary: 3DTV study shows that seeing is believing

I’ve come across an interesting experiment that captured consumer reaction to the 3DTV experience.

I’ve come across an interesting experiment that captured consumer reaction to the 3DTV experience.

The Nielsen Company invited consumers in Las Vegas, Nevada to watch a 30-minute 3D reel featuring sports, nature, comedy, a music concert, movies and video games and then weigh in on the content.

No surprise – the majority said it was better than 2D. But what’s really interesting is why. They didn’t only like what they saw … they liked how the 3DTV experience made them feel.

Here’s what they found:

-6 out of 10 participants agreed the 3D content was better than their current 2DTV viewing

-48% found 3DTV more engaging

-57% found 3DTV made them ‘feel like they were part of the action’

-48% felt ‘closer to the characters’

These reactions speak to the immersive nature of 3D television. The journey into the third dimension is not only visual – it’s emotional.

For example, our editor Brian Mann and I were checking out a 3D stock footage demo of a waterfall a week or so ago. As I sat in my chair and watched the water flow over the rocks, it was like I was sitting in the middle of a forest right next to it, about to toss a rock into the stream.

I’ve seen a lot of waterfall B roll in my life … but I’ve never felt like that.

More and more people are demanding this superior experience from their home entertainment.

Here’s what the Las Vegas participants wanted to see more of:

File 171

Nielsen study: participants were exposed to 30 minutes of 3D content in Las Vegas, Nevada.

Another Nielson survey asked 27,000 online consumers from across 53 countries if they currently owned or will purchase a 3DTV in the next 12 months.

The good news for 3DTV:

-15% ‘probably will purchase’

-9% ‘definitely will purchase’

-4% already own a 3DTV

But there’s a ways to go:

-21% are still undecided

-19% ‘probably won’t purchase’

-33% ‘will definitely not purchase’

Many consumers are still skeptical about 3DTV. It’s a fact. And it’s a fact that we shouldn’t ignore.

But look at how far we’ve come in just a year.

Last January it would have been next to impossible to purchase a 3D TV for your home. Fast forward to the 2010 holiday season, when 3D home entertainment systems popped up in every major electronics store.

And availability isn’t the only thing that’s improving: the technology out there is getting better and cheaper.

What the future holds…

As 3DTV evolution and accessibility continues to grow, so does the opportunity for great storytelling in a whole new dimension.

And the more people who experience how meaningful this experience is, the more momentum 3DTV will gain.

Ian Herring, President

@ianherring

Editing a 3D documentary: where to find high-quality stereoscopic stock footage

Yet another sign that 3D entertainment is gaining momentum – an expanding library of high-quality stereoscopic stock footage is available online.

This is great news for 3D production houses like ours.

We can only include a certain amount of 2D footage in the 3D shows we’re delivering. It can range from as little as 2-3 per cent per show, depending on the broadcaster.

For our visual effects, this isn’t an issue – we can convert 2D VFX shots into 3D. However, stock footage is a completely different matter.

Where to find it:

Our editor, Brian Mann, recently came across Artbeats, an online source for royalty-free stock footage in high quality stereoscopic 3D. This type of footage could allow us to fill visual gaps and transition between scenes while keeping the show as 3D as possible.

We tested Artbeats free download of a waterfall in S3D. It looked fantastic and lived up to their promise of high quality 3D stock footage.

File 166
Still image of Artbeats waterfall download. View with Red/Cyan glasses for full 3D effect.

3D stock options:

Artbeats’ clips are available in S3D HD and S3D 4K formats. Predominantly, their footage has been shot on RED ONE or RED MS using a stereo rig. There is not an extensive range of categories… yet. Mostly they feature aerials, animals and nature.

On the upside, new content is added monthly and will soon include pyrotechnic, new city scene, establishment, winter scene and additional aerial collections shot on RED Epic cameras using a beam splitter rig.

Metadata provided by Artbeats:

– positive parallax percentage (the customer has the option to position and crop to set convergence)

– interocular separation measurements

– maximum screen display size (anywhere from 42” televisions to 42’ movie screens)

– frame rate (24p, 25p and most in 30p)

– clip length (5-60+ seconds)

What it will cost you:

Prices range from $449-$799 USD for left/right and side-by-side formats. Some rights managed clips have a higher sticker price, so be sure to check the fine print. They also sell the RAW (.R3D) file of a RED clip for an extra $100.

Artbeats is right on the pulse of 3D accessibility. We haven’t purchased anything yet, but we will certainly keep them in mind as we move forward.

Ian Herring, President

@ianherring