-Your eyes should focus easily and naturally when viewing in stereo. If you are getting headaches or your eyes cannot focus, improper alignment is the cause (DCR tip: Take off your glasses and try to spot an area of high contrast. You may see that a bright spot is a little higher for one eye than the other.)
-When items on screen are glowing or have an unnatural sheen it is due to exposure differences between the two cameras. It could be unmatched exposure or reflection issues with the beam splitter rig.
-Keep an eye open for bright objects. Glints, lens flares and spotlights create more technical issues that have to be considered during stereo capture. Glints off of metal objects can be messy and appear to be a different shape in each eye. Lens flares will “invert” and pull away from the viewer, which can be visually confusing. Spotlights can create star patterns that rotate differently in each eye making it uncomfortable to view.
-Try to determine if there is too much depth onscreen. Some say it is perfectly acceptable for backgrounds to be out of focus. Others maintain that if the viewer cannot easy focus on distant objects there is too much divergence. (DCR tip: Look for distant objects like mountains or spotlights; If the doubled-image of the mountain appears separated by many inches or feet the background is probably too far away.)
And this time they’ll have the chance to actually see it in 3D.
It’s an obvious step – if you’re paying for 3D content chances are you’ll want to check it out at some point before you sign off on the show.
But with this emerging technology, even obvious steps are rarely easy.
So how did we get a cut out for the broadcasters to experience a full color 3DHD explosive demolition from the comfort of their plush office chairs?
First of all, our editor Brian Mann had to make sure all the shots in the cut were 3D legal and correct any colour discrepancies between the two eyes.
To create a comfortable 3D viewing experience, we also monitored the LR convergence throughout the edit.
This made for more work – dealing with issues that either didn’t exist before (converging shots) or that would have been tackled further along in the editing process (for example colour correction after picture lock).
To send this 3D version of the show to our broadcasters, we opted for two different delivery formats: a side-by side 3DHD version of the show on Blu-ray disc and a digital anaglyph SD version.
To create the Blue-ray disc we exported the CineForm 3D file from Final Cut Pro, burned it onto the Blu-ray disc via Adobe Encore that, unlike FCP exporting functions, allowed us to create more professional customized menus and shipped via courier.
Now all you need to view this form is any Blu-ray player (doesn’t have to be 3D) and a 3D-enabled monitor.
The anaglyph version was exported as a .mov file straight from FCP and then uploaded onto our server. It’s a seriously inferior experience, but will still give those who are unfortunate enough to not have a 3DTV handy a way to see it in some version of 3D.
Though you might want to get on that, dear readers. I’d hate to see you live in the flatlands of 2D when the 3D world is just within your reach.
As for the broadcasters … maybe someone should warn them— cause they’re in for quite a ride.
But after seeing the same footage in stereoscopic 3D I just wasn’t satisfied – I wanted to give people with 3D-enabled devices the chance to not only experience the stuff we shot, but to judge its quality for themselves.
So I had my editing team upload a side-by-side version of interior shoot selects to YouTube:
It seemed like a simple plan – put the videos on and set them to 3D.
We managed to get anaglpyh playback working fine, with the correct aspect ratio and eye orientation. We then pushed forward to see if modern stereoscopic 3D methods worked.
They didn’t. When we tried to play the clips using half-width, side-by-side, 3D they were a no go on our JVC GD-463D10U monitor. We tried using both a MacBook Pro with DVI to HDMI and a PS3 with HDMI to HDMI.
We also tried to play three other 3D-enabled videos on YouTube that were uploaded by other people and came up against the same issue. I’ve embedded these at the end of the post if you’ve got 3D-enables gear and would like to give them a go.
What’s going wrong?
From what my team can tell, the problem seems to be that YouTube does not map the pixels properly for TV playback.
A huge caveat – and my call out to the 3D-enabled – this is not to say the YouTube 3D function definitely doesn’t work. It just doesn’t seem to work on the equipment that we have access to – a Mac computer, a PS3, and a passive filter 3DTV.
More equipment than 99.999999 per cent of the world has … but still. Our tests were not exhaustive.
The next move
This setback has put me in a difficult position. YouTube has the potential to offer a free 3D online playback solution that’s more comprehensive than anything else. And I really want to get our stuff on there so people can check it out – especially since more and more consumers are buying 3D-enabled viewing devices.
But YouTube just isn’t working for us. So where do I go from here?
We could host future 3D content on our own web server or perhaps on another web video community like Vimeo, but doing so would seriously cripple our reach. Missing out on YouTube is clearly a doozy when it comes to exposure … it’s one of the top-searched sites in the world.
And there’s another downside: neither our web server or Vimeo offer the ability to toggle between different 3D delivery formats that YouTube (in theory) could. This means we would have to render out and upload many different versions of the same video – more work for my team.
We can do further YouTube tests, and see if we can work around the issues we’ve encountered so far. But this means potentially re-encoding and uploading new videos. It will probably require a great deal of time to invent workarounds, render new files, and upload the new tests – especially since my edit crew also has to meet the demands of film projects that are currently in production.
Also, further testing at this point feels like it could be a gamble. As far as we can tell Google has very limited support for the feature, so our only option is try, and try some more, to see if there is something undocumented that works, or try, and try some more, only to discover that it really doesn’t work after all – at least not yet.
The other option is to abandon YouTube. That means abandoning any desire or investment to attain any of the benefit of having our 3D content on the mega site – at least until the 3D feature becomes more mainstream and (hopefully) functional.
Or finally, the wild card option: you, dear reader, have successfully watched this type of footage on YouTube, have the magic solution to this seriously irritating problem, and can’t wait to share it with us …
There were some hiccups when we first got the software, but we worked out the kinks and it’s been pretty solid ever since.
Now our compositor has added the program to his arsenal – and it’s paying off once again.
Before incorporating CineForm, our VFX team gave our editor ProRes videos. They would then to be transcoded into CineForm files and muxed – two extra steps for our editor for each video every single time.
Now, he can read, write and export CineForm 3D files in Adobe After Effects, and deliver them – already muxed – directly to the editor.
I’ve received a few questions re. how we’re syncing Canon 7Ds we’re using to capture elements of our first 3D documentary … so I’m blogging about it to share with everyone.
We’ve used DSLRs to get high res stills at set intervals for time lapses for years, but, of course, never in 3D.
For this 3D episode of Blowdown, the explosive demolition series we produce, the crew will use 7Ds for these time lapses – and also for establishing shots of the condemned sports stadium in Salvador, Brazil and, of course, the implosion itself.
Here’s how we’ve brought this system into the third dimension:
1) Splice cable so there are two heads on one intervalometer.
2) Attach heads to timer remote ports on two Canon 7Ds on side-by-side rig.
3) Sync using the one handheld intervalometer.
I’ll provide more details re. our time lapse strategy later: wanted to get this bit up ASAP.