I’m looking forward to seeing Tim Burton’s Alice In Wonderland. Not for the obvious reasons, such as the coolness that a 3D Stereoscopic Wonderland promises or even the sense of awe spurred by Burton’s aesthetic craziness.

I’m really into the what’s going on behind the curtain.

Looking around a little, I dug up some very cool green screen technology that, while not cutting-edge-new. ), it’s still pretty neat; a portable, suit-case sized pre-visualization system that accurately merges live-action fore-ground elements to computer generated, virtual background elements in real time. It’s been getting a workout on several high-profile television series (‘V’, the recently exhumed ‘Knight Rider’) and of course, ‘Alice In Wonderland’. Inventor and MIT alum Eliot Mack, the guy responsible for bringing this thing into the world, has developed a system that allows directors and cinematographer’s to view accurate renditions of how a final shot will look in high-definition, as opposed to actors in front of a wall of green screen or a pre-comp rough assembly. He calls it PREVIZION, and it’s got a huge amount of promise, albeit a rather hefty price tag to boot (for now). Here’s the skinny, below.

To fully appreciate the system’s benefit on set, consider the typical visual effects process (in layman’s terms, for brevity). A shot is recorded in front of green screen, digitized and uploaded to the VFX pipeline. VFX Artists key out the green screen background, then identify camera tracking and calibration points. Depending on the complexity, this can take days for a single shot. Next, VFX artists drop in background elements (skies, previously recorded live action footage, etc) and composite these elements together into a unified shot, taking anywhere from hours to weeks. Utilizing camera-mounted sensors, Previzion keeps tabs on what the camera is doing; where the camera is oriented, which direction it’s shooting, and what it’s focused on. Accurate tracking data is thus generated on set, and ready for introduction into the VFX pipeline. On set, the merged image is viewed on a high-definition monitor, enabling the director to adjust actor’s positions and make changes in lighting to further enhance realism in the shot (making the live action on the set look like it belongs in the virtual world of the background).

When the camera pans, tilts, or zooms in or out, the background imagery adjusts to match the camera view. This goes for focus shift as well. Seeing is truly believing. Have a look at the system in action HERE.
It’ll be interesting to see how long this technology takes to work it’s way into the lower budget brackets, providing a cost effective alternative for indy filmmakers, much like stereoscopic technology has over the last few years. Personally, I can’t wait- I’m working out the wrinkles in my green screen this weekend.

Previzion is sold thru Lightcraft Technology

Advertisements