Members of the MIT community have a history of transforming visual effects. Herbert Kalmus (1903) and MIT Physics Professor Daniel Comstock codeveloped Technicolor (yes, it was named after MIT) and Bill Warner ’80 created the Avid digital editing system. And now Eliot Mack SM ’96 hopes to add his name to the list. His portable Previzion system enables accurate matching of live-action foregrounds and computer-generated backgrounds so directors can see beyond a green screen at how the final shot will look. Watch video of the technology in action.

Previzion being used on set

To add backgrounds in post-production, technicians have to know exact camera positions and lens optical parameters used during filming. Currently, visual tracking involves 3,000-pound cranes fitted with rotary measuring devices that lack complete accuracy. Mack's invention puts Intersense optical-inertial and Airtrack inertial sensors onto the camera itself, which precisely record necessary data and eliminate the need to manually figure out tracking info.

While green-screen technology is not new (meteorologists swear by it), creating live photorealistic images with it is. Current technology has problems accounting for motion tracking, image resolution, focusing and defocusing background shots, and capturing lens adjustment calibrations, which are crucial for post-production work. Mack has refined his technology to automatically generate camera tracking data and to not miss a single strand of hair against the backdrop. “Essentially, we’re recreating the world on the fly,” he says. So far, it’s been used on the television show V, the upcoming Tim Burton movie Alice in Wonderland, and the Knight Rider made-for-TV movie.

How does it work?
Consider the typical FX process. A show is recorded against a green screen, then digitized and loaded into a computer. An artist keys out the green, then another team manually identifies all the camera tracking and calibration points, which can take days for just one shot (a special-effects-laden movie would contain hundreds or thousands of such shots). Next, a team fills in the background, and yet another artist composites all the images together, all of which can take weeks or months. Mack’s invention can, with simpler backgrounds (like those usually found in TV), generate final-quality output in real time that just needs to be edited, scored, and distributed. Post-production time can still be cut by weeks for more complex scenes by fixing any lighting issues on set and using the tracking data Previzion generates (see photo caption).

Innovation is nothing new to Mack, who helped create the Roomba and developed Previzion with some of his former iRobot colleagues. They engineered a prototype and showed it at trade shows, using feedback to systematically improve the technology. “Previzion is a wild enough departure that people won’t believe it until they see it,” he says. “Many people don’t know it’s possible yet.” Which is why he spends much of his time visiting sound stages to demo his product. Technical teams have to feel confident with the technology, and schedule it into future productions. But Mack thinks in about five years, his hard work will have paid off—to the point where it will be rare to film against a green screen without a real-time image behind it.

Could a product like Previzion, though, mean fewer jobs for LA professionals? Mack doesn’t think so since it’s less likely work would be outsourced, which is often the case now. And, the business could actually grow. “I think the special effects and motion picture industries could be bigger than they are now,” he says. He believes less sophisticated entertainment, such as webisodes and soap operas, could be transformed so that a higher-quality output becomes standard. “If you give very powerful tools to very creative people, they can do some pretty amazing things,” he says.