Imagine Your World | Streak Camera captures a trillion frames per second

Streak Camera captures a trillion frames per second

June 03, 2014

Cambridge ( For photographers, the future is approaching with lightning speed. By using optical equipment, MIT researchers have created an imaging system that can capture moving particles of light. The "camera" can acquire visual data at a rate of one trillion exposures per second. That's fast enough to produce a slow-motion video of a burst of light traveling the length of a one-liter bottle, bouncing off the cap and reflecting back to the bottle's bottom.

"There's nothing in the universe that looks fast to this camera," says Media Lab postdoctoral Andreas Velten, one of the system's developers. It relies on a streak camera, an instrument for measuring the variation in a pulse of light's intensity with time. The aperture of the camera that transforms the temporal profile of a light pulse into a spatial profile on a detector is a narrow slit. Particles of light known as photons enter the camera through the slit and pass through an electric field that deflects them in a direction perpendicular to the slit. Because the electric field is changing very rapidly, it deflects late-arriving photons more than it does early-arriving ones.

Therefore, the image produced by the camera is two-dimensional, but only one of the dimensions - the one corresponding to the direction of the slit - is spatial. The other dimension, corresponding to the degree of deflection, is time. The image thus represents the time of arrival of photons passing through a one-dimensional slice of space. Originally, the camera was intended for use in experiments where light passes through or is emitted by a chemical sample. Since chemists are interested in the wavelengths of light that a sample absorbs, or in how the intensity of the emitted light changes over time, the fact that the camera registers only one spatial dimension is irrelevant.

After accumulating hundreds of thousands of data sets, the researchers developed algorithms that can stitch that raw data into a set of sequential two-dimensional images. A video posted on YouTube explains the experiment in more detail here. Because the ultrafast-imaging system requires multiple passes to produce its videos, it can't record events that aren’t exactly repeatable.

Any practical applications will probably involve cases where the way in which light bounces around is itself a source of useful information. Those cases may include analyses of the physical structure of both manufactured materials and biological tissues "like ultrasound with light," Media Lab Associate Professor Ramesh Raskar says.

He also sees a potential application in the development of better camera flashes. "An ultimate dream is how do you create studio-like lighting from a compact flash? How can I take a portable camera that has a tiny flash and create the illusion that I have all these umbrellas, and sport lights, and so on," Raskar says. "With our ultrafast imaging, we can actually analyze how the photons are traveling through the world. And then we can recreate a new photo by creating the illusion that the photons started somewhere else."

"As photons bounce around in the scene or inside objects, they lose coherence. Only an incoherent detection method like ours can see those photons," Velten adds. And those photons could let researchers "learn more about the material properties of the objects, about what is under their surface and about the layout of the scene. Because we can see those photons, we could use them to look inside objects - for example, for medical imaging, or to identify materials."