Friday, February 27, 2015
Search
  
Submit your own News for
inclusion in our Site.
Click here...
Breaking News
Net Neutrality Rules Pass FCC
Apple To Hold Watch Event on March 9
ARCHOS Expands Selection of Smartphones During Mobile World Congress
LG 's 2015 OLED And LED 4K ULTRA HD TVs Now Available in The U.S.
WD Adds New NAS Solutions for Business Users to My Cloud Line
Toshiba Launches 8 Megapixel CMOS Image Sensor for Smartphones and Tablets
Samsung To Showcase IoT, Network Functions Virtualization and 5G Technologies at MWC 2015
Intel Introduces New Brand Levels for the Intel Atom Processor
Active Discussions
burning
nvidia 6200 review
Hello
Burning Multimedia in track 0
I'm lazy. Please help.
sanyo e6 camera
need help on some cd burning...
Why Double Logins ?
 Home > News > General Computing > Imaging...
Last 7 Days News : SU MO TU WE TH FR SA All News

Tuesday, December 13, 2011
Imaging System Captures Light In Motion


MIT researchers have created a new imaging system that can acquire visual data at a rate of one trillion exposures per second - fast enough to produce a slow-motion video of a burst of light traveling the length of a one-liter bottle.

The system relies on a recent technology called a streak camera. The aperture of the streak camera is a narrow slit. Particles of light - photons - enter the camera through the slit and pass through an electric field that deflects them in a direction perpendicular to the slit. Because the electric field is changing very rapidly, it deflects late-arriving photons more than it does early-arriving ones.

The image produced by the camera is thus two-dimensional, but only one of the dimensions - the one corresponding to the direction of the slit - is spatial. The other dimension, corresponding to the degree of deflection, is time. The image thus represents the time of arrival of photons passing through a one-dimensional slice of space.

The camera was intended for use in experiments where light passes through or is emitted by a chemical sample. Since chemists are chiefly interested in the wavelengths of light that a sample absorbs, or in how the intensity of the emitted light changes over time, the fact that the camera registers only one spatial dimension is irrelevant.

But it's a serious drawback in a video camera. To produce their super-slow-mo videos, Velten, Media Lab Associate Professor Ramesh Raskar and Moungi Bawendi, the Lester Wolfe Professor of Chemistry, must perform the same experiment - such as passing a light pulse through a bottle - over and over, continually repositioning the streak camera to gradually build up a two-dimensional image. Synchronizing the camera and the laser that generates the pulse, so that the timing of every exposure is the same, requires a battery of sophisticated optical equipment and exquisite mechanical control. It takes only a nanosecond - a billionth of a second - for light to scatter through a bottle, but it takes about an hour to collect all the data necessary for the final video. For that reason, Raskar calls the new system "the world's slowest fastest camera."

After an hour, the researchers accumulate hundreds of thousands of data sets, each of which plots the one-dimensional positions of photons against their times of arrival. Raskar and Velten developed algorithms that can stitch that raw data into a set of sequential two-dimensional images.



The streak camera and the laser that generates the light pulses - both cutting-edge devices with a cumulative price tag of $250,000 - were provided by Bawendi, a pioneer in research on quantum dots: tiny, light-emitting clusters of semiconductor particles that have potential applications in quantum computing, video-display technology, biological imaging, solar cells and a host of other areas.

The trillion-frame-per-second imaging system, which the researchers have presented both at the Optical Society's Computational Optical Sensing and Imaging conference and at Siggraph, is a spinoff of another Camera Culture project, a camera that can see around corners. That camera works by bouncing light off a reflective surface - say, the wall opposite a doorway - and measuring the time it takes different photons to return. But while both systems use ultrashort bursts of laser light and streak cameras, the arrangement of their other optical components and their reconstruction algorithms are tailored to their disparate tasks.

Because the ultrafast-imaging system requires multiple passes to produce its videos, it can't record events that aren't exactly repeatable. Any practical applications will probably involve cases where the way in which light scatters - or bounces around as it strikes different surfaces - is itself a source of useful information. Those cases may, however, include analyses of the physical structure of both manufactured materials and biological tissues ? "like ultrasound with light," as Raskar puts it.

As a longtime camera researcher, Raskar also sees a potential application in the development of better camera flashes. "An ultimate dream is, how do you create studio-like lighting from a compact flash? How can I take a portable camera that has a tiny flash and create the illusion that I have all these umbrellas, and sport lights, and so on?" asks Raskar, the NEC Career Development Associate Professor of Media Arts and Sciences. "With our ultrafast imaging, we can actually analyze how the photons are traveling through the world. And then we can recreate a new photo by creating the illusion that the photons started somewhere else."

Velten adds, "As photons bounce around in the scene or inside objects, they lose coherence. Only an incoherent detection method like ours can see those photons." And those photons, Velten says, could let researchers "learn more about the material properties of the objects, about what is under their surface and about the layout of the scene. Because we can see those photons, we could use them to look inside objects - for example, for medical imaging, or to identify materials."


Previous
Next
SandForce Releases SSD Processor Optimized for Cloud Computing        All News        Japan Demonstrates Multi-Vendor Equipment Interoperability in Transmitting 100 Gigabit Ethernet Signals over an Optical Network
WiGig Alliance Moving Forward With 60GHz Software Specs     General Computing News      Japan Demonstrates Multi-Vendor Equipment Interoperability in Transmitting 100 Gigabit Ethernet Signals over an Optical Network

Get RSS feed Easy Print E-Mail this Message

Related News
Mitsubishi Introduces Smart Home Concept
Mitsubishi Develops 3-D Model Reconstruction Technology
Mitsubishi Electric, Ritsumeikan University and JST Develop Security Solution for IoT Devices
Mitsubishi Chemical Files Patent Lawsuit Against Red Phosphors in China
Mitsubishi To Unveil World's Largest High Definition Video Display
Samsung Challenges Apple iBeacon With Proximity
Mitsubishi Doubles OLED panel's Life Span
Mitsubishi Develops New High-power Red Laser Diodes for Projectors
Pioneer Is Shipping New OLED Lighting Modules With a Wet Coating Process for a Light-emitting Layer
Mitsubishi Chemical Develops Bio-based Plastic For Automotive Touch Panels
Mitsubishi Develops Searchable Encryption Platform Software
Oji and Mitsubishi Chemicals Develop Transparent Paper

Most Popular News
 
Home | News | All News | Reviews | Articles | Guides | Download | Expert Area | Forum | Site Info
Site best viewed at 1024x768+ - CDRINFO.COM 1998-2015 - All rights reserved -
Privacy policy - Contact Us .