Computational photography is still a few years ahead of us in the future, but this early some forward thinking individuals are already tinkering with technologies that demonstrate its possibilities.
Researchers at Standford University have developed a chip – an imaging sensor – that can record not only images but also their physical distance from the camera. The implication is that 3-dimensional images are possible with this image sensor.
The futuristic sensor harnesses multiple microscopic lenses which record what they see in a sub-array of 16×16 pixels. Each sub-array gets its own microlens and each pixel records not only the RGB information but also how far that particular part of the image is, in effect giving it depth data.
The end result can be printed as a normal photograph, or it can be projected as a 3D image a lot better than what is currently possible with holograms, or maybe with some futuristic 3D printer, sculpt a solid image of a person’s bust complete with all the colors of the photograph. The possibilities are endless.
In image editing, it offers a novel way of enhancing images by selectively shifting focus or shifting viewing angles, thereby resulting in better pictures.
Adobe Systems earlier yapped about a prototype of the technology (video here at AudioBlog.fr), but the Stanford boffins are way ahead in this game by actually developing the chip. (Whatever happened to the plenoptic lens by Ren Ng and John Wang of MIT?)
[Via: News.com]