martinezphoto Offline Upload & Sell: Off
|
p.1 #1 · p.1 #1 · Lytro Illum - everyone here should see this. | |
http://www.theverge.com/2014/4/22/5625264/lytro-changed-photography-meet-the-new-illum-camera
"Light-field photography has been discussed since the 1990s, beginning largely with three Stanford professors, Marc Levoy, Mark Horowitz, and Pat Hanrahan. (The term "light field" was first coined in 1936, and Gabriel Lippmann created something like a light-field camera in 1908, though he didn’t have a name for it.) Instead of measuring color and intensity of light as it hits a sensor in a camera, light-field cameras pass that light through a series of lenses (hundreds of thousands in Lytro’s case), which allows the camera to record the direction each ray of light is moving. Understanding light’s direction makes it possible to measure how far away the source of that light is. So where a traditional camera captures a 2D version of a scene, a light-field shot knows where everything in that scene actually is. A processor turns that data into a 3D model like any you’d see in a video game or special effect, and Lytro displays it as a photograph. It’s a little bit like the small bots in Prometheus, spatially mapping an entire room in order to display it back later. Or think of it as a rudimentary holodeck, projecting a simulated scene that changes as you move through and interact with it.
Lytro didn’t invent the science, just found a way to turn the required technology — which was once made up of 100 DSLRs in a rack at Stanford — into a product you can hold in your hands.
As I sit on a couch in the middle of Lytro’s office, alternately taking photos and seeing them displayed in 3D on a large TV, it becomes clear. This is the future. Not the Illum, necessarily, though it’s one of the more exciting cameras I’ve seen in a while. Maybe not even Lytro, though it’s built a huge lead in its nascent industry. But light-field photography — the notion that the future is about turning the complex physical parts of a camera into software and algorithms, that capturing beautiful photos is little more than a data-crunching problem — seems almost obvious. Why capture one photo, from one angle, with one perspective, when we could capture everything? When I can explore a photo, zooming and panning and focusing and shifting, why would I ever want to just look at it?"
This could be a revolutionary step in photography.
|