Ajay C Offline Upload & Sell: Off
|
When a photosite is hit by a ray perpendicularly diodes separate the light into G-R-G-B (Green, Red, Green, Blue) components. When the photosite is hit at an angle the diodes don't function as they should but leak wavelenghts they should have filtered away and therefor we get the color cast towards the edges. What Sony engineers must have done (my hat off to them) is to design photosites that could take a hit at an angle without light leaking to neighbouring photosites.
Luka,
The photodiodes are monochromatic. The color filter array (RGBG) filters the light, and photons with corresponding wavelengths hit the pixels. The above images show what is called color cross talk. i.e. Once the visible spectrum is filtered through the CFA, that light should ideally go to that pixel, but some of it is going to the neighboring pixels.
What Sony must have done is to shift the CFA on top of the pixels (even the sensor in X100 does this - they mention it in one of their marketing promotional items). The idea is that instead of CFA lying dead center on the pixel, if the CFA array is shifted progressively with the pixels at the edges getting the most shift, then most of the light hitting at extreme angles can be converged into the pixels those light rays should go in to.. There are also other techniques like building "light guides" with metal stacks on top of the pixel which limit color cross talk, but the idea is the same - to prevent contamination.
Actually, looking at what you posted NEX-C3 might be a better sensor in terms of color cross talk than the M9 sensor (assuming NEX-C3 is not doing the correction in software).
|