Home · Register · Join Upload & Sell


  Previous versions of Ajay C's message #9794449 « Sony NEX-C3 first impressions »


Ajay C
Upload & Sell: Off
Re: Sony NEX-C3 first impressions

denoir wrote:

I know the photodiodes in the sensor are monochromatic but I assumed there was some form of semiconductor device doing the color filtering in the CFA. When I think about it however it wouldn\'t make sense as you are not interested in getting an electrical response to a wavelength at that point - you are just doing band pass filterering. So the CFA consists of tiny monochromators - i.e the filtering process is purely optical?

Correct, it is purely optical. All that CFA does is filter the light into one primary color (depending on the color of the cfa) - band pass + some attenuation. Loosely speaking, it is like sticking a red / blue / green filter in front of your lens.

I don\'t think then that the CFA is the cause of the color shift. As you can see in the images above there is a clear asymmetry to it. You\'ve got a red shift in one part of the image and a cyan shift in another. I could accept that you\'d get a single color cast towards the edges if one of the filters (red for instance) was more sensitive to light coming at an oblique angle, but I can\'t see how on earth you\'d get a global asymmetry to it.

No CFA by itself will not cause color shift. Different hues are common. There is math to it but I\'ll try to explain in simple terms. For instance, take a pixel quad (2x2 - [Green, Red] and [Blue, Green] in the next row), and imagine white light rays coming in perfectly normal. CFA filters the light, and the corresponding pixels get excited, and produce an electrical response. i.e. all channels respond the same way and when pixel data is demosaiced, it is a perfect white response (assuming white balancing was done). Now imagine, light rays coming in at an oblique angle and the micro-lenses dead center on top of the pixel. All of a sudden, all the light that should be gathered by the pixels is - a. not going into the pixel (attenuated), b. some of it reflected around and goes into other neighboring pixels. If you read (a) and (b) carefully, the pixel response has changed and also the neighboring pixels are responding differently. Depending on the angle (and the side) at which light rays hit the pixel, different intensity of light can leak into neighboring pixels, and hence elicit different response. When this neighboring pixel data is used for further processing, and the same color gains (white balancing) are applied, it is not pure white but colored, and depends on which pixel (or a combination) had the best response to these light rays. The key to asymmetrical color shifts is the angle of the light rays itself. Hence the idea behind the shifted micro lenses is to reduce this \"mischanneling\" of the light at the edges of the optical array.

Yes, it works rather effectively. When the CFA is shifted, it is not completely optimized for extreme ray angles. It just mitigates color cross talk for high ray angles. The formula used for shifting can be optimized for a \"certain\" range of ray angles.

Sorry my bad, i meant the micro-lenses, not the CFA. CFA is not shifted as the pixel centers remain the same through out the array.

I really don\'t see how cross talk could account for the color shift. Not any type actually. The CFA induced one would affect things symmetrically. I don\'t see how the electrical crosstalk in the semiconductor could produce the effect either (i.e flow through the space between the well and the substrate). It has however been very long since I knew the semiconductor stuff adequately so I may have missed something completely.

It is nothing to do with electrical cross talk on the IC.

I have however yet to come across an explanation for the asymmetry in the color shift.

Read above.

Perhaps, but those tricks did apparently not work for the M9 or the older NEX cameras or color shift would have not been a problem in the first place...

I am not surprised. Micro-lens shifting is relatively new for APS-C sized (and bigger pixel) sensors. Smaller pixels (like cell phone camera sensors) have much worse of a problem because of color crosstalk. The new NEX sensor could be using shifted micro-lenses, while the older one did not. Or perhaps that they just used a newer microlens architecture with the older pixel technology. Anyone\'s guess. This DPR / DXOmark can figure, unequivocally

Jul 28, 2011 at 10:27 AM

  Previous versions of Ajay C's message #9794449 « Sony NEX-C3 first impressions »