# CMOS Sensors - Blue Channel - Boosting



## Caps (Feb 17, 2013)

hello all,

    I ran across an interesting video about the value of using glass filters (either screw-in or square) in the digital age. The claim was that certain affects cannot be duplicated in PP or shouldn't be tried due to IQ degradation. In short, there is still value to old-school glass filters (brought to you by Tiffen, BW etc)."Drink milk - brought to you by the dairy council"

   Anyway, in the video they talked very briefly about blue filters (80 filters) and recommended them for boosting the "weak and noisy" blue channel. The moderator said that weak blue channels are inherent in all silicone based chips and that an 80c filter would help with this problem. 
  I have no reason to doubt a man giving a lecture in front of Schnider, BW, Tiffen and high level people of the industry. 

How does the glass filter aid the chip? Does custom white balancing shift something to cause the blue channel to become noisy? How would you deploy this in the field?

Of course the 80 filters are designed to WB yellow light but I just don't know how he is recommending their use.


----------



## BrianV (Feb 17, 2013)

When was the article written, and what generation of detector were they using? 

Detector chemistry was altered years ago to introduce materials such as "Indium Tin Oxide" to boost blue response. The "Quantum Efficiency" of the blue channel is quite good. Sensors in the early 1990s: different story. Looking at the spectral response of the CMOS sensor used in the full-frame Kodak DCS14 over 10 years ago, the response curve is about the same as film.

White balance built into the camera is more effective than using color correction filters.

Color filters are useful for the new Leica M Monochrom camera. Both color-correction filters and selective color filters are useful because the camera's sensor does not record color information. Color filters are used on the Monochrom camera the same as with film.

If interested- comparison shots here:

http://www.dpreview.com/galleries/876628771/albums/m9-and-monochrom-comparisons

Red filter on the Monochrom versus Leica M9 "red-filter" simulated in Silver-Efex 2.


----------



## BrianV (Feb 17, 2013)

just a follow-up- If the lighting conditions are Way-Out-There: like heavy Tungsten lighting that needs to be cut with an 80a and deeper, the auto-white balance of the camera would cut the red channel in "about half" to correct the color after the image was taken. If you used an actual filter, for the same exposure- the red channel would be cut in about half as it is being taken. If pixels in the Red channel had "saturated", or blown out: you would have been better off using the color correction filter to preserve highlights.

If lighting was heavy in the blue channel, or green channel- a color-correction filter would help in the situations where the highlights were being blown-out.


----------



## Caps (Feb 17, 2013)

Got it. Thanks for the down-low BrianV. I'm glad not to compicate my operation by adding another layer of hassle 

I got the info from the link below  -  1:50:53 point. 

AbelCine EXPO: Filters for ... on AbelCine EXPO: Filters for Digital Cinema


----------



## Helen B (Feb 17, 2013)

Bear in mind that the standards expected of professional motion pictures are often significantly higher that those required for general still photography, and the efforts to achieve those standards are often prodigious.

If you want to investigate the response of your camera in different types of light, try using rawdigger on an image of a grey scale, or just black and white, or just any neutral card at different exposures. Notice where the peaks occur in the four channels (RGGB). This exercise should give you a good feel of the relative amounts of change necessary to achieve equal R, G and B values.

The overwhelming evidence is, of course, that it is perfectly possible to take great pictures without a light balancing filter in front of the sensor (either in front of or behind the lens) and that one shouldn't worry about it, especially if you expose for the highlights. If you start to run into problems, then it may be a useful piece of information. Or not.


----------



## Helen B (Feb 17, 2013)

This is an example of the raw histogram of an image of a neutral white in 2850 K incandescent lighting, shot with a Nikon D3. You can see that the green channels have the highest values, closely folowed by the red channel. The blue channel is the equivalent of over a stop down from the green. If you really wanted to balance these you would need more of a true light blue filter than an 80 in this case - because an 80 would have too much negative influence on the red.


----------



## Caps (Feb 17, 2013)

Helen B...

    Thanks for going the extra mile !!! It appears you've got the answer, blue filters are not needed.

I'm glad I asked the question though....thanks again.


----------



## Garbz (Feb 18, 2013)

He may be right about the poor response and high noise in a typical blue channel. But let me show you why this doesn't matter.

Below we have an identical image 3 times. The top one is the original file. The first one down is the original image, scaled down to the size you see, with a 3px gaussian blur, and then a 25% uniform noise distribution applied to the pixels of the image. This second image has the settings applied to the blue channel only. The bottom image is the exact same thing again, however this time it's applied to the green channel only. Notice how the blue doesn't look too bad compared to the original, yet the green one is completely screwed? Our eye sensitivity makes the blue channel almost redundant, we just need it to fill in a bit of colour information. This is also leveraged by a lot of lossy compression algorithms which start by completely butchering the blue channel.
Original:





Blue butchered:





Green butchered:


----------



## BrianV (Feb 18, 2013)

KAC-12040 - Surveillance - Markets - Products - CMOS

Looking at data sheets from modern sensors, the "QE", Quantum Efficiency, of BLUE channels is on par with green and red. From the new CMOS sensor from Truesense,
"
40%, 45%, 43% (470, 540, 620 nm) 

"

The first generation of sensors, it was very low and was noisy. If you need details at this level, try to get the data sheets from the sensor and check out the spectral response curves.


----------



## Helen B (Feb 18, 2013)

Doesn't RawDigger tell you exactly what you want to know about the sensor you are using?


----------



## BrianV (Feb 18, 2013)

Does it give numbers for full well-charge, darkcurrent, spectral response, etc? 

Much of this information is in the manufacturer's data sheets. For sensors used in the scientific/technical market, they are available. Kodak and now Truesense made them available for their products. Some others do as well, but Sony and the manufacturers that produce sensors for commercial cameras keep much of it proprietary. Some manufacturers encrypted some of the fields in their raw files. 

Leica published the spectral response of the M Monochrom, and I have it for the M9 and M8.

Too add: the original question concerned excessive noise in the blue channel of silicon based detectors. Noise is a function of dark current (the "background" current generated in the absence of light) and being more noticeable in one channel versus the other two means the Quantum Efficiency for that channel is low. This was true of the blue channel in first generation detectors. Kodak improved sensor chemistry to extend blue response around 1995, probably ahead of other manufacturers. The sensor used in the Nikon D100 seemed to have poor blue response, remembering forum discussions on it and the Epson RD-1. If the QE is relatively even across the channels in the sensors, and it is in the data sheets that I've seen lately, there should be no reason to expect one to be any noisier than the others under normal lighting conditions. Use a darkroom red light for lighting, Blue and Green channels will be noisy.


----------



## BrianV (Feb 19, 2013)

I've never used Rawdigger- just looked it up. It gives a good analysis of the data stored in the raw file, but this is not the same information as given in sensor data sheets. Data sheets give the performance specifications of the detector, down to electrical and spectral characteristics.

What Rawdigger does now, I used to do in FORTRAN many years ago. I dug out some software written in the 80s to look at the DNG files from the M9 and M Monochrom. Hard to believe that processing files from the Monochrom are a problem for any vendor.


----------



## Helen B (Feb 19, 2013)

The raw file tells you all you need to know in this case; the sensor data alone is neither sufficient nor necessary.

Although the improved quantum efficiency for blue wavelengths helps, it is not the whole answer to this issue (an issue that we all agree is usually unimportant, of course).

What you seem to forget, and what the RawDigger histograms I posted show, is that there are plenty of common light sources that result in a lower blue sensel response in practice, and hence less signal/noise.


----------



## BrianV (Feb 19, 2013)

Like I stated, if you use a red light for illumination -you will not get much response in the blue or green. They will be noisy. Most lighting sources are not so far off from the norm, color correction filters that cut 50% of one portion of the spectrum versus another are usually all that is required. 

If your histogram was of a sunlit scene, I would guess that the blue response of the sensor is much less efficient than green and red. This is where knowing the spectral response comes in handy. Basically, it would let you optimize selection of a color correction filter before taking the shot to minimize noise. You might choose a color filter that is stronger or weaker, more like what was done for film balanced for a different lighting sources. In the case of the early silicon detectors, it was more like orthochromatic film. 

If I were going to use color-correction filters to get better signal/noise, it would make life a lot easier knowing the spectral response of the sensor, the color correction filter, and the lighting source to get a correct white balance. The raw histogram data is a convolution of the three along with the colors in the image.


----------



## amolitor (Feb 19, 2013)

Helen, I don't fully understand your charts.

The light source you're using is quite light on blue spectral content, isn't it? So with a perfect sensor you'd expect the blue to be quite low. Is that what you're saying with the charts?

I'm having a little trouble distinguishing issues about the actual spectrum (light falling on the subject) from issues about sensor efficiency. Your charts seem to be about both, since they're describing what was measured by the sensor?


----------



## Helen B (Feb 19, 2013)

The histogram is of a neutral target in 2850 K incandescent, as I mentioned in the description of it. It tells you at a glance that a CC30B would pretty much balance out the three channels in this and similar situations (should you wish to do so) - it's as simple as that, you don't _need_ any other information. That is the point I'm trying to make. Where do you disagree?


----------



## Caps (Feb 19, 2013)

BrianV

_histogram data is a convolution of the three along with the colors in the image.


_This discussion is getting a little beyond my beginners knowledge... but... 

On the DXO labs site, they have reviews for the Nikon/Canon camera sensors. Does the information they give clear up the points you are trying to make?


----------



## Helen B (Feb 19, 2013)

amolitor said:


> Helen, I don't fully understand your charts.
> 
> The light source you're using is quite light on blue spectral content, isn't it? So with a perfect sensor you'd expect the blue to be quite low. Is that what you're saying with the charts?
> 
> I'm having a little trouble distinguishing issues about the actual spectrum (light falling on the subject) from issues about sensor efficiency. Your charts seem to be about both, since they're describing what was measured by the sensor?



That is the whole point, and the direct simplicity of the method - they take both the sensor response and the spectral content of the light source into account. They also take the ADC's behaviour into account, of course.

It reminds me of the difference between Kodak and Ilford spectral sensitivity curves for their films. Kodak use a radiometric method (ie based on radiometric rather than light energy) and Ilford use a simple wedge spectrogram made in tungsten light. Ilford's method shows you roughly what to expect in practice, while Kodak's method needs further interpretation but is more versatile. As an aside to that: Unless you know the difference in method you would misinterpret the difference in spectral response, of course.


----------



## amolitor (Feb 19, 2013)

I think the, uh, light is dawning. Maybe?

Can RawDigger predict what will happen with other light sources, or is it about telling you what's going on with THIS light source with great utility and precision?


----------



## amolitor (Feb 19, 2013)

Helen, is the RawDigger chart, plus head-scratching and thinking, and then selecting the right filter essentially what "setting a custom white balance" does?

(albeit, by post processing the raw data on the way OUT of the sensor, rather than balancing the light going IN to the sensor, the latter process having some advantages, um, some of the time)


----------



## BrianV (Feb 20, 2013)

The original post stated that silicon based sensors will always be noisy in the Blue Channel. That was true many years ago. Response in the blue channel was down 10x at 400nm with regard to a modern sensor, and the statement regarding noise in the blue channel was true. The tendency of the Blue channel to be noisier than the other bands has been eliminated. Then the discussion got off to using color correction filters as an advantage when lighting itself is biased towards one part of the spectrum. Using a color balance filter can alleviate this, but it is not worthwhile to undo the white balance algorithm of the camera. The histograms show the Blue response is about 40% the green and red, and the minimum value of blue was way above the noise floor. Use of a color correction filter is not required to defeat noise in the blue channel with this lighting source.

So if you use a calibrated lighting source, on a calibrated test target, without an external filter, you can basically back out the spectral response of the sensor. I prefer looking at data sheets to get the information.


----------



## amolitor (Feb 20, 2013)

If you have adequate light, isn't a CC filter always going to be better? Doing white balance in post will always be about amplifying weaker channels which will amplify any system noise. 

This is all rather academic, since clearly doing it in post is clearly good enough. I certainly don't intend to start correcting my light.


----------



## BrianV (Feb 20, 2013)

If you want "correct" colors with a minimum of eyeballing it, would be nice to have a reference point. As shown, you can always back-out the data required.

I guess the use of a "Gain Flattening Filter" to even-out the light across all of the color channels would be the complement of the spectral response of the sensor and the light used to illuminate the scene. Even response across the channels to minimize noise. More important when balancing channels on a WDM network rather than photography.

Kodak used to sell a lot of film to the scientific/technical community, and it was used to produce radiometrically calibrated images. Truesense still targets the scientific/technical market.


----------



## Helen B (Feb 20, 2013)

amolitor said:


> I think the, uh, light is dawning. Maybe?
> 
> Can RawDigger predict what will happen with other light sources, or is it about telling you what's going on with THIS light source with great utility and precision?



Yes, but it's easy enough to look at the histogram for a neutral target in two or three different light sources to get a more general idea of what filters would be required.



amolitor said:


> Helen, is the RawDigger chart, plus  head-scratching and thinking, and then selecting the right filter  essentially what "setting a custom white balance" does?
> 
> (albeit, by post processing the raw data on the way OUT of the sensor,  rather than balancing the light going IN to the sensor, the latter  process having some advantages, um, some of the time)



The use of physical filters to balance the raw values would be generally coarser than white balancing in post, not least because of limitations in the availability of filters, but fine tuning isn't necessary. It's a little like modern colour negative film. In the good old days we had to try to get a rough correction when using colour negative film, close enough for final correction during printing. As colour negative film became more and more capable of handling a wider subject brightness range the need for correction in camera became less and less important, if the exposure was adequate. The highest quality results are still obtained by getting an approximate balance in camera, but in many cases there may be very little difference between that and just fixing it all in post.



BrianV said:


> The original post stated that silicon based  sensors will always be noisy in the Blue Channel. That was true many  years ago. Response in the blue channel was down 10x at 400nm with  regard to a modern sensor, and the statement regarding noise in the blue  channel was true. The tendency of the Blue channel to be noisier than  the other bands has been eliminated. Then the discussion got off to  using color correction filters as an advantage when lighting itself is  biased towards one part of the spectrum.



The inspiration for the original post appears to have come from a technical discussion on high-end modern cameras rather than cameras from many years ago. I think that considering the light source - sensor interaction is a more complete way of understanding this situation than sensor spectral response alone, but that is just my opinion.



> Using a color balance filter  can alleviate this, but it is not worthwhile to undo the white balance  algorithm of the camera.



I think that if you are happy using the white balance algorithm of the camera then you are certainly not going to be bothered about this sort of refinement. 



> The histograms show the Blue response is about  40% the green and red, and the minimum value of blue was way above the  noise floor. *Use of a color correction filter is not required to defeat  noise in the blue channel with this lighting source.*



I'm not sure that you can make that deduction (in bold) from the evidence in the histogram. The histogram shows that the subject has a brightness range of merely a third of a stop for the majority of values, and there is only a stop between the extreme values - the minimum value of blue is only one stop down from the maximum value of blue, so one would hope that it is well above the noise floor. Normal subjects can have a much greater brightness range, and it is the combination of both brightness range and naturally lower blue signal that would determine whether or not the low blue values are above or below the noise/acceptable resolution floor. The acceptable blue noise criterion is going to depend on the use - there will be some post processing circumstances in which clean blue matters more than it does in general.


----------

