3

When shooting with long exposures (multiple to tens of seconds) I noticed that in some cases colors in the resulting photos change hue. This was on Canon EOS 1100D, but I've also seen such a problem with some other cameras like Canon EOS 40D and 60Da. This even happens when only aperture is changed, without changes to shutter speed.

Below is an example pair of shots on 1100D, first with 20 s exposure, second with 30 s. All the other settings are identical (aperture f/3.5, ISO 100, focal length 18 mm). The photos were taken at almost the same time deep in the night (21:49:02 and 21:50:05 MSK on 25.02.2019 at 60°N,30°E). The images were formed from RAW files (*.CR2), where pixel data were converted to linear sRGB using the LibRaw-provided cam2rgb matrix, and then gamma-corrected by raising the result to the power 1/2.2.

20 second exposure

30 second exposure

If you open them in two tabs and switch between them quickly, you'll notice that, aside from brightness, they also change hue: the first one is more bluish, while the second is more purplish.

To make it easier to see that the whole sky changes hue, not only near the sources of light, here's the same pair of photos, but with colors normalized so that R+G+B=const:

20 second exposure, normalized colors

30 second exposure, normalized colors

What is the reason for this? Is it a known problem with long exposures? How can I compensate for this to get some continuity of hue for time lapse animations where scene brightness dramatically changes in time (e.g. twilight) thus requiring different exposures? (By continuity I mean continuous change of chromaticity regardless of exposure.)

Ruslan
  • 560
  • 3
  • 17

5 Answers5

2

How can I compensate for this to get some color continuity for time lapse animations where scene brightness dramatically changes in time (e.g. twilight) thus requiring different exposures?

When linear raw values are converted to gamma corrected values not all are multiplied by the same amount. A curve is applied to mimic the way human vision responds to varying levels of brightness. It's basically logarithmic, but there are "shoulders" at the brightest and dimmest values that get flattened out in the opposite direction from the main logarithmic curve.

When you shift the raw values recorded by the sensor higher by exposing longer, after raw conversion not everything in the scene will be equally brightened compared to the darker image because all of your values will fall at different points on the S-shaped curve applied in gamma correction. The midtones will be more or less all boosted by the same amount. But there are very few midtones in your example photos. Almost all of the values are bunched up on one end or the other. The darker values will be brightened more, comparatively, in the longer exposure than the midtones. The midtones will be brightened more than the brightest values that approach but do not reach full saturation (those already blown out can't be any brighter than they already are in the darker image).


Answer before the OP clarified that these two photos were not taken during twilight, as implied in the question:

Short answer: you can't when mixing two light sources that are that different in the type of light they output and one is staying a constant brightness and color while the other is dimming and shifting in color.

Brightness isn't the only thing in the scene changing during twilight. The color of the light from the sky shifts rapidly during twilight as well. If you correct to maintain the same color sky, your artificial light sources will shift. If you process to maintain the color of your artificial light sources, then you'll have to accept that the color of the sky is changing.

In the case of your two example images, you also increased exposure as the sky darkened. This changed the ratio of the influence of light from the sky to light from your artificial sources because the sky is not as bright when the second image was captured - but the artificial sources were just as bright before you exposed them for 50% longer.

Even if you used identical white balance settings to develop both images, one would expect the color to shift as more of the light in the second image is from the artificial light sources and less of the light is from the sky, which is also changing color as well. This is because there's more light from the artificial sources in the second image, due to the longer exposure, and less of the total light is from the sky.

Michael C
  • 175,039
  • 10
  • 209
  • 561
  • I think you misunderstood the setting of the two photos. They were done deep in the night, where the sky wouldn't be as bright if there were no light pollution. And both photos were taken in the same minute, so the sky didn't change at all. The twilight was just an example of what type of changes of brightness to expect. The main problem is that when I do the timelapse with one exposure, I get continuous hue change, and when I switch to another exposure, the hue abruptly changes and then continuously evolves in time again. – Ruslan Mar 06 '19 at 05:43
  • @Ruslan If the two photos were not exposed at twilight, why did you mention it? That's kind of misleading. – Michael C Mar 06 '19 at 21:19
  • Well, I just wanted to justify switching exposures. – Ruslan Mar 06 '19 at 21:29
  • @Ruslan That would only justify switching exposures when the conditions were actually at twilight. Answer has been updated. – Michael C Mar 06 '19 at 21:32
  • Not sure what you mean by the S-shaped curve. Isn't transformation from linear sRGB to actual sRGB basically exponentiation by a constant (approximately, with γ≈1/2.2)? This function doesn't look like S – more like curved Г. – Ruslan Mar 06 '19 at 21:42
  • @Ruslan Not typically. But even if it is applied in a purely exponential way, different values from the original raw values will not be multiplied by the same amount. Because it is logarithmic, brighter values get multiplied by a larger factor than dimmer values do. Increasing the exposure will not result in all values being raised by the same amount if the same gamma correction curve is used. Of course the brightest values are already blown and can't get any brighter than they already are. Thus your crooked Γ just moves to the left at the top.. – Michael C Mar 06 '19 at 21:59
  • I think I still don't understand what type of conversion of RAW values to sRGB you have in mind. I'm thinking of the sequence: 1) Linear transformation RAW→linear_sRGB (via cam2rgb matrix provided by LibRaw), 2) gamma correction to get to final sRGB (by the sRGB specification with that piece-wise definition with γ≈1/2.4, or approximately with γ≈1/2.2). What exactly process do you mean instead? – Ruslan Mar 06 '19 at 22:06
  • @Ruslan Gamma correction is an exponential, not linear, operation. – Michael C Mar 07 '19 at 20:02
  • If one graphs the equation y = x² one gets a parabola, not a straight line. You seem to think such an exercise would yield a straight line. – Michael C Mar 07 '19 at 20:04
  • If we compare two sections of y = x² they will be different. The section between x = 0 and x = 10 will not be shaped the same as the section between x = 5 and x = 15. You're expecting the curves to match in shape with one just higher up on the y axis. They don't. – Michael C Mar 07 '19 at 20:06
  • Erm... I may have expressed myself badly, but I don't confuse exponential with a linear function. Gamma correction is listed as the second stage of conversion from RAW to sRGB in my comment. Only the first one is a linear transformation. Second is exponentiation, and it always, not only "typically", has a graph different from S shape. – Ruslan Mar 07 '19 at 20:49
  • Actually, now I see that you seem to be talking about brightness in your answer, which is kinda irrelevant to the OP. The OP is about chromaticity (xy or rg), which shouldn't change due to exposure, since chromaticity is defined only by ratios between XYZ components. – Ruslan Mar 07 '19 at 20:53
  • @Ruslan Forget the S-shape of typical light curves applied in raw conversion. My previous three comments do not assume an S-shape at all. A pure gamma correction is still logarithmic. After conversion, the relationship between values starting with 0-10 before conversion will not be the same as starting with 5-15 before conversion. The slope of the exponential function alone (without any S-curves applied) will be different for different sections along the x-axis. – Michael C Mar 07 '19 at 20:53
  • Chromaticity will change when various light sources in the scene at different brightness levels have different hues. When you raise the exposure level, the input level of those various light sources are placed at different points along the x-axis than they were with lower exposure. Thus the relationship between them (in terms of ratios of the brightness of one to the others) will change, which affects the overall hue of the image. – Michael C Mar 07 '19 at 20:56
  • To put it another way: When one is talking about mixed light sources, the relationship of the relative brightnesses of each light source will have an effect upon the ultimate hue of the image. – Michael C Mar 07 '19 at 20:59
  • But change of exposure changes photon count from all of these light sources by the same multiple (up to shot noise), doesn't it? Why would then one light source get a larger boost in brightness than the others? The scene doesn't change — only the camera settings do. – Ruslan Mar 07 '19 at 21:02
  • NO. The logarithmic gamma correction means lower values are not amplified as much as higher values are (or vice versa depending on whether the gamma correction value is less than '1' or more than '1'). That's what "logarithmic" or "exponential" means. For y=x², consider the following sequence (0,0), (1,1), (2,4), (3,9), (4,16), (5,25). Now compare that to (3,9), (4,16), (5,25), (6, 36), (7,49). If you plot both of those series, the shape of the curve is different for each set. It's more than just one set is higher along the y-axis than the other. – Michael C Mar 07 '19 at 21:08
  • The proportional relationship between y coordinates is different for x = |0-5| than for x = |3-7| – Michael C Mar 07 '19 at 21:09
  • Why are we talking about gamma correction so much? Chromaticity is calculated before gamma correction, in linear space, and this calculation cancels all the information about brightness. See the definition of e.g. r chromaticity coordinate: r=R/(R+G+B) where R, G and B are linear RGB-space values. Any multiplier is canceled by the division. – Ruslan Mar 07 '19 at 21:14
  • 1
    Chromaticity of your images should be the same before gamma correction. But chroma is also affected by gamma correction when different hues have different brightnesses before gamma correction. Your eyes do not see the individual hues of each light source in the resulting photo. They see the sum total of all of the hues combined. When you raise the same components by differing ratios, it will alter the combined hue. – Michael C Mar 07 '19 at 21:20
  • Do I understand you correctly, that you mean that chromaticity appears to differ due to the raising to the power of 1/2.2 of the pixel values? If yes, do you take into account the fact that the eyes will still get the linear brightnesses we worked with before gamma correction, because the monitor will do the reverse operation, i.e. raise the values to the power of 2.2 before passing them to DAC? – Ruslan Mar 07 '19 at 21:33
  • Gamma correction in raw processing is not the same exact thing as gamma correction between a graphics adapter and a monitor. They are related concepts, but the same concept is applied differently at different steps in the pipeline between a raw image file and an image displayed on a monitor (or printed by a printer, which also use their own form of gamma correction). It is unfortunate that we use the same term for both. – Michael C Mar 07 '19 at 21:50
  • Could you point me to some resource to read about the "other" gamma correction — the one unrelated to monitor? – Ruslan Mar 07 '19 at 21:52
  • When you apply gamma correction to raw image data, the transformation is reflected in the rasterized RGB values for the resulting image file (TIFF, JPEG, PNG, etc.). When your GPU translates those RGB numbers and sends that raster image to a monitor, it applies additional gamma correction (required by CRTs) that is not reflected in the RGB values before it translates them. When an LCD monitor "reverses" gamma correction, it is only the gamma correction applied by the GPU that is reversed, not the gamma correction done in raw image processing. That does not get "reversed" by the monitor. – Michael C Mar 07 '19 at 21:56
  • Well, at least by default on all the GPUs I used this additional gamma is 1.0. And I did measure light output from the monitor as I change RGB 8-bit values: the monitor shows its gamma 2.2 quite well with respect to these exact values, with no additional gamma measurable. – Ruslan Mar 07 '19 at 22:00
2

It appears that I was actually doing (unknowingly) "as shot" white balance correction while processing the raw files. Namely, I was using the cam_mul coefficients, which do change from image to image (they represent white balance of the embedded JPEG, in particular), and the particular camera I used to do the photos (I have two of the same model) had WB mode set to Auto, which I forgot to check when shooting.

Behavior of RawTherapee added to my confusion: when I chose "Neutral" processing profile, I supposed that it would reset to some consistent, shot-independent, settings. But it appears that White Balance Method setting is still "Camera", leading to the same discrepancy I got with manual LibRaw-based conversion.

Now I've tried replacing my usage of cam_mul with pre_mul (daylight WB), which seems to be constant regardless of the scene (at least it's the same for the two problematic photos in the OP), and the hues are now the same. Similarly, choosing the same white balance method in RawTherapee gives matching hues too.

Ruslan
  • 560
  • 3
  • 17
1

I've seen cold-start issues.... go out into the cold night with a cold camera, heat the CMOS with a long exposure, first cold-CMOS image has different response parameters vs next exposure that starts with a warm CMOS.

But who gets long exposure settings right on the first try, LOL? So we're usually not in a position to notice.

Roger Krueger
  • 755
  • 3
  • 6
  • This actually seems realistic for the pair of photos I show in the OP. The first one was made right after I got outdoors into the cold. But I've also seen a similar problem when I did timelapses indoors, where ambient temperature was almost constant. But yes, I've tried several times to reproduce problem, and it's quite elusive, so indeed might be related to temperature. – Ruslan Mar 11 '19 at 05:10
0

For a similar reason why the lights in this question Why did this image turn out darker? change their appearance:
The changed exposure time altered the ratio of (near zero) ambient light from the dark sky to the warm haze of artificial light.

In the second picture the artificial light is much more pronounced and due to haze affects the sky as well. Which is why the effect is almost unnoticeable at the top of the photo, far away from the skyline.

ths
  • 7,161
  • 1
  • 11
  • 25
  • In the post you link to ratio of luminance due to flash vs luminance due to the scene changed between the shots. In my question there's no camera-controlled light source in this pair of photos, only exposure changed. And there's no blacked-out parts of the image to make chromaticity undefined, so this shouldn't be a problem. The effect is noticeable in the whole photo (aside from overexposed parts), if you increase its brightness. – Ruslan Mar 06 '19 at 10:04
  • I've added a version of the photos with colors normalized to constant brightness so that you could see that the whole sky is affected almost equally. – Ruslan Mar 06 '19 at 10:14
  • the answer is still the same, you're seeing haze from the artificial lights vs. the black sky. – ths Mar 06 '19 at 10:36
  • What do you mean? There's no black sky at all! The whole question is about hue of the haze. – Ruslan Mar 06 '19 at 10:37
0

As you know, this camera sports a CMOS imaging chip. The heart of this chip is millions of tiny photosites. This design features lots of sophisticated functions. Each photosite contains converters that read out the amount of charge induced by photon hits and transfigure the charge to a voltage. Additionally each site contains an amplifier to boost the signal to a usable lever.

In a perfect world, each photosite would function exactly the same as its fellow sites. However each will react to light slightly differently and each amplifier operates with slightly different efficiency. Heat enters into the mix. The longer the exposure, the more the chip heats up, and this augments the disorder.

Engineers design cameras with the goal, “make a faithful image”. This has never happened -- be it a digital of a film camera.

P.S. --- In the photofinishing industry we often found that automated color printers often induced a slightly different hue when reprinting a negative previously printed using an identical setup. The hue differences most often were traced to microscope differences in the placement of the negative in the negative gate. In other words, the sensors that report color balance send different data induced by just a slight repositioning. Perhaps not the same thing, but perhaps your hue variance is influenced by changes in positioning and subject fluctuations like neon lights flickering etc.

Alan Marcus
  • 39,160
  • 3
  • 48
  • 91