The matter of overexposure on a sensor is a tricky one. To make it easier to picture, I prepared a graphic that shows what happens inside the silicon itself. Of course, real sensors are more complicated — they have microlenses, color filter stacks, and much more. But if we simplify, each little square under the mosaic is a sensor well. Each well is a collector of photons. A useful way to imagine it is like a vessel that fills with liquid: photons pour in like drops of water. Each vessel has a capacity. Once it is full, it cannot tell us how much more light was really there.
Now, in practice, cameras avoid giving us the “messy part” of that story. The analog-to-digital converters only digitize a range where the signal is reliably linear. So in our data, there is no slow bending away from linearity — just a hard stop. What we receive as raw values are like the liquid gauges shown in the graphic.
ColorPerfect works in a very pure way: each pixel is treated on its own, one by one, without reference to its neighbors or the geometry of the scene. So if we look at one example pixel, we may find that red and blue are measured correctly, but green has spilled over. The well is full. In histogram terms, those pixels are the spike at the right edge.
Here comes the problem: when green is maxed out, we no longer know how much more should have been there. Was it just one extra drop? Was it double or triple? We cannot know. What we do know is that something is missing. And the absence of green eventually appears as its complementary — magenta.
This does not show itself all at once. With
SmartClip off, the defect grows gradually: a little missing green may only dull the color, while more missing green drifts into magenta. The information is already unreliable, but the eye only notices once the error becomes large.
When Dave and I first designed PerfectRAW, this problem demanded an answer. That is when
SmartClip was born. It is not a magic cure, but a principled choice: do not invent color where the signal is broken. Instead, take brightness from the data, and allow the user to guide the missing color. That is what
SmartClip lets you do. With a click, you tell ColorPerfect: “This should have been blue,” or “This should have been yellow.” The program then uses that direction to restore harmony.
It is possible that one day we may find new solutions here. But for now,
SmartClip remains as it is, and it has proven reliable.
Returning to Alexis’s image: why do we see artifacts on the blue wall, or along the edges of the wing-shaped lamps? It is exactly this: one channel has clipped and obviously barely so - hence the "artifacts", so the program refuses to trust the color. With
SmartClip active, a proper click on the blue wall and another on the yellow screen will set things right. But for the lamps themselves, which are meant to be pure white light, the correct treatment may require some touch-up in Photoshop, at least where they overlap the yellow screen.
The easy answer is don't overexpose, only that that can be hard
