I've received a large number of emails on the subject of linear light over the course of running this blog.
I intend to break the following down into a series of blog posts if the feedback is high enough. I doubt it will be, but alas, for those that have emailed, the first installment here is dedicated to you. Whether you care or not is a question only you can answer.
I'm willing to bet that the audience that reads this blog is either already acutely aware and does care, or does care and hasn't yet explored the issue to their satisfaction. If you fall into the latter group, I can only hope this post serves you well and leads you further along the path of exploration.
The following is an entirely worthless piece written from an entirely delusional and dimwitted vantage. Errors, omissions, glaring idiocy, and other like details are entirely possible and expected.
Linear light? Why do I care?
If you have been compositing images, you may have come across the following. The example is a simple full magenta color slightly blurred and composited over top of a neutral grey tone:
I'll try my best to not make it too painful...
Aliens and Audio
Imagine if you will an alien race. Perhaps they are willowy blue things with tribal patterns or some sort of oblong shaped head with razor teeth and acid blood. It doesn't matter.
Now imagine that these aliens live on a planet where the range of sound goes from zero to one via our SupahDupaAudacity Meter. If we were to graph the alien landscape audio range to the sound level, it might look like:
|Alien Audio Reality|
For whatever reason, be it the quiet flapping of their food's wings or the need to listen to high fidelity rock music, the aliens have evolved in such a way that a particular range of the audible spectrum is far more sensitive to them. Away from that spectrum, the audio falls off. Heck they can't even hear levels around 0.3 and below. Above say, 0.7, the audio vibrations all sound the same.
Their 0.4-0.5 range however, is superb! They can hear within that range like an alien dog. In fact, they have evolved in such a way that 90% of their capacity to hear audio tones is in fact dedicated to that thin little range of noise.
If you could visualize this, it might look something like a curve, with the perception of audio falling off at around 0.3, being extremely solid in between 0.4 and 0.5, and gradually tapering off toward 0.7. Perhaps a graph of the situation would look something akin to:
The Alien Reality
Out in the physical alien world, the audio behaves according to the laws of their alien physics. Let's say 0.1 unit of noise plus 0.1 unit of noise equals 0.2 units of noise. This is sort of the simple math behind their reality.
However, to the aliens, this simple little bit of reality is entirely bent. They hear the world through warped ears. That teeny little bump in physical reality is actually nearly their entire quality range of hearing!
Back to Our World
The scenario I described above is loosely what is happening in our world regarding perception. On the one hand, our physical world behaves very uniformly and reliably. One unit of light plus one unit of light equals two units of light. All is well and good.
Within our perception system however, just like the aliens, we see things through a bent and warped set of sensors. Our eyes are in fact non linear receptors. Certain ranges of light are entirely visible to many of us and some other ranges are utterly invisible.
Physical reality, however, maintains no such bias.
What the Heck Does This Have to Do With Overlaying Images?
Well, we are getting there...
First, we have to accept that our eyes see things in a curved and warped fashion. Second, we must also accept that our physical reality can be modeled much more easily with a non-human relative model. Where our eyes see things in a bent and warped fashion, our reality can be modeled to behave in a predictable and mathematically uniform manner.
Modelling Reality in a Computer
In our computers, we can represent our abstract reality using any number of models. One such model might be RGB, where we mix the three channel colors to arrive at a gamut of colors. We can represent this gamut of colors using units of red, green, and blue.
This model however, as a result of historical displays and bent views of the world, does not accurately depict reality. Why not?
Well in our above example, the aliens have a sensitive range of hearing in between 0.4 to 0.5. If the aliens were designing an audio application, that audio application might only offer the ranges the aliens can hear - between 0.3 and 0.7. The net result is that the audio application would stretch the entire 0.3 to 0.7 range and warp the range between 0.4 and 0.5 to fit within the editing application's interface ranges. After all, that's where the alien's audiophile zone is and they probably weren't too concerned with things they couldn't hear.
In our real world dilemma regarding perception, the same is true. The digital imaging applications you have come to love have somewhat offered up a warped view of the world that appears perfectly fine. In fact however, in a large chunk of applications, those red, green, and blue channels in our color model are warped and bent just like our alien audio application.
That however, doesn't stop a computer from needing to do plain math with our data. See, our computers and the real physical world share an interesting similarity: Their math models behave relatively consistently. One plus one in the computer's registers and memory will equal two consistently. Equally, one unit of red plus one unit of red equals always two units of red. No shocker here.
The impact of that math, however, is radically different depending on the context. For our alien planet, 0.1 units of audio plus 0.1 units of audio is a mere 0.2 units of audio. As our aliens discovered above though, those mere 0.2 units of audio nearly covers their entire most sensitive audio range!
Herein lies the core crux of the problem. We have two different models, and they don't play nicely together.
Over and Alpha
To finally encapsulate this puzzle, it is probably wise to look at a simple bit of math. The fundamental math for an alpha over operation using straight alpha on a fully opaque background is:
Output.RGB = (Over.RGB * Over.Alpha) + ( Background * ( 1 - Over.Alpha) )
If the math scares you, don't let it. It's very simple. In fact, we are only going to focus on the first portion of the formula: Over.RGB * Over.Alpha, or more simply, a number multiplied by another number.
While the underlying values of the numbers might be stored in a unique manner, we can perceive the Over.RGB value as a series of red, green, and blue values. The Over.Alpha could be viewed as a single value that ranges from 0 to 1.
In the case of the RGB values, we have a series of three values that are roughly akin to the alien audio application example - bent. Within our alien audio application, the values are warped according to the needs of the application and the dynamic range of the alien ears. With our RGB values, we have historical display hardware and the nature of human perception to account for the warping.
With alpha we have a linear system. Ten percent plus ten percent always equals a uniform twenty percent.
Mixing the two systems is where things get potentially ugly. Consider the alien audio application for example. If the aliens have recorded an awesome musical track that flutters around 0.5 in their audio application's dynamic range, the real world values might be somewhere about the middle of their bent audio perception range - somewhere between the upper limit of 0.7 and the lower limit of 0.3. Let's say it is around 0.5.
So what happens if the aliens want to decrease the audio by about 50% in their audio system?
The math is simple: the middle range of their audio 0.5 multiplied by 50%, which is 0.25! Simple! The final musical track has a value of 0.25 on average as opposed to the original 0.5.
Except it isn't quite what the aliens were hoping for.
Sadly, this is entirely below the range of the alien's lower audible range at 0.3. This is way lower than they would like to hear, after all, they wanted a result that was around 50% less loud to their ears, not to the physical reality. Thanks to the bent audio application and simple math however, they got something radically different.
Enter Linear Light
So what to do? Clearly the bent and warped perception in their applications isn't quite working out the way they want. If they keep the adjustments minor, they might not even realize there is a problem at all, despite the fact that the math is totally incorrect. The net sum is likely a bunch of confused alien audio artists.
Well, let's pretend that the aliens could somehow "correct" their work so that it works in the full zero to one range as present on their planet? What if they could somehow bend their audio application's model back to the reality and do all of the math on it?
They would somehow need to invert their audio samples to their alien planet's model, do the work, then before listening, correct it back to fit within the ranges that their ears work best in.
This is precisely how a linear light workflow works.
A Long Path to the Reason
If you have managed to get this far, you may be able to figure out what has happened with the simple image that started it all.
The math being performed on our magenta overlay is fundamentally broken. Our magenta value and the grey background live in the warped sRGB land, while the alpha lives in linear land. When we smash the two together in math via a multiplication, the result is very much broken when it comes to reality and how light behaves.
There has been a large movement in rendering and imaging applications that tries to fix this. How does it do it? Well on a simple level all that is happening is that the images that come in with sRGB values are in fact "inversely bent" out of our historical bent space and flattened to a linear space. If you need to visualize this, it would be as though we started with the "Alien Ear" image above and flattened it back to the "Alien Audio Reality" model.
This conversion to a more useful model makes the math hold up wonderfully. Then, just prior to the image being displayed, it is temporarily bent back to the warped version our eyes expect.
And that brings us back full circle to arrive at Exhibit B way back at the start of this diatribe.
In the interest of brevity and audience, this post obviously is an overly simplistic view of the issue in the name of getting the core foundation across. There are similar "bent and warped" complexities associated with things beyond the simple relative value of colours, including colour coordinates and other strange warping. This leads down the path of color management.
Hopefully though, the rather silly introduction here has sparked your interest into those topics.
Alpha operates on a linear model. The RGB values in the bulk of images are bent and warped. Mixing the two results in bent math that is fundamentally broken.
If this was useful, let me know and perhaps I'll try and flesh out some other whacky things...
- Charles Poynton's Famous Gamma FAQ.
- A guide to linear light in Nuke.
- Charles Poynton's Famous Color FAQ.
- One of the first evangelists of linear light Stu Maschwitz.
 This is further complicated by the fact that some people may have entirely different vision needs and / or differences. Some may be able to see much further than an average person, some may have a very particular sensitivity to colors or only see within certain ranges.
 In fact, there are legitimate reasons to have an application that covers the ranges you cannot see or hear, but that is the beyond the scope of this post.
 The over operation for a premultiplied alpha image is Over.RGB + (Background * (1 - Over.Alpha)). This subject is likely beyond the discussion here.