I want to start out by apologising for a lack of pretty graphics in here, which is a little odd for a post about visual quality. It’s a simple answer as to why this is though – I’m currently typing this on my laptop, watching as the Windows 7 system recovery progress bar loops on the screen of my main PC (uh… yeah, it’s behaving a little odd. But that’s another topic entirely).

Ok, now that you know you’re in for a lot of monotonous text, let’s get on with it!

Gamma correct rendering may sound like a simple enough concept at first, but to do it correctly can be very challenging – especially once you throw hardware variations into the mix. Possibly worse yet is that it is something you must keep in mind throughout development, and educate your teammates about. Or you ignore the issue completely and live with the consequences, but it will come around to bite you down the road. Repeatedly.

First some definitions:

• <Gamma/sRGB/Linear> space
The curve of raw data values to represented values for the data that you’re working with (could by bytes, floats, or rgb triple). Linear is easy as it’s an exact match. Gamma and sRGB on the other hand are not identical, and instead define a curve that gives more raw data values to the lower ranges of the represented values compared to the higher ranges. For the purpose of this post, we’ll call gamma and sRGB the same thing (though they aren’t necessarily this way, as sRGB refers to a very specific curve, while gamma can be anything).
• To gamma a texture/value
To convert from linear space to gamma space. This will result in raw linear values (the same as their represented values) of 0.2 going to raw gamma values of ~0.5 (but still representing 0.2). The exact values depend on the curve of the gamma space that you convert to of course.
• To degamma a texture/value
The opposite of the above; to convert from gamma space to linear space by applying the inverse of the gamma curve. So, with the above example you would get back the original linear raw value of 0.2 from the gamma value ~0.5.

So what does it really take to be gamma correct? There are 3 primary areas of concern: the pipeline, the shader, and the render target.

The Pipeline and Tools

• Some of the source data that you’re given will be in gamma space, and other source data will be in linear space. The pipeline has to know what it has, and what it should do with it. Easier said than done, as this requires meta data to be present – either artist-set or automatically-set based on usage. Having artists specify what everything should be interpreted as is obviously the easiest choice when faced with having to shoehorn gamma correct behaviour into an existing pipeline without the backend architecture to support it, but does carry with it the consequence that user-errors will be abundant.
• If you’re going to perform any processing on textures in the pipeline (resizing, mipmap generation, blending the edges of cubemap faces, etc), the operations must be done in linear space. The gotcha with this is that you must operate with as much precision as possible throughout this process to avoid issues with quantization due to the conversion to linear space and back again. This usually means converting to a floating point texture immediately, and only converting back as the very last step. Yes, and you’ll probably have issues achieving this with some of the external libraries you use. So go and modify them too (and diverge from what’s in SVN, making taking updates all that much harder). Fun stuff.
• The final conversion to gamma space after you’ve done your processing may also bite you due to hardware variations. If you’re lucky you’ll only be targeting platforms that have proper support for sRGB – but many are not so lucky and as such will be in for a world of fiddly pain thanks to . But at least it’s documented now, which has only happened in the last couple of years. Extremely fun stuff.

• Everything you do in a shader should be in linear space. Simple.
• There are states that you can set on the various platforms to automatically convert textures when sampled from gamma space to linear space, but these states do live in different places for different platforms.

The Render Target

• The joys of hardware variations will strike you severely here, and throw a spanner (wrench for those of North American heritage) in the works.
• Frame buffers are usually stored in gamma space and you output linear space values from the pixel shader. Thus, blending the output of the pixel shader with the frame buffer can be done in linear space (correct) or gamma space (incorrect) depending on the platform. DX9 and the PS3 will do it incorrectly, but DX10+ and the X360 will do it correctly.

Here’s a spoiler: not everything should be gamma corrected at every step. But what does that mean? Why is that? Well, that’s exactly what this post is about!

Continued at Intuition for Gamma Correct Rendering at TechnoFumbles…