Jump to content

Working with colour spaces in plugin code (this is hard lol)


Recommended Posts

Paint.NET layers use sRGB colour with straight alpha. Because sRGB colour looks good to human eyes and is normal for saving images.

 

Colour calculations (like lerping, blending, compositing, measuring colour space distance) are often more accurate in linear colour with premultiplied alpha.

 

So we want to take the image, convert it to linear colour, then convert it to premultiplied alpha, then do our calculations, effects, blending, etc. And then convert it back in the reverse order to get the final image.

 

I have some questions though:

 

When we pass Environment.SourceImage into a shader or make a CommandList, do they get converted automatically? Or do we have to do that ourselves?

 

Should I convert colours to linear colour or premultiplied alpha before I assign them to the brush that I use to paint in the CommandList?

Link to comment
Share on other sites

The SourceImage is automatically premultiplied and is in the sRGB "companded" color space. So the components of each pixel have a range of [0, 1] which maps directly to ye ol' ColorBgra component values of [0, 255]. To summarize, each color component is divided by 255.0f, and then the color is premultiplied. There is no further adjustment to gamma to linearize the values.

 

If you'd rather have the SourceImage come in straight alpha format, you can override OnInitializeRenderInfo() and set renderInfo's InputAlphaMode and OutputAlphaMode properties. If you set both of those to Straight, the PDN effect system will then not premultiply the input, and then not un-premultiply your output. This can be desirable for performance reasons if you're already doing everything in straight alpha space, or if you have some need to preserve color values for transparent pixels (otherwise any pixel with A=0 will also have RGB=0, because that's how premultiplied alpha works).

 

You'll need to do sRGB <--> Linear conversions yourself using SrgbToLinearEffect and LinearToSrgbEffect. So you'd take SourceImage and plug it into an SrgbToLinearEffect, then you'd do all your work in linear space, and then for the final output effect you'd plug it into a LinearToSrgbEffect and return that from OnCreateOutput(). These effects expect their input to be premultiplied; you can set the AlphaMode property to switch them into straight alpha mode (so the input would be straight alpha, as would the output).

 

19 hours ago, Robot Graffiti said:

When we pass Environment.SourceImage into a shader or make a CommandList, do they get converted automatically? Or do we have to do that ourselves?

 

Other than what is specified by renderInfo.Input/OutputAlphaMode, conversions never happen automatically. As stated above, SourceImage is already in premultiplied format unless you override OnInitializeRenderInfo() and set renderInfo.InputAlphaMode. If you set that property to Straight then you will need to premultiply SourceImage yourself before plugging it into e.g. an image brush. Direct2D expects everything to be in premultiplied alpha space. However, if you're creating a solid color brush, the ColorRgba128Float is expected to be in straight alpha space. This is generally true for ColorRgba128Floats when they're "on their own". I've made sure that anywhere you pass in ColorRgba128Float as a method parameter, it's straight alpha. If you see ColorPrgba128Float, it needs to be premultiplied, and you can use ToPremultiplied() on the ColorRgba128Float to do that.

 

Direct2D's effect system is a little different, however. The effect rendering system itself knows nothing about premultiplied vs straight, or sRGB vs linear. From its perspective each input image is just a 2D array of Float32 values, which is piped into a pixel shader, which then produces the output (another 2D array of Float32 values). It does no additional processing on top of what the pixel shader does. I take advantage of this for the distortion effects which are based on sample maps -- these are pixel shaders, wrapped in an effect, whose inputs and outputs are (X,Y,A,*) values instead of (R,G,B,A). (the * in X,Y,A,* just means it's ignored) These X,Y,A values are used by another pixel shader which samples the input at the given X,Y coordinates, and then it multiplies the color by the A value. If you want to try this out yourself, check out SampleMapRenderer.

 

However, each effect will have expectations about the alpha format of its inputs and also for its output. So for each effect individually, unless it documents otherwise, you must assume that it operates in premultiplied alpha space. Some effects have an AlphaMode property to configure this -- and often you'll see a "2" version of the effect that I have written which adds an AlphaMode property and does the conversion for you (e.g. FloodEffect2, GammaTransferEffect2, GrayscaleEffect2).

 

For your own pixel shaders, whether it consumes premultiplied or straight alpha inputs is up to you. If you need to work in straight alpha space, which is common for pixel shaders, then you must un-premultiply any premultiplied input images using UnPremultiplyEffect. And then if the output is in straight alpha, and is to be plugged into something that expects premultiplied, you must premultiply it using PremultiplyEffect.

 

Alpha management is kind of a PITA -- @BoltBait can attest to this -- and it's easy to get it wrong. I really wish D2D had the capability to attach some sort of metadata to effect inputs and outputs so that I could automatically insert (Un)PremultiplyEffects into the effect graph, but this is not currently implemented.

  • Like 2

The Paint.NET Blog: https://blog.getpaint.net/

Donations are always appreciated! https://www.getpaint.net/donate.html

forumSig_bmwE60.jpg

Link to comment
Share on other sites

I've also made sure that each of Direct2D's built-in effects have a clickable link to the Direct2D documentation in their IntelliSense:

 

 

image.png

(note that this IntelliSense is on the type, not the constructor -- so you'd rest the mouse over the GrayscaleEffect declaration on the left, not the GrayscaleEffect(deviceContext) constructor on the right)

 

So, PaintDotNet.Direct2D1.Effects.GrayscaleEffect, a wrapper for D2D's built-in Grayscale Effect, has a link to https://learn.microsoft.com/en-us/windows/win32/direct2d/grayscale-effect

The Paint.NET Blog: https://blog.getpaint.net/

Donations are always appreciated! https://www.getpaint.net/donate.html

forumSig_bmwE60.jpg

Link to comment
Share on other sites

Thanks, I think I got it.

  • sRGB to linear conversion isn't automatic, I have to do that myself.
  • Straight alpha is automatically converted to premultiplied by default for both SourceImage and solid colour brush (but the alpha mode is configurable for SourceImage).
  • SrgbToLinearEffect can be used on an input that is already premultiplied (its alpha mode is configurable).
  • I have to be careful to give shaders inputs in the colour space they're written and/or configured to expect (some of them are configurable).

I had a shader which I had written on the assumption that shaders were using straight alpha - but I now know they aren't (and in some cases, shouldn't) - so I have to either change the shader or use UnPremultiplyEffect and PremultiplyEffect around it.

Link to comment
Share on other sites

Sounds like you got it

 

36 minutes ago, Robot Graffiti said:

I had a shader which I had written on the assumption that shaders were using straight alpha - but I now know they aren't (and in some cases, shouldn't) - so I have to either change the shader or use UnPremultiplyEffect and PremultiplyEffect around it.

 

Exactly.

 

Premultiplying is easy -- just new float4(color.RGB * color.A, color.A). Un-premultiplying is more tedious because you have to special-case for when A=0 so you don't end up with RGB=+infinity. Best to have your shader use straight-alpha input, and use UnPremultiplyEffect on its input if needed. I also put a comment for myself above the shader to document its alpha format input/output, e.g. // Straight alpha input and output, or // Straight alpha input, premultiplied alpha output

The Paint.NET Blog: https://blog.getpaint.net/

Donations are always appreciated! https://www.getpaint.net/donate.html

forumSig_bmwE60.jpg

Link to comment
Share on other sites

Also, I mentioned that the D2D effect system doesn't do any processing on the input/output buffers: it also won't automatically clamp values within an effect graph. So if your shader does something like color.R - 0.75f (subtracts a constant value), you could end up sending negative values to the next effect in the graph. So if you see problems with this, you can use Hlsl.Clamp() in your shader to clamp your values (oftentimes just the return value). You can also use ClampEffect (remember to set its AlphaMode property, as usual its default value is Premultiplied).

 

However, values are clamped at the very end of the PDN effect rendering process while converting back to BGRA32 (which obviously can't store negative values anyway).

The Paint.NET Blog: https://blog.getpaint.net/

Donations are always appreciated! https://www.getpaint.net/donate.html

forumSig_bmwE60.jpg

Link to comment
Share on other sites

  • 1 month later...

In the next update for PDN, I'm making some changes with regard to color spaces.

 

In PDN 5.0 through 5.0.3, GpuEffects are expected to output sRGB pixel data. You can work in linear space as I described earlier by using e.g. SrgbToLinearEffect on your inputs and LinearToSrgbEffect on your output.

 

In 5.0.4, the default will be to work in linear space (existing plugins will not be affected though). You can then specify that you'd rather work in sRGB space in your OnInitializeRenderInfo() method by setting the ColorContext property to GpuEffectColorContext.Srgb. You will get an obsoletion warning when you go to recompile your effect -- instead of new GpuImageEffectOptions() { IsConfigurable = true } you'll need to do GpuImageEffectOptions.Create() with { IsConfigurable = true }. The error message will tell you exactly this.

 

In either case, Paint.NET will then automatically supply you with images in the appropriate color space via both Environment and CreateImageFromBitmap(). It will also automatically convert your rendering output to sRGB.

 

Working in linear gamma space really is a lot better. I've updated several other places in PDN to use linear and the results are fantastic. https://developer.nvidia.com/gpugems/gpugems3/part-iv-image-effects/chapter-24-importance-being-linear

 

And when the conversion is done on the GPU there really doesn't seem to be a performance hit, even on the lowest-end hardware I could find (e.g. Intel UHD, or even ARM64). 

The Paint.NET Blog: https://blog.getpaint.net/

Donations are always appreciated! https://www.getpaint.net/donate.html

forumSig_bmwE60.jpg

Link to comment
Share on other sites

I'm also working on a solution for predefined colors, e.g. Colors.Blue. These are currently provided as sRGB and do not get automatically linearized when cast from ColorBgra32 to ColorRgba128Float.

The Paint.NET Blog: https://blog.getpaint.net/

Donations are always appreciated! https://www.getpaint.net/donate.html

forumSig_bmwE60.jpg

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...