Description
The BiLinear and CatmullRom interpolators assume the luminance of the source image's color space is linear. But the vast majority of images are in the sRGB color space, which has a highly non-linear luminance curve. As a result, scaled images can look quite different from the source image.
For example, on a reasonably calibrated monitor (and at 100% scaling in the browser), the left and right columns of the following image have the same average luminance:
But scaling this down by a factor of 2 using x/image/draw yields the following:
Here, the two columns appear very different.
For an in-depth (possibly too in-depth) discussion of this problem and many test images, including the one I used above, see http://www.4p8.com/eric.brasseur/gamma.html. My test program is here: https://gist.github.com/aclements/599107a2e3f187f8a2c0.
A lot of software (including browsers) gets this wrong, and that's really unfortunate. It would be fantastic if Go software using x/image got this right out of the box.
Probably the best solution to this would be to thread color space information throughout the image library. At the other extreme, given the general recommendation to assume sRGB in the absence of other information (since virtually every image created in the past two decades is sRGB), it may make sense to simply assume sRGB when interpolating. We could also do the latter first and then later add more complete color space support, with the default being sRGB. Another option is to add this information to the x/image/draw.Options structure, though I fear that may interfere with later adding proper color space support to image.
/cc @nigeltao. (We discussed this a few weeks ago in person, but I figured I should open an issue so it doesn't get lost.)