Red and blue values switched when saving bitmapdata as a png

Hi.

I am trying to save a bitmapdata to a png on the windows platform using the following code :

public function SaveAsPngTemp()
{
var l_Image:BitmapData = new BitmapData( 500, 500, true, 0x00000000 );
l_Image.fillRect( new Rectangle(0, 0, 500, 500 ), 0xFFFF0000 );

	var imageData = l_Image.encode( l_Image.rect, new PNGEncoderOptions() );
	
	var fo:FileOutput = sys.io.File.write( "test.png", true );
	try {

		fo.writeBytes(imageData, 0, imageData.length );

	} catch (e:Dynamic){
		trace("Error writing file test.png : " + e);
	}
	fo.close();
}

The image test.png is saved but with the color 0xFFFF0000 I get a blue image (and with 0xFF0000FF I get a red image). And I am having the same problem with loaded png images that I save as png : both colors are switched.
The problem seems quite simple but I have been unable to find where it comes from.

I am using openfl 3.2.2 and the last verion of lime.

I ran into the same problem when I upload images into Stage3D textures… and found out that OpenFL is trying to convert RGBA pixels to BGRA and make them premultiplied on native targets, but Stage3D wrapper uploads pixels as it is. Maybe conversion is forgotten in some other places.

I don’t think you can restore original unpremultiplied image once you make it premultiplied. Doesn’t this cause problem?
Also I want to avoid unnecessary RGBA/BGRA conversion as much as possible.

http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/display/BitmapData.html#getPixel32().


I guess this is for cairo-based software renderer but…

  1. Cairo only accepts “premultiplied BGRA images”
  2. But It’s impossible to internally store pixels in premultiplied format as RGB values may get lost

So, to pass the image to cairo, you must create temporary image first to make the image premultiplied, and

  1. WebGL textures only support uploading from RGBA images, not BGRA
  2. BitmapData.getPixels() returns pixels in unpremultiplied RGBA format

BGRA images also works well with OpenGL on native targets, but storing pixels in different order only on native targets is confusing when I need direct access to pixels.
If it’s unavoidable to create temporary images for cairo, I’d rather choose RGBA and swap channels when I need cairo.

Thanks for your interesting answer. I don’t know if my problem is caused by this but it really looks like it is.
However if I replace BGRA32 with ARGB32 in BitmapData.hx nothing changes.
Anyway I don’t feel good enough to mess with that kind of native stuff and design choices. For now I wil just swap channels, much easier.

Thanks for the heads up, I forgot to check the format for PNG/JPG/BMP encoding.

I’ve done a pass on pixel format support in Lime, now supporting pixel operations for ARGB, BGRA or RGBA image buffers. OpenFL is now using premultiplied BGRA buffers, improve performance considerably and making it more consistent with Flash (which uses premultiplied BitmapData)

This just needs to be double-checked for encoding

Well I was failing to understand what flash was doing…
It completely loses RGB value when you load an image with fully transparent pixels. Only for get/setPixel it converts premultiplied to unmultiplied internally to avoid data loss on those operations.

Then uploadFromBitmapData of flash.display3d.Texture takes premultiplied data while uploadFromByteArray takes unmultiplied data? That’s really confusing.

Not sure about BGRA/RGBA but flash might be storing it in BGRA and swizzling for getPixels.

Flash is definitely premultiplied alpha, it might use BGRA (and flip to ARGB for the user) but I’m not sure about. You can tell about the premultiplied alpha because using “0x80FFFFFF” it reads back as about “0x80F7F7F7” I believe. Other tests have confirmed to me differences in behavior, cause because we were not multiplying the image buffer, while Flash Player has been.

Now that we’re using a new format, we’ll have to double-check the Stage3D implementation to be sure that it’s multiplying/unmultiplying as appropriate.

Only premultiplied values seems to be stored in current implementation, which is different from what doc says, but this is another issue.

If you use setPixel, it sets the alpha to 0xFF, which is a multiplier of 1. Multiplying the color value by the alpha multiplier is the same color value – no data lost. If you unmultiply a color value with an alpha of 0xFF, you’re dividing by 1, still no data loss. It’s only when you use a fractional alpha value that you lose color information, until an alpha of zero where everything is turned to zero. This is surely one reason why Flash Player enforces a color of 0x00000000 when the alpha is zero :slight_smile:

Alright, I just committed a patch that will convert back to unmultiplied/RGBA image data before performing an encode (similar to how it was before the change. Ideally, we’ll handle formats on the C++ side for faster encoding, but for now, this will give us accurate results

@singmajesty does that patch have any impact on display3D textures? I’ve just been taking a look at the current OpenFL 3.2.2 (Lime 2.5.2) and when I target HTML5, I get the correct colours in my render however for Mac and Neko (not using legacy) I get the wrong colours.

So both are using the current ‘Next’ implementation but render differently, so that’s the first problem.

I’m trying to have the most efficient mechanism of going from a bitmapData to a type suitable for GL.texImage2D uploading (e.g. UInt8Array without and array cloning if possible).

Cheers

Greg

I switched from RGBA to BRGA_EXT on native.
I also made UNPACK_PREMULTIPLY_ALPHA_WEBGL enabled so that I can convert unmultiplied images to premultiplied one without using JavaScript.

Note that UNPACK_FLIP_Y_WEBGL and UNPACK_PREMULTIPLY_ALPHA_WEBGL only work on WebGL, not on native, as their name suggests.

BGRA_EXT is not in OpenGL standard but most devices support it. Mali-400 MP doesn’t seem to support BGRA though.
https://www.opengl.org/discussion_boards/showthread.php/185197-Why-OpenGLES-2-spec-doesn-t-support-BGRA-texture-format

But for me it still seems impossible to upload textures without flipping, as coordinate system of Stage3D and that of OpenGL don’t match.
If you don’t y-flip textures, render textures become upside down. To solve this without sacrificing rendering performance, I changed the code to y-flip images and flip them again on pixel shader.

@vroad Thanks for the info. I’ll take a look at plugging it into my Context3D. Seems that some of your Context3D changes are similar (but different :wink: to some I’ve got). Out of curiosity, how far away is your display3D lib away from a PR?

I’m tempted to fork your repo and plug in some other bits - would need to test things out though to make sure all is working, although with ‘next’ atm things are only partially working for me (hance the reason I may try your branch).

Thanks again.

Greg

You may know that the recent versions of OpenFL switched to BGRA premultiplied for BitmapData, I think HTML5 is RGBA premultiplied right now. I haven’t been able to test this with stage3d, improvements are still welcome :smile: