Trying to set the pixel value of a bitmap, but when alpha is involved, I get strange results.
var color:Int = (128 << 24) | (64 << 16) | (196 << 8) | 32; // argb
var bitmapData:BitmapData = new BitmapData(1, 1, true, 0x00000000);
bitmapData.setPixel32(0, 0, color);
var pixel:Int = bitmapData.getPixel32(0, 0);
var a : Int = pixel >> 24 & 0xFF;
var r : Int = pixel >> 16 & 0xFF;
var g : Int = pixel >> 8 & 0xFF;
var b : Int = pixel & 0xFF;
trace(a, r, g, b);
I expect to see in the output:
128,64,196,32
but instead I get various results, such as:
0,30,39,224 0,224,39,224 0,29,167,224
Thinking this has to do with premultiplied alpha, but I’ve tried modifying color
several different ways and have yet to get pixel
to match the original color value.
If I trace color
directly, 128,64,196,32
is traced as expected.
Any help will be much appreciated…