BitmapData::setPixel32 with alpha issue

Trying to set the pixel value of a bitmap, but when alpha is involved, I get strange results.

var color:Int = (128 << 24) | (64 << 16) | (196 << 8) | 32; // argb

var bitmapData:BitmapData = new BitmapData(1, 1, true, 0x00000000);
bitmapData.setPixel32(0, 0, color);

var pixel:Int = bitmapData.getPixel32(0, 0);

var a : Int = pixel >> 24 & 0xFF;
var r : Int = pixel >> 16 & 0xFF;
var g : Int = pixel >> 8  & 0xFF;
var b : Int = pixel       & 0xFF;
		
trace(a, r, g, b);

I expect to see in the output:

128,64,196,32

but instead I get various results, such as:

0,30,39,224 0,224,39,224 0,29,167,224

Thinking this has to do with premultiplied alpha, but I’ve tried modifying color several different ways and have yet to get pixel to match the original color value.

If I trace color directly, 128,64,196,32 is traced as expected.

Any help will be much appreciated…

Aren’t those Ints signed? If so, try unsigned shift >>>. In your example you set 128 twice…

trace((128 << 24) | (64 << 16) | (196 << 8) | 32);  // -2143239136 (minus!)

trace(128 << 24);  // -2147483648 (minus again!)
trace((128 << 24) >> 24);  // -127 (and again!)
trace((128 << 24) >>> 24);  // 128 (Yup.)
1 Like

If you continue to have problems (and are using Neko), please try another target as well, there might be an issue with Neko 32-bit integers there

Thanks for the suggestion @madneon, but it seems as though @singmajesty is right…

I built it for Windows instead of Neko and it worked just fine.

Running slightly older verions of lime/openfl currently (2.0.4 / 2.2.1 respectively) and the latest Neko version (2.0.0).