Is it possible to get a sprite into a bitmapdata, AS3 style? Something like this:
scenery=new Sprite();
/* (scenery gets populated with several other sprites) */
var bd:BitmapData=new BitmapData(cast(scenery.width),cast(scenery.height));
bd.draw(scenery);
addChild(new Bitmap(bd));
I get the same result exporting for flash (which was rather unexpected). I use the cast to turn those Floats into Ints. This is what haxelib returns, I don’t know if that’s of any relevance:
@player I just cast because it works and is way faster, at least by my very humble tests. Is there any issue I should be aware of (I’m no big time dev)?
@ibilon What kind of incorrect? I’m using it for everything (well, positioning on stage, bitmap sizing, screen sizes… simple 2D stuff mostly) and it seems to be spot on, feels like a low time Math.floor.
Well… now I feel like an idiot… I was fiddling with the transparency settings and the stacking order, everything was working fine from the beginning, it was obfuscated somewhere at the bottom of the display list.
var f:Float = 0.3;
var i:Int = cast f;
var j:Int = Std.int(f);
trace([i, j]);
Flash: [0, 0]
C++: [0, 0]
I knew Flash had some sort of automatic conversion, but I was surprised to find this works for C++. Turns out, that’s because HXCPP likes to cast values:
Float f = ((Float)0.1);
int i = ((int) f);
int j = ::Std_obj::_int(f);
So both i and j are converted correctly, with the difference being that j requires an extra function call.
Then I tested on Neko, Output: [0.3, 0]. Oops!
Well, maybe you don’t need Neko. How about HTML5? [0.30000000000000004,0].
Looks like this becomes a problem on targets without static typing. If that’s true, I’d expect it to fail in PHP and Python as well.