I’ve just noticed that TILE_ALPHA doesn’t work on flash target.
_renderFlags = Tilesheet.TILE_RECT | Tilesheet.TILE_ALPHA;
_tilesheet.drawTiles(myshape.graphics, [0,0,0,0,100,100,0.5], true, _renderFlags, 7);
works on CPP (Mac) showing an alpha of 0.5, but it doesn’t work on flash (shows a full alpha 1.0)
maybe a bug, should I set something else on the render flags?
for now, since it is the same alpha for all the tiles, I am drawing at alpha 1.0 and set the Shape (the surface where I am drawing) alpha to 0.5 and it works.
As far as I know Tilesheet class uses drawTriangles method on flash which doesn’t support alpha. I wonder if there is a way around it. Copypixels takes ‘alphaBitmap’ as one of the parameters but then we would lose matrix operations : (
ok, now I understand. I was thinking the TileSheet class came from Flash, when it is actually something “new”.
This explains why.
Knowing the limitations.
For now it’s not a big deal for me as I said before, since I don’t really need per-tile alpha settings. It could be interesting in the future if I will need it, but the solution at this points depends very much by what I want to achieve.
Thanks for the explanation anyway, at least now I understand why
You could try the tilelayer library, which handles the differences between platforms for you.
Interesting stuff! Same performance (or maybe better) as Tilesheet?
Is there any comparison test?
Should be about the same on native targets, given that it’s built on top of Tilesheet. Joshua was talking about optimizing Tilesheet’s performance in Flash, but I don’t know if he did yet. (If he did, those optimizations won’t apply to tilelayer.)
That is brilliant. Thank you very much for the advice.
I think I will stick on Tilesheet on this project but definitely move to Tilelayer on the next one.