I’m wondering what’s the recommended maximum size of a single BitmapData that we can load to GPU with a Tilesheet?
I did a little research and I found that 2048x2048 should be supported by all the current devices and for the newer devices 4096x4096 is good too, but with that size I couldn’t test it on nether of the platforms, only 1024x1024 went well.
Is there a setting I’m missing? How do I load a large bitmap into the tilesheet?
Sorry for bumping but is there any way to find out max BitmapData size from haxe code? I thought that GL.MAX_TEXTURE_SIZE might work, but it returns 3379 so I’m not sure if it works correctly.
That’s what I thought too. Does that work on desktop?
I’m asking because when I create a BitmapData with that size (which is 16384 x 16384 for me) on windows target, I get wrong results:
I wanted to create the BitmapData to draw to it and use drawTiles to draw from it later, but when I do this it only seems to draw the initial background of the BitmapData, not the images drawn to it. As soon as I use a smaller size(e.g. 4096 x 4096), it works.
Maybe this has to do with graphics memory? I have a GTX 660 it that is important.
GL.getParameter(GL.MAX_TEXTURE_SIZE); stopped working on openfl 2.2.6 It returns 0 when I’m targetting mac or android. Still works fine on neko though. Going back to 2.2.3 fixed the problem.
Strange, I can’t think of any reason why this would not work. When do you check for it? Perhaps there’s a chance that you’re checking before the GL context is ready?
class Main extends Sprite
{
public function new()
{
super();
addEventListener(Event.ADDED_TO_STAGE, init);
}
private function init(e:Event):Void
{
removeEventListener(Event.ADDED_TO_STAGE, init);
#if (cpp || neko)
trace(GL.getParameter(GL.MAX_TEXTURE_SIZE));
#end
}
}
Doesn’t work on openfl 2.2.6/2.2.7. Works fine on 2.2.3. I have also tried tracing max texture size on enter frame, but it always returns null on windows, 0 on android so waiting a little bit longer doesn’t work.
Fixed, this worked for me on Neko, but on the C++ target, it’s correct – it was returning null instead of a numeric value. Interesting. So I posted an issue for this on the HXCPP repository, but in the meantime, I’ve made changes that resolve this issue, and probably is a better setup for GL.getParameter right now, anyway