After having an installation problem on my workstation I was able to get it working well with another user on the same machine.
Now I’m trying to get a feel for the Haxe environment. I usually create applications for the desktop, sometimes Starling based, often in 4K resolution.
First, I threw a 2 starling sprite with big textures on stage and set up a few gestures events. Really great, even at 8024x8024 pixels no break in the frame rate. Not to compare with Air.
But already with the 3rd Texture in this size the fun stopped: the windows App crashed. Is there more information on how big the textures should be in total? Is there a way to make atf internal? When using tools like vvvv oder openframeworks I could define the compression after the asset is loaded, really cool for CMS based apps.
Then I loaded several images as bitmaps in the “classic” fl.display environment I onto the stage. As soon as I take several large pictures, 4500x7500 pixels, the Windows application crashes. Where is the problem?
var bitmapData = Assets.getBitmapData ("assets /textures/1x/i1.jpg");
var bitmap = new bitmap (bitmapData);
bitmap.x = 0;
Where can I found more information about the internals?
Hmm, it would be interesting what the cause is of the crash.
There could be some internals of ours (such as PNG/JPEG decompression) that are not optimized for giant file sizes, and we run into an issue of using more memory than we need to.
You’re going to want to see whether the application appears to be running out of system memory, or if we’re actually going beyond how much memory the GPU can hold. We should have code that polls the maximum texture size for the GPU (and restrains the size accordingly) but if multiple textures are at the max size, you can reach limits by using multiple ones at once.
Can you try openfl test html5, and see if the web browser allows sizes this large?
I want to check if the adobe air limitation in graphic exists although on the haxe port… perhaps I can do some windows app in haxe for faster development and without runtime license. with the benefit to give the client a preview on opengl.
I have done sliced backgrounds a lot in air/starling. but you can’t definitly not do this 3x4k app with animated elements in air/starling, the air architecture is total limited in graphics memory ( I think about 512 MB) that’s what I want to test, where is the limit.
And let me know what you find, there probably are some areas in the code that aren’t designed for extra large textures, that could be optimized more. All the source is open, so we have room to improve any areas where it’s possible on the platform/hardware
Limits such as “graphics memory” are meant to be very literal: "this is how much you can present at one time to the hardware."
But you can easily use exactly the same trick that is commonly used with [road …] maps. (In fact, you can use the same tile-structure and therefore the same software tools to prepare the data.) Slice the total bitmap into square tiles. Then, as the player zooms around the bitmap, constantly determine which tiles are currently being seen (and, those which are “nearby” and thus likely to be needed soon). Only those tiles need to be in memory. Graphic buffers which contain anything else should immediately be discarded or repurposed.