Neko memory error: "too many heap sections"

HI I am testing a few openfl applications in neko and when I create a moderate number of objects (~2 million), neko fails with the error “to many heap sections” which I assume means that I am trying to allocate too much memory or maybe that there are too many discrete sections in memory? Does anyone know how to overcome this issue?

Which desktop OS are you using?

I am using Windows 7, 64 bit.

I do not see the error in Neko, perhaps this is a limitation of libgc, which Neko relies upon. I’m not necessarily if there is a way around this. Why do you have 2 million objects? :wink:

Just curious, do you think you might be generating a large number of objects from, say, a for loop?

For example, this allocates many new objects (and may not be cleaned quickly, depending on the platform/memory management system):

for (i in 1000000) {
    var x = i * 10;
}

versus

var x;
for (i in 1000000) {
    x = i * 10;
}

Yes, I am generating a large number of objects in a loop and adding them to an array. Is there a better way to do this (for example is there a way to allocate from a contiguous chunk of memory)? I have two million objects for the purpose of testing extreme cases of a physics engine that I am working on.

It might be a heap vs. stack issue. Is it possible to move “var” declarations out of the for loop, like in my above example? It helps in C++, not sure how it would interact with Neko, but worth a try?

Ya, that would be possible and I will give it a try and let you know if it works.

Frankly, I think that you are making things m-u-c-h too hard on “your ever-faithful computer!”   :open_mouth:  

Surely(!!) there must be no There IS no earthly reason why you must actually instantiate “two million(!) objects” to do anything-at-all.

Even if it you are dealing with some scenario where “by definition, up to 2 million objects could exist,” such a definition (ahh, “by definition” :grimacing:) merely describes “the mathematical domain of the function,” not the actual number of objects that are required to exist right now!

What you need to invent, right now, is some kind of container class which, given some object-ID in “this range of up to 2 million,” will instantiate a new object instance on demand, and that will also limit the total size of the pool, automatically discarding any objects that have not actually been referenced in a sufficient amount of time.

Be nice to the poor computer!   Do not require it to create 2,000,000 objects, if 1,999,993 of them will never actually be used!   (“Hey, what did the poor thing ever do to you?”)

I can literally think of dozens off the top of my head especially when instances require a large amount of precomputing.

Hmm, maybe if Your objects are larger than 2 kb You are hitting the 4gb ram limit for 86 architecture?
Neko AFAIK is 32 bit lib

No, total memory usage is about 700 MB (which is completely rediculous as each object consists of only about 10 floating point fields). When I increase the size of the objects (to about 100 floating point fields), memory usage increases slightly but neko still throws the error around 2 million objects. This is annoying as I can easily think of traversable graph structures in games (for large scale pathing for example) with more than 2 million points.

Could You check whether this will work?

        var arr = [];
        for(i in 0...128000000){
            arr.push(i);
        }
        
        while(true){
        trace(arr[127999999]);
        }

on neko target I got 2,5 GB ram occupied, but app is working.
On cpp I got 700mb of ram occupied and app is also working

With

        var arr = [];
        var arr2 = [];
        for(i in 0...128000000){
            arr.push(i);
            arr2.push(i);
        }
        
        while(true){
        trace(arr[127999999]);
        trace(arr2[127999999]);
        }

on cpp target I got 1,3 GB of ram and app is working. I am running linux

It doesn’t seem to work. I can only get up to about 1.2 GB before neko crashes. C++ is crashing at around 500 MB. Looks more and more like it is just a problem with my flash develop install/compilers. I probably have some obscure compiler option turned on that limits total memory usage in a weird way :stuck_out_tongue:

Next time I work on this, I’ll just try reinstalling haxe, neko and flash develop and see if that works.

Maybe You could try building from command line?

Yeah, I’ll just stick with my original comment on this one:   “you have no choice but to delay, until the last possible moment, the actual allocation of anything.”   Furthermore, you must also advance, to the earliest possible moment, the disposal of anything.

“Advance calculations,” of course, “are easy, and virtually free-of-charge.”   Even the slowest machine usually has plenty of time on its hands with which to perform background calculations, so long as any resource allocations indicated by such calculations are deferred to the last practicable moment . . .