Invalid buffer length

Im findng that my game crashes quite frequently (windows and mac CPP target) at the moment and I receive the error “Invalid buffer length” and if I run in debug I get the below. Anyone know what might be happening here? Really appreciate the help, thanks :slight_smile:

Called from ApplicationMain::main ApplicationMain.hx line 122
Called from ApplicationMain::create ApplicationMain.hx line 55
Called from lime/app/Application.hx line 158
Called from lime._backend.native.NativeApplication::exec lime/_backend/native/NativeApplication.hx line 102
Called from extern::cffi C:/Users/Jay/hxcpp/3,2,81/src/hx/Lib.cpp line 131
Called from lime._backend.native.NativeApplication::handleRenderEvent lime/_backend/native/NativeApplication.hx line 59
Called from lime/app/Application.hx line 519
Called from openfl.display.Stage::render openfl/display/Stage.hx line 898
Called from openfl._internal.renderer.opengl.GLRenderer::render openfl/_internal/renderer/opengl/GLRenderer.hx line 274
Called from openfl._internal.renderer.opengl.GLRenderer::renderDisplayObject openfl/_internal/renderer/opengl/GLRenderer.hx line 291
Called from openfl.display.DisplayObjectContainer::__renderGL openfl/display/DisplayObjectContainer.hx line 879
Called from openfl.display.DisplayObjectContainer::__renderGL openfl/display/DisplayObjectContainer.hx line 879
Called from openfl.display.DisplayObjectContainer::__renderGL openfl/display/DisplayObjectContainer.hx line 879
Called from openfl.display.DisplayObjectContainer::__renderGL openfl/display/DisplayObjectContainer.hx line 875
Called from openfl.display.DisplayObject::__renderGL openfl/display/DisplayObject.hx line 1134
Called from openfl._internal.renderer.opengl.utils.GraphicsRenderer::render openfl/_internal/renderer/opengl/utils/GraphicsRenderer.hx line 820
Called from openfl._internal.renderer.opengl.utils.GraphicsRenderer::updateGraphics openfl/_internal/renderer/opengl/utils/GraphicsRenderer.hx line 1016
Called from openfl._internal.renderer.opengl.utils.GLStack::upload openfl/_internal/renderer/opengl/utils/GraphicsRenderer.hx line 1250
Called from openfl._internal.renderer.opengl.utils.GLBucket::upload openfl/_internal/renderer/opengl/utils/GraphicsRenderer.hx line 1471
Called from openfl._internal.renderer.opengl.utils.GLBucketData::upload openfl/_internal/renderer/opengl/utils/GraphicsRenderer.hx line 1543
Called from lime.utils.Float32Array::subarray lime/utils/Float32Array.hx line 150
Called from lime.utils.Float32Array::new lime/utils/Float32Array.hx line 95
Called from lime.utils.ArrayBufferView::new lime/utils/ArrayBufferView.hx line 69

Are you just drawTriangles? try using legacy( before adding openfl in your xml) to see if its not a bug in your code. It looks like in some place a buffer is created and (byteLength + byteOffset > buffer.length) is triggering. It could be a bug the the new release.

Thanks Jaukob,

I can confirm that the error is coming from just after a drawTriangles() call.

I wasn’t able to get the game to run with legacy or hybrid modes. But when I commented out the drawTriangles() call the game no longer crashed.

I should mention that the game runs fine, until I finish or quit out of a level and then attempt to reload the same level. As soon as the new level begins it crashes.

It looks like the drawTriangles() call is being successfully made, its only when the graphics class comes to execute the __command that the crash occurs. I wonder whether things are not being properly reset? What would a buffer be in this case?

I’d like to see the typed array implementation replaced

This appears to occur in the constructor of a typed array, using code something like this?

var uint8Array = new UInt8Array (byteArray, 100, 100);

It probably doesn’t look like that, but it’s saying that 100 + 100 is greater than byteArray.length if this were the code.

This could be coming from commands internally in the renderer, perhaps something, like you said, is not being fully reset

If you are rendering to an empty object, what happens if you switch to a new object instead of re-using the old one? Same thing?

Bit of an odd one. If I create a new canvas (Sprite) to render to then I can now reset when I have only one spine object (using drawTriangles), but if I have more then 1 then I get the crash whenever I reset.

I wonder if it’s an issue inside the renderer, or if there’s just some additional safeties that they put into the Flash drawTriangles that we don’t have (so it really is bad data)

Does your code run on Flash or HTML5?

Its CPP for Windows and also for Mac (but drawTriangles() are invisible on mac builds)

I think it is some kind of internal rendering thing (from the error message) but it is related to executing the drawTriangles.

I may be wrong but are the things that need to be rendered pushed to an array called __commands in the graphics object of the sprite class. These then must be executed later - and I think it is at that point that the error occurs.

Ah re-reading I see what you mean. Its not really possible to build the game in flash or html5 as there are far too many CPP specific things we are doing in there. I’ve given it a go but its just not really working. Plus the game is in widescreen HD so I think it might make flash explode!

Any further thoughts or help on this?

The Buffer is relating to an array being created by the renderer as the graphics is being updated. The GL renderer has been updated in the last four days, so it is likely that the bug you experienced has since been fixed. Although, if there is anything to go by, this line may give us a clue (massive commented out part, although may not directly relate to the rendering of triangles).

From what I can see that is going on in the background, is that OpenFL is creating an index buffer to upload if no length is provided. i.e. not specifying the indices as seen in the drawTriangles() function:

drawTriangles (vertices:Vector<Float>, ?indices:Vector<Int>

I’m no expert on OpenGL, so I may be wrong, but try applying the second parameter and see what happens.

Another thing you can try is if the length of your array is always divisible by 3 list.lenght%3==0

@brutalexcess - thanks so much for looking in to this. I switched to the nightly-updated development builds which presumably have the updated GL renderer, but unfortunately this had no effect on the error.

I also do provide indices, and have used a trace statement to confirm that these are definitely being provided. Which does make it very strange that an index buffer is being created as that line clearly points out…

@juakob thank you also. I can confirm that the length is indeed always divisible by 3.

My current fix is below for ArrayBufferView.hx. Extremely ugly and hacky - but at the moment continued development is impossible. I’ll update this thread if I’m ever able to fix it:

if (byteLength + byteOffset > buffer.length) {				
	//throw "Invalid buffer length";
	while (byteLength + byteOffset > buffer.length)
if (byteLength + byteOffset > buffer.length) {                
    //throw "Invalid buffer length";
    while (byteLength + byteOffset > buffer.length)

Surely that would eventually start reducing the length of your triangles? An alternative would be to draw them manually, avoiding any calls that would otherwise callback to the buffer.

Also, do you clear the graphics at any point when you reset levels? It is unlikely, but it is possible that you are duplicating instead of resetting results and causing strange things to occur.

Also, are you using beginFill() and endFill() calls? It is also possible that when you draw things like triangles to distort bitmaps without first calling endFill() may result in unexpected feedback.

I looked into the index buffer a bit more, and it’s a bit more clear to me that the buffer is designed to render graphics. If you add graphics to a buffer that has not been created yet, you end up with the Length is greater than error, because the buffer is too small to write to, or is null and can’t be drawn to.

In C++, most drawing routines generally start by allocating memory first (creating a buffer), shifting bits into it (drawing to the buffer) and then releasing it (finalising the buffer) so that it can then be drawn to.

Make sure that you begin…() to “initiate” the buffer, and end() to “finalise” the buffer. What is also likely to happen is that if you are drawing any more than what looks to be just 32-bits, that means you can only draw a maximum of 32-bits per render. It is not essential to end() on the graphics, but it may be wise in this case, and also good practice.

I learned a lot of this mostly from C++ tutorials ;p