Antialiasing on cpp targets

Hi,

Sorry if this question has been asked before, I couldn’t find any solutions after some searching.

I can’t figure out how to enable antialiasing on cpp targets. Setting the “antialiasing” attribute in the project.xml file doesn’t seem to do anything.

<window width="800" height="600" background="0x000000" orientation="landscape" vsync="true" antialiasing="4" if="cpp" />

This is the result using the openfl graphics API to draw a circle and a line:

I know I can do it by using BitmapData.draw and enabling smoothing, but as I understand it this triggers CPU rendering right? Can antialiasing not be done on the GPU?

I’d really appreciate any help.

Thanks

Edit: I get the same result on beta 3 and in -dlegacy mode

You could use catchAsBitmap=true on the Sprite you are drawing.

According to this (older) post:
http://www.openfl.org/archive/community/general-discussion/openfl-gpu-and-hardware-accelleration-performance-what-should-i-be-using-doing-ensure-bitmaps-are-hardware-accelerated/

it should use hardware rendering.

If this is absolutely true for cpp targets I can’t confirm with certainty,
I am relatively new to Haxe as well.

Maybe one of the makers can give more insight on this…

cacheAsBitmap = true will force SOFTWARE rendering, and has nothing to do with smoothing.

For GPU anti-aliasing, one of the method is to use TileSheet.drawTiles() with smoothing set to true. Also if you are using bitmap graphics you can set bitmap.smoothing to true as well.

I cannot guarantee that BitmapData.draw() on OpenFL 3.0 beta is using the GPU (since the old version is using CPU), but drawTiles() will always use GPU on native targets.

But since you’ve brought it up, I do hope that the anti-aliasing setting in project.xml works in newer versions. Starling’s stage3d native anti-aliasing works fine, but somehow the one here does not. It would be really nice if this feature can be fixed in newer versions :smiley: thx @singmajesty

oops, seems I had something like a “left-right disorientation” when I was reading the thread I’ve posted above…

So catchAsBitmap=true is basically quality=good/performance=bad

Sorry for adding a bit of nonsense to this question.

Thanks for the responses. It seems for now if I’m using the graphics API to make the shapes then I’d have to use CPU antialiasing. It should be fine though, the game I’m making shouldn’t be too CPU intensive.

Thanks

Yea its kind of an old threat.

By the way I’m not sure if cacheAsBitmap = true automatically enables bitmap smoothing, and by that I mean the quality of the display object may not change at all.

Most of the time this property is used mainly for caching (as the name implies) which is when you have say a sprite that contains a lot of complex vector graphics, you can enable cacheAsBitmap to store all the complex graphics as a bitmap and renders faster (if the sprite’s rotation and scale etc properties are not changed). But that was in Flash platform where all vector arts use CPU. Since we are talking about cpp targets, I don’t see any common reasons of using cacheAsBitmap since the vector graphics are now rendered in GPU.

The newer renderer, when cacheAsBitmap is available, will use hardware. The older renderer used software, which improved the smoothing for graphic shapes

Can you use pre-rendered bitmaps for your shapes? Those will be smooth and perform faster

I’m generating the graphics at runtime + they’ll be scaled/distorted a lot so vectors will make the whole process a lot easier. Fortunately (after running a couple tests) it looks like performance won’t really be an issue even with CPU rendering (using bitmapdata.draw), so I’ll just make do with that for now. It’s good to hear though that the new cacheAsBitmap will use hardware :slight_smile: I’ll have to try it out soon.