Et tu, Enum?

Hidden Memory Allocations

I’ve worked hard to keep memory allocations out of the main loop of the game I’m working on. All the game entities, the components, the event and messaging systems, all those data structures get recycled. The pooling system in LibGDX has been a big help, but there’s still a lot of bookkeeping logic to cover to make it work correctly.

So today I finally got a chance to see how well I’d done by running the code on an actual device with DDMS watching allocations. I started tracking allocations, played through, and this is what I got:

DDMS Screenshot

The first three lines are the allocation tracker starting, and then there are thousands of identical 56 byte allocations. That’s it. That’s the only allocation during gameplay until the game is over. So what’s happening?

The rendering code iterates through different render levels so that the back-to-front draw order is correct. I put the set of possible render levels in an enum called RenderLevel. Then at draw time:

That call to RenderLevel.values() is returning a new array each frame. I knew using the for(:) loops can cause hidden allocation for iterators when working over sets (see LibGDX’s collections for a solution), but I would have guessed that the enums would be more efficient. It’s low-level code and enums can’t change, so surely values() would just allocate one array, right?

Obviously I was wrong. Is there a reason values() doesn’t return the same array each time it’s called?

For the solution, I just did what values() should have done: put a variable with the array in the enum. Then I can just iterate over that without any heap allocations.

Problem solved!

Leave a Reply

Your email address will not be published. Required fields are marked *