In my 2D platformer game, any object that can interact with the world around it is called an Interactable and is expected to be added to a level. So far, I've been adding each Interactable into an ArrayList which I constantly loop through to make each Interactable run.

So far, this method hasn't really faltered because I'm not typically adding and removing objects constantly. I'm considering using a LinkedList, however, seeing as I plan to have Interactables be added and removed at a fast pace. Would using a LinkedList be important in this scenario?

LinkedList is rarely useful, since it has so much overhead even when it is supposed to be faster. For a very very large list where objects are constantly removed randomly hundreds of times per frame or so, LinkedList might be faster. The problem with LinkedList is that it creates an Entry object which it wraps in each object added to store the next and previous entries. This allocation (and later garbage collection) makes it a lot slower to add and remove stuff than ArrayList in most use cases. ArrayList's only weakness is removing objects in the beginning or middle of the list, since all following objects have to be shifted to fill the hole created. The longer the list the slower it becomes.

Even when you have lots of stuff to remove, ArrayList can be a good choice. Lists are often used to keep track of game objects. So you say "But wait! The objects will always be added at the end of the list, but as they die they will be removed! I have thousands of objects, so I should use a LinkedList to avoid shifting thousands of objects on every remove()!". Nope! Most likely the best solution is to use TWO ArrayLists and pingpong your objects between them. By doing that you can avoid all shifting while still getting fast (and random) access to all objects.

Also Doesn't LinkedList leave null spaces in the List if you remove an object? or was that Hashtable...

Nope, LinkedList does not leave null spaces. Only arrays do that (obviously). Hashtables/HashMaps KIND OF do that, but that's simply because the key no longer maps to anything, so it returns null instead. Not really the same in my opinion.

Though it has to be said, considering the number of things that might be found in an ArrayList, removing things from the head of the queue is really often so fast you'd never ever notice it. It depends on how many things are likely to be in it...

Consider a real-life example of, hmm... Particles. So you've got 5,000 particles in a frame (a high figure, but not unlikely). They're stored in an ArrayList. Each entry in the ArrayList takes 4 bytes, so that's 20kb of RAM, fitting neatly in the L1 data cache on many if not all desktops.

Now, imagine the rather unlikely even that every single particle dies one frame, and you discover this as you iterate through the ArrayList (using an integer index, not an iterator, just for absolute efficiency's sake). You have to shuffle 4,999 particles down 4 bytes. It's all in the L1 cache so memory access is effectively free to the CPU; you spend 4,999 cycles moving your particles assuming the loop that copies the data is about as simple and efficient as it can be. Then you have to do it again on the next particle. 4998 cycles. Repeat to solve triangular number, approximately (5000x5000)/2, or in the end 12,500,000 clock cycles, without ever having to touch the L2 or L3 caches most likely, let alone system RAM, until the end of the operation.

That's 12.5m cycles out of your 3.3 million or so you've typically got in a single video frame (2GHz core). Oh dear, you just spent 4 frames ditching a mere 5000 particles. Judder. Imagine if you had, somehow, 10000 particles. That'd be nearly a second wasted doing something utterly trivial and would present itself as a horrible jarring delay in your buttery smooth animation.

But what if, weirdly but possibly, the worst case occurred when even going backwards - every other particle died in one frame? Well, then you'd be compacting a slowly more complex list of particles even if you were scanning backwards. Scanning forwards would be exactly the same speed too.

So you'd be crafty and use the third and final technique which is to make a copy of the original arraylist, and copy the surviving particles into it each frame. This performs consistently no matter how many live or dead particles you have in any particular frame or the pattern of their expiration.

Anyway - for the OP - basically you need to do Just That. Each frame, scan your list of Interactables, and copy each live one into a second ArrayList, and then point your game at that ArrayList. Flip between two ArrayLists, alternating each frame, rather than creating (and expanding!) an ArrayList every frame, or that'll be slow.

The solution proposed by theagentd and princec work when every object in the list needs to be checked for state. But DrewLol did not state that that is how he is deciding what gets removed. He might have an object report that it has died and now needs to be removed. In the scenario removing fromt he list involves an O(n) search. In that case a HashSet or HashMap might actually be better because it has O(logn) search time.

You have a system set up so that an object will notify some controller object that it has finished being useful and should be removed from the list. Something like a bullet that's just discovered that it impacted against a wall. It'll tell the physics controller "I impacted, remove me from the list of collision objects please!"

In that case, depending on how you wrote your code, the physics controller would have to spend some time searching through its list of objects and remove it.

Since the controller loops through the list of objects in order to call their update() method it can simply check if the object it's updating has reached its usefulness and discard it right there without looping through the list an unnecessary amount of times per discarded object like theagentd's code does?

Moreover if the object itself needs to notify the controller you might discard the object mid-update or cause concurrency issues. A situation where the controller is no longer in control but the objects it's supposed to control pushes the controller around.

I can't really conceptualize the scenario you're describing, I could be wrong.

If I have a game that is moving at 30 - 40 frames a second I am not going to be looping through lists of objects needlessly at each of those frames. For the most part nothing is happening of interest during most of those loops. You need to save your CPU for animation, math, collision detection, event handling. A callback to a controller when an event occurs is more efficient.

The solution proposed by theagentd and princec work when every object in the list needs to be checked for state. But DrewLol did not state that that is how he is deciding what gets removed. He might have an object report that it has died and now needs to be removed. In the scenario removing fromt he list involves an O(n) search. In that case a HashSet or HashMap might actually be better because it has O(logn) search time.

I was wondering why no one else had steered DrewLols away from using a List at all. Out of all the Java Collections a HashSet seems to be the best option for his scenario. (By the way, hash tables have O(1) add, remove, and query times. Not log n.) In my projects I use a Bag (though slightly different then Kappa's and more like a special purpose set than a general purpose bag or multiset.)

* I often work directly with an array** In order to remove "dead" objects I either implement a normal remove(GameObject o) method OR, more frequently, implement something like call removeDeadObjects() or inline code to do the same once per update, OR use Iterator's remove method since it can be done in O(1) time with one swap.

I don't think so. By "kind" do you mean class? Sets can't store duplicates. Bags and Lists can, but storing multiple references to the same instance sounds like a source of bugs.

1 2 3 4 5 6

Carc1 = newCar(), c2 = newCar();Buildingb = newBuilding();collection.add(c1);collection.add(c2); // Fine so farcollection.add(b);collection.add(c1); // Does this get stored once or twice? Is it updated once or twice per update to c2?

I don't think so. By "kind" do you mean class? Sets can't store duplicates. Bags and Lists can, but storing multiple references to the same instance sounds like a source of bugs.

1 2 3 4 5 6

Carc1 = newCar(), c2 = newCar();Buildingb = newBuilding();collection.add(c1);collection.add(c2); // Fine so farcollection.add(b);collection.add(c1); // Does this get stored once or twice? Is it updated once or twice per update to c2?

Ah, I did a test, and you're right! Very interesting, because my understanding of Sets was that they stored unique entries, and I guess individually allocated Objects of any kind are considered unique to the HashSet. Maybe this is common knowledge, but I just took a data structures course, and since that's what I was taught, I figured it would hold true for this scenario. :p Cool!

On a side note, Best Username, I've decided I may wish to start using Bags in my own games! ;] I know credit goes to kappa for it, but thank you for introducing me! I'm looking forward to doing some test cases to see how much faster it works in comparison to ArrayLists.

Thanks! I guess I should jot down that one in my programming notebook for future purposes... my game engine could probably use a bit of that, now that I think about it

EDIT: Wait, so I was technically right about this?

Well, actually, I suppose without implementing equals() the hash set would work just fine for the OP's purposes, but it just depends on whether or not that kind of functionality is added into the class. If it were me and I wanted to get the supposed performance benefits, then I guess I wouldn't add it into classes I know would be added into the set, but now that Riven basically just ripped my code a new one and taught me a lesson I think I'd much rather implement things that way!

I don't quite get how performance has anything to do with it. Implementing .equals(...) and .hashCode( ) is critical for HashMap/HashSet to work.

Sorry Riven! I meant the performance of the HashSet, not the classes to be placed within the HashSet! :p

It's just that he was saying he could get certain performance benefits with using the HashSet for these purposes, but it seems to me that if the objects passed into the set are equal, then this wouldn't be useful in a situation like the OP described, where he has essentially a stage of Interactables; my point was, what if there are two Interactables with the same attributes? They wouldn't be preserved in the set if this were the case, making the data structure worthless. Thus, in order to get the performance benefits he was talking about, he would have to omit those two methods you wrote in my code, which circumvented the set recognizing two objects with identical attributes. Otherwise, it would retain what makes it a set, and without the equals() and hashCode() methods, it seems to do little more than act as a list for those half-complete objects.

You could argue that java.lang.Object was poorly designed, as tieing the root class of a type hierarchy to some collection based on hashes, seems rather awkward. I mean, you don't see Object defining compareTo(...) or compare(..., ...) to directly tie in with TreeMap or PriorityQueue...

Anyway, enough off-topic-ness. Your 'Clown' class threw me off, it's commonly accepted that two clowns with the same name are indistinguishable, hence the implementation of .equals(...) .

Hi, appreciate more people! Σ ♥ = ¾Learn how to award medals... and work your way up the social rankings!

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org