I want to add a central data manager to my project, so that different classes can share data. It will also act as a map to avoid doubling up on resources.

I figure that there are 3 ways I can do this:

create the manager as a global (EVIL)

pass the manager to each class as it is created (UGLY)

create the manager as a singleton

Every post I see says to avoid singletons, but is this a situation that breaks the rule, or is there a better solution?

Also, the graphic data above is just one type of data; realistically there will be many data types. To avoid having a whole mess of singletons, I would create a central data manager that has a factory class for each type:

Oh yeah, I'm doing this in C++, but the concepts should apply to most languages

I think the real question is: what does centralized "data management" actually entail? What does it mean for a resource to be managed? Try to have resources know how to manage themselves, e.g. using factory methods and smart pointers, instead of centralized resource managers.

Ok, so a more generic version of the question: what is the best way to share the data between different classes that are completely un-related?

Factory methods and smart pointers. Sometimes you cannot get away from it, so managers-as-factories (like a general purpose caching resource loader) need to exist.

A singleton is something altogether different. It says "There can only be one of these. Ever. So let it be written." That is nearly always (but not universally) a bad thing.

Don't use singletons.

The preferred way to do things is that when an object needs data you will either pass the data as a parameter when you need it, or set it using a mutator/accessor pair (get/set method) at some time during the object's lifetime.

In practice you can get very nearly all the data to objects using this method. A programmer ought to be able to go for many months without touching anything other than direct parameters to their game objects.

Almost everything should fit this usage pattern.

Sadly, large programs have some objects that need certain bits to be accessible on a read-only basis to other parts of the system. It is not ideal, but in the real world there are occasionally good reasons to create a very minimal, tiny, global object that points to the canonical versions of certain factory methods. These may include links to things like the game clock, a handle to the main simulator, or a handle to the resource manager. These should be used sparingly. On the rare occasion that these must be used, keep the Law of Demeter in mind.

I have a "CSingleObjects" class that has those global objects as static members. Their constructor is private but CSingleObjects is declared as friend to them so only that can instantiate them. In another header I have defines like

Personally, I believe there is a widespread misconception that everything must be wrapped in a class when programming in C++. I try to stick to the keep it simple philosophy which is to simply keep things as simple as possible, but not any simpler. Singletons are a completely over-engineered version of a global variable. I avoid the use of singletons by creating a file-scope variable that is accessed via a method. It is a clean and simple solution. As for your dilemma, I recommend this approach in order for you to get a reference to the manager (although I prefer to pass references as parameters to the methods that need them whenever possible).

Number 2 isn't ugly; it makes the dependencies between your classes explicit and obvious.If when constructing a ModelLoader, it requires a DataCache as an argument, then it's clear that this dependency exists, and it's easy to follow the layering of the different parts of your architecture.On the other hand, if the ModelLoader internally acquires a DataCache via a global/singleton, then this is a magical, hidden dependency. This obfuscates the architecture, makes initialization ordering non-obvious, and makes the code much more brittle (harder to maintain) in the long run, due to all the hidden interactions.

Number 2 isn't ugly; it makes the dependencies between your classes explicit and obvious.
If when constructing a ModelLoader, it requires a DataCache as an argument, then it's clear that this dependency exists, and it's easy to follow the layering of the different parts of your architecture.
On the other hand, if the ModelLoader internally acquires a DataCache via a global/singleton, then this is a magical, hidden dependency. This obfuscates the architecture, makes initialization ordering non-obvious, and makes the code much more brittle (harder to maintain) in the long run, due to all the hidden interactions.

Just playing devils advocate here, but couldn't you make the argument that that's a good thing? i.e. I need a to load a model, so I create a ModelLoader; I don't actually care how it internally loads models. And if someone one day decides to change how ModelLoader loads models, I don't want my code to suddenly break.

if you think programming is like sex, you probably haven't done much of either.-------------- - capn_midnight

Just playing devils advocate here, but couldn't you make the argument that that's a good thing? i.e. I need a to load a model, so I create a ModelLoader; I don't actually care how it internally loads models. And if someone one day decides to change how ModelLoader loads models, I don't want my code to suddenly break.

I think the point is more that they're both rather ugly but the uglyness of passing dependencies is a major improvement over the problems globals or singletons present for the situation. It's a balance of "more code to write and text to visually process" against "making it clear what a class does without requiring you to look inside it."

Personally I can kinda see where it comes from, I tend to use "manager" classes in my code a lot still just because I find them convenient and easier to follow in many cases, but I pass them around as dependencies. Having tried the singleton approach I can honestly say that it removes the clarity of code in what an object needs access to in order to function. When you have to pass it some kind of rendermanager then you know it is going to draw something, when you pass it an audio manager you know it needs to play sound. That kind of thing.

Using globals causes dependencies that you don't even know are actually there, you may remove some object that is a dependency without even knowing it is one, suddenly your happy and encapsulated little class is breaking and you have to poke around it's innards to find out why.

Especially for my game states, since every game state might create another game state that needs some other of those "managers", I basically have to pass and store them into each one of them. Are there any good strategies to handle this sort of (growing) complexity? Talking generally for situations where I have to pass many references around that are not directly related, but could theoretically be grouped...

Number 2 isn't ugly; it makes the dependencies between your classes explicit and obvious.If when constructing a ModelLoader, it requires a DataCache as an argument, then it's clear that this dependency exists, and it's easy to follow the layering of the different parts of your architecture.On the other hand, if the ModelLoader internally acquires a DataCache via a global/singleton, then this is a magical, hidden dependency. This obfuscates the architecture, makes initialization ordering non-obvious, and makes the code much more brittle (harder to maintain) in the long run, due to all the hidden interactions.

Just playing devils advocate here, but couldn't you make the argument that that's a good thing? i.e. I need a to load a model, so I create a ModelLoader; I don't actually care how it internally loads models. And if someone one day decides to change how ModelLoader loads models, I don't want my code to suddenly break.

Yes, but no

There's 3 designs below; what you're describing could be implemented as #1 or #2. I was discouraging #1 and recommending #3.If there is a 1:1 relationship between the loader and the cache, then #2 is ok. If multiple loaders can share the cache, then you should use #3.#3 also follows inversion of control, a principle that generally makes code more flexible, testable and maintainable in my experience -- making components that are explicitly plugged together by the user allows them to be used in many more ways that components that automatically create/link their own inter-object dependencies.#1 is the ugliest and most fragile IMO, so should almost never be used. It doesn't do what you're advocating -- removing any care of the internals from the user of the class -- it only seems like it does on a superficial level. In this case, the user still needs to ensure that the global caching system has been initialized before creating their ModelLoader1, and must ensure that all their ModelLoader1's have been destroyed before the global cache is destroyed.

As for your code breaking if the design of the loader is changed... with #2 and #3, you'll get compile time errors if this occurs. With #1 your code might still compile but then crash because the hidden global dependencies might not be correctly configured for the new design. The fact that your code breaks at compile time when design incompatibilities are introduced is IMO a very good thing.

Especially for my game states, since every game state might create another game state that needs some other of those "managers", I basically have to pass and store them into each one of them. Are there any good strategies to handle this sort of (growing) complexity? Talking generally for situations where I have to pass many references around that are not directly related, but could theoretically be grouped...

The 'context pattern' is often used to address this. If all of your game states requires that long list of arguments, then you can put them into a context structure:

Ok, so a more generic version of the question: what is the best way to share the data between different classes that are completely un-related?

If the classes share data, they are actually related.

Generally, the proper way to share data is figuring out which object should be the "owner" (maybe a third class, and maybe created for this purpose) and arrange for the dependent classes or methods to have a reference to the appropriate object, maybe with the help of other classes.

For example: a wargame or the like, with many little Units which move independently and need information about other Units in the current game state when they choose an action, can have a sophisticated Map data structure which owns all Unit instances, offers queries to get sets of Units by location, type, status etc., and gets passed to the "ai update" method of each Unit instance.

Especially for my game states, since every game state might create another game state that needs some other of those "managers", I basically have to pass and store them into each one of them. Are there any good strategies to handle this sort of (growing) complexity? Talking generally for situations where I have to pass many references around that are not directly related, but could theoretically be grouped...

What Hodgman said is a good strategy if you have that many references. Personally I try to look at the rough "tree" structure of my dependancy passing and try to "bundle" references in higher constructs. E.g. If I have some general manager classes like rendering I declare them as member variables of my top level engine class that maintains the game loop, then pass them down to the game states in a big bunch.

The game states are kind of the "higher level" constructs so if a smaller thing like an NPC or something needs access to only a few of the managers(maybe rendering, audio or something) then they have an easy to location to grab them from. I try to make use of owning objects to hold references, I.e. a map can hold a lot of references and then when units are created it may only pass a few references to them. Usually as you move farther down the tree you'll find yourself passing less dependencies, so the context structure might not be as needed.

It seems kind of messy at first but I try to keep the thought process behind it at "if something that belongs to a larger class needs a dependency then surely that parent class would want the references to give to it's 'children.'" So passing a bunch of things like rendering and sound to a game state seems more sensible in that context.

The 'context pattern' is often used to address this. If all of your game states requires that long list of arguments, then you can put them into a context structure:

Ah yeah, that seems to be just what I'm look for, sweet. So simple, and still making code much simpler. At least for the top level gamestates, I definately need to pass all those around. Would it be bad if I passed those context structs on to objects that didn't need all of those classes, for example only the entity and message, but not the system manager? I don't see any unnecessary dependency created through that, and since I'm using forward declaration compile times shouldn't suffer from that to much, too. But I'd gladely hear a second opinion, what do you say?

What Hodgman said is a good strategy if you have that many references. Personally I try to look at the rough "tree" structure of my dependancy passing and try to "bundle" references in higher constructs. E.g. If I have some general manager classes like rendering I declare them as member variables of my top level engine class that maintains the game loop, then pass them down to the game states in a big bunch.

That sounds quite familiar, I also store all that managers and resource modules in the top-level engine class. But still, passing directly is getting quite anoxious, if I have to pass 8-9 different references around, which is hardly avoidable, since everything bundles in my game state classes. I'm using the MVC-pattern to handle things from there on, but the game state initializes those other classes, so it needs to have everything that any entity system, high-level graphics handler etc.. might need.

The game states are kind of the "higher level" constructs so if a smaller thing like an NPC or something needs access to only a few of the managers(maybe rendering, audio or something) then they have an easy to location to grab them from. I try to make use of owning objects to hold references, I.e. a map can hold a lot of references and then when units are created it may only pass a few references to them. Usually as you move farther down the tree you'll find yourself passing less dependencies, so the context structure might not be as needed.

Yeah, thats for sure, the further down I go, the less things I have to pass around. Still, seing things as in the "context structure", I notice a lot of classes that might be grouped together because they really are passed on together a whole lot. I a classes needs only one of them, I can still pass only that one it needs, though.

It seems kind of messy at first but I try to keep the thought process behind it at "if something that belongs to a larger class needs a dependency then surely that parent class would want the references to give to it's 'children.'" So passing a bunch of things like rendering and sound to a game state seems more sensible in that context.

Didn't even have audio, physics, ai etc.. implemente yet and already I'm passing a train wreak of classes to the game states :/ You are right that context structures aren't totally needed, but still, thinking about extending the game/engine and probably starting a new game from the same code base after I've finished this, I feel like giving context structures a try. I can revert it easily anyway, if it for some reason becomes unnecessary or adds some sort of complexity that outnumbers the hundered references I'd have to pass anyway^^

1. End user of your game does not give a damn about how your engine classes are structured. Really.

2. If you are the only coder on the project, do not over-engineer for the sake of some higher principle / book / pattern / <insert random bullsh*t reason>. Refactor, so that you, as a coder, have to spend minimum (preferably zero) time on that class you wrote 18 months ago.

2. If you are the only coder on the project, do not over-engineer for the sake of some higher principle / book / pattern / <insert random bullsh*t reason>. Refactor, so that you, as a coder, have to spend minimum (preferably zero) time on that class you wrote 18 months ago.

There's a reason that good programmers advocate the use of 'best practices', it's because worse practices cause actual, real life problems. KISS, and YAGNI matter of course, but "just slap it together" cannot be universally employed to produce quality software. Refactoring is important, but cannot be your only route to good design.

There needs to be a balance, and doing some simple things like not using singletons in your design will make the code significantly better and save you time - regardless of how many developers are on the project.

There's a reason that good programmers advocate the use of 'best practices', it's because worse practices cause actual, real life problems. KISS, and YAGNI matter of course, but "just slap it together" cannot be universally employed to produce quality software. Refactoring is important, but cannot be your only route to good design.

There needs to be a balance, and doing some simple things like not using singletons in your design will make the code significantly better and save you time - regardless of how many developers are on the project.

Second that! Up until before a couple of months, I've been used to writing code that just fulfilled on purpose - to work. The outcome was obvious: Most of the time the code didn't work, and it mosten ended at the point where it was too complicated even for myself to work with, even without a break of a few months. The only means of refactoring would have been to rewrite everything from scrath. Since I'm actually caring about "good design", and am not even afraid of a little over-engeneering, things are working out way way better for me. Its really better to simpliefy utterly sofisticated code if it turns out too complicated, than to improve horribly bad code because it turned out too unflexible and restricting to work with.

2. If you are the only coder on the project, do not over-engineer for the sake of some higher principle / book / pattern / <insert random bullsh*t reason>. Refactor, so that you, as a coder, have to spend minimum (preferably zero) time on that class you wrote 18 months ago.

There's a reason that good programmers advocate the use of 'best practices', it's because worse practices cause actual, real life problems. KISS, and YAGNI matter of course, but "just slap it together" cannot be universally employed to produce quality software. Refactoring is important, but cannot be your only route to good design.

There needs to be a balance, and doing some simple things like not using singletons in your design will make the code significantly better and save you time - regardless of how many developers are on the project.

Yes, best practices become best practices for a reason. I don't think that's the point VladR was trying to make though. I think his point was that you should use the best practices that are appropriate for your code and that are appropriate for solving the problem you're trying to solve. If you're going through a design patterns book and thinking "OH GOD I'm not using the Abstract Functional Delegator Aggregate Factory pattern anywhere in my code, I must find a place to use it" you're doing it wrong.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.

Yes, best practices become best practices for a reason. I don't think that's the point VladR was trying to make though. I think his point was that you should use the best practices that are appropriate for your code and that are appropriate for solving the problem you're trying to solve. If you're going through a design patterns book and thinking "OH GOD I'm not using the Abstract Functional Delegator Aggregate Factory pattern anywhere in my code, I must find a place to use it" you're doing it wrong.

I would agree with this pretty much.

To me the trick is to balance trying to make simple improvements and trying new ideas with getting your finished product done. While learning especially(e.g. making a bunch of games or something to build skills) it's important to set yourself a goal you think you can manage, and try and think up a quick way to get the parts together. But once you start working on it and see areas where you can bundle code together or use a design pattern to help you, it's worth trying it and reading up on the subject.

Personally I would say I almost always learn the most by trying something and failing, then refactoring it until it works. That gives you insight on both ways to tackle a problem and the pros and cons of the solutions you try.

+1 for refactoring, it is a little dishearthening to know you are writing code you know you are probably gonna refactor soon but it's impossible to get it right on the first attempt. Gets your brain working though and that's important, I do it all the time in my hobby projects. It is also pleasing mentally to clean up old code. There's a good book called "The pragmatic programmer" (http://www.amazon.com/Pragmatic-Programmer-Journeyman-Master/dp/020161622X) I highly recommend.

A little bit harder to convince people at work, especially non-programmers, the value of spending some time and go back refactoring old code instead of adding new features though...