Video games are ultimately a visual/auditory experience -- you are kidding yourself if you think graphics don't matter. Just like in film, game developers strive to create an immersive experience. Take a look at Desperate Gods, which utilizes advanced graphics to bring your standard board game genre to a new level of immersion.

I do not want to stir the Java2D discussion up again, and I think this point can well be discussed independently. Also I went through several of such discussions in the past, and don't really want to bring it up all again. But I want to present a few points, which might be interesting:

From books over comics to movies, there is an increasing level of graphical and auditory presentation. But it's not directly linked to the immersiveness that the contemplators experience.

Books are text only, have no sound, and only few and static images included. Still books can be a very immersive experience. I think this is a good example that immersion does not depend on the realism or even flashiness of the presentation. Immersion happens in the head of the contemplator.

Roguelike games use a very abstract, even non-symbolic representation of their game worlds, but many players experience them as very immersive. Moreover, the players tell that the freedom which the nonsymbolic representation gives their imagination, is a good thing because they can imagine the game the way they like it best.

Comics take away with some of the freedom. The pictures limit the freedom of imagination a bit, but there is still a lot of freedom there, since the pictures are static and much action happens even between the panels, unshown and purely in the mind of the reader.

Movies take away another freedom, the imagination of action. Some movies show much detail of the action, some don't. But the tendency is there.

Finally, all of these can be immersive experiences. Going back to the inital claim, that graphics and auditory presentation matter for immersion, I want to express my doubts. Particularly when considering the books example. I believe that immersion happens in the head of the contemplator, and is not directly dependent on the graphical or auditorial level or even realism of the presentation.

Immersion is pretty subjective, and it isn't as simple as "better graphics are more immersive" - in fact, many modern games focus so much on the graphics that it actually distracts from the actual gameplay- for example, cutting to an eye-bleedingly HD cutscene in the middle of the action.

Some of the most immersive games have pretty simple graphics- Super Hexagon comes to mind, but I'm sure we can each think of a few dozen examples if we try. Part of that immersion is imagining the world for yourself, but another part of it is not having eye-catching distractions that take away from the actual gameplay.

Another thing that comes to mind is board games- games like DnD and Settlers of Catan, for example, are extremely immersive. Contrary to what the OP said, they do not require flashy graphics to be immersive.

I think that the original poster of that quote is showing a pretty narrow viewpoint and a lack of imagination by claiming that good graphics are a prerequisite for immersion. I guess if your only exposure to video games is call of duty, that's an understandable misconception, but the basic truth is that games are a much more complicated art-form than how well one can render a bullet hole.

The first and formost point is to not break immersion by having an inconsistant artstyle.

So whatever the chosen artstyle (pixel, abstract vector, comic, highpoly crytex models) is,it should have a consistant style and quality over the game. (ingame, screens and GUI)It sould follow a noticable stylistic guideline, and not look wildly assembled from different sourcesand quality levels.

So Elements should not look obvioulsly out of place.

For example, even a simple artstyle (lets take that roughlike) can look consistant and fitting.But dont drop in a high-res rendering of an orc next to a EGA pixel-style lootbox.(Else it looks like a Monty Python collage)

Its good if one artist, or an artteam with a common style work on the game.Even if its just the developer doing the art.

So define a style first.Whatever tech is used to render it is secondary and depends on the artstyle.Thats why Java2D can work well for low res and pixely games.

For me 90% of the selling point of a game is the story it tells. I don't care about graphics, sure it's all fine to have something pretty to look at, but if the story is shit, then the rest of the game is shit, despite how fancy the graphics is.

The last 10% is spread out between the games interface(how easy it is to navigate menus, move units/char/etc.), sound, and graphics. Where graphics covers about 1% of it for me.

Of course, in games with no story to tell(Games I usually don't play), then graphics become one of the only selling points, so if you just want to make games with no story, then I suggest you brush up on your artistic skills.

But that's just my 2 cents.

EDIT:Oh and yes, keep the art style the same throughout the game, as Damocles said.

In addition to consistent art style, consistent sound is also very important and can play a major role in increasing immersion.

Also any technical hiccups can easily break the immersion, such as laggy or stuttering frame rate, long loading screens (especially between levels), glitchy AI behaviour, basically anything that distracts the player from the main game experience, becomes a hurdle or actively reminds them that they are using a computer program.

Personally also find fullscreen mode much more immersive than windowed mode.

I also don't think books are a proper comparison; whereas films are a bit closer to the wire. With a book, you are the one fabricating the landscapes, characters, visual effects, action scenes, etc. You become immersed through your own imagination. With a film or game, the imagery and sound is laid out in front of you, and there is far less room for imagination. A landscape is as it appears on the screen.

A game doesn't need fancy graphics to be immersive, but if the graphics or art style doesn't stand up to the player's expectations, then the immersion is broken and the effect is lost.

For example: Among the Dead. Creepy as hell, looks totally immersive. Now say the same game was made in the 90s on the N64, albeit with poorer resolution, lighting, animations, sounds etc. It would have been just as scary in the 90s, but if today's audience were to play it, it would feel "flat" because it doesn't meet their audio/visual expectations.

The same has happened in film. Visual effects have advanced, and so too has the audience's expectation. I recently tried watching The Mummy, which was great and terrifying upon its release, but now the VFX look cheesy and laughable.

I think the problem with your logic is that it's not an either/or thing. Can great graphics add to immersion? Sure. But can "bad" graphics lead to an equal amount of immersion? Absolutely. Can "bad" graphics take away from immersion? Sometimes. But so can "good" graphics by being a distraction.

Sticking with your movie example, think of the latest and greatest CGI-fest. Can it be immersive? Sure. But many filmmakers sacrifice the story in exchange for eye candy, which actually decreases immersion. Plenty of old black and white films are more immersive than stuff being put out today, and plenty of movies or shows with terrible special effects (the Twilight Zone, or the original Star Trek) are more immersive than the latest and greatest CGI.

In the end, it's completely subjective. Maybe you need CGI to immerse yourself in a story, but plenty of others do not. So you can't really say that good graphics are a prerequisite for immersion. Maybe they are for you, but plenty of other people become immersed through using their imaginations instead of being a passive viewer.

O for a Muse of fire, that would ascendThe brightest heaven of invention,A kingdom for a stage, princes to actAnd monarchs to behold the swelling scene!Then should the warlike Harry, like himself,Assume the port of Mars; and at his heels,Leash'd in like hounds, should famine, sword and fireCrouch for employment. But pardon, and gentles all,The flat unraised spirits that have daredOn this unworthy scaffold to bring forthSo great an object: can this cockpit holdThe vasty fields of France? or may we cramWithin this wooden O the very casquesThat did affright the air at Agincourt?O, pardon! since a crooked figure mayAttest in little place a million;And let us, ciphers to this great accompt,On your imaginary forces work.Suppose within the girdle of these wallsAre now confined two mighty monarchies,Whose high upreared and abutting frontsThe perilous narrow ocean parts asunder:Piece out our imperfections with your thoughts;Into a thousand parts divide on man,And make imaginary puissance;Think when we talk of horses, that you see themPrinting their proud hoofs i' the receiving earth;For 'tis your thoughts that now must deck our kings,Carry them here and there; jumping o'er times,Turning the accomplishment of many yearsInto an hour-glass: for the which supply,Admit me Chorus to this history;Who prologue-like your humble patience pray,Gently to hear, kindly to judge, our play.

Shakespeare may have apologized for not transporting you to France to watch the Battle of Agincourt firsthand, but he put as much craft as he could into the staging, even with the limited technology of the time.

A game doesn't need fancy graphics to be immersive, but if the graphics or art style doesn't stand up to the player's expectations, then the immersion is broken and the effect is lost.

I once heard a person advise you apply this even to AI. He said that it was one thing for a dog or something to have a poor pathfinding system, but if you had a more humanoid character it was very important that you get it right.

Sticking with your movie example, think of the latest and greatest CGI-fest. Can it be immersive? Sure. But many filmmakers sacrifice the story in exchange for eye candy, which actually decreases immersion. Plenty of old black and white films are more immersive than stuff being put out today, and plenty of movies or shows with terrible special effects (the Twilight Zone, or the original Star Trek) are more immersive than the latest and greatest CGI.

How many old black and white films have you actually watched?

I'm a film major, and we've studied a lot of black & white, silent films, foreign, etc. They are no doubt "good films," but many are painful to sit through and difficult to become immersed in. It's clear that the rest of the class feels the same. And not because the films don't include CGI or explosions, but because they don't meet today's visual/audio expectations.

Quote

In the end, it's completely subjective. Maybe you need CGI to immerse yourself in a story, but plenty of others do not. So you can't really say that good graphics are a prerequisite for immersion. Maybe they are for you, but plenty of other people become immersed through using their imaginations instead of being a passive viewer.

I have no doubt that many people can still play old games like Pacman or Pong and become immersed, and that a lot of immersion has to do with the individual. Myself, an artist, is obviously somebody who places more emphasis on visuals and audio.

But generally speaking, audiences of the 21st century have higher expectations for sound and graphics. For example: anti-aliasing is something that nobody noticed in the 90s, but now it's a big deal.

Imagine if Limbo's polygons were aliased. It might not "make or break" the game, but it certainly would have the potential to distract you and break the level of immersion.

I don't have an exact number for you (and comparing our film street cred over an internet forum is a little too nerdy, even for me), but more than a few. I'm a huge fan of old sci-fi, which might be a good example- compare the original War of the Worlds or The Day the Earth Stood Still to their modern, CGI-filled counterparts. Which is more immersive? I would argue that the black and white movies, with all their terrible special effects, are much more immersive than the modern films with great special effects. But more to the point, I would argue that it's completely subjective- what I find more immersive, you might not, but that also means that you can't really say that better graphics == more immersion.

I have no doubt that many people can still play old games like Pacman or Pong and become immersed, and that a lot of immersion has to do with the individual. Myself, an artist, is obviously somebody who places more emphasis on visuals and audio.

Then as an artist, you should know that all of this is subjective. Some people, like you, might prefer Call of Duty realism in their games. Others are more immersed by simple graphics and a focus on gameplay, or a unique albeit unrealistic or uncomplicated visual style. Others might want something completely different. My only point is that you can't say that one equals the other.

But generally speaking, audiences of the 21st century have higher expectations for sound and graphics. For example: anti-aliasing is something that nobody noticed in the 90s, but now it's a big deal.

Imagine if Limbo's polygons were aliased. It might not "make or break" the game, but it certainly would have the potential to distract you and break the level of immersion.

I disagree. Again, all of this is subjective, so you can't really say with any accuracy that everybody prefers a certain thing. Sure, there is mass-market appeal to games like Call of Duty with a focus on great graphics, but there is also a huge market for the other side of the spectrum. Plenty of other people find games like Fez or Retro City Rampage or Super Hexagon more immersive than the latest and greatest AAA title.

In the end, my only point is that all of this is subjective, so you can't make a blanket statement like the one quoted in the OP. Plenty of people prefer simpler graphics and find that more immersive, and as per the original discussion, Java 2D is a fine option for that.

Interesting that you consider Fez a "non-graphical" game. It's one of the more visually impressive indie games I've seen in recent years, and it's art style is not only beautiful and consistent, but also extremely original. It uses a lot of very technical graphical effects that are at the heart of this whole discussion. All in all, it is a very graphically impressive game (definitely not possible in Java2D, but let's not open that can of worms).

Fez falls into a category of "retro games reinvisioned with modern-day hardware-accelerated graphics." Same with Revenge of the Titans, Titan Attacks and many others.

Sorta makes your "graphics don't matter" argument moot.

Anyways, sure, it's all subjective, and you'll always find exceptions. I'm sure you can find people who would rather play the original Snake than some modern equivalent.

In the vast majority of cases, though, you should not expect today's generation to be happy with aliased geometry, heavily compressed textures, incorrect depth/perspective, poor framerates, or other graphical issues that you could have gotten away with in the 80s and 90s. In other words: graphics are important, and today's audiences have higher graphical expectations.

Interesting that you consider Fez a "non-graphical" game. It's one of the more visually impressive indie games I've seen in recent years, and it's art style is not only beautiful and consistent, but also extremely original. It uses a lot of very technical graphical effects that are at the heart of this whole discussion. All in all, it is a very graphically impressive game (definitely not possible in Java2D, but let's not open that can of worms).

Fez falls into a category of "retro games reinvisioned with modern-day hardware-accelerated graphics." Same with Revenge of the Titans, Titan Attacks and many others.

When did I say Fez was non-graphical? All I said that it wasn't like Call of Duty. My point is that it's a spectrum: you've got hyper-realism on one end, and bad/no graphics at the other end. Saying that your whole point is "graphics matter" is a bit like saying you need a video game system to play video games- um, duh, but which system is best (or more on-point, where in the spectrum of graphics is best for immersion) is completely subjective and therefore not really something you can argue like you seem to be.

When did I say they don't matter? I said it's subjective, which means of course they matter- to some people. Other people prefer games with "worse" graphics, or different/weird graphics, or something else. Your point seems to be arguing for a particular spot on the graphical spectrum as "the best" and therefore the only graphical goal worth pursuing, which isn't correct.

Anyways, sure, it's all subjective, and you'll always find exceptions. I'm sure you can find people who would rather play the original Snake than some modern equivalent.

Or the original Mario or Sonic, or Zelda, or whatever. Those games clearly still have an appeal, and they can certainly be done in Java2D. They don't have cutting edge graphics or sound, but people still play them all the time- actually, Shining Force 2 for Sega Genesis is paused on my tv right now. Indie gaming has capitalized on that interest in 2D/retro games in a huge way, so saying that it's only worthwhile to try to make a game with the latest and greatest graphics is clearly wrong, not to mention that it misses out on one of the great things about Java- it's relatively easy for a newbie to get a basic 2D game up and running. I think discouraging people from trying that is a bad idea.

In the vast majority of cases, though, you should not expect today's generation to be happy with aliased geometry, heavily compressed textures, incorrect depth/perspective, poor framerates, or other graphical issues that you could have gotten away with in the 80s and 90s. In other words: graphics are important, and today's audiences have higher graphical expectations.

Sure, buggy graphics (lagging, flickering, etc) almost always detract from immersion, but that's a different issue. Saying you shouldn't have buggy graphics is like saying that your game shouldn't crash every 5 minutes. That's true, but that's a pretty meaningless argument due to its obviousness. A post helping newbies avoid common problems would be useful, but if all you mean by "graphics are important!" is that buggy graphics are bad, I'm not really sure how useful that is.

Graphics might be important in that regard, but it's misleading to follow that statement up with examples of the latest and greatest graphics. "Good" graphics might be important, but there are a million different definitions of what "good" graphics for immersion are. For you, that might be bleeding-edge 3D graphics. Many others have no interest in that- take a look at the "casual gaming" surge for a million examples.

To be fair, I'm only going from the out-of-context quote in the OP, so maybe "buggy graphics are bad" was indeed your entire argument, in which case I don't disagree but I think that's a different thing than saying "good graphics are important", which is what I'm arguing against. You can have "good" graphics that look like colored squares or just a few lines, same way you can have "bad" graphics that are the most realistic thing you've ever seen. I guess it depends on what you mean by "good" graphics. I'm assuming you're talking about Call of Duty good, not "non-buggy" good, which is where I disagree with you. But I'm not really sure which one you mean at this point.

I've never mentioned Call of Duty, nor do I play those games. The term "graphics" covers many areas, among them: art style, visual consistency, texture resolution, anti-aliasing, and overall performance (i.e. frame rate). When I talk about the value of "good graphics" I am not talking about the importance of "replicating Call of Duty." I am also not saying "Every single person in the world expects high visual quality," as that would be a sweeping generalization.

I'm not even sure what you're arguing anymore, although you seem intent on steering the conversation back to Java2D.

The fact is: The majority of games that most developers will aspire to create demand a higher degree of performance and reliability than Java2D provides. Hell, even something like Retro City Rampage would probably not perform well with Java2D. Nor would it be distributable to WiiWare, XBOX Live Arcade, and other niche marketplaces that specifically target retro gamers.

Aside from performance, there are dozens of other reasons why Java2D is a poor choice of library to start off with. If you want to keep defending Java2D as a game development platform you should post it in the other thread.

Immersion also exists in other settings, such as movies and books. It's not the platform that matters, it's how the users sinks into the story and forgets other senses. Thus has nothing to do with graphical presentation.

Hence, graphics are not the same as immersion. Games with simple graphics can be more immersive than games with realistic graphics, it's what captivates the users senses that matter.

Of course it helps that the gaming environment is realistic graphically. But if you're setting out to create an immersive game by focusing on graphics, you're doing it wrong.

Even though I've seen improvement in the libraries over time from 1.4 (when I started) to now, it will not be able to out perform OpenGL which uses pure C implementation. It has nothing to do with immersion, or games people should or should not be designing. In my opinion, I think Javascript and HTML have a lot more potential when combined with the OpenGL pipeline than LibGDX does, because you don't have to go through as many hoops to get iOS support. Will that change in the future, probably, but it is just very forward to be promoting one technology as better for game development than all others.

[/offtopic]

As for immersion, I agree with this thread 100%. Graphics =/= immersion.

im·merse (-mûrs)tr.v. im·mersed, im·mers·ing, im·mers·es1. To cover completely in a liquid; submerge.2. To baptize by submerging in water.3. To engage wholly or deeply; absorb: scholars who immerse themselves in their subjects.

Graphics are a part of immersion, they aren't the deciding factor. Everything has to flow and be transparent in order for immersion to take place. It is a condition that requires your full focus and attention. Even something as simple as a noise going on outside can distract from it.

Games become easier to immerse yourself into the more senses they ignite at the same time. That is why theaters are very effective, they make sure they fully get you relaxed and ready to be immersed with sight, sound, and visual spectacle. It is also why many people believe theater experiences are a lot more effective than at home. (Hell, even I'd be interested if they made a public theater style gaming lounge for video games.) We just like being away from reality.

So, what does this have to do with gaming?

It simply means that you can create an immersive experience regardless of the platform.

I don't know how many of you have heard of Passage, but this game still remains as one of my greatest deep experiences I've had with a game. I wasn't expecting much, as 8-bit graphics aren't very impressive to look at in today's age. But, the game still managed to keep me drawn in all the way to the end.

Graphics are a selling point. It is like the movie poster for video games. Games with great graphics usually have high selling points early in their career and drop off over time. (Sounds like most games today actually.) There is a completely different selling point that many game designers miss, and that is word-of-mouth. Word takes a while to generate steam, but it makes sales continue far into the future.

Sad part is, graphics are a one-night-stand. While word-of-mouth is a long-lasting relationship. People still need the same basics as they did when we were using ASCII graphics to create games.

1) Decent visuals: So you can tell things apart in the world you are in.2) Transparent controls: Menu systems and control options that are not clunky and don't get in the way of game play.3) Good logic: Game play that allows players to grow, gives a good challenge, and has good replay value.

Games that have these qualities usually are able to make money for longer periods of times. It is the success of these older games that allow names like Pac-Man, Sonic, Mario, and other games to last into today's demographic.

I recently played Borderlands(1) and FarCry 2 wich I really enjoyed.Borderlands for its gamesetting and constant parodies. It was fitting in THIS game.And FarCry3 for its open world setting and beeing relative realistic. (at least the first 50% of the game was very good,the second half / island seemed very rushed together)

Anyhow. So I got RAGE on a bargain.Thinking it is like a mix of both. (Wasteland setting / open word)

But in the first 2 minutes of the game I stepped outside, and there was some dude with a buggy asking to hop in.(Else you get shot dead instantly for no good reason from somewhere as a level boundary)Until then the game had a quite good postapocalyptic mood.

But then: There was a HALO bobblehead on the dashboard.A total immersion breaker!Maybe the devs thought it was funny or what. But it totally put me out of the immersion,putting such a direct Pop-reference in front of the player. In Borderlands it would not seem out of place, as it parodied itself constantly,but RAGE pretends to be more a more "serious" setting.There was no realistic explanation of this item beeing there. I was an obvious joke element.So there was gone the feeling of me arriving in some "realistic" future, to something that screams:"Hey, Im a game, and the developer wanted to put something funny in, Ha Ha"

-> Dont put in out of place eastereggs just for the funny moment. This can really break immersion a lot more thanyou think.A slapstic joke in a Drama does not seem funny but distracting.

Immersion happens when the player can experience the world and explain it setting in its own terms.Introducing somethin obviously unfitting is like a tumor to the design.Dont slap the "Im just a game" into the players face.

In the vast majority of cases, though, you should not expect today's generation to be happy with aliased geometry, heavily compressed textures, incorrect depth/perspective, poor framerates, or other graphical issues that you could have gotten away with in the 80s and 90s. In other words: graphics are important, and today's audiences have higher graphical expectations.

Sure, buggy graphics (lagging, flickering, etc) almost always detract from immersion, but that's a different issue. Saying you shouldn't have buggy graphics is like saying that your game shouldn't crash every 5 minutes. That's true, but that's a pretty meaningless argument due to its obviousness. A post helping newbies avoid common problems would be useful, but if all you mean by "graphics are important!" is that buggy graphics are bad, I'm not really sure how useful that is.

Those are all performance arguments. And you response by talking about bugged graphics. Thats just red herring.

Those are all performance arguments. And you response by talking about bugged graphics. Thats just red herring.

I originally responded to the quote in the OP about using "advanced graphics" to bring a "new level" of immersion. That seems incorrect to me, which I think has been sufficiently demonstrated in this thread.

Then the person quoted switched the meaning of "good graphics" to be non-buggy graphics, which is different from "advanced graphics". I gave him the benefit of the doubt and tried responding to both points, but I think the actual conversation got muddled in the process.

The central point is this: good graphics can help immersion, but what "good" means is completely different for every game. Some games will be best coupled with "advanced graphics", but plenty of other games will be Java2D-style graphics. Either one can be immersive, so arguing that "advanced graphics" bring a "new level" of immersion seems incorrect. If, however, you simply mean that the graphics should be "good" in that they fit the mood of the game and aren't buggy, then okay.

I don't want to bicker or get off topic more than we already have, but the conclusion of the above paragraph is that Java2D is a perfectly fine option for plenty of games. Plus it's what most newbies are taught starting out, and it works out of the box, and there are a ton of tutorials for it, which are huge bonuses. So I think telling people not to use it because it won't give you "advanced graphics" is misleading at best, and outright harmful to developers that might find great joy in Java2D at worst. Is libGDX more powerful? Is openGL faster? Absolutely. But Java2D is fine for most games that novice or even intermediate programmers want to make, plus it's easier to learn (or at least teach). Java2D can definitely be immersive, just like libGDX games can be, movies can be, Call of Duty can be, and board games can be.

I didn't want to get involved in the nonsense Java2D thread because it's just going to consist of "Java is too slow for games!" trolls that we've seen too many times before. I love JGO because it's usually above the bickering-over-the-internet frustration that most forums have, so I'll just say this: good graphics != advanced graphics, what graphics are "good" or immersive is entirely subjective, so saying that advanced graphics take games to a new level of immersion is just plain incorrect. That's my only point.

I never "switched my meaning" of graphics -- my point has always been the same since the first rely in this thread:

Quote

I agree with kappa - hiccups and windowed mode break immersion.

...

A game doesn't need fancy graphics to be immersive, but if the graphics or art style doesn't stand up to the player's expectations, then the immersion is broken and the effect is lost.

For example: Among the Dead. Creepy as hell, looks totally immersive. Now say the same game was made in the 90s on the N64, albeit with poorer resolution, lighting, animations, sounds etc. It would have been just as scary in the 90s, but if today's audience were to play it, it would feel "flat" because it doesn't meet their audio/visual expectations.

The same has happened in film. Visual effects have advanced, and so too has the audience's expectation. I recently tried watching The Mummy, which was great and terrifying upon its release, but now the VFX look cheesy and laughable.

Heavy texture compression, poor depth (like in Duke Nukem 3D; not a bug, but a rendering trick), aliasing (e.g. in geometry or shadows), and 30 FPS are not bugs or glitches, and they were all commonplace if you were alive in the 1990s. Today, though, most users have higher graphical expectations than they used to, which is why you can't get away with these graphical issues without the risk of "losing immersion."

Quote

I don't want to bicker or get off topic more than we already have, but the conclusion of the above paragraph is that Java2D is a perfectly fine option for plenty of games.

I never "switched my meaning" of graphics -- my point has always been the same since the first rely in this thread:

What I'm saying is that the statement I was originally referring to, the one quoted in the OP, was apparently arguing that "advanced graphics" lead to the "next level" of immersion. That's the part I disagree with, since plenty of games with simple graphics are more immersive than games with the latest and greatest.

You then started talking about compressed textures and poor framerates, which I think is a different point than saying advanced graphics == immersion.

The source of my confusion is that the statement I was responding to was this:

A game doesn't need fancy graphics to be immersive, but if the graphics or art style doesn't stand up to the player's expectations, then the immersion is broken and the effect is lost.

...which I think is a bit of a contradiction. All I was saying was that "advanced graphics" are not a prerequisite for immersion, which you apparently agree with, despite the quote in the OP, which is the point of this thread. I guess by "advanced graphics" you just mean graphics that fit well with the game, but as with the Pretentious Ludum Dare game, that can be as simple as squares and lines.

Probably a good idea. I can agree that things like libGDX can be extremely helpful to people who know what they're doing (I'm currently learning it myself), but Java2D is perfectly fine for many games and is more accessible to newbies, which is why it's a great option by itself.

Computer graphics have nothing to do with immersion. Immersion can take place even while playing textual games, by reading a book, listening to someone telling a story over a campfire or listening to a radio theater show (War of the Worlds).

The setting does not matter, it's just a setting. It's like we were debating if it mattered for radio theater show immersion whether it was in stereo or mono quality, or 5.1. Sure, more quality is better, and may help, but immersion is based on a wholly different thing though.

Other factors are more important, like is the user allowed to focus on his immersion? Reading a book while there's a lot of noise around you can disturb you, just as having people around wanting to talk to you, other factors may help, like being wrapped in a warm blanked drinking hot cocoa.

I read your reply, appel, and I agree with you. You don't need advanced graphics for immersion. The simplest graphics, or even no graphics at all, can be just as immersive as the latest CGI or realistic graphics. It's completely subjective.

Computer graphics have nothing to do with immersion. Immersion can take place even while playing textual games, by reading a book, listening to someone telling a story over a campfire or listening to a radio theater show (War of the Worlds).

This.

Many people find Dwarf Fortress extremely immersive, even though it has arguably got the worst graphics in the history of gaming. Immersion is a complex psychological phenomenon, which defies simple explanations. I think that is precisely why making a really good game is a real challenge, and one that is not a matter of following some simple theory or guideline. It's so complex that it's more like art than science.

Graphics definitely play some role in immersion, but it's way more complicated than more realistic graphics = more immersion. I even read some time that making graphics too realistic can actually break immersion. That's got to do with suspension of disbelief. It's easier for most people to "suspend their disbelief" when characters are obviously fantasy (think World of Warcraft), not trying to fool you into thinking its a real person. Characters that are very close to realistic, but not quite, sometimes seem to make people extra critical.

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org