Sure there is... There are better more elegant languages that let you think and express your program's logic at a higher level. I used to be a C++ guru, btw.

There is HUGE buzz behind Scala and legitimate reason for it. Companies like twitter and tumblr did major migrations to Scala for very good reasons.

This attitude of "I will cling to whatever legacy tool has maintained popularity" is lame. I understand newbies, need somewhere to start. But if you are an independent hobbyist, you should be attempting to blaze new trails not cling to the legacy ways of the past.

I missed this. You're taking my comment out of context. There's zero reason for big game studios to change from C++. The tech reasons have been covered ad naseum in many threads so there's no reason to repeat them. So instead I'll do two hypothesis:

1) The big players in the multi-billion dollar game industry have only succeeded in hiring idiots that don't know anything about modern compiler & language design.2) Maybe they do have a clue.

That may be true. "Big game studios" have to make all kinds of miserable sacrifices and compromises that I am sympathetic for but not very interested in.

On this forum, I'm expecting to chat about programming technology with others are excited about that.

Believe me, I'd rather not talk about C++ at all, but people always want to bring it up. It actually hurts me to defend it in any manner. BUT I think it very useful for us to be at least partially sane when having technical discussion.

@Cas: why javascript? We could come up with all kinds of technical reasons, but they'd all be BS. The real reason has got to be because Netscape made a successful browser. Follow the bellwether.

@Roquen - I know why javascript, historically... I'm just wondering why competing clientside tech failed to flourish. VBScript died on the vine, Google are attempting only now to do something called Dart but of course it won't work anywhere outside of Chrome. Browsers would probably have been far better off if they used some sort of bytecode that any language could have been compiled down to instead of the current status quo. Maybe LLVM is the best bet for a rosy and inclusive future.

I'm just wondering why competing clientside tech failed to flourish. VBScript died on the vine, Google are attempting only now to do something called Dart but of course it won't work anywhere outside of Chrome. Browsers would probably have been far better off if they used some sort of bytecode that any language could have been compiled down to instead of the current status quo.

Dart is compiled into JavaScript which runs on all major browsers. No one is deploying raw Dart source. The native Chrome support is a development feature so you can dev/debug in native Dart and compile/deploy with compiled JavaScript.

JavaScript is being used as a byte code. People have been compiling CoffeeScript, ClojureScript, Dart, Java, C#, and now Microsoft TypeScript to JavaScript.

You seem to be implying that JavaScript established it's dominance through the idiosyncracies of the evolution of the web and browsers and computing devices and not because it won some pure intellectual best language competition... This is beyond obvious. Most technical standards are like that. It's hard to rally all the competing players together to push new replacement standards. Look at x86; many have pointed out flaws and many have designed much better ISAs, but still people have found ways to make new chips with cutting edge designs that satisfy legacy x86 compatability baggage. Everyone can point out flaws with JavaScript, but many are finding ways to provide cutting edge dev tools and runtimes around the legacy compatability issues.

@Cas: LLVM bytecodes wouldn't be a desirable solution as it's too low level IHMO. For instance type information is already baked in at that point. All bytecode definitions suffer from problems like this. This limits the types of languages and features which can be supported. I favor an abstract syntax tree representation. Smaller in size and a higher-level abstraction model. Once the VM is done with any verification and high-level transforms, it's simple to flatten into something like LLVM, which then handles all the hard-core low level optimizations and spews out native.

I don't see what the problem of typed bytecodes is in terms of supporting higher level languages. If you have no types in the bytecodes, then what you really have is a single type. If the bytecode supports an "object" type and other types, then you can easily emulate a unityped VM within that typed one. At runtime, type information is an optimization you needn't concern yourself over.

An AST is higher-level, but it has no semantics of its own. Let's say my language supports delimited continuations, and I spit out an AST for it. Let's say the underlying execution engine I hand this AST to doesn't have any notion of them. Now how do I implement them efficiently? An AST would certainly make a more sensible compiler target than a crazy hall of funhouse mirrors like javascript source, but it doesn't give you any extra semantics for free (except maybe macros).

All macros suck, they render your code unreadable and make it impossible for others to modify without major headaches.

If your only experience with macros is of the C variety, then I suppose that's a pretty natural conclusion. Otherwise, you might want to educate yourself a little on the sort of things that structural transformation of code can actually do before rendering sweeping judgements on readability and maintainability.

Heck, even cpp macros ... do you find code using NULL harder to read than code with ((void *)0) instead?

Readability. Java is not a functional language; there are no cool syntax tricks that you can use to shorten your method by three lines at the expense of readability. Also, Java coding standards are strictly enforced: if you use method_name instead of methodName, an angry mob of pitchfork-wielding Java devs will hunt you down and destroy your soul.

I wouldn't say that there are no syntax-tricks. What about the enhanced for-loop, or the shorthand-if?

I wouldn't say that there are no syntax-tricks. What about the enhanced for-loop, or the shorthand-if?

Compared to Python, Java's "syntax sugar" is laughable.

Python has a lot of useful things like list comprehension, "with" statement, multi-line strings, operators for joining/multiplying lists and objects (and ways to override these), a crapload of common methods like str.splitlines() or CSV parsing/loading, keywords for method paramaters, etc. Once you're familiar with Python or any other functional language, going back to write in Java feels extremely "old school" and verbose. You find yourself writing double or triple the amount of code, and none of it can be written 'functionally'.

I don't get this new fad of calling all dynamically typed languages functional.

It's because these languages support basic functional concepts like, say, first-class functions. It's hardly the "dynamic" aspect, which actually diminishes it as functional with respect to modern FP techniques that involve sophisticated type systems such as those found in Haskell and Scala. You don't need such a type system to be a functional language -- Scheme and Erlang get by without it -- so at best the type system is orthogonal.

Python also imported list comprehensions from Erlang, and those are very much a functional construct. Haskell uncovered the building blocks behind list comprehensions (spoiler: it's monads), and C# and Scala followed suit, so now you can write your own "whatever-comprehensions" by just implementing a couple simple functions. Python unfortunately doesn't base its listcomps on the functional principles, but it doesn't mean that the hardwired listcomps/genexps it has are any less functional themselves.

Python's definitely not a functional language by any stretch, but it does at least support it.

... I think we all know that the internet was not supposed to be used as an interface for multi-media in the first place. It was only made for text-based systems, and people decided that they wanted to use it for more powerful media. Obviously when the Internet was first made, it wasn't devised for movies, mass media, or gaming.

HTML and JavaScript are just plain outdated. That is the fact. What all companies, especially Google, is trying to do is keep JavaScript and HTML relevant so all the loads of old web pages would still be readable. JavaScript is definitely not suited for dealing with the amount of media we are trying to achieve nowadays.

If it weren't for Google pushing to great lengths to keep JavaScript relevant, the Internet would have probably collapsed under the device overload. The applications are getting more and more powerful every day. Something is going to break, and I think JavaScript is running a thin wire.

If you ask me, the Internet needs a language overhaul. Right now, due to the inclusion of tablets and phones, the programming world has become very messy and unfocused. Even Java's whole notion of "write once, run everywhere" is essentially ruined. It is like companies don't know what technology to go behind. Ha, I don't have any clue what technology to go behind. Too bad time is money... .. . ...

What makes you think the Internet is the Web, or that even the Web strictly pertains to HTML? So it wasn't "designed" for movies. The phone system wasn't designed to carry data either, nor was the cable TV system. Well-designed systems adapt and evolve without needing to be destroyed first.

Have you read some of the first RFCs? I think you'll find that graphical apps were being considered right from the start.

As Tim Berners-Lee famously said: "The Web will always be a little bit broken, because we are too."

All macros suck, they render your code unreadable and make it impossible for others to modify without major headaches.

If your only experience with macros is of the C variety, then I suppose that's a pretty natural conclusion. Otherwise, you might want to educate yourself a little on the sort of things that structural transformation of code can actually do before rendering sweeping judgements on readability and maintainability.

Heck, even cpp macros ... do you find code using NULL harder to read than code with ((void *)0) instead?

Try c, c++, and ASM, of course others but none of them used macros. ASM from the 1980s to about 1995. I find syntax for most languages to be blech, but you can set a constant NULL instead of a macro, global constants are possible, and programatically sane, in cpp.

... I think we all know that the internet was not supposed to be used as an interface for multi-media in the first place. It was only made for text-based systems, and people decided that they wanted to use it for more powerful media. Obviously when the Internet was first made, it wasn't devised for movies, mass media, or gaming.

HTML and JavaScript are just plain outdated. That is the fact. What all companies, especially Google, is trying to do is keep JavaScript and HTML relevant so all the loads of old web pages would still be readable. JavaScript is definitely not suited for dealing with the amount of media we are trying to achieve nowadays.

If it weren't for Google pushing to great lengths to keep JavaScript relevant, the Internet would have probably collapsed under the device overload. The applications are getting more and more powerful every day. Something is going to break, and I think JavaScript is running a thin wire.

If you ask me, the Internet needs a language overhaul. Right now, due to the inclusion of tablets and phones, the programming world has become very messy and unfocused. Even Java's whole notion of "write once, run everywhere" is essentially ruined. It is like companies don't know what technology to go behind. Ha, I don't have any clue what technology to go behind. Too bad time is money... .. . ...

The "internet" gets a "language" overhaul about once a year. The latest trend is HTML5, replacing Flash, thankfully. Tablets and iPads need an overhaul, if anything, and all other touch devices, so they can interface with other events instead of just "click here."

Okay then, I'm not talking about anything like those. Try Scheme, Lisp, Dylan, and Scala macros, or Python's EasyExtend, and that's much more like what I'm referring to when I say macros. The C Preprocessor gave macros a bad name.

Okay then, I'm not talking about anything like those. Try Scheme, Lisp, Dylan, and Scala macros, or Python's EasyExtend, and that's much more like what I'm referring to when I say macros. The C Preprocessor gave macros a bad name.

Macros are still obsolete. It's actually one of the reasons I prefer Java, the other being that everything except primitives is a pointer, and the garbage collection system.

What makes you think the Internet is the Web, or that even the Web strictly pertains to HTML? So it wasn't "designed" for movies. The phone system wasn't designed to carry data either, nor was the cable TV system. Well-designed systems adapt and evolve without needing to be destroyed first.

I agree with your point. Let me clarify my points a bit.

The Internet is not the Web, but the Web does make up a chunk of the Internet. Tablet and phone devices are more-or-less structured to handle the Web better than any other feature of the Internet. HTML5+JS is pretty much de-facto for all Web interactions (if you want it to work everywhere.) Personally, I find JavaScript very clunky to work with.

The fact that we have to convert everything to HTML5+JS to make programs and web pages work everywhere seems like a step backwards. Isn't the ability to program in many different languages the key to better programs? Unless you are arguing that these systems (HTML and JS) are well- designed, I can't see the point you are getting at.

If it weren't for Google pushing to great lengths to keep JavaScript relevant, the Internet would have probably collapsed under the device overload. The applications are getting more and more powerful every day. Something is going to break, and I think JavaScript is running a thin wire.

I'm not sure I follow. When developing an app, you typically look at its performance, and verify it runs at reasonable speeds on your target devices. If not, you cut back on calculations, one way or another.

Web apps will never dictate at what speed (javascript) applications will run, just as the usual OS processes don't dictate which speed CPUs should run at. The environment dictates the boundaries, and developers may or may not try to do as much as possible within those boundaries.

Webapps will never get too powerful for browsers, just like programs never get too powerful for a CPU, simply because they will be rejected by the users, and the developer either goes out of business or solves the performance problem. If javascript performance is tripled next month, you'll see webapps pop up demanding three times as much performance as before. If javascript performance won't increase from now on, web apps will cope. Nothing is going to break, there is no thin wire.

Last but not least, I don't quite get why you attribute Google with a major role behind javascript for browsers. Google was the very last to enter the browser environment with Chrome. We should be most thankful for Firefox and to a lesser extent Safari.

Hi, appreciate more people! Σ ♥ = ¾Learn how to award medals... and work your way up the social rankings!

What is the best language is better left for an academic discussion. Reality rarely favors the "best" anyway, because "best" is a very subjective term, ask an engineer and a businessmen what is the "best" programming language, the engineer will answer "The best designed language", the businessman will answer "The one that creates most value for me/my customers as cost effective as possible". Sadly, there's little correlation between a well designed language and productivity.

So far, all the con points here about javascript are exactly that, "it's shit", a very subjective thinking from an engineering standpoint.

Windows 95 crashed every day back in the day, it didn't stop it from becoming the most popular OS in the world, because no matter how often it crashed it was always many times more productive than the alternative.

Which language will be used a lot (does not mean "used exclusively") in the future, the trend favors Javascript (among others), but not so much Java.

It's already falling out of fashion to have JVM's installed on the desktop side, Applets are disabled by browsers by default, difficult to find devices that support Java. On the server end, Java still has a fighting chance, but I don't it will be for long, new languages that scale better are already eroding Java's base there. That's the trend for Java, and you don't need to be a javascript lover to see that.

Just recently I found out how horrible Java is doing multiple asynchronous calls (deferreds) in the backend. I had to fix this code someone had written, he wanted to do multiple asynchronous calls, and he ended up creating one thread for every call he made, or roughly 100 threads, PER CLIENT REQUEST. Of course this was rectified with a thread pool and executors, but even with that it was nightmarish, because we had to somehow manage the executor service for that particular controller, and that added to the complexity and counter-productiveness.This was no junior programmer who made this mistake, and he writes java all day long. Why seasoned java programmer stumble on this is self-explanatory, the language doesn't promote doing complex things in a instinct-like, easy, productive manner. Given how "well" java is designed, the language constructs just don't scale for the modern reality of mass-core servers and asynchronicity. Compare it to javascript and how easy it is do use jquery's $.when().

XML is falling out of favor too, and JSON seems to be the programmers choice nowadays. Ever tried reading JSON data with Java? Well, it's a mess because Java is strong-typed. XML is no better in Java. Javascript and JSON is a happy marriage.

But there's so much happening on the front-end, with all the devices, and in the back-end, that if you truly believe Java is the way to go (full steam ahead) then you're in for a surprise when in 5-10 years junior programmers will not work on your projects because it's java.

>Google are attempting only now to do something called Dart but of course it won't work anywhere outside of Chrome.

You can cross-compile it to (modern-ish) JavaScript. Right now you get about 75% of the speed of hand-written JavaScript, but I'm certain that they will reach at least 90% in the not too distant future.

You also get source-maps, which means it's still somewhat debuggable in this state.

Furthermore, there is tree-shaking (=dead code removal), which allows you to use large kitchen-sink libraries without having to pay for it. This is something which always bothered me with JS: every feature of your library is just dead weight in most cases.

Well, there is also some overhead for every language feature you use. So, there is quite a bit of initial overhead, but you still break even somewhere at the 50-100 kB mark.

>Browsers would probably have been far better off if they used some sort of bytecode [...] Maybe LLVM

@appel All those arguments can be used against C too, where the situation is much, much worse. Still, it's alive and kicking, just not serverside. Today, Java can hardly be beaten for large serverside projects, while C can hardly be beaten for... well any desktop project.

Once a language establishes itself in a certain area, a massive ecosystem will emerge. This will give the language / platform momentum, which certainly won't break down in the next 5-10 years.

Hi, appreciate more people! Σ ♥ = ¾Learn how to award medals... and work your way up the social rankings!

I'm sorry, but if your senior programmer was spawning a thread per request, it was either a configuration error from making a threadpool unbounded, or he just doesn't know concurrent programming. Concurrency has many great subtle oddments to it, but it's not exactly subtle when spawning a new thread is your first and last approach. Nobody is an expert in every domain, so his seniority may mean he was stuck in his ways rather than having had long exposure to modern concurrent idioms.

java.util.concurrent is a decent start, but it's only going to become actually pleasant to use in pure java when JDK8 lambdas land in the language. Even then, j.u.c.Future is pretty anemic as compared to the akka/scala-2.10 version or com.twitter.util.Future, so there's still some limitations there, but any of them still beat the weak opaque tasks in JS any day.

WRT: Macros. Forget c/c++ & asm. In the simple case you write a type once..like say a 3d vector. Then you've covered 3d vectors over singles, doubles and any extended precision. Useful, but small potatoes. Next some type that contains more than one subtype. Now it combinatorial. Now consider (like I mentioned in the latest operator overloading disscussion) dual-numbers. One usage is automatic differentiation. We're now talking about removing up to something like 10 engineering years of coding from some projects. Macros rule.

@appel All those arguments can be used against C too, where the situation is much, much worse. Still, it's alive and kicking, just not serverside. Today, Java can hardly be beaten for large serverside projects, while C can hardly be beaten for... well any desktop project.

Once a language establishes itself in a certain area, a massive ecosystem will emerge. This will give the language / platform momentum, which certainly won't break down in the next 5-10 years.

Actually, that depends on many factors. Java serverside is not highly efficient for say, a simple chat server app. But for desktop it's the best write once, run anywhere, VM for full sized apps, like games. It got a bad reputation due to a rash of bad coders writing a bunch of junk applets that were either malware or so poorly written they should never have been uploaded in the first place, like most of the Java IRC chat clients they still use today, for example.

A huge problem is maintenance of apps written, either they were not versioned correctly or they just didn't keep up with the latest versions of Java. A lot of really awesome desktop programs, some proprietary, are written in Java, or written to support Java. It's now more of a desktop app VM than an applet VM, and it seems Oracle is following that trend, focusing on the desktop VM environment more.

java.util.concurrent is a decent start, but it's only going to become actually pleasant to use in pure java when JDK8 lambdas land in the language. Even then, j.u.c.Future is pretty anemic as compared to the akka/scala-2.10 version or com.twitter.util.Future, so there's still some limitations there, but any of them still beat the weak opaque tasks in JS any day.

AFAIK, none of java.util.concurrent is very dependent on lambda expressions. Can you elaborate on where that would really benefit?

From my limited understanding, Akks is more on the level of JMS as a high level cross computer message queueing system. java.util.concurrent is much lower level and works with threads/pools/tasks.

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org