Well, I've been looking into the whole javascript (and webGL) thing. I'm impressed by how much faster it runs these days and some of the game demos I found out on the web (especially the HTML5 stuff). I'm curious though as to how one could "secure" it for use with "online" functions (like posting high scores, or multiplayer). Seems like anyone could just take a look at the code, save it, edit it, run it again and do whatever they want with any server feedback code.

Is there a way to prevent such easy hacking?

Using JavaScript doesn't change anything there. Everyone can just use a sniffer and check what's going on. If you use a minifier (you should for the bandwidth savings alone) the code is about as readable as decompiled obfuscated Java.

I've always quite liked it as a language, although like oNyx mentions it's something I've messed around with without taking the time to properly 'learn' it. My issues have always been with the implementation - the speed was always terrible and browser quirks rendered it fragile and even more "write once test everywhere" than Java.

However this seems to be rapidly changing - particularly Chrome's V8 seems to have given most of the other browsers a good kick up the rear end and now javascript is reaching acceptable speeds. Browsers still seem to be very quirky but there's a good handful of libraries that seem to do a good job of hiding these, and the canvas tag should improve things still further. So in a couple of years javascript-the-programming-experience (as opposed to javascript-the-language) could well be a much more pleasant experience (or even now, depending on how much you need to worry about compatibility and older browsers).

The browser needs to execute the code.The browser needs to decrypt the code.The browser needs the key.The client knows the key.The client can decrypt the code.

Oh, and clientside security will never work.

Agreed.

But if you add username/password into the mix you can use the password as the key for encryption. When submitting highscores username can be sent both as plaintext and encrypted together. Server can verify it's the user they claim to be by looking up their password using the plaintext username and then unencrypting the message to get their username a second time (only the person with the password could have encrypted their username). Server can then challenge the user using an encrypted nonce to prevent replay attacks. Note that the password is never sent.

There are also alternates, the user could send their username and receives back a key to use for future encryption which is encrypted by the server. Nonces are also included to prevent replay attacks. This is used in Kerberos.

Well, I've been looking into the whole javascript (and webGL) thing. I'm impressed by how much faster it runs these days and some of the game demos I found out on the web (especially the HTML5 stuff). I'm curious though as to how one could "secure" it for use with "online" functions (like posting high scores, or multiplayer). Seems like anyone could just take a look at the code, save it, edit it, run it again and do whatever they want with any server feedback code.

Is there a way to prevent such easy hacking?

Multiplayer games are usually run on the server with the client being a dumb front-end. In theory it doesn't matter if the client is decrypted and hacked apart because it's just dumb client. The server just ensures all inputs it receives are valid.

Which applies to any crossplatform development. Write once, run anywhere is more or less a lie. Some kind of apps (eg. LWJGL based games) need little to none tweaks and generally just work as intended, but GUI applications are very different matter. You really need to test on all platforms and make changes for each platform as each OS has slightly different concepts (Mac OS X being the most different). If you don't do this you'll deliver a poor application, for every OS (maybe except the one on which you're developing).

Having good crossplatform libraries can help, for example I want to experiment with some higher level abstraction for PureSwing that would help in developing crossplatform GUI apps with having (quite) different GUI structure based on standards for each OS. But while it would help, the testing everywhere still applies.

Write once run anywhere is not a lie as such. Just oversold. It was oversold in the perception that a compliant L&F CDE app would not need any changes to be a windows L&F app vers a Mac L&F app.

If you are not doing L&F gui's its pretty good. I don't need different TCP bindings etc. In terms of a basic GUI its also far easer. But if fine details of *platform specific* L&F are important then I think swing is just as bad and just as good as any other widget library I have had the displeasure to use.

Personally I hate how pedantic people are with L&F. Its like "pixel perfect" web design. Lame.

I have no special talents. I am only passionately curious.--Albert Einstein

Personally I hate how pedantic people are with L&F. Its like "pixel perfect" web design. Lame.

Please note this is not L&F issue this time, but more high level thing, eg. Mac OS X has quite different concepts for GUI, like having completely different layout and content of dialogs or how are some tasks done by user. In fact Mac OS X was big eye opener for me when it comes to writing crossplatform applications. Windows and Linux applications seems to differ in just few details so it's mostly just matter of L&F.

Which applies to any crossplatform development. Write once, run anywhere is more or less a lie.

Oh I agree, you've always got to test on the platforms you release on, but there's a difference between fixing bugs because you used an API incorrectly (which is what I find most of my java cross-platform issues are) and the flat out browser incompatibility and mess that javascript has.

..Mac OS X has quite different concepts for GUI, like having completely different layout and content of dialogs or how are some tasks done by userMac OS X has quite different concepts for GUI, like having completely different layout and content of dialogs or how are some tasks done by user

I would still call this L&F. However i think we probably both agree that other widget toolkits don't address this problem any better than java(Cross platform ).

The pedantic bit was more of a Linux/windows user thing i noticed with my old work. Quite a lot of people get really upset if the font is not the one they think it should be for example. In a word processor, fine... but in high end banking software.... Just change it!

Just for the record. I have not needed to support much mac beyond a working GUI. We have been able to expect users to work with the different "L&F" mostly.

I have no special talents. I am only passionately curious.--Albert Einstein

But if you add username/password into the mix you can use the password as the key for encryption. When submitting highscores username can be sent both as plaintext and encrypted together. Server can verify it's the user they claim to be by looking up their password using the plaintext username and then unencrypting the message to get their username a second time (only the person with the password could have encrypted their username). Server can then challenge the user using an encrypted nonce to prevent replay attacks. Note that the password is never sent.

In no way does this prevent me from sending in a score of ten billion.

Dispite all my criticisms, and all of the issues I've had writing cross-platform Java, it is still easily the best platform I've ever used for cross-platform development. 99% of the time it really does 'just work' across all platforms.

My biggest complaint is that the fake native look & feel is not default. Sun seem to have this bizarre idea that pushing their own Java look & feel is a good thing. They have made excellent UIs, but they are not on par with the Windows or MacOS! People also tend to have a very negative stereotype towards Java apps. IMHO foreign look and feels of the same quality usually always feel worse, simply because they are different. The brushed metal UI for Safari is a good example. Looks great on a Mac (it fits), looks terrible under Windows (really stands out). The various alternate MS look and feels (i.e. Office and Windows Live messenger) get away with this because they are essentially the same with improvements, so they still fit.

It's easier for you to be tracked and banned from doing so a second time.

With your encryption, you are only making sure nobody is tampering with the connection between the browser/useragent and the server. When the useragent is mallicious, it is not tampering with the connection, it simply sends invalid data. Data integrity/correctness wise, by no means this yields any advantage over sending plaintext.

Hi, appreciate more people! Σ ♥ = ¾Learn how to award medals... and work your way up the social rankings!

My suggestion to use encryption was in response to oNyx saying that anyone can just use a packet sniffer to listen to the packets being sent across a network. Encryption aims to solve this issue, that's why I suggested it.

Google's decision to implement O3D on top of WebGL is yet another sign that the client side is converging on the browser and "html 5" technologies. This could be bad news for Java, but then again - maybe not. First of all, it would reinvigorate OpenGL as the most widely used graphics API at the expense of DirectX, which would mean better OpenGL drivers. JOGL and JWGL developers would clearly benefit.Second, and this is a wild hypothesis (or wishful thinking, perhaps), the rise of "html 5" and fast JavaScript could make Java the only real alternative for web-deployed software. I believe Flash (and Silverlight for that matter) doesn't have much to offer on top of html5, but Java has some advantages. It would be much more performant than JavaScript in the near future at least, it has a multitude of libraries and power-features, and provides a much nicer development experience, especially with JavaFX script which is a joy to use compared to browser technologies (or Flex). True, Java applets and WebStart apps won't get any more common, but they could become the de-facto standard whenever some true heavy lifting is required client-side. All this, however, is dependent on Oracle delivering some much awaited features for Java/JavaFX soon - especially improving the user experience vis-a-vis loading time and installation

I don't like the direction the Web goes. JavaScript is basically the assembly language of Web. Why are we using a high-level browser-embedded (possibly interpreted) language for (about) everything? All those applications growing on the Web are Html and JavaScript. I find this approach a waste of resources. We can have MS Word (or OpenOffice) text processors compiled from C++ to optimised machine code. Or we can have runtime-compiled and optimised Java, but no, we are going with Html and JavaScript.

I have researched Web (Html+JavaScript) games a bit. In display complexity i would say they currently equal to desktop games of around 1993 (before OpenGL arrived). With Html5, embedded video, and 3D, Web (JavaScript) games would reach display complexity of the early OpenGL games (Quake II). So basically Web games are 15 years behind.

I cant understand, why is it so hard for all the parties to standardise on a common platform which is more effective. I mean, why is the Web (Html+JavaScript) the only platform commonly accepted?

Besides driving the 3D display from JavaScript, JavaScript programs running 3D graphics are more complex too. In this regard the mobile platforms are not suitable for running JavaScript. One article showed that the iPhone has very bad JavaScript performance. So you may be able to play video (and make Web games with prerecorded videos, like those around 1995), but you wont be able to do 3D in JavaScript on mobiles.

I think the future is in video streaming. There are already two such services. It is possible to rent a game (from the game "cloud"), and to stream display&audio trough the net to you, and controls back to the server. Mobile net will soon reach speeds of 50-100Mbps, and video streaming will become more viable.

Right now Google, Amazon and others are providing services for running applications in the "cloud". I think we will have "game clouds" too. That means, make a game which plays on a desktop computer, with full DirectX 11/OpenGL 3 capabilities. Publish it to a "game cloud", and serve it to anyone with a web browser, TV device, or mobile device. Running games on clients computer will become irrelevant, the client computer will become just a cheap "terminal". I can imagine that new game API's will be published, specifically for making games run from "game clouds". Possibly game engines (like the Unreal engine) will have such "game cloud server" functionality built into them.

Running games 'in the cloud' has many advantages for the clients, and massive disadvantages for the developer/hoster: scalability is horrible and expensive. Maybe, maybe you have 10 players concurrently playing on 1 dedicated server, and that's that.

And then there is lag. Network lag (twice), frame rendering, video compression, video decoder, for every controller-input event. No FPS in the cloud!

Hi, appreciate more people! Σ ♥ = ¾Learn how to award medals... and work your way up the social rankings!

Well, cloud FPS's are (almost) here: http://www.onlive.com/. Apparently on fast enough connections latency is acceptable.I agree hosting them should be very expensive. Not so much hosting the CPU work - servers have better cost-effective CPUs, but I don't see how you can get around dedicating a GPU per (active) player. Maybe those new NVIDIA Tesla farms can help: http://www.nvidia.com/object/preconfigured_clusters.html

Well, cloud FPS's are (almost) here: http://www.onlive.com/. Apparently on fast enough connections latency is acceptable.I agree hosting them should be very expensive. Not so much hosting the CPU work - servers have better cost-effective CPUs, but I don't see how you can get around dedicating a GPU per (active) player. Maybe those new NVIDIA Tesla farms can help: http://www.nvidia.com/object/preconfigured_clusters.html

There is marketing and there is reality

No matter how cost-effective those server CPUs are, it's far less costly not to run the games on the servers in the first place. I think this topic has been beaten to death for several times now, and the FPS genre is simply not well suited for 'remote execution'.. even on LAN-like speeds/latency.

Hi, appreciate more people! Σ ♥ = ¾Learn how to award medals... and work your way up the social rankings!

I don't like the direction the Web goes. JavaScript is basically the assembly language of Web. Why are we using a high-level browser-embedded (possibly interpreted) language for (about) everything?[...]

[...]I have researched Web (Html+JavaScript) games a bit. In display complexity i would say they currently equal to desktop games of around 1993 (before OpenGL arrived). With Html5, embedded video, and 3D, Web (JavaScript) games would reach display complexity of the early OpenGL games (Quake II). So basically Web games are 15 years behind.[...]

With WebGL you can make games with PS1 like computations with PS3 like shaders on top. That's plenty of power, if you ask me.

Regular Canvas is similar to Flash. We'll see some hardware acceleration in the future though.

Also, both (i.e. Canvas and WebGL) will get faster once some bottlenecks are removed. The compositing currently takes a big chunk of CPU cycles for no good reason. This will get fixed.

Android handsets will be fine, too. They support OpenGL ES 2.0 and the V8 JS engine (the one Chrome uses) works there as well. Oddly enough the JS engine is just like the most powerful one on the desktop, whereas the Java VM is pretty crippled (they intend to speed it up a bit though) compared to those you know from desktops.

It's sorta unfair, but as things are, JavaScript is in pretty good shape on those weak devices.

It's sorta unfair, but as things are, JavaScript is in pretty good shape on those weak devices.

I was referring to addition complexity needed in 3D games. Think physics, etc. Whatever performance JavaScript has, it was not made for such things. And running such things on the desktop (in JavaScript) is also problematic, and in the article I've read, the iPhone JavaScript is 10 times slower than on a netbook. The 3D display may work properly (its hardware accelerated no question), but with CPU performance 10 times slower than a low-end desktop, it will not run the additional computations needed for a 3D game. Android may be better, but i doubt even it has enough processing power.

Sure, running games in the cloud is expensive. That is why i think there will be special versions of game engines made for running in the cloud. Something like a single game instance synthesizing display for multiple clients, sharing resources (textures, models), and rendering to framebuffer and compressing the stream too. One such instance could possibly act as a multiplayer server too (reducing that part of lag to 0). I think games specifically written to run in a "game cloud" will require lot less resources per player served, than a stock desktop game run on the server. That is why i think special game engine versions, new "cloud game" API's, SDK's will have importance.

I personally think it's only a matter of time until JavaScript gets some competitors in the browser. I personally would love to see something along the lines of C--, a language designed to be an intermediate language. Google are also working on Native Client to allow users to safely run native apps in their browser. Something like this will probably end up being the best solution.

I just can't see "the cloud" working for games. We still have massive asymmetries in network to local cpu/bandwidth ratios. And there are no indications thats ever going to change. In other words, networks are still the bottleneck.

And thats leaving out costs. As Riven says it does not scale. Everyone thinks bandwidth is cheap. Its not, for a decent uptime (Service Level Agreement not some "promise") server with serious bandwidth, its cheaper to master and post DVD's than sell online. I have run the numbers. Peer agreements only help some.

Then on top of all that is what people call HD for live streaming. Its not even VGA half the time. So pixelated and blocky. A BluRay disk has peek bandwidth of about 40Mbit. Thats for 1080p and really its optimized for 720p/1080i. How may FPS gamers do you know that run at that low resolution (I do, but many are well above that now). So rendering high res and then *on the fly* compression that has very poor performance and high latency to a level of crap... its not going to take off.

Sure it may work for Farmvile.... But >720p60, i can always give my clients a better game for less cost by giving them a full client side app.

Also the "the cloud" crap was called "net computing" about 10-15 years ago. We were all going to have dumb terminals within the decade. It didn't sell at all.

I have no special talents. I am only passionately curious.--Albert Einstein

Also the "the cloud" crap was called "net computing" about 10-15 years ago. We were all going to have dumb terminals within the decade. It didn't sell at all.

Net speed is just one thing that got better, but the big difference is that we got a lot more and better mobile devices now. We already have the "dumb terminals", its called the Android, iPhone, iPad and the netbooks. Mobile net speed will get up to 50-100Mbps, and that will be more than enough to stream video. Also consider that mobile devices have smaller displays, so a small (rendering, streaming video) resolution is enough. In the time of "Net computing" there were no applications on the net, now we have Google, and other myriad of Web based applications. There are even programming IDE-s on Web now.

I'm not arguing that a desktop (or game console) game beats streaming video. Its just PC sales are declining, and mobile device sales are exploding. Mobile devices are too weak to run proper 3D games, but are (thanks to hardware acceleration) fast enough to show streaming video.

Also the "the cloud" crap was called "net computing" about 10-15 years ago. We were all going to have dumb terminals within the decade. It didn't sell at all.

This. "The cloud" has degenerated into a marketing euphemism. If big corporations said "store your private data on our servers" people would run a mile. Call it "the cloud" and suddenly it's hip and trendy.

Today Steam distribute 20 petabytes of data a month and they are able to make a profit. Bandwidth costs have been going down over the last decade, and that's expected to continue. Typically with optic fibre it's the interface which is the expensive part, that's one reason why it's not being pushed for use in homes. However again that is changing. Light Peak is an example where an optic fibre interface is becomming cheap enough for use on home devices (even though it's still more expensive then USB). There are houses already that have optic fibre broadband connections, and I see no reason why this won't be common place in the future.

Even if OnLive (and similar) services don't work in practice due to latency, with increased bandwidth you can have the user download a game on the fly (or at least just the bits they are using) in realtime.

This. "The cloud" has degenerated into a marketing euphemism. If big corporations said "store your private data on our servers" people would run a mile. Call it "the cloud" and suddenly it's hip and trendy.

It was. But if you look around now, you will see a different picture. Companies are saving tons of cash by running their services on Google or Amazon cloud instead of their own data-centers. Its not "just hype", there is a huge amount of programming articles about writing apps to run on clouds, application frameworks getting cloud-compatible, discussions of users, both good and bad reports. It shows that cloud is not an empty phrase any more. It is different than the "Net computing" initiative was. "Net computing" focused on the client side, and there were neither client machines, nor services usable from the Web. It was a sales slogan, but now "The Cloud" is a working business of both Google and Amazon.

Only if you can change the laws of physics...With a given S/N ratio there is only so much data you can fit per Hz. Either that or you are the only person on the block with mobile device. They only way you could get that to work is will cell sites the size of an orange.

I have 8Mbit at home completely unlimited for 25EU. Many people up north get fiber at that price (But don't ask how much the beer costs!). Home bandwidth is quite high and the last mile is good enough in many places. But thats not where there is a bandwidth shortage.

Quake 3 still has many active servers... Lets assume that we have a server with 16 people connected. You are not going to get away with you tube quality here. But lets assume there is some serious codec magic going on and you can get 1 frame latency with 10Mbit (I seriously doubt 720p60 at 40Mbit would good enough). Lets assume a 50% occupancy and we have 12*16*3600*10Mbit => 864 Gbyte *per day* *per server* (only 16 players!) or 26Tbytes per month for *one* server. Compared to quake live which has a usage of about 5Mbyte per hour for 16 people on a server (ie 60Mbyte for 12 hours). Even if the encoding could get the bit rate down to 1Mbit its still massive compared to a normal system. So quake live will be cheaper... and we haven't even started talking about the hardware required to render and encode this data, the cooling or the rent. And lets not forget a pipe thats 1000x larger than your competitor is going to cost more too.

Sure it can be done... but it will never be cheaper.

And mobile devices are not too slow for 3d. Older 3d works fine. There 3d capabilities will increases faster than there network capabilities.

I have no special talents. I am only passionately curious.--Albert Einstein

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org