Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

twofishy writes "Something I've noticed amongst financial service companies in London is a growing use of Java in preference to C/C++ for exchange systems, High Frequency Trading and over low-latency work. InfoQ has a good written panel discussion with Peter Lawrey, Martin Thompson, Todd L. Montgomery and Andy Piper. From the article: 'Often the faster an algorithm can be put into the market, the more advantage it has. Many algorithms have a shelf life and quicker time to market is key in taking advantage of that. With the community around Java and the options available, it can definitely be a competitive advantage, as opposed to C or C++ where the options may not be as broad for the use case. Sometimes, though, pure low latency can rule out other concerns. I think currently, the difference in performance between Java and C++ is so close that it's not a black and white decision based solely on speed. Improvements in GC techniques, JIT optimizations, and managed runtimes have made traditional Java weaknesses with respect to performance into some very compelling strengths that are not easy to ignore.'"

Programs written in some languages run more quickly than those written in other languages. Some languages allow you to write programs more quickly than other languages. Therefore, the choice of language is always a tradeoff. There is no perfect programming language, and no, not all programming languages are alike as you seem to believe. If I want to write a program quickly, I write it in Python even though I know the program will run quite slowly.

Their referring to the lag you get in launching a Java App. There are ways around those issues, but not ways that are native to the language or the platform. Up until the hotspots are found and compiled a Java App is pretty slow. I'm not sure if this should be an issue with HFT though. The Algorithm should be persistent in memory for the trading day. The host spots should be found and compiled within a few minutes. Unless they are repeatedly launching it there shouldn't be a problem.

Some people also work on multi-million line projects which compile in a minute or less. It's a trade-off. Most compiler issues are related to obscene (and generally unnecessary) amounts of header dependencies. 1000 lines takes an almost immeasurably small period of time to compile, but a 2 line file can take a minute if it includes the wrong headers. The ISO C/C++ library or boost is pure evil in these regards.

A well formed program can compile extremely rapidly. A poorly formed program can often compile extremely quickly with enough CPUs working in parallel across a fast enough network. However a decent program will take time for the initial compile but take almost no time for each consecutive compile when only a file here or there changes.

Also, to make things politically incorrect on Slashdot, good tools like Visual C++ 2012 can even take poor code and make it compile quickly if the projects files are designed well. Do your coding there and compile it later on GCC.

Manual memory management is a whole class of bugs that Java doesn't have to deal with. A good C/C++ programmer shouldn't have *that* much difficulty, but it does add to debugging costs and detracts from mental focus on the algorithmic aspects of the problem.

Garbage collection creates a different class of problems, namely that the performance characteristics of your program become non-deterministic. This is a Big Deal for certain classes of applications such as video games and in particular HFT. Would you like to be the person explaining why the GC caused your program to stall at an inopportune time for 50ms and lost somebody a few million bucks?

Any JVM that would run this code will be tunable, including the ability to tune the GC so it becomes more deterministic. The fact that your desktop app runs in 'use all memory then GC' does not mean that is the only way the JVM can work.

Garbage Collection with a bad implementation this is true. But garbage collectors do not need to cause non-determinism; they can be used even in real time systems. The problems is that people re-invent the wheel and don't learn from all the research in the past that sped GC up. Or more likely, the system was originally designed to be a simple quick-and-dirty scripting language but its scope has grown so that now the hastily written GC is unsuitable for the new requirements.

To be fair, good GC is hard, and it is extremely difficult to do it portably. Some good GC techniques need hooks into the operating system (ie, you want to mark pages as clean or dirty). Doing this portably sometimes means you see implementations that sit GC up on top of malloc, so that GC is merely a way to eliminate manual allocating and freeing, and you end up with performance headaches with memory fragmentation.

Yes it's true, if you use a malloc and free directly and no GC, you can still end up with non-deterministic allocation performance! Especially with a naive malloc I see used a lot on small systems, where you just have a linked list of available regions so you have a linear search to find regions big enough. And you end up with fragmentation on any system that does not have relocatable memory (ie, most traditional non-interpreted systems), so many embedded systems prefer to use memory pools to fix those problems.

"Wouldn't a C++ programmer generate an applicable program effectively as quickly as a Java programmer?"

No, some languages simply are slower to develop with and debug. The problem is also made worse depending on the frameworks and IDEs available. As an example, you're going to get your work done way quicker writing an application manipulating dates and times using C# and Visual Studio than you are Java and Eclipse because until Java 8 Java's date time functionality is shit and Eclipse is a dog slow IDE. With Java 8 and say NetBeans or JDeveloper though things will be pretty similar.

At the end of the day with C/C++ you have to deal with memory management and that's just one additional piece of work that you don't have to be so concerned with with Java. C/C++ give you more scope for optimisation and more control over memory management as a benefit of that though. It's about trade-offs and figuring out what matters.

But it's possible to be great at both C++ and Java without having to descend into petty arguments as to which is better and know when to use each in response to a specific task, and that's the sort of great programmer these institutions will be looking for and this is really what they're talking about - both have their place but in some cases getting a trading application to market a day earlier than the competition even at slight latency trade-off may be enough to net your company a few million dollars advantage. In other cases, the latency improvements of a highly optimised C++ application may instead be the key to scooping up those extra millions.

At the end of the day with C/C++ you have to deal with memory management and that's just one additional piece of work that you don't have to be so concerned with with Java.

You're funny.

I think we have about a dozen calls to new and free in a few hundred thousand lines of code in our server. The vast majority of memory allocation in C++ is hidden in libraries like STL, which we presume have been debugged.

In Java, on the other hand, you have to be very careful with memory management or you'll either end up pausing for long periods for garbage collection or crashing with out of memory errors. Instead of worrying about whether people remember to call free after allocating something, you end up putting in caches so you can reuse objects rather than reallocate them. We've used some Java libraries designed for high-performance financial uses and they try very hard not to allocate any objects that they'll later need to clean up.

Programmers believing that 'you don't have to worry about memory management in Java' is the reason so many Java programs are a slug-like mass of memory bloat.

But that's the point isn't it? Sometimes those hits from the garbage collector simply don't matter and if they don't matter because performance in this respect isn't the greatest concern then you don't have to bother yourself with it, whilst in C++ you have to bother yourself with it whether performance is a concern or not - in other words C++ always has an inherently slower time to market because of this inherent underlying trait and that's the point of TFA.

What hinders you to use Joda Time if you think that Java's data and time are insufficient?Any non-trivial application have dozens or more dependencies, one more or less really is not important.I find the strength of Java is the enterprise grade open source libraries, one for every possible scenario.

You are obviously a free thinker shill. Go back to your free thinking and let us get back to the regularly scheduled bashing of everything related to corporate, government, or anything else a 13 year old boy doesn't fully understand yet.

Use RAII consistently, and use containers (from stl or otherwise) which have asserts() on bounds-checking.
Bonus points for a tiny unit-test (which can therefore run at the end of every compilation).
You'll be amazed at how stable, maintainable, easy to debug and performant your code will be.
Do the hardcore pointer handling only where the profiler tells you that it matters and there's no way java even gets close in performance

Java benchmarks always avoid the thing that makes java slower, mandatory use of heap and all the indirection for reference-semantics for user types. In other words, as soon as you start using many objects, java's performance goes into the crapper, especially using the standard EE libraries of the major vendors,

The chest beating about Java vs C is kinda sad. Look, I've spent the past 20 years hating java with the fire of a 1000 suns, but having been kinda forced to use it lately I've realised its actually not a bad language, in fact its quite neat and well thought out (But god help me, somehow its date handling is even more broken than javascript)

The problem is all the verbose cruft that goes with it. The giant overly complicated frameworks that require configuring 50 different XML files fed through a labrynith undocumented build process that allows you to write terse and insane pattern-madness code....or alternatively just fire up JDBC and write a bloody SQL query instead.

I think JAVA could shine if people just threw out about 15 years of insane and overly complicated frameworks and took some tips from the python and perl people and replace them with some simple but effective libraries that do one thing and do it well.

Yes, Java at its core is a simple language, with well understood concepts and features that people have been using for decades. But it's saddled with a huge amount of framework. To be a good Java developer these days doesn't mean knowing how to use Java the language, it means knowing all about the infrastructure and how to quickly tie together pre-existing functions to do what you want.

Wow, this is new. A rational, reasonable argument from a self-described Java hater... not just another "Java is slow because it's interpreted" or "Java is insecure" bullcrap post.

I've seen legions of developers grow frustrated with the gigantic frameworks and libraries that have grown up around Java, and react by abandoning Java and building something insanely simple in another language.

I wish just one of those guys would have instead tried abandoning the bloated frameworks and instead build something insa

Java does NOT perform anywhere close to as efficiently as C/C++. You might be able to get message transmissions to take the same time, but the Java environment will undoubtedly take more system resources. The same happens with any through the kitchen sink of libraries at it interpreted language..Net, Java, Ruby. In my experience Perl runs faster than those 3 but managers have been led to believe that Perl has a slower time to market, thus is slipping from the mainstream. The closer you get to stripped down

Java does NOT perform anywhere close to as efficiently as C/C++. You might be able to get message transmissions to take the same time, but the Java environment will undoubtedly take more system resources. The same happens with any through the kitchen sink of libraries at it interpreted language..Net, Java, Ruby. In my experience Perl runs faster than those 3 but managers have been led to believe that Perl has a slower time to market, thus is slipping from the mainstream. The closer you get to stripped down, just what you need, compiled language, the faster and less system resources the code will take to execute.

The big issue with language decisions these days is that they tend to be driven by perceived market value. People are the most expensive cost to most businesses these days. So the marketing battle between languages focuses less on performance and more on how experienced and expensive your developers need to be. What I see being missed with this marketing is that by lowering the people quality and marginalizing your language and code quality, you are setting yourself up for maintenance, improvement, and performance costs down the road.

Never trust the assertions of people who say "undoubtedly", "obviously", "just plain common sense", etc.

A lot of "undoubtable" things are wrong. Especially when they're based on how things "should" be. Doubt. Measure.

For simple applications that don't deal with a lot of dynamic memory, Java is almost as fast as C/C++. Unfortunately, trading systems are not such - I know because I worked in that domain for some time, in the 1980's and again in the 2000's. Java's biggest issue is garbage collection - it is decidedly non-deterministic. IE, it can have SERIOUS impacts upon performance, and at times that cannot be pre-determined. In my experience (30+ years), if you need consistency and speed, then you do not select Java for your environment - and I do a LOT of Java development. I just don't use it when I need the software to have a small footprint, be fast, and stay out of the way of other system processes.

In my experience (30+ years), if you need consistency and speed, then you do not select Java for your environment

You have 30+ years of Java experience? Wow. BTW, if you do have a lot of Java development under your belt, how does Zing compare to other solutions in your application domain? Still inconsistent and slow? (I mean, the GC implementations started converging to what can be described as "good performance" only in the last few years, so unless you focus solely on the last few years, of course your experience is going to be horrible on the average.)

This struck me as well. In the original article, they put a bound of 100 milliseconds as counting as "low latency" in their book, but Wikipedia's article on algorithmic stock trading puts it in the microseconds instead:

"Low-latency traders depend on ultra-low latency networks. They profit by providing information, such as competing bids and offers, to their algorithms microseconds faster than their competitors."

Uh since around 1.3 the JIT optimization for java has led to blindly fast code containing optimizations which are not even available to C++ . Dynamic compiling allows for branch prediction to be more accurately, unlike malloc, the GC knows where to look for free memory and returns it from the last bit of memory you just requested, if you know where pointers are pointing at compile time, you can put them in registers. C++ and other statically compile languages don't have this information, so it stores them in cache, but the JIT can acquire this information and store it in a register. It's the difference between a register to register test and reading from disk.

There are tons of other stuff like this. I don't have it committed to memory, and compiler technology is not my thing but if you look around you'll see that actually GC and JIT are theoretical advantages in terms of speed and those advantages are being realized. It can even figure out what chip it's being run on at runtime and optimize the code for that chip.

The Java is Slow Meme is left over from 1995 before there was even HotSpot.

Not bashing any other language here. C# could also avail itself of these advantages.

Java continued to carry the stigma of being slow, and rightfully so, because it was horrendously slow for desktop use until Project Mustang (Java 6). Java 6 marked a point where Java not only no longer sucked for desktop application use, but actually became quite good at it.

Printing went from completely, unusably slow to blazingly fast.GUI components went from being painfully slow and difficult to program to some of the fastest and most flexible GUI components I've used on any platform.

All that is nice theory, but in practice C still runs faster, for various reasons. The problem you have is that you're only looking at facts that support your position. That is the problem with the world.

Consider that every single array access in Java is checked, for starters.

Switching to Java should yield similar results to C++. What matters is whether the programmer understands the memory architecture of the run-time environment well enough to not have issues. Generally, you'll find that even the best programmers in either language will overlook things like garbage collectors and memory fragmentation issues. It's a time-to-market thing. When working with large dynamic data sets, it doesn't matter if you're using Java or C++, the developer needs to be able to adapt their code to perform well on the system.

Code written without considering the processing time of memory management will probably work much better in C++ than Java. That said for huge data sets, Java could perform better since the memory itself is location independent and it is highly probable that you're gain a great deal of performance from being able to defragment memory. Consider however that the garbage collector and the defragmenter will have unpredictable times which can cause multi-millisecond hiccoughs during processing.

I recommend if you take this route, you hire a compiler geek to work on staff optimizing the memory operations.

To me this is not a case of Java being fast enough as good enough. Keep in mind these are people who are building their own microwave towers from one exchange to another to shave microseconds. But also keep in mind that this is is also the era of big data. So you now have a situation where you have to process insanely large amounts of data in near as is possible to real time in order to make trades "Now!"

You lost all credibility as soon as you used Eclipse as an example. I use Eclipse almost every day. It does exhibit some of the behaviors you say. BUT, and this is a very large BUT, Eclipse is a desktop application, running on whatever you happen to be running, with absolutely NO tuning of the JVM. Applications like HFT are NOT running on some random desktop, they are running in servers with sufficient resources and, more importantly, proper tuning of the JVM to meet their needs.

There is no reason Java can not run just as fast and predictably as C/C++, given people who know what they are doing.

Yes, yes there is. Well, more than one actually. Here we go again:

(1) Unbounded stop the world of the JVM causing thread stalls (GC being the canonical example)(2) Other stop the world's in the JVM due to code profiling causing a newly compiled version to be swapped in (maybe they've switched to atomics here, I'm not sure)(3) Java's internal profiling and housekeeping absorbs CPU cycles(4) Everything is allocated on the heap. If you have an array of objects - it's an array of pointers to things in the heap.

I'm not a Java user, so I've never directly tuned for things like GC, nor do I interact with it directly. Warm up is a different story.

I interact with quite a few exchanges (over all kinds of protocols). Most are, unsurprisingly, written in Java. Almost all of them perform terribly at the beginning of the week. The issue is a standard one: the JVM hasn't JITted important code paths, and it won't until several thousand requests come in. For a standard throughput-oriented program, this doesn't matter -- the total time wasted running interpreted code is small. For a low-latency network service, it's a different story entirely: all of this wasted time happens exactly when it matters.

The standard fix seems to be to write apps to exercise their own critical paths at startup. This is *hard* when dealing with front-end code on the edge of the system you control. Even when it's easy, it's still something you have to do in Java that is entirely irrelevant in compiled languages.

If JVMs some day allow you to export a profile of what code is hot and re-import it at startup, this problem may be solved. Until then, low-latency programmers should weigh the faster development time of Java with the time spent trying to solve annoying problems like warm-up.

I built a real-time interface to some hardware in Smalltalk 20-years ago! (Yikes!) Since Java and Smalltalk have the same GC theory (at least, simplistically), some of my experiences would translate. I also spent 5-years running high-performance, low(ish) latency stuff in Java too. The simple rules of low-latency Java (or Smalltalk) are:
- Create less garbage. Don't tell me that the GC is an issue when you are creating huge garbage. If your application is running in a tight loop (small amount of code

It's a good thing that these days computers have plenty of memory. It's not quite yet true on smaller devices such as smartphones, but it will be soon. With each year, memory use is becoming less of a problem and how to take advantage of parallel processing is becoming more of a problem.

Traders are typically working with monster servers outfitted with over a hundred gigabytes of RAM, not tiny desktop workstations that need to swap just to move the mouse. I won't say that memory usage is no object, but there is almost no reason not to throw extra memory at a process if it wants it.

Your trading engine runs in Java and leaks four gigabytes an hour? No problem. Just give it 64G of stack and do half an hour of garbage collection after the market closes. Is that not enough? Okay, give it more. Don't have that much available? Get more. Can't afford it? Now you're just pulling my leg. Buying extra memory is cheaper than debugging a live system where any slip-up could cost you thousands of dollars in missed trades or penalties.

The only reason Java is slower than C, is because C can have unchecked arrays, or low level access to CPU registers or vector SIMD.

Given the same code written in both C and Java, and including C range bounds checking (to make it as safe as Java), the speed will be the same. Or, quite possibly Java would be quicker once the JVM starts stripping away the checks once it realises there is no possibilities of bounds been breached.

Java is slow because it is Garbage-Collected, not because it runs in a Virtual Machine.

Memory usage is more important than the virtual machine for performance for anything more complex than calculating Fibonacci numbers, as it affects hard drive swapping and cache misses. That's what is making Java programs **feel** slow. The hard disk is the bottleneck, not the CPU.

Ram is cheap, and you have control over the max heap size on your java call, so I don't know why you think disk swapping is a problem. Not only that, but garbage collection can actually speed you up in some cases: if you can defer memory management from a time when it would slow you down to a time when you're sitting on free time anyway (waiting for io, etc), you are actually sometimes better off.

This. I've actually personally witnessed a case where a java test application outperformed a (roughly) equivalent C++ application by a startling amount. The two applications were more of a benchmark program than a real app, since it was only testing the timing characteristics of allocating many thousands of objects on the heap, discarding some of them, allocating more, then discarding more, and so on.... The C++ version used the default new and delete operators, while the java version just used the normal new keyword and left the discarding of memory of old objects to the GC. The Java version outperformed the C++ version by more than a factor of 10. Altering the C++ version so that it used custom new and delete operators for the objects, pooling unused objects instead of always just returning them to the heap as the default delete operator did, the C++ version of the application sped up considerably, outperforming the Java version only slightly over a period of about one billion allocations. In consideration of this experience, I think that one is left with answering the question for themselves of whether spending the extra time to specifically optimize a C++ program is truly worth the marginally improved performance. Sometimes it might be... but sometimes it won't.

In consideration of this experience, I think that one is left with answering the question for themselves of whether spending the extra time to specifically optimize a C++ program is truly worth the marginally improved performance.

Probably not in general if the performance gain is marginal, though you've missed one very large aspect of C++ that idiomatically, small temporary objects are allocated on the stack, not the heap, and that's completely free.

Cheap RAM means you can run bigger and faster C++ apps too you know, not just Java. C++/C will beat java in almost every test that counts because skilled C/C++ developers can write code that handles memory better than most GCs.

Yes, Java can be compiled. It can also be run interpretted. Either way, it runs in a Virtual Machine, which in and of itself adds another layer in the stack and slows the software down, however small the difference may be it will be there.

The only reason Java is slower than C, is because C can have unchecked arrays, or low level access to CPU registers or vector SIMD.

Given the same code written in both C and Java, and including C range bounds checking (to make it as safe as Java), the speed will be the same. Or, quite possibly Java would be quicker once the JVM starts stripping away the checks once it realises there is no possibilities of bounds been breached.

And yet in embedded environments the advise when using Java is do to write in software in such a way that the GC is never invoked because it causes major performance penalties at unexpected times.

The only reason Java is slower than C, is because C can have unchecked arrays, or low level access to CPU registers or vector SIMD.

That and C and C++ have faster allocation and deallocation of temporaries. And they have no runtime specification of what's going on, so the optimizer is allowed to do more.

Specifically, C/C++ (one of the few times where such a phrase is meaningful) require the programmer to specify stack allocation or heap allocation. Stack allocated objects are completely free. At the point of function call and return, it simply uses a different size for the increment/decrement.

Java has an excellent heap allocator (very fast) and can do good escape analysis, but very fast is aloways slower than free and escape analysis is never as good as by-hand specification.

In terms of the optimizer, java specifies all sorts of things about how stack traces from exceptions must be accessed and so on. C/C++ don't specify anything about those. As a result, the optimizer is free to remove more stuff than in C/C++ than Java. If you compile with -O3, it's not uncommon to find the debugger thinks you're in a completely different place than seems to make sense.

Or, quite possibly Java would be quicker once the JVM starts stripping away the checks

The thing is, java trades some safety and tighter specification for some performance loss. C and C++ allow the programmer to specify more. While the JVM is very good at removing a lot of stuff, it's always fighting an uphill battle to remove stuff that simply isn't there in C and C++.

I'm not bashing java. It's an OK language (not my favourite, not my least favourite) and there are many others including much more fun ones for the VM. It does decent stuff and I often use Java programs (minecraft and ImageJ are particularly high on the list).

Many JVM implementations can make use of SIMD at run time because they know what they're running on. Additionally, in many algorithms, the pointer indirection cost in C is higher than the bounds checking cost in Java (because pointers make life hard on optimizing compilers, and HotSpot doesn't have that disadvantage). Cache coherency also plays a big role. See the link in my reply to parent comment.

Those extra cycles to make the code safer will cost an HFT firm tens of thousands of dollars. C is a better choice for these people *because* it allows them to cut corners. I mean, who cares if a bug in your program crashes the exchange when the exchange will just let you reverse the trades?

More important than the actual runtime environment is that fact that in any networked application that processes lots of data, _latency_ is the bottleneck, not the actual performance of a well implemented* algorithm. The latency between servers, between RAM and the CPU, and even between L3 and L1 (hell, even from L1 to the registers) will have a larger impact on the overall performance than the actual language used. Round trips to and from memory (either local or remote) are what kills performance for mo

round trips occur every time you call a function/method and the stack has to be saved..

Is this always true? Can't the compiler unroll that, or the CPU store the stack info somewhere fast?

I personally find it weird that many systems still push parameters onto the stack and then call the function. That's mixing code and data. Very unhygienic. Security problems and potentially poorer performance.

Wrong. This hasn't been true for more than 15 years 1996. It's a JIT-compiled language, which means it has a slow startup time, but once the VM is warm, it can actually outperform C on numerical code. Really [scribblethink.org].

And you'll notice even on your linked counterexample there are 5/11 examples where Java is within the margin of error on CPU time. Take the time to go back and read the link I posted. "Java vs C" goes back and forth depending on the algorithm in question and the cache characteristics of the target platform.

And you'll notice even on your linked counterexample there are 5/11 examples where Java is within the margin of error on CPU time.

Yeah and in the rest C++ is quite definitively faster. In other words as I claimed, C++ is either the fastest or within a small percentage, across the board. There are precious few examples where C or C++ is substantially outperformed by another language (and in almost all of those precious few examples, it's Fortran) and plenty of examples against just about any other languages where it is a clear winner.

Take the time to go back and read the link I posted. "Java vs C" goes back and forth depending on the algorithm in question and the cache characteristics of the target platform.

I did though the one I posted has much more recent results and no dead links. All the benchmarks are similar. In some cases C++ is slightly otuperformed by Java, in other cases C++ substantially outperforms Java.

Mostly it doesn't matter.

But when people routinely benchmark against Java rather than C++, then I'll accept Java as the faster language.

A run time interpreted language that in real life JVM implementations is a run time profiled, optimised and compiled language. That can create better optimised code than any compile time optimiser could because it's optimising against the actual workload instead of guessing what it might be and compromising against many possibilities.

Java is a shitty language with many nasty aspects, you managed to pick the one thing it can get very right. Not much of a programmer then.

FYI the author of that quote is the chief engineer of a C system on which enterprise, low latency trading systems are built. So when he says it, I would tend to give that more than the passing thought.

Those items quoted have indeed been recently shown to bring performance close to similarly developed C++ systems. Though you are right that the layer of indirection will always mean overhead, if you are working low level enough and real time is not your goal, that indirection is what allows your code to be run

Java is far, far safer than C++. C++ does not enforce type safety at all. For example, in Java you cannot possibly have a buffer overrun or access freed memory as you can in C++. I think most of the security notices are about C and C++ programs, not Java programs. I think you're referring to the Java runtime, which is written in, you guessed it, C.

For applets in a web-browser. Running a Java server is a different beast with different concerns.

The issue wasn't simply related to "applets for a web-browser". The reasons the Java RTE/VMs were pulled was due to how much it affected nearly every enterprise level use of Java in the data center - affecting entire classes of Java programs instead of having program specific attack vectors.

The article you just linked to is describing...wait for it...flaws in the security sandbox which is used for browser plugins. A standard JRE app is not sandboxed, unless you're using a custom ClassLoader for plugins or weird stuff like that. Can you give an example of a datacenter Java application where this would be relevant and how it would be exploited? Why a typical enterprise server app would be running user supplied.class files is beyond me.

Java is far, far safer than C++. C++ does not enforce type safety at all. For example, in Java you cannot possibly have a buffer overrun or access freed memory as you can in C++. I think most of the security notices are about C and C++ programs, not Java programs. I think you're referring to the Java runtime, which is written in, you guessed it, C.

Yet it is Java that has had its run-time environments pulled for security concerns.

In perspective, the RTEs have been pulled because they had flaws that enabled exploit code to be run. In C, the RTE lets ANY code be run, including exploits, so what's really happening is that Java is falling back to something closer to C levels of runtime security.

The reason for the concern about Java exploitability is that while most sane people have long since given up on download-and-run C code (ActiveX), Java applets (while comparatively rare) have not had exploit concerns until fairly recently. Becaus

There are various ways of sandboxing native code. On the heavyweight end you have something like a hardware assist, i.e. a hardware VM. You can then have things that go all the way down like software VMs, jails, sub-kernels, dynamic binary translation (like bochs, valgrind) to basic kernel based features like using rlimit and running as a nonpriviliged user or starting it in an SELinux/AppArmor context with no access to anything.

Java is far, far safer than C++. C++ does not enforce type safety at all. For example, in Java you cannot possibly have a buffer overrun or access freed memory as you can in C++. I think most of the security notices are about C and C++ programs, not Java programs. I think you're referring to the Java runtime, which is written in, you guessed it, C.

Yet it is Java that has had its run-time environments pulled for security concerns.

In perspective, the RTEs have been pulled because they had flaws that enabled exploit code to be run. In C, the RTE lets ANY code be run, including exploits, so what's really happening is that Java is falling back to something closer to C levels of runtime security.

The Java Run-Times were pulled because they were allowing exploits that were not necessarily related to the program being run.
Comparatively, C/C++ programs are generally only susceptible to the flaws in the actual programs, and flaws in their support libraries are only exposed as much as the program allows it to be.

The reason for the concern about Java exploitability is that while most sane people have long since given up on download-and-run C code (ActiveX), Java applets (while comparatively rare) have not had exploit concerns until fairly recently. Because until recently, Java's sandbox was considered trustworthy. C/C++ doesn't have a sandbox.

C/C++ doesn't have a native, built-in sandboxing mechanism. But they can most certainly be sandboxed whether via hardware or software mechanisms.

Well, let's think about the equivalent of pulling the Java run-time environment for C/C++. That's right, you'd have to pull your operating system. Do you see how that's not even close to a reasonable comparison?

True, you'd have to pull the OS for C/C++ when it is providing it dynamically. C/C++ programs (and any native code program) could be using the RTE statically - updating the version in the OS has zero impact on the actual program as it still uses the old version. That said, the C/C++ RTE does not in itself introduce security flaws into the programs that use it, which was why the Java RTE was pulled.

Not only that, but you're not even talking about related types of safety. The Java runtime keeps getting in trouble for poorly handling malicious third party code. If you are writing a java program yourself, it is immensely safer for the reasons GP listed, and you probably aren't in the business of attacking yourself with malware.

Yes, Java keeps getting in trouble for allowing additional code to interface with a program and extend the prog

Java is the only programming language I have run across in common usage that incorporates a deprecation mechanism.

When code becomes obsolete, you can tag it as deprecated. It will produce warnings when compiled, IDEs and javadocs will highlight it as deprecated, but it continues to be usable. That means that you can delete the code at your leisure instead of being forced to confront - and fix/bypass broken code when you're doing a completely unrelated emergency repair the way th

..jiggabits of ram use isn't inherent for java as such.it's if you put in gigabytes of libs..

anyhow, maybe they should look for guys who used to do j2me programs, because, uh, you can write java so that you don't trash the gc stupidly and get away with pretty nice things in 300kbytes of heap..