What's the difference? Are High level languages more powerful than low level languages? Are they faster? It came up as I was working through a C tutorial and it mentioned that C was middle-level, and Java and C++ were High level.

This is an age-old question, and an eternal debate. It boils down to this:

Low level languages:
- Very close to the architecture
- Lets you do whatever you want
- Theoretically, you can get better performance if you know what you are doing

High Level:
- Works on a conceptual level
- Abstracts the physical architecture
- Prevents you from doing stupid things
- Adds some overhead

Low-level languages tend to be double-edged swords. On one hand, you can get blazing performance. On the other, any complicated program becomes very hard to maintain, and it becomes so easy to make an error that the advantages are often lost. Any low-level programmer who claims that he programs without errors in a low-level language is either lying or delusional.

Generally, you want to have a mix of both: High-level code for most of the game, coupled with low-level code in critical areas (rendering is often the main candidate)

C++ is a bit of an odd beast (which is why it's so popular). It's both a somewhat high and low level language. It gives you very tight control of the architecture while at the same time providing high-level concepts to play with.

Nobody is going to agree on the definitions of the terms, their meanings are not set in stone.

In general parlance, C and C++ at least are "low-level" languages, Python and Ruby are "high-level" languages, and ObjC is an odd beast where a low-level language is embedded in a high-level language.

As with all one-dimensional scales it's completely inadequate for any real purpose, since "close to the hardware / produces very efficient code" and "easy to program with / concise" are independent properties.

C++ is both harder to program with and further from the hardware than C, where does that put it?

Yes, definitely hard to be accurate and consistent about what exactly is a high-level language and a low-level language. Here is how I classify them:

All low level languages are compiled to run directly on the processor as binary, so I would classify C, C++, and ObjC as low level. Assembly is technically the definition of low-level, but I don't even consider using it anymore, so I'd be more comfortable calling it hardware level nowadays.

All scripting languages are high level, like Lua, AppleScript, Ruby, Python and such. All scripting languages must run within another program, called a virtual machine (which itself must (usually) be written in a low level language), or translator.

Even though Java and C# are written in such a way as to be syntactically and functionally similar to low level languages, the fact that they run through a virtual machine classifies them as high-level in my view.

So basically, any language that runs directly as binary on the CPU without the aid of a virtual machine after compilation is low-level. Everything else is high-level.

If you wanted to create a mid-level language category, I think that would consist of C++, ObjC, Java, and C# -- all four of which are extremely powerful, industrial strength stuff, but not as strictly bare to the metal like C. So I guess then I'm saying C would be the only low level language in popular use today, which is why I don't think of such a thing as mid-level language -- you're either running through a virtual machine (or translator) or you're not.

I don't care about the distinction between "runs on physical hardware" and "runs on virtual hardware" (and it can always be blurred, eg. Quake 3 uses C running on a virtual machine for its gameplay code).

For me the definition is *more* about the ease of programming. Languages which are type-safe (either statically or dynamically), languages which are garbage-collected, languages with syntax macros, languages with closures, etc. will rank higher on my scale than languages without those features.

C and C++ are missing everything, so they're low-level, ObjC has dynamic dispatch and now garbage collection, so it's somewhere in the middle, Java is dynamically type-safe, garbage collected, and nearly has closures, so it's further up. Lisp with CLOS has the lot, I think, putting it very high level.

For me, the "ease" of programming is not the main factor. It's really about the level of abstraction present. There more far away from the architecture, the higher the level.

The difference between this and the "ease" of use appear when you look at languages that have a drastically different approach. Take PROLOG, it is by no means an easy language to program with. It is, however, very high-level in the sense that it deals with complicated concepts. PROLOG does not even care if your computer executes operations sequentially.

C and C++ are low-level because you have the notion of bits, sequence of operations and other such concepts ingrained into them in such a way that they are tailored to reflect the underlying hardware executing them.

OneSadCookie Wrote:(and it can always be blurred, eg. Quake 3 uses C running on a virtual machine for its gameplay code)

Yeah, but that isn't blurry to me at all since by my classification scheme that clearly makes it a high level language since it is running through a virtual machine; doesn't matter that the syntax is C. I wrote my own scripting language which is also using the C syntax, but I know darn-well it ain't low level since I abstracted it away from the hardware. Maybe it is because I was the one who wrote the virtual machine for it that I can't see it as low level when I programmed it from a low-level language? How could I possibly call my scripting language low-level even though the syntax is C? I dunno, that's just too much of a stretch for me, which is why I draw that line there.

But as you pointed out, that doesn't make it accurate by someone else's measurement. I make it a one dimensional description by my method, which is perhaps terribly naive, but at least it is simple.

So Java is just as powerful as C? I find it hard to imagine liking Java more than C, because in my limited experience, Java is used for online games, and in all the Java (Freeware) apps I've tried, it runs much slower than programs written in C/C++/Obj-C. Is this true or is it just me?

Hairball183 Wrote:So Java is just as powerful as C? I find it hard to imagine liking Java more than C, because in my limited experience, Java is used for online games, and in all the Java (Freeware) apps I've tried, it runs much slower than programs written in C/C++/Obj-C. Is this true or is it just me?

Well, anybody can make a slow program in C too, to be fair. You have to take the developer into account, amongst other things (like the virtual machine and support libraries, or what kind of environment it is used in -- web browser or high-efficiency game engine), but yeah Java is generally slower than C, mostly because it runs inside a virtual machine. Java and C are different languages which have their good features and bad features. For instance: Java is indeed one of the most powerful languages out there, but that doesn't necessarily make it as powerful as C when it comes to speed, nor does C's speed make it more powerful than Java's ease of programming.

The best all-around advice is to use the right tool for the job. I know of a defense contractor that writes high-end military 3D simulation software in Java because it works well with the very large development team that they have. I don't use Java because it doesn't suit my tastes, but that doesn't mean it's a bad language. If I had my choice between C and C# though (which is similar to Java and C++) I would choose C# for as much as I could because I can get a lot more programming done in a shorter period, even though C is faster. Engineering is all about making trade-offs. Often times with programming languages you trade off some speed for more language features to help you finish the project faster. That's the main reason so many of us on the Mac prefer Obj-C over plain C, since it trades a very small speed loss for messaging overhead in favor of, well, messaging (among several other things), which makes life real easy compared to C in lots of situations.

AnotherJake Wrote:Yeah, but that isn't blurry to me at all since by my classification scheme that clearly makes it a high level language since it is running through a virtual machine; doesn't matter that the syntax is C. I wrote my own scripting language which is also using the C syntax, but I know darn-well it ain't low level since I abstracted it away from the hardware. Maybe it is because I was the one who wrote the virtual machine for it that I can't see it as low level when I programmed it from a low-level language? How could I possibly call my scripting language low-level even though the syntax is C? I dunno, that's just too much of a stretch for me, which is why I draw that line there.

But as you pointed out, that doesn't make it accurate by someone else's measurement. I make it a one dimensional description by my method, which is perhaps terribly naive, but at least it is simple.

Hm. Wouldn't that mean any Windows app is low-level when executed on an x86 CPU, but high-level as soon as it's Virtual PC on my PowerPC Mac?

Cochrane Wrote:Hm. Wouldn't that mean any Windows app is low-level when executed on an x86 CPU, but high-level as soon as it's Virtual PC on my PowerPC Mac?

That isn't at all the context I was thinking in terms of... A virtual machine applied after the fact is just a neat twist on the concept, but it doesn't change the nature of the language the applications were written in. The low level apps in Windows (running within something like VirtualPC on a PowerPC) were naturally written in a low level language which was designed for that purpose and compiled down to binary, not byte code, intended to run directly on and optimized for the processor, not a virtual machine, so no, they're still low-level.