EDIT: This question at first seems to be bashing Java, and I guess at this point it is a bit. However, the bigger point I am trying to make is why any one single language is chosen as the one end all be all solution to all problems. Java happens to be the one that's used so that's the one I had to beat on here, but I'm not intentionality ripping Java a new one :)

I don't like Java in most academic settings. I'm not saying the language itself is bad -- it has several extremely desirable aspects, most importantly the ability to run without recompilation on most any platform. Nothing wrong with using the language for Your Next App ^TM. (Not something I would personally do, but that's more because I have less experience with it, rather than it's design being poor)

I think it is a waste that high level CS courses are taught using Java as a language. Too many of my co-students cannot program worth a damn, because they don't know how to work in a non-garbage-collected world. They don't fundamentally understand the machines they are programming for. When someone can work outside of a garbage collected world, they can work inside of one, but not vice versa. GC is a tool, not a crutch. But the way it is used to teach computer science students is a as a crutch.

Computer science should not teach an entire suite of courses tailored to a single language. Students leave with the idea that all good design is idiomatic Java design, and that Object Oriented Design is the ONE TRUE WAY THAT IS THE ONLY WAY THINGS CAN BE DONE. Other languages, at least one of them not being a garbage collected language, should be used in teaching, in order to give the graduate a better understanding of the machines.

It is an embarrassment that somebody with a PHD in CS from a respected institution cannot program their way out of a paper bag.

What's worse, is that when I talk to those CS professors who actually do understand how things operate, they share feelings like this, that we're doing a disservice to our students by doing everything in Java. (Note that the above would be the same if I replaced it with any other language, generally using a single language is the problem, not Java itself)

In total, I feel I can no longer respect any kind of degree at all -- when I can't see those around me able to program their way out of fizzbuzz problems.

Many good questions generate some degree of opinion based on expert experience, but answers to this question will tend to be almost entirely based on opinions, rather than facts, references, or specific expertise.
If this question can be reworded to fit the rules in the help center, please edit the question.

9

Don't worry. Give Oracle half a chance and they will remedy the situation:) Their lawsuits against Android and Apache are just the beginning of driving developers away. I teach at an institution that has taken the above as a cue to start teaching Python, in anticipation of the future market.
–
SamGoodyNov 2 '10 at 10:55

3

What academic settings are you referring to? I don't offhand know anywhere where CSci students are expected to learn only one language. (Last I was in school, you could have gotten by with C and Scheme, I think, if you picked your major and classes carefully, but you'd be using at least two dissimilar languages.)
–
David ThornleyNov 2 '10 at 13:57

16

Can you explain how your L1 cache works? Can you design an ALU? Can you explain Maxwell's equations? What about the thermodynamic properties of your motherboard? Could you build a USB drive? Do you REALLY know how your computer works? Or did you pick a level of abstraction appropriate for your interests and go from there?
–
MetricSystemFeb 7 '11 at 19:52

5

Because deciding that you need to understand memory management to "fundamentally understand the machines they are programming for" is a completely arbitrary level to stop at. If it's not necessary to understand the problems you're interested in, your time is often better spent studying what you are interested in ( since nobody has enough time to learn everything ). And any argument you present as to why Java programmers need to learn memory management could be extended infinitely until everybody needs to know atomic physics to "understand" anything.
–
MetricSystemFeb 7 '11 at 22:32

7

Furthermore if anyone has a straw man argument here it's you, given that I've never heard of a single CS program which only presents Java as a programming language.
–
MetricSystemFeb 7 '11 at 23:38

16 Answers
16

This isn't a Java problem, it's a teaching problem. Not knowing how to program is not a languages fault, it's the students fault. Same goes for all your issues, GC, basic machine knowledge, how things work "under the hood" so to speak.

Your major gripe about garbage collection throws me slightly. Unless you're doing C or C++ garbage collection is typically very good and not an issue. Would you rather they all learn assembly? Higher level languages that are strict are very useful for teaching. It gives you the flexibility of libraries, packages, and other niceties when you need it, without any of the confusing language "sugar" present in most other higher level languages (PHP, Ruby, Python, Perl).

I'm not saying that the higher level languages should not be taught. But I am saying that something lower level (on the order of C or Fortran or some other systems programming language) should be taught as well. (Going lower than C is less useful because below that level anything you learn there is going to be specific to a single machine architecture). I'm not saying that all CS grads should be great low level programmers. But I am saying that if confronted with a low level problem, they should not sputter and die, as all the students around me seem to do.
–
Billy ONealNov 2 '10 at 3:00

3

@Billy: Going lower then your initial learning level is hard. I find myself looking at C++ / C code and having a bit of a fun time wrapping my head around it. Don't expect miracles, but don't expect angels either. Programmers need to have drive, lacking that they will all "sputter and die" quite quickly no matter what level the problem is.
–
Josh KNov 2 '10 at 3:04

5

@Josh: Fair enough, but IMHO formal exposure to at least one other programming environment should be a necessity to be given a CS degree. The degree doesn't say "I understand Java", it says, "I understand CS". The two are not the same, and you cannot achieve a full understanding of CS using Java (or for that matter, any language) alone. (BTW, +1)
–
Billy ONealNov 2 '10 at 3:13

2

@JoshK: An inability to debug at low levels, and a greatly reduced ability to properly reason about the implications of the code they write.
–
Mason WheelerJul 13 '12 at 2:38

1

I know programmers who only learnt Java and are scared by C++ or C because they have to use pointers. I do not know if someone who cannot program with pointers can be considered a real programmer: after all, all modern computers and runtimes are still based on a basic architecture involving registers, main memory, stack, heap. So I prefer to program in Java rather than in C++ (it is definitely easier to get things done if you do not have to bother about memory management) but I agree that teaching should definitely not be restricted to Java.
–
GiorgioJul 13 '12 at 8:40

Are universities really doing this? Or is this just a perception people get when they get a bad job interview candidate?

I got my CS degree 4 years ago and while Java was used in the intro classes, it was not particularly emphasized after that. Off the top of my head, the core required courses required you to learn Java, C, C++, SML (functional programming language), assembly, matlab and CAST (circuit description language). This doesn't take into account all the math and theoretical CS courses and of course, the half dozen elective CS courses that would have exposed you to a whole bunch more. From talking with friends at a number of different universities, it didn't sound like what they were doing was much different.

If there are universities solely teaching Java, then that is indeed a tragedy. Not because Java is bad, but because it is just one tool of many that should be available to a programmer. A developer who only knows Java is equivalent to a carpenter who only knows how to use a hammer. However, I have trouble believing that's really the case, at least for respected universities. I suspect that this is just the perception from seeing bad candidates and blaming the school for the candidates' incompetence rather than the candidate's own lack of drive/motivation/curiosity/professionalism/etc.

Agreed. I just graduated a few years ago, and while most of my classes were taught using Java, it was hard to graduate without taking a class that was taught using C or C++, and we also had a programming language course that emphasized the use of Haskell.
–
mipadiNov 2 '10 at 15:01

I'm really surprised this is the situation as many schools. When I got my CS degree, I had a series of languages courses that between them included all the popular languages of the day (this was almost 40 years ago): assembler (Univac 1108 and MIX), FORTRAN IV, SNOBOL4, COBOL, Lisp, and Algol 68. I can't imagine having just one language being taught. Or how I could appreciate what compilers had to do without having written some assembler programs.

We did use Algol and FORTRAN for most of the data and algorithms courses. I would have liked to have spent more time with Lisp -- instead we had just an introduction and now am trying to re-learn some of the concepts. We used SNOBOL4 for writing parsers -- and its pattern matching with backtracking built a good foundation for later work the regex libraries included with many modern languages.

This introduction to half a dozen languages sparked an interest in me that has continued ever since.

That's what the industry asked for. There was a shortage of Java developers so schools starting creating them. Companies got developers who can start work as soon as they walked through the door.

Anyway, the language doesn't matter. They could teach a non-GC language and still produce bad programmers. The language schools use is an implementation detail. Schools should teach concepts. If this isn't the case then it's the school that is the problem not the language they teach with. Just because someone doesn't know what a pointer is doesn't mean they can't be an effective programmer.

You don't need to know what a pointer is. But you need to be able to pick up the concept relatively quickly. Not being to understand pointers, or concepts similar to pointers, indicate lack of understanding of computer science as a whole. I'm not saying all CS grads need to be awesome low level programmers, but I'm saying they should at least have a conceptual understanding of what's going on when they receive a 4 year university degree.
–
Billy ONealNov 2 '10 at 3:05

@Javier: How is CS not about programming? Sure there's that branch of CS, "Theoretical Computer Science", which is not about programming, but for the most part it is about programming.
–
Billy ONealNov 2 '10 at 15:20

1

@Billy ONeal: the wikipedia definition is a little long, but the part that gets closest to programming is "the study of practical techniques for their implementation". that is, it's not about the implementation (programming) but studying the implementation techniques (algorithms, structures, languages, etc)
–
JavierNov 2 '10 at 17:28

1

@Javier: I think "the study of programming" and "the action of programming" are one in the same. After all, code is read much more often than it is written. But that's beside the point -- when you receive a CS degree, the thing you're going to end up doing is programming.
–
Billy ONealNov 2 '10 at 18:06

Laziness and profit motive. Java gets most people 80% of the way, with 20% of the effort. However, this often produces "monkey see monkey do" voodoo coders who don't have deep understanding of what is happening, and are unable to dig deeper than high-level tools.

Teaching C/C++ properly is difficult. Even when done well, quite a few won't fully understand low-level concepts like pointers (there are studies on this). Moreover, it does not seem immediately relevant in the job market.

If you are to teach CS cource focused on data structures or algorithms. Getting most of the unnecessary stuff out of the way is a good thing. You don't wan't students to deal with low level stuf when learning high level concepts.

On the other hand low level stuff should be also taught. Just so we get more all-around programmers.

As always it is a question of what you want to teach and how much time you have to do it in.

Doing things in Java means that you don't have to deal with stray pointers and free/malloc. That means quite fewer hard-to-find bugs.

This doesn't mean that you shouldn't learn about the underlying machine - which you must to be good at the trade - but that perhaps you should choose the right tool for the job. High-level algorithms are perhaps not the best topic for the assembler course.

Is it too much to ask that they be taught something about how computers really work? It makes no sense to me that any CS course outline would not include something of operating systems and computer archetecture. I'm not saying you should be able to design a computer on the back of an envelope, but a general overview puts all languages into perpective. I mean, Java runs in a virtual machine!

In all honesty I do not think that you should lose respect for all degrees and especially not CS because the only language taught is Java.

Any decent comp. sci student understands that his CS degree is nothing more than a piece of paper and that real knowledge will come from personal projects and personal study. I would like to think that the good students know they need to know C and learn how to get things done in Linux / Unix and learn how to NOT have their hand held by modern languages.

I don't see a problem with using Java. The biggest problem (as you mentioned) is when universities teach imperative programming as the best and only way to solve all your problems. This is very far from the truth. For example instead of using three imperative languages (or even worst the same language) for three different courses, they should use three different programming paradigms. An example is Java (imperative), Prolog (declarative), and Haskell (functional).

As for why this happens, I think it's a combination of (1) politics - chairmen are not open minded or have personal benefits/sponsors, and (2) laziness - professors were once students and they teach whatever they know or like.

Calvin college in Grand Rapids Mi. Once they switched from Java to Python for introductory programming classes the professor in disbelief mentioned to a group of us near the end of his semester, "A lot of them can actually program, it's incredible". His amazement was in contrast to the previous experiences with Java. This doesn't make Java a bad language, just hard to pick up from scratch. I would argue the same thing about C, or C++ or any language that requires intimate knowledge of memory allocation schemes or types.

A few years ago I read somewhere that, in the UK, the Java decision was made to entice more students into programming. Apparently it was easier on the brain because of the lack of nasty pointers and memory management etc... I'd guess that this is the reason why Python is being quickly adopted as the new 'learning tool.' I'd be of the opinion that your typical uni course is not going to make you into a programmer. All it is meant to do is arm you with a certain amount of knowledge and prime you for programming. The day that one can call oneself a programmer generally comes a fair bit of time after uni and when that time comes, pointers, garbage collection and programming paradigms other than OOP, should not be a problem.

I got a degree in computer systems engineering, so I did study assembly, microprocessors, VHDL, what the logic gates are and how to create a flip-flop out of them, I studied electrical circuits, operational amplifiers, low-pass filters, amplitude modulation, built a circuit in a lab that would sound like a telephone keypad, messed with a logic analyzer, took some physics classes. On the Linux side - I recompiled a kernel before, wrote a tiny device driver, and a homework file system.

But then I realized that I like more high-level stuff a lot more. I prefer Python over C. I like C# and I love the garbage collector. I like math and algorithms and data structures, but I am not a particular fan of pointers - I can mess with them; I just do not want to.

So, you think that the university should teach more C, less Java. Someone else thinks that they should teach everything in Scheme. Spolsky will take both.

Yes, it is true that a university degree is merely a piece of paper, but "do not let schooling get in the way of your education". There is always Wikipedia, Stack Overflow, Coursera.org, etc. Looks like the traditional universities will become less important.

If I could do undergrad all over, then I would have gone to a different school, I would not have studied any of the electrical engineering stuff, but rather Math, CS + minor in Physics and French. Scratch French; I should have studied abroad in Switzerland and learned 5 natural languages while at it. I would also go back and find a person who convinced me that I absolutely must take some intro to economics classes in college in order to survive it in the real world and smack him on the head with my collection of Milton Friedman's books (the only few books on the subject of economics that I read which do not suck). I also would not pay money for the book Freakonomics 1 or a book about Perl language.

Having hindsight I would have made 100 billion dollars on the stock market by now, and I would have invested it in cancer research, I would have hired scientists to rid the world of plastic waste; I would have donated enough cash to Ron Paul campaign so that the war in Middle East would be over 3 years ago.

Recognize now that you will never get your time back, so try to spend it wisely.

Java uses C-style syntax. Many programmers are familiar with it, so it reduces the learning curve.

Money was spent to market Java as cool and modern. But compared to lisp, java is living in the stone age.

The JVM. JVM is the one cool thing about java that is legit, not just marketing hype. Although the JVM should be praised as a platform for running programs, it does not by itself merit the existence of the Java language. You can theoretically and practically compile many different languages for the JVM.

Your question is supporting the findings of another person who wrote a paper saying Java is so cool but not heavy enough for scientific computing. I think he should just use Big Decimal with all its fine precisions, NANs etc..and get on with it.

I want to live only in a automatic garbage collected world. Its healthier, cleaner, neater, aesthetic, prevents money wasted on health expenses, is more desirable - don't have to tag something as unwanted - simply by dereferencing it or setting it to null or when I move out of scope the item gets automatically tagged for garbage collection..

Ok, now I am talking about Java only but it would be most helpful if this works in the real garbage scenario too. Java is write once run anywhere which is good enough for me for it to be used as a language in schools. No memory leaks and being liked by happy programmers who don't want to spend time tagging every object they use and stop using itself is good logic on behalf of Java.

s1 and s2 will be pointing to the same String object - which is by reference
s1 == s2, s1.equals(s3) but s1 == s3 is false because s3 is a new reference object - this is java arithmetic of references and I understand all this so sorry, I do not wish to be labeled as someone who does not understand the under lying complexities of pointer arithmetic from C or C++.

References are not pointers. You can't use them arithmetically, and you can't use them to reinterpret a piece of memory to which they point. And of course, no matter how much you want to live in a garbage collected world, the fact of the matter is that the world is not a garbage collected world. I'm not saying that Java is a bad language, or that people shouldn't use java. If you like it, use it, there are things about it that are great. I just think it's inappropriate as a beginner language because it hides things that are essential for new programmers to understand.
–
Billy ONealJul 13 '12 at 16:51