8 comments:

I was a dork in middle/high school and taught myself to program. I was super excited to go to college to study software engineering. Started off in the accelerated CS classes because of AP exams. The first programming we did, if I recall correctly, was drawing up UML diagrams in Rational Rose. Then we'd use Rose to generate Java code from the UML diagram, then fill it out.

I lasted two quarters there, it almost completely destroyed my love of programming. Took a couple years to recover.

Looking back at it with my "real" programmer mentality, I'm amazed that such a well-respected CS program could get it so damn wrong. Maybe it is what BigCo needs though. Can't have a bunch of bright-eyed kids excited about programming thrown into the depressing world of enterprise coding. Much better to suck their souls before they realize what's possible.

This is a pretty ignorant post. Paul Graham was talking about "nasty little projects", not college; in fact, in most good CS programs, you learn to write a compiler (which Graham described as a worthwhile project). And you have "heard a rumor" just floating out there and don't even offer us a link?

There are a lot of bad CS programs out there. Those are probably the ones making people write FizzBuzz programs on the midterm. We usually call those places "junior colleges." Good CS programs teach you about data structures and graph theory. Good CS programs make you write a shell and a small operating system from scratch. Good CS programs make you implement a compiler, thread libraries, web server, and journaling filesystem implementations. Good CS programs teach you how analyze the running time of your algorithm and determine if a problem you're working on is NP-complete. Good CS programs teach you linear algebra.

The odds are very low that you will get the same kind of quality of all those experiences in four years on your own.

The problem is that many college grads think that their education is enough, so they don't actually do any work improving their skills. But the solution isn't to not go to college, it's to do the work! And true, many people would be better off not going to college. But the people that are interested in learning about computing for computing's sake should go to college. Your skill set will be much richer if you enroll in a good CS program.

I honestly don't understand why people even think about FizzBuzz for more than 5 seconds. It's as if everyone started saying "hey, let's talk about implementations of 'Hello World!'"

I honestly don't understand why people even think about FizzBuzz for more than 5 seconds. It's as if everyone started saying "hey, let's talk about implementations of 'Hello World!'"

That's what Giles is saying with his Fizzbuzzters post and this. FizzBugzz is such a retarded "problem" that if colleges are making students do it at midterms, it demonstrates the fundamental flaw in the administration's thinking.

If anything, it strengthens the theory that colleges focus on training rather than education. Look how quickly they evolve - "our graduates won't get hired if they can't do FizzBuzz? Let's add it to the curriculum then!"

Come on. Holding up the University of Chicago as an example of college in general is like holding up Pablo Neruda as an example of writing in general. That's not what the thing is. That's what the thing occasionally becomes, when it's done better than most people will ever discover is humanly possible. And I didn't provide any backup to this rumor because the rumor was whispered in confidence. If you're interested in this topic, there's a possibility you may see the rumor verified elsewhere soon enough. And if you don't, well, that's why it's called a rumor. And for the love of God, if you can't see the connection between college and nasty little projects, remind yourself that this is a post about putting FizzBuzz on a mid-term exam!

Anyway, enough defensiveness. I agree with you about the solution, at least.

And btw, Pat -- I think you've hit the nail on the head regarding how this happened. They read the FizzBuzz blogs like so many other people, and if anything missed the point even more completely.

I studied something other than computer science in college because I already ran the networks and unix servers for the computer science department and I thought I better learn something I am unlikely to learn on my own. I had been working at the university since I was in high school and many of my friends were CS majors. Occasionally I would look at their classwork and write up little programs which they were assigned just to convince myself I wasn't missing too much.

I could have learned more database and queing theory and other such things earlier for sure, and even now there are many topics which I have not broached in a professional capacity. I studied math, physics and philosophy before dropping out because I wanted to learn something "impractical." I always knew I could and would teach myself whatever was necessary for work.

I dropped out because at the age of 22, I worked half-time for motorola and I earned more than many of my professors -- I remembered something about "decreasing marginal returns" from one of the few economics classes I did not skip. I always thought I could learn more computer science as and when needed. It's not like pure math or physics which you will almost never need and are unlikely to learn on your own.

My advice is, if you love software and computers, study something else to enrich your life and expand your mind. You will live and breathe computers regardless. Don't waste tuition and your precious youth on CS if you know you are going to be doing computer stuff anyway.

From what I've been hearing it sounds like the CS curriculum has changed a lot at some places from when I took it. I checked my alma mater's program and it was similar enough to what I took that I don't think its curriculum has degraded, though from what I've heard from a friend who took a graduate program there, the faculty isn't so hot.

I don't regret taking CS. It exposed me to a lot of things I don't think I would've gotten otherwise. In some cases it helped me work more intelligently on projects. The program I took was not good on the practical skills. I learned that on the job, as almost every other CS grad does.

Up to the point I entered college (this was almost 20 years ago), the most powerful computer I had worked on was a 286 PC and a Mac with a Motorola 68000 CPU. I had programmed in old-style BASIC (with line numbers) for about 7 years, and I had done a little Pascal, all on 8-bit computers. That was it. In college I got exposed to mainframes and Unix. I learned about programming in assembly language (and found it wasn't as hard as I thought), C, Icon, Prolog, SML (predecessor to OCaml), and Smalltalk (I don't include Lisp in this list, because while they did teach it, my teacher was terrible). Most projects were done in either Pascal or C, but I got exposed to the other "research languages" as well.

Compilers was a graduate level course back then, but I took it anyway, because I was so interested in the topic. Building my own language compiler was a pinnacle experience.

My college experience is the reason I even knew Smalltalk existed, and why I've returned to it (aka. Squeak) in recent months.

I have been retraining myself lately. The way I was trained in CS was all about "this is how the computer works", not so much "here's how to express an idea logically" and have that expression execute on a computer (ie. using an evaluator). We were taught about formal logic (graph and set theory), but very little on getting that logic to execute. It was all about translating the formal logic into lower level code, and then executing that.

I second Stephen's post. The work I did at University is much more difficult than what I work with as a programmer, these days. I draw from that experience whenever I get something challenging to code.