Posted
by
samzenpuson Wednesday June 13, 2012 @07:30PM
from the oldest-school dept.

Esther Schindler writes "Young whippersnappers might imagine that Computer Science degrees — and the term "computer science" — have been around forever. But they were invented, after all, and early programmers couldn't earn a college degree in something that hadn't been created yet. In The Evolution of the Computer Science Degree, Karen Heyman traces the history of the term and the degree, and challenges you on a geek trivia question: Which U.S. college offered the first CS degree? (It's not an obvious answer.)"

About six months ago, I was overexerting myself removing 'MyCleanPC' from a customer's computer. Apparently, the client in question was unaware that it was a piece of malware, written by Russian programmers whose only experience with computer programming involved a copy of Visual Basic 3.0 and MS BOB, and was responsible for Windows crashing all the time.

After removing 'MyCleanPC,' my client's computer ran 1000 times faster than before, and their credit card numbers were no longer mysteriously getting stolen.

Every person I know who has a Computer Engineering degree makes less money than I do. I also work with people who have nothing more than tech school diplomas who make more than I do and frankly can run circles around myself.

When you graduate you will realize your degree is not what is important to be successful in the workforce. It is all about hard work, connections, raw talent, and a bit of good luck sprinkled in.

Many moons ago, I was a junior in college and chasing down CS as my bachelor's degree. One day, I decided I'd had enough arguing with machines. Now, as a firefighter, I love coming to work, and make more than most of my friends who continued on to CS degrees.

Today?

I'm doing the IT / programming / database / GIS work for my fire department...still arguing with machines, but now its enhanced by arguing with bureaucrats.

Bullshit you aren't. If you're earning a PhD you're towards the top of the capable list of people who earned bachelors degrees. Some of the capable people will go off and get real jobs that pay 70 or 80k a year after graduation (which is now all of my former students from a course that finished at the end of 2011 who left academia), but you cannot get into a PhD programme without being well above average. Different fields have differing skill levels and outlooks, but you can't get a PhD in any of the sciences unless you have well above average reasoning and maths skills. You have be passionate about being dispassionate and you have to be able to look at evidence and analyze it properly. Those are extremely rare skills. Even amongst people with undergraduates in science or engineering.

In physics to get a graduate degree you have to be in the top 70% of graduates from a bachelors more or less, but to pass in physics at all at the undergraduate level is quite hard. You're not all that much more special than people in say, medicine or engineering but when you're in academia and everyone you see over 30 you call "doctor" you forget that only about 10% of the US population has a graduate degree, let alone a PhD.

Engineering and comp sci are a bit different. They're harder to get into to start with, but it's easier to get into grad school once you pass, because most of your compatriots like money more than they like being able to investigate some novel, as yet unsolved problem that may remain unsolvable. Why is physics easy to get into but is proportionally so hard? Because as part of the regular science faculty they don't really care. If you can get into 'science' in general you can enroll in any of the physics classes. Not enough people are interested in physics for it to be a huge problem. I'm in canada and in my graduating year there were, I think it was about 170 BSc grads in the whole country, and about 2000 in the US. But there were also about 1800 or 1900 PhD's in canada and the US. In comp sci we have about the same, today (a number of years later) number of PhD's as physics, it's up over 2000 ish but not far off. But something like 50k undergrads in comp sci in canada and the US combined.

I'll grant you, that getting a PhD puts you 'only' the top 10% or so of the population at all, and within that much of the distinction is more interest than specific skill set. But you can't get a PhD without being really good in your area, and really good in general. You can get a BSc and be mediocre, and that's as much about luck and opportunity as anything else. But once you get stuck in a room full of computer nerds universities can pick and choose who they take for PhD degrees. I know where I am they have about 300 qualified applicants a year for about 40 spots in grad school (and it costs about 100 bucks to apply so you don't just fling applications about wildly, but you that doesn't mean only 40 of those 300 will go to grad school at all).

I grant that there's a lot to be said for when you're born and luck, especially in being financially successful in life, but Academia in north america and europe are very much merit based. It may be luck and opportunity that determines which field you go in, and whether or not you end up a professor of computer science making 130k a year or bill gates making 130k an hour, but in both cases you can be in the top 1% of the population if you manage your money and don't do anything catastrophically stupid professionally.

TL:DR. I call bullshit. Luck and temporal factors will get you a bachelors and contribute to what field, and how much money you make. But to get even accepted to a PhD programme you have to be in the top quarter or so of graduates from comp sci or engineering.

Depends, how many hours did you work, how much experience do you have, how many hours did you have to work to get there, how much vacation do you get, what's your pension like, what's your job stress like, where do you have to live etc. etc. etc. I know lots of professors who pick up their kids at 3pm every day, take 2 months at home in the summer (they still have to work some of that, but they are at home at least) taking care of the kid. You get to meet this constant stream of interesting people in academia etc. If you go off into industry with a PhD you can easily start at 100k a year at 26 years or 27 years old, and have all the vacation time, pension plan etc.

PhD's aren't about the money, you are guaranteed enough to be reasonably successful in life, but how much effort you want to put into it is up to you.

Oh, and where would you be without a bunch of PhD nerds inventing the languages who programmed in, the IDE's you used (or the command line compilers) the OS schedulers etc. Being able to program well is a skill, but computer scientists aren't programmers. You could have made 236 being a welder for all it matters, lots of scientists need to know how to weld, lots need to know how to program, but they don't do it well.

You could well be in the top small fraction of the population intelligence wise. Which means it's unfortunate you didn't go to school, because you'd be making 350k a year not 236k. One of my buddies is about 50 years old, making about 450k a year working part time. The joys of being able to teach people how to program.

Like I said, luck and opportunity can get you into a BSc and it can get you money, but it won't get you a PhD. Maybe if you'd paid attention in school you'd be better at reading comprehension than programming, though from the sounds of things this plan worked out better for you.

No, a diploma or GED probably wouldn't make you better at what you do. Education is to identify and work with talent. In the 1980's we weren't identifying talent with programming particularly as far as I know.

Which again, doesn't mean you aren't successful. Hell bill gates is the most successful dropout in history. Luck and opportunity will get you a long way at making money and being successful in life no doubt. But a merit based system only lets people in who have merit. It could certainly miss som

Okay, but we don't actually have a merit system anywhere in America, unless the primary indicator of merit is considered to be money. It is a near certainty that we have missed out on brilliant physicists simply because they were born into poverty.

To show aptitude? Because it's not, as that guy was talking about, the 1980's when there weren't a wide availability of degrees. Today if you go for a fresh starter job you're competing with people who have degrees and demonstrable skills.

Sure, if you can get an entry level job you can work for minimum wage for a few years until you pick up the skills, and be at constant risk of being replaced by someone who has a degree and doesn't make mistakes you don't even know you're making.

and be at constant risk of being replaced by someone who has a degree and doesn't make mistakes you don't even know you're making.

This. a thousand times this. I work with guys who can program and are hard workers but never went to school. It kills me how poor the quality of their code can be. They write it just enough to get the job done but in a year when someone has to maintain it, I promise it will take at least ten times longer then necessary. That isn't to say I'm not glad they are here helping out but if they just practiced some proven design patterns I wouldn't be looking for another job as franticly.

Right, and all of those proficiency tests people take, our guys can pass. If you went to a shitty school, or there are shitty school near where you live, that's your problem. I'll grant you that the US makes it especially hard to know which schools are legit and which aren't, and we're over simplifying saying a 'degree' rather than specific courses. I haven't taken any of our courses on bioinformatics, so I'd be pretty much doomed if asked about it.

Because hiring is a taxing job, and the vast majority of people doing it will take every shortcut available. If you apply for a non-starting IT position and have no degree, your resume goes straight to/dev/null the majority of the time. You'd think your experience listed on the document would handily override the lack, and you'd probably be right, but they won't be able to take that into account because they don't even get that far in reading it.

Even 'starting' IT jobs typically require degrees these days. I've seen entry level helpdesk jobs requiring 5+ years experience with a bachelor's (specifically in BIS/MIS/CS) and a list of skills as long as your arm. Most likely this is a tactic not to hire an american, but I see them often enough it's normal in the current market.

Also lots of places have policies that require a bachelor's degree to work for them. You could have 10 or even 20 years of experience and they still insist you have the degree bef

I don't know about Phd's, but I made $236k, no high school diploma, no GED, no degree.

That's excellent, you have done very well for yourself.

PhD's aren't generally about the money. Having one, especially in a useful area can lead to very comfortable well paid jobs, but that's not what they are.

A PhD is an educational degree. One learns a lot about a specific field, but more than that, one learns how to do original research. One of the main (and often ignored goals) of a PhD is to learn how to effectively do research: how to direct it, how to choose appropriate paths, how to descover new things about the world, how to be reasonably sure that you're right about them and how to communicate those discoveries to others.

So how impressive can a Phd be? That mean I beat out that 10%

Depends on the PhD, depends on the person. Not all PhD's are equal. People in the know (i.e. in the same area as the PhD acquirer in question) generally won't accept the existence of a PhD at face value, they will go more on the contents of the PhD, the research group and advisor as indicators of how good the PhD is. If you're not in the right area, it's very hard to find out that information.

The important thing is not to get a chip on your shoulder (comments like "how impressive can that be" indicate that maybe you have).

If you're making 236k, then you're well above the top 10%. Just because you are higher up than people who have more qualifications doesn't mean you should discount those qualifications or assume that they are de-facto worthless.

It's not luck. We pick out the ones with the best averages, and the best letters of recommendation and all that relatively quantitative stuff. Letters of recommendation have a blurb of fairy bullshit about how 'Sir_sri is very talented and would make a barf barf barf' the important parts are the questions that let us judge what the raw numerical stuff actually means. What quartile of your graduates is this person in, would they qualify for graduate school where you are, on a scale of 1 to X how would you

For reasons of space, I limited the question to American universities, but computer historian and former IEEE Computer Society president Michael R. Williams points out that many universities worldwide were offering CS degrees by this period. He received his own PhD in CS from the University of Glasgow in 1968. He believes Glasgow’s program dates as far back as 1957, since he was an invited speaker at its 40th anniversary in 1997.

and when TFA says the answer isn't obvious - it kind of is. Cambridge was the home of the first computer lab, staffed with people like Maurice Wilkes and Roger Needham. That's exactly why I'd expect it to be the answer.

When I read computing at Cambridge, they'd just extended the duration of the course from 1 year to 2 - and even that was (as I recall) based on two hours of lectures per day and a couple of harware labs per week. You had to enter as an undergraduate on the assumption you were going to read something else and read CS as an afterthought..

I think this was very useful as it ensured students had a background in something else (like maths or engineering), but also gave them some spare time to attend lectures in o

When I (an American) spent a term at the University of Aberdeen back in '86, I was amused that they insisted on expanding "CompSci" as "Computing Science", further evidence that the US and UK are divided by a common language.

âoeAt an academic level, it's a very different background,â says Bobby Schnabel, Dean of the School of Informatics at Indiana University and chair of the ACMâ(TM)s Education Policy Committee. "The calculus and differential equations that underlie engineering are not what underlies computer science. It's really discrete mathematics."

That was true a few decades ago. Today, though, all that discrete math isn't as useful. Today, you need calculus and Bayesian statistics for machine learning. You need differential equations and computational geometry for game development and robotics. Number theory, mathematical logic, graph theory, and automata theory just aren't that important any more. Most of what's needed from those fields is now embodied in well-known algorithms.

I got all the classic discrete math training, but over the years, I've had to use far more number-crunching math.

Basic automata theory is essential to software engineering - understanding capabilities of various computation models (what all can you do with a regex?), writing parsers and compilers, etc. Understanding basic graph theory (shortest paths, minimum spanning trees, bipartite graphs, maximum flows, coloring) is very important all across the field, from optimization to game development - sure it's well-known algorithms, but they are well-known only if you study and grok them. In the end, these really are the foundations of computer science and algorithmic thinking, while calculus etc. get useful when you get involved with real-world applications or simulations (or machine learning).

I'd agree that number theory is not that useful outside of crypto and anything regarding mathematical logic feels extremely old-fashioned in current AI research.

Sure, but there's still a lot of research on a automata theory and graph theory going on, it just depends on what field you land it and what problem is most needing solved where you are.

If you're making compilers for a living it's a very different job than if you're making user interface API's. And I have some friends who work for the same company, in the same building, on the same floor, where one does one, one does the other, and the skillsets required are completely different.

Depends on what area of computer science you are in. For every field you point out that uses calculus I can point you to two more active areas of research that focus on discrete. Personally, I am in cryptography (which no one can argue as being "solved") where modern research still relies on new developments in the areas you downplay i.e number theory and graph theory (check out the new biclique attack on AES for an example).

When I applied to schools in 1980, I noted that Stanford did not have an undergraduate computer science degree which seemed a bit ironic considering that so many CS advances came from Stanford.

The thing is, what "computer science" meant was not a very well defined thing. It could be computational theory at some schools, or it could be an engineering program at others, or a mathematical elective at others.

Our "CS" undergrads had to slide in under the fairly broad "Engineering and Applied Science" umbrella or else stick out the more stringent requirements (EE151, AMa95, etc) of a straight EE degree with a focus on "Computing". There were CS courses and professors, but no degree plan.

CS is a new and brilliant approach based on mathematics to an idea which never existed in any person's head until von neumann & turing and even lady ada before. It's a new science which literally changes every day.

I wrote my first, very simple computer program around 1966 in a class in numerical analysis when I was an undergraduate math major. I was going to a small liberal arts college, less than 2000 students. The college computer was a PDP 8. You bought decks of cards, punched them up with your program and submitted them to a clerk in the Admin Building and hoped the thing would run. In the mid-1970s, after a hitch in the Navy, I went back to school at a somewhat larger place on the GI Bill. We timeshared on

Computer science now has "routes", "track", or "emphasis". C.S. with emphasis on Web, or Security, or Artificial Intelligence, or Crypto, or Machine Learning, or Software Engineering, or General/Mathematics, or Foundational/Theoretical. So I can tell an employer, "Yea, I am a computer scientist. But only the kind that works with web tech. I don't know enough about Embedded systems to get your water pumps working in sync, sorry!" I've even seen a "Developer" track offered. Hmm.

Depends. Supposedly, at the university I attended, Software Engineers took one course from every track, while Computer Scientists took 3 from two tracks (for a total of six). In my case, I took Operating Systems (threading + the linux kernel) and Machine Vision / Graphics (designing GUIs + implementing mathematical algorithms for drawing images...in Java *shudders*). I already had experience with OpenGl, so not having a class in it was not a major issue; although doing image work in Java nearly made me brea

> I believe it was Jerry Falwell's Liberty Christian University. But the computer was an abacus, and it could only count up to 6,000.

As subscribers to the theory of intelligent design, Liberty doesn't believe the modern computer "evolved" from the abacus - certain features of the abacus and of a modern computer are best explained by an intelligent cause, not an undirected process such as natural selection.

Northeastern created the nation’s first college devoted to computer science in 1982, and today’s College of Computer and Information Science remains a national leader in education and research innovation...

I graduated with the first five-year class of NUCCS in 1988. Five years because Co-Op experience was mandatory and built into the curriculum. Freshmen and Sophomores used Pascal; by Middler (3d year) you had to take the 1-credit "lab" course in C

Much like typing and shorthand. The original word "computer" referred to mostly female clerks, who tallied long calculations by hand or adding machines in backrooms of laboratories and insurance companies. Many of these same women migrated to the early electronic computers in the 1940s programming them by setting dials, rewiring, and punch cards. I believe the feminine clerk side of the business gave computer programming a low status in the early decades.

When I went back to Philly Community College[1][2] in 1978, I got into a track for an Associate's Degree... in Data Processing.[3] There were a *lot* of folks taking DP. That was, of course, before the escalation of titles (that's a sanitary engineer, not a janitor). I also have an ex who's library science degree title included information systems

As an undergrad at Ga. Tech back in 1969-1973, they had a GRADUATE program in C.S., but no undergrad program. I had a roomate who was working on his masters in C.S., but I could not major in that. Also at that time, there was no minor program available (for anything--not just C.S.). So I majored in physics and took a lot of computer courses when I could. Good old Basic, ALGOL, and Fortran for the most part. I even recall an assembly-level simulation language called "Dummiestron" (or Dummystron?).

Georgia Tech had an undergrad ICS (information and computer science) program that started in 1972. I started in 1973 with the second class of undergrads. IIRC, when I was looking at undergrad Computer Science programs at the time there were only two. Stanford and GA Tech.

Exactly what I was about to point out. I have a PhD in EE (microelectronics), and a bachelor comp sci, the two things could not be more far removed.

The PhD in EE was all about things like the physical properties of materials (especially silicon), chemistry, properties of plasmas in a vacuum, etc. the comp sci degree was more about coding algorithms,apis, multitasking and other operating systems concepts.

Both things are useful to me, and gave me completely different skill sets.

Comp sci grew very much out of different departments, some places (like waterloo) it's an extension of maths, some places it's physics, some places it's engineering. But you're right, as a discipline comp sci is concerned much more with what is theoretically computable and how complex that is, how you logically envision that problem and how you organize and represent information. Computer engineering is much more about the problem of building all of the components and how they get soldered together.

Though I grant there are computer scientists who do research on what is computable on real hardware only, and engineers (and physicists) who think about hardware that could be used to solve problems not normally regarded as computable or computable in a particular time. Part of doing research is that you solve a problem and what discipline it happens to be belong to is secondary.

EE is more towards the hardware side while Comp Sci is more on the software side

It's more than that. Engineering disciplines are different than the science disciplines. Engineering majors will be required to take courses on the engineering process that are merely optional for a student of computer science.

Years ago I found myself with a BSEE and no job. At the time it seemed like every job interview ended with, "Thanks for your time but we're looking for someone with more programming experience." It didn't take me long to realize that I needed to go back to school.

Computer Science was created by the foremost minds of yesteryear, with hideous amounts of resources, to solve problems that the human mind would find either tedious, impossible, or both.

To reiterate my point, it was created because the Physicists, Mathematicians, and Electrical Engineers of half a century ago had hit a brick wall, and needed something new to help themselves over it. I believe Einstein, Feynman, and friends were among those people, using computers to perform yield calculations (and borrowing