Posted
by
samzenpuson Wednesday September 03, 2014 @03:08PM
from the most-bang-for-your-buck dept.

jjp9999 writes A college degree may not the best route when it comes to jobs in coding. Jobs for computer science majors flow aplenty, yet employers (and job-seekers) often learn quickly that the college grads don't have the skills. "This is because the courses taught in virtually all computer science curriculums focus on theory, and they only dabble in teaching practical programming skills," says Cody Scholberg on Epoch Times. This ties into a unique factoid in the world of programmers. Nearly half of the software developers in the United States do not have a college degree. Many never even graduated from high school. Instead, many aspiring programmers are turning to open source learning materials, or to the new programming bootcamps popping up around the United States. While theory does have its place, the situation raises the question of whether colleges are teaching the right skills people need to join the workforce, and what its place is amid the rise of open source learning.

I wouldn't say learning to code outweighed a college degree. But having the mentality that would lead one to want to learn to code... That's a sure bet. Of course, that mentality might lead one to attend college but it's my contention that that is less advantageous for many (certainly it turned out to be a time sink for me).

I wouldn't say that learning to code necessarily outweighs a degree. But I do think university courses are too heavily focused on theory, and not enough practical application.

They complement each other. The big problem here (having gone though both, most but not all of the college being quite a while ago) is that a computer programmer back in the day HAD to know theory well, because programming was hard work! Input/output was so slow that you had to get it right the first time. Often you would present your code to somebody at a window to run on the mainframe, and if you were lucky you got a printout (!!!) the next day. If you got it wrong, a whole day was down the tubes.

Memory and storage were always in short supply, and CPU time was expensive. So everything had to be optimized. Sometimes for speed, sometimes for size, somethings a compromise of both. Theory was everywhere and you had to use it.

Heavy on theory, short on practice model that university CS was built upon, out of necessity. And they've kind of stuck with it, because universities are slow to change such things.

But I would also say that it is not a waste of time. As a practical programmer, theory will get you far. Look! De Morgan's Theorem just let me reduce those 5 lines of code to 2. You may not need to know linear algebra to work on sets of numbers, but if you do, hey, check it out. Now our program is half the size and our memory usage is down by 2 orders of magnitude.

So I don't think either one replaces the other. They complement each other. But I do think universities could concentrate, at least for their BS programs, a bit more on practical programming and just a bit less on theory.

My university took the attitude that computer science was an engineering discipline. You need to understand the theory, because the theory helps you classify and interpret that problem you're dealing with. But as an engineer, you're also on the hook for the process of design, and the actual design itself.

Certainly they do not want to just teach you a programming language, because that's like teaching a mechanical engineer the tools and settings for a single CAD program. Or an architect just how to draw blueprints.

But they still called it a computer science degree because I guess the world assumes people with a "software engineering" degree don't understand theory? I don't know why they made that choice.

The premise in the summary is wrong. Employers have not learned that actual skill outweighs the fact that someone survived college.

The fact is that such a degree in no way indicates that obtaining it involved actually learning what was presented for longer than it takes to pass the relevant examinations.

On the other hand, if the programmer presents a series of complex projects they have completed, this does positively indicate they have both the knowledge (what the degree should attest to, but really doesn't rise to the challenge) and the ability to employ that knowledge (which the degree does not assure anyone of, at all.) Those completed project should also serve to demonstrate that the required portions of theory have both been absorbed and implemented, presuming the project works well and as intended.

Employers and HR departments are rarely focused on actual performance, except in the very smallest of companies. Most use a combination of bean-counting, related age-discrimination, and the supposedly valuable rubber stamp of a degree to winnow out programming job applicants. After all, if said employee screws it up, that's the employee's fault. Not the HR person.

This, in fact, is why most corporate software goes out the door with so many problems, and it is also why those problems typically remain unfixed for very long periods of time.

It sure would be of great benefit to end users and companies if actual skill *did* outweigh a degree. But that's most definitely not happening. It's wishful thinking, that's all. And if you're an older programmer, even your sheepskin won't help you -- you cost too much, your health is significantly more uncertain, they don't like your familial obligations, they don't like your failure to integrate into "youth culture" as in no particular fascination with social media... or even your preference for a shirt and tie. Welcome to the machine. You put your hand in the gears right here. Unless you've enough of an entrepreneurial bent that you can go it on your own. In which case, I salute you and welcome you to the fairly low-population ranks of the escapees.

The value of "learning to program" is roughly comparable to the 1st year of CS classes at a reputable University. It is certainly not a replacement for the entire degree. Also, the degree is no replacement for practical experience.

There really aren't any shortcuts. There's a certain amount of time and effort you need to spend getting really good at something. Even Mozart couldn't escape from it.

This reminds me of one of my cousins that thought you could get into computing by taking shortcuts like a weekend bootcamp. It was ultimately motivated by the usual underlying contempt that people have in general for anyone else's profession. (It's not just a computing thing)

The value of "learning to program" is roughly comparable to the 1st year of CS classes at a reputable University. It is certainly not a replacement for the entire degree.

Yes. IMHO this is what most often gets overlooked when people debate university CS/SE as a mostly-theoretical discipline as distinct from practical experience in industry.

You can study practical skills in using a certain language or library or tool, and you can become somewhat productive. But without sufficient theoretical understanding, you're just doing cookie cutter coding, and you will always have a relatively low glass ceiling on how much you can achieve.

Put more bluntly, practical skills are what you pick up to get from incompetent newbie to vaguely useful programmer in the first year or two on the job, but improving your theoretical understanding is what gets you from there to seriously useful senior developer a few years after that when you're no longer just writing simple GUI logic in C# or trivial ORM code for a Ruby on Rails web site back end.

Also, the degree is no replacement for practical experience.

Indeed, but someone with good theoretical understanding will pick up any given tool based on that theory fairly quickly.

Now, at no point in this post did I imply that getting a degree is either necessary or sufficient to achieve a good understanding of the theory. As far as I'm concerned, you absolutely can get there with time, effort and an open mind.

However, I think even autodidacts will find the process significantly easier if they've developed rigorous mathematical thinking and the ability to read and digest technical writing first one way or another. Also, for better or worse, the reality is that having that degree certificate will probably get you better jobs early in your career, which in turn will give you better experience and better colleagues to learn from at work.

In any case, just reading lots of casually written tutorial blog posts by people who've been playing with a tool for six months longer than you have certainly won't get you to that level of understanding alone. It's very easy to spend a lot of time doing that in a field like software development, feel like you've learned a lot and can be super-productive, and never even know how much you're missing if you've never found the right course of study or mentor or on-line learning resource to open your eyes. That, IMHO, is the biggest risk for people who haven't studied formal CS/SE one way or another, and sadly you can always find plenty of examples in the on-line forums for whatever the latest shiny technology is (currently I'd say it's front-end web development).

I taught my self to code at age 8, dropped out of college, with only a 101 comp sci class. All elementary school comp sci education was self taught. I recently got thrown on a team at work with a guy who was a couple years out of college with a masters in cs. He is pretty sharp, but he knows little about anything other than elementary algorithms, was no experience with assembly, sql, and hasn't even heard of touring and has never read Knuth.
Give me a passionate, self motivated coder any day. They will teach themselves whatever the need to know to solve an interesting problem.

The fact is that such a degree in no way indicates that obtaining it involved actually learning what was presented for longer than it takes to pass the relevant examinations.

I think you misunderstand. They are using the degree as a method of verification that you know or can know.
The degree is not proof that you know everything that was taught, BUT that you were at some point capable of learning everything that was taught well enough at one point to pass the test.

If you could learn it well enough to pass the exam once, then you are more capable than the vast majority of the population.

Which shows you more suitable than the average person as a mentally capable employee of learning and working in th field.

In my case I studied computer engineering. I did not take a programming class per-say. I took classes where you were expected to learn the programming language but it was more about the algorithms than the language. For example I learned C in my first upper-division algorithm programming class but most of our time was not spent on the programming language.

My degree also gave me access to a lot of things that would not be easily doable outside of the university such as a lot of hardware concepts. A lot of wh

this is an interesting discussion...I always like reading what actual coders have to say

i was in middle school in the late 80s-early 90s and first used a computer running DOS...i understood coding conceptually & we did a few command line things, also formulas in spreadsheets...i learned coding mostly from doing database management...then got an MS in information and communication science which included getting our CCNA

in academia i used SSPS which has (IIRC) a python-like (maybe Fortran?) scripting lang

This is an age old question not necessarily particular to Software Engineering... Are credentials or experience more important?

I would say experience is what you need to do the job, while credentials are often what you need to get the job in the first place and advance your career beyond your current role. I think that holds true for the majority of jobs, but there are plenty of examples and counter-examples of people having success without experience and/or without relevant degree credentials. Career wise I would suggest maximizing the financial return on all your strengths in the near term and either address your weaknesses as best you can or just go around them. Medium to long term always be looking to fill in the gaps in your experience or education that might be relevant to the types of jobs you may want/need in the future.

The problem with being self taught, having only experience, is that the experience tends to be only in what was necessary or what was interesting. Very few aspiring developers have the ambition and discipline to study and learn all the topics that one will be "forced" into studying in a formal degree program. For most aspiring developers the formal degree program will give them a broader set of knowledge and tools to build upon. The purely self taught that have such a broad set of knowledge are exceptionall

Let me rephrase that question: "does knowing how to do a job outweigh knowing abstract theory about that job?" I think the answer there is pretty obvious: *of course* coders who actually know what they are doing are more valuable to an employer than some kid with a CS degree and no idea how to actually do a programmer's job.

I'm assuming the vast majority of programming jobs require the ability to code, and no further domain specific knowledge. This is just based on my reading of many, many programming job listings over the years.

I'm sure there are jobs that require CS knowledge, just as I'm sure there are (programming-related) jobs that require Biology knowledge or Architecture knowledge or whatever. But all of those are niches: a very small subset of all programming jobs require those specific areas of knowledge. ALL progr

I'm assuming the vast majority of programming jobs require the ability to code, and no further domain specific knowledge. This is just based on my reading of many, many programming job listings over the years.

I'm sure there are jobs that require CS knowledge, just as I'm sure there are (programming-related) jobs that require Biology knowledge or Architecture knowledge or whatever. But all of those are niches: a very small subset of all programming jobs require those specific areas of knowledge. ALL programming jobs require coding though, and even among the ones that require domain-specific knoweldge, I'd imagine the bulk involve a lot more coding than anything else.

You don't need "domain specific knowledge" to code, but I think most such programmers are subpar. Code is like writing; you only need to know English (or your native language) to write, but if that's all you know then you're not going to be a particularly useful writer. Code implements algorithms, algorithms solve problems, and knowledge of the problem space is always not just valuable but the difference between uninteresting scribbles and a best selling novel.

The amount of data and the complexity of the calculations involved demanded either a machine with a large number of compute cycles or some nifty CS theory-style rejiggering on the back end. In the end, the whole thing will run nicely on a modern, fairly average laptop as opposed to requiring the processing power of a huge server (or cluster).

I can't speak to your case specifically, but there's a trend in our industry (because of virtualization, AWS, etc.) to do the exact opposite of what you did. The basic equation of why goes like this:

X = the cost of a programmer for a month (or however long it took), including not just their salary but also their medical, 401k, the fractional cost of their manager's time, etc.Y = the cost of the cheapest AWS machine you could get away with for the next 5 years

The best-run company I ever worked with took what I thought was a sensible approach to optimization:

We were working on a complicated production system with hundreds of individual components and intense uptime requirements. The vast majority of the programmers (about 1000) were to focus on writing "robust" code that worked in an "obvious", easy-to-maintain way. The Performance Engineering team would look at system metrics (everything was instrumented) to find the actual performance bottlenecks. Then they would send in a crack team of commando programmers to do trippy, non-obvious optimizations on very small pieces of code.

The idea was, in a complicated system it's very unlikely that your specific piece of code is going to be the limiting factor in overall system performance. So it's better to have less performance-optimal but more robust code in most places; and to use fast but brittle code only where absolutely necessary.

FWIW, the company in question is outlandishly profitable, and their software is widely considered the best in their industry.

Ever hear of the phrase you don't know what you don't know? What I have seen are people that are self taught and be a pretty good but they have a very limited domain of knowledge, specifically limited to the things they studied. Usually they have huge gaps in information from algorithms, discrete mathematics, and other things that are very valuable as knowledge that can be applied practically to their coding directly or indirectly. The computer science or computer engineering degree purposely teaches a wid

You may have a BS in Comp Sci, but I'll tell you one thing: I'd really hate having to read your Implementation Docs or code comments if they look anything like the post you just made.

Your post also brings into question exactly how good of a programmer you really are as well. You see, English, much like programming, has a structure and a syntax. While you may have syntax, there is no structure. You may not have to compete for a job with someone who doesn't have a BS in CS, but you will most certainly have your cover letter compared to another person with a BS in CS who actually puts structure into his correspondence.

As someone without a BS in anything, I've actually found the opposite.

Yes, people who are self-taught often have gaps in our knowledge, but we tend to be *much* faster at filling those gaps. Also, the fact that we acquired all the knowledge we did without a college degree indicates that we are motivated to fill those gaps ourselves.

It is very likely that there are things we have not been exposed to, even if we match your 15 years' experience as a software engineer. However, upon exposure, I am willing to bet that we will beat you soundly at rapid acquisition and assimilation of knowledge - especially since, if you've been in the field for 15 years, your degree is over 15 years old. Which means that plenty of things which are new to me will be new to you, too.

You're absolutely right that you'll never have to compete for a job with someone that does not have a bachelor's degree. I, on the other hand, have to compete with people like you for the right to do my damn job all the time, because you're absolutely convinced that four years in a university beat four years actually in the field working on real-world problems, while voraciously consuming papers and books, and while corresponding with experts in the field - because unlike you, my tools were not handed to me by a university; I had to build them myself.

None of which translates well to a bureaucracy-approved stamp I can stick on my resume, so you're right - good on you. You'll get fast-tracked to management, where you'll continue to pretend like you know what you're doing more than I do, where you continue to ignore my explanations of why your harebrained ideas won't work, and where you'll continue to get me fired when they fail in exactly the way I warned you they would. You've certainly got it all figured out.

Yes, people who are self-taught often have gaps in our knowledge, but we tend to be *much* faster at filling those gaps. Also, the fact that we acquired all the knowledge we did without a college degree indicates that we are motivated to fill those gaps ourselves.

Meh. Your generalizations are just as bad as someone arguing the opposite. I agree that someone who is self-taught AND motivated can be amazing -- ultimately, that's sort of what college used to be about, i.e., taking you from the high school "spoon feed you knowledge" mentality to the self-learning place where you can teach yourself what you don't yet know.

Good college grads learn to teach themselves, sometimes as a result of university training. Other people pick up the self-teaching and motivation s

A lot of other people have gone over what's wrong with your argument, so I'll try not to rehash that too much.

I'll admit that my 13 years as a professional programmer (after my degree) are years that I would say are more fundamental to my general programming skill than my CS degree, but I learned a lot of things in University that are hard to come by elsewhere. I learned a lot of things that aren't about computers, and that's been really helpful.

Many people are self-taught, not just "coding" but also computer science. It's not like you can't read a few books to get the underpinnings that will ever matter on the job.

OTOH, I've interviewed quite a few people with degrees but only very shallow coding skills (no real understanding of pointers or debugging), and who still didn't have strong fundamentals in computer science. I seriously wonder what some schools teach for four years.

> In fairness, anyone who actually learns to program on their own can muddle through with a couple of printf's and a stopwatch

That approach is so crude and unsophisticated that just about anyone would know well enough not to admit to actually doing something that primitive.

> Honestly, even if I'm doing it a on a million items, O(N^2) vs O(N) isn't a big deal - hardware is cheaper than my time.

No. Not really. Plus that difference you are glossing over there can mean the difference between the problem being solvable with currently available hardware (or not). There's only so much hardware you can throw at a problem before you exhaust that approach.

Actually. The more I look at your statement, the more the mind simply BOGGLES.

There are too many things that an employer is looking for from a degree that has nothing to do with coding. Ability to follow through with a royally painful task, well rounded as in able to communicate clearly and plenty of other things.

Do colleges actually teach useful skills? I got the very basics out of my college and the rest I learned on an internship and on the job. I do think colleges could be improved but I'm not smart enough to say how.

In India many colleges and universities offer Master/Bachelor of Computer Applications degrees. It started out teaching simple Word, Excel, dBaseIII, FoxPro. Now a days they have added PeopleSoft and Oracle too. Some colleges add things like Ansys, ProE, ProSteel, Fluent, Ansoft HFSS etc.

These are the graduates who end up in USA via H1-B process most of the time in HR, IT, banking projects. Quality of the graduates vary significantly. But they all make decent salaries in USA, comparable to high quality engineering grads from US schools on salaries. I have seen these programmers of questionable abilities pulling 100K to 140K a year easily.

If you want to learn how to build usable software, that is a different skillset.

I've got a Comp Sci degree, and I've been a professional software developer for the past 19 years. While some things I learned in my degree program have come in handy, I learned to code professionally AFTER I entered the workforce, and primarily from working with other people's code and being mentored by those that had done the job.

There should be a professional "Software Engineering" (or call it something else if the Engineers get upset about the term) program for those that want to actually build code.

If you want to learn how to build usable software, that is a different skillset.

Precisely. Getting a computer science degree in order to become a programmer is like getting a mechanical engineering degree before becoming a mechanic. Yeah, it's kind of vaguely field related and will help give you some background about why things are done a certain way, but it's not at a

So much of the code that I've seen is poor because the people writing it have not learned the fundamentals of requirements determination and problem solving skills. Then you need to understand how to choose appropriate algorithms and tool sets to apply. Then you learn what you should have known in the first pass and you start again on a better solution. While colleges attempt to produce people having those skills, they often do not provide enough practice. For all four years of my degree program, only on

One of our Java development teams was discussing a new project, and one stated she doesn't know where the code runs. I over heard this, and went over to join the discussion. This is one of our best teams, and know one knew the answer. So I explained how Java Server Pages compile, and where each piece physically runs on multi-tier architecture. When these people learned to code, they learn though an IDE. Hit compile, and deploy the code, and it goes off into wonderland to run. They never learn how it actuall

For example:1. You'll follow and (perhaps later on) write and refine software specifications. You need to learn different ways to do this.

2. You'll need to select appropriate algorithms for the task at hand, and evaluate performance for new code -- which you wrote against a trivially small amount of data -- against production data volumes.

3. You'll need to understand pros and cons of different software development approaches, particularly waterfall and the broad category of "agile". Why would you pick one over the other?

4. You'll need, at least on occasion, to understand one or more software modeling systems, and perhaps to create models that represent what you're suggesting.

5. You may very well need advanced mathematics for your job. Just a couple of months ago, I had to write some vector-handling code, in PL/SQL of all things.

Sure...you could learn all this on your own. But a good compsci curriculum will provide you with at least an introduction to all of these, with some kind of attestation of basic familiarity.

If you want to be "just a coder," go right ahead. However, you'll never be all that competitive with those possessing the larger body of skills needed to be a solid technical professional. Of course, real experience is very helpful in landing the first job. That's what student jobs, interning, and cooperative education are for. I'd never have landed my first job without some of the skills I learned over four terms of co-op.

Incidentally, point #2 is key. For some reason, even compsci grads like to think their algorithm analysis course(s) was(were) useless. But having watched an MIT EECS grad write a web application that was, as I recall, O(n^2), just because he didn't want to use a database, I can only say that the course is essential. Dr Susan Mengel, you were one tough cookie, but, boy, did you teach me that stuff well.

Let the really smart kids (IQ > 130) take traditional CS. They can deal with the theory and advanced math.

The kinda bright kids (IQ 100 - 120ish) can take Computer Programming where they learn to code in the real world. Teach some business-y stuff too while they're at it. Only math required would be high school algebra and geometry. Maybe some trig but that's it.

I have been an employed programmer for about 8 years now, dropped out of school to get paid instead of paying. Every single person I have had to work with who had a CS degree have had two traits in common. First, they love to remind you they have the degree. Second, they barely contribute anything to production except great ideas of how not to do things.

As a non-degree'd person, I have done contract work for 3 separate universities so far. You would think they would have an infinite supply of proud cheap labour to tap before giving me a call.

Because I'm a software developer in the United States with a Masters of Science in Computer Science. All of my coworkers have at least a bachelor's degree in one field or another. And my undergrad very much so started with a sink-or-

If you are hiring someone to develop code and you must pick one or the other, pick the person who knows how to code. If you can find someone with a degree in CS, math, physics, accounting, philosophy, a natural language, law, or anything else who also knows how to code then hire that person.

Especially if they have a degree in the subject matter and know how to program that's a bonus. Sometimes the actual subject matter really is CS. Sometimes it's accounting, medicine, physics, geology, or something else.

If the metric of comparison is employment, you need to be able to produce output rather than cite theory. In fact, I know of no developer, ever, who was hired on the strength of his awareness of theory with no programming ability. There is a chance you could get something like that in emerging fields like machine learning or data analysis, but you'd still have to have some ability to implement your theories or processes. Of course, you'd also have to be an acknowledged expert in the field, and that's not

The real question is not 'whether colleges are teaching the right skills people need to join the workforce'. First you need to address this assumption that the point of college is merely to qualify you for a job or increase your future earnings potential. Because it isn't at all clear to me that it would be a problem if college wasn't preparing people to enter industry - the only thing that a university degree specifically ought to prepare you for is further schooling and life in academia.

A working programmer and a computer scientist are two different things, but the computer scientist should be able to write a basic program [kegel.com]:

A surprisingly large fraction of applicants, even those with masters' degrees and PhDs in computer science, fail during interviews when asked to carry out basic programming tasks.

For programmers, this is a basic test [codinghorror.com], but when a computer scientist can't do something this fundamental, it calls their higher-level qualifications into question; and even if it doesn't, it makes you worry that their architecture or design will consider real-world issues and implementability.

Nope. University/College is a scam. Just watch youtube videos and read slashdot and you'll be golden.

Disclaimer: I have both an undergraduate and graduate degree in computer science, and while I have an amazing job and make more money than I thought possible, I don't think my education had *any* impact what-so-ever on my success.

Disclaimer2: This question appears almost weekly and is clearly a circle-jerk...

Title aside, the ability to code is a workplace requirement, and if you are not looking at traveling/work internationally, you aren't going to get very far without a degree.

Some of the "college drop out" success stories are no longer just coders. They are now C-Level executives, different rules apply. If you don't have a degree then in general you won't be eligible to get Visas to work in other countries.

Independent about how good you are, without a degree you are restricted to your local geography (count

Absolutely, to be a top notch software developer, learning to code trumps a CS degree. After all, you can get a degree but who knows how much attention you really paid in class. Further, learning to code goes beyond what is taught in a CS degree.

However, this isn't an issue of theory vs. practice. It's like saying, would you rather have a surgeon that learned surgery the traditional route with 4 year degree heavy in science, specifically biology classes and then went to med school, etc, or a self taught sur

... and I didn't have to read the article to know that. Real world experience is better than "untested outdated group think", I mean "progessive college theory" any day. I know may people make 6 figures programming who decide "I'll go to college now, to learn even more about programming" and after a semester or two they all say the same thing "I corrected my professor, he said you cannot do X or Y in JAVA, I tried to explain how we do it all the time by doing Q and R, but he just looked at me and said 'if

Computer Science is largely very specific applied math and theory. It includes algorithms, algorithm efficiency, a bunch of math, data structures from a theoretical design standpoint, and computer architecture. It tends to be very academic.

University programs vary widely on what the programs focus on, but generally Comp Sci is about the math and theory, and programming is something you do on the side to get the assignments done to illustrate the theory you are learning. With Computer Engineering and Soft

You will always not have some skill listed on the requirements. When was the last time you seen :
Wanted someone fresh out of school with 0 years experience in the technologies we are currently using.
Though you might see:

Wanted unpaid/paid intern!

Usually what you will see is something like:

Wanted highly motivated individual with at least 5 years experience C#, and MCPD certification is mandatory. Experience with Agile and SCRUM is highly preferred.
You won't get that anywhere except by taking

If you're just going for code grunt who just needs to implement X, then a degree is largely worthless verses just general coding ability. By contrast, if you're getting lot of 'I have problem X, can you solve it' situations, theory becomes a lot more important and the coding a lot less. However, i suspect the bulk of programming positions are more the former than the latter.

By Sturgeon's Law, most colleges that offer CS degrees are diploma mills. That isn't to say they're all scams, and you could certainly learn how to program well despite your shitty education, because after all, you really learn how to program on your own in any case, but simply because they teach things like Java and SQL, instead of things like actual fucking CS, you're not going to learn how to solve interesting software problems. You're going to learn how to be cogs in a corporate hierarchy and do what th

Schools are in the business of producing degrees and protecting their reputation. They aren't in the "teaching" business. If I knew more about computer science when I was in high school; I'd have applied to very different schools. Unfortunately, I needed a few years in my career to really learn how to judge a school.

In the short-term if you take the time to become a master coder on your own, you can probably get a decent job as a coder and avoid a few years "wasted" in school.

However, if you later grow tired of coding, wish to move into management, get RSI and wish to do other IT, etc., then the degree will have notable benefits. HR departments feel more comfortable with degrees when it comes to hiring semi-generalists.

Look at it from HR's perspective: If 19 candidates with degrees and 1 without apply, if HR selects or

The difference between someone with a Computer Science degree and one who's learned practical coding is the difference between a residential-home architect and a construction-oriented master carpenter. The first can design your home and tell you why it's designed that way. The second can actually build it, tell you what goes into the construction and why, and when certain design elements are going to muck up the physical realities. In the end, you're going to need both skillsets unless you restrict yourself

It isn't true that Universities don't focus on practical coding. In school I had to code many things. They had to work. Many students cheat. They have other people do their assignements. Or they do the assignments with a study group. Or they use google to find the solutions to problems. They get out of school and cannot program simple things. This is why internships are so important. Companies need to try out people before the commit to them. They need to work with for a few weeks to see if they are fakers.

College degrees are not the same as those from vocational schools which is what it sounds like at least some in industry prefer. Nobody graduating with a college degree has a very high skill set for a particular task - not even accounting majors. All start on the lower rungs but are expected to advance rapidly because they have the necessary broadbased knowledge to learn and use knew skills effectively. Simply put, they are not one trick ponies. Did somebody say ponies?

If getting a degree is part of your career plan, it is probably worth it. If you expect to "learn a trade" or "see where you land" or "get an education" you are in for a huge sea of debt and disappointment.

If you are going to college to learn to be a computer programmer, that might be possible, but it will be NP hard, and I have never seen it done successfully.

There is this kind of false dichotomy that ignores the benefit of a degree because everyone confuses a college education with "learning."

We've been arguing this for more than 20 years. Not much has changed, and it's not a new question. Code Slinger vs. Book Knowledge. College of Hard Knocks vs College of Ivy. I'm a greybeard now, and while I won't pretend to answer the whole question, I will provide some perspective...

I was a code slinger type - Right out of high school with some programming knowledge, some commercial success (with the C64), and whole lotta balls. I did some college, but it wasn't for me at the time. It didn't connect with what I wanted to do, which was code. I joined a contracting house, and they sent me all over the country. I learned more in 10 years doing that than any college would ever teach. Databases, Integration, GUI's, network programming, mulithreaded programming, and real-world problems, both programming and political. C, C++, Cobol, Fortran, BASIC, assembly (various), and eventually Java.

In the late 90's, I went back to school. Why? Not to learn programming - I was already at the top of my game. I went back to learn all the other stuff, and to do other things. I took psych courses, math courses, art classes, electronics, music, law, languages (Living: French, Dead: Nahuatl)... I did it on my terms (Harvard Extension, no time limits.) I will graduate next year.

Do colleges teach some basics? Sure - Data algorithms and Graphics programming were very useful. Are they realistic? Not really - sometimes horribly so. Massively Parallel Programming was a mess of math decomposition problems I dropped quickly. Did I need them to enter a career of commercial programming? Nope.

I would say college education is not a prediction of coding ability. Having a college degree when you are entering the field can be useful, but having a CS degree IMHO is not any more useful than a general BA or BS. If you go to college, go to get a general education, learn how to think critically, expose yourself to some interesting things - but it is NOT a training program for coders. Technical schools are a whole 'nother thing, and I would avoid them like crazy. My experience is that they do train you, but the training is narrow and short-sighted. In the end, it would be throw-away time, and the student would have very little gained.

College? Sure - go do it. You will be a better person, and you will have some great social experiences. But if you want to code, you need to put the time in yourself and learn the skills. College won't teach you that.

How much of that "nearly half" without a college degree is from the era when computer science degrees didn't exist? I went to a Top 10 engineering school, and we didn't have "Computer Science" or "Computer Systems" until my junior year - the closest was either a Math or EE specialization. There were already people leaving college early directly for industry, because the market was hot in both industrial and military applications. Alternatively, how many are from military training? I have worked with two people, both at the engineering level, both alumni of Air Force technical training in computer-based systems.

I concur with the many posters pointing out that a degree isn't just about coding; I keep having to fix things by people who can code, but never learned to DESIGN.

Actually in terms of self-awareness, development and personal growth, experiencing university life can have a tremendous impact beyond the classroom. On average, I'd say it can at least doubles your social skills to an order of magnitude improvement social skills for some, and improve your quality of life. My personal opinion is that for many young people (and perhaps those not so young) considering this question, this can be an far greater benefit, and a more important benefit to your quality-of-life.

Having a degree can also make it easier to get a chance to be considered during a tight job market, and improve chances at negotiating a better salary / contract.

Getting a degree, without learning to code, will certainly make you an incompetent bane of your co-workers existence, no matter how short that career may be.

While being aware of the financial realities (and potential opportunities for assistance) of the cost of university, the strongest case tends to obviously be: do both.

Others have pointed out the obvious complimentary nature of knowledge (theory) combined with experience (practice). If you don't know what to work towards, you can waste a lot of time and effort doing things the hard way or rediscovering the bubble and merge sort. Or if you don't know what can and can't be done, or how to do it, you end up a hard working monkey with a very limited playbook. You may find the ever constant change in technology a burden, rather than an enjoyment (I mean after the first 5 years), because in my experience those who understand the fundamentals, those abstract or theoretical bits, can adapt to change more readily and often with dramatically less effort.

Most famous university dropouts (in Sciences and IT) both made it through admissions obviously, and more importantly left before they could finish their degree: that is to say, they were most likely in their 3rd or 4th year, not entirely flunking out first semester (though having a rough to horrible first year grades isn't particular uncommon even for many who later become professors themselves). In a fair number of cases, including some William guy from Redmond, they complete their degree later in life.

In the end; it is what you make of it, just like everything else in life.

Absolutely. For any reasonable definition of 'coder' that approximates definition of 'employed IT professional' that statement is false.

Clearly we live in different universes.

In yours, it seems that "employed" means "employed by a large corporation whose organizational chart can only be displayed on several bedsheets stitched together", while in mine it means "hired or being engaged to perform work for pay".

My universe is full of people who are employed as IT professionals without having science or engineering degrees. Most of them landed their current positions on the strength of previous work in the field, and are just as capable as their degree-bearing and ring-wearing coworkers. As long as you can do your job, nobody cares what kind of expensive picture frames you hang on your wall.

I have relatives, retired in their 50s after a career in early days of programming (60s-70s era). Almost nobody had a degree back then.

Today? Things changed. If you are giving advice to young people on how to get a coding job, saying "don't bother with a degree" would set them up for a life of hard knocks. Would they still make it? Perhaps, but why make things harder on yourself?

Where I work there was one person hired without a college degree and the only way we were able to have him work for one of our major customers is because he was going to school while working. May be slightly different working in the avionics industry though (a lot of things are in many painful ways).

Most coders don't actually program. They just write some lines of code that connect libraries together. They wouldn't know where to even begin if asked to write the libraries themselves, or write the networking protocols, or the operating system, or the compilers, or the GUI frameworks, or the browser, or even a simple scripting language.

I see some of these people in interviews, and they say "I do middleware". I ask them how the product they work on actually works and they honestly don't have any idea.

>>A college degree may not the best route when it comes to jobs in coding.

If you plan to be employed in the technology field, then you have to have a degree in computer science, engineering, math, or physics. Without a degree you will find nearly impossible to get past HR gatekeepers. Nobody actually cares where the degree is from, just that you have one.

Sure, you can beat the odds and be The Exception, but life is hard enough already that it is unwise to invite additional difficulties.

Maybe you missed this part of the heading (not even TFA):"Nearly half of the software developers in the United States do not have a college degree."

That isn't just saying not a "computer science, engineering, math, or physics" degree, it's saying any college degree at all. So, presumably a lot more have college degrees with other majors.

So how exactly is almost half plus every programmer with a non-STEM degree "The Exception"? It seems to me the STEM majors are the exception.

This is simply not the case today, especially as applied to 20-somthing trying to get a job. If you are still skeptical, I invite you to go to talk to HR and ask them what it would take to get entry-level job without a degree.

Since I'm directly involved in the hiring for my company I can tell you for a fact that we are desperate for qualified candidates, and their college status is like item #25 on the list of things we care about. Given the incredibly competative job market we have, the idea that we (or any Silicon Valley company) would turn down an otherwise-qualified applicant simply because they lack a diploma is laughable.

Now, that being said, we have multiple PhDs on staff, so it's not like we're anti-education. I'm just

I wouldn't turn down qualified applicant ether, but the simple fact that HR departments exist make this unrealistic goal for any large organization. The resume of such hypothetical person will never land on our desks. Moment you grow past 20ish people shop and have to have procedures, policies, equal opportunity and all that other annoying but often necessary bureaucracy you lose your "just hire the gal/guy" ability.

Well, to be fair my current company is a "20ish people shop":-) But that being said, my previous company was 100+ people, had an HR department, and was just as in need of qualified programmers (and just as willing to hire candidates without degrees).

I think you've had experiences with one specific type of company, but you shouldn't over-generalize your experience to assume the whole industry is identical.

Also, keep in mind that for many Silicon Valley companies these days, HR isn't the gateway, the on-sit

If you are still skeptical, I invite you to go to talk to HR and ask them what it would take to get entry-level job without a degree.

Not all companies have HR gatekeepers. HR is their to filter out job requirements. If the job requirements say "Or equivelent experience", that's your ticket. If there's no HR department (the case with many smaller companies), then that barrier is gone.

Bascially, I'm calling bullshit here. I've known many people, including myself with very successful careers in IT

Without a degree you will find nearly impossible to get past HR gatekeepers.

Depends where you are. In my experience, what you say is very true on the East Coast. However in California it's not true at all, and I think out here not having a CS degree might even be a slight advantage.

In terms of hiring, I have yet to see a college or trade school that does an adequate job. Fundamentally, I'm hiring people to develop web apps on the MS MVC stack. That requires a bit of theory, architecture, security, and hands on coding skills. If you can't actually code, you're worthless. I give all applicants a CS101 level coding test. Anyone worth hiring will be done in under 5 minutes. From there, it turns in to an interview about your theoretical knowledge and patterns. Anyone without a basic