Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

BeardedChimp writes "I, like many others here, have learned to program by myself. Starting at a young age and learning through fiddling I have taught myself C++, Java, python, PHP, etc., but what I want to know is what I haven't learned that is important when taught in a traditional computer science curriculum. I have a degree in physics, so I'm not averse to math. What books, websites, or resources would you recommend to fill in the gaps?"

Being self taught myself, I think the biggest downside is some of the strategies and standards that are taught in the mainstream curriculum (IE: how to properly use the object-oriented model, etc.). Especially when I first started out, my code may get the job done, but it wasn't the cleanest or best approach. Luckily, I think it will come to you in time if you focus on improving your code.

I'd say you need to learn enough mathematics to get an appreciation for what goes into the discovery of an O(nlog[n]) algorithm -vs- its [naive] O(n^2) counterpart.
The problem is that you need to know just an incredible amount of graduate level mathematics to even get a sense of what the Fourier Transform is all about before you can have any sense of awe at how a Fast Fourier Transform would be an improvement over a [naive] "Slow" Fourier Transform.

I'd say you need to learn enough mathematics to get an appreciation for what goes into the discovery of an O(nlog[n]) algorithm -vs- its [naive] O(n^2) counterpart.

This is a gap I've seen with self-taught programmers: they didn't take algorithms classes where you analyze algorithms for efficiency and write complicated algorithms for (mostly) academic problems. Even amongst university-taught programmers, most people see this class as a waste of time. I've taken it twice (undergrad and graduate level) and find it helps me in my job.

You could teach people all they need to know about big O and common algorithms in an afternoon.

Sorry, but I gotta call B.S. on that one.

You need YEARS of mathematical training to grok this stuff.

Have you ever tried teaching college level programming to recent American high school graduates?

I have had young adults [some already with bachelor's degrees who were coming back to school to brush up] who couldn't reliably compute anything in Base-16 [hexadecimal].

They need the better part of a decade's worth of intensive mathematics training to get to the point that they could really grok the difference between what goes into a "slow" O(n^2) algorithm and its "fast" O(nlog[n]) counterpart.

And let's face it, lots of people doing basic HTML or VBA [Visual Basic for Applications] probably don't have sufficiently high IQs to make that transition.

And even if they do have sufficiently high IQs, then summoning the self-discipline [not to mention just the spare time] to tackle this stuff is going to require a really formidable application of the will.

Which is not to say that it can't be done, but the odds are definitely stacked against them.

Wow. With teachers like you out there, I'm surprised anyone ever learns anything. You denigrate an entire generation of students as well as programmers of 2 languages, then justify it by boasting that it took you YEARS to learn something most of us picked up in a semester or two?

I'm not saying someone couldn't learn from someone like you, but the odds are definitely stacked against them.

Intensive math training = Knowing that n^2 grows faster than n * log(n)? You don't even need to understand math for that (Wolfram Alpha or a calculator will tell you). Figuring out if a program/function completes in O(n), O(n^2) or O(c^n) time is something that anyone with a basic understanding of loops and jr high math could do. O(log(n)) or O(nlog(n)) are slightly harder, but that's only because most people don't deal with logs in everyday life.

And one more thing.. what's the big deal about teaching people hexadecimal? What's the purpose? I can do it, but I've never once thought of a reason I'd want to. Isn't the whole point of the compiler that it does that stuff for you?

If you are doing basic bean counting software, you don't really need that much math. Calculus is overkill.

Heck, in some instances, you need to know how to interpret legal documents more than you need advanced math. I remember a case where I had to tweak an asset ledger system in order to help it meet IRS standards for depreciation after a change in ownership.

At the same time, if you are doing engineering or scientific programming, it can be extremely useful to know Advanced Calculus because that is what t

That kind of thing isn't necessary in most programming. Niche knowledge ftw.

I disagree. Maybe you don't need to understand FFT's in your line of work (I don't), but analyzing algorithms for efficiency is absolutely a real-world skill.

If you don't at least understand these concepts (, then you don't understand why one algorithm is crap in the real world and another algorithm is preferred. If you don't understand why one algorithm is crap and another algorithm is good -- even though both provide the correct results -- then you have no business writing code professionally. This

two things overlooked are computational complexity and standard algorithms. true, those are seldom used in the real world, but understanding them provides lots of insight when designing bigger systems, even when you're not actually using complex algorithms or calculating the performance of every row of code.Most of the botched implementation of pagination/lazy loading of data come from the lacking of this knowledge. Most performance problem comes from "stuff A works" and if I need to do a lot of A operation

Amen to that. One of the best things I pulled from my undergrad Comp Sci work was an appreciation for how different parts of the system interact, and how things work on the metal. Many self-taught programmers (my brother is especially at fault for this!) just care that the code works, as others have said, but when it doesn't, they don't necessarily have the tools to figure out why. As a very basic example, knowing the ramifications of only one process/thread being able to use a network adapter at a time

Testing and thinking like a QA person [amazon.com] -- there are great resources out there for how to write tests that really have a chance to find and exercise bugs, it requires a knowing a bit about the most common programming errors.

Anything not specifically related to churning out "working code:* proper use of source control* how to debug effectively and debugging without leaning on the debugger as a crutch, e.g. the "binary search" strategy* choosing the right tools for performance analysis and improvement* how to r [amazon.com]

Self-taught programmers might not know design patterns by name, but they will likely stumble upon the more common ones on their own. When they finally learn about design patterns, they will understand the topic better because they "invented" some of the design patterns themselves. That's how it was for me at least. One day I was explaining something to another programmer, and after my long explanation he just looked at me and said "Oh, so you're using the visitor pattern." I tilted my head, went online, and learned a new name for something I had been using for years.

My initial programming experience predated the Design Pattern concept. At the same time, I knew about pattern language, the concept behind design patterns, because I had heard about the architectural version by Christopher Alexander.

Think of a pattern language as being a conceptual construct that does a specific job, be it an entry way for a house, a menu for an application or the splash page for a web site. Certain activities occur in that 'space' and if you know what those activities are, based on past

>Thinking about it, I would say that good programmer and analysts were already looking for design patterns long before Design Patterns became formalized.

I agree. A lot of what appears in the Design Patterns book struck me as "obvious - that's how I've always done it". But again coming from the Ada community, the language and the culture encouraged the kind of bigger-than-just-a-subroutine approach that is also codified in Design Patterns.

In my experience, though, (and I got out of Corporate I.T. about 3 years ago), is that self-taught programmers don't learn formal techniques; instead, they cobble stuff together until it "works". Then, since they built it from scratch, they get all defensive when you attempt to talk about more efficient techniques, etc.

Maybe I've just had bad experiences, but it was my experience that the folks who shouted the loudest had the code that sucked the hardest. (Now, on the other hand, another poster had it rig

This has been my experience as well. Though I don't have a CS degree, I had enough formal programming work as an Engineering major to understand how to do it mostly right. When I was lead Internet developer for a computer sales company I ran into a large number of developers who excelled at finding things online that could do part of their overall goal and then massaging the outputs to be able to string them together. Yes, the solutions worked but looking at the code reminded me of a car built by collecti

While working on a small project, I was figuring out the problems and solutions to "databasing" in text files. (avoiding collisions, indexing, etc.) And, yes, when I finally took a database class, those topics were much easier. Then I felt rather pleased with myself for figuring out the issues correctly.

Design patterns are useful, but if you can't arrive at the need for them on your own, and assume that every piece of code written must be fit into the patterns you know of, you overlook some fast and powerful solutions. Granted, there's always a risk of vastly increasing clutter in systems that don't make proper use of design patterns. I agree with the sibling post in that there is value in not knowing them first, but learning them after you've recognized walls that are difficult to tackle.

Although in practice most of the time advanced data structures and algorithms are not used, it is useful to study them and implement them yourself at least once. Dijkstra's algorithm, Prim's, Kruskal's, maximum flow, and other basic graph-operating algorithms are a good example.

Yes, I think if you are self-taught, you are already great at the practical matters (probably including stuff like design patterns), but you might get in trouble when trying to devise effective algorithms and data structures for a particular problem that might not be difficult _technically_, but computionally quite so.

It's important to have a basic concept of time/space complexity (the O-notation) and understand at least couple of basic algorithms (my picks from the whole spectrum would be QuickSort, dept

Seconded. As a self taught programmer, classic algorithms are always my weakest link when I do an interview. Unfortunately, it's also the most glaring, since they usually teach that kind of stuff early on in COMP SCI.
I had an interview with Amazon a while back and in preparation bought the book "Programming Interviews Exposed". It's a pretty good resource for getting your hands dirty with all the algorithms that a general programmer should know.

Experience helps, but if you don't have a foundation of theory you are going to start wrong in the first place.

The biggest issue is that the person doing the model knows there is actually a wrong way and that they want to seek the correct way of doing things. Yes there may be many correct solutions (each with their own trade-offs), but there are solutions that are just outright wrong. In my opinion not doing the wrong thing takes education and interest. Experience helps you choose the correct solution wit

Maybe CS degrees have change since most of the people I work with got their degrees, but I don't think they got much in the way of data modeling theory.

It wasn't part of the core classes where I attended (though it should be). There were 3rd and 4th year courses that focused on modeling such as: object oriented design and development, and system analysis and design.

Depending on what a student decided to focus on it was possible to completely avoid any courses that involved modeling. I worked as business/system analyst while in school which got me interested in design and also the business side of software.

Others have slightly different styles and conventions, and ways of solving problems. Working on something like a large open source project could teach you about working on a team, where one person can't "own" a whole part of a program. (And cleaning up others code will greatly help you learn about documenting and formatting your own code.) One good team assignment we got, Each person work on a part of a program. Away from computers, we had to, on a whiteboard or whatever, decide inputs and outputs, etc between parts, then code separately. Grade on that assignment was how well the program behaved when the teacher, in front of the class, compiled the separate parts, and ran it for the first time combined.

I'd have to totally disagree on that. What you're talking about, is something (almost) any programmer is going to learn, simply in the course of having a job. It doesn't even have to be a programming job.

Design Patterns: common "Template" solutions to regularly encountered problems/variations-on-that problem. Be careful when learning these that you don't fall victim to "when you have a hammer, everything is a nail". Also learn the Anti-patterns, wikipedia has a good list of anti-patterns.

Algorithms & Data Structures: Analysis, average running time Big O is most important, but understanding worst-case runtime is important too. Designing algorithms vs knowing when to leverage an existing one.

the C++ standard library provides a great many of these, it has a high efficiency sort (from ), it has good collection data structures (vectors, linked lists, maps, etc)

Objected Oriented Analysis And Design: Knowing when to make something an object, when and how to use inheritance and polymorphism, when to not make something an object. Plain old data objects. separation of responsibility: UI is not logic, logic is not UI.

To be fair, I got BS degrees in Computer and Software Engineering, but if we consider only the required courses from both these curriculums, I gained:
- NO education on design patterns
- A very limited understanding of how to apply big O analysis to an algorithm (although I do understand what it represents)
- Almost NO attention to "proper" OO analysis and design. Any knowledge of this I have is from personal experience during school and professionally
- only a cursory look at threading.
- A good understan

Ouch, yeah I think that says something about the CS curriculum at that school. The list that the grand parent posted is pretty much the core classes that I had to go through in college. Really the only optional thing that I think everyone should have to take as well is a Sr. Design project type course set like most engineers do.

my school's core curriculum is actually how i crafted the list. Prior to high school I was a self taught programmer, in high school i was blessed to have a computer science class taught by a former lead developer from Apple (Newton platform).

I looked my university curriculum and deduced what i would have never properly learned as a self taught programmer.

Before my high school teacher educated me in the right ways to do things I was the worst type of BFI self-taught you'd ever seen: like the vast majority o

I have BS in Information and Computer Science and I must say that my experience was remarkably similar to the parent's. One problem, IMHO, is that the major is called "computer science" when it should really be called "computational science" instead. People assume that the degree is all about computers and their use but this is a misnomer. It is more correct to say that "computer science" is a specialized study of mathematics as it relates to computation and its complexity with some practical high level com

There's way too much emphasis on Software Engineering in these comments. Multithreading? Self-taught programmers are the only people who ever have the time to study IA-32 and associated errata to the point where they're able to write a reliable threading library. Object-oriented design? Computer architecture? Design patterns? All of this is easily within the reach of a self-taught programmer.

Knowing when to make something an object... when to not make something an object.

This is something even experts struggle with. As a long-time skeptic of OOP[1], I've tried to find a consensus about the "proper" way to architect an app using OOP so that I have something consistent to dissect and publicly analyze besides "toy" examples, but found the OOP author/proponents' answers either vary widely per practitioner or are too open-ended.

They pretty much say, "I know good OO when I see it, but I cannot write

Is this a concept with both data and functions associated with it? If Yes: Good candidate for objectIs this a linear task (A->B conversions, etc)? If Yes: poor candidate for object (though the data it is working on might be a good candidate)

I was learning and coding on my own steam for about 15+ years. Then I joined the ACM [acm.org] (two years now) and my eyes opened. I am now about 1/3 though a B.Sc in CS (part time) and I'm also following a CPD program at another University. I have also joined the IEEE [ieee.org] as I required access to more material for my studies. What I realized was that I should have done it from the start. So my advice is simply this: start to follow some part time programs and get the theory as well. I have learned in the last two odd years a lot on subjects like modelling, quality assurance, frameworks and architectures which I otherwise would not have known. I also found that the quality of my code has greatly improved since I now work in a much more structured way.

Experience helps, but the real killer deal is experience backed by a CS/Eng. degree.

If you know your algorithms and data structures, and have a firm grasp of the architecture of modern computer systems, you'll be way ahead of a depressingly large proportion of people with degrees in CS that come past me in interviews.

The most informative and entertaining book I can recommend on algorithms is Bentley's "Programming Pearls".

As a recent CS grad (dec 2008, but that was "school 2 years", "break work in field 2 years", "school 2 years") I can attest to the lack of skill of some of the people who only retain information for the duration of the class they're in. What was even more disturbing was in my graduating class (only 8 of us) the two of us with the most experience: academic, open source, professional full time work in the field were the LAST to get jobs.

As a recent CS grad (dec 2008, but that was "school 2 years", "break work in field 2 years", "school 2 years") I can attest to the lack of skill of some of the people who only retain information for the duration of the class they're in.

That is a great argument for cooperative studies. At the school I attended engineering students were forced to do "full coop" (4 terms of actually working in something related to their field). That wasn't a requirement for CS (despite being in the same faculty). I ended up doing a partial coop (2+ semesters) because I saw the value in it. I found that course work was significantly easier when I had some real world experience I could apply coursework to. Ideas stuck because I could immediately think of real

What gaps do schooled programmers have that self-taught programmers don't? While a self-taught programmer might go about getting the job done differently, I can almost always count on him to get it done. Programmers coming out of school often still have a horrible worth ethic, especially when compared to their self taught peers. Granted, I have a very limited experience, so I wouldn't cast that judgment over all, but I would be curious to here what others think.

SCC and Software engineering walk hand in hand. Everything from how useful unit tests can be after merging code to how to design code so that you large teams can work on the code without stepping on each others feet.

I'd suggest a. Debugging techniques (but then I strongly prefer design/language approaches that minimize debugging in the first place)
b. Programming-in-the-large, including (i) program structure; (ii) maintenance/documentation considerations (That's true for programmers who have worked on large, well-run projects.)
c. MAYBE multiple programming languages - As I wrote in another posting on this thread, I will not

all these things were taught at the university I went to.. in my (admittedly limited) experience the people who don't get taught all these things [and others things mentioned elsewhere in this thread as being lacking from university] are those who went to community college. The Community Colleges here have computer programming programs, that teach absolute garbage.

Then again: I'm in Iowa, and I went to Iowa State University (where the automatic digital computer was invented)

I'd say that formally taught programmers may not have much experience with maintenance programming, especially with legacy systems that have been running for years. They are used to 'blank sheet' programming assignments that allow them to control the entire project from start to finish. They don't have to deal with code that has been modified dozens of times over the years, often without much documentation.

I got my professional start in programming doing maintenance programming under the supervision of a

Self-taught folks tend to have less in their toolkits -- when all you've learned is a hammer, everything looks like a nail. Then, when trying to do something different, they try to take their hammer and "invent" a screwdriver out of it.

kids right out of school aren't ready to take practical problems and successfully solve them -- that work ethic tends to take a year or two to get down (and rightly so, since CS programs aren't I.T.-drone-cookie-cutter factories), but at least they know enough about the la

Not understanding the business reasons behind the decisions being made. The PHB's don't care what language you use or algorithm or any of that (usually). What they do care about is that it's on time, has the features they want, and gives them a high return on investment.

I've sat in meetings were a programmer will go on about why a certain feature can't/shouldn't be done for a variety of technical reasons and gets no where. And right after that another programmer will mention that the feature can't be do

Generalizing... Self-taught usually means that the person has a technical interest in programming. You're likely to find more of them to be competent as they've spent the time scouring the manuals to figure out how the thing works. From there you have to find out what else they have... do they have good business fundamentals to understand how to apply programming to what you need for your business, or good math fundamentals if your need is complex scientific programming?

This is what I was thinking myself. The self-taught programmer has already demonstrated enough passion in the field to know there is genuine interest in their work. I cannot stress how highly I value that drive as I feel anything is possible so long as the drive is there.

With people coming out of school, you are going to have to sift through quite a bit to find those few who have true passion for their work. Moreover, the grads tend to have this attitude out of school that they know what they are doing -

Agreed. Both sides can learn from each other. Knowing theory tells you what's possible and street knowledge tells you what's practical. Theory cannot fully substitute for street knowledge and vise versa. (By the time you master both, you get carpel-tunnle and move into management;-)

I highly recommend Lewis' Elemtns of the Theory of Computation. Garey's Computers and Intractability seems to get quoted a lot as well. I'm not sure how important this stuff is in every day computing, but if you want to learn computability, these two cover everything.

"Computers and Intractability" is more like a list of problem-reductions among NP-complete problems + references.
If you want to write "Problem XYZ is NP-complete", it is a good book to quote, but it's not at all a book, that teaches you anything...

I have a CS degree from a major university. I have to disagree with most of the comments I've seen so far. Things like design patterns, proper object modeling, even advanced data structures and algorithms can be picked up on your own with a bit of effort as you need them, and experience building real production used software is the key to hone those skills.

IMHO there are two things that I got from school. How to properly analyze code (in terms of processing time, memory usage,...) so that I could acc

"Can be picked up on your own" and "were picked up on your own" are rarely the same thing. I was a self taught programmer until high school, where there was CS class (And later AP CS) taught by a former Apple Newton platform lead developer. As a self taught, I was fairly typical from everything I read.

He took that raw talent, and enthusiasm, and turned them into real talent.

I've been doing this for a few years and the one gap I'm seeing more and more of doesn't actually have anything to do with programming techniques, "design patterns" or anything else that's hugely technical. All of these things are pretty well-known and accepted by everyone, and you can always be sure that there'll be someone around pushing one or another of them as the be-all and end-all of Programming.

The one gap you might have as a self-taught programmer is in fact in the _history_ of computer science.

I've found that self-taught programmers can actually be quite productive. However I've noticed (in general) the following deficiencies which I think are both rooted in the fact that the need to memorize seemingly arbitrary facts about a system is inversely proportional to deepness of understanding of that system (see graph [typepad.com]):

-Design Patterns (noted earlier by others): There is a tendency of self-taught programmers to follow a design pattern more doggedly than others. This can be tied back to the fact that for the self-taught a particular design pattern represents what programming is to them. They memorize a series of facts that support the design pattern they use rather than understand the nature of a design pattern itself. They tend to have steeper learning curves when presented with new structures and design patterns because using a new design pattern requires the abandonment of the facts they've memorized and starting anew with memorizing a new set of facts.

-Adaptability: Self-taught programmers tend to reach a certain level of comfortableness with technology (ie: languages/libraries/etc.) and attach themselves to it. The thought of using a different language, library, or system is daunting (or even aggressively resisted) since, again, changing requires a new memorization of facts around the technologies (see graph).

Much of what you should learn formally from a CS degree is WHAT a programming language is or WHAT a design pattern is, not merely HOW to program or HOW to use a particular design pattern.

That said, there's nothing stopping a self-taught individual from learning these things on their own. It's just that when you're teaching yourself a trade you, naturally, immediately (and sometimes exclusively) focus on things that allow you to compete on a particular level or with a particular technology. Learning design patterns or what programming is in the abstract doesn't seem to have an immediate payoff (clients aren't going to ask you about those things). But they are skills which allow you to be competitive across technologies or design patterns which is especially important in the rapidly changing world of computers.

So it's not as much about the gaps in your knowledge as a gap in the way you think about things. the way you make simple models and use them to reason about basic properties of any system that you are proposing to build. You can get the general outline logically correct before you try to make the details perfect.

Having said that, the subjects that opened my mind were:

Functional languages - make you think differently. You realise that there is another way and maybe you realise when and why you'd choose it

Basically you write a normal program and you say what data structures you wish to be persistent. (e.g. the object representing your addressbook in an email program for example). That's really all you have to do. The next time you run your program that dat

As a self-taught PHP and C# Developer, the biggest trouble has already been outlined as limited exposure to new concepts. The bigger question, however, is how to gain exposure.

#1 - User Groups
I personally don't attend user groups because I have 2 jobs, and 2 kids, however, the Ruby community has shown again and again that it works, not just for the new stuff, but for the old stuff. They just overhauled Rails and as long as the community keeps talking, they'll do it again and again to perfection.

I'm about 3/4 self-taught, I had some CS courses in college (before there was even a CS minor in the school) and a couple of grad school courses (which I can't say I got all that much from.)

But here's my list, based on what I've experienced over the last 30 years:

analysis of algorithms, "Big O" and similar things. If you've read a basic data structures book, you -might have- seen this stuff. But it's really important theory to understand. I'd rate this as the #1 gap; people who don'

Was fun even 15 years ago to see Pascal programs full of goto loops, from people that learnt plain basic by themselves and then "upgraded" to a more serious language. The very thinking of how is the flow of a program, independant from the language used, was wrong because that "programmer" never understood some basic programming concepts. Programming has evolved with time, and chances for doing it all wrong because missing or not fully understood key concepts are bigger now.

cynicism, hopelessness, futility, frustration, despair, indignation, indentured servitude, the gut wrenching emptiness that hits when a project you've poured your heart and soul into gets canned right when it's almost ready to release just because the ceo read some article about how everything should be done in some new sexy framework...

- study of algorithms (big-O notation with case studies on sorting algorithms); This one completely changed the way I view program efficiency

- formal languages / compiler theory (grammars and parsing have never been the same for me since). This is something you will look at when you write any low-level parsing/validation: XML, functional / expression editors and even program parameters parsing in some cases.

You almost certainly already have some grasp of Complexity Theory since it governs why e.g. mergesort is faster than bubblesort. I personally found it a somewhat dull topic but it is probably worth delving into a bit for "self improvement" purposes.

Functional programming is worth playing around with. US universities tend to focus on Lisp, I think. ML and Haskell are often used in the UK and have a very interesting type system (proponents say that it's about the most advanced one out there) that it's also worth being aware of. Haskell is also a lazy language, which is interesting although you're unlikely to encounter it anywhere else! Some of my ML programming course dealt with how to build lazy data structures without explicit language support, which was potentially a useful technique.

Others have mentioned design patterns. I guess it's worth looking at those since even though you might instinctively know some, it's easier in an interview if you can *name* them so they know you know what you're talking about.

Don't overlook the knowledge to be gained by working with someone who's already been there. Try to find someone who does (or did) the same kind of programming that you do and open a dialog with them. Swapping war stories with someone who has 10 or 15 years more experience can help you decide what kind of things you still need to learn, and what direction you want your career to take.

One of the biggest ones I've seen is that self-taught programmers tend to not think about algorithm efficiency. Learn how to determine the big-O of your functions and learn how to code more scalable algorithms.

IMHO it's not going to be specific gap; the gap is going to be quantitative. Formally taught programmers will have fewer of them, because they're "forced" to work on stuff that doesn't interest them or otherwise wouldn't ever come up in their projects (or doesn't appear to them, to come up in their projects). You can be self-taught and have a shitload of professional experience, and yet that experience can be very narrow, whereas a CS graduate is going to have most of the bases covered.

In my experience the biggest problems are caused by self-taught programmers who lack the humility to realize that computer science and computing a large field, in which they are not domain experts of all of it. Seriously, I'm saying this not to be an insult, but as a plea for self-taught programmers to take their blinders off, and admit to themselves there is a lot of knowledge about computer science and computing (or IT), which has a rich if relatively short history. The ones who get pass the chip-on-their

By getting a college degree, you are ensuring a literacy level of at least what would have been a seventh-grade education 30 years ago. With just a high school diploma, one's literacy level would only be at about a third grade level 30 years ago.

Computer programming is of little value without the ability to communicate.

I have to hire college graduates to change diapers at the school I run -- to ensure that when they do speak to the children, they do so with correct grammar.

Database design - If you don't know the formal procedure for database normalization, you shouldn't be designing a schema -- not because you'll necessarily use that procedure, but because you know the factors to take into consideration.

System architecture - I think the one most useful class I ever took was the one that stopped the CPU from being a black box. I had an old-school professor, and he went deep -- we started off with digital logic, Karnau

I am about half self-taught and half college-taught. I am currently looking for jobs, and despite having 10 years programming experience and knowing more than a handful of languages, understanding OO and algorithms, I don't have the minimum requirements for most jobs I'm looking at. Why? I don't know SQL well. It's one thing if you're looking to work at a major, kickass place like Google or Microsoft, but a lot of the smaller shops are looking for people who can write client-server code in Java and SQL. If

In my experience, at least half of the programmers with degrees don't grok databases at all, and should never be given access to the database. It's not just SQL, either. Sure, they can muddle their way through a select, maybe a group-by, they might even attempt a few joins -- but they have absolutely no clue how to design a database properly, what NULL is for, what foreign keys are for, what cascade rules are for, what triggers are for, what views are for, what LEFT vs. RIGHT vs. INNER do

Well that's exactly my point. I am a good programmer, even great when I'm in my element. But you have (almost creepily) described exactly my knowledge about databases and exactly my gaps. Everything I know about SQL I learned from hacking around with web programming and Android using SQLite on my own time.

But in the real world, how much software DOESN'T use a database in some form as its backend, if you look at it as a percentage of all the software being written? How many jobs are out there that require

I wish I could point you to a good book on the subject. Date & Darwen are too theoretical, Celko is too tricksy. You're not an idiot, you're just ignorant, you don't need an idiot's guide; you want to come fully up to speed, no just get your feet wet, you don't need a "in 24 hours" book; you want to know how to use it, not every last option available, you don't want a "complete reference." Looking at the list of books on amazon, none stand out as useful. Maybe I'll just have to write it.

1. Learn how to use source code repositories (including branching, merging, and resolving conflicts), and how organizations do releases2. Learn how to do code reviews, and how to adapt to local practices and style guides.3. Learn how to be flexible.4. Be humble.

Self-taught programmers frequently develop and religiously stick to their own favorite practices, code formatting, programming languages, IDEs, etc. When you're part of a team, though, your code needs to be readable, understandable and maintainable

Both self-taught programmers and school-taught programmers have gaps. I feel like I got pretty good exposure to both scenarios, since I was self-taught but did go to college, where I had some of those gaps filled in (and also took plenty of courses where I learned precious little).

One thing that you're going to be seriously lacking in knowledge is code management.

You've got to use something like CVS or Subversion to keep track of your code. (Both tools are free) If you aren't using a CMS, then you don't have code, you have a bunch of text files that happen to compile. You can check differences, make rollbacks, and use a lot of great tools to track what's been going on in your codebase. You can try something new and then either integrate it or dump it. I used TortoiseCVS as the front-

"Happy families are all alike; every unhappy family is unhappy in its own way". Same for self (or partially) taught developers. Each will have a different gap. It depends on the developer's interests, project history, etc. A self-taught game developer may have a solid grasp of 3D and analytic geometry, while a self-taught web developer may have a solid grasp of database theory. Presumably, a developer who went through academic training will know at least something about both (and many other issues) - depend

I'm learning C++ using Stroustrups's 2009 "Programming -- Principles and Practice Using C++" textbook. This book is huge, but well-written for both first-semester college students and self-taught programmers.

I opted for C++ instead of C because I wanted a multiparadigm language to help me learn different approaches to coding. The procedural paradigm is strong and useful to learn how computers work in low level, but I C++ provides with other useful paradigms as well which might come handy when you jump to a

Learn one language well, and other languages will be easier to pick up, yes. But calling C++ "multiparadigm" is like calling Java "Object-Oriented". It makes you sound smart, and it even makes sense when you know a little about the subject. Then you learn what the term actu

I was introduced to all that at university and while I don't doubt that it's granted me a better insight, I can't say for sure that it's ever had an effect the code I've been paid to write. Most of these questions I read about what makes a "good" developer amount to so much navel gazing. There are of course a few basic things that every developer must be able to do, but beyond that it really depends on the persons goals, the job etc.