I'm 16. I started programming about a year ago when I was about to start high-school. I'm going for a career in programming, and I'm doing my best to learn as much as I can. When I first started, I learned the basics of C++ from a book and I started to learn things by myself from there on. Nowadays I'm much more experienced than I was a year ago. I knew I had to study by myself because high-school won't (likely) teach me anything valuable about programming, and I want to be prepared.

The question here is: how important is it to study programming by oneself?

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
If this question can be reworded to fit the rules in the help center, please edit the question.

12

The languages I use on a daily basis didn't exist when I was going to college. So self-teaching is very important for learning new technology if you plan on being a developer for more than a few years.
–
Jon StrayerMay 7 '12 at 15:03

5

You noticed that now you are much more experienced than you were a year ago--in fact I bet a year ago you didn't know how much you could learn! I find this happens every 2-5 years, I look back and go "Wow, I had that all wrong and didn't even know it was possible to do it better". Self-teaching is critical, all the time, if you want to be even moderately good. Furthermore I wouldn't want to work with someone who couldn't learn that way.
–
Bill KMay 7 '12 at 18:47

2

When I was your age, I wouldn't have been able to not take up programming by myself even if I had tried. It's way too much fun. And yes, it turned out to be extremely important and helpful for my career, but that was just a nice by-product. Do it for the fun of it, and you won't have to think about whether you'll need the experience or not.
–
Ben LeeMay 7 '12 at 22:11

17 Answers
17

It's critical. I don't think I've ever known a good programmer who wasn't self-taught at some level. As a hiring manager at a large company, I can say that a candidate who describes personal projects and a desire to learn will trump one with an impressive degree every time. (Though it's best to have both.)

Here's the thing about college: Computer Science courses teach theory, not technology. They will teach you the difference between a hash table and a B-tree, and the basics of how an operating system works. They will generally not teach you computer languages, operating systems or other technologies beyond a shallow level.

I remember back in the mists of time when I took my first data structures class and we got a thin manual for this new language called "C++" that they'd decided to start learning. We had two weeks to pick it up enough to write code. That was a good lesson in and of itself. That's the way your career will go.

Your school will likely not teach you what you need to get a good job. Schools often trail what's hot in the industry by many years. Then you'll get a job. Whatever company you go to will almost certainly not spend any particular effort to train you. The bad companies are too cheap, and frankly the good companies will only hire people smart enough to pick it up as they go.

I graduated college in 1987. I went to work as a C programmer with expertise in DOS, NetBIOS and "Terminate-and-Stay-Resident" programs. In the years since, I have had little if any actual training. Look at the job ads... not much call for those skills! The only reason I can be employed today is because I've spent the intervening years constantly learning. To succeed as an engineer, you have to have the habit of learning. Hell, I'd go beyond that: you have to have the love of learning. You need to be the sort of person who messes around with WebGL or Android or iOS because it looks fun. If you are that sort of person, and maintain the habit of learning, you'll go far in the industry.

This. Theory is very important too, and it's harder to be self-taught. I do disagree on company training though.. Our field is very much in demand these days and companies are having trouble finding good talent. (At least in my area)
–
user606723May 7 '12 at 5:17

3

Theory is important, but it's easier to learn it yourself than it used to be, since videos of many university classes are available online. I dare say one might learn more from watching the lectures of a great professor online than from being physically present at the lectures of a mediocre one.
–
Jeanne PindarMay 7 '12 at 13:52

1

But it's true that you should always make projects yourself. University just offers many starting points. I guess if you're not curious, if you don't want to try some things, you shouldn't want to work as a programmer (as a full time job).
–
mooseMay 7 '12 at 19:02

Self-teaching is very important. You cannot rely on a formal education to teach you everything you need to know about your field. However, that being said, a formal education is also very important if you want to enter that career field well-prepared and well-equipped.

I am on my way to college and have spent the past four years teaching myself software development, as a result I now work for a large, well-known company maintaining enterprise applications. It doesn't take a ton of talent but it does take a ton of work and motivation. I think literature and practice are your best bet when it comes to learning. It is also important to pick a specific field as, though you can carry languages and logic across all fields, you can only truly become "great" given sufficient practice and understanding in a specific field.

There's formal and self education, but you're leaving out having an on-the-job mentor-which can be the best way to stretch your skills and learn how software is really made.
–
JeffOMay 6 '12 at 18:07

1

"However, that being said, a formal education is also very important if you want to enter that career field prepared to take an active stance in the work force." ... This sentence is very vague. An "active stance" in the "work force"? Really? What does that even mean?
–
bleshMay 7 '12 at 0:02

1

I think that on the job experience is the most important... but a large part of that is self-taught. You can't expect everyone at your office to teach you everything, and the very best take their experience from every place they can get it..
–
user606723May 7 '12 at 5:19

Learning on your own is very important. Having the discipline to research and gather the necessary knowledge to accomplish a task will put you far ahead of many others that rely on formal training to accomplish the same task. This goes for any industry, not just the software industry.

Don't get me wrong, getting some formal training or education is helpful, but your own motivation to better your skills will help you grow into a better software developer. There is always something to be learned: new platforms or programming languages to experiment with, development methodologies to implement, tools and algorithms to use, the list goes on. Not everything will be introduced to you through formal training and so it is up to you to learn about other topics and ideas you might be interested in that will help you throughout your career in programming.

In programming, self-teaching is what you will be doing every day. You will have to teach yourself a lot of things, not just computer languages and tools that keep on changing. You will have to learn code other people wrote and you will have to fix that too with minimal instruction and supervision. It is rare in some organizations to get any real training more than 1 time a year (if ever!). Make sure you can do (and enjoy) this, otherwise, consider a different career while you are still young.

Self-study is very important because you won't always have opportunities for formal training. When you start looking for a project, internship or job, find ones that have solid senior developers who can really teach you something. Being in an environment that does things right can be the express route to quality code.

I can tell you that there have been several places I've worked where they wouldn't even consider someone who didn't have their own projects outside of work. It exhibits love for programming beyond just showing up to a job and collecting a check. I'm going to go out on a limb here and say this: All programmers who don't love to program suck at their job. Even worse, they have nothing to add to any team they could join.

I'd take an inexperienced junior developer that loves what he does so much he plays with code in his free time over three mid-level developers that are just going through the motions: That junior developer will be great some day, and the others will never be any better than they are.

If you're not learning new things, you're just sitting around forgetting what you know.

Compared to almost any other field I can think of, programming is basically all about self-education. Formal computer science/software engineering education is useful, but it's not really where you learn to code.

What you should really do is just start working on projects: build something that you want to exist. And then join an open source project to work on something that you want to be even better. The learning comes for free then...

Get a formal education as well - 90% of what you get taught in a formal education won't be used in the workplace. But the 10% that is used will be some strange and obscure thing that you thought at the time would never be of any use at all.

Without the formal education, you suffer from the problem best described as "you don't know what you don't know". The formal education gives you a wide wide wide coverage of lots of things in the full knowledge that nobody will ever go out and use it all. Because what you will use is something you won't know until you need it, it's all about being prepared so you know where to go looking.

The self-learning / curiosity driven thing is what makes you a far more knowledgeable and well-rounded person. Not to mention more valuable to an employer.

Side note: I've spent my entire career, since starting programming aged about 15 (sheesh, over 25 years... nearer 30 years) finding that apart from my university education, most training courses (you know... learn BLAH in 3 full time days) are pretty useless. These are generally superficial, and as far as the much touted "keeping your skills up" mantra goes, not very helpful for one who needs to get into deep technical detail. Buying books, using the internet, delving into the maths / physics / architecture / whatever are what have made me useful and valuable. At one stage I was one of about 10 people in the world with specific expertise on a particular subject - all that knowledge was gained by self-teaching and on-the-job learning.

The only time to stop learning is when you are 6 feet under in a pine box.

It takes at least 10 years of practice before one can become great in any given area. So it is important to start early.

The fact that you started at an early stage of your life doing something you seem to like already puts you way ahead of the pack. So unless you're having second thoughts about the area you'll be working, don't stop... don't ever stop!

I consider self learning one of the best skills of any given developer and the second being a college degree. A good college degree is important just because it adds so much quality to your knowledge which you would probably not be able to gather by yourself. There are certainly exceptions to this rule, but that's all they are; exceptions.

Also, the more experience you build up early the better you'll enjoy college and the better you'll absorb new concepts presented to you. At first you'll find it all so easy and pointless but very quickly you will feel challenged and will eager to learn more.

After you leave college don't ever stop learning as it will be one of your best features. I recommend reading 'Talent is Overrated'. You can take a look at this quick article about that book:

after your studies : as already stated in other answers, self-teaching is what you will do in your everyday life as a developer. You cannot know everything and companies know this. What you MUST know is how to improve yourself. Best developers have the ability to learn new languages, new technologies quickly by themselves.

during your studies : the distance between what college courses will teach you and the reality of what will be your job is huge. Specifically I am talking about maintenance and maintainability. An essential part of a developer's job consists in maintaining legacy code (bug fixes, improvements, adding features, etc). You can read Is the creation of brand new software generally a major part of most programming jobs? for further details.
As maintaining legacy code is essential, you will also need to write maintainable code yourself, and this is rarely taught with formal lessons (see How to improve the training of students regarding maintainability?). If you don't practice self-teaching and read a lot during your studies, you are unlikely to acquire the needed skills that will make you a better developer than the average.
Don't try to learn lots of languages/technologies during your studies, but learn good practices and clean coding. For example you'd better read Robert C. Martin's "Clean Code" than Herbert Schildt's "Java The Complete Reference" during your studies.

The annual world championship held in British Columbia. The finalists
were Canadian, and Norwegian.

Their task was as follows. Each of them that a certain portion of the
forest. The winner is the one who could knock down the largest number
of trees from 8 am until four o'clock in the afternoon.

At eight o'clock whistle blew and two woodcutters took their
positions. They cut down a tree behind a tree, while the Canadian had
not heard that the Norwegian stop. Realizing that this was his chance,
Canadian redoubled his efforts.

At nine o'clock in the Canadian heard that the Norwegian again went to
work. And again, they worked almost simultaneously, when ten to ten
Canadians have heard that the Norwegian stopped again. And again, the
Canadian went to work, wanting to take advantage of the weakness of
the enemy.

At ten o'clock in the Norwegian re-enter the work. Until ten minutes
to eleven, he briefly not interrupted. With the growing sense of
jubilation Canadian continued to work in the same rhythm, already
feeling the smell of victory.

It lasted all day. Every hour of the Norwegian stayed for ten minutes,
and the Canadian continued to work. When you hear about the end of the
competition, at four o'clock in the afternoon, the Canadian was quite
sure that the prize in his pocket.

You can imagine how he was surprised to learn that he lost.

How did that happen? - He asked the Norwegian. - Every hour, I heard you for ten minutes stops. Like, Damn you, you managed to cut more
wood than I do? It is impossible.

In fact, everything is very simple - just said Norwegian. - Every hour, I stopped for ten minutes. And while you continue to chop wood, I was sharpening my ax.

Self teaching is very important since you internalize the concepts your own way and that really helps.Choosing a language to self-teach with is very important.Languages that have clear documentation are very important and what you don't want to be is very ambitious in wanting to learn too many popular languages too fast.Since you say you started with c++,i would recommend java for you,it always works so well for self taught programmers.

I probably learned 95+% of what I know related to programming from trying stuff on my own and learning how it works. School can help with teaching good programming style and optimizing code for speed, etc, but you will never become a "good" programmer just by reading a textbook. A great way to build programming skills is to find everyday problems that you could use a computer to solve, and try to write some code to get it done. Getting stuck is part of learning. I started out wanting to be a web dev, so I would make dummy websites fairly often (not hosting them or anything of course) to test out new things I wanted to try. It worked out pretty well for me!

I started programming 2 years ago. My school could not teach the languages well, and I have to go online and do lots of research on myself. I am a slower learner and so it took me 2 year to write my first program, while all my school mates are still struggling or have given out on programming.

This shows that it is better for a programmer to be self taught rather than depend on the school. School will always hide information, as they think we are not ready for it.

As always in mathematics and computer science - there are 2 points of view:
1. Necessary
and
2. Sufficient contitions

It's necessary to (self-)learn throughout your live. No other option to be and remain good programmer.

Above point is not sufficient condition - you must have strong background in mathematics - high school and then graduate good University in the field of computer science. This is (maybe) the sufficient condition. Your brain must be taught to think algorithmically. This CANNOT be achieved by self-learning.

Your brain can be taught how to think algorithmically without the university. I would even suggest that there are better ways of learning this than going to a university (an apprenticeship, for example, would be more effective for many people). On the other hand, there are many people who just can't think algorithmically regardless of how many computer science courses they take. University certainly is a huge help to most programmers, but it's not required either.
–
PhilMay 7 '12 at 13:47