I'm 16. I started programming about a year ago when I was about to start high-school. I'm going for a career in programming, and I'm doing my best to learn as much as I can. When I first started, I learned the basics of C++ from a book and I started to learn things by myself from there on. Nowadays I'm much more experienced than I was a year ago. I knew I had to study by myself because high-school won't (likely) teach me anything valuable about programming, and I want to be prepared.

The question here is: how important is it to study programming by oneself?

Answer: Self-Improve or Lose (88 Votes)

It's critical. I don't think I've ever known a good programmer who wasn't self-taught at some level. As a hiring manager at a large company, I can say that a candidate who describes personal projects and a desire to learn will trump one with an impressive degree every time. (Though it's best to have both.)

Here's the thing about college: Computer Science courses teach theory, not technology. They will teach you the difference between a hash table and a B-tree, and the basics of how an operating system works. They will generally not teach you computer languages, operating systems or other technologies beyond a shallow level.

I remember back in the mists of time when I took my first data structures class and we got a thin manual for this new language called "C++" that they'd decided to start learning. We had two weeks to pick it up enough to write code. That was a good lesson in and of itself. That's the way your career will go.

Your school will likely not teach you what you need to get a good job. Schools often trail what's hot in the industry by many years. Then you'll get a job. Whatever company you go to will almost certainly not spend any particular effort to train you. The bad companies are too cheap, and frankly the good companies will only hire people smart enough to pick it up as they go.

I graduated college in 1987. I went to work as a C programmer with expertise in DOS, NetBIOS and "Terminate-and-Stay-Resident" programs. In the years since, I have had little if any actual training. Look at the job ads... not much call for those skills! The only reason I can be employed today is because I've spent the intervening years constantly learning. To succeed as an engineer, you have to have the habit of learning. Hell, I'd go beyond that: you have to have the love of learning. You need to be the sort of person who messes around with WebGL or Android or iOS because it looks fun. If you are that sort of person, and maintain the habit of learning, you'll go far in the industry.

Answer: There is not a Class for Everything (11 Votes)

Learning on your own is very important. Having the discipline to research and gather the necessary knowledge to accomplish a task will put you far ahead of many others that rely on formal training to accomplish the same task. This goes for any industry, not just the software industry.

Don't get me wrong, getting some formal training or education is helpful, but your own motivation to better your skills will help you grow into a better software developer. There is always something to be learned: new platforms or programming languages to experiment with, development methodologies to implement, tools and algorithms to use, the list goes on. Not everything will be introduced to you through formal training and so it is up to you to learn about other topics and ideas you might be interested in that will help you throughout your career in programming.

Answer: Go Alone or Go Home (5 Votes)

In programming, self-teaching is what you will be doing every day. You will have to teach yourself a lot of things, not just computer languages and tools that keep on changing. You will have to learn code other people wrote and you will have to fix that too, with minimal instruction and supervision. It is rare in some organizations to get any real training more than 1 time a year (if ever!). Make sure you can do (and enjoy) this, otherwise, consider a different career while you are still young.

Answer: Learn or Forget (4 Votes)

I can tell you that there have been several places I've worked where they wouldn't even consider someone who didn't have their own projects outside of work. It exhibits love for programming beyond just showing up to a job and collecting a check. I'm going to go out on a limb here and say this: All programmers who don't love to program suck at their job. Even worse, they have nothing to add to any team they could join.

I'd take an inexperienced junior developer that loves what he does so much he plays with code in his free time over three mid-level developers that are just going through the motions: That junior developer will be great some day, and the others will never be any better than they are.

If you're not learning new things, you're just sitting around forgetting what you know.

Have an opinion about the importance of self-learning in preparation for professional programming? Disagree with the opinions expressed above? Bring your expertise to Stack Exchange, a network of 80+ sites where you can trade expert knowledge on topics like web apps, cycling, scientific skepticism, and (almost) everything in between.

34 Reader Comments

I taught night classes in programming methodology, databases, and Visual Basic at the local university for four years as a second income. I've always loved programming and have had to teach myself damn near everything because in my experience nobody else will. That being said, and since these classes were taught in Texas, in modified one of the most well-known phrases in Texas to help my students understand the importance of self-education.

I modified the phrase "No pass, no play!" to "No play, no pass!"

Essentially, in high school, many students learn because they're taught. In college, usually, students learn to teach themselves. In the world of programming, if you aren't "playing" at something unknown, then you're not like to "pass" in your career. Most every company I ever worked for, some fourteen and climbing, never paid for any training. Not a dime. Sixteen years between training sessions (one in 1985 and one in 2001), then three years (2004), then three years later I paid for my own GIS training in 2007. Since then, nothing, all self-taught in the evenings at home.

People mostly learn by doing. I would focus more on spending time programming than on worrying about learning different techniques. It is probably a good sign if that kind of activity is one that you enjoy doing. If you are motivated by career prospects, you should undertand that you can only make a relatively low grade career in software just with coding skills. Higher grade careers are centered around knowledge of some commercially important problem domain that is modeled in software or in managing people. Learning different things is an interesting activity and one I enjoy. But people are generally more successful if they focus on excelling at a limited number of activities. In any case, you will have to learn whatever techniques are in use where you wind up working. Starting out your career by working a number of different jobs can help you both learn a vareity of techniques and also build up a network of people who have some familiarity with you and what you can contribute to a software project.

Learn, learn, learn. Keep learning. And don't get stuck in a rut about certain technologies. I'll pass over candidates any day that don't show that they are interested in what is new; not so much because it might be used, but it shows an interested mind.

However, you gotta learn the basics. I see lots of people coming out of college that just have degrees where all they did was Java, HTML, Javascript, CSS, building apps in various UI or drag-n-drop tools. I'll pass these people up too, unless they can prove to me that they have learned the basics and fundamentals of programming. Be able to solve problems without relying on technology x, y or z.

I do have to agree and disagree with Khepry. Yes, college (not just in CS) teaches you to learn and teach yourself. But it does, and should or at least did, give you a base line knowledge that most people won't bother to learn themselves.

FYI when looking at a resume, unless you have been a consultant, I'm not really impressed by less than 2 years spent at anyone job. That is a red flag to me when hiring, unless of course I'm hiring for a consulting position, that this applicant probably will not be devoted to what they are doing and are just using it as a leaping stone to yet another job.

The "wrong" question, I think, but perhaps you are very young. The answer lies in why you want to have that programming career. If you find programming fun, it fulfills your curiosity and provides you with exciting challenges, great. Then you do it on your own anyway in your free time, i.e., you teach yourself new tricks all the time (so why the question?). If you want to become a programmer because it is cool, your peers do it, etc, then your mileage will vary. People really good at something tend to love doing that something, so they do it in their free time too, not just as work. They are simply driven to solve the next problem and constantly learn on the way.

So follow what motivates you, and push yourself, invent projects for yourself to learn, wherever curiosity takes you. Courses certainly help, so make use of them. If a course is easy/boring, pick something else or make it challenging for yourself. Many careers outside computer science/engineering involve a fair dose of programming, if you like programming, you have plenty of options.

In short, I do not know a single good programmer who did not have and does not currently have side projects. It is impossible not to teach yourself (the only way for that would be to lose interest entirely).

I knew quite a few people in my CS department who didn't know anything about computers before going in, and ended up having extremely successful careers, working at places like Amazon and Google. This, however, does not contradict the advice that yes, you should self-learn.

The successful students who started from nothing, did not just sit there and absorb lectures. Once they figured out programming was for them, they started doing extra credit, visiting office hours, TAing, helping out with research, and taking advantage of all the opportunities that a self-taught high school student wouldn't have access to.

It's much easier to be "self-taught" when you have a good mentor, and in a good university program, you have access to that. Even so, you need to have the internal motivation to take advantage of those opportunities, and a good indicator of that motivation is being self-taught in high school. If you have already decided you want to be a programmer, you should start now, because those who start early, and *remain* motivated through university and their early careers, often achieve great things.

I can't really add much to what everyone else has already said (I'm a 23 year old CS student myself), but even my professors at university have told us this - from a course, you'll learn only the bare minimum about programming, the theory and other formalised stuff that's needed for professional work. The majority of your learning comes from applying your knowledge to your own projects and by hacking about in your spare time.

I would say paramount. Good programming is an art, and that isn't learn just by reading or hearing somebody. You must do it yourself.

There are individuals that think that programming is not an art. They just want to have profits and they don't care about anything else. Fortunately they are punished with bad code that leads to missing deadlines and eventually project failures.

Think about Leonardo da Vinci, who was a great engineer and a great artist. At least try to have a team that wants to be something more that mere skilled labor. If the programmer is not proud of their work don't expect anything else than mediocrity.

Like any craft, the way you learn to be a working programmer is by doing it until you get good at it. You have to actually write software to learn how to write software. I would recommend learning about Robert Fripp's stages of guitar craft (enthusiast, happy gigster, master, etc) and apply them to computer programming. Whether it's guitar, painting, programming, etc, it's all the same basic concept of learning a craft.

I studied computer science at a pretty good college, and had programming courses in highschool as well. But I did a LOT in my own time for fun. Half of what I talked about in my interviews when I got my co-ops and finally my job after graduation had to do with the stuff that I did in my own time. I learned so much on my own time in highschool, that by the time I got to college, the only new information really had to do with datasets and learning new algorithms. The basic conceptual stuff, like program flow, error checking, modular design, optimization, debugging, and abstraction, I already had teased out through my own projects.

Because I played with a lot of different things on my own, I amassed a pretty lengthy resume of skills and experience even right out of graduation. But even then, I still was teaching myself new things. Now, the two programming languages that I work in most of the time (SystemVerilog and Perl) were both initially self-taught. The former, was completely self-taught. The latter, I self-taught for a good amount at a co-op job before smoothing out my knowledge in part of a college course (Programming Language Concepts, which covered formal languages, lexers, BNF, and so forth, as well as introduced us to 4 different languages of our choice).

I'm still expanding on that knowledge. I downloaded or bookmarked the language specs and library indexes for things I'm working on, and trying to always figure out a better way of doing things.

In the programming world, you can't ever remain static. It's constantly moving, and you either have the drive to move along with it, or you get left behind.

The question here is: how important is it to study programming by oneself?

I'd say there is no other way to do it and hence it is the key skill you will need.

From the view of programming, all the stuff you will learn in formal education are tools and raw materials. You will learn some abstract theoretical things, mathematically describing problems, you will learn patterns to structure software, MVC or Consumer-Producer, you will be introduced to programming paradigms...All of that is just a repository of rather abstract techniques and approaches.You will likely not learn even a single language to a degree that is sufficient for actual work, but you will learn basic and advanced concepts, ideas and structures found in various languages. You're mainly told how stuff works and less how to use it.

After college, a common thing to find out is that the reality is way more complex than the examples and assignments you worked on previously. The great thing about writing software is the immense degree of freedom you have. For almost anything you have half a dozen basic approaches to tackle a problem, with different focuses, up- an downsides, without any of those being outright right or wrong. For any further step there's choices again. The programming language is just one of those choices.This is why I also tend to compare writing software to an art discipline for a big part.

The thing with programming in particular now is that the tools, materials and the fashion change more rapidly than in most other disciplines. You will have to keep up with those changes and in practice there will likely be only very little formal training once you're out of college.So you HAVE to teach the stuff yourself and that's basically what you should learn in college.

Another thing that sometimes seems overlooked to me is the ability to gather domain knowledge quickly. When you write software, you usually do it for a certain purpose and it is way easier and nets better results if you get that purpose and can view what you do from the perspective of those the software is made for.The chances of getting formal introductions to foreign knowledge domains is practically zero by my experience. You will likely have to do that yourself, which means teaching yourself new things.

So yeah, you're up to an education that will mainly teach you abstract concepts whose usage patterns you need to familiarize yourself with on your own and then to a job that's a continuous learning experience. Teaching programming and all kinds of things yourself is the key skill.

I am 23 and been doing this seriously about a year and I always fret that I didn't start earlier. I know I've seen plenty of things on SO about people starting at age 50, and it's never to old, etc. It's just so damned cool to start at 15 and get recognized for it. I tried C++ at 15 and the compiler wouldn't let me get past "Hello World". I don't remember the specifics, but thinking back, that couldn't have been TOTALLY my fault. But it kept me away from programming for years, though I now do it for fun. I've learned other things in that time, but I wonder if I've set myself up for a much harder track toward professional programming. Maybe then, someone will see how damned hard I'm working at it now.

on the subject of self-teaching, if one was a complete beginner to programming (zero experience), where should they start? I've always wanted to learn but I never really found a clear answer on where to start.

on the subject of self-teaching, if one was a complete beginner to programming (zero experience), where should they start? I've always wanted to learn but I never really found a clear answer on where to start.

I would give all the credit to me learning on my own to any formal training. I started programming when I was about 12 years old and wanted to create a better Yahoo!\Altavista. I ended up not creating a better enough search engine written in purl on Angelfire because i was horrible about it, but it jump started me past anyone in a formal class at the time because of the web being new.

My degree was not in computer science but I was fascinated by PCs and programming, way back around the time peronsal computers became widely available. I learned to program for fun, self-taught, and only later did it become for profit (I was almost shocked when I found out people would be willing to pay me to do the stuff I did for kicks). I used to buy books (and compilers and whatever was needed) for new languages just to try them out and compare them with the languages I already knew. What I've found over the years is that the basic concepts and structures are pretty universal, so if you learn the concept of an "if-then-else" structure, it will serve you across a wide variety of programming languages and you'll just need to fine-tune the exact way it's written in a particular language.

Eventually I did go back to school for CS classes and many were informative and useful. However, most of it was too slow paced, for the very reasons cited by commenters above: a lot of students are going through the motions to get a grade or advance to the next class but they're not passionate about what they're supposedly learning. The few who are passionate have probably already raced through the textbook, tried most of the coding samples, and wish the teacher would bump things up a few levels so they could try more interesting problems and projects. If that describes you, you will probably do very well in this field. It should be fun and interesting, even during the times when you gnash your teeth and bang your head on the desk because you can't figure out why your code isn't doing what you're sure it should be doing. The fact that you bang your head instead of shrugging your shoulders goes to show that solving the problem matters to you, it's not just a minor annoyance in a dull job.

The one CS class I would trade for all of the others, and which I recommend if you can find something similar in your future college, was called Programming Logic. We did zero programming, just wrote a bit of pseudo-code occasionally. Instead, we created flowcharts...funny looking maps of programs and projects that use symbols and lines to show how you get from the beginning of your program to the very end, accounting for all of the actions, decisions, and choices along the way. If you map it out wrong, you find yourself in a continuous loop or dead-end and there's no way to reach the final goal. If you do it right, you can trace every action and decision point (if-then-else, for example) and there's always a clear path to your end point. It's invaluable to learn how to structure your programming and solve problems that way, to map out in your mind even if you don't use written flowcharts later in real life.

With a passion for programming and problem-solving and an understanding of the logic, you can learn most everything on your own and do well. Good luck.

I teach at University (robotics, machine learning and AI). It is such a pleasure to teach students who learn because they're interested! I agree with all the posts saying that self-learning is important once you've graduated.

But, that isn't all that is important. When people self-teach they tend to concentrate on their strengths, getting better at what they already know. It is important to round out your knowledge. I started programming when I was about 12. By the end of high school I could already program in assembly, BASIC and Pascal. I found first year Uni (= Australian shorthand for University) a bit boring - we were doing stuff I already knew. It wasn't until we started studying the CS theory more in later years that I realised I was in the right place. You need to learn that theory, and most people don't teach it to themselves because it isn't immediately obvious that it is relevant for any given problem.

There's one other point that is important... the staff at the Uni have been thinking hard about the future of university education given the availability of online video lectures such as coursera and udacity, etc. There were some suggesting that the future of Uni was the Ox-Bridge model where lectures were outsourced and then you had local tutoring to polish up the edges. That is a good model, but it was pointed out that so-called 'open universities' have been pursuing this model for years without affecting the big research Universities. My current best guess at the reason for this is that the social aspects of University are really important. I'm not talking about the night-life. Simply put, having a lot of good students learning together seems to work better than people studying individually, or in groups without strong peers. Groups of strong peers explain stuff to each other. They motivate each other. They challenge and push each other. They also form a network that really helps later in your career. *This* is the reason you want to go to a good Uni in your country.

I teach at University (robotics, machine learning and AI). It is such a pleasure to teach students who learn because they're interested! I agree with all the posts saying that self-learning is important once you've graduated.

But, that isn't all that is important. When people self-teach they tend to concentrate on their strengths, getting better at what they already know. It is important to round out your knowledge. I started programming when I was about 12. By the end of high school I could already program in assembly, BASIC and Pascal. I found first year Uni (= Australian shorthand for University) a bit boring - we were doing stuff I already knew. It wasn't until we started studying the CS theory more in later years that I realised I was in the right place. You need to learn that theory, and most people don't teach it to themselves because it isn't immediately obvious that it is relevant for any given problem.

There's one other point that is important... the staff at the Uni have been thinking hard about the future of university education given the availability of online video lectures such as coursera and udacity, etc. There were some suggesting that the future of Uni was the Ox-Bridge model where lectures were outsourced and then you had local tutoring to polish up the edges. That is a good model, but it was pointed out that so-called 'open universities' have been pursuing this model for years without affecting the big research Universities. My current best guess at the reason for this is that the social aspects of University are really important. I'm not talking about the night-life. Simply put, having a lot of good students learning together seems to work better than people studying individually, or in groups without strong peers. Groups of strong peers explain stuff to each other. They motivate each other. They challenge and push each other. They also form a network that really helps later in your career. *This* is the reason you want to go to a good Uni in your country.

This essentially encapsulates my question earlier in-thread: I started programming at age 22. I'm taking an AI class online right now and love it, am taking a data structures class and see the usefulness and enjoy some aspects. I like programming, but I wonder if I might ever catch up with your level of knowledge. How long does it take to become competent? I made a sudoku solver that mostly works, but had some bugs with harder puzzles. I'd like to say I'd find myself fairly competent if I could make a decent pacman agent without needing to spend half a year on the project. I'd like to hear your thoughts.

I guess I was lucky because I started out as a photographer and realized early on that I wasn't a very good camera photographer. But put me in a darkroom and I could do magic. I leaned everything about the physics, chemistry and math involved in the processes of the dark room. Then came programmable enlargers and developers and I studied and became an expert on the hardware programming. I eventually was hired as an audio visual developer for a major company and learned analog to digital programming. I learned to program thirty to forty projectors for large AV presentations. About this time the dinosaurs became extinct and the company downsized and I found myself on the street looking for a job. Because I had taught myself basic analog to digital programming I was able to get a job as a technologist for another large corporation and in the course of my duties I was offered the opportunity to be part of the software development staff but I had to learn FORTRAN on my own. I found people to teach me the basics and created a niche writing graphic software for mathematical models. I found all my photographic experience applied to my work and soon realized that programming is an art and science. As one poster stated, I learned my craft. I found that languages were secondary to writing good software and it was the art that made good software great software. I taught myself every language that came along to apply the art, OK some didn't apply the art, like COBOL but I learned them anyway. About half way through my career an engineer invited me to work with him on a hardware project and one thing led to another and I had my first patent. Goodby software, hello hardware engineering. I spent my last five years of my career working in a college system in Florida helping students understand the world of system development. Thirty-five years later I am now retired but am still active in the world of Computer Science.

So I had a great career made good money and never got a degree. I attended a lot of college and if I had studied any one discipline I could probably have a couple of degrees but I realized very early on that college computer courses are years behind actual work environments. So I went to college to learn things I was interested in, like math, history, biology, anthropology, art and more. And from every course I learned things that I could use in my work that had nothing to do with programming. The point? Teach yourself as much as you can, get help when you need it, and never stop learning about life. Programming is the art of creating something real from an abstract idea, and life is just full of abstract ideas.

I am 23 and been doing this seriously about a year and I always fret that I didn't start earlier. I know I've seen plenty of things on SO about people starting at age 50, and it's never to old, etc. It's just so damned cool to start at 15 and get recognized for it. I tried C++ at 15 and the compiler wouldn't let me get past "Hello World". I don't remember the specifics, but thinking back, that couldn't have been TOTALLY my fault. But it kept me away from programming for years, though I now do it for fun. I've learned other things in that time, but I wonder if I've set myself up for a much harder track toward professional programming. Maybe then, someone will see how damned hard I'm working at it now.

It's a bad craftsman that blames their tools for their problems, and with the history of C++ being around since 86 (I think it is) in an evolving form, and being mostly what it is now when you were 15, when combined with all the things successfully created using it as the language, well, your statement is an invalid excuse. Even in its earliest forms, it still wouldn't have been valid to place any blame on the language: it's been used in most every domain, successfully, from the embedded realm, to kernel space code as well as user space code of operating systems, to writing many compilers, and the list goes on. You might as well consider that C is fatally flawed, by your reasoning: with some minor things.not being identical, C++ is a not-quite-strict superset of C, and can be used that way as desired.

I'll answer this two ways: if you *really* want to be a programmer (software engineer), self-starting and self-taught is critically important. However, you will need some training in methodologies and general tools of the art in order to be competitive. We hire a number of consultants every year to do our programming for us, and what we look for, besides demonstrable competency in the language du jour, is background in projects at least somewhat similar to our own (can be different domain, but working on similar problems) OR a very wide range of projects that shows the applicant is capable of landing on their feet running.

However, we generally don't hire programmers full time; programming tools become quickly dated, and we'd rather hire someone who is up-to-date with the current development paradigm than invest in trying to second guess where that paradigm is heading and maintaining our own programming team. Rather we hire people with specific domain or approach expertise (say, machine learning, or artificial intelligence) with deep knowledge of the field who will tell the programmers what to do. They pick the algorithms, understand the application, and 'solve the problem.' Software engineering is a support role - software is the tool, not the end in itself. Unless you plan to work at a company that builds software tools, that will probably be true for most companies. In that case, computer science is much more important - how to design, construct algorithms, read and understand papers in your field, etc. That's not to discourage being able to program, that goes without saying, but a great chef isn't about the knife work - that's only a stepping stone.

So when you think about the vector of your career, you need to also think about if you plan to stick to programming or you want to be the technology go-to guy. In the latter case, preparing for advanced degrees (MS if not PhD) is really more important than your particular skill at programming. (I've known a number of CS PhDs who couldn't program to save their lives, but have won numerous best paper awards, have distinguished careers in both academia and industry...)

The most critical skill is the ability to think. Typical programming can help with part of that (proceduralization). Do also look at other kinds of programming (e.g. logic - PROLOG, functional, etc.) to be better rounded.

on the subject of self-teaching, if one was a complete beginner to programming (zero experience), where should they start? I've always wanted to learn but I never really found a clear answer on where to start.

Find an itch to scratch. In life, there's always some hassle, some irritation where you wish you had better tools to organize information. Next time you run across one, use it as your idea to try to implement.

Like any craft, the way you learn to be a working programmer is by doing it until you get good at it. You have to actually write software to learn how to write software. I would recommend learning about Robert Fripp's stages of guitar craft (enthusiast, happy gigster, master, etc) and apply them to computer programming. Whether it's guitar, painting, programming, etc, it's all the same basic concept of learning a craft.

I know that I'm going off topic here, but thanks for posting this. As a programmer who plays guitar for a hobby I found this an interesting comparison.

One of the Achilles' heels of self-taught programmers is their tendency to blow off concepts they don't understand easily. When you are in a class, you are forced to learn what the teacher tells you to learn.

For example, at first, I found the concept of pointers to pointers in C confusing, so I just skipped over it. I was doing things in an inefficient way until I dug in and learned the concept.

Thus, my advice to my fellow self-taught programmers is: if you get stuck on some concept, don't blow it off. Grab onto it like a pitbull and dig into it until you get it.

There are C and C++ which are founded on layers of previous procedural languages. Then there are Java, C# and a whole family of object-oriented languages. And the distinction is getting blurred. Why mention that? Because you'll have to jump around from language to language in order to get contracts and as fashion changes. By all means use C++ as your starting block, but as you develop you will need to pick up all sorts of knowledge elsewhere.

Consider those who learn foreign languages. Some acquire rudiments of one language: that's what High School gives you. People who go to university acquire a deep knowledge of one language, perhaps two. But it's vital to spend significant time in the country, otherwise you don't catch nuances and you certainly don't get slang or local accents. But the good news is once you have learned a few foreign langues, it becomes much easier to learn more: there is lots of crossover, and you already have the personal skills. The same applies in IT.

One thing no-one seems to have addressed,or maybe I've missed it: there are vocational qualifications like MSCE - there is a whole industry built around delivering and testing for such courses. These courses teach technology through example, but (IMO) won't in themselves give you the experience needed to get and hold a job. Loads of people believe they need to get such qualifications on their CVs, though.

I'd like to ask the forum what they think of that, because it's relevant to how our our student, Theplan, might decide to develop himself.

PS no surprise Theplan won't get a full programming education at high school. Firstly few schools will have staff with the appropriate skills; secondly they tend to concentrate their efforts into what school leavers need to know in general: that's Office skills. And that leads to a whole different discussion thread ...

Learn, learn, learn. Keep learning. And don't get stuck in a rut about certain technologies. I'll pass over candidates any day that don't show that they are interested in what is new; not so much because it might be used, but it shows an interested mind.

However, you gotta learn the basics. I see lots of people coming out of college that just have degrees where all they did was Java, HTML, Javascript, CSS, building apps in various UI or drag-n-drop tools. I'll pass these people up too, unless they can prove to me that they have learned the basics and fundamentals of programming. Be able to solve problems without relying on technology x, y or z.

I do have to agree and disagree with Khepry. Yes, college (not just in CS) teaches you to learn and teach yourself. But it does, and should or at least did, give you a base line knowledge that most people won't bother to learn themselves.

FYI when looking at a resume, unless you have been a consultant, I'm not really impressed by less than 2 years spent at anyone job. That is a red flag to me when hiring, unless of course I'm hiring for a consulting position, that this applicant probably will not be devoted to what they are doing and are just using it as a leaping stone to yet another job.

OH! So you're one of the guys in the "Remember kids, you can't become a programmer unless you ace grad and doctorate level math...".Silly me to think that you can't even BEGIN to learn how to program without knowing how to do residues on a volume function in your sleep.I know I'm wrong on the math and the logic, but i found a troll and just had to troll back.

Seriously man, learn to have a balanced outlook, look at all of these:* work experience* contest attendance/prizes* college experience* personal/unofficial projects (involvement in OSS communities)* college gradesNot necessarily in that order, but there's no written rule that "Highest Math GPA = Best Programmer".

Alternatively you should just split the stack of applications in two and throw half in the trashcan: "I don't want unlucky people working for me." is just as good as your way of doing it.

Please let's not start a war. There is nothing wrong with learning whether from a University Prof, a training instructor, a book, some friends. And (IMO, which is probably not worth much) a passing knowledge of Donald Knuth is likely to help anyone wanting to solve problems. As would some (Mathematical) knowledge of recursive v iterative solutions, the knee-jerk example being how to calculate factorials.

Learning by trial and error can be very instructive but can be time-consuming. Going on courses might shortcut that. Maybe.

You can't separate skills in programming from skills in data structures and algorithms. And that means stepping away from the compiler and learning problem solving methodology.

You could also say that UI design and testing skills are techniques you might not pick up immediately if self-taught.

I am 23 and been doing this seriously about a year and I always fret that I didn't start earlier. I know I've seen plenty of things on SO about people starting at age 50, and it's never to old, etc. It's just so damned cool to start at 15 and get recognized for it. I tried C++ at 15 and the compiler wouldn't let me get past "Hello World". I don't remember the specifics, but thinking back, that couldn't have been TOTALLY my fault. But it kept me away from programming for years, though I now do it for fun. I've learned other things in that time, but I wonder if I've set myself up for a much harder track toward professional programming. Maybe then, someone will see how damned hard I'm working at it now.

Honestly, I don't think you'll have a problem. I'm from a non-traditional background as well. I got my first "real" programming job at 24 (I'm 28 now) with no degree or formal programming experience. That was the hard part - convincing someone to take me on. It's been all gravy from then on in my experience. Like everyone says, as long as you exhibit a passion and are constantly pushing yourself to learn, you'll go far.

Seriously man, learn to have a balanced outlook, look at all of these:* work experience* contest attendance/prizes* college experience* personal/unofficial projects (involvement in OSS communities)* college gradesNot necessarily in that order, but there's no written rule that "Highest Math GPA = Best Programmer".

Thank you for this. A balanced approach is a great outlook to have. Not only am I terrible at math in general, but I also don't have a degree. I am, however, and excellent software engineer, as any one of the companies I've worked for, including one of the nation's largest financial institutions, will tell you. It gets under my skin when people will overlook you just because you don't have a degree. After a certain amount of professional experience, it becomes mostly irrelevant anyways.

I'm a high school teacher who accidentally learned to program in Python last year. HTML should definitely be listed as a gateway drug...

Anyhow, do any of you guys think that if there had been teachers in your high school who knew how to program it would have been of any help? Would it have mattered? I'm just wondering because sometimes I look around at my colleagues and I feel very, very alone. One could theorize that maybe high school would be more relevant to more kids if more teachers could program....maybe.

Anyhow, do any of you guys think that if there had been teachers in your high school who knew how to program it would have been of any help? Would it have mattered?

Pardon the language, but FUCK YES!!!

We even had a non mandatory afternoon group for Informatics where I studied. One of the common mistakes was that the students could not save properly during the exam, because they were computer illiterate. That was the level of that and hence I found that dull and uninspiring.With the benefit of hindsight, I can now safely tell that that teacher had basically no idea about programming above the level of what we did, which was Hello World, some hardcoded math calculations and at the end a very very limited command line calculator.

If there would have been someone around being able to hold a proper class, I would have chosen Informatics right away, which I did not and regret.