Friday, February 16, 2018

Learning to program is getting harder

I have written several books that use Python to explain topics like Bayesian Statistics and Digital Signal Processing. Along with the books, I provide code that readers can download from GitHub. In order to work with this code, readers have to know some Python, but that's not enough. They also need a computer with Python and its supporting libraries, they have to know how to download code from GitHub, and then they have to know how to run the code they downloaded.

And that's where a lot of readers get into trouble.

Some of them send me email. They often express frustration, because they are trying to learn Python, or Bayesian Statistics, or Digital Signal Processing. They are not interested in installing software, cloning repositories, or setting the Python search path!

I am very sympathetic to these reactions. And in one sense, their frustration is completely justified: it should not be as hard as it is to download a program and run it.

But sometimes their frustration is misdirected. Sometimes they blame Python, and sometimes they blame me. And that's not entirely fair.

Let me explain what I think the problems are, and then I'll suggest some solutions (or maybe just workarounds).

The fundamental problem is that the barrier between using a computer and programming a computer is getting higher.

When I got a Commodore 64 (in 1982, I think) this barrier was non-existent. When you turned on the computer, it loaded and ran a software development environment (SDE). In order to do anything, you had to type at least one line of code, even if all it did was another program (like Archon).

Since then, three changes have made it incrementally harder for users to become programmers

1) Computer retailers stopped installing development environments by default. As a result, anyone learning to program has to start by installing an SDE -- and that's a bigger barrier than you might expect. Many users have never installed anything, don't know how to, or might not be allowed to. Installing software is easier now than it used to be, but it is still error prone and can be frustrating. If someone just wants to learn to program, they shouldn't have to learn system administration first.

2) User interfaces shifted from command-line interfaces (CLIs) to graphical user interfaces (GUIs). GUIs are generally easier to use, but they hide information from users about what's really happening. When users really don't need to know, hiding information can be a good thing. The problem is that GUIs hide a lot of information programmers need to know. So when a user decides to become a programmer, they are suddenly confronted with all the information that's been hidden from them. If someone just wants to learn to program, they shouldn't have to learn operating system concepts first.

3) Cloud computing has taken information hiding to a whole new level. People using web applications often have only a vague idea of where their data is stored and what applications they can use to access it. Many users, especially on mobile devices, don't distinguish between operating systems, applications, web browsers, and web applications. When they upload and download data, they are often confused about where is it coming from and where it is going. When they install something, they are often confused about what is being installed where.

For someone who grew up with a Commodore 64, learning to program was hard enough. For someone growing up with a cloud-connected mobile device, it is much harder.

Well, what can we do about that? Here are a few options (which I have given clever names):

1) Back to the future: One option is to create computers, like my Commodore 64, that break down the barrier between using and programming a computer. Part of the motivation for the Raspberry Pi, according to Eben Upton, is to re-create the kind of environment that turns users into programmers.

2) Face the pain: Another option is to teach students how to set up and use a software development environment before they start programming (or at the same time).

3) Delay the pain: A third option is to use cloud resources to let students start programming right away, and postpone creating their own environments.

In one of my classes, we face the pain; students learn to use the UNIX command line interface at the same time they are learning C. But the students in that class already know how to program, and they have live instructors to help out.

For beginners, and especially for people working on their own, I recommend delaying the pain. Here are some of the tools I have used:

2) Entire development environments that run in a browser, like PythonAnywhere; and

3) Virtual machines that contain complete development environments, which users can download and run (providing that they have, or can install, the software that runs the virtual machine).

4) Services like Binder that run development environments on remote servers, allowing users to connect using browsers.

On various projects of mine, I have used all of these tools. In addition to the interactive version of "How To Think...", there is also this interactive version of Think Java, adapted and hosted by Trinket.

I have used virtual machines for some of my classes in the past, but recently I have used more online services, like this notebook from Think DSP, hosted by O'Reilly Media. And the repositories for all of my books are set up to run under Binder.

These options help people get started, but they have limitations. Sooner or later, students will want or need to install a development environment on their own computers. But if we separate learning to program from learning to install software, their chances of success are higher.

UPDATE: Nick Coghlan suggests a fourth option, which I might call Embrace the Future: Maybe beginners can start with cloud-based development environments, and stay there.

UPDATE: Thank you for all the great comments! My general policy is that I will publish a comment if it is on topic, coherent, and civil. I might not publish a comment if it seems too much like an ad for a product or service. If you submitted a comment and I did not publish it, please consider submitting a revision. I really appreciate the wide range of opinion in the comments so far.

45 comments:

Great post Allen!My first computer was a VIC-20 (I later got a Commodore 64), where you also start with a BASIC prompt. I too have been thinking about to the barriers to starting to code.Last fall, I helped in my son's grade 8 class to teach them python programming. There we wanted the students to be able to start coding with minimal effort, and minimal magic. They already had MacBook Airs, which had python 2.7 already installed, so there was nothing to install. Then we concentrated on a few core programming constructs (no classes for example), and no libraries. This was an attempt to recreate the environment I had when I learnt to program on the VIC-20.I think it worked really well. The students were able to write fairly advanced (a few pages full of code) with that set-up. If you want to do more advanced programming, you'll need to install more, but for a start it was really good.I've written more about it here:https://henrikwarne.com/2017/12/17/programming-for-grade-8/

Wow - memories. I had the VIC-20 as well, although I actually had an old Texas Instruments before that. Remember the cassette deck? I still have some old cassettes spitting out squawks of bytes :-) I also had a $300 dot matrix printer that took 2 days to print out a page.

That tweet about dinosaurs tho. I would kinda call those people "sane" instead. Forcing developers to use the slow, unreliable cloud-shit for anything productive is sooo far-fetched at this point, I'd argue to fire anyone who does that professionally without an incredibly sound reason.

This is great. I used a similar tactic with JSFiddle when teaching 5th grader's procedural art in JavaScript, so that they would get exposure to all three web langs without having to muck around in a text editor or file system (they are really wiggly, hard enough to get them to think about the syntax).

I personally only overcame this barrier as a programmer without a CS degree by moving to a city with other devs on modern stacks and going to meetups and having them help me through CORS errors in chrome console, which were baffling and not in any of the tutorials, etc. I wish things like https://frontendmasters.com/books/front-end-handbook/2017/ which are more holistic had been around when I started.

P.S., we met at EdFoo where I embarrassingly told you during a conversation about this wonderful book called Think Bayes that had this wonderful educational paradigm... oops.

With the advent of free online courses like khan academy, resources like stackoverflow and (obviously) the world wide web and powerful search engines, not to mention error-highlighting, auto-completing, easy-to-navigate IDEs, it is massively easier, cheaper and more approachable these days than back when everyone was figuring out how to write GOTO statements on C64s and ZX Spectrums.

If you are incapable of master the skills required to download and double-click an installer for something like Eclipse, you're hardly going to develop a keen interest in programming, are you?

Sorry, but although it is well written, this whole article is total bunk.

I click on "Software" to install software. Software sits there with no indication about what it is doing, eventually a pile of icons appear that I have no interest in. Click in the search box, and type: Ecliple (type) El (typo) Eclipse because some dude on the internet said "something like Eclipse."

Eclipse Integrated Development Environment version 3.8.1-8, and reading down the oddly sorted comments most say OLD VERSION. A Jan 14, 2018 comment says "Good, but out of date -- This is sooo old, I can easily get the new version from the website."

Close Software, open Chromium, search for Eclipse, find Eclipse (software) on Wikipedia, click the official website at the bottom of the article: https://www.eclipse.org/

Click the large orange DOWNLOAD button, eclipse.org/downloads click the large orange DOWNLOAD 64 BIT button, click the large DOWNLOAD button to Download from somewhere, now it downloads, open the download.

Now I am in Archive Manager, I see a folder named eclipse-installer and an "Extract" button, click "Extract", choose a folder to extract it to (or just dump it in Downloads), Extraction completed successfully -- Close (default), Quit, Show the Files. I click "Show the Files"

I am in my Downloads folder in a File Manager, double click "eclipse-installer", and now there are many folders and icons, there is a gear box named "eclipse-inst", maybe I'll read the readme first, the readme is a folder, double click readme folder, double click readme_eclipse.html A huge outline of links appear with the title Eclipse Project Release Notes, Release 4.7.0 (Oxygen) Last revised June 28, 2017. Ctrl+F is usually our friend, Find "install" 2. General - Startup > 1. Installation/Configuration issues that can cause Eclipse to fail start, that doesn't even make any sense grammatically. Close the readme, go up a directory in the File Manager, and double click the gear box.

Eclipse Installer

A Java Runtime Environment (JRE) or Java Development Kit (JDK) must be available in order to run Eclipse Installer. No Java virtual machine was found after searching the following locations: /home/user/Downloads/eclipse-installer.jre/bin/java java in your current PATH, Close

You make some very good points, but I liked the article, so I'd like to present a different side to that coin.

I started with computers before I even had web access. Back then, computers were built almost solely to program, to the point where you couldn't do much without 'programming'. When I had a problem, I had to read dense, horribly written manuals. When that failed, I had to ask one of two people I knew who programmed for help. And, if they couldn't help me, I needed to either abandon the project, brute force my way through the error, or try to do it in a different (hopefully better documented) way. It was very frustrating, but I used to get high from solving problems.

Today, programming has done a complete 180. If you have good Google skills, you can solve just about any problem in moments. There are tons of resources out there. It should be so easy. But, there are a couple of problems. The first is that computers are now designed to hide programming from their users. And the second (which the article didn't mention) is that it's too easy to solve a problem by copy/pasting some code from Stack Overflow. You've all heard of test driven development, but now we have an era of Stack Overflow Driven Development where you copy, paste and pray.

So, on one hand it's harder to get started today, but it's easier to find answers. Because it's easier to find answers, I question whether new developers really understand how to troubleshoot. Back in my day, it was easy to get started. Or, more accurately, it was almost impossible not to start. But, it was very hard to get answers.

If I had to choose one, I think I'd pick the old days. Though, it's also possible/likely that I'm just 40 and nostalgic for my youth!! :)

It's not even harder to get started, unless by "getting started" you literally mean number of steps required to be able to start typing a program in, which unless hardlyyou have serious ADD hardly counts. For me "getting started" means how far you will have got after your first session attempting to write some code - maybe after an hour or two.

Codecademy, etc. have interactive zero-install web-based lessons. Where were those on my ZX Spectrum back in 1984? Sinclair User magazine (although hardly interactive) and if you got stuck you were on your own.

Eclipse has an easy-to-use, live-edit-capable, point and click debugger? Where was that on your C64?

Honestly, you lot really are sounding like old farts who can't see the extent of the rose-tint affecting your spectacles for all the nostalgia clouding your vision. :-)

In common with most things to do with science and technology, we have it better than we ever have before. Unsurprisingly.

You are right that It is easier to learn to code/program now than ever before, but that’s not what Allen’s post is about. It’s about “the barrier between using a computer and programming a computer is getting higher.” I don’t have a SC degree but managed to land a help desk position 2 years ago based on my history in customer service and additional work experience. I previously was in manual labor and then sales, with a BA in Fine Arts. This article summed up my stuggle well. Beside on the job experiences I studied up a lot to get the core 3 CompTIA certs, and now devote personal time to LinuxOS and servers / Ruby on Rails / SQL reporting. I am now finally staring to get through that high barrier. Yes, now that I am past that barrier from user to programmer I am can finally understand and utilize all the amazing resources out there. Nonetheless I First had to “Face the pain” to get here and still have so much to learn :). ** when I started I literally didn’t know how to launch an application unless it was on the task bar or had a desktop shortcut.

Honestly, these days, when I need to teach a complete newbie about programming, I just tuck some javascript into a script tag in an index.html file. Their entire dev environment is NotePad and a web browser - two things they already have and are familiar with using.

Once they start getting the hang of basic programming concepts in JS, then I'll often pop them over to some other language and a dev environment they actually have to install.

I definitely agree with the sentiment about the bootstrapping problem being a lot harder when a machine doesn't boot into a commandline.

Most people are not interested in programming. Forcing EVERYONE to boot to a command line when only 1% of users actually want or need one is not going to make your operating system popular. Look at Linux, the reason we still haven't seen the "year of linux" is because it's still too complicated for the average user who only wants to get on facebook and watch cat videos. I do agree somewhat with the sentiment of the article, but for different reasons. Back in the day learning to code meant learning one language. Today to do anything meaningful on a computer you need to know several languages (JS, JQuery, CSS, Html, SQL, XML, JSon, Angular, boostrap etc.) Ok, some of those are not programming languages, but you still have to have a solid grasp of them to get things done. The learning curve to get to a "programmer" level was steep before and required a lot of dedication, but today I would say that it's literally a cliff.

I don't think anyone is suggesting that people who want to watch cat videos should boot to the command line, or even that anyone should *boot* to the command line. The suggestion is that, in order to learn programming, it either a) helps or b) is essential to be able to *use* the command line.

Of my fairly wide range of acquaintance, I don't know anyone who a) can write simple code with some fluency and b) can't use the command line.

Just for example, the Software Carpentry training organization starts every course with a morning on how to use the command line.

Speaking as a programmier, researcher and teacher, I would say to my students who want to do meaningful analysis on their computer, that zero knowledge of {JS, JQuery, CSS, XML, Angular, bootstrap} would be fine, and a passing familiarity with {HTML, JSON} would likely come their way at some point.

It's true that there is a wall of flak that one has to penetrate to start coding these days. I teach people to code starting with OCaml. The first assignment is fearsomely long and tedious configuration work. And god help you if you have a windows machine.But I don't think this is the reason it's harder to learn coding. It's harder to learn coding because 1. too many high schools are teaching students to "code" using the foolishness promulgated in Java and 2. because there are many, many solutions or near-solutions, often wrong or half-baked, on the web. New coders have to have the stamina to resist looking at 50 different solutions to a related problem.

The problem with cloud9 is that you have to learn both how to do sysadmin junk, AND how the process to do that sysadmin stuff on cloud9's system deviate from the instructions that you'd find on a site telling you how to do the sysadmin on a normal machine. And every cloud provider differs in their own way. If you don't have a clear, concrete picture of how a system works before hand, you will be utterly lost at worst, and hopelessly brittle at best.

I completely agree with everything said. I'm of the same vintage as many of you, I learned to program on an Apple ][ and a TI99/4A. I have an analyst who is very good with SQL and visualization tools and has been trying to do some scripting with PHP. He's getting better but you can see what a struggle it is for him to understand the things we took for granted 35 years ago.

He has a deep understanding of the data layer but I think because of that he's often at a loss when he has to do anything procedural. He gets caught in the trap of "It's just there, why can't it just do this for me" and he's trying to force the flow to work like he thinks in SQL. Back in the day you had to understand the procedural stuff to get to the data now in many ways it's reversed.

first of all, thank you for your books. I worked through twice and skimmed the rest. It was a pleasure to read and fun to work with.

I've been thinking about your post for a while and I was thinking: Can one separate "Learning to Program" from "Understanding Computers"? My point is, that for me, this is inseparable. To do good programming, I have to -- at least on superficial level -- understand how stuff works beginning with the hardware and ending at high-level programming languages. Similar like understanding the OSI 7 layer model if you want to program for The Internets.

Despite your wish for it, you can not do everything in one book. It's impossible. The easiest jump-start I see is using tools like Anaconda or https://colab.research.google.com or jupyter or the likes. Which come ready made. But still then, to grok this, takes time.

Maybe it needs a separate book? What about: "How to think like a computer." and then reference this book in the intro of all the other books. ;-)

The best, easiest way for an absolute beginner to learn to program is using the BBC micro:bit. Its a tiny board, you plug in the USB cable to a computer, download the IDE. You program in a subset of Python (microPython). The IDE has a button called Flash, when you click on it your program is sent to the micro:bit and executed.

The language and board are very limited, but it works, works very well. Great for young kids.

Many years ago, there was a development tool by the name of Turbo Pascal. It ran on MS-DOS, the IDE was integrated with the compiler, etc. I spent an entire day learning the whole system, including the Pascal language. Apple had a tool called HyperCard, very easy to use programming tool.

There are a lot of tools available for programing the Raspberry Pi, which is great, but a lot of folks now only use either Android or iOS devices. Its a shame that no one seems interested in creating programing tools that are simple to use and learn, but powerful enough to create useful applications on mobile devices.

Here too, thanks for your books. I tried to use them at some point with my two sons.

I noticed that now that they are in college, just like the interns we had, they are using Anaconda and Jupyter Notebook. For larger projects, PyCharm.

I did explain the advantages of Emacs to both my sons and some of our interns, but to no avail. And now, in my latest project, I just followed them and used PyCharm, Python, and (for me) KLayout). Pure magic ...

So I would start with Anaconda (in a supervised install), and then they can run idle, Jupyter Notebook, Spyder, etc.

For example, Visual Studio is available and free for Windows and MacOS. C# is an excellent language and is similar to Python in syntax. .Net is a wide and robust framework and comes with Windows and if you install VS for Mac, you get it as well.

That does leave out Linux do a degree - but I'm finding it hard to believe that Linux users are the ones you're talking about - and there's mono for Linux which also gives you C# and .Net.

Your post reminded me of an idea I had for a web-based "DSP explorer" tool previously.

I worked in audio software for a few years, developing audio drivers for Windows and also firmware for devices. It's actually what inspired to start programming in the first place. I find DSP programming really interesting as the results can be "heard" instantly. However, it took a while before I got to the point where I could actually manipulate the audio data to do interesting stuff. Similar to the sentiment in your blog post - I didn't really want to know about pointers in C or how malloc works - I just wanted to try DSP on audio samples!

So I thought it would be great to have a super simple tool for beginners to use, to accelerate that process. As you know, any DSP processing happens in an audio callback function. That's the heart of the whole thing, so why not have a web-based tool that plays an audio track and gives users access to the output callback, so they can modify audio samples (using Javascript for example) and hear the results instantly. You could even hook up some simple sliders to allow them to modify parameters in realtime.

I think this would be really useful for learners who just want to get a feel for how they can use math to transform audio, with no need to get into OS concepts beforehand. There may already be tools like this out there, although I'm talking about something really easy to use (and much friendlier than Matlab ;-) )

This couldn't be further from the truth in my opinion. I started programming 25 years ago and it took grit and a real resolve to go through a manufacture's manual to figure out how to code. Today you got a gazillion ways to learn how to code. Just to prove my point, the SDK and Environment installation on my dad 286 took hours to install and an entire submanual to get ready. That's before you had the C compiler ready on that IBM2. Did I mention that the development environment was spread over 5 diskettes? Today it is a walk in the park compared to the old school days.

Amazing article. It is however, missing development containers. Docker has fundamentally solved this problem. Most problems related to setting up a development environment is around downloading packages, setting up paths/default interpreters etc., there is no need to run another kernel in a VM. Containers solve all the problems above as they have their own root FS without being as heavy weight as Virtual Machines.

I started with a CoCo in 1981 with BASIC and a cassette "drive". I learned QuckBASIC, then VB, all to support contract database programming. I later got a couple of degrees in computer science. The most important thing to me would be the ability to have fun right away, with possibilities developing in the newbie's head. I've seen amazing reactions with R or even Excel VBA with instruction on "How do I get it to print my name over and over?" I miss those days sometimes.

I would rephrase it slightly. It's harder to learn to program something that you'd actually want to use and feel proud of now. On a mac at least, python is installed by default, so you can still do: open terminal, type "python", type "print 'hello world'" and you've programmed something. But I'm not sure why anyone would want to do that, except just to say they programmed something.

What do people actually want to do? Play a game, write a blog, maybe crunch some numbers for homework? There's tons of existing software out there, often for free, that's way better than anything you'd be able to write as a beginner, or even as an expert without putting tons of time and effort in. (Why is this blog on blogger instead of making his blog website from scratch? Probably because that would be a ton of effort for no real gain). And once you try to go beyond the basics, everything is linked... Every time you look up a programming problem you'll find a reference to some other library or concept that you need to learn, or else have a big black box of "i don't know, it just works" in your code.

Way to complicated for no real reason. Especially since enforcing strict OOP is not even all that useful. There is no reason why main can't just be a top level function or why System.out.println can't just be println.

Interesting viewpoint, but I'm not sure I agree. In response to your three initial points;

1. No computer manufacturer I can remember ever installed an SDE. The C64 only worked that way because it lacked a proper operating system (like CP/M). An SDE might have been included with some later operating systems, but most assumed the user was not a programmer. The other problem these days is the extreme fragmentation of programming tools. Back then, you had far fewer language options and generally a language didn't fall out of favor in a couple of years. If you included BASIC, that'd be good enough for most people -- not so with web development, there are a trillion languages/frameworks and most of them only work well for very specific applications. And honestly, if someone can't figure out how to install a development tool of some kind -- I honestly doubt they're going to have much luck developing anything. You have to learn to walk...

2. This transition was deliberate -- it was part of making computers easier to use, and while they may obscure the internals, I do believe that GUIs do introduce great efficiencies when implemented properly. Just because you don't start in the "dark place" (as early Windows users called a command prompt) doesn't mean you can't get there. You can develop a ton of things on pretty much every OS from a command line (shell scripts, Powershell, batch files, VB script, etc).

3. As you pointed out later on, in many ways cloud computing is a boon for budding programmers. You don't have to install anything. On a great many web pages, you can just right-click and view the source for most of what is going on -- which is something I still do today on misbehaving pages. There are many sites that allow you to write code right on a web page and when your code doesn't work, they'll even explain to you what you did wrong. Never mind the developer console in Chrome. And mobile devices are strictly for consumption. You can't develop on a phone (at least, not in any way that approaches normal programming and/or isn't painful).

Thank you for writing this, it's a very helpful reflection on a very difficult problem.

As I was saying to a friend today, I find it very hard to convey my own sense of disconnection when I was being taught (at the time) to run canned programs on a mainframe, and the great difference it made when I realized I could install and run software on my own machine, for the same tasks. It is something to do with ownership, and with the feeling that you have agency, in using your tools. I believe that this effect is more important in the long term, than the pain of persuading people to get the tools. Programming does take a certain amount of courage, to explore and to fail and to feel foolish, at least some of the time.

I was convinced by Rich Hickey's exposition of the difference between "easy" and "simple", and the importance of choosing simple instead of easy:

I think the main issues with IDEs for learners, local or cloud, is feature creep.

Learners need a one-click install IDE that has very few menu options and does very little. It is easy to work out how to use it because there are very few options to choose from.

This is great! But then someone adds another feature; wouldn't it be great of it could also do this too. And that. And another thing. All the developers agree the ID is getting "better", but it is no use as a leaner IDE any more. Before long it has joined the big heap of IDEs out there that put learners off before thy get started.

More years ago than I care to admit to I learnt to write computer programs using a BBC Micro, BBC Basic and the built-in 6502 assembler. The programming environment was almost the same as the command-line and the programs were mostly interpreted so feedback was pretty instant. So far, so straightforward. To make anything interesting happen I had to learn to operate the computer using the user manual provided. This took some time as I built a model in my head of what was going on and built some context for what worked and what didn't. Then I had to learn the syntax and grammar of BBC Basic from the user manual provided. Another learning curve with a great deal of trial and error. For even more interesting things to happen I had to learn (i) what the "operating system" really was (ii) how to manipulate the complex interface between BBC Basic and the operating system. Building expertise does not happen in an intellectual vacuum so during this process I was building all manner of little projects from games to low-level graphics functions to mathematical approximations. Since then I have built my career in software engineering on the foundations of what started out as a hobby by dint of hard work, challenging work projects and continuous self-learning. At no time in this process has it ever been easy to quickly assemble the tools, expertise and domain knowledge to do anything other than the most trivial work. To suggest an easy solution does/should exist to investigate every complex problem without tears seems IMHO to call for the end of progress, innovation and effort. Programming as a tool to investigate difficult and novel problems is not only hard (and the skill hard-won) but always has been and until someone writes the code for Douglas Adams' DeepThought probably always will be.These days I have a pile of domain knowledge across wireless comms, instrumentation, lasers, digital signal processing et. seq. and I usually write embedded software in C and assembler on microprocessors, microcontrollers and DSP processors.Just lately I have gotten interested in massively parallel computing using GPUs and APIs like OpenGL and CUDA and I would like to ask why is it so hard to get all the tools, documentation and expertise into one place before I can get even the simplest shader program to work...?If you are reading this and starting out in software engineering then I hope you have the very best of luck going forward.

I started programming 40 years ago, in my early twenties, writing machine code (i.e. no assembler!) for the Z80 on my NASCOM. There was no one to ask for help, but I had lots of time and I got stuff to work. I still feel the old excitement if I take a peek in the Z80 instruction set manual...

Today, everyone wants stuff yesterday, I've learned 6 new languages/frameworks in the last year and Stack Exchange is a godsend to find examples to speed up my programming. Not as exciting, because one no longer feels as though one is doing something relatively unusual, for the first time.

So, I guess it's swings and roundabouts. You have to get into using all the tools & help to be productive but it does add to the complexity. However, if it were too easy, everybody would be a programmer, right?

Interesting points, wrong conclusions. When I was in high school there wasn't any computer programming classes. Now my high school has a whole department. My nephew is taking mobile programming classes. Cmon, if I had that in high school I could've done so much more.

No, there's more available, but there's also much more complexity. There are so many languages and stacks it seems worse, but only if you want to want to know all the different areas. The number of tutorials and sites that help you have let me jump into a new job with new technologies and be productive within months, not years of schooling.

Also, according to this logic, I should take my parents and stick them on a Unix system with no gui so they "learn how to code". Silly.

> Also, according to this logic, I should take my parents and stick them on a Unix system with no gui so they "learn how to code". Silly.

Yes - that is the key and interesting point in this article. It seems so obvious that it's a bad idea, it's just - silly. It's obvious that setting things up in the cloud will make it easier to learn. It is obvious that docker will solve our learners' problems with installation. It's obvious that they should not have think about the architecture computer on which they run their programs.

And yet, it may well be wrong. Obvious things are often wrong.

Eben Upton made the Raspberry Pi because he noticed that students applying to study computer science at Cambridge had learned much less than their counterparts 10 years before. The new generation lived in a world of ubiquitous computing, and they had learned very little about computing; the previous generation had battled with the BBC Micro, delving deep into its hardware.

Look at the official Raspberry Pi guide "our first recommendation for adults and older kids interested in getting started with the Raspberry Pi". Depending on the edition, chapter 1 is "Meet the Raspberry Pi". Chapter 2 is "Linux System Administration".

Thanks for writing this article. Like you my first computer (a Commodore VIC20) had the built in programming enviroment. Like you, I just used tools as they became available, or was instructed to use them. In other words, I'm just used to the current way of doing things. In many respects, I thought of this as a golden age of programming because of all of the tools and information available. The large number of non-college educated programmers entering the professional programming ranks seems to support my original idea.

You're article has made me realize, that for someone coming into this cold, there really is a lot to learn and be familiar with in order to do some simple things. Thank you for making me consider this.

Concerning Nick coghlan's point, I agree and disagree. Business (esp small business) will embrace cloud computing given the challenges and costs of setting up and running server hardware. Using the cloud makes so much sense in comparison. However, for someone learning to program, locally available tools are good first step. If you can't do a basic programming locally, how are you going to do it in the cloud?

I actually believe there is an agenda to keep programming elitist, not necessarily by programmers themselves but by those at the top of the pyramid.Computers should be about making life easier, and doing all the hard tedious work for you.Eg, it used to be that I could create a Textbox and give the command Str1$ = Textbox1 + "pie"

And the environment would assume, ohh, what's in the textbox must be a string.

Alternatively, I could give the command Num1 = Textbox1 * 2.4

And the environment would assume oh, now he's putting numbers in there.

Now, very often, these things must be declared beforehand.

Another example is machine language or assembler.By default, a very fiddly thing to do, because you're down to the level of putting ones and zeros into registers and accumulators, shifting and adding binary etc.That's a necessarily complicated thing to do.

Then you work on a high level language like C++ and it's just as complicated, and I'm left thinking, this is just as complicated and unforgiving as working with a low level language like assembler.

I don't think it need be like this for the vast majority of high level programming situations.Old versions of Visual basic used to be easy in this way, up until Version 5 came along.Languages and environments that make it easy, tend to be ignored, two in particular I can think of are REBOL and MIT Appinventor.

High level languages should be about making life easier for the programmer and I think in general they don't.OK in the example above. a compiler needs to know whether what it is adding is a string, or a number, this is determined by the use of '$' as per early versions of basic.

It ought to be possible for your average shop keeper to write his own code to carry out stock control, or whatever to run his business.Of course, it would be better if software to do this were freely available, so he does not have to re invent the wheel.And there is of course, but it would be nice if programming were not so intimidating for your average shop keeper.There are of course occaisions where complex, deep coding is necessay, scientific projects etc, but for the vast majority of applications, much looser programming environments should be used, eg scratch like applications like MIT's app inventor, Visual Basic, REBOL.

When I was learning to code, I had to find hard to find books on the topic, that were often quite expensive in of themselves. My Parents barely knew what a computer was or what good one was.

_"What would anybody ever need a computer in their house for?"_

That was the viewpoint of _most adults_ back in the day.

The Internet didn't exist. But BBS and 300 baud modems did. I was one of the lucky first to own a 1200 baud modem! My hunger for knowledge led me to acquiring the code that runs those BBS's and run one myself. I wanted to play games and especially games that engaged other real people (computer AI was quite dumb back then...).

Those learning to program then didn't grow up with innate computer skills. So new programmers of the day learned about computers as much as they learned about programming them. Not only that, they _built_ the tools they then used to program with. New editors. New languages. New frameworks.

But the world of programming was still opaque. You had to go to college to really crack it. Why? Because the Internet didn't yet exist. Open Source didn't yet exist. I spent hours upon hours on Gopher and FTP chasing source code to learn from. My focus was on the very core things that are not taken for granted today. How to do doubly-linked-lists, queues, stacks, sorting algos, memory management. I wrote my own testing suites, my own low-level debugger and hacked on vi, the editor, just to have better tools to build and test with. I even learned Assembler and often traced the assembler generated by my program's compiler to find "my bug" vs. a compiler bug.

Who hears about compiler bugs these days?

Yup, I'm sorry, but programming today is not harder than when I was learning. Now, the programs you write today....different story. It may be getting harder and harder to come up with great, outstanding ideas that is, what they used to call, "the next killer app" that defines a whole industry and movement like 1-2-3 and Excel. Like Word. Like Amazon. Like Google or Facebook. It's harder to stand out. It's harder to hit a home run.

Then again, it's not. Today, you can build something. Build it relatively quickly. Publish it to the App store. List it on websites all around like Product Hunt, TechCrunch, and so on. You can crowd source funding and quickly get mindshare of millions of people with virtually NO budget.

If you don't know where to start, you know where to start getting answers. And those answers are, by and large, free to all who care to seek it out. Programming is more accessible to people of all walks of life than ever before. Even my young daughter of six has "programming environments" that lets her put together programs without knowing what a semi-colon is. There are even frameworks now that let you build entire applications that can be deployed to cloud and mobile alike without writing one single line of code.

I'm sorry, but learning to program today is easier than ever before on every single front I can think of except one: The spark of an idea that will solve something not yet solved.

LOL.I started on a VIC20.I would totally swap with anyone learning today.Try convincing your parents to spend hundreds of dollars, in 1980's cash, on a C compiler or a machine language cartridge. And hundreds more on books. $40 month for another phone line so you could log into BBS to message a few other programmers without tying up the family's only phone line. Then randomly lose your programs ("magic" peeks and pokes and tedious endless data statements) which were stored on budget audio cassettes you spent your meager allowance on at Kmart.

The good old days were not a golden age. Your glasses are stained rosy by age.

To your points:1. Every computer sold today has several development environments orders of magnitude easier and more powerful than a C64 prompt preinstalled.2. Yeah, "SYNTAX ERROR" was surely more helpful :) You must have had some OTHER Commodore OS. I had lock ups and mysterious errors. It is always better now.3. Cloud? If it is confusing, skip it. I love Bluemix, but there is a world of things you can program without ever knowing about the cloud.

I think there will be always certain friction between programmer and common user interfaces. Even programmers these days do not see everything. I am from 90s and barely wrote a few lines on a 8086 microprocessor, so did not get to learn from the excruciating debugging phases, while my previous generation made a living on these.

Troubleshooting can be frustrating. "How do I solve a problem I don't know how to solve? How do I find the answer to a question I didn't know to ask?" But today it basically comes down to learning how to use google and stack exchange effectively and a healthy dose of curiousity. Teach those skills first, or very early. But yeah that first step in programming can be either motivating or demotivating depending on how it goes. Javascript in the browser is a pretty safe place to start if you want to achieve a quick feeling of success and the ability to go where curiousity takes the student in a tight feedback loop.