I am currently finishing my MSc in computer science. I am interested in programming languages, especially in type systems. I got interested in research in this field and next semester I will start a PhD on the subject.

Now here is the real question: how can I explain what I (want to) do to people with no previous knowledge in either computer science or related fields?

The title comes from the facts that I am not even able to explain what I do to my parents, friends and so on. Yeah, I can say "the whole point is to help software developers to write better software", but I do not think it is really useful: they are not aware of "programming", they have not clue of what it means. It feels like I am saying I am an auto mechanic to someone from the Middle Ages: they simply do not know what I am talking about, let alone how to improve it.

Does anyone have good analogies with real-world? Enlightening examples causing "a-ha" moments? Should I actually show a short and simple snippet of code to 60+ year-old with no computer science (nor academic) experience? If so, which language should I use? Did anyone here face similar issues?

$\begingroup$Comments are not for extended discussion; this conversation has been moved to chat.$\endgroup$
– D.W.♦May 3 '16 at 5:22

3

$\begingroup$This question got a downvote and one or more close flags, if I correctly understand the panel. Please consider leaving a comment so that I can improve it. Though I've been lurking here for a while, this is the first time I actively participate to the site and maybe I am not familiar with some policies.$\endgroup$
– effeffeMay 3 '16 at 12:21

$\begingroup$I simply can't understand why this question is more up-voted than a question related to a specific problem, which is more useful. This question is ridiculous and was ridiculously up-voted. Incredible!$\endgroup$
– nbroMay 5 '16 at 17:15

2

$\begingroup$@nbro I don't get what's ridicolous about the question, and I am not sure how you decided that a specific question is "more useful" than another one.$\endgroup$
– effeffeMay 6 '16 at 12:30

14 Answers
14

If you have a few minutes, most people know how to add and multiply two three-digit numbers on paper. Ask them to do that, (or to admit that they could, if they had to) and ask them to acknowledge that they do this task methodically: if this number is greater than 9, then add a carry, and so forth. This description they just gave of what to do that is an example of an algorithm.

This is how I teach people the word algorithm, and in my experience this has been the best example. Then you can explain that one may imagine there are more complex tasks that computers must do, and that therefore there is a need for an unambiguous language to feed a computer these algorithms. So there has been a proliferation of programming languages because people express their thoughts differently, and you're researching ways to design these languages so that it is harder to make mistakes.

This is a very recognizable situation. Most people have no concept that the computers they use run programs, or that those programs are human-written source code, or that a computer could 'read' source code, or that computation, which they associate with arithmetic, is the only thing computers do (and data movement, and networking, maybe).

My research is in quantum computing, so when people ask me what I do, I don't attempt to explain that. Instead, I try to explain that quantum physics exists (they've usually heard of Schrödinger's cat, and things that are in two places at once), and that because of this strange physics, faster computation might be possible.

My goal is to leave the person feeling a little more knowledeable than they did going in, feeling excited about a world they didn't know existed, but with which you have now familiarized them. I find that that's much more valuable than explaining my particular research questions.

$\begingroup$Sorting a deck of cards is also an easy way to introduce the notion of algorithms.$\endgroup$
– MorwennMay 1 '16 at 19:45

2

$\begingroup$@Morwenn That's true! There are loads of algorithms that we execute in daily life! Dealing cards is algorithmic, traffic has many algorithmic aspects, although they are event-based and not imperative, cooking is algorithmic when you do it on auto-pilot. The reason I like addition is that everybody learned the same algorithm in elementary school, whereas to sort a list of numbers, people's strategies vary and they're not methodical: they try to find patterns of near, adjacent numbers, and not everybody knows the order a deck of cards is supposed to have anyway (is it hearts before spades?)$\endgroup$
– Lieuwe VinkhuijzenMay 1 '16 at 20:06

$\begingroup$Personally when asked about quantum computing I tend to avoid physics completely but try to give a general idea (that a quantum computer does not work better or faster but can simply do calculations in a way that's out of scope for normal computers). If they ask what that way is they tend to be happy with a simplified view, basically that of SIMD processing (with linear resources enough to access exponential size of parallel input data), which most of the algorithms after all are.$\endgroup$
– The VeeMay 1 '16 at 20:06

$\begingroup$You can go beyond just cards; I recently found myself using a variant of Quicksort to sort a stack of papers I had to organize without even thinking about it because there were too many for a basic insertion sort.$\endgroup$
– JABMay 1 '16 at 20:14

$\begingroup$@JAB That's amazing! But you probably already know Quicksort in the back of your mind. The advantage of explaining addition vs. sorting is that everbody has the same addition algorithm, but no layperson has any methodical sorting algorithm. On the other hand, that could be an advantage! You get to explain different algorithms. This is the route I take when I talk to somebody mathematical-minded outside cs, and it gets across the concept of different algorithms for the same task with different time-bounds, and why lower bounds are difficult.$\endgroup$
– Lieuwe VinkhuijzenMay 1 '16 at 20:23

Programmers can tell computers what to do. To do that, they need to use a programming language. That is a language that is understood by both computers and humans. For example, if you edit a Word document and press a key, the computer will show the letter you pressed. That's because a programmer wrote a program saying: If the user presses "A", put an "A" into the document. If the user presses "B", put a "B" into the document, and so on. The computer just follows the rules in the program that the programmer wrote.

Now sometimes the programmers write a stupid rule by mistake. The computer will try to follow the stupid rule anyway, but if you follow a bad rule, something bad will happen. For example, sometimes, when you're editing a Word document, suddenly everything freezes and the computer doesn't react anymore. That may be because some programmer at Microsoft wrote a less-than-perfect program.

My work is about inventing methods to check programs for such and other mistakes, using other programs and some mathematics. The basic idea is to figure out how to predict what will happen when a program is executed, without actually executing it.

Of course, since it's research, I only work on a little aspect of this, not everything at once, but that's the big picture of what we're trying to achieve.

I'm using an explanation in a similar style for my field (domain specific languages), and I can report that it often helped me overcome the "oh you're a computer scientist I could never do that let me go away and stop talking to you" issue. The key appears to be to get the first couple of sentences out until one reaches an example the other person can relate to, like Word documents in this case. Bonus points for special-casing the example to the other person, such as Excel for accountants or Powerpoint for bosses or computer games for gamers or web browsers or whatever.

Note that you don't have to stay at this superficial level. If you (and the other person!) wants to, you can delve into the details of what you're exactly doing from there. For example, "In my current project, I'm trying to mathematically prove that what I invented last year actually works. That means I have to define very carefully what a program really means, and what my invention predicts a program means, and then I can show that the prediction is actually right".

Most people understand recipes. If you follow the instructions, you'll get a decent meal. Sometimes, though, the instructions can be difficult to follow. For example, when you're making perogies, you'll find instructions like this, taken word for word from Grandma's Polish Perogies:

To cook perogies: Bring a large pot of lightly salted water to a boil. Drop perogies in one at a time. They are done when they float to the top. Do not boil too long, or they will be soggy! Remove with a slotted spoon.

Yikes. Let's review Grandma's kind advice to us. Drop them in one at a time... so they all have different lengths of time they're in the pot. Got it. They are done when they float to the top. Well how many is 'they'? Do I catch each one as it comes up? Do I wait for 80% to float to the top and then get them all? This is crazy inaccurate. Do not boil too long, or they will be soggy! How do I measure when they're about to get soggy so they aren't in too long? If five pop up to the top at the same time, will I have time to get them all?

And trust me, I've ruined a lot of perogies in my day. This is a serious problem that any perogie cook has run into before. But in spite of these problems that come out even upon the most basic analysis, people are still able to make perogies using this exact same method. But is there something we can do to make more batches of perogies successful and less batches of perogies fail?

Wouldn't it be grand if someone specialized in making recipes more reliable? Someone who said "we can pipeline the perogies so they pop up and out of the water!" or "we can add a special dye to the perogie, safe to eat of course, that gives them different shades and we know to get the darkest ones out first, since they're most at risk of becoming soggy". We want an expert who can take this recipe and fix the potential problems with it. The lives of perogie cooks around the world will be easier, and fewer soggy pasta-potato lumps will be fed to a dog.

Programmers follow recipes all the time too. But sometimes, the equipment they use, the instructions they use, and the interpretations they use don't come together to make the lavish meal they wanted. Thankfully, there are people who dedicate their careers to making the life of a programmer more productive. In your case, you've specialized in one of the tools a programmer uses, the language, to try to make them better.

Programming languages are used by people to provide instructions to a computer. Everything that a computer does is done through some computer code written in a programming language by a programmer.

So if, for instance, we want the TV channel to change when we press a button, then we would need to write some code in a programming language to do this. The same goes for everything that happens with a laptop, a smartphone, and anything else whose operation involves a computer.

It might seem like there might be a single programming language that programmers could use to write all programs. But this just isn't the case. Different programming languages exist for all different types of applications.

Some are used to build websites, others to build the laptop applications (like Microsoft Word).

One of the reasons there are many different programming languages is because some are better suited to some tasks than others. Another reason is that some languages are designed to be used with different computers. So you can't always take code written for a smartphone and run it on a laptop. And while some programmers can program in many languages, and many are capable or learning new languages quickly, any programmer is going to know some languages better than others, and some not at all.

So programming languages are one of the key technologies that make computers work. Well designed programming languages can increase the productivity of programmers tremendously. They can also improve security and reduce programming errors, bugs, and defects.

And so basically, this is what I study:
How do different parts of programming languages affect programmer productivity? What parts of a particular language make it better for writing code for a web site? Why are some languages more popular than others?

While it may seem that these questions are esoteric and far removed from the average person's daily life, the opposite is true. The study of programming languages is vital to any product or service that uses a computer. And today that means just about everything ;)

Programming languages are used to provide instructions to computers. Human languages are used to communicate ideas to other people and to help form our own thoughts. The Sapir-Whorf hypothesis says that the language that you use influences your thought. (The degree to which the Sapir-Whorf hypothesis is true is debated, but we can just accept it to be true when discussing programming languages. Otherwise, you might as well give up on your PhD. 😜)

There is a wide variety of human languages, some with features that we consider to be exotic. For example:

Some languages require you to indicate evidentiality with every statement: whether the statement is due to your own experience, your own inference, hearsay, speculation, etc.

Some languages form huge words by agglutination; each word may be highly modified to encode a lot of information about its grammatical role. Other languages hardly ever modify the words at all, and rely on word order or particles to express that information.

Languages differ in vocabulary size. Some languages have words that can only be translated with a circumlocution (e.g. 엄친아). Some languages have brilliant expressions that are so good that other languages borrow them (e.g. Schadenfreude).

Some languages have no concept of left/right; you must express everything in terms of north/south/east/west.

Languages vary in their phonotactics. For example, the syllable rate varies, with Japanese tending to use many simple syllables, while Chinese is slower, but encodes information in tones.

Languages vary in information density. If you look at a Chinese-English-French translation, you'll see that the Chinese version is very compact on paper, and the French will occupy the most space.

Some languages, like English, are promiscuous: anyone can freely borrow a word from another language and Englishify it. Other languages, like French, have a standardizing body that resists change. Icelandic, for example, is fiercely conservative by nature.

Is there a best language, objectively speaking? The answer might depend on what you are trying to do.

If you're trying to communicate secretly over the radio, Navajo would be a good bet. (Mention your favorite "write-only" language.)

If you're trying to write a warning to future generations, and the inscription must be understandable a few thousand years from now, you might want to use Chinese (due to the sheer number of speakers and the stability of its writing system) or maybe English (which has changed dramatically over the last millennium, but is very widely known). (C and JavaScript will probably live "forever".)

Perhaps neutrality is important, in which case you might choose Esperanto. (Java is designed to be portable; C is less so.)

Maybe you need to express a thought with complete precision, and no natural language will suffice. You have to resort to Ithkuil!

It may be very difficult to translate between some language pairs (e.g. Arabic-English) because of vast cultural differences and cultural connotations with certain words. Similarly, some ideas are not easily expressed in certain computer languages because the concept just doesn't exist (e.g. tail-recursiveness).

In the end, everything on a computer gets translated to machine language, but picking the right language for the job can greatly affect the productivity, reliability, performance, and agility of your software. We choose straitjacket languages like Ada or Java for "serious" projects, and Ruby or Perl for maximum whipuptitude. For querying a database, SQL is the usual language; writing your own C code would be idiotic.

Based on these analogies, I'm sure you can expound on ideas like functional programming, type safety, type inference, etc. for a few hours.

$\begingroup$One difficulty with this is that some people have never really reflected about their own natural language. So the language examples may be almost as hard for some people to grasp, as the notion of a programming language. If you know the person is (at least) bilingual, the prospects are probably better....$\endgroup$
– TextGeekMay 4 '16 at 16:08

$\begingroup$@TextGeek Even within English, there are dialects. People argue about language rules like double negatives and ending sentences with a preposition. Language evolves all the time, sometimes in controversial ways. The popularity of texting has led to new abbreviations. In American English, you can turn any noun into a scandal by adding a "-gate" suffix. I don't think you have to be bilingual to appreciate language innovation and diversity.$\endgroup$
– 200_successMay 4 '16 at 16:45

computer languages relate somewhat roughly to human languages. they use standard/ common/ shared words. consider that there are thousands of human languages, some defunct, others active, and their vocabulary and usage is continually evolving over time. some people create new useful words to express concepts that were not previously expressible. another overlapping aspect of computer language and human language is grammar. some languages eg english have very complex grammar. consider all the different tenses and the complex rules that govern them. other languages do not have the same tenses. another correlation is with parts of speech categories eg noun, verb, adverb, adjective etc, these function somewhat like types in computer languages. imagine creating new languages with different parts of speech categories that have not previously been considered, or with new combinations of parts of speech that come from different languages, etc.; so note that linguistics has a connection to computer science eg esp in Chomskian language theory.

software and hardware often are related to engines and machines and even are named after this. there is an old software analogy that trying to modify a complex 24/7 production system is like changing the engines of a plane while it is in flight. also note that jet engines are highly complex and involve massive precise specifications of interchangeable parts and here there is some analogy to computer languages that control hardware. imagine all the complexity of documents that describe exactly how to build jet engines: all the parts, how they are constructed, how they fit together, how they are assembled etc; they are created/ compiled by engineers using a precise format/ structure/ conventional rules; software is somewhat analogous to this.

re machines, anyone without scientific or mathematical education or even a child can understand the basic concept and many aspects of a Turing machine! a wondrous creation. one suspects Turing was inspired by typewriters and/or teletype machines. one could describe the state table, and show them a sample state table that computes multiplication, and someone could watch a youtube animation of a Turing machine compute a basic computation like a multiplication. tell them that the state table can compute or literally decide a language, and that the inputs are literally called words. in fact, Turing machines accept words in languages etc. then tell them that there are two state tables that both compute multiplication, but one is much faster or more efficient than the other, and that your research involves finding better state table concepts. figuring out how to build state tables effectively involve programming languages that summarize the contents of large state tables! a compiler converts code written in a programming language into a large state table.

sorting algorithms are a great entry level metaphor into computer science. one can look at the different lists of instructions aka algorithm for each sort. one can learn what a bubble sort is versus an insertion sort, one can use a deck of cards. now focus on the different sets of instructions, and consider that they have to be written in a very precise language, as precise as mathematical statements, which has a strict syntax etc, and has basic commonalities/ structures: conditional logic, loops, variables etc, and explain that the art of this precise specification is about computer languages, and that some different languages may yield the same algorithms, but that there is some subtle stylistic variations in these languages that are studied at great depth, etc

$\begingroup$"I think it will work assuming they already know a bit about computers" that's the whole point: if you carefully read my answer, you will see that the main problem is explaining what I do to those who have no clue of how computer actually work.$\endgroup$
– effeffeMay 2 '16 at 16:16

$\begingroup$@effeffe: but they're aware that computers exist? And they likely have some idea that there are things called "programs", "applications", or "apps", even if they've never knowingly used a computer themselves. So for the sake of simplicity, programming is (among other things) creating an app by writing down instructions that the computer can follow. Naturally it would take a lot of time and instruction for them to appreciate any detail of programming or of how it can be improved, but if they know that computers and programs get better over time they know "a bit" about computers, at least.$\endgroup$
– Steve JessopMay 3 '16 at 11:45

If you don't want to use comparisons, although I think the "algorithm" one that Lieuwe brought up is very nice to convey the idea, you could say that you want to reduce misunderstandings between humans and computers. After all, you're dealing with languages, and that's something very basic to humans, I guess. So why not pretend the computer is just another mind you want to talk to?

Humans created computers, so we know how the computer's mind works. But the computer usually does not "know" how our minds work. (Or what our actual intention was when we write an intricate and finely crafted bug that lives happily in the brambles of our code for many cycles ;) ) Thus, it is up to us to refine the language we use to communicate with them, and reduce possibilities for misunderstandings. And that's what you do. Analogies like spellchecking or grammar are, of course, very welcome in this context.

My work is somewhat like exploring new approaches to musical notation. (example) Although the predominant notation systems are quite sophisticated, it is valuable to explore alternatives that reduce time/effort/errors for the composer, for the performer, or allow things that are clunky or not possible to express in the predominant system, or even just promote different types of thinking and thus lead to novel compositions. (This makes it easy to understand that one of the challenges is in convincing people to adopt the new stuff.)

I am helping to design better tools for people in my industry. Just like people in the manufacturing industry are aided by innovations like cordless drills, laser cutters, and 3D printers (none of which existed 60 years ago), people in the software development industry are aided by more powerful, more precise, more robust, or easier-to-use programming languages and features thereof.

Well, most likely, you have some reason for studying languages - use that reason. E.g.:

I'm trying to make computers and their applications cheaper, easier to use and safer.

If that's something that piques their attention, feel free to go a bit deeper, but don't forget about inferential distance - most people have huge trouble explaining something that's too far from the student's existing experience and knowledge.

Languages are way down the computer ladder. The people you're trying to explain to likely don't know what algorithm is, much less that there are different ways of representing said algorithm that are more or less useful. You can explain anything if you go gently enough, but be prepared to go through many layers if you want to explain something as "deep" as "programming language RnD". It helps if the people you're explaining to already handle math well, but that's about as rare as computer knowledge :)

Go easy on analogies. They sound like they're helping, but from my experience, they usually add to the confusion even if you think they're a great help. Programming languages are kind of like lawyer speak... but that's not very helpful for people who don't already understand both programming and lawyering, really. If you really need some extra help, examples work well enough - but you do need to accomodate them to the person you're explaining to (and what exactly you're researching)

You know how you can't divide by zero? I'm trying to make sure that computers never even try to divide by zero, so they don't crash when someone makes a mistake.

I find the best analogies are tailored to the person you are talking to. Are they a painter? Discuss how what you're doing is the equivalent of exploring the theory of how to make better brushes, only this field is only 60 years old instead of 600! Equestrians? Compare it to the development of task specific horse-shoes over the years.

If they really want to understand better, my favorite way to explain it is to talk about filling out Tax forms, and the instructions on each line. I find most people over 20 have some experience with IRS Form 1040, and those boxes you fill out correspond pretty nicely to talking about assignment of variables. Then I point out that software does the equivalent of about 2 billion of these a minute (a made up number, but it gets the point across). If you're studying computer languages, its easy for people to grasp why better instructions would actually matter =)

If they still seem interested, this is about the point where I start trying to explain flow control. Usually, by the time I finish discussing flow control over Form 1040, they start getting all excited and start asking if I've sold any of my ideas to the government yet!

I often talk about embedded systems, so sometimes I'll merge the Form 1040 analogy with a robot cooking a loaf of bread (or 500). Usually it works well, though for some reason going from IRS tax forms to making something that leaves a good taste in your mouth is a really hard stretch for some people.

Your parents have some kind of computer. Probably not one, but many. They may have a desktop computer or a laptop, or a phone, or an iPad. If not, then they have a washing machine or a video record or a DVD player or something with a computer inside.

If they don't, you say: "Sorry, but you managed to have a life completely without computers. So I can't explain to you what I'm doing. But the world is changing. You just need to trust me that I know what I'm doing. "

If they have any kind of computer, you say: "This is a computer, and it doesn't just work by magic. It works because some clever man or woman wrote a program that tells your computer how to work. And the job of these clever men and women is really hard, and what I'm doing is helping them to make the job easier".

A while back I decided that a good analogy for programming which could be easily understood for people with little or no computer experience, would be writing knitting recipes.

A good knitting recipe contains several sizes in the same list of instructions, which give you loops and if-statements. It is unreadable to those which do not knit, and if it contains errors you end up with misaligned patterns or an extra sleeve. It is then up to you - as the recipe writer - to find out where the recipe was wrong based on the incorrect sweaters and fix it.

You can then use the example of "I am looking at how recipes are different for knitting, crocheting, Nålebinding and similar" and learning to X (where you need to find an analogy for what you actually do).

$\begingroup$@DavidRicherby This is why I suggested to use the example of OP looking at how recipes look for different types of handywork. Apparently these are very different too.$\endgroup$
– Thorbjørn Ravn AndersenMay 3 '16 at 5:27

It's Magic!

When people with no technical background whatsoever ask what it means to write software/programs/do software engineering, I just tell them what it really is: magic. Magicians incant spells in an esoteric language to perform certain tasks, they wave their wands, and something magical happens. I incant certain spells in an esoteric language, I wiggle my mouse, and something magical happens (as far as they are concerned).

If they aren't convinced, I asked them to turn on their smart phone and tell me how anything on it really works. They usually say: "I dunno, it just does. I push buttons and stuff happens." Then I tell them: "Yes, exactly, but I know what is really going on, and it is basically the same as Harry Potter waving his wand and saying: 'Hocus pocus'" For all intents and purposes, to the layperson, I think this is a perfectly legitimate explanation.

Almost everyone knows who Harry Potter is, what he does, and what makes him special (at least that he is a magician and can do powerful things with magic). You can simply explain that sometimes magicians need to write their spells in a new language to make their magic even more powerful, which is not really that far from the truth. You can just tell them that you are studying the process of creating more powerful spellbooks to aid a generation of even more powerful wizards who can create even greater magic on their favorite consumer devices. Everyone appreciates that.

Epic Fail

If that fails, I fall back to the recipe example, because the non-uniformity of solutions at least teaches people that there is more than one way to skin a cat (or bake a cake), and this is part of what makes programming so tricky. It also helps put the person in the position of being the CPU and sometimes following instructions they might not understand (most people probably don't know the difference between baking soda and baking powder, and why a recipe would have one or the other).

Dead Ends

I don't like using math as an example, because to do math, you have to understand what the steps are for. You can't do very much math by following instructions blindly (well, you can, but that's not how humans learn it, usually). I don't like sorting as an example because it is too technical for the layperson to appreciate. If I am talking to someone who can appreciate the sorting example, then they probably already have some idea what programming is about, and likely have tried it themselves.

$\begingroup$I don't see what this has to do with the question. Specifically, the question is asking how to explain type theory to laypeople, and type theory is math.$\endgroup$
– David RicherbyMay 3 '16 at 2:14

$\begingroup$Perhaps sadly, perhaps not, but programming in general is a closed book to 99+% of the world's population. I've spent a fair amount of time over the years trying to get non-programmers to understand programming, without success. Explaining the intricacies and delicacies of various type systems is akin to explaining subatomic particle physics to those same people - their eyes will glaze over and they'll probably be polite but they won't get it. And that's OK - they don't NEED to understand it, and in all likelihood they don't CARE that they don't understand it. It's enough that WE do. :-)$\endgroup$
– Bob JarvisMay 5 '16 at 15:27

Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).