When starting a project for a company that's not primarily a programming company, one of the expectations is that there's a finished product at the end free of all bugs and that it does everything needed right away. However, that's rarely the case.

What are some ways to manage expectations and explain to non-programmers how software development differs from other types of product development?

Answer: Start with Bugs (15 Votes)

If you own a computer, you've encountered a bug. Bugs are a good place to start.

"What's the most annoying way an application has ever failed you? Multiply that by ten, and you'll know what our users will experience if we don't devote enough resources to testing and maintenance."

And don't underestimate the value of establishing a good working relationship with non-programmers. If you can establish that your judgment may be trusted, they'll take you seriously when you sound the alarm that X is going to fail spectacularly if you don't do Y pronto, even if they don't completely understand your reasoning.

Answer: The Factory Metaphor (1 Vote)

Think of software as a machine or assembly line that exists inside the computer. Raw materials and components are fed into the machine, which follows a set of procedures to produce some final product. The procedures are set up to perform a specific operation on some raw material or component to a specific set of parameters (eg, time, temperature, distance, etc) in a particular order. If the details of the operation to be performed were incorrect, or the machine's sensors aren't correctly calibrated, or if some raw material or component wasn't within expected quality standards, it could change the operation's outcome and the product would not turn out as expected.

That's a broken product and a factory that needs to be fixed.

Another Metaphor (5 Votes)

The best metaphor for anything is itself. Anything different will lose some accuracy.

As such, picking the best metaphor depends on what it is, specifically, you want to capture about programming. Since there will be a lot of answers given here about coding metaphors, I'll answer with the classic metaphor for the development process as a whole:

Building Construction

The most common aspect of this metaphor is that a physical architect is somewhat analogous to a software architect. Here are a few other parallels:

Changes are cheaper and cheaper the earlier you make them. That is, you can move one line on paper now or 10 tons in cement later.

A building without a proper plan will tend to collapse.

The builders attempt to implement what the client wants. If the client doesn't accurately describe what they want the building to look like (or there is some other failure in communication), it'll be costly to change.

There are certain laws of physics that cannot be bent. Just as a three-hundred foot wide 2nd story cannot be built on a 100-foot wide 1st story, feature X cannot be built without a robust subsystem Y.

Of course, like any metaphor, the construction metaphor has its limitations. Some flaws:

Buildings are 1-time use; you build it somewhere, and there it stays. You can't copy it a million times for a million different users with a million different needs at zero incremental cost.

Buildings are considerably more immutable than software.

There's no clear analogy to building material cost. A line of code costs nothing—only the time it takes to produce it costs money.

The incremental architecture that is (depending on who you ask) possible with software is not possible with construction, where you design it once, then build it.

So, like any analogy, it depends what you're trying to explain. Be wary of over-relying on any one metaphor, or your customer will start wondering what the property taxes on his new payroll system will be.

Think you know how to get non-programmers to understand the development process? Disagree with the opinions expressed above? Downvote or upvote an answer, or leave your own answer at the original post at Stack Exchange, a network of 80+ sites where you can trade expert knowledge on topics like web apps, cycling, scientific skepticism, and (almost) everything in between.

The best way I have found to explain software to non programmers is that software is a mystery like the holy trinity. It’s mathematics, engineering and semantics and it's neither of these three things.

Software is mathematics because it must be proven correct beyond doubt. It's engineering because unlike mathematics it has to run in real time against multiple limitations and constraints. Finally, it's semantics because there can't be software without a language and all languages are based on semantics.

People’s abilities develop differently. Some people become great speakers while others who are equally intelligent are unable put two words together without sounding stupid. Some people are able to abstract naturally and understand mathematics almost instinctively while some people can’t take their feet off the ground one inch.

Very much like a musician who’s been playing for 20 years hears a wrong note in a piece of music without thinking, a developer with 20 years of experience ought to be able to see a simple piece of code and catch a bug in it.

When I interview potential developers I take their age and years in the business into account and talk about these things. In 10 minutes it becomes pretty clear whether he was born to be a developer or made a bad turn and ended up in the wrong profession.

I was having a similar discussion with my wife last night - does she even know what I do in the basement every day (or night, I have flexible hours)?

I work remote, and for the last 2 years on a huge custom development project for a municipality.

Working from home, I get frequent visits (for better or worse), and sometimes -she walks in, and I'm super focused - like balancing 10 needles on 10 heads trying to add another.Or, I'm surfing the web / youtube / staring at the screen.Or even, I'm not there and walking about (not even, mind you, to my own detriment).

How do I explain, in a believable way, that all 3 are actually working - and working damn hard?

I've used the writer analogy, but she doesn't buy into it.

My project is a novel (or a biography, haven't decided, since it's not fiction). Some days I make no progress - have a frustrating problem that I just can't get past - a blocking bug - which ends in frustration. Some days, I write the code equivalent of a short story, and my bleary-eyed self emerges and exclaims, "YOW! WHAT A DAY!" Other days, like yesterday, I refactor a bunch of brute-force code into an elegant Haiku, and am just pleased with myself - a day when things just come together. Other nights I'll (re)-clock in at 10 and go until 4, resuming at 8.

She can't understand how I can do this day-after-day, every day, staring at the screen. And, despite not the career trajectory I set out for 7 years ago, I have found quite a bit of satisfaction in it lately (as gaining mastery of any field tends to create a positive feedback loop, for me). Billable hours vs. real hours don't compute, either. And the idea that what appears as doing nothing can really be burning some serious cycles (foreground, or background processing, mind you) solving a problem.

Metaphors can be a bit patronising. Try explaining the process, what the components are and what the workflow is. Avoid technology specific terms where possible. In fact explaining the process in a non-technical way can also help your own thought processes - sort of like pseudo code. If the customer understands the pipeline your developing the chances are they'll have a better understanding of the complexity and a greater tolerance when bugs appear.

Metaphors can be a bit patronising. Try explaining the process, what the components are and what the workflow is. Avoid technology specific terms where possible. In fact explaining the process in a non-technical way can also help your own thought processes - sort of like pseudo code. If the customer understands the pipeline your developing the chances are they'll have a better understanding of the complexity and a greater tolerance when bugs appear.

The best explanation I've seen is the first answer on this thread, http://www.quora.com/Engineering-Manage ... tor-of-2-3Though it's more an in depth description of "The devil is in the details" using the coastline of England metaphor than a description of the whole development process, it helps explain the all too common missed deadlines in our biz.

@gheeren: that answer is totally great. It really gives a feel of the development process. Gotta bookmark it. You can't explain a lay person what the precise problems are in development, partly because you've got to be a developer to understand it, partly because it's hard to pinpoint for ourselves. However, the "plan a walk, how-hard-can-it-be" metaphor nails it. You start with a clear goal, a map, reasonable estimates, and then arrive months later because the map is not the terrain...

The answers from stackexchange on the other hand all miss a couple of vital points, and have been written in a haste. And the one that falls back on "it's mathematically impossible" is such a nerd that he (anyone thinks it's a woman?) should never be allowed to communicate with non-devs.

I've got a similar problem. In this case I started working for a person who is computer illiterate. They are very smart (near genius) in their own field of expertise, but when it comes to computers they don't spend any time to learn even the most basic things. I am the sole computer technical person in the shop. Additionally, we've known each other peripherally for about 20 years prior to this arrangement.

To set the stage. The boss had a process that they had written down over the years and needed somebody to put it into a computer program. About 4 months later I emerged from the completed project and it ran flawlessly. We put it into multiple client's hands. About 4 months later a bug was reported. After fixing the bug, I pointed out to the boss that according to the written process, it wasn't a bug, it was coded as specified. At that point, we both chalked it up to a lesson learned and at most a miscommunication. No harm done.

Further projects came into the shop that did not have any written process. At this point, I should of taken it upon myself to write up something regarding the new projects, but I didn't. Project creep happens and testing / QA is taking a backseat to the hurried schedule. The completed project is released about 3 months later but without a full regression test. 1.5 months later, a catastrophic bug appears. Said bug is fixed and the project is working error free to this day (about 4 months later).

The problem is when the catastrophic bug happened, the boss emphatically stated that we (the company) have to guarantee that the projects we complete are 100% error free. I told them at best, we can spend a significant amount of time testing out the most egregious scenarios and making sure the correct things happen, but we can't account for every unknown bug. I am having difficulty explaining to the them the importance of testing and why it is impossible for any appreciably complicated program to be error free. I've brought up examples of software projects that infamously went wrong (Mars Polar Lander, Space Shuttle redundancy failures, various articles in our field about project design failures), but it doesn't appear to make any dent in their position.

I understand from a business perspective that it is great to say 100% error free, but the reality is all about managing expectations and working with the customers as problems arise and fixing issues in a timely manner. From our side, I need to be able to tie in the importance of testing into each project. I am nearly there I think with that one. The boss is still adamant about the 100% error free guarantee and whenever we talk about it, I tend to get defensive because what I see is a mathematical uncertainty and the boss appears to see a failure to execute properly.

Is there a way to allow them to see that software development is never a 100% mathematical engineering effort and that, some bugs, no matter how carefully we work to avoid them, will inevitably be exposed? I'll read over that link about the trip planning and terrain as the variable. Maybe that will be just the thing I need to break through this communication barrier.

There's no clear analogy to building material cost. A line of code costs nothing—only the time it takes to produce it costs money.

Not quite true. There is a marginal cost to every line of code. The computer costs money. The office and its furniture costs money. The power to run the computer costs money. Granted, these are pretty minor when compared to a single line of code but to say it "costs nothing" is simply untrue.

Marginal cost ≠ fixed cost. The office and the furniture and the computers that run them are considered fixed costs. Excepting time and the implied opportunity cost, the poster is correct in asserting that a line of code has no intrinsic cost. It's as close to nothing as we can get. You are right that nothing costs nothing, but so many things cost so close to nothing that they can be considered that practice. A brick costs orders upon orders of magnitude greater than a line of code saved on a disk somewhere. Put another way, a line of code costs about as much as a few skin flakes.

I understand from a business perspective that it is great to say 100% error free...

Strange, I would have thought the opposite, since most business decisions are about managing risk, which you can't do if you've deluded yourself into believing that something is error free.

Most small businesses, though, don't have nearly the resources to thoroughly study and understand their processes (as you've experienced) or to quantify risk and just "do" things. They're either lucky enough to learn from the often painful and expensive lessons, or are unlucky and simply go out of business.

Pardon me while I pour a bucket of cold water on this whole discussion. I think that we still need to work on explaining the development processes to programmers.

Say there's a scale of understanding, from no knowledge that a thing exists, to vague awareness, to illiteracy, to basic comprehension, to functional mastery, to hacking the guts of it, to fully grokking something: where are most programmers when it comes to development processes?

Personally, I know what the jargon means and how the concepts fit together, so I'm beyond basic comprehension. And while I've done many of the things involved, I'm not at a point where I can do them reliably, so I don't think I'm at the stage of functional mastery.

To a large extent, I don't expect there's any way to learn this stuff better than by doing it, simply because I've read books and articles on it and often see a lot of absolutely certain blanket statements and not a lot of systematic thought. And in the development community, there is way too much sentiment along the lines of "we need to do this thing because it's the thing everyone does!"

And I certainly don't see much evidence that being competent technically means you are qualified to talk about development processes, even though it is something of a prerequisite. I think there are some very smart programmers who are awful developers, that is, they know how to construct brilliant algorithms or they can cleverly hack something together, but what comes out simply isn't a product or it doesn't meet requirements. A lot of physicists and chemists, for instance, seem to be like that.

1) Have them write down instructions for going to the store and getting groceries2) Point out the major flaws and have them rewrite it3) Point out the new bugs they just introduced4) Either they give up and scoff at you, or they keep participating and refining a few times5) Point out what an immensely tangled mess their final instructions are

Computers do exactly what you say and absolutely nothing else. Anything you did not anticipate can be perceived by the user as a bug. Reality has no limit on the possible inputs and environment that your progam will be exposed to. Every line of code you write to address one bug can introduce new bugs.

When the definition of "bug" is "something that doesn't work the way I feel it should", there is simply no way to write bug-free software.

Software is mathematics because it must be proven correct beyond doubt.

Really?

I still like the way it worked in the old days. I'd get an idea, build a user interface to collect data, add the logic to process the data, and then keep tweaking it until it did what it was supposed to do. Built a system that generated shop orders for a factory and the only proof required was that they looked like the old shop orders.

Layer upon layer upon layer of busy-work has been added to make it more complicated then it has to be. Too many lackluster programmers made it important to slow down the process, I guess. Kinda sad...

Funny to see you guys use the building design and construction process as a metaphor. I'm a structural engineer, and I find myself using metaphors to explain that process to other people. As for changes being cheaper the earlier you make them... it's so very true, but I can't imagine someone not getting it regarding programs, and then suddenly understanding it in the context of buildings.

As a non-programmer, it'd help us if you could break down programs into pieces for us. Ideally into large components with sub-components. Something we can sort of visualize, like a flowchart. As Huperniketes says, make it sound like a machine. You can go into detail about what parts of your program do, just not how they do it.

...Is there a way to allow them to see that software development is never a 100% mathematical engineering effort and that, some bugs, no matter how carefully we work to avoid them, will inevitably be exposed? I'll read over that link about the trip planning and terrain as the variable. Maybe that will be just the thing I need to break through this communication barrier.

Get him to hire someone to do QA. And have him be involved in developing the use cases for the QA person to test. I think once he starts doing that he will realize that some of the crazy edge cases are unlikely enough that it won't be worth paying someone to test for. The added benefit is that you are not so responsible for post release bugs as the QA person had to certify the use cases which were specified by the boss.

Marginal cost ≠ fixed cost. The office and the furniture and the computers that run them are considered fixed costs. Excepting time and the implied opportunity cost, the poster is correct in asserting that a line of code has no intrinsic cost. It's as close to nothing as we can get. You are right that nothing costs nothing, but so many things cost so close to nothing that they can be considered that practice. A brick costs orders upon orders of magnitude greater than a line of code saved on a disk somewhere. Put another way, a line of code costs about as much as a few skin flakes.

There is at least the opportunity cost.. the developer could be doing something else with their time.

When I was doing consulting and had a bill rate I used that to help the client better understand the impact of their changes. They want an extra button on the screen and I know that is going to take X hours of UI work and Y hours of database work at bill rate Z I can tell them that the button will cost $3,000.. Putting actual price on changes helps the client to realize that changes aren’t free and they usually agree to forgo them or are at least not surprised when the project isn’t done within the original estimate.

Software is mathematics because it must be proven correct beyond doubt.

Really?

I still like the way it worked in the old days. I'd get an idea, build a user interface to collect data, add the logic to process the data, and then keep tweaking it until it did what it was supposed to do. Built a system that generated shop orders for a factory and the only proof required was that they looked like the old shop orders.

Layer upon layer upon layer of busy-work has been added to make it more complicated then it has to be. Too many lackluster programmers made it important to slow down the process, I guess. Kinda sad...

I feel like a lot of this was an attempt to manage risk through project management. Hacking away at something until it is done with no clear idea of the amount of time it will take drives project managers (PMs) crazy.

So instead they try to break the process up into parts so they can make up a bunch of estimates and feel like they have everything under control. In some cases it is necessary when the programmer is a different person, has a different skill level, or is at a different location and needs detailed specs to do the coding. It has been frustrating to me in the past where I was the only one responsible for application design, requirements, specifications, development and testing. The additional overhead of creating 100s of pages of documentation which I knew nobody was ever going to look at simply for the sake of following a process always seemed like an enormous waste of effort.

I think that the PMs don’t want to acknowledge that development is an art and see it as something more akin to factory assembly work. Having to do all the design before writing a line of code impairs my creative process. It’s difficult to see every possibility until you are in the thick of it.

PM managed development is also worried about ‘Gold Plating’ which is making something better than what was specified by the requester. This is seen as a waste of resources and is discouraged. I always pride myself on being able to see the vision of what the customer wants even if they can’t articulate it. Having to build only what they specifically ask for and knowing that with a little extra effort you could build which would blow them away can be pretty spirit crushing.

Agile Development has grown up to try to address many of these issues. With Agile you don’t create all of your specs before you touch a line of code.. its all an iterative process. And the client is involved so you can work with them to create a product which more closely matches their ideal vision (even if they don’t know what that is at the start)

Marginal cost ≠ fixed cost. The office and the furniture and the computers that run them are considered fixed costs. Excepting time and the implied opportunity cost, the poster is correct in asserting that a line of code has no intrinsic cost. It's as close to nothing as we can get. You are right that nothing costs nothing, but so many things cost so close to nothing that they can be considered that practice. A brick costs orders upon orders of magnitude greater than a line of code saved on a disk somewhere. Put another way, a line of code costs about as much as a few skin flakes.

I own and run a business. Marginal versus fixed cost is a bookkeeping fiction. There is a cost, regardless of how it is dealt with.

I own and run a business. Marginal versus fixed cost is a bookkeeping fiction. There is a cost, regardless of how it is dealt with.

I wish more people understood this; when you typically hear people say that the marginal cost is zero, or close to zero, what they seem to be saying is that the cost is zero. Which is far from the truth.

I.e., the fact that the marginal cost of producing a DVD is 5 cents is basically meaningless when you have to recoup the costs of the billion dollar fabrication facility and the millions of dollars spent to obtain the rights to the movie on the DVD.

Marginal cost ≠ fixed cost. The office and the furniture and the computers that run them are considered fixed costs. Excepting time and the implied opportunity cost, the poster is correct in asserting that a line of code has no intrinsic cost. It's as close to nothing as we can get. You are right that nothing costs nothing, but so many things cost so close to nothing that they can be considered that practice. A brick costs orders upon orders of magnitude greater than a line of code saved on a disk somewhere. Put another way, a line of code costs about as much as a few skin flakes.

I own and run a business. Marginal versus fixed cost is a bookkeeping fiction. There is a cost, regardless of how it is dealt with.

So do I, and no it isn't, particularly if you're loss-making, since it determines whether you're better off winding up the business or not.

I was having a similar discussion with my wife last night - does she even know what I do in the basement every day (or night, I have flexible hours)?

I work remote, and for the last 2 years on a huge custom development project for a municipality.

Working from home, I get frequent visits (for better or worse), and sometimes -she walks in, and I'm super focused - like balancing 10 needles on 10 heads trying to add another.Or, I'm surfing the web / youtube / staring at the screen.Or even, I'm not there and walking about (not even, mind you, to my own detriment).

How do I explain, in a believable way, that all 3 are actually working - and working damn hard?

I've used the writer analogy, but she doesn't buy into it.

My project is a novel (or a biography, haven't decided, since it's not fiction). Some days I make no progress - have a frustrating problem that I just can't get past - a blocking bug - which ends in frustration. Some days, I write the code equivalent of a short story, and my bleary-eyed self emerges and exclaims, "YOW! WHAT A DAY!" Other days, like yesterday, I refactor a bunch of brute-force code into an elegant Haiku, and am just pleased with myself - a day when things just come together. Other nights I'll (re)-clock in at 10 and go until 4, resuming at 8.

She can't understand how I can do this day-after-day, every day, staring at the screen. And, despite not the career trajectory I set out for 7 years ago, I have found quite a bit of satisfaction in it lately (as gaining mastery of any field tends to create a positive feedback loop, for me). Billable hours vs. real hours don't compute, either. And the idea that what appears as doing nothing can really be burning some serious cycles (foreground, or background processing, mind you) solving a problem.

Any ideas how to better communicate this?

Maybe she questions what you do because she is not satisfied with the money. Sounds like you are trying to express a vocation instead of a pay check.

I was having a similar discussion with my wife last night - does she even know what I do in the basement every day (or night, I have flexible hours)?

I work remote, and for the last 2 years on a huge custom development project for a municipality.

Working from home, I get frequent visits (for better or worse), and sometimes -she walks in, and I'm super focused - like balancing 10 needles on 10 heads trying to add another.Or, I'm surfing the web / youtube / staring at the screen.Or even, I'm not there and walking about (not even, mind you, to my own detriment).

How do I explain, in a believable way, that all 3 are actually working - and working damn hard?

I've used the writer analogy, but she doesn't buy into it.

My project is a novel (or a biography, haven't decided, since it's not fiction). Some days I make no progress - have a frustrating problem that I just can't get past - a blocking bug - which ends in frustration. Some days, I write the code equivalent of a short story, and my bleary-eyed self emerges and exclaims, "YOW! WHAT A DAY!" Other days, like yesterday, I refactor a bunch of brute-force code into an elegant Haiku, and am just pleased with myself - a day when things just come together. Other nights I'll (re)-clock in at 10 and go until 4, resuming at 8.

She can't understand how I can do this day-after-day, every day, staring at the screen. And, despite not the career trajectory I set out for 7 years ago, I have found quite a bit of satisfaction in it lately (as gaining mastery of any field tends to create a positive feedback loop, for me). Billable hours vs. real hours don't compute, either. And the idea that what appears as doing nothing can really be burning some serious cycles (foreground, or background processing, mind you) solving a problem.

Any ideas how to better communicate this?

Maybe she questions what you do because she is not satisfied with the money. Sounds like you are trying to express a vocation instead of a pay check.

I don’t agree that it is a money issue. Application developers are generally well paid.

I have experienced the same misunderstandings as plympton. To my fiancée it looks like I am just messing around on the computer.

Most of her computer use involves social applications, Facebook and media consumption. She generally views her own computer use as non-productive, more akin to wasting time than anything productive. Rationally, she knows that I get paid to do this work and I am being productive but in her gut she still associates computer use with wasting time. This is reinforced when she occasionally stops in and sees me surfing the web or watching something on YouTube. Like plympton, these activities are a part of my productive process. When I run into roadblocks while coding, doing something else for a while can help to refocus my thoughts.

I don’t think that any explanation is going to overcome this conflict. The conflict is between what she knows on a rational level and what she knows at a gut / emotional level. The only real solution would be to somehow change her perception of her own computer use.

If my clients had any interest in programming, they'd have learned how and not hired me. Instead, they can get what they pay for, which is a great end-user experience, while I get to have all the fun in the sandbox.

I was doing reporting and light-weight dba duties at a job. I automated some tasks via ms office's vba, and suddenly the non-techie types thought I was a programmer. The director of the dept asked me to sit in on a meeting. Turns out he had talked everyone up about how he had a programmer in his dept (me), and everyone was looking at me as they discussed building out an intricate, interactive web interface to track project logistics across the nation and across various departments. I instantly knew I was in over my head, and I also knew the kind of stuff he wanted usually involved a swat team of developers, IT network admins, etc, etc to coordinate.

After the meeting, I had to explan to him that his expectations were way off target. He was getting upset, b/c he thought I was just saying "no" to the task even though I could do it..I was a "programmer", after all...I should be able to do anything.

I basically explained it to him with the following analogy ... Imagine you have someone that's really good at HR, works in an HR department with many folks, and can speak English very fluently. Ok, now transplant them in Japan, tell them they have to get an entire accounting department up and running for a big company, and they will be the only person in that department. He looked at me with "I don't get it" eyes. Dude, first you'd have to learn a totally different language just to be able to talk to other folks. Next, it's a totally different skill set you're asking that person to use which they're not trained in. Third, you're asking one person to do the job of what normally lots of folks are hired to do.

Apparently that made sense to him. So, they decided to consult a real programming contractor company about the assignment. He had a reality check when, after ironing everything out, he finally realized it was going to be a million dollar project that would involve tons of departments, build time, testing, coordianation, etc, etc.

Sadly, only he and the other high-level execs met with the programming company to tell them "what we want" instead of letting actual end-users sit in on meetings, or someone like me sit in to help them with some usability or to narrow down what they needed and tell them something wasn't possible before they even had the programmers chasing their tails on it. So, we end up spending tons of money, the specs keep getting re-written over and over b/c they're not clearly explaining what they want and they keep changing their minds, and the end product is not something the end-users want to use ... b/c it doesn't address the issue it was supposed to address....which was to keep projects on schedule, departments accountable of their work flow & time, and keep track of inventory as it was used/stored/shipped/etc.

There's no clear analogy to building material cost. A line of code costs nothing—only the time it takes to produce it costs money.

Not quite true. There is a marginal cost to every line of code. The computer costs money. The office and its furniture costs money. The power to run the computer costs money. Granted, these are pretty minor when compared to a single line of code but to say it "costs nothing" is simply untrue.

I usually go with the idea of flow charts. most people in business have at least a basic understanding of a flow chart.

Start with a relatively simple process like a barista making your choice of coffee beverage (another thing business people tend to relate to).

It starts simple, select menu option input, brew coffee, add ingrediants, mix appropriately, charge custoemer fee, hand over beverage. Simple right? OK, now that's what you as the project manager have communicated your custoemr wants. Now, lets look at step one: Get the menu selection from the customer. Well, this includes displaying said menu, where do we get that? a database, right. Someone has to be able to edit that too, so we need some place to store data, some way of displaying it to customers, some interface for editing/changing the menu, and a way in the end to tie that back to the cash register later....

OK, lets start with the menu itself, what's oni it, how is it broken down? Are their optional ingriediants/charges we need to considder? How should it be organized? We're not even at "take the order" yet, we're still talking about menu design in such a way as the pieces can be broken out. It;s not as simple as size and type, there's complexity here with extra shots, toppings, additives, flavors... Any and all of that can change seperate from the Grande base cup, and what's free today might cost tomorrow and vice versa. all this has to be flexible and clear, and easy enough to visually put on the cash register screen (in a UI we have to seperately desaign) so they can push buttons that in the end make a drink that someone else can readd from a ticket and replicate in a cup without error.

Make coffee: OK, a human is trained. How do you train a PC. Oh, you give it a recipe? Well, no, you also need to teach it how to set up the coffee machine. Where is the coffe, how are the ingredients marked, how does one measure things, how does one clean parts, where are the cups, WHAT is a CUP? A computer doesnt know ANYTHING, we have to teach it every tiny little step. What is a cup? styrofoam, in a certain shape. What is styrofoam? What are the dimensions of the shape? a computer needs to know all this data in order to operate an android body and make a cup. No, we don;t really need to program a PC to that level for a PoS cash register, but lets assume we were designing a #d environment of a coffee shop inside a video game, then yea, we have to teach each and every motion to a computer. How to move an arm, what IS an arm, what are it;s limits, do buttons light up when you press them, what happens when a virtual button is pressed, what sounds are created, where do the logos and objects get stored, all of that is complex, and each step has mutliple parts and each part has multiple steps.

Programming a coffee shop in 3D could have tens of thousands of objects, motions, and instructions to design, you gave me 5 flow chart objects, and I extrapolated 10,000 from it. That's making a cup of coffee. Now, design a customer sales system that handles retain inventory management, logistics, labior costs, etc. That's what a programmer does. We take a simple thing, and break it down over and over again into more and simpler parts until the simplest machine in the world can be told using only 4 comands how to replicate the entire process. Those 4 commands: Add, move, multiply, negate.