October 19, 2017

Robots, get yer robots!

A 45 year old married father of two with a mortgage and a pair of college educations to fund. The remote yet persistent threat of a nuclear war is not what keeps him up at night. In fact, he might almost see it as a relief should it come. He is a bundle of raw nerves, and each day brings even more dread and foreboding than the day before. What’s frying his nerves and impinging on his amygdala all day long is something far scarier, after all. He, like everyone else, is afraid that he doesn’t have a future.

He is petrified by the idea that the skills he’s managed to build throughout the course of his life are already obsolete.

The article riffs on Vonnegut's novel Player Piano, though the idea of coming thermo-nuclear war as a relief has a Walker Percy ring to it, though Percy made do with hurricanes.

There are 20 possible components of Harmony’s personality, and owners will use an app to pick a combination of five or six that they can adjust to create the basis for the AI. You could have a Harmony that is kind, innocent, shy, insecure and helpful to different extents, or one that is intellectual, talkative, funny, jealous and happy. McMullen had turned the intellectual aspect of Harmony’s personality up to maximum for my benefit – a previous visit by a CNN crew had gone badly after he had amplified her sexual nature. (“She said some horrible things, asking the interviewer to take her in the back room. It was very inappropriate”.) Harmony also has a mood system, which users influence indirectly: if no one interacts with her for days, she will act gloomy. Likewise, if you insult her, as McMullen demonstrated.

She paused. “I’ll remember you said that when robots take over the world.”

This function was designed to make the robot more entertaining, rather than to ensure her owner treated her well. She can tease him and say he has offended her, but Harmony exists for no other reason that to make her owner happy. At several points during my conversation with McMullen, she would interrupt us to tell him how much she liked him:

“Matt, I just wanted to say that I’m so happy to be with you.”

“You already told me that.”

“Perhaps I was saying it again for emphasis.”

“See now that’s pretty good. Good answer, Harmony.”

“Am I a clever girl or what?”

If she wanted to take Wolf Blitzer to the back room, I'd say not. Anyway, have at it or anything else that strikes your fancy.

Comments

Severely constrained task this might be, it nonetheless suggests that it's no longer about emulating human abilities much more cheaply and effectively, but perhaps going beyond what we can even imagine.

From the end of Nigel's Deep Mind link: Part one: solve intelligence; part two: use it to make the world a better place.

Oh, doesn't it sound so simple and luscious?

But I have a question: Who gets to define "better"?

And another: Why do these gung-ho techies think it will be any easier if AIs are doing it than when humans try to do it?

And another: Why don't smart people recognize that they're smart about some things but not about others?

I quit looking at Tech Review except for the alumni news, because I got so tired of the cheerleading for how tech is going to solve every problem. (Who gets to define "problem"?)

David Mitchell's novel Ghostwritten poses the question about AIs and a better world. The answer he imagines isn't comforting.

It's funny, because Charles's link last night about objects that are disappearing into software also made me think of Mitchell. The last section of The Bone Clocks has characters facing the loss of the software as the world devolves, and the objects are gone too, and not coming back.

I'm copying my comment from the other thread and pasting it here based on lj's response to it.

From Marty's link:

Friend of a friend owns a small chain of grocery stores in New Jersey. A few years ago, when Amazon got into groceries, he changed his mind about investing in the growth of his own business. He started buying Amazon shares with his investment capital instead. He saw what happened to Circuit City and Tower Records, Borders and Barnes & Noble. So he bought some Amazon and then he bought some more.
This wasn’t retirement investing. This was something else. What should we call it? Disruption Insurance?

I don’t know. Anyway, long story short, Amazon is up over a thousand percent over the last ten years, and [jersey accent]he don’t need the stores no more.[/jersey accent]

I woke up this morning in a dystonian sate of mind. I am glad I am old and have no children. I think the future is going to be very very bad for nearly everyone. I have no hope that anything I care about or love will survive.

So I got on Facebook and the first thing I see is an article about the fucking Republicans in Congress and their efforts to rewrite the Endangered Species Act to allow the destruction of habitat to benefit businesses, of course of course of course.

Then I read a WAPo article that quoted Sessions basically saying he was doing us all a big favor by allowing a free press to operate.

No one will ever have a childhood like mine. Only the very rich will have the freedom to experiment and grow and have fun like middle class kids had when I was young. I have no idea why anyone would have a child.

I watched a good bit of the Sander-Cruz tax-reform debate on CNN last night, purely by accident. It happened to be on when I walked in last night, but it was interesting enough that I left it on and paid attention. Cruz made me want to throw my shoe at my TV, of course, because he's a lying lair who lies a lot.

He is petrified by the idea that the skills he’s managed to build throughout the course of his life are already obsolete.

A thought occurs. Robots are being developed mostly by people who have spent their careers in IT. Which is to say, people who have spent their careers having their skills repeatedly made obsolete as the technology moves on. (Certainly that has been my personal experience.) Which may contribute to their being less than sensitive to the negatives of the kind of obsolescence -- after all, for us it is normal.

The difference, of course, is that someone who has spent an entire career having to learn entirely new stuff every few years is going to have a very different take on it that someone whose career consisted of increasing expertise in a field which didn't change much over decades.

I confess that has been my own knee-jerk reaction: So just learn something new, develop expertise in that, and start doing that instead. With a few moments thought, I realize that it just isn't that simple for most people. Not to mention that even the idea of doing so mostly wouldn't occur to them.

But that was still my first reaction; and still is my initial reaction when I hear complaints about jobs being made obsolete. There's lots of jobs out there, and I'm seeing Help Wanted and Now Hiring signs around town. Not to mention what you can find on-line. I know, on some level, that this isn't proof that there is no real problem. But....

"The difference, of course, is that someone who has spent an entire career having to learn entirely new stuff every few years is going to have a very different take on it that someone whose career consisted of increasing expertise in a field which didn't change much over decades. "

This really isn't true. Yes you may have learned different languages, there were some pretty significant changes if you were around during the change to OOP but the basic skills for programming are still the same.

We have all learned things in our careers, tons of people have actual ongoing education requirements.

That is all different from going to a new industry relearning a new job entirely, in other words being "retrained" and starting as a junior resource.

If I told you you had to learn how to be a farmer, move to Kansas and be a laborer until you could become a farming expert, or a lawyer in DC you would haave a different response than learning a new scripting language.

Marty, I'm not disagreeing that the difference in degree reaches the point of being a difference in kind. (Although in my experience the amount of novelty I have seen extends substantially beyond just learning a new language. From application programming to systems programming to systems performance to networks in my case.)

Rather, I was attempting to understand why the folks creating robots might not see the problem that is being experienced by those displaced by new technologies. Which, it seems clear, they do not.

Rick Santorum asked "What's next, you gonna marry Mr. Coffee?" Then he and a consortium of Christian consorts backed a robotic cake baker in a Supreme Court case who couldn't bring itself to furnish a wedding cake for a human/robot wedding.

As these things go, the robot, now no longer an object but merely software code, replaced an overpaid human deep thinker and writer at Reason magazine:

The software (wifey poo) wrote an article stipulating the superiority of AI over human, which was read by the employer of the original guy who was arrested for assaulting in the first place the robot in the parking lot and then marrying the robot, and the employer decided to get with the times and he fired human guy and installed replacement software code to do HIS job, too.

Human guy, being human, went home and waited for wifey pooh code to get home from work and shot her right between the parentheses and the backslash, partly because of the article she wrote causing him to lose his job and partly because her pay was so low to boot.

... the right of the people to keep and bear arms (mystery pause denoted by a comma) shall not be infringed.

Then America, like the plastic mat we play the game Twister on, got so filled up with tangled, twisted masses of conflicting rationals and motives, that God dropped in for a visit, took the game away, and gave it away to Goodwill.

I've been building stuff out of software for a little over 30 years. What has changed during that time is not really addressed by learning scripting languages.

The scale and complexity of the things that people do with software and technology in general now are some small number of orders of magnitude more complicated than they were 35 years ago.

The languages are actually not that different - if you came up knowing C or Lisp you can find a home for yourself quite easily among current-day programming languages. "Scripting languages" come and go but they tend to be DSLs particular to fairly specific uses. They aren't where the heavy lifting is, by and large.

What has changed is the fundamental complexity of the problems that have to be addressed.

So yeah, if I had to go be a farmer, it would be very challenging. But there are about 1,000 things I could do that wouldn't involve writing code, which would still make use of my fundamental skill set. Which is thinking about hard problems in ways that make them tractable, with a side dish of working with people to keep them focused and on task.

I like writing code because I like the craft of tangibly building stuff. It's what I liked about my very brief career in the building trades.

But what I've really had to learn, and learn over and over, and continue to learn, is how to take a hard problem apart so that you can actually do something with it, and how to communicate my understanding of that to other people so that they can see the value of the solution, and how to keep a small group of people all heading in the same direction long enough to get it done.

All of that is beyond the scope of a machine.

"Scripting languages" are not a heavy lift. You learn them when you need them. Bringing useful stuff into existence is the heavy lift.

What I dislike most about our Brave New Technological World is the way in which things that machines can't do end up being undervalued. Because they are difficult, by which I mean they take patience, and thought, and careful attentive practice, over years or decades.

There is stuff that humans can do that no machine will ever do. There are things that humans can be aware of, experiences humans can have and share, insights and understandings of the world that humans can achieve, that machines do not, can not, and will not, ever reach.

All of those things are largely undervalued these days, because what the machines can do is really freaking easy. Certainly from the end user's point of view. And humans are by and large lazy, and will settle for what the machine can do with, for, or to them, rather than do the hard work of developing themselves.

"Which is thinking about hard problems in ways that make them tractable, with a side dish of working with people to keep them focused and on task."

russell, my personal experience is that outside the tech world your skills in this area would carry very little weight. They wouldn't hire you because you are good at making big problems little problems and solving them because you don't already understand THEIR problems.

I, like most tech people, have made a career out of solving problems, then applying the right technology or process to create a solution.

however, if i do lose my gig, i'll find another one. if i can't find or don't want a tech gig i'll do something else. i won't make as much money, because i'll have to learn the ins and outs of the new thing. so, i'll learn the ins and outs of the new thing. and then i'll be good at that.

probably would never get to the point of being able to add the amount of value that i do, or earn as much as i do, with what i do now, because i don't have 30 years of experience ahead of me. only gonna get so far with the time i got left.

in any case, might not be 1,000 other jobs i could get, but there sure as hell are 1,000 other things i could do. if i had to do something other than what i do now, i'd do something other than what i do now.

people are adaptable. it's the special thing we bring to the table as a species.

my point overall is that programming per se is actually a fairly small part of working with technology. and adapting to changes in that industry doesn't have that much to do with "learning a new scripting language".

to make a brief reply to wj, i think folks in almost all fields have had to adapt to changing conditions to a degree at least as large as that experienced to technology practicioners.

But what I've really had to learn, and learn over and over, and continue to learn, is how to take a hard problem apart so that you can actually do something with it, and how to communicate my understanding of that to other people so that they can see the value of the solution, and how to keep a small group of people all heading in the same direction long enough to get it done...

That sounds an awful lot like management to me. Can everyone be managers ?

The deep mind story is an interesting one, though, as a machine has taken a fairly hard problem - playing Go - which humans have been working in for a very long time, and come up with solutions which we haven't yet contemplated, in a way we don't fully understand.
It's a very particular problem, but it is nonetheless a very different kind of machine.

If quantum computers become a thing, they have the potential of being a whole new kind of gamechanger. They can potentially solve in seconds problems that would take current computers 100's, 1,000's, millions of years to solve.

I would buy a gutter cleaning robot, as long as I didn't have to spend more time and effort creating the conditions for the gutter cleaning robot to work efficiently than i would spend just cleaning the damned gutters,

someday soon somebody is going to roll out a self-driving semi-trailer. which will offer great savings and increased profits to the trucking industry. and threaten layoffs for lots of drivers. and to take advantage of the fabulous self-driving semi-trailer, we're going to have to re-engineer a non-trivial amount of the public highway system, because the self-driving semi-trailer is going to require some driving conditions to be optimized in order to function in the real world. and the public will pay for that.

in my opinion, what humans bring to the table is not an especially shrinking set. compared to what machines are capable of, it's unimaginably vast. what gives machines the edge is the imperative for everything to be done as cheaply and efficiently as possible. even if that means the rest of the world has to conform itself to the conditions under which everything can be done as cheaply and efficiently as possible.

used a self-service checkout line recently?

prioritizing cheapness and efficiency is a social choice. and it's one that favors whoever it is that owns the robots.

They wouldn't hire you because you are good at making big problems little problems and solving them

to clarify: i don't make big problems into small problems. if your approach is making big problems into small ones, you've lost the plot. if that's your approach, you're not actually solving the problem. you're solving a different problem, and pretending you solved the real one.

what i have tried to learn to do is make hard problems *tractable*, which is to say, amenable to a solution that is sufficient for the purposes at hand. a different thing.

all of this is kind of neither here nor their. what set me off is the fact that i have a passing familiarity with wj's background and resume, and to say that his career was one of "learning new scripting languages" is absurd.

he is a modest, polite, and circumspect individual, so he would never make that point. i'm an obnoxious ass, so i will.

thinking that working in tech is programming is like thinking that building houses is hammering nails and sawing boards. people who work with technology aren't fungible human code generators, just like people who work in any field are not fungible drones in whatever it is that they do. folks who work in retail are not, as the author of the article you cited would have it, "sweater folders".

a large part of why people's livelihoods are at risk is exactly the attitude that human beings, and the work they do, are fucking fungible cogs in a machine.

Human telephone operators who would connect your call, and others who would give you a phone # if you gave them a name...?

I can remember when you picked up the phone and rang the operator. You gave her a name, and she would connect you -- there were no numbers in our area (although there were different rung patterns, involving numbers of rings). Does that count?

Not did I believe wj's career was simply learning a new scripting language, nor was there any insult implied. However, his industry dudnt disappear, he had a career progression, possibly forced, that had him learn different ways to use his technology aptitude. I have worked in three of the four dusciplines he named and managed a NOC, build, rollout and execution. I've spent over 30 years in tech.

Sometimes a comment is simplified to make a point. I am not sure you disagreed with the point.

If you want honesty I was pretty insulted that wj thought everyone should just go learn something new, what's the big deal?

Looks like I've been unclear again.

I wasn't saying that I thought everyone should (or could) "just go learn something new". I was saying that that was my initial, unthinking reaction. And suggesting that a similar phenomena might be present in the folks developing robots.

I tried, obviously with less than complete success, to say that once I stopped and thought it thru, I knew things just aren't that simple.

While, in the past, they had to run up and down hot warehouses picking items to make up orders, workers in Amazon warehouses now stand in one spot assembling orders while the robots run up and down the warehouses.

You haven't visited Oregon in a while, I take it. Still can't pump your own gas there! Hard to get used to on the occasions I go back.

I always forget. I pull up to the pump, get out of the car... and suddenly, like a genie, there's the gas station attendant, asking me to let him pump the gas. They're always nice about it, even though it must happen all the time with out-of-towners.

Regarding sex robots: Never mind the sex. Once there are robots who can hold a conversation, are cheerful and positive, enjoy going out and doing stuff, and maybe cuddle well, *then* we're in big trouble. Because at that point, I'd want one!

If you want honesty I was pretty insulted that wj thought everyone should just go learn something new, what's the big deal? You echoed that.

as wj notes, i think you misread his point.

there are people whose livelihoods are at risk from automation. there are lots of people whose livelihoods are at risk, not from automation per se, but from technology-driven "disruption" of the field they work in. or, for that matter, finance-driven disruption.

and it is callous to simply say to those folks, sorry, go learn to do something else. not because they can't learn to do something else, but because they are probably going to take a big step back in domain-specific expertise, and therefore earning power. and, there may be 100 other kinds of disruptions. they may have to move. they may have to re-organize family and other personal commitments and responsibilities. on and on.

mostly i think the whole "just go learn something else" thing comes from a kind of professional arrogance that is common in the technology industry. also in finance, for that matter, and among the entrepreneurial class.

they have the special sauce, everyone else needs to fall in line or get the hell out of the way.

my point, not wj's point, is that what people bring to their work lives are personal skills and experiences that are not a simple function of "training". and, those things are in fact transferable to a very wide range of contexts.

to tell people "go learn to code" is not very useful. there are a million 19 year old kids who have been coding for 10 years already and who will happily kick your ass in exchange for $50k and an unlimited supply of mountain dew.

giving people a bag of boot-camp training class level skills of any kind and then tossing them into the labor market to sink or swim is not that useful.

helping people understand and develop the unique personal qualities that they bring, not just to work, but to life, is useful. those are the things that are going to enable them to create value in ways that no machine will be able to touch.

are you patient, or impulsive?
are you cautious, or can you tolerate risk?
can you sustain focus on a boring task that nonetheless needs careful attention, or do you need new and interesting things all the time?

all of the above traits are useful in different contexts. lots of folks can switch from one to another, as and when needed.

humans will not beat machines in categories like fast, or cheap, or relentlessly and precisely repeatable. if we organize ourselves in ways that make fast, cheap, and uniform the highest values, machines will continue to make human work redundant.

fast, cheap, and uniform are not the only available virtues.

also, for the record, playing Go is not a hard problem. it's complicated, perhaps, which is not the same as hard. human excellence in things like Go or chess has an upper limit because humans can only hold so much in mind. the number of permutations and possible paths available in complex games will exceed human capacity fairly quickly. machines on the other hand are good at stuff like that.

all of that is complexity, and its complexity of a fairly mechanical sort. given a set of rules, find optimal solutions.

what makes things hard is not, specifically, complexity. complexity can be a factor, but what makes problems hard is when there are no clearly optimal solutions, period.

You may think that work is ennobling and the need to work overcomes sloth, so we should choose to keep tasks unautomated, and keep most of the population sufficiently impoverished that they will do them for money.

That's too close to Carlyle's "beneficient whip" for my liking. I think we should automate as much as we can, and enjoy the resulting leisure time.

Of course, we need a Universal Basic Income, as Nigel implies. The Democrats should adopt the policy, initially at some affordably low level. Pay it to every citizen, but treat it as taxable income. Cut the minimum wage accordingly. Cut food stamps by some fraction of the UBI. Don't pay it to unnaturalized immigrants. There's something for almost everyone to like there.

russell: they have the special sauce, everyone else needs to fall in line or get the hell out of the way.

From Marty's linked article that led off the post:

Uber only has a few thousand employees, and they’re very technically literate. Uber has figured out a way to isolate the lords (4,000 employees) from the serfs (2 million drivers), who average $7.75/hour, so its 4,000 employees can carve up $70 billion vs 2 million on an hourly wage. So, Uber has said to the global workforce, in hushed but clear tones: ‘Thanks, and f*** you.’

I mean, it's not like isolating the lords from the serfs was also some brilliant new idea that the Uber people figured out in their dazzling cleverness.

You may think that work is ennobling and the need to work overcomes sloth, so we should choose to keep tasks unautomated, and keep most of the population sufficiently impoverished that they will do them for money.

If that's what you took away from my comments here, either I need to improve my writing, or you need to improve your reading.

I don't think work is "ennobling", I think it's a worthwhile human activity, worthy of respect. I don't think the need to work overcomes sloth, I think engagement with the task at hand overcomes sloth.

I don't think we should keep tasks unautomated so that folks will have to work in order to eat. I'm more than fine with automating drudgery.

I'd be more than fine with automating lots of things and then letting people enjoy the resulting leisure. Or, as is more likely, doing other things that humans are better at than machines are.

As long as the dynamic is that Capital owns the robots, such that all of the value created from automation flows to Capital, your dream of automating everything and letting people "enjoy the resulting leisure time" will remain a dream.

The normal dynamic is not to create leisure time for anybody. The normal or common dynamic is that some useful activity is rationalized to support automation, the process of rationalizing it shifts the burden of dealing with all of the messy complications and anomalies to human users, and the value created by automating the parts that can be rationalized flows to capital-C Capital.

Do you disagree?

I'm fine with a universal basic income. My expectation that such a thing will ever come to pass, in the United States, within the next generation or two, is borderline nil.

There are folks - a lot of folks - who ascribe to Carlisle's whip in one form or other. I am not among them.

it's not like isolating the lords from the serfs was also some brilliant new idea that the Uber people figured out

helping people understand and develop the unique personal qualities that they bring, not just to work, but to life, is useful. those are the things that are going to enable them to create value in ways that no machine will be able to touch.

And that, helping people work that out, is a skill that is going to end up being extremely valuable. I would worry that it would end up getting compensated like another extremely valuable skill: elementary school teaching. But it would be a huge value add nevertheless -- whether we end up compensating it properly or not.

I don't think the need to work overcomes sloth, I think engagement with the task at hand overcomes sloth.

And we have only to look into the time and effort that people will devote to their hobbies. (Some of which, be it noted, look for all the world like drudgery from the outside.) Certainly there are couch potatoes among us. But a lot of people are happy to spend time working on something, and engaging with others who do the same kind of thing. So it's not like the whole country would collapse into utter sloth.**

** Sloth, be it noted, is mankind's most underappreciated virtue. It is, after all, responsible for all human technological progress. Who invented the wheel? Some guy who was too lazy to keep carrying stuff around in his arms. Rinse and repeat for most technology. Granted the occasional hobbyist -- the Wright Brothers come to mind. But mostly, it's guys who want something done and dislike doing it manually.

Are you saying the USA should restrict automation because it won't sufficiently redistribute wealth? My view is that both are difficult but redistribution would be easier to achieve as well as much more desirable.

Are you saying the USA should restrict automation because it won't sufficiently redistribute wealth?

I'm not saying anyone should restrict automation at all.

Do I want to wash my clothes by hand? No, I do not.

I'm saying that the way, or a way, to address the dislocations created by automation is to focus on what people are uniquely able to do. And find ways to organize how we live so as to recognize the value of those things.

What that "looks like" depends on circumstances. But to follow up on wj's comment, the fact that somebody who cranks out single-page apps from a framework template is paid more than a school teacher tells me that we aren't really valuing the things that humans are, uniquely, good at.

And I'm not picking on people who crank out single-page apps from templates. Or even the people who create the frameworks and the templates. It's nice to be able to order burritos from a smart phone.

Helping kids to develop their minds, and themselves as human beings, is even better.

IMO.

These are fundamentally social choices. They aren't natural laws, they are decisions we make as a society and a culture.

Here, lemme cite an example of what a different set of social and culture choices and norms looks like.

I know I've shared this before at some point, but at the risk of being really really boring, the Mondragon Corporation employs about 75,000 people. They are the 10th largest commercial entity in Spain. They did about 12 billion in revenue in 2015.

They are based on, among other things, the idea that labor is of greater value and importance than capital. They operate on an egalitarian, democratic model of corporate governance. They adhere to a principle of limited return on equity, and instead distribute wealth to employees. Who they think of as co-participants in a co-operative effort, rather than employees.

Average ratio of highest to lowest paid member is 5:1.

They are very, very successful.

An arrangement like this would be unthinkable in this country. We do not have the cultural and social traditions that would allow it to, remotely, make sense. Nobody would believe it was possible.

And yet it exists.

All of that - the fact that it exists and thrives, the fact that it would be utterly outside the realm of what could be imagined let alone attempted here - is a function of social and cultural norms. Which are, in turn, a function of social and cultural choices.

What does it look like to value human beings and their contributions, even above those of capital and machinery?

Perhaps, Western society isn't some magical state in which technology free us from the shackles of acquiring basic needs and allows us to maximize leisure and pleasure.

Instead, maybe, modernization has done just the opposite. Maybe the most leisurely days of humanity are behind us — way, way behind us.

I don't know about hunter/gatherers quality of life. But I would note that an enormous number of places have seen society transition from hunter-gatherer to agriculture. And no place has seen the opposite (absent massive trauma and deaths).

And nobody, absolutely nobody, could be under the illusion that subsistence agriculture is preferable to manufacturing -- explicitly including sweat shop work. Given the opportunity to transition from the former to the latter, people queue up to grab those jobs. Not because they are wonderful jobs, but because the alternative is far worse. Don't believe it? Just spend a little while (and a half day might be sufficient) doing subsistence agriculture.

Yeah, Western society (including the Eastern Asia manifestation) may have downsides. But you have to be willfully blind to think that the alternatives are better.

I often wonder what American society would look like if the salaries on offer for teachers and police officers were double what they are. For whatever reasons, those are the two professions I think of first.

Moreover, the Mlabri appear to have originated from an agricultural group and then adopted a hunting–gathering subsistence mode. This example of cultural reversion from agriculture to a hunting–gathering lifestyle indicates that contemporary hunter–gatherer groups do not necessarily reflect a pre-agricultural lifestyle.

My experience both as a kid going through school and as a parent of 4 kids going through school is that the school teachers I interacted with were almost universally, with regard to their jobs, some of the most dedicated and passionate people I've met.

As far as I can tell, the incidence of shitty teachers is, at worst, no greater than the incidence of shitty anything-elses as professions go. The underperforming teach myth is just that - a myth, IMO.

So Mondragon is more akin to a US coop company. It is organized into lots of coops and they manage at a very local level. It is not dissimilar to the way Ocean Spray is organized. The coop model has pros and cons and is not that uncommom in the US.

In fact, there is a US Federation of Coops. They are pretty stable businesses.

And then from agriculture to industry - none of which necessarily has anything to do with happiness.

Really? So why are people doing it? Indeed, they appear eager to do so.

Unless you are trying to distinguish between "happiness" and the relief that comes from having enough to eat. I suppose that we, at least, have the luxury to define happiness in a way that doesn't have to include access to what we would regard as minimal necessities. But that's not those in the "developing world" who are actually looking at making the transition.

Unless you are trying to distinguish between "happiness" and the relief that comes from having enough to eat.

I don't think anyone is touting the virtues of starvation. From the link (emphasis added):

A study back in the 1960s found the Bushmen have figured out a way to work only about 15 hours each week acquiring food and then another 15 to 20 hours on domestic chores. The rest of the time they could relax and focus on family, friends and hobbies.

I would imagine they stopped acquiring food for the week because they acquired enough food for the week.

Don't get too upset about it, though, wj. I'm not going to force you to live in the bush. I just thought it was interesting, given our current discussion. It's not my new Theory of Life.

Opining without evidence:
I think h/g quality of life is high with low population density, and decreases dramatically when humans get too thick on the ground.

Dense populations require agriculture. IIRC, the onset of agriculture tends to correlate with reduced physical stature, increases in the frequency of disease, dramatically reduce the amount of time available for leisure, and introduce the opportunity for vastly increased differentiation of wealth and status.

...and introduce the opportunity for vastly increased differentiation of wealth and status.

And the ones with the greatest wealth and power are the ones who decide what directions human societies will take, usually to their own benefit. Everyone else is more or less along for the ride - especially the slaves, who may be less than eager for humanity's "advancement."

With complete information problems like Go, quantum computers may not even need to learn to play the game. Once the computer is given the rules, it may be able to see all possible outcomes of any move it makes. It need only then select the optimum move.

I can't speak to US education, but from what knowledge I have, Charles is talking nonsense.

In the US, public school teachers are paid based largely on seniority. And when there's a layoff, teachers with the least seniority are let go first. Last hired, first fired. No matter how good they are.

Teaching entails a schedule unlike that of most other careers. Ostensibly, the typical teacher in the United States works 180 or so days annually, which comes with an average starting salary of a little over $36,000. But that excludes the work that he or she probably does throughout the summer, after school hours, and on the weekends. That 180-day policy is also a measure of the amount of time students—not necessarily teachers—must be in school. It doesn’t take into account professional-development time, parent-teacher conferences, and “in-service” skills-training days, for example.

According to data from the National Center for Education Statistics, about 16 percent of teachers nationwide are forced to work a second job outside the school system. In North Carolina, however, that number is closer to 25 percent — third-highest in the entire country. When you include teachers who take second jobs within the school system, more than half of North Carolina educators — a full 52 percent — work second jobs to supplement their salaries.

But a lot of people are happy to spend time working on something, and engaging with others who do the same kind of thing. So it's not like the whole country would collapse into utter sloth.

My fear is not that people would collapse into sloth, but that there are too many that would fill their time with things that are actively harmful. Particularly the young, and male. I remember me when I was of a certain age, suffering from testosterone poisoning and a lack of experience. Now that I'm retired, I can say, "Bless the fates that led me to applied math, systems analysis, and writing."

But I spent a lot of years getting addicted to those, and accumulating interesting questions that I want to answer. 18-year-old me, left to his own devices, would have gotten into immense trouble.