Anyway, you can consider somebody as "normal" only as long as you don't really know them. That's when you understand they are as strange and far from the "norm" than anyone else. Normality is only an appearance.

I don't think Bubbles trying to be more "funny" was wrong for any given value of wrong. But it does seem like she isn't aware of the inherent tier system of jokes and burns. Jokes where you can insult someone are best reserved for the closest friends or best friends, where they know you're joking or at least fire back something just as insulting.

As it is, Roko and Bubbles aren't really friends and more acquaintances and so, Bubbles in an attempt to be funnier, just comes off as rude from Roko's perspective.

« Last Edit: 06 Dec 2018, 12:41 by Castlerook »

Logged

Whenever someone says "I'm not book smart but I am street smart.", all I hear is "I'm not real smart, but I am imaginary smart."

We had Bubbles blushing about falling for Faye, then Beepatrice's embarrassment about human sex toys (which she herself comments on), now Roko getting upset about Bubbles' poorly-considered joke.

Aren't AIs capable of seeing past this sort of reaction ? Or are they deliberately crippled to make them more acceptable to humans ?

Sorry if it's an old question. It's just something I noticed a lot recently and joined up here to ask about.

Welcome, new person!

It's a mystery why the AIs have the same emotions in the same proportions and intensity as the humans. There's a thread from I think a year or two ago where another curious new person started a great discussion about it.

We had Bubbles blushing about falling for Faye, then Beepatrice's embarrassment about human sex toys (which she herself comments on), now Roko getting upset about Bubbles' poorly-considered joke.

Aren't AIs capable of seeing past this sort of reaction ? Or are they deliberately crippled to make them more acceptable to humans ?

Sorry if it's an old question. It's just something I noticed a lot recently and joined up here to ask about.

If you give AI emotions, those will sometimes be overwhelming. That's the thing with emotions. You seem to consider AI shouldn't have emotions, but then, they simply couldn't interact with humans the way they do in the comic, but would rather just be sophisticated computer system like we know them in the real world. Is giving AI emotions what you call "crippling"?

I like that the AI in QC have emotions and can express them freely. But there comes a point where they can become too emotional and human-like, to the point where Jeph could have swapped out Bubbles and Roko for Faye and any other character, even a "meatbag" ex-cop. IMO AI should be more in control of their emotions than humans are, maybe able to even turn them off completely if necessary; this could be one of the differences (maybe advantages?) between them and us.

I'd argue against turning off their emotions. It's unhealthy for people not to process complex emotions and it could be an even worse problem for an AI to do the same. Likewise, while AI might be physically superior to a Human, a cold emotionless rationale would likely stymie AI growth eventually.

Logged

Whenever someone says "I'm not book smart but I am street smart.", all I hear is "I'm not real smart, but I am imaginary smart."

Does anyone else think it's inappropriate of Roko to ask Elliot to mentor her in bread? I feel like she'd be making him a participant in her fetish without his consent. And if she actually takes a job at the bakery, then extend that to everyone else who works there and all the customers who come in. Like, there's a difference in my mind between just coming in to purchase bread vs "Tell me everything about this bread while I secretly get off on it."

I am firmly on the other side of the fence on this issue with a caveat. That being "Don't do that at work". Then again when it comes to "interpersonal happy fun time", unless it is part of the job description then "Don't do that at work" pretty much applies to everyone.

Can she be excited in more ways that one about bread? - yes

Can she learn under a master to make bread and share the objects of her love with others? - yesCan she share her passion for bread with others? - As long as she does not cross the line in sharing then yes

Heck - you could substitute any activity or object for bread and the results would be exactly the same.Sportsball, DramaTV, Games, Movies, Comics, The secret life of slugs ......

(click to show/hide)

Remember kidsDon't do slugs

So a word to the wise moment here - Don't overshare and freak-out those not as excited about the subject as you are.Warning - while you were typing a new reply has been posted. You may wish to review your post.

Gah - Okay, on the subject of AI and emotions a few points.

You can't turn them on or off, why expect any other intelligence to be able to?Emotions are not just chemical based but also mind based as well and I don't see that great a difference on the psychology.

I do agree that our esteemed creator may have drifted and lost track of some his characters unique characteristics of late.

(click to show/hide)

I personally hope that he has learned some things over the years

You can't please everyone all the timeYou can and will make mistakesThings will go wrongSome people are meanSome people are immature and petulantSome people are uninformed / ignorantSome people are jerksSome people are idiots - it is untreatable and must be handled with cautionSome People are evilYou are a people and can be any of the above, just don't be evilStrategy, Tactics, Logistics - understand which is whichA plan that gets you where you need to go does not mean you can't take any side trips along the wayHumour and Drama are like salt and pepper - spices that enrich the flavour as long as you don't overdo itSomething that is too good to be true usually isn't - be that a product on TV, a social movement or your latest creationJumping in headfirst is fun and exciting but you will end up hurt by what is below the surface if you don't check the waters first

Logged

A good pun is it's own reword.There is a difference between spare parts, extra parts and left over parts.

The Venn diagram for Common Sense and Good Sense has very little, if any, overlap.

I don't think AI should be emotionless, far from it - having emotions helps them relate better to humans and vice-versa. I just think they should be in better control of their emotions, since they're generated differently than humans' are (no hormonal/chemical component, for one thing). Put it this way - imagine the damage a powerful AI like Station could do if it lost emotional control of itself. It would become a danger not only to the humans within it, but also those on the planet below.

Put it another way - even us humans have to learn some degree of emotional control to function in society. It follows that AI need to do the same, only they don't have the nearly two decades of time to learn that control like humans do during childhood and adolescence. So it follows that their emotional control would need to be at least partially "built-in" at the time of their installation in a body, maybe as some form of emotional control subroutines?

Jeph has also I think said before that he has deliberately not landed on an explanation why the AIs are so much like humans in their design, so regardless of why we may think AIs have them in this universe and whether we think it is right, it certainly seems to me like the AIs have emotions because humans do.

If you think that emotions are some kind of liability, or are "crippling", then bear in mind that humans evolved to experience and to display emotions, and so it stands to reason that both the experience and the communication of emotions must have held some kind of evolutionary benefit.

It's therefore not much of a stretch to posit that emotions benefit AIs as well, particularly as they live among humans who also experience and display emotions.

It is interesting to consider and discuss what those benefits might be exactly, but that is the bottom line.

Logged

"There is no expedient to which a man will not resort to avoid the real labor of thinking." - Sir Joshua Reynolds (paraphrased)

I like that the AI in QC have emotions and can express them freely. But there comes a point where they can become too emotional and human-like, to the point where Jeph could have swapped out Bubbles and Roko for Faye and any other character, even a "meatbag" ex-cop. IMO AI should be more in control of their emotions than humans are, maybe able to even turn them off completely if necessary; this could be one of the differences (maybe advantages?) between them and us.

That's the thing, though--I think it's apparent from the context that she's asking Elliot these questions specifically because she's excited about bread sexually. There's no indication that she has an interest in baking unrelated to the fetish thing. The conversation leading up to this was basically: "I have a bread fetish." "Have you thought about working at a bakery, since you have a bread fetish?" "I've thought about it, but it might ruin my fetish." And then she asks Elliot a question aimed at gauging whether or not working with bread every day actually would ruin her fetish.

So the issue I have isn't that she wants to be his bread apprentice despite happening to have a sexual interest in bread. It's that she wants to be his bread apprentice because she has a sexual interest in bread, and for the purpose of obtaining sexual gratification from her proximity to and involvement in the bread-baking process.

If he's cool with that, then okay. But she ought to be clear with him about the nature of her end of the arrangement, and she also ought to make sure he's actually comfortable with that role in facilitating her sex life.

I do not think that he'd be participating in her fetish any more than someone who was asked to teach someone how to maintain their vibrator.

To me, that's not the same thing. For one thing, if you're in the business of vibrator maintenance, you've signed up to be adjacent to people's sex lives in that capacity, whereas Elliot has not.

"Vibrator maintenance professional" is an actual job description?

.........

DIBS!

Logged

"Freedom is always the freedom of the dissenter" - Rosa Luxemburg"The first rule of the Dunning-Kruger club is you don’t know you're a member of the Dunning-Kruger club. People miss that." - David Dunning

Artag talked about "seeing past" an emotional reaction. It would make sense, a priori, if AIs had better readouts of their internal state and more awareness than we do. The most likely reason they don't is that they weren't deliberately designed.

Now, Roko, as a former police officer, you must be familiar with how everyday behaviour can turn into law in most people's eyes simply by the fact that everyone does it!

Seriously, though, this strip has the feel to me that something Jeph wrote in response to the number of people who weren't satisfied with how he handled Roko quitting the police force. He seems to have decided to handle this by trying to give some background. He is presenting what happened as basically the end result of misgivings she'd been having for a long time about other officers' conduct and how she was being perceived as being part of a police organisation. People were afraid of her and reflexively distrusted her (including those who really didn't have any obvious reason to do so) and that didn't sit right with her. In essence, she had just had enough, irrespective of whether or not the worst cultural stereotypes about the police had any validity.

As for her concerns about her ineffectiveness? Outside of superhero comics, there is very little that one individual, no matter how talented, can do. This is why all the best positive social changes have been the results of large numbers working together to a common goal. Hopefully, Roko will find people to work with and systems to work in which she feels confident can make her part of a force for good.

Panel 3 was a nice touch. It seems that Bubbles isn't fully aware of her strength always. She's going to need time and practice to try to do a reassuring shoulder squeeze without buckling bones and structural spars!

Artag talked about "seeing past" an emotional reaction. It would make sense, a priori, if AIs had better readouts of their internal state and more awareness than we do. The most likely reason they don't is that they weren't deliberately designed.

It wouldn't surprise me that that would be the case for some of the larger AI, like Station, or Spookybot, for whom their consciousness - on a human level - is just another subroutine. For others, I think it's also the physical limit of processing power that's playing a part. I believe it's been stated before that a lot of that goes to subconscious subroutines - just like ours does.

I think we're maybe being a little harsh on our new friend asking about AI and emotions...?

It should be remembered (in fact I am surprised it hasn't been mentioned already) that one of the over-encompassing cultural examples we have of a human AI in popular fiction is Data from ST:TNG - an android who actively struggled to attain emotions, while seeing other AI receive them (Lore-(his brother) and Lal - (his 'child') )

Lore seemed to get worse when he inserted Data's emotion ship, and Lal was basically destroyed by them messing up her neural net (IIRC).

When Data himself DID receive emotions (First from Lore himself (TNG:Descent), and secondly in the movie ST:Generations) they caused him nothing but problems. (And in the movie First Contact -he is seen to turn it OF so that he is unhampered by it) in Insurrection he 'leaves his emotion chip behind' when going on a mission, and in Nemesis he is acting as if he has never had it. (He uses the line "I feel nothing" in that movie.)Using him as an example - it's not too much of a stretch to see after all his yearning for them, Data getting emotions was not beneficial for him after all.

So, yeah - I totally understand why someone might see AIs as being *crippled* by emotions. Hell, are we really insisting that emotions cannot BE crippling?

Secondly - Seriously... what's happening with characters necks? (Bubbles especially) - is it just me or have they gotten thinner and longer?

I like to call Hawkwind "Britain's answer to the Grateful Dead, except they're actually psychedelic, and don't suck."

And they also the legend that was Lemmy playing bass with them (And even singing arguably their biggest hit 'Silver Machine')

Not a legend. A fact. Until he got busted trying to carry methamphetamine across the US/Canada border. Then they fired him, and he went on to found Motörhead (which was the title of a Hawkwind song that he wrote).

Artag talked about "seeing past" an emotional reaction. It would make sense, a priori, if AIs had better readouts of their internal state and more awareness than we do. The most likely reason they don't is that they weren't deliberately designed.

Posting this only to get the more knowledgeable people talking:

I dimly recall that consciousness wasn't really a survival strategy for naked apes with over-sized neocortices, either, but rather a by-product of our social & linguistic evolution?

And there are several psychiatric and neurological conditions that severely impact self-awareness and awareness of ones own emotional state, while the affected person remains, for all intents and purposes, broadly 'functional'?

Logged

"Freedom is always the freedom of the dissenter" - Rosa Luxemburg"The first rule of the Dunning-Kruger club is you don’t know you're a member of the Dunning-Kruger club. People miss that." - David Dunning

Panel 3 was a nice touch. It seems that Bubbles isn't fully aware of her strength always. She's going to need time and practice to try to do a reassuring shoulder squeeze without buckling bones and structural spars!

Repairs to Faye would be painful and very expensive. I hope she works something out!

... The conversation leading up to this was basically: "I have a bread fetish." "Have you thought about working at a bakery, since you have a bread fetish?" "I've thought about it, but it might ruin my fetish."

Is it really possible to ruin a fetish by making it part of your work life?(asking for a friend)

Logged

What would I do if I were smart? I guess first I'd stop taking the stupid pills.

Artag talked about "seeing past" an emotional reaction. It would make sense, a priori, if AIs had better readouts of their internal state and more awareness than we do. The most likely reason they don't is that they weren't deliberately designed.

Yes, this explains my thought better. Crippled is probably too strong a word : I'm just seeing some inefficiencies that a more through analysis of the situation would have avoided, and that's what I'd expect of an AI.

I also see some comments here about Bubbles pushing Roko .. she says 'oof', like the breath's been pushed out of her . But also, of course, a reasonable human-like reaction to indicate subtly to other humans 'that was a bit too much'.

As for Androids expressing emotion ... my thought was not Data, but Kryten's 'emotion chip' in Red Dwarf.

... The conversation leading up to this was basically: "I have a bread fetish." "Have you thought about working at a bakery, since you have a bread fetish?" "I've thought about it, but it might ruin my fetish."

Is it really possible to ruin a fetish by making it part of your work life?(asking for a friend)

I like to call Hawkwind "Britain's answer to the Grateful Dead, except they're actually psychedelic, and don't suck."

And they also the legend that was Lemmy playing bass with them (And even singing arguably their biggest hit 'Silver Machine')

Not a legend. A fact. Until he got busted trying to carry methamphetamine across the US/Canada border. Then they fired him, and he went on to found Motörhead (which was the title of a Hawkwind song that he wrote).