A situation specifically created with this "conversation" in mind. It doesn't have to occur naturally, nor does it need to have a point. It can happen; thus "never" is a flawed statement.

The word "never" isn't a statement. You have not proven your idea. First, you needed to put the word conversation in quotes for your claim to be valid but that invalidates your point. Part of the poster's premise was that the conversation – a dialog between 2 programs/people – can take place. It can't because to get to the "conversation" as presented it would need to be scripted and thus is not a debate or conversation anymore; it is a set piece, a fiction, unreal. Which was my point.

Quote

Alternatively, there is nothing stopping the programmers from creating alternate programs with alternate startup assumptions separately and testing them separately. No interaction with the running program is required to create the modifications. There is nothing stopping a programmer from creating artificial memories or modifying existing ones.

Agreed. But this doesn't matter with regards to the point I was making. See below about special design.

Quote

Where in the OP did it state - or even imply - that the conversation occurred naturally?

Quote

This fictional debate takes place in an unspecified A.I computer system between two sentient computer programs (Alpha and 0,1)

A debate is an exchange between two people (or in this supposed case AI programs) where ideas/points/arguments are exchanged. If it is not occurring naturally then it is not a debate; it is a set piece, a fiction, unreal. Which was my point.

Quote

Or you can just not give the programs the ability to read that information.<ad hominem removed> The only reason you would ever need to create a "specially designed independent computer system" was if you had gone out of your way to add in the ability for them to look at and comprehend the locations you put the programmer signatures. And if, for whatever reason, you open up their capabilities to read files at random - don't give the program admin privileges - and don't chmod 777 everything.

Part of the premise underlying the original post is that the programs are a lot like people. Which would include the ability to examine their environment. Limiting or eliminating that ability creates an alternative specially designed computer system – exactly what I said would be needed. You have unintentionally supported my contention. The same with altering the programs to perceive time on human scale or just about any other special aspects you care to name.

Quote

His analogy fails utterly because of the conversation he decided to write on the behalf of the programs. It was terrible.

I thought that was painfully obvious to all but the original poster. But he replied to the joking about his OP as if they were serious points to be considered. To me, the logical thing was to rip his underlying premise to shreds then any discussion of the content on his part fails.

Quote

You are making a number of unnecessary assumptions and attempting to use those against the premise you were given. It seems like a poor attempt to dodge by claiming a what-if scenario can't happen. That's like playing D&D and questioning the GM when he tells you a dragon landed in front of the party because you don't believe in dragons.

The "assumptions" were present in the premise. And a correct analogy here would be like playing D&D and questioning the GM when he has a group of Klingons beam down in front of you.

Quote

Also; for someone who took a shot at me about programming, you clearly have a lot to learn yourself.

Yes, I do. I haven't done any programming in decades. (I don't count making macros as programming.) I first learned Fortran 4[1] then went on to other languages. My programming ended about the time C (or C+ - I've forgotten which) was gaining popularity.

I apologize for the ad hominem attacks – they are too easy to engage in. As you know. Like myself you left a lot unsaid that made it possible to interpret your posts in a negative light; you know your comments were accurate because you know exactly what you meant but the actual wording left it possible to view you as not knowing enough about programming. I do that with my posts also.

I seem to have found a hot button of yours and will endeavor not to use a certain word again unless I want you jumping all over my posts.

The word "never" isn't a statement. You have not proven your idea. First, you needed to put the word conversation in quotes for your claim to be valid but that invalidates your point. Part of the poster's premise was that the conversation – a dialog between 2 programs/people – can take place. It can't because to get to the "conversation" as presented it would need to be scripted and thus is not a debate or conversation anymore; it is a set piece, a fiction, unreal. Which was my point.

Quote

Quote

Where in the OP did it state - or even imply - that the conversation occurred naturally?

Quote

This fictional debate takes place in an unspecified A.I computer system between two sentient computer programs (Alpha and 0,1)

A debate is an exchange between two people (or in this supposed case AI programs) where ideas/points/arguments are exchanged. If it is not occurring naturally then it is not a debate; it is a set piece, a fiction, unreal. Which was my point.

Quote

The "assumptions" were present in the premise.

All we are told in the opening is that there are two sentient programs with the names "Alpha" and "0 1". We are not given their location. We are not given how long they have been operational. We are not given how many there are in the computer system.

They could be in a multi-server complex that spans 12 decks on the U.S.S. Enterprise E. They could have been running for one thousand years. They could have a hundred-thousand other programs with them[1][2]. We've already gone over that fact that multiple AIs can be created without any knowledge of a programmer. You've also admitted that enough experience (gained over an unspecified period of time) would create differing responses to the same situation despite "identical twin" starting conditions.

But they will be like human identical twins; it will take a lot of experience before they significantly diverge.

And no one made these two have this conversation (although with Alpha's opening line, one might think it were coerced - but terrible writing aside...), it just happened to occur exactly one thousand years (or however long) after they were activated.

Again, none of this was specified because as a "what-if" scenario it shouldn't need to be.

Part of the premise underlying the original post is that the programs are a lot like people. Which would include the ability to examine their environment. Limiting or eliminating that ability creates an alternative specially designed computer system – exactly what I said would be needed. You have unintentionally supported my contention. The same with altering the programs to perceive time on human scale or just about any other special aspects you care to name.

Can humans examine everything about their environment? Anyone here able to see the ultraviolet spectrum with their built-in capabilities? There could be a programmer's signature there! We need tools that we've created in order to examine certain aspects of our environment - there is no reason for this exercise to assume the programs have developed their own equivalent technologies. Limiting their ability to examine their environment is not an obstacle. The same for perception of time - we have no true basis to contend AI should experience time at any specific rate, if at all. The fact that they are AI (Artificial Intelligence) means anything we put them in or any decisions we make about how they experience their environment will be "specially-designed" by some metric; since there is no real base assumption or prior example one can look at for their capabilities.

I am of the opinion that this argument is simply invalid. It does not take away from the AI's ability to have a conversation any more than two imprisoned blind men[3].

I thought that was painfully obvious to all but the original poster. But he replied to the joking about his OP as if they were serious points to be considered. To me, the logical thing was to rip his underlying premise to shreds then any discussion of the content on his part fails.

You're right, it was painfully obvious to everyone but the original poster. But are we talking to each other about it, or are we responding to him? If he thinks it's serious; then (in my opinion) the best ways to address it are to either laugh at it and hope he has a sense of humor about his own failing (he doesn't), or we address the content and show him why it fails. The goal in doing it that way is to change his thought process. Make him recalculate the results - essentially we correct the first thing on his list, invalidating the rest - and he has to defend that or accept it before we can move on.

Revise his thinking and he'll either run away[4] or understand a little bit more.

Furthermore - it's a "What-if" scenario with a non-self-contradictory set-up. I don't really see the value in going after the premise since that doesn't actually advance the conversation.

And a correct analogy here would be like playing D&D and questioning the GM when he has a group of Klingons beam down in front of you.

I see nothing wrong with this.

But in seriousness, I still see nothing wrong with this - it's a furthering of D&D's inherent what-if scenario generation. How do the players react to this change? Do the Klingons win or do the players capture themselves a nice new Vor'cha starship? Sure, the GM is going to get some funny looks, but there is still nothing wrong here. Also; thanks for the idea. *Evil grin*

I seem to have found a hot button of yours and will endeavor not to use a certain word again unless I want you jumping all over my posts.

The word to be careful with is "You". I don't mind someone talking about programming and missing something - this is true for any topic. My problem is when someone takes a shot at someone else (especially if that someone else is me) about a lack of knowledge without direct provocation[5].

There would be a lot less gun-related deaths if everyone shot second, basically.

Logged

"You play make-believe every day of your life, and yet you have no concept of 'imagination'."I do not have "faith" in science. I have expectations of science. "Faith" in something is an unfounded assertion, whereas reasonable expectations require a precedent.

So, going with the whole point of the thread's allegory, you'd agree that YHWH (or Jesus or whoever) has to first introduce himself - at the very least, convincing someone that he exists - before that person can freely choose to either pursue or avoid a relationship. Agreed?

Wonderful. Then you agree that those to whom your god has not introduced himself, who are not yet convinced of his existence, do not yet have the freedom to choose whether or not to pursue a relationship with him.

Today, right at this moment Jackie Chan is introducing Himself to you through me.

HELLO THERE!

now you have all the freedom in the universe to choose [to pursue a relationship with Jackie Chan].What do you say?

Logged

"You play make-believe every day of your life, and yet you have no concept of 'imagination'."I do not have "faith" in science. I have expectations of science. "Faith" in something is an unfounded assertion, whereas reasonable expectations require a precedent.

Would the result be any different if you had said yes? Can you really choose based on some random person on an internet forum offering you a relationship with some other unrelated entity?

-

Nevermind that you didn't exactly fulfill the "convincing" part of the requirements.

Logged

"You play make-believe every day of your life, and yet you have no concept of 'imagination'."I do not have "faith" in science. I have expectations of science. "Faith" in something is an unfounded assertion, whereas reasonable expectations require a precedent.

Today, right at this moment my God is introducing Himself to you through me.

I AM THE CREATOR OF ALL THINGS, AND I SENT MY SON TO SAVE YOUR SIN WHOSE EVER BELIEVES IN HIM SHALL BE SAVED.

now you have all the freedom in the universe to choose.What do you say?

I say, "I've heard this before." And I *did* choose god. In fact, I seriously contemplated priesthood as a late teen. But god failed to keep up his end of the bargain; he wasn't strong enough to keep the insidious evil called "education" from my paltry brain. I learned stuff--mainly history and how religions of all type are more of a blight on the world than a help--and thought for myself, and very gradualy fell away from Roman Catholicism, then Christianity, then theism altogether.

So, here's one example of someone who did, in fact, choose to have a relationship with god. But he wasn't present enough to maintain it.