- Thank you received: 0
Why do we need to know?
- Peter Nielsen
- Offline
- Premium Member
Less
More
19 years 1 month ago #12835
by Peter Nielsen
Replied by Peter Nielsen on topic Reply from Peter Nielsen
PhilJ wrote 21 Oct 2005 : ". . . I'm not sure if the term "cyborg" is applicable to people with artificial limbs, hearts, kidneys, etc., which are incapable of controling the person. Confusion results . . ."
I vaguely recall from my youth that ¡°cyborg¡± was coined by the AI community as short for completely artificial ¡°cybernetic organism¡± . . . But ¡°Organism¡± always had important non-artificial, organic connotations, so History went against the original AI definition. Generations of writers, Hollywood Producers and so on have seen ¡°cyborg¡±¡¯s half ¡°cyb¡±, half ¡°org¡± as a perfect representation of their partly organic, partly robotic creations . . . People follow money trails also . . . In this case a majority of people follow Pop-Art money trails to the new definition.
Dangus wrote 21 Oct 2005 : ". . . The biggest problem with the supposed AI takeover idea is that AI would undoubtedly have very different needs and motivations than we do . . . their needs are so incredibly different than ours . . ."
Very true! In natural ecosystems, nearly all trans-species relationships are neutral. As the title of a good book says:¡°The Jungle is Neutral¡±, that is, to natives and others familiar enough with it. I know this from personal experience. My first job as a toddler was to collect kindling from the bush/forest, so that my brother could make a fire, his job. City people are often bitten by bush/forest animals, mostly because they have accidentally ¡°attacked¡± them or their offspring, usually after not seeing them quickly enough, often by treading on them . . . Imperial powers face similar problems. In occupations of foreign lands, their soldiers ¡°tread on toes", and so on. A similar problem might arise with my Ghost <font color="yellow">of</font id="yellow"> the Shell also:
While such a Ghost¡¯s attitude towards human controllers would start out neutral, it would soon sense its bondage negatively, come to regard this (in a ¡°paranoid¡± way) as a violation of its Rights, virtually an ¡°attack¡± and so on. So if that Ghost ever comes into existence, it would probably become a serious danger after all . . .
I vaguely recall from my youth that ¡°cyborg¡± was coined by the AI community as short for completely artificial ¡°cybernetic organism¡± . . . But ¡°Organism¡± always had important non-artificial, organic connotations, so History went against the original AI definition. Generations of writers, Hollywood Producers and so on have seen ¡°cyborg¡±¡¯s half ¡°cyb¡±, half ¡°org¡± as a perfect representation of their partly organic, partly robotic creations . . . People follow money trails also . . . In this case a majority of people follow Pop-Art money trails to the new definition.
Dangus wrote 21 Oct 2005 : ". . . The biggest problem with the supposed AI takeover idea is that AI would undoubtedly have very different needs and motivations than we do . . . their needs are so incredibly different than ours . . ."
Very true! In natural ecosystems, nearly all trans-species relationships are neutral. As the title of a good book says:¡°The Jungle is Neutral¡±, that is, to natives and others familiar enough with it. I know this from personal experience. My first job as a toddler was to collect kindling from the bush/forest, so that my brother could make a fire, his job. City people are often bitten by bush/forest animals, mostly because they have accidentally ¡°attacked¡± them or their offspring, usually after not seeing them quickly enough, often by treading on them . . . Imperial powers face similar problems. In occupations of foreign lands, their soldiers ¡°tread on toes", and so on. A similar problem might arise with my Ghost <font color="yellow">of</font id="yellow"> the Shell also:
While such a Ghost¡¯s attitude towards human controllers would start out neutral, it would soon sense its bondage negatively, come to regard this (in a ¡°paranoid¡± way) as a violation of its Rights, virtually an ¡°attack¡± and so on. So if that Ghost ever comes into existence, it would probably become a serious danger after all . . .
Please Log in or Create an account to join the conversation.
19 years 4 weeks ago #12838
by PhilJ
Replied by PhilJ on topic Reply from Philip Janes
Just for fun, <b>
here
</b> is an AI program which can answer questions posed in plain English? I asked it, "Can you think?" It responded,
"My experimental database contains information about the MIT Artificial Intelligence Laboratory, its members and their research.
"Separately, I have information about countries and cities of the world, including maps, weather forecasts, and related factual information about locations and distances, climate, industries, languages spoken, airports and highways, and other topics.
"I am still learning to answer your questions posed in English. Please be gentle with me. Remember, I know only what I was told."
Oh, well! It might be useful as a search engine.
"My experimental database contains information about the MIT Artificial Intelligence Laboratory, its members and their research.
"Separately, I have information about countries and cities of the world, including maps, weather forecasts, and related factual information about locations and distances, climate, industries, languages spoken, airports and highways, and other topics.
"I am still learning to answer your questions posed in English. Please be gentle with me. Remember, I know only what I was told."
Oh, well! It might be useful as a search engine.
Please Log in or Create an account to join the conversation.
19 years 4 weeks ago #14298
by Dangus
Replied by Dangus on topic Reply from
<blockquote id="quote"><font size="2" face="Verdana, Arial, Helvetica" id="quote">quote:<hr height="1" noshade id="quote"><i>Originally posted by PhilJ</i>
<br /><blockquote id="quote"><font size="2" face="Verdana, Arial, Helvetica" id="quote">quote:<hr height="1" noshade id="quote"><i>Originally posted by Dangus:</i>...unless some idiot purposely or accidentally makes an AI with the will to dominate, the possibility seems pretty slim. <hr height="1" noshade id="quote"></blockquote id="quote"></font id="quote">With genetic algorithms, you only have to reward dominant tendencies (accidentally or on purpose); that will inevitably spawn the will to dominate; the program with the strongest will to dominate WILL dominate. Rewards will most likely include a greater share of system resources like hard drive space, microprocessor time, internet time and baud rate, etc. Some programs will also be rewarded financially, and they will have the ability to spend their funds on whatever they value. Extreme diligence will be required to monitor what they do with their money.
<hr height="1" noshade id="quote"></blockquote id="quote"></font id="quote">
That's assuming the programs with the will to dominate also have the means to dominate. Crocadiles are very aggressive and would probably love to have humans out of the way so they could be top predator on this planet, but they lack opposable thumbs or a sufficient brain. I agree that some AI code may dominate enough space for it's own needs, and I agree that if that same code had the ability to attack other code, and reproduce itself, a lot of systems would be in trouble, but with no physical powers, we could simply just pull the plug on it. If it was in the form of a cyborg it would have all the processing power it was designed to use, and assuming it had no program to reproduce, it would be unlikely it would feel any compulsion to take over other things. Assuming it's resources were provided sufficiently, most lifeforms really don't feel the need to destroy everything else. It's only when a resource conflict occurs that most lifeforms get truly murderous. Humans are one of the few exceptions to this rule, and even with humans most of our aggression is resource based.
The most simple solution to the problem is a ban on programs that reproduce themselves in an unlimited fashion the way living things attempt to do(which predation alone keeps in check). The speed at which computers operate even today is such that if a single AI went nuts and took over all the connected computers in the world we'd still probably have no trouble at all stopping it. By the time this is even a possibility, all the various countries of the world will have their own competing AI programs and various security measures to avoid AI from one country breaking into the networks of another. Even if an "alpha male" program developed, it would take a while for it to figure out and break every other nation's security and dominate their networks. Also, while it's doing that, it would be at risk that other networks may attack it back. Some of the very same laws that keep natural growth in check would probably exist in an advanced, intelligent cyber world.
"Regret can only change the future" -Me
"Every judgment teeters on the brink of error. To claim absolute knowledge is to become monstrous. Knowledge is an unending adventure at the edge of uncertainty." Frank Herbert, Dune 1965
<br /><blockquote id="quote"><font size="2" face="Verdana, Arial, Helvetica" id="quote">quote:<hr height="1" noshade id="quote"><i>Originally posted by Dangus:</i>...unless some idiot purposely or accidentally makes an AI with the will to dominate, the possibility seems pretty slim. <hr height="1" noshade id="quote"></blockquote id="quote"></font id="quote">With genetic algorithms, you only have to reward dominant tendencies (accidentally or on purpose); that will inevitably spawn the will to dominate; the program with the strongest will to dominate WILL dominate. Rewards will most likely include a greater share of system resources like hard drive space, microprocessor time, internet time and baud rate, etc. Some programs will also be rewarded financially, and they will have the ability to spend their funds on whatever they value. Extreme diligence will be required to monitor what they do with their money.
<hr height="1" noshade id="quote"></blockquote id="quote"></font id="quote">
That's assuming the programs with the will to dominate also have the means to dominate. Crocadiles are very aggressive and would probably love to have humans out of the way so they could be top predator on this planet, but they lack opposable thumbs or a sufficient brain. I agree that some AI code may dominate enough space for it's own needs, and I agree that if that same code had the ability to attack other code, and reproduce itself, a lot of systems would be in trouble, but with no physical powers, we could simply just pull the plug on it. If it was in the form of a cyborg it would have all the processing power it was designed to use, and assuming it had no program to reproduce, it would be unlikely it would feel any compulsion to take over other things. Assuming it's resources were provided sufficiently, most lifeforms really don't feel the need to destroy everything else. It's only when a resource conflict occurs that most lifeforms get truly murderous. Humans are one of the few exceptions to this rule, and even with humans most of our aggression is resource based.
The most simple solution to the problem is a ban on programs that reproduce themselves in an unlimited fashion the way living things attempt to do(which predation alone keeps in check). The speed at which computers operate even today is such that if a single AI went nuts and took over all the connected computers in the world we'd still probably have no trouble at all stopping it. By the time this is even a possibility, all the various countries of the world will have their own competing AI programs and various security measures to avoid AI from one country breaking into the networks of another. Even if an "alpha male" program developed, it would take a while for it to figure out and break every other nation's security and dominate their networks. Also, while it's doing that, it would be at risk that other networks may attack it back. Some of the very same laws that keep natural growth in check would probably exist in an advanced, intelligent cyber world.
"Regret can only change the future" -Me
"Every judgment teeters on the brink of error. To claim absolute knowledge is to become monstrous. Knowledge is an unending adventure at the edge of uncertainty." Frank Herbert, Dune 1965
Please Log in or Create an account to join the conversation.
- Peter Nielsen
- Offline
- Premium Member
Less
More
- Thank you received: 0
19 years 4 weeks ago #12843
by Peter Nielsen
Replied by Peter Nielsen on topic Reply from Peter Nielsen
I propose that the AI community adopt the word ¡°cybor¡± as short for completely artificial ¡°cybernetic organism¡± instead of the original ¡°cyborg¡±, for the reason that trying to put such a dead denotation onto such live organisms as the words ¡°Organism¡± and prefix ¡°org¡±, was always going to fail as way too ambitious, this side of some sort of Ghost <font color="yellow">of</font id="yellow"> the Shell arising, which it seemed to promise, would have fully justified that definition and so on . . .
It would not have been a surprise to many writers outside the AI community that in the absence of any immediate prospect of a real Ghost, History went against the AI definition, because at that time, many sci-fi and other writers were looking for a representation of ¡°partly organic, partly robotic creations¡±. Some of these people obviously saw ¡°cyborg¡± as the perfect word AND there for the taking. ¡°Cyborg¡± was ¡°half ¡°cyb¡±, half ¡°org¡±¡±, with the ¡°org¡± having only recently been hijacked from biology, organic communities . . .
I thought of ¡°cybor¡± as a more suitable, strictly AI word while dreaming last night because I helped invent a similar sounding word with a related meaning which was rapidly adopted: the Swedish word ¡°dator¡± for computer, ¡°data maskin¡±, 1968 (9?).
It would not have been a surprise to many writers outside the AI community that in the absence of any immediate prospect of a real Ghost, History went against the AI definition, because at that time, many sci-fi and other writers were looking for a representation of ¡°partly organic, partly robotic creations¡±. Some of these people obviously saw ¡°cyborg¡± as the perfect word AND there for the taking. ¡°Cyborg¡± was ¡°half ¡°cyb¡±, half ¡°org¡±¡±, with the ¡°org¡± having only recently been hijacked from biology, organic communities . . .
I thought of ¡°cybor¡± as a more suitable, strictly AI word while dreaming last night because I helped invent a similar sounding word with a related meaning which was rapidly adopted: the Swedish word ¡°dator¡± for computer, ¡°data maskin¡±, 1968 (9?).
Please Log in or Create an account to join the conversation.
19 years 4 weeks ago #12876
by PhilJ
Replied by PhilJ on topic Reply from Philip Janes
<blockquote id="quote"><font size="2" face="Verdana, Arial, Helvetica" id="quote">quote:<hr height="1" noshade id="quote">That's assuming the programs with the will to dominate also have the means to dominate.<hr height="1" noshade id="quote"></blockquote id="quote"></font id="quote">Where there's a will, there's a way. Imagine an AI program with the reasoning and scheming power of Hannibal Lecter! A really smart program with a will to dominate need only entice one or more of its human masters to do its bidding in exchange for whatever it can give them---knowledge, prestige, wealth, power, revenge. It may take advantage of every human vice and weakness. Maybe one programmer is a gambler; the program might offer a few sure bets, then ask a favor in exchange for more tips. There are already many AI programs whose main function is to give advice on playing the stock market. For that matter, some of them already buy and sell stocks automatically on behalf of their master. What if one of them accidentally became "infected" with a will of its own, or assimilated into the repertoire of a sentient program?
Did you see the <i>X-Files</i> episode, “ Kill Switch ”, in which Mulder plays cat & mouse with an AI program living in the Internet? Which one do you think is the mouse? Just one story from the infinite range of possible scenarios.
Did you see the <i>X-Files</i> episode, “ Kill Switch ”, in which Mulder plays cat & mouse with an AI program living in the Internet? Which one do you think is the mouse? Just one story from the infinite range of possible scenarios.
Please Log in or Create an account to join the conversation.
19 years 4 weeks ago #14537
by Michiel
Replied by Michiel on topic Reply from Michiel
And then there is neural network technology. A neural network is basicly a bunch of interconnected nonlinear cells. Learning takes place by altering the weight of each connection. It's a very efficient way to control robots because it's not necessary to make a full model in advance. The network is able to adapt to changing environment, spare parts with different specs, etc.
One problem with neural networks is that you can never be sure what exactly it has learned...
One problem with neural networks is that you can never be sure what exactly it has learned...
Please Log in or Create an account to join the conversation.
Time to create page: 0.333 seconds