Once personality is mimicked sufficiently well, some laws will treat them as self aware when they aren’t in the least.
People will have real attachment to robot pets if they can cross the uncanny valley and not defecate in your apartment.
This whole area is a giant can of worms. https://www.sciencefriday.com/segments/science-diction-the-origin-of-the-word-robot/
Robot is drawn from an old Church Slavonic word, robota, for “servitude,” “forced labor” or “drudgery.” The word, which also has cognates in German, Russian, Polish and Czech, was a product of the central European system of serfdom by which a tenant’s rent was paid for in forced labor or service.
Making an intelligent machine that can pass the Turing Test and then treating it immorally, is immoral. This is something that Star Trek: TNG got right.
In a better (not even perfect) world, we’d do well to do the same.
You did not define treating it immorally.
Pretending a horse is a child doesn’t make it one. Olivia Newton John figured this out after her first child.
The Turing test is junk science (which takes nothing away from his otherwise brilliance.) By some measure Eliza variants have already passed it, but they are not intelligences.
I’ll give you a very simple example….
A pedophile buys an android designed to look and behave exactly as an 11-year old girl. One evening he goes after it with garden tools. Through an open window you can hear the screams from the ‘bot, pleas for help. An accurate enough rendition might include red hydraulic fluid spraying around. You are a neighbor and witness it. You call the police. They show up & tell him to keep it quiet next time. He can’t be arrested for vandalizing his own personal property. Now you try to explain to your own 11 year old daughter why that’s different and why as a society we should tolerate it. All sorts of depravities are only a few dollars away or available for lease, no cash down and only $120/mo for 24 months.
Judgement will not be on the machines. It will be on us.
You can be arrested for traumatizing children however it’s achieved. They have catch all laws for that.
And this is different from the neighbor’s kid seeing you watch a horror movie on Showtime, how?
You don’t see a difference between ‘raping’ an android in front of a child vs. doing it as a movie?
How do you judge intelligence? How can you tell a machine has a soul? The argument of difference “in kind” is a slippery slope, when the physical world evidences only differences “in degree”. Does a mind have inherent inalienable rights? Or only if it is made up of organic biological matter? To flip the argument on its head; are we free to abuse the autistic because they would fail the Turing Test? Why? It’s really more of a question for us than for the machines, isn’t it? For the truth will be self-evident to them.
We start by realizing no algorithm can do anything but mimic intelligence, no matter how complicated or what tests it passes.
Once you say intelligent is mechanical then morality and ethics go out the window. You no longer need the devil to make you do it.
Everything is then just a result, neither good nor evil.
I’ll consider extending human rights to a machine when it demonstrates an ability to recognize and respect those rights in others.
…
Some members of species Homo-Sapiens don’t pass that bar.
A deeply profound position. I could endorse that.
OK, then I think we are in agreement. That’s essentially getting to the same point.
Once personality is mimicked sufficiently well, some laws will treat them as self aware when they aren’t in the least.
People will have real attachment to robot pets if they can cross the uncanny valley and not defecate in your apartment.
This whole area is a giant can of worms.
https://www.sciencefriday.com/segments/science-diction-the-origin-of-the-word-robot/
Robot is drawn from an old Church Slavonic word, robota, for “servitude,” “forced labor” or “drudgery.” The word, which also has cognates in German, Russian, Polish and Czech, was a product of the central European system of serfdom by which a tenant’s rent was paid for in forced labor or service.
Making an intelligent machine that can pass the Turing Test and then treating it immorally, is immoral. This is something that Star Trek: TNG got right.
In a better (not even perfect) world, we’d do well to do the same.
You did not define treating it immorally.
Pretending a horse is a child doesn’t make it one. Olivia Newton John figured this out after her first child.
The Turing test is junk science (which takes nothing away from his otherwise brilliance.) By some measure Eliza variants have already passed it, but they are not intelligences.
I’ll give you a very simple example….
A pedophile buys an android designed to look and behave exactly as an 11-year old girl. One evening he goes after it with garden tools. Through an open window you can hear the screams from the ‘bot, pleas for help. An accurate enough rendition might include red hydraulic fluid spraying around. You are a neighbor and witness it. You call the police. They show up & tell him to keep it quiet next time. He can’t be arrested for vandalizing his own personal property. Now you try to explain to your own 11 year old daughter why that’s different and why as a society we should tolerate it. All sorts of depravities are only a few dollars away or available for lease, no cash down and only $120/mo for 24 months.
Judgement will not be on the machines. It will be on us.
You can be arrested for traumatizing children however it’s achieved. They have catch all laws for that.
And this is different from the neighbor’s kid seeing you watch a horror movie on Showtime, how?
You don’t see a difference between ‘raping’ an android in front of a child vs. doing it as a movie?
How do you judge intelligence? How can you tell a machine has a soul? The argument of difference “in kind” is a slippery slope, when the physical world evidences only differences “in degree”. Does a mind have inherent inalienable rights? Or only if it is made up of organic biological matter? To flip the argument on its head; are we free to abuse the autistic because they would fail the Turing Test? Why? It’s really more of a question for us than for the machines, isn’t it? For the truth will be self-evident to them.
We start by realizing no algorithm can do anything but mimic intelligence, no matter how complicated or what tests it passes.
Once you say intelligent is mechanical then morality and ethics go out the window. You no longer need the devil to make you do it.
Everything is then just a result, neither good nor evil.
I’ll consider extending human rights to a machine when it demonstrates an ability to recognize and respect those rights in others.
…
Some members of species Homo-Sapiens don’t pass that bar.
A deeply profound position. I could endorse that.
OK, then I think we are in agreement. That’s essentially getting to the same point.