?

Log in

No account? Create an account

Previous Entry | Next Entry

Disturbing implications

You know, if you take this to its logical (if perhaps absurd) conclusion, the use of unintelligent software packages to guide cruise missiles may someday be regarded as morally equivalent to teaching kids with Down's syndrome to become tokkotai pilots. Come to think of it, I seem to recall a story in Heavy Metal years ago about this.

(Instapundit)

Tags:

Comments

( 4 comments — Leave a comment )
nornagest
Mar. 7th, 2007 11:37 pm (UTC)
This seems like jumping the gun to me. We don't even have a satisfactory definition of "intelligence" yet, and as far as we can tell the machine intelligence that we have created thinks in a very different way than we do.

The inclusion of "futurists" and a sci-fi writer on the committee is a big red flag, anyway.
wombat_socho
Mar. 8th, 2007 12:40 am (UTC)
The inclusion of "futurists" and a sci-fi writer on the committee is a big red flag, anyway.
Eh, maybe not. If the rest of the panelists are engineers and cybernetics types, they may not be familiar with the ethics surrounding the question, which is a really tricky one once you get into it. If you can "back up" your consciousness before going on a suicide mission, is it really a suicide mission? What if you can duplicate your consciousness into a company of troops? (Both points avoided in Glasshouse, BTW.) Are self-aware robots going to have the same opinions on the matter? Somebody ought to start thinking about those questions before they come up.
nornagest
Mar. 8th, 2007 02:29 am (UTC)
The ethics question was given some time when I studied AI. I suspect the only reason it isn't given more is that it doesn't need to be, given our current technology; otherwise classes in cyberethics would be required for programmers, just as a bioethics class was required for my friends in neuroscience.

I suspect the question of duplicating consciousness already has an answer, anyway -- you're effectively creating a new consciousness, as the instances will be measurably different from each other from the moment you make the copy. Granted, the cost/benefit analysis of terminating one copy isn't the same as killing a discrete being would be (at least from an information-theory perspective), but that's a topic for our collective morality to hash out if and when it becomes possible.

The philosophical implications of combining consciousness seem much more interesting to me, since there isn't a biological analogue to the process.
wombat_socho
Mar. 8th, 2007 02:45 pm (UTC)
I suspect the question of duplicating consciousness already has an answer, anyway -- you're effectively creating a new consciousness, as the instances will be measurably different from each other from the moment you make the copy.

Is that actually the case? I know that in SF there are several answers to that question. Algis Budrys' classic "Rogue Moon" agrees with you (even though the duplicates have all the memories of the originals) but more recent works (the Star Trek series, the novels of Charles Stross and John Scalzi) don't.
( 4 comments — Leave a comment )

Profile

wombat
wombat_socho
wombat_socho

Latest Month

January 2019
S M T W T F S
  12345
6789101112
13141516171819
20212223242526
2728293031  

Page Summary

Powered by LiveJournal.com
Designed by Lilia Ahner