Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: [News] Linux and OSS Will Not Kill People

"Oliver Wong" <owong@xxxxxxxxxxxxxx> wrote in message news:E1oEg.7799$365.7007@xxxxxxxxxxx
"Roy Schestowitz" <newsgroups@xxxxxxxxxxxxxxx> wrote in message news:6697700.qI8P1PXjBG@xxxxxxxxxxxxxxxxxx

,----[ Quote ]
| The lead developers Tiziano Mengotti and Rene Tegel have been reading
| their Isaac Asimov and have decided to give their code the equivalent of
| the three laws of robotics. They say that "the program and its
| derivative work will neither be modified nor executed to harm any human
| being nor through inaction permit any human being to be harmed."
`----
[...]

Even just the term "harm" itself is a ill defined. If a human is aging at the rate of 1 second per second, does that consitute harm? If a human ages for over 90, 100, 150 years, surely they will die from that aging. So how is a robot to stop a human from aging?[*]


[...]

*: Obviously, the only way to prevent humans from aging is to ensure that no humans will ever exist. At this point, there are about 6 billion humans in existence. If you kill all of them, you can guarantee that no further humans will exist. So by killing all humans that currently exists, you've cause harm to 6 billion humans, but you've prevented potentially harm to an infintie number of humans: all humans that will ever exist.

And so obviously, if you accept the clause of that modified GPL, there are two strategies that I can think of to ensure that the above mentioned program and its derivatives will neither be modified nor executed to harm any human being, nor through inaction permit any human being to be harmed:


(1) Destroy all copies of the software (including its original source code, whether the source code exists on harddrives in computers, or in the brains of programmers). Therefore, that software cannot be modified nor executed to cause harm to anyone.

(2) Destroy all humans (without using the software or its derivatives). Therefore, there won't be any humans around for the software to harm when modified or executed.

As far as I've reasoned it out, you MUST perform one of the above two actions if you accept the license agreement. Any other action allows for the possibility of the software somehow harming a human, which means you're allowing a human to become harmed by the software via inaction.

Even (1) is iffy, because even if you destroy all copies of the software and its source code, and everyone who knew anything about the design of the software, there's still a small chance that, by some fluke, someone will independently write the exact same program, down to the bit level. Yes, it's highly improbable, but it's still possible, and if you don't do anything to take care of this possibility, you're allowing it to happen via inaction. So the safest bet is (2) to kill all humans.

- Oliver


[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index