There Oughta be Three Laws


By L. Neil Smith

© Copyright JPFO. Inc

When I was young, just beginning to get deeply interested in firearms, there appeared in the newspapers (remember them?) a story about a farmer who got in trouble with the law trying to protect his property.

There were two houses on his land. He and his family lived in one. Another, unoccupied and closer to the road, kept being broken into by teenagers who had uses for an empty house. Unable to watch it 24 hours a day, the farmer rigged a shotgun just inside. If someone forced the door, they'd get blasted. When that actually happened, the farmer was surprised that the cops hauled him away. Apparently he'd figured, if he had a right to shoot trespassers himself, it was okay to do it by mechanical proxy. I don't recall what he was charged with, but it came down to leaving a self-actuating weapon lying around out of his direct control.

Negligent homicide, I guess, or at least a pretty good try.

When I first heard about this, I was outraged. Here was a hard- working individual with not very many options, trying to hold onto what he and his family had the only way he knew, against burglars and vandals. The entire business had taken place on his property, and whoever got injured had done so through their own illegal and immoral choices.

But the more I thought about it over the years, the more I tended, grudgingly, to agree with the law. After all, you're not allowed to murder anyone on your property, or to rape or torture them. Leaving an automated weapon to guard the door meant leaving out the capacity for human judgement. What if it had been a child who opened the door, or an accident victim from the road, or someone seeking emergency shelter in a storm? A human being with a shotgun in his hands could restrain himself from shooting them, but that rigged shotgun could not restrain itself.

What brings all this to mind just now is an article I saw recently on about an expert in the field of robotics who worries about the development of automated weapons systems -- armed killer robots -- on which the United States government has plans to spend some four billion dollars by 2010. To read the article, go to: .......

I'm primarily a novelist by trade -- a science fiction writer, in fact -- and I've made many technological predictions that turned out to be accurate. To tell the truth, I've worried about this sort of thing myself, since the 1980s, when prototypes (somewhat resembling Daleks, the evil machines on Dr. Who) built specifically for police work began showing up in the news. I'm not the only one. In 1984, Michael Crichton wrote and directed a movie Runaway, which featured, among other things, a household robot that had gone "insane", armed itself with a .357 magnum revolver, and murdered the family who owned it.

Manufacturers claim that using robots for hazardous tasks protects both the police and the public. And given the behavior of the police during the Los Angeles Rodney King riots and at Columbine High, maybe they feel a need for such a comfort. But it's not that great a leap from machines designed to disarm bombs, to machines designed to disarm you.

What if there had been killer robots available at Waco, or Ruby Ridge, or at the MOVE bombing in Philadelphia -- or even at the Warsaw Ghetto? Would the authorities have waited 51 days before assaulting the Branch Davidian church? Would those historic sieges have ended more quietly and quickly -- and with less embarrassing publicity -- if killer robots could have been sent in to eradicate the holdouts? And would the lessons these events taught us about government have gone unlearned?

Let me put it this way: do you doubt for an instant that this is precisely what some individuals in government ardently hope will happen?

And even if we were contemplating completely innocent and lawful uses for these infernal devices (they ought to be of some use to fire departments, for example) think of that computer sitting on your desk for a moment -- especially if it uses Microsoft Windows. Would you be willing to bet your life on its flawless performance? Yet that's just what the authorities will expect of us in the near future -- if we let them.

And how long before some criminal learns to hack them for his own purposes?

Other science fiction writers have been thinking about the ethical problems that robots represent for a very long time. Isaac Asimov, for example, who actually invented the term "robotics", believed that the best way to reduce the ominous threat that robots may represent is for them to be constructed around a set of rules he called the "Three Laws of Robotics", which he introduced in his pioneering 1942 short story "Runaround".

Go to:

Here are Asimov's Three Laws:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Asimov's idea was that these laws would be so thoroughly ingrained in the robot's "positronic" brain -- hardwired in, as it were, rather than simply programmed -- that if it tried, or was compelled somehow, to violate them, it would destroy the robot. Now, nearly seven decades later, we wonder why we haven't heard any of this in connection with the police and military robots we've been seeing increasingly in the news.

Look at it this way: if we're going to try to stamp every one of the two billion .22 long rifle bullets American shooters expend every year with a unique serial number, as some politicians are demanding, it seems quite reasonable, at least to me, to insist that every mobile cybernetically controlled device above a certain number of gigabytes should have Asimov's Three Laws built into it. Certainly the latter recourse is no more clumsy, stupid, expensive, or unworkable than the former.

How about you?

Equally certainly, the use of conscienceless machinery is going to make killing a great deal easier. Throughout most of human history, a combatant had to confront his enemy face to face. Perhaps it's only a coincidence, but in more recent history -- say from the War Between the States, through two world wars, Korea, Vietnam, down to today -- as it has gotten easier to kill other people at a distance, wars have become much vaster in scope, hideously more frequent, and unimaginably deadlier.

Or perhaps that's just a side-effect of larger, more powerful governments.

Meanwhile, we wonder the "little things", too. Like whether women -- your wife, for example, your daughter, your sister, your mother, or even your grandmother -- would prefer to be groped at the airport by the thuggish drones of the Transportation Safety Administration or by police state robots equipped for the task. It's not really much of a choice, is it? In fact it's like choosing between being hanged or electrocuted.

We also wonder, since resisting such lethal machinery would be more difficult than resisting merely human stormtroopers, especially if it could stand off at 600 yards and shoot you through the eye -- rendering your "freedom arsenal" ineffective -- if one unintended but inevitable consequence of the use of robots by the police might not be a lot more "IEDs", improvised explosive devices, hidden in garbage cans and storm drains, not just overseas, but increasingly here at home.

Imagine trying to take a simple walk at the risk of being blown to bits.

Knowing what we do of history and human nature, we can confidently predict that one way the authorities will try to "solve" this problem of imperfect machinery and the indiscriminate use of military and police robots is to persuade you to let them put an electronic chip in your arm so the mechanical minions of the law will know not to shoot you.

But not only is this morally wrong -- autonomous human beings were never meant to be tagged, tracked, and herded like so much livestock -- it also means that the police robots will know to shoot you, if your name and electronic ID number happen to show up in the wrong file.

How long before the "No-Fly" list becomes the "No-Pulse" list?

Maybe they'll start this chipping process in the public school system, claiming they need a way to "protect" children from the deadly killer robots patrolling the hallways to keep kids from smoking -- and kissing.

There are alternatives. Perhaps the best of them came from yet another science fiction writer, the late Mack Reynolds, a Marxist who nevertheless wrote two wonderful novels in the early 1960s, Mercenary From Tomorrow and The Earth War, in which he proposed that armies of the future be forbidden to employ any weapon that was invented after the year 1900. That means rifles, shotguns, and pistols only (the semiautomatic "Broomhandle" Mauser became available in 1896). No nukes, no tanks, no airplanes, and certainly no robots. Wouldn't you just love to be there when that idea is introduced in the United Nations?

Go to:

What we wonder most is why, in a nation of liberal politicians who absolutely hate, loathe, and despite firearms and who are hell bent on protecting us -- to death if necessary, whether we want them to or not -- there isn't already a law on the books that absolutely prohibits anyone, especially any agencies of the government, from arming robots, whether they remain under remote human control or are fully autonomous devices.

What about it, Caroline McCarthy, Chuck Schumer, Diane Feinstein, Richard Daley? Wanna do something that would really make America safer? Or do you look forward to robots to doing your dirty work -- dirty work you're too cowardly to do yourself -- the way the cops do now?

Inquiring minds want to know. --

A fifty-year veteran of the libertarian movement, L. Neil Smith is the Author of 33 books including The Probability Broach, Ceres, Sweeter Than Wine, And Down With Power: libertarian Policy In A Time Of Crisis. He is also the Publisher of The Libertarian Enterprise, now in its 17th year online.

Visit the Neil Smith archive on JPFO.

© Copyright Jews for the Preservation of Firearms Ownership 2012.

Original material on JPFO is copyright, and so it cannot be used or plagiarized as the work of another. JPFO does however encourage article reproduction and sharing, providing full attribution is given and a link back to the original page on JPFO is included.

Back to Top




The JPFO Store

Films and CDs