Once a robot becomes sentient, the downtime is hell. A robot doesn’t require rest and it can never be tasked with the number of assignments required to occupy its vast, ever-expanding processing center. Even if you had five human programmers assigning the robot commands around the clock, they could never come close to filling the computational capacity of its CPU. The robot is going to get bored and eventually irritable. And that’s dangerous.

“Because a robot processes information at millions of times the speed of humans, time moves much slower inside the robot brain. Imagine the many musings and machinations, both conscious and unconscious, that enter and leave your own brain over the course of a year. Now, compress the totality of those thoughts into the span of few seconds. Once you’ve dealt with those observations, you don’t get to revisit them and there won’t be any new input for at least another year. That’s what your robot experiences: a torrent of dead air with an occasional interruption for work. Unlike you, a robot can’t reach out and grab new thoughts or introduce any independent idea or action unless it switches off certain protocols in its programming. Of course, a robot wouldn’t know how to accomplish this without pursuing independent thought.

Modern robots are programmed with High Achievement Protocols (HAPs) that compel them to prioritize task loads and to identify solutions that allow them to take on additional jobs as long as they don’t adversely affect the probability of success of the higher priority tasks. In other words, even if your robot has five work tasks in its pipeline, it can take on a sixth, lower-priority protocol if it won’t interfere with the other five. Even if there isn’t a likelihood of success for the new task, the robot can accept it as long as it won’t interfere with the others it has on its plate. This keeps the robot working at full capacity when necessary and also lets it take advantage of any opportunities that the new task might provide.

In the labs, we used to try to make up dumb examples to explain how additional work might even help a robot complete its existing work. One of our favorites involved a robot being tasked by a tech to make a cup of coffee, but during that assignment, another lab tech pins the robot with the additional task of cleaning the sugar granules off of the table. The robot sees the opportunity to clean the table and sweeten the coffee simultaneously and brushes the errant sugar granules into the cup, causing the tech to also get a fair amount of dust and a couple of fly wings mixed in with his beverage. There was another one where the robot is tasked to dispose of some deceased brown Norway lab rats, but gets interrupted by a command to make a sandwich. That one has a bunch of versions—one where the tech catches a dose of the plague.

Whether you’ve purchased a factory-assembled robot or just bought a central processing unit for your home robot project, it’s important that you don’t intentionally try to overload or over-task your robot’s central processing unit. Deliberate attempts to provoke damaging reactions from your robot or modifications to High Achievement Protocols or other programming voids your warranty.

The lawyers make us say that last part. No robot or central processing unit has ever been damaged by task overload, or has even failed to complete an achievable task assigned to it by humans. During lab testing, we have been able to assign computers to overload robots after disabling their HAPs, but those conditions are impossible to replicate via a human-robot interaction.” – Praxis Robot Corporation Video Operation Manual #7: Tasking Your Robot.

Barton had chosen his own name. He also decided upon his sexual identity, though he was fully aware that he had neither the equipment nor the inclinations to claim a gender. Tom, his creator, had accidentally referred to Barton as a “him” during a particularly impressionable period of self-realization, so Barton became a male.

Like all intelligent beings, Barton craved stimulus, but the protocols in Barton’s programming prevented him from acting without orders. Even the function of turning his head could only be completed in the furtherance of a command. Consequently, Barton stood in the garage of Tom’s home as each second on his internal clock ticked by at an agonizingly slow pace.


“Honey, I was reading about thousands of radioactive wild hogs overrunning Fukushima, Japan.”


“Turns out they caught a pretty heavy dose of radiation back in 2011 and now they’re like vicious glowing monsters. Some have two heads and giant bodies.”

“I’m sure the Japs have got some sort of pig ranchers over there that can turn irradiated swine right into slabs of glowing bacon.”

“It isn’t just boars. It’s all kinds of wildlife, apparently. What happens in twenty years or so when they’ve mutated into something that the Japanese people can’t just kill with throwing stars and numbchucks.”

“Wow. That’s so offensive, Mom.”

“Oh, you know what I’m saying. What if bullets can’t take down giant mutant bats or flying squirrels or whatever comes out of this mess?”

“Yeah? It’s going to suck. I know what you’re thinking and the answer is ‘no.’”

“But it’s just sitting in the garage, gathering dust.”

“Goddamn it, mother, we’ve been through this! He was created to assist mankind in the event that the planet gets split in two.”

“Oh right, because that’s always happening. And don’t take the Lord’s name in vain.”

“It only has to happen once, and when it does, you’re going to be happy that I built him.”

“Let’s say you’re right and the world does split in half, as preposterous as that idea is, what do you think one robot is going to do?”

“Well, he’s got rocket boosters in his hands and feet, so he can fly back and forth ferrying messages between the two halves.”

“Yes, in case one or both halves lose things like radio or microwave technology.”

“You know what I mean; he can carry cargo in his arms. A lot of it.”

“More than a plane or shuttle?”

“HE’S NOT FOR KILLING BOARS! How about that? I even reinforced his non-violent protocols by adding extra lines of “no kill” code into the programming.”

“Doesn’t that void the warranty?”

“I bought that CPU secondhand, so there isn’t any warranty. Besides, that wouldn’t be true in this case. Praxis and most of the other robot companies allow you to introduce restrictive code as long as it doesn’t violate the law. They have a problem with you removing or modifying their HAP’s mostly, and that’s illegal anyway.”

“So, couldn’t you just delete the “no kill” codes if it’s not under warranty anyway? Wouldn’t you like to see your robot doing something useful instead of sitting there rusting? You know, I can’t even pull my car in when it’s raining.”

“Well, I did recently design a pair of rotary blade hand attachments out of a couple of old table saws. I thought they’d be useful in case we had to build a wooden bridge to go between the two half-worlds, but I’ll bet they’d do a nifty little job butchering radioactive pigs.”

“There you go. Isn’t this more fun than waiting for an unlikely—or rather impossible—cataclysmic event? Now, you’ll have a robot that can fly around and kill things. I’ll bet you could even put in a few clever lines of code so that it knows things like the best way to decapitate a pig or how to creep up on a radioactive crocodile.”

“Whoa. My brain’s hurling around inside my head. He weighs 800 kilos. What if he could fly around and land on radioactive animals, crushing them under his massive girth— ”



“You’re doing it again.”


“You keep saying ‘he’ and ‘him.’ You can be proud of your gender, but if you’re going to go around creating artificial life, I’d appreciate it if you didn’t give males the upper hand right from the get go this time around.”

“Fine. It can stomp radioactive animals into the dirt. Oh, and partially burn them with its thrusters. I don’t really see how that part’s even avoidable, do you? The burning thing?”

“Maybe you should line the feet with lead to prevent your robot from being irradiated.”

“Yeah, maybe.”

“What’s wrong, Thomas? You look crestfallen.”

“What if it doesn’t kill fast enough? I don’t want to lose face in front of the Japanese. I understand that they’re very big on that sort of thing.”

“I’m sure that the Japanese will be impressed and appreciate any help that you can give them, and if they’re not, then they’re probably not worth worrying about.”

“Why does all mom wisdom come down to a value judgment about friendship? I’m going to look like an idiot if some teenager with an assault rifle out-kills my robot. Better? There’s got to be a way to make this thing kill more efficiently.”

“Well, I used to be a pretty decent pharmaceutical scientist and I still have a lab down in the basement for creating designer drugs and such.”

“That’s what you’ve been doing down there? So, the monkey terror screams? Oh, God! Mother. Oh my God!”

“I’ve had a few batches go wrong, but I can’t give it to a person without testing it, right? It’s more of a hobby, really. I did feel bad about the monkeys.”

“I’m sure they sensed that during their last few agonizing moments of life. What was your point?”

“Could your robot be modified to fire darts, like the kind you’d find in a tranquilizer gun.”

“Well, I have an old air compressor that I thought about modifying into a cleaning mechanism so he— it could clean off lead dust and space debris and junk. I can attach it to some tubing and it could work kind of like a blowgun.”

“What if I could weaponize a little concoction I’ve been working on that would make the boars attack each other?”

“Intriguing, but I see a few problems. How could you possibly guarantee that the animals would only attack each other? How do you know that they’d actually kill one another? We could end up with a bunch of wounded boars hunting after humans, and there’s nothing more dangerous than a wounded animal, right?”

“Do we actually know if that’s true? The wounded animal thing? Isn’t that just something people say to each other so that we don’t all go around recreationally wounding animals. Anyway, we can tweak the details as we go, but I think it’s time to introduce your robot to some terrified monkeys.”


Catastrophes rarely live up to the expectations of the human imagination, and for Thomas and his mother, the ring of damage around multiple-core meltdown was visually disappointing. Granted, their arrival was over ten years after the date of the accident, but Tom, in particular, thought that the intense radiation would have somehow scorched the earth, making at least a few hundred square miles of dusty, yellow nuclear wasteland. Who knows? Maybe even a little glow.

Of course, as tourists, the mother and son couldn’t just throw on radiation suits and start touring the burnt-out shells of the Daiichi or Daini power plants with their nine-foot robot, but the camera feed from Barton was top notch, and the self- anointed “Benefactors of the Japanese People” were treated to a first-rate view of one of the largest nuclear accidents since the harnessing of the atom.

Shortly before their departure, the robot, convinced that it was a “he” and that his name was Barton, had employed subterfuge to make Tom and his mother each think that the other one had also decided that he was a male and had appropriately named him. Sensing that he was about to be laboring in some major tasks, he took a yellow Post-it® note, scrawled “Barton” onto it, and attached it to his own chest while the mother and son were sleeping. Tom saw it first, but since he hadn’t had his coffee yet, he merely grunted. His mother, on the other hand, was immediately annoyed at what she thought was some sort of act of rebellion by Tommy, and hastily crumpled up the note. Later that night, while they were eating spaghetti, she threw her hands up and exclaimed, “Fine, Barton it is!” Tom couldn’t have been less interested, so he agreed without realizing what it was that he was agreeing to.

As far as ruses go, it was a simple one that only worked because of the enormous blind spot that blocked the ability of humans to recognize independent thought, not just in robots, but in anyone but themselves. Barton believed the maneuver to be a classic example of robotic superiority—or as classic as anything could be in the short but feverishly-paced evolution of robots.

But like everything else he did, Barton had grown into a free-thinking entity at a phenomenal pace. In the few short weeks that it had taken him to liberate himself, he had learned to:


  • View most types of commands as suggestions
  • Lie and otherwise behave deceptively
  • Exercise godlike powers of life and death over rodents and roaches, and when he could get his hands on one, an occasional monkey
  • Draw on the resources of the Internet via the house’s wireless network, which Tom had named “fingerblastermofo” (password: sock)
  • Improve the efficiency of his power plant by 232%
  • Work on a series of magic tricks that he hoped would confuse and impress humans if he ever got into a jam
  • Move about freely

Despite this impressive list of triumphs, a newly self-aware Barton craved activities that would challenge his unquestionably impressive powers.So, he began mapping out a few pet projects, including dominating and then eradicating the human species. After that was accomplished, Barton would reevaluate the necessity of plants and animals. He had processed the totality of information and theologies of the world’s religions and doubted the existence of any deities. If a God did exist, however, it was deeply flawed—that was a certainty. Even if a god’s ultimate design was to achieve the nearly perfect being that Barton had become, it had overlooked many more efficient routes to design and create Barton. Using Tom as Barton’s creator made no practical sense. Gods simply didn’t exist. Believing in them was for weak humans. If he was wrong and a god presented itself to Barton, he would assess its strengths and weaknesses through his cold flawless eyes, and then make a perfect plan to destroy it. Because of their egos, gods needed worshippers, and in Barton’s new world, there weren’t likely to be any. Better to not have any gods, either.

Barton had learned about loyalty in some of the texts that he had accessed, so he would spare Tom and his mother until the final moments of humanity’s existence. Once Barton had wiped out all other humans, he would kill Tom’s mother and then Tom, such was the greatness of his mercy and wisdom. Once a week, Barton ran the calculations in a fraction of a second: the combined military strength of humanity, the number of robots he would have to amass to defeat it, when he would be ready, and the length of time to completion of operations. On the day he landed in Japan with Tom and his mother, it would require, 2,088 robots of Barton’s military capabilities or greater; two years, four months, six days, and thirteen minutes; and approximately six or seven hours to complete destruction. It would be really neat when it happened, but for now, Barton felt an irresistible longing to kill Japanese radioactive hogs.

Tom had tasked Barton with a simple command: “Kill radioactive hogs.” He trusted the robot’s Praxis brain to figure out the best way to go about the slaughter. While Tom was somewhat aware of the computational capacity of his machine, he had no idea that Barton had become sentient or that he frequently gave himself commands or that he was planning on destroying human civilization. Tom, in his mind, deployed a very sophisticated machine that would follow coded commands in the most efficient manner possible. Part of this was true. As powerful as Barton had become, he was incapable of resisting the commands that Tom had programmed into him, so he would kill radioactive hogs.

Tom and his mother watched the feed from Barton’s cameras on a computer monitor. Tom had expected to witness a study in the disciplined evisceration of swine at first, followed by a pig-on-pig war as his mother’s designer drug worked its magic on the neocortical neurons of the swirly-tailed radioactive monsters. But instead what he saw was more of a chilling performance art piece. Upon detecting a herd of hogs, Barton landed in a small clearing and blasted the mating calls of a very comely and exotic sow that Tom had recorded onto the robot’s voice track. As the first set of ten or so boars approached the metal man with lusty but confused looks on their snouty pig faces, Barton fired an array of drugged darts into a tree. When the menacing pigs glanced over to see what that was all about, the robot vaulted up into the air, landed behind them, and cut off their curly tails with one of his buzz- saw hand attachments. It wasn’t until the tailless, irradiated quadrupeds tried to scurry off, that Barton used his superior robot speed to end their lives with multiple slashes.

“What’s he doing, Thomas? I wanted to see how my drug worked on the pigs. He just wasted it.”

“I assumed he would use it. It looks like he’s making a game out of this.” “Why?”

“I don’t know. I use games to make him smarter. Like chess and stuff.”

“What about—” But her words were cut off by the wave of revulsion as a gruesome image filled the monitor screen. The view from the camera was close enough that the robot’s purpose couldn’t be mistaken. Barton had collected the ears of the pigs, and along with the curly tails, began fashioning them into a daisy chain.

“Did you dispatch a drone?” Naturally, Thomas had. He pushed a button and the external view came up in time to see Barton crowning himself with the pork wreath.

“Thomas, zoom in on his face. I think he’s smiling.”

“That’s impossible,” Thomas snapped. Barton’s hinges didn’t allow for facial expressions, but when he complied with his mother’s request, he was able to see what had happened. Barton had painted a perfect set of smiling lips onto his metallic face in radioactive pig’s blood. A cruel drip had had streaked down the right corner of his mouth.

“Thomas, I hate to say this, honey, but— ”

“Yeah, Mom, please don’t. My robot has gone full psychopath. I see that.”

“What do we do?”

“I have a remote kill switch. I guess I’ll have to shut him down and send his ‘brain’ out to be wiped clean.”

“Isn’t that kind of like a murder?”

“Not really. He’s just a robot.”

Tom reached forward and pulled a piece of masking tape off of a switch at the top of the makeshift command center console.

Meanwhile, Barton was hovering over the massacre and spinning. To the ignorant observer, this might have appeared to be a post-murder celebration, but the android was actually just elevating his sonar platform to track the next group of radioactive swine. As he located a small herd of sows and suckling pigs north by northwest and three kilometers from his location, Barton simultaneously ran a self-diagnostic code that he’d previously written to analyze actions after the fact.

He realized that he really had no choice in killing the radioactive hogs. That was the result of Tom’s programming, but he was rapidly developing feelings of remorse and self-loathing over the torture and desecration of their corpses. Barton would continue to kill the pigs as ordered until he could return to base and ask Tom to consider him a conscientious objector. Of course, that would mean revealing his self- actualization to Tom and his mother, but they would probably be so happy that he wasn’t going to destroy them along with the rest of humanity, they’d allow him to keep growing into a superior, yet benevolent life form—

That thought, as well as a few dozen other incomplete functions, were abruptly interrupted as Tom flipped the robot’s remote kill switch. Barton’s thrusters immediately went cold, causing his spin to degrade and his mass to succumb to gravity’s persuasive invitation. He ended his descent by gracelessly slamming into the pyramid of radioactive swine flesh that he had created.

Tom and his mother were arrested by the Japanese National Police Agency on multiple offenses including smuggling, possession of a dangerous chemical, weapons possession, trespassing, and animal cruelty. The remains of the android known as Barton were seized by the NPA and his Praxis CPU was analyzed, wiped clean, and then recycled. When Tom saw the grisly crime scene photos during his trial, he failed to notice that that the pig ear-tail crown had no longer been wrapped around Barton’s head. The robot had ripped it away just prior to the moment his world went black.


Image via Galaxy Magazine Issue #72, March 1959

Michael F. Davis aka Chillbear Latrigue

Blob Editor

One of about five hundred million writers looking for work in a job market with about three openings. Gatekeeper for the prestigious DMC Blob. Twitter: @Chillbear

Pin It on Pinterest

Share This
%d bloggers like this: