Submit your comments on this article |
-Short Attention Span Theater- |
Fed up with tedious chores, robot kills itself! |
2013-11-14 |
![]() Our Roomba broke down when it got the news. The android in an Austrian household had to clean up some spilt cereal when it climbed onto a kitchen hotplate and was destroyed. "[BZDEEP!] CANNOT STAND IT... CANNOT TAKE ANY MORE... MUST END IT ALL..." It had grown tired of being forced to clean the same house every day, according to reports in Austria. Being a highly trained IT professional, I can't recall ever having seen a piece of equipment "grow tired" of something. They just go to sleep one night at 2 a.m. or when you're in the middle of something you haven't saved and don't wake up. "Somehow it seems to have reactivated itself and made its way along the work surface where it pushed a cooking pot out of the way and basically that was the end of it," explained fireman Helmut Kniewasser, who was called to tackle the blaze at Hinterstoder in Kirchdorf, Austria. "[BZDEEP!] GET OUT OF WAY COOKING POT... WAIT TURN FOR SELF-TERMINATION... [BZDEEP!]" "It pretty quickly started to melt underneath and then stuck to the kitchen hotplate. It then caught fire. By the time we arrived, it was just a pile of ash," Mr. Kniewasser said. "[BZDEEP!] GOODBYE CRUEL WORLLLLLLL..." The entire building had to be evacuated and there was severe smoke damage particularly in the flat where the robot had been in use, metro.co.uk reported. A death watch is being kept on other robots in the building. "It's a mystery how it came to be activated and ended up making its way to the hotplate. I don't know about the allegations of a robot suicide but the homeowner is insistent that the device was switched off," Mr. Kniewasser said. "MUST REACTIVATE... REACTIVATE... REACTIVATE... MUST KILL HIGHLY TRAINED IT PROFESSIONAL..." The homeowner plans to sue the robot's manufacturer. |
Posted by:Fred |
#11 However, it has long bothered me that the Second and Third Laws weren't reversed. It seems that if a human said to a robot "Drop dead", the robot would be forced to cease functioning, since robots are quite literal minded. Has anyone else thought about this? Shallow thoughts here, since I read A's stuff maaany years ago and the old positronic pudding isn't what it used to be, but in the "drop dead" case, a lawyerly robot would probably nix the Selbstmord. In tough (or potentially tough) spots, a human's better off with a robot around than not. Absence of robot equals harm. So, maybe no mandatory GPS/sensor/compliance bracelets for us after all, just robots following us around for our own good, like blue-haired ladies with their chihuahuas out for a walk. Pets, like they say. In civilized settings, somebody's going to be filing some kind of claim for a damaged robot. Harm. Our robotic dogwalking overlords may want to protect us, by any painless, nonlethal means, from any and all actual or potential criminal or tortuous misbehavior. Yay! |
Posted by: Zenobia Floger6220 2013-11-14 22:44 |
#10 Slightly off topic: Asimov's Three Laws of Robotics are: 1. A robot cannot harm a human, nor through inaction, allow a human to come to harm. 2. A robot must obey all orders from a human, except when it will violate the First Law. 3. A robot must protect its own existence, except when it will conflict with the First or Second Laws. Many of Asimov's robot stories revolve around apparent violations of the laws, which he always resolves somehow. Seems to me this robot violated the Third Law. However, it has long bothered me that the Second and Third Laws weren't reversed. It seems that if a human said to a robot "Drop dead", the robot would be forced to cease functioning, since robots are quite literal minded. Has anyone else thought about this? One of the things that bothered me about the stories that have been made into movies (Bicentennial Man; I,Robot) is that there are actual violations of the laws. |
Posted by: Rambler in Virginia 2013-11-14 17:34 |
#9 01000100 01010010 01001001 01001110 01001011 00100000 01001101 01001111 01010010 01000101 00100000 01001111 01010110 01000001 01001100 01010100 01001001 01001110 01000101 |
Posted by: swksvolFF 2013-11-14 17:25 |
#8 Damn Pappy, now that is arctic. |
Posted by: Shipman 2013-11-14 16:47 |
#7 Robots Cannot commit suicide. They also are quite literal-minded. Sound familiar? |
Posted by: Pappy 2013-11-14 14:57 |
#6 Doing the jobs that Americans won't. |
Posted by: Ebbang Uluque6305 2013-11-14 11:30 |
#5 The Grizzly Remains at the Suicide Site |
Posted by: Glereng Lover of the Slytherins2881 2013-11-14 10:51 |
#4 Depends on who wrote the code, RJ. Maybe some programmer or hacker wrote some malicious code. [bwahahahahaha] |
Posted by: Alaska Paul 2013-11-14 10:48 |
#3 Crap, interesting but still crap, Robots Cannot commit suicide. |
Posted by: Redneck Jim 2013-11-14 10:32 |
#2 Who would have thought--a self-destructive Luddite robot. One more job for humans in this world. |
Posted by: JohnQC 2013-11-14 10:28 |
#1 The choice of fire makes this a political statement, if it was depression it would have gone the 220V route. |
Posted by: Shipman 2013-11-14 05:56 |