Drunk Man Assaults R2D2-Style Security Robot and Gets Arrested

Good thing he didn't mess with a pistol-packing Russian FEDOR robot


Android Technics

While on patrol in the parking lot of its manufacturer, a K5 security robot was assaulted by Jason Sylvain. Sylvain, who was apparently drunk, pushed over the 300-pound R2D2-style robot made by the Knightscope company. Alerted by the robot, two Knightscope employees detained Sylvain until the local police showed up. The K5 robots operate autonomously and provide 360 degree video of the areas they patrol. The robots are equipped with two way audio that allows intercom conversations between the security operations center and a person near the robot. In addition, the robots can broadcast pre-recorded and live audio messages in the areas where they are operating. They also monitor environmental conditions, keep track of license plates, and detect wireless activity. The company promises that the robots will soon offer a gun detection feature.

Going beyond mere gun detection, the Russian FEDOR humanoid robot can now shoot guns. As Futurism reports: FEDOR — short for Final Experimental Demonstration Object Research — is a humanoid robot developed by Android Technics and the Advanced Research Fund in Russia. "Final" does sound a bit ominous. In any case, the FEDOR bot can drive a car, use various tools (including keys), screw in lightbulbs, and even do pushups. It has now added the ability to hold and fire two pistols at the same time to its skill set.

Russia's Deputy Prime Minister Dmitry Rogozin tweeted, "Robot FEDOR showed the ability to shoot from both hands. Fine motor skills and decision-making algorithms are still being improved." Rogozin further explained, "Shooting exercises is a method of teaching the robot to set priorities and make instant decisions. We are creating AI, not Terminator." Very reassuring.

Enabling mall security robots to protect themselves by, say, administering a mild electric shock to someone Sylvain is OK, but arming them lethal weapons is, well, premature. On the other hand, I am against banning the development of warbots. See FEDOR in action below.

NEXT: Marijuana: Now Officially As Normal As Steel, Cement, and Rubber

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

    1. I’ve seen this fight before.

  1. On the other hand, I am against banning the development of warbots.

    I’m sure the attendees of the Afghan wedding party that one of those inevitably cuts short will have other thoughts.

    Wasn’t there a hundred years ago some old Russian silent movie or play about a robot rebellion? You’d think the Russkeys would know better.

    1. The Russkies can’t abide a Robocop gap.

  2. Stephens, however, told me: “He claimed to be an engineer that wanted to ‘test’ the security robots. I guess he now has his answer.”

    “claimed to be an engineer” hahahahahahahahahahahaha.

  3. In sharp contrast to the way doggy cops have ascended into blue sainthood, I don’t think these robot cops are going to find such a welcome acceptance into the brotherhood.

    1. Exactly. For every K-9, there has to be a K-9 handler. Robots might be able to just take all their jobs.

      The gods know robots couldn’t be any worse.

      1. Actually, with the increase in tickets given to female drivers who can no longer weasel out using their feminine wiles, it may help cities balance their budgets.

  4. decision-making algorithms are still being improved

    Pray the Chicago PD doesn’t have any input in developing those decision-making algorithms. “Kill them all and let God sort them out” is a pretty simple algorithm but probably not what you want to see in an unstoppable killing machine.

  5. The company promises that the robots will soon offer a gun detection feature.

    I’d like to see that.

    1. The polarimetric radar used by the team works by sending out a signal at a particular polarization, and carefully analyzing the polarization of the signal that bounces back. An irregular metal object can change the polarization of the signal, allowing for the detection of concealed items.

      Though the technology has not yet undergone any human testing, Sarabandi’s team has carried out a simulation using a mannequin painted with a coat that reflects radar-like human skin. The mannequin was placed on a turntable in an anechoic chamber, a room designed to absorb all echoes and reflections.

      The techniques could be used to scan large groups of people, with each subject taking less than a second to process. This would then allow security personnel to closely observe the individual in question or even take suspects aside for more comprehensive scans.

      I can’t see anything going wrong with this.

    2. Just watching this now. By their own admission it doesn’t “detect guns”, it detects “irregular objects” being carried by the person.

      1. The inventor is excited, because once the “irregular object” is detected, that person can be pulled aside and further interrogated. Safety!

        1. They’re gonna find sooo many dildoes.

          1. “Of course it’s policy to never imply ownership in the case of a dildo. It’s always the indefinite article ‘a’ dildo, never your dildo.”

          2. The Russian one’s sure. The American ones are gonna find dildoes right up until they identify/out their first trans person. Then we get to find out whether the language used to describe trans-stuff is a context-free grammar or not. Personally, I’m hoping it’s not and the robots are shamed as finite-state, cis-het shitlords like the rest of us.

    3. There was a documentary from the 80’s called Robocop that showed how it worked with and ED-209.

  6. Enabling mall security robots to protect themselves by, say, administering a mild electric shock to someone Sylvain is OK, but arming them lethal weapons is, well, premature

    Look, if self-driving cars can remain incident free, I don’t know why a lethal weapon can’t be thrown into the mix.

  7. We are creating AI, not Terminator.

    IOW, you’re creating Skynet, which will then create Terminators. That’s waaaaay better *rolls eyes* /sarc

  8. “Your move, creep”

Please to post comments

Comments are closed.