Real world edge case as self-driving delivery bot has run-in with LA law
Video of a self-driving delivery robot entering a crime scene in Los Angeles has gone viral, prompting reasonable questions and hyperbolic headlines.
The unusual event, on 13 September, was captured in a 1m46s video by Twitter user “Film The Police LA”, receiving over 3k retweets and £21k likes:
He also posted it to Youtube:
“I wanna see this so badly,” says someone at the start. Near the end someone muses: “That’s gonna be the easiest way to bomb people, with a robot”.
NBC News ran the story under the headline “Skynet Fights Back: Food Delivery Robot Drives Through LA Crime Scene”, a reference to The Terminator films.
The incident itself – a suspected shooting – thankfully turned out to be a false alarm.
Was it self-driving?
However, the appropriation of blame is complicated by human intervention – a bystander lifting up the police tape to enable the robot to proceed, and the later claim that a human operator was responsible.
On 17 September, Serve Robotics, took to Twitter to clarify that: “This week a Serve robot failed to reroute around a police barrier because of human error. While robots are capable of operating autonomously in most circumstances, they’re assigned to human supervisors to ensure their safe operation, for instance when navigating a blockage. We respect the important work of law enforcement and are taking steps to ensure our operating procedures are followed in the future.”
As with the Cruise robotaxi drive-off back in April – “Ain’t nobody in it!” the officer says – in America, autonomous vehicles are having real-world run-ins with the law.
It’s only a matter of time before similar incidents happen here in the UK.