In this file photo taken on September 13, 2016 a pilot model of an Uber self-driving car travels in Pittsburgh, Pennsylvania. Uber said on March 19, 2018, it is cooperating with police following a deadly accident involving one of the ride-share company's self-driving cars in Arizona. (AFP/Angelo Merendino)
Ride-sharing giant Uber said Monday it is suspending use of self-driving cars after one of the vehicles struck and killed a pedestrian in the US state of Arizona.
The Uber vehicle was in autonomous mode, with an operator behind the wheel, when it hit a woman walking in the street in the city of Tempe late Sunday, according to the San Francisco-based company.
The victim was hospitalized and later died from her injuries.
"Our hearts go out to the victim's family," an Uber spokesperson told AFP. "We are fully cooperating with local authorities in their investigation of this incident."
Uber said it had temporarily halted its use of self-driving cars for testing or customer rides in Tempe, Pittsburgh, Toronto, and San Francisco.
Tempe is one of just two cities -- along with Pittsburgh -- where the ride-sharing firm has been using autonomous vehicles as part of its regular passenger.
The vehicle operator in the driver's seat was the only person in the car when the accident occurred, Uber said. The car was in police hands on Monday.
Sunday's accident was the first fatal self-driving car crash involving a pedestrian.
The first deadly self-driving car accident was reported in mid-2016, and involved a Tesla.
The Tesla Model S, cruising on "Autopilot," failed to detect a crossing tractor-trailer against a bright sky, killing the driver -- who it later emerged had kept his hands off the wheel for extended periods of time despite automated warnings not to do so.
Investigators at the US National Transportation Safety Board determined the probable cause of the Tesla crash was the combination of "a truck driver's failure to yield the right of way and a car driver's inattention due to overreliance on vehicle automation."
Autonomous-vehicle technology has been touted as having potential to save fuel, ease congestion, and to save thousands of lives by avoiding accidents due to human error.
As with the fatal Tesla crash, however, the deadly Uber accident is likely to stoke concerns that the industry is moving too fast.
Google-owned Waymo this month began using its self-driving trucks to haul cargo bound for the internet giant's data centers in Georgia, while rival Uber announced the use of self-driving semi trucks as part of an on-demand trucking service in Arizona.
In September, US Transportation Secretary Elaine Chao released new guidelines that permit more testing of self-driving cars.
But America's non-profit Consumer Watchdog has warned that roads are being turned "into private laboratories for robot cars with no regard for our safety."
The group on Monday called for a nationwide moratorium on testing self-driving cars on public roads while investigators figure out what went wrong in the Uber accident.
"Arizona has been the wild west of robot car testing with virtually no regulations in place," Watchdog technology project director John Simpson said in a statement.
"When there's no sheriff in town, people get killed."
Car vision tests?
US states set their own rules for roads, and a handful have passed laws allowing self-driving vehicles.
California and Arizona have been particularly encouraging, hoping that companies developing autonomous technology in those states will create local jobs and facilities devoted to a promising new industry.
Duke University robotics professor Missy Cummings is among the advocates of slowing down introduction of autonomous vehicles to avoid risk and get proper regulations in place.
While machines are better at staying vigilant and reacting to routine situations, human drivers have proven superior at handling unusual or unexpected situations, according to the professor.
Cummings reasoned that if people need to pass vision exams in order to get driving licenses, so should self-driving cars.
She noted a case in which putting stickers on a stop sign could fool autonomous car sensors into seeing it as a sign indicating a speed limit.
"If we are still learning at this rate, and still uncovering major problems, it begs the question of why we are trying to put this technology into widespread use," Cummings told AFP.
"I am a big fan of the technology, but it is very unproven and experimental."