Waymo, the self-driving taxi company backed by Alphabet Inc., has issued a voluntary recall after multiple instances of its robotaxis violating school bus stop signs in Texas. The incidents, which have been recorded on video, show the vehicles driving past stopped school buses in Austin without yielding.
No injuries were reported in these incidents, but the schools involved have asked Waymo to pause its operations near schools during pick-up and drop-off periods due to safety concerns. The company has since issued a recall, stating that all affected vehicles had received a software update by November 17.
However, tech analyst Ahmed Banafa is not convinced that the issue has been fully resolved. "The problem with autonomous cars is trust," he said. "Should I take this car and consider its history? Now it's up to the software to ensure safety." Banafa also questioned whether Waymo is transparent enough about its testing processes, suggesting that third-party oversight could help build trust.
Waymo maintains that its robotaxis are safe and that all vehicles had the required software update. However, Banafa remains skeptical, saying that while the technology has made significant progress, there is still room for improvement.
The National Highway Traffic Safety Administration (NHTSA) has asked Waymo to provide answers to a series of questions about the incidents in Texas. Meanwhile, passengers who have experienced Waymo's robotaxis reported mixed reactions. Some expressed confidence in the safety record of the technology, while others acknowledged that there is still room for improvement.
As the NHTSA continues to investigate the incidents, Waymo has committed to its operations on public roads. The company believes that its autonomous vehicles are safer than human drivers when it comes to injury crashes. While Waymo's recall may provide some comfort, many experts will continue to scrutinize the safety record of self-driving taxis to ensure that they meet the highest standards of trust and accountability.
No injuries were reported in these incidents, but the schools involved have asked Waymo to pause its operations near schools during pick-up and drop-off periods due to safety concerns. The company has since issued a recall, stating that all affected vehicles had received a software update by November 17.
However, tech analyst Ahmed Banafa is not convinced that the issue has been fully resolved. "The problem with autonomous cars is trust," he said. "Should I take this car and consider its history? Now it's up to the software to ensure safety." Banafa also questioned whether Waymo is transparent enough about its testing processes, suggesting that third-party oversight could help build trust.
Waymo maintains that its robotaxis are safe and that all vehicles had the required software update. However, Banafa remains skeptical, saying that while the technology has made significant progress, there is still room for improvement.
The National Highway Traffic Safety Administration (NHTSA) has asked Waymo to provide answers to a series of questions about the incidents in Texas. Meanwhile, passengers who have experienced Waymo's robotaxis reported mixed reactions. Some expressed confidence in the safety record of the technology, while others acknowledged that there is still room for improvement.
As the NHTSA continues to investigate the incidents, Waymo has committed to its operations on public roads. The company believes that its autonomous vehicles are safer than human drivers when it comes to injury crashes. While Waymo's recall may provide some comfort, many experts will continue to scrutinize the safety record of self-driving taxis to ensure that they meet the highest standards of trust and accountability.