Uber’s Autonomous Vehicle Crash Shakes Industry

Last week’s accident between a testing autonomous vehicle and a pedestrian has shaken the automotive industry. Uber, the ride sharing company and their fleet of autonomous vehicles manufactured by Volvo, have all but stopped any further autonomous vehicle testing until further notice because of the crash. The crash involved a Volvo XC90 autonomous vehicle that was occupied by a human backup driver, and a pedestrian. While details on the accident have not been released, preliminary analysis of the available evidence shows that the pedestrian likely entered the oncoming path of the XC90 without sufficient time for the vehicle’s driving systems to properly avoid hitting the pedestrian. In addition, the backup driver did not have sufficient time to react to the situation or to avoid intervening with the driverless system before the vehicle collided with the pedestrian. This was thought to have been the first ever fatal accident involving an autonomous vehicle since testing had begun, including tests undertaken by other companies, such as Google. Right after Uber suspended their autonomous vehicle testing, Toyota announced that they would also be suspending all autonomous vehicle testing until further notice. In a statement provided by Toyota, the company informed the industry that they feel that the fatality has caused an emotional response from the backup test drivers and has shaken the confidence that autonomous vehicles can effectively prevent accidents in everyday scenarios. A similar response has reverberated throughout the automotive industry. Most companies are concerned about the backlash caused by the accident and how the thought of autonomous vehicles as being completely safe may now be gone. Autonomous vehicle testing relies on sensors placed around the vehicle that “see” the environment around the vehicle in an attempt to perform the act of driving at the level of a human, or even better. The system of sensors used by autonomous vehicles is dependent on being able to correctly identify the surroundings in the event of an emergency, and respond appropriately by navigating the autonomous vehicle away from the emergency. It may be difficult for the public to regain trust in such systems after they have been shown to be fallible. -taken from www.sae.org

Tags: , ,