Block on Trump's Asylum Ban Upheld by Supreme Court
Tech geeks are asking questions about the first ever Google car crash over the weekend, which occurred when the company's self-driving car rear-ended another vehicle near Google's Mountain View, California headquarters.
Built on a set of lasers and imaging systems, since the Google self-driving car has hit the streets (it's logged 160,000 miles), one of the largest questions has been about responsibility.
Much like with this incident, everyone wants to know who, ultimately, is responsible for a self-driving car crash?
In a statement released to the press, Google explained that all of its self-driving vehicles are manned with a driver, but that ultimately the software is driving the car.
However, Jalopnik reports that the company has also said that the weekend's car crash was the result of human error, as the driver had taken control of the vehicle prior to the incident.
While this points to driver responsibility for the accident, PC Magazine points out that no one knows if the human stepped in at the last minute to avoid an accident the software was going to cause.
If this is the case, should the driver be legally responsible for the Google car crash?
And what about the software being a big distraction? Jalopnik rightfully points out that a driver using a self-driving car is not paying attention.
Though Google can't be expected to take responsibility for every crash that occurs as a result of its software, can we say the same for drivers, who could be said to have assumed the responsibility?
These are all pertinent questions that need answers, and hopefully as a result of the Google car crash, legislators will start taking them seriously.