Donna | Should Self-Driving Cars Kill to Save its Passengers? @DonnaIRL | Uploaded December 2016 | Updated October 2024, 6 days ago.
With mass research on self-driving cars, humans driving their own car may be obsolete in the next decade or two. This comes with many benefits such as a reduction in traffic and accidents due to human error. We would no longer have a driver under the influence or one that is too tired. With companies like Google and Tesla already coming out with semi-autonomous vehicles, this future is very near.
Despite the enormous amount of good autonomous vehicles will have on society, accidents will not ultimately disappear. In fact, there are a few cases where a semi-autonomous vehicle mistakes a white vehicle for the sky. As a result, the self-driving car keeps going and kills the driver. Manufacturers can however improve their system so that the car can recognize obstructions better and save its passengers, but if saving its passengers require it to kill another life?
Self-Driving cars will have to face similar moral dilemmas eventually. Which lives should these cars prioritize? MIT has created a test in which humans decide just that. In today's video my friend Josh and I will take that test and discuss which lives the autonomous vehicles should save and why.
Take the test Here: http://moralmachine.mit.edu/
Subscribe to Josh: youtube.com/channel/UCV2y78o25KonQxXXWCBMwsA/featured
Last Video: youtube.com/watch?v=B9uvvXKQNVo
Subscribe: youtube.com/subscription_center?add_user=designingdonna
Twitter: twitter.com/designingdonna
Instagram: instagram.com/psychirl
With mass research on self-driving cars, humans driving their own car may be obsolete in the next decade or two. This comes with many benefits such as a reduction in traffic and accidents due to human error. We would no longer have a driver under the influence or one that is too tired. With companies like Google and Tesla already coming out with semi-autonomous vehicles, this future is very near.
Despite the enormous amount of good autonomous vehicles will have on society, accidents will not ultimately disappear. In fact, there are a few cases where a semi-autonomous vehicle mistakes a white vehicle for the sky. As a result, the self-driving car keeps going and kills the driver. Manufacturers can however improve their system so that the car can recognize obstructions better and save its passengers, but if saving its passengers require it to kill another life?
Self-Driving cars will have to face similar moral dilemmas eventually. Which lives should these cars prioritize? MIT has created a test in which humans decide just that. In today's video my friend Josh and I will take that test and discuss which lives the autonomous vehicles should save and why.
Take the test Here: http://moralmachine.mit.edu/
Subscribe to Josh: youtube.com/channel/UCV2y78o25KonQxXXWCBMwsA/featured
Last Video: youtube.com/watch?v=B9uvvXKQNVo
Subscribe: youtube.com/subscription_center?add_user=designingdonna
Twitter: twitter.com/designingdonna
Instagram: instagram.com/psychirl