您现在的位置是:探索 >>正文
【】
探索8人已围观
简介Imagine this scenario: You’re driving down the road when suddenly, out of nowhere, a lone pede ...
Imagine this scenario: You’re driving down the road when suddenly, out of nowhere, a lone pedestrian jumps into your path. The road is narrow and walled in by a barrier on either side, so your options are limited. You can either keep driving straight ahead and hit the pedestrian, or you can save the pedestrian by swerving into one of the barriers. What do you choose?
Now imagine a scenario in which it’s five pedestrians in the road instead of one. Now 10. Does your answer change?
SEE ALSO:This driverless car can harness the power of IBM WatsonIn the real world, rare though such scenarios might be, the driver would likely be forced to make a split-second decision about how to behave -- a choice that he or she probably wouldn’t even have time to fully think through.
But coming up with responses to these situations becomes a little more pressing when we consider the possibility that self-driving vehicles might one day be making these decisions for us -- which means engineers must figure out the best and most ethical ways to program them for such situations ahead of time.
To aid in that process, new research -- published Thursday in the journal Science-- has tackled the difficult question of how the public actually feels driverless cars should behave when such an ethical quandary is presented -- and how the vehicle’s programming might affect its willingness to actually drive around in one.
By analyzing a series of surveys presenting a number of difficult situations, similar to the ones above, the researchers found that most people believe utilitarian vehicles -- cars programmed to save the greatest number of people, even if it means sacrificing the passengers -- are the most ethical.
Whether they’d personally purchase such a vehicle, however, is a different matter.
The rise of driverless cars
Autonomous vehicles -- cars programmed to drive their passengers around without human control -- were once an idea straight out of science fiction. Now, they’re slowly becoming a reality.
Google is already testing prototypes of its self-driving cars on public streets in a handful of U.S. cities, and other companies, such as Tesla, are working on their own versions of autonomous technology. Within a few years to a few decades, it’s possible that such vehicles could go to mass production and become a fixture on the roads.
And many experts believe this would be a good thing.
In the U.S. alone, tens of thousands of people are killed each year in car accidents, according to the Department of Transportation, and several million are injured. A majority of these accidents are chalked up to human error, which scientists suggest could be avoided with driverless vehicles programmed to follow the road rules. Some research has suggested that up to 90 percent of traffic accidents could be avoided with the use of driverless cars.
But the fact that self-driving vehicles must be programmed ahead of time to respond to any given situation presents a kind of quandary for the industry. There are certain ethical questions -- such as who the vehicle should sacrifice in a situation where someone inevitably must be injured -- that must be answered before these cars hit the road.
“The people who are actually designing these cars -- they recognize this as a significant problem,” said Jason Millar, chief ethics analyst at the Open Roboethics initiative (ORi) and a postdoctoral research fellow at the University of Ottawa Faculty of Law. Millar was not involved with the new study, but has conducted similar research in the past.
A tragedy of the commons
The new study relied on a series of surveys that presented a number of difficult ethical scenarios.
In one, the car must either swerve off the road and kill its passenger or save its passenger and kill 10 pedestrians. In another, the car must either kill one pedestrian or two pedestrians, although the passenger remains unharmed in either case. And in another, the car must either kill 20 pedestrians in the road, or swerve and kill both the passenger and the passenger’s child. The study included six of these types of scenarios in all.
“What we found was a large majority of people strongly feel that the car should sacrifice the passenger for the greater good,” said Jean-François Bonnefon of the Toulouse School of Economics, the new study’s lead author, during a Wednesday teleconference.
Overall, the respondents tended to feel that sacrificing few for the sake of many -- even in cases when the passenger must be sacrificed -- was most moral.
For instance, in the situation where either the lone passenger or 10 pedestrians must be sacrificed, 76 percent of participants thought it would be most moral to sacrifice the passenger. Throughout the various scenarios, moral approval of this system tended to be higher the more people were saved.
Adding family members and children to the mix did complicate the results a bit.
Participants’ belief in the morality of sacrificing the passengers -- even in order to save a greater number of pedestrians -- decreased when other people were in the car. However, even in these situations, more than half the participants still felt that this was the most moral choice.
The participants’ feelings changed when it came to the idea of who should regulate these decisions and whether they choose to purchase such a car, themselves, however.
Participants were generally reluctant about the idea of the government enforcing utilitarian vehicles -- vehicles programmed to save the greatest number of people, even at the expense of the passengers. And they were much less likely to say they would purchase a self-driving car with this type of regulation. But they were more likely to say they would buy the kind of vehicle programmed to protect its passengers at all costs.
The researchers likened these responses to the “tragedy of the commons” -- an economic theory suggesting that, when shared resources are at stake, individuals will act in their own self-interest instead of taking the common good into account, thereby depleting the resource and causing harm to everyone.
“Even if you started off as one of the noble people who are willing to buy a self-sacrificing car, once you realize that most people are buying self-protecting ones, then you are going to reconsider when you’re putting yourself at risk to shoulder the burden of the collective when no one else will,” said Iyad Rahwan of the Massachusetts Institute of Technology, another co-author.
According to Millar, however, even this seemingly wishy-washy behavior is not surprising.
“This is stuff we know from centuries of work in ethics and also decades of work in moral psychology,” he said. “It doesn’t really tell us anything we didn’t already know or couldn’t easily predict about people’s attitudes toward driverless cars.”
In fact, the study is not the first to investigate such questions about driverless cars, specifically, either. Millar and his colleagues at the Open Roboethics initiative have conducted surveys posing similar situations (and garnering similar results from participants).
Driving into the future
Credit: Iyad RahwanEven if the study does not reveal any particularly new insights into the human psyche, it does raise some interesting questions about the future of the technology and how it should be regulated.
There are several potential options. Manufacturers could be in charge of deciding how their own cars should be programmed; the government could create a set of regulations that would apply to all vehicles; or consumers could be permitted to create the settings in their own vehicles.
Allowing manufacturers to program the cars at will presents a kind of ethical quandary itself, Millar noted.
“It’s just not the normal scope of engineering expertise,” he said. “They’re not ethicists."
"There are all these interesting features that come with those types of problems, those crash problems, that don’t submit easily to engineering expertise where you’re trying to crunch numbers.”
He adds that design constraints would likely also make it very difficult to allow users to tweak the programming in their own vehicles -- meaning that allowing the government to regulate will likely be the most logical choice in the future.
That said, the researchers have noted in the paper that governmental regulation -- particularly if it goes the utilitarian route -- may be a hindrance to the public’s willingness to adopt the technology at all. And that would also be a problem, given that driverless cars are expected to save so many lives (all other ethical questions aside) simply by eliminating human error from the driving experience.
What people say they'll do in a survey and what they'd actually do may be two different things.
However, some experts aren’t so sure that this will become an issue.
“What people say they'll do in a survey and what they'd actually do may be two different things,” said Patrick Lin, director of the Ethics + Emerging Sciences Group at California Polytechnic State University, by email. Lin has also done extensive work on issues related to the ethics of driverless cars.
“Airbags, for instance, hurt and kill some drivers today,” he continued. “This is to say that, under some circumstances, your airbag may kill you. But I don't know anyone who refuses to drive a car because of that possibility. The same may apply to automated cars that may make decisions for the greater good, but which harm their passengers.”
In any case, the researchers have acknowledged that there are a host of other complexities to be considered when making these kinds of programming decisions beyond even the scope of the situations they presented in their paper.
If pedestrians are behaving illegally, for instance, should that affect the vehicle’s behavior? If there’s a greater likelihood of one person surviving an accident than another, even though both would be put at risk by a crash, should that be taken into account? Who -- if anyone -- should be held liable for any harm that comes from the behavior of a driverless vehicle?
So there are many questions left to be answered, and they’ll likely only become more complicated as the technology advances and gets closer to commercial applications.
“Driverless cars have the potential to revolutionize transport and eliminate the majority of deaths on the road,” Rahwan said. “But as we work on making the technology safer, we need to recognize the psychological and social challenges they pose, too.”
Have something to add to this story? Share it in the comments.
TopicsCars
Tags:
转载:欢迎各位朋友分享到网络,但转载请说明文章出处“夫榮妻貴網”。http://new.maomao321.com/news/46f9899855.html
相关文章
These glasses hide a fitness tracker on your face
探索The last time a company tried popularizing wearable tech embedded in glasses, most notably with Goog ...
【探索】
阅读更多Cynthia Nixon loses NY primary and shares a powerful message to young people on Twitter
探索Governor Andrew Cuomo won the Democratic nomination for a third term as governor of New York, defeat ...
【探索】
阅读更多This artist does the most detailed pen drawings you've ever seen
探索The amount of detail in artist Olivia Kemp's massive landscape drawings is breathtaking. Especially ...
【探索】
阅读更多
热门文章
- Plane makes emergency landing after engine rips apart during flight
- You know you love Windows 95, and now it's an app
- Shop uses photos of Jeff Goldblum to sell stuff and frankly we'll take 1 of everything
- Lady Gaga is so good at folding things
- Snapchat is about to explode in popularity, report says
- You know you love Windows 95, and now it's an app
最新文章
Early Apple
These videos of girls meeting female STEM stars will help you dream big
Twitter bug could make it appear you liked Donald Trump's tweets
Mark Zuckerberg subtly made a case for not breaking up Facebook
Tourist survives for month in frozen New Zealand wilderness after partner dies
These videos of girls meeting female STEM stars will help you dream big