Should a self-driving car kill the baby or the grandma? Depends on where you’re from.

The idea was to create a game-like platform that would crowdsource people’s decisions on how self-driving cars should prioritize lives in different variations of the “trolley problem.” In the process, the data generated would provide insight into the collective ethical priorities of different cultures.  Recommended for You I 3D-printed every bit of my wedding—including my bouquet EXCLUSIVE: Chinese scientists are creating CRISPR babies US Army soldiers will soon wear Microsoft’s HoloLens AR goggles in combat The Chinese scientist who claims he made CRISPR babies is under investigation CRISPR inventor Feng Zhang calls for moratorium on gene-edited babies The researchers never predicted the experiment’s viral reception..The Moral Machine took that idea to test nine different comparisons shown to polarize people: should a self-driving car prioritize humans over pets, passengers over pedestrians, more lives over fewer, women over men, young over old, fit over sickly, higher social status over lower, law-abiders over law-benders? And finally, should the car swerve (take action) or stay on course (inaction)?.Moral Machine Rather than pose one-to-one comparisons, however, the experiment presented participants with various combinations, such as whether a self-driving car should continue straight ahead to kill three elderly pedestrians or swerve into a barricade to kill three youthful passengers.  The researchers found that countries’ preferences differ widely, but they also correlate highly with culture and economics..The results showed that participants from individualistic cultures, like the UK and US, placed a stronger emphasis on sparing more lives given all the other choices—perhaps, in the authors views, because of the greater emphasis on the value of each individual.  if(“undefined”==typeof window.datawrapper)window.datawrapper={};window.datawrapper[“tLegH”]={},window.datawrapper[“tLegH”].embedDeltas={“100″:743,”200″:545,”300″:453,”400″:439,”500″:400,”700″:386,”800″:361,”900″:361,”1000”:361},window.datawrapper[“tLegH”].iframe=document.getElementById(“datawrapper-chart-tLegH”),window.datawrapper[“tLegH”].iframe.style.height=window.datawrapper[“tLegH”].embedDeltas[Math.min(1e3,Math.max(100*Math.floor(window.datawrapper[“tLegH”].iframe.offsetWidth/100),100))]+”px”,window.addEventListener(“message”,function(a){if(“undefined”!=typeof a.data[“datawrapper-height”])for(var b in a.data[“datawrapper-height”])if(“tLegH”==b)window.datawrapper[“tLegH”].iframe.style.height=a.data[“datawrapper-height”][b]+”px”}); Countries within close proximity to one another also showed closer moral preferences, with three dominant clusters in the West, East, and South..The study has interesting implications for countries currently testing self-driving cars, since these preferences could play a role in shaping the design and regulation of such vehicles..“We used the trolley problem because it’s a very good way to collect this data, but we hope the discussion of ethics don’t stay within that theme,” he said.. More details

Leave a Reply