Rabu, 16 November 2016

Resume Artikel


CLIVE THOMPSON
Leave The Driving to Us
Machines can make decisions. That doesn’t mean they’re right.
            So you can’t wait for a self-driving car to take away the drudgery of driving? Me neither! But consider this scenario, recently posed by neuroscientist Gary Marcus : your car is on a narrow bridge when a school bus veers into your lane. Should your self-driving car plunge off the bridge-sacrificing your life t save those of the children? Obviously, you won’t make the call. You’ve ceded that decision to the car’s algorithms. You better hope that you agree with its choice. This is a dramatic dilemma, to be sure. But it’s not a completely unusual one. The truth is, our tools increasingly guide and shape our behavior or even make decisions on our behalf. A small but growing chorus of writers and scholars think we’re going too far. By taking human decisionmaking out of the equation, we’re slowly stripping away deliberation-moments where we reflect on the morality of our actions.
                Not all of these situations are so life-and-death. Some are quite prosaic, like the welter of new gadgets that try to “nudge” us into better behavior. In his new book To Save Everything, Click Here, Evgency Morozov casts a skeptical eye on this stuff. He tells me about a recent example he’s seen : a “smart fork” that monitors how much you’re eating and warns you if you’re overdoing it.
                Fun and useful, you might argue. But for Morozov, tools like the fork reduce your incentive to think about how you’re eating, and the deeper political questions of why todays food ecosystem is so enfattening. “Instead of regulating the food industry to make food healthier,” Morozov says, “we’re giving people smart forks”.
                Or as Evan Selinger, a philosopher at Rochester intitute of technology, puts it, tools that make hard thing easy can make us less likely to tolerate things that are hard. Outsourcing our self-control to “digital power” has consequences: use siri constantly to get instant information and you can erode your ability to be patient in the face of complete answers, a crucial civic virtue.
                Things get even dicier when society at large outsources its biggest moraldecisios to technology. For example, some police departments have begu using PredPol, a system that mines crime data to predict future criminal activity, guiding police to areas they might otherwise overlook. It appears to work, cutting some crime by up to 27 percent. It lets chronically underfunded departments do more wvth less.

                But as Morozov points out, the algorithms could wind up amplifying flaws in existing law enforcement. For example , sexual violence is historically underreported, so it can’t as easily be predicted. Remove the deliberation of what police focus on and you can wind up deforming policing.

Tidak ada komentar:

Posting Komentar