Anda di halaman 1dari 2

Kayla Fadden

Loving Grace

Summary of All Watched Over by Machines of

As more jobs that were once performed by humans, or not at all, are now being achieved by machines
and their algorithms, more ethical concerns are beginning to arise as well. One of the oldest ethical
issues originally posed by Phillipa Foot regards the double effect doctrine which investigates whether it
is acceptable to commit a harmful act if it is for a good result; do the ends justify the means? This classic
ethical problem is one that, if an algorithm were to evaluate and decide a best course of action, it would
not make the morally acceptable choice because algorithms are essentially thoughtless. They model
certain decision flows, but once you run them, no more thought occurs. Where humans can make the
distinction that killing one to save five is not an ethically sound decision, algorithms do not have a
moral center; they have no sense of right or wrong; they cannot take responsibility for their
consequences. With more machines taking on a sophisticated role in society, how can the creators of
such algorithms instill a sense of thoughtfulness in something that is entirely thoughtless?
Algorithms may not have the ability to think, but still someone must be held responsible for the actions
of these machines. Devices are being made to perform jobs that make human life easier, but there is a
lack of awareness of and consideration for the failure modes, the edge cases, and the worst case
scenarios. Creating an algorithm to solve a problems best-case-scenario is simple, but considering
every potential ethical conundrum a machine may come across and programming a solution to said
problem is nearly impossible. Just because ethical humans create these machines does not mean they
should be so innately trusted to make morally just choices. There are cases where a bot has purchased
cigarettes, counterfeit branded clothing, master keys, and drugs, which were mainly illegal purchases.
Even Google, a widely known company, has been sued for defamation of character due to their page
ranking algorithm associating Albert Yeungs name with criminal gangs. Though no one at Google said
anything construed as hurtful toward Yeungs reputation, the company is responsible for the actions of
their algorithms. As machines become more advanced and their algorithms serve more essential roles in
everyday life, the creators responsibility to program a sense of morality increases drastically.
Humans have been imagining a world of machines long before the creation of the computer or the
Internet. Richard Brautigan wrote a poem in 1967 speaking of a cybernetic ecology where humans
have machines to watch over them and work for them. Brautigans poem accurately predicted life today
where jobs are constantly being surrendered to cheaper, faster, and more efficient machines; but some
situations require more than the unmerciful, black and white, decision making that machines are
programmed to do. In the field of computing, it is easy to make choices that lack the compassion and
emotions that humans possess because algorithms make decisions strictly based on rules and laws.
Software professionals must keep a sense of compassion and morality while programming as we slowly
and inevitably and irreversibly surrender to these machines of our own creation.
Dr. P Drexel

1
9/6/16

Kayla Fadden
Loving Grace

Dr. P Drexel

Summary of All Watched Over by Machines of

1
9/6/16

Anda mungkin juga menyukai