Anda di halaman 1dari 43

The Turing Test

The Turing Test


Computing Machinery and
Intelligence
Alan Turing
Some Theories of Mind
Some Theories of Mind

Dualism
Substance Dualism: mind and body are differerent substances.
Mind is unextended and not subject to physical laws.
Interactionism: mind and body interact
Occasionalism/Parallelism: mind and body dont interact
Property/Event Dualism

Epiphenomenalism: physical events cause mental events


but mental events dont cause anything

Property Dualism:some! mental states are irreducibly non"


physical attributes of physical substances
Some Theories of Mind
Some Theories of Mind
Physicalism: mental states are identical to physical states# in
particular# brain states or# minimally# supervene upon physical
states.
!"nalytical# or !$ogical#% &ehaviorism: tal$ about mental
states should be analy%ed as tal$ about behavior and behavioral
dispositions
The Identity Theory Type'Physicalism%: mental states are
identical to so nothing more than! brain states
(unctionalism: mental states are to be characteri%ed in terms of
their causal relations to sensory inputs# behavioral outputs and
other mental states# that is# in terms of their functional role.
Dualisms%
Dualisms%
Pro
&ualia

Irreducibility of psychology
The 'ombie Argument
The Cartesian (ssentialist
Argument
)on
Causal closure of the physical

)implicity
Descartes
Descartes
* "rguments for Dualism
* "rguments for Dualism
(ssentialist Argument

It is conceivable that ones mind might exist without ones body

*hatever is conceivable is logically possible


Therefore# it is possible ones mind might exist without ones
body
(mpirical Argument
The complexity and flexibility of human behavior# including
linguistic behavior# couldnt be achieved by mere mechanism so
we need to assume some non"physical substance as an
explanation for such behavior.
The +ombie "rgument
The +ombie "rgument
A philosophical! %ombie is a being which is a perfect duplicate of a
normal human being+including brain and neural activity+but which
is not conscious.
The 'ombie Argument for property dualism

'ombies are conceivable ,avid Chalmers singing the


'ombie -lues!

*hatever is conceivable is logically possible


)ome! mental states.properties.events are not identical to any
brain states.properties.events
/ote0 this argument doesnt purport to establish substance dualism
or# as ,escartes wished to show# that minds.persons could exist in
a disembodied state.
Problem ,ith )artesian Dualism
Problem ,ith )artesian Dualism
1*e do not need that hypothesis20 complex behavior can be
explained without recourse to irreducibly non"physical states.

Contra ,escartes# purely physical mechanisms can exhibit the


$ind of complex# flexible behavior# including learning or
1learning2! characteristic of humans.

All physical events have sufficient causes that are themselves


physical events

3hysicalism is an aggressor hypothesis0 we explain more and


more without recourse to non"physical events.states

Agency explanations are eliminated in favor of mechanistic


explanations+including explanations for agency itself.
Claims to the effect that non"physical events cause physical
events introduces an even bigger mystery0 what is the
mechanism4
Epiphenomenalism
Epiphenomenalism
Motivation for (piphenomenalism

All physical events have sufficient causes that are themselves


physical events

-ut some mental events+5ualitative states# the what"it"is"li$e


experience+seem to be irreducibly nonphysical0 it seems
implausible to identify them with brain events.

3roblem0 intuitively some mental states cause behavior


(. g. pain causes people to wince
Moreover# part of what we mean by 1pain2 seems to involve an
association6 of with characteristic behavior
6*ell leave 1association2 intentionally vague
Philosophical% &ehaviorism
Philosophical% &ehaviorism
Motivation

*e want to hold that there are no irreducibly non"physical


causes of physical events

-ut we also need to accommodate the fact that what we mean


by terms designating mental states involves an association with
characteristic behavior.
3roblems

Intuitively# theres more to some mental states0 the problem of


5ualia

Intuitively# there can be less to mental states0 its conceivable


that one may be in a given state without even being disposed to
characteristic behavior+or that one may be disposed to
uncharacteristic behavior
,ispositions arent causes so# while behaviorism associates
mental states with behavior# they still dont cause behavior.
The Identity Theory
The Identity Theory
Motivation
*e want to hold that there are no irreducibly non"physical
causes of physical events
-ut we also want to understand them as 1inner states2 that are
causally responsible for behavior
3roblems

&ualia again0 intuitively there is more to consciousness than


brain states

)pecies chauvinism0 if we identify a type of mental state# e.g.


pain# with a type of brain state that is responsible for pain in
humans# e.g. the firing of C"fibers# what do we do about non"
humans who dont have the same $ind of brain states but who#
we believe# can never the less have the same $ind of mental
states4
-hat a theory of mind should do
-hat a theory of mind should do
Ma$e sense of consciousness0 1The 7ard 3roblem2

Avoid commitment to irreducibly non"physical states# events or


substances

(xplain the causal role of mental states as

(ffects of physical events


Causes of behavior
Causes of other mental events
Allow for multiple reali.ability in order to avoid species chauvinism
*e want to be able to ascribe the same $inds of mental states
we have to humans who may be wired differently# other animals
and# possibly to beings that dont have brains at all# e.g.
Martians# computers
(unctionalism
(unctionalism
*hat ma$es something a mental state of a particular type does not
depend on its internal constitution# but rather on the ,ay it
functions/ or the role it plays# in the system of which it is a part.

/ote0 1function2 here related also to 1function2 in math sense.

Topic 0eutrality: mental state concepts dont specify their intrinsic


character# whether physical or non"physical+thats a matter for
empirical investigation.

)o 8unctionalism is in principle compatible with both physicalism


and dualism

Multiple 1eali.ability: A single mental $ind property# state# event!


can be 9reali%ed9 by many distinct physical $inds.
The same type of mental state could# in principle# be 1reali%ed2
by different physical or non"physical! states
,isagreement about how 1liberal2 we should be in this regard
"n E2ample: Pain
"n E2ample: Pain
*ere interested in analy%ing or ordinary concept of pain
*e understand it in terms of its causal role
As being typically produced by certain stimuli# e.g. bodily injury
As tending to produce certain behavior# e.g. wincing
As producing further mental states# e.g. resolving to avoid those
stimuli in the future
*e recogni%e that different $inds of physical of non"physical!
mechanisms may play that role
Compare to other functional concepts li$e 1can opener2
*e leave empirical 5uestions to empirical investigation
The &ig 3uestions "bout (unctionalism
The &ig 3uestions "bout (unctionalism
)onsciousness: some mental states appear to have intrinsic#
introspectable features+and those features seem to be essential
Inverted &ualia see -loc$ 1Inverted (arth2!

'ombies
The :nowledge Argument see ;ac$son 1*hat Mary ,idnt
:now2!

4nderstanding: controversial whether understanding can be


reduced to the ability to mediate input and output by manipulating
symbols see Turing 1Computing Machinery and Intelligence2 vs.
)earle on The Chinese <oom
The Turing Test
The Turing Test
(unctionalism: mental states are to be characteri%ed in terms of
their causal relations to sensory inputs# behavioral outputs and other
mental states# that is# in terms of their functional role.
A Turing Machine can do this=
)o if 8unctionalism is true# a machine should in principle be able to
do anything a person can do
Can a machine do whatever a person can do4
And can it meet>
The )artesian )hallenge
The )artesian )hallenge
If there were machines which bore a resemblance to our bodies and
imitated our actions as closely as possible for all practical purposes,
we should still have two very certain means of recognizing that they
were not real men. The first is that they could never use words, or
put together signs, as we do in order to declare our thoughts to
others. For we can certainly conceive of a machine so constructed
that it utters words, and even utters words that correspond to bodily
actions causing a change in its organs. But it is not conceivable
that such a machine should produce different arrangements of
words so as to give an appropriately meaningful answer to whatever
is said in its presence, as the dullest of men can do. Secondly, even
though some machines might do some things as well as we do
them, or perhaps even better, they would inevitably fail in others,
which would reveal that they are acting not from understanding, but
only from the disposition of their organs. For whereas reason is a
universal instrument, which can be used in all inds of situations,
these organs need some particular action! hence it is for all practical
purposes impossible for a machine to have enough different organs
to mae it act in all the contingencies of life in the way in which our
reason maes us act. ?,escartes "iscourse on #ethod@
-hat can people do that computers can
-hat can people do that computers can
*t do5
*t do5
Telling 6umans and )omputers "part "utomatically

A CA3TC7A is a program that protects websites against bots by


generating and grading tests that humans can pass but current
computer programs cannot. 8or example# humans can read distorted
text as the one shown below# but current computer programs canAt0
The term CA3TC7A for Completely Automated 3ublic Turing Test To
Tell Computers and 7umans Apart! was coined in BCCC by Duis von
Ahn# Manuel -lum# /icholas 7opper and ;ohn Dangford of Carnegie
Mellon Eniversity.
Empirical and )onceptual 3uestions
Empirical and )onceptual 3uestions
The Turing Test0 Can a machine6 meet the Cartesian challenge4

Ese language in a way that humans do rather than merely


uttering sounds4

(xhibit the complexity and flexibility of behavior in a wide range


of areas as humans do4

*hat# if anything# of philosophic interest would it show if a machine


could pass the Turing Test4
Is passing the test necessary for intelligence4
Is passing the test sufficient4
6 *hat is a 1machine24 Arent our brains themselves machines4
Some )hatbots
Some )hatbots
(li%a
Alice

)u%ette

;ac$ the <ipper


3FMF generator
3oetry generator
Chatbot
Collection
WFF
WFF
The &abbage Engine
The &abbage Engine
E0I")
E0I")
&uild your o,n Turing Machine7
&uild your o,n Turing Machine7
A Turing machine is a theoretical computing machine invented by Alan Turing
(1937) to serve as an idealized model or mathematical calculation! A Turing
machine consists o a line o cells "no#n as a $tape$ that can be moved bac"
and orth% an active element "no#n as the $head$ that possesses a property
"no#n as $state$ and that can change the property "no#n as $color$ o the
active cell underneath it% and a set o instructions or ho# the head should
modiy the active cell and move the tape (&olram '(('% pp! 7)*)1)! At each
step% the machine may modiy the color o the active cell% change the state o
the head% and then move the tape one unit to the let or right!
+read more in &olram ,ath&orld-
" Turing Machine is an "bstract Machine
" Turing Machine is an "bstract Machine
An abstract machine is a model of a
computer system considered either as
hardware or software! constructed to
allow a detailed and precise analysis
of how the computer system wor$s.
)uch a model usually consists of input#
output# and operations that can be
preformed the operation set!# and so
can be thought of as a processor. An
abstract machine implemented in
software is termed a virtual machine#
and one implemented in hardware is
called simply a 9machine.2
?*olfram Mathworld@
Turing Machine here: try it7
"nother Turing Machine
" concrete Turing Machine
Different hard,are 8 same abstract machine
Different hard,are 8 same abstract machine

Mental states are li$e computational states of computers

The same computational or mental state can be reali%ed by different


hardware or brainware=
*ere in the same
computational state=
*ere in the same
computational state=
*ere in the same
computational state=
*ere in the same
computational state=
The Imitation 9ame
The Imitation 9ame
Turing proposes a Ggame in which we have a person# a machine#
and an interrogator+separated from the other person and the
machine.
The object of the game is for the interrogator to determine which of
the other two is the person# and which is the machine.
1I believe that in about fifty years time#2 Turing wrote in HIJC# 1it will
be possible to programme computers>to ma$e them play the
imitation game so will that an average interrogator will not have
more than KCL chan ce of ma$ing the right identification after five
minutes of 5uestioning>I believe that at the end of the century the
use of words and general educated opinion will have altered so
much that one will be able to spea$ of machines thin$ing without
expecting to be contradicted.2
)o far this hasnt happened but>there is a contest on0
The Empirical 3uestion: )an a machine pass5
The Empirical 3uestion: )an a machine pass5
The $oebner Pri.e: In HIIC 7ugh Doebner agreed with The
Cambridge Center for -ehavioral )tudies to underwrite a contest
designed to implement the Turing Test. ,r. Doebner pledged a
Mrand 3ri%e of NHCC#CCC and a Mold Medal solid+not gold"plated=!
for the first computer whose responses were indistinguishable from
a humanAs.
The )onceptual Philosophical% 3uestion
The )onceptual Philosophical% 3uestion
If the meaning of the words $machine% and $thin% are to be found by
e&amining how they are commonly used it is difficult to escape the
conclusion that the meaning and the answer to the 'uestion, $(an
machines thin% is to be sought in a statistical survey such as a
)allup poll. But this is absurd. Instead of attempting such a
definition I shall replace the 'uestion by another, which is closely
related to it and is e&pressed in relatively unambiguous words.
7ow is the 5uestion of whether a machine could pass the Turing
Test! related to the 5uestion of whether a machine can thin$4
*hat would it show if a machine could pass the Turing Test4
Is being able to pass the Turing Test a necessary condition on
intelligence4
Is being able to pass the Turing Test a sufficient condition on
intelligence4
&ehaviorism5
&ehaviorism5
The new problem has the advantage of drawing a fairly sharp line
between the physical and intellectual capacities of a man. *o
engineer or chemist claims to be able to produce a material which is
indistinguishable from the human sinbut even supposing this
invention available we should feel there was little point in trying to
mae a $thining machine% more human by dressing it up in such
artificial flesh.
*hat matters for Gintelligence+or whatever Turing is testing for4
,oes Gthe right stuff brain"stuff# Gspiritual substance#2 or
whatever! matter4
,oes the right internal structure or pattern of inner wor$ings
matter4 If so# at what level of abstraction4
,oes the right history# social role or interaction with environment
beyond interrogation and response in the Turing Test matter4
Ob:ections Turing )onsiders
Ob:ections Turing )onsiders
H. The Theological Fbjection
B. The G7eads in the )and Fbjection
O. The Mathematical Fbjection
P. The Argument from Consciousness
J. Arguments from Qarious ,isabilities
R. Dady Dovelaces Fbjection
K. Argument from Continuity in the
/ervous )ystem
S. Argument from the Informality of
-ehavior
I. Argument from (xtrasensory
3erception
The Theological Ob:ection
The Theological Ob:ection
Thining is a function of man%s immortal soul. )od has given an
immortal soul to every mabn and women, but not to any other
animal or to machines. +ence no animal or machine can thin.
Turings response0 Mod could give a machine a soul if he wanted to

)ome 5uestions0

'ombies. Fn this account it would be a contingent fact that


intelligent computers or humans! had souls+soulless %ombies
could perfectly simulate ensouled humans or machines.
Are souls# if there are such things# what matter for
consciousness vide Doc$e!
The
The
;6eads in the Sand* Ob:ection
;6eads in the Sand* Ob:ection
The conse'uences of machines
thining would be too dreadful. ,et
us hope and believe that they
cannot do so.
Turing notes that theres no real
argument here.
/evertheless# the prospect of
intelligent machines raises a
number of ethical 5uestions>
The Mathematical Ob:ection
The Mathematical Ob:ection
)-del%s theoremshows that in any sufficiently powerful logical
system statements can be formulated which can neither be proved
nor disproved within the system.
Conse5uently there will be some 5uestions
a machine being essentially an automated
formal system! cannot answer.
Turing notes however that there are
5uestions that humans cant answer
and it could be that beyond this were
bound by the same constraint that restricts
the capacity of machines.
The "rgument from )onsciousness
The "rgument from )onsciousness
*o mechanism could feel .and not merely artificially signal, an easy
contrivance/ pleasure at its successes, grief when its valves fuse,
be warmed by flattery, be made miserable by its mistaes, be
charmed by se&, be angry or depressed when it cannot get what it
wants.
A machine that passed the Turing Test would# ipso facto# be able to
give appropriate responses to 5uestions about poetry# emotions#
etc.
If we re5uire more than the Turing Test as evidence of
consciousness then we have no good reason to believe that other
humans are conscious.
-ut we do have good reason to believe that other humans are
conscious.
Therefore the Turing Test would be evidence of consciousness in a
machine if that machine could pass the test.
"rguments from <arious Disabilities
"rguments from <arious Disabilities
These arguments tae the form, $I grant you that you can mae
machines do all the things you have mentioned but you will never be
able to mae one tobe ind, be resourceful, be beautiful, be
friendly, have initiative, have a sense of humor, tell right from wrong,
mae mistaes, fall in love, en0oy strawberries and cream, mae
someone fall in love with it, learn from e&perience, use words
properly, be the sub0ect of its own thought, have as much diversity
of behavior as a man, do something really new.
It seems li$ely that we can construct machines that will be able to do
a great many of these things+including learning and ma$ing
mista$es but

*e should also as$ whether various items on the list are


re5uirements for intelligence or whether were building in a species"
chauvinistic re5uirement that would exclude intelligent beings that
arent li$e us humans.
$ady $ovelace
$ady $ovelace
*s Ob:ection
*s Ob:ection
$The 1nalytical 2ngine has no pretensions to originate anything. It
can do whatever we now how to order it to perform%
-ut computers can surprise us and
3eople arent all that original anyway
(inal Ob:ections
(inal Ob:ections
Argument from Continuity of the /ervous )ystem

<esponse0 a digital machine can imitate an analogue machine

Argument from the Informality of -ehaviour

<esponse0 no reason to thin$ human behavior is any less rule"


governed

Argument from (xtrasensory 3erception


Ta$ing ()3 seriously# we could find ways to rule it out by putting
competitors in a Gtelepathy"proof room. )urely# even if ()3 were
a reality it wouldnt be any more of a re5uirement for intelligence
than the ability to appreciate strawberries and cream.

Dearning

In fact computers can# at least Glearn and# unless weve


established independently that they aren%t intelligent# no reason
to deny that this constitutes genuine learning.
Imitation and 1eplication
Imitation and 1eplication
*hen is imitating T replication+i.e. another instance of T+rather
than mere simulation4

*hen does the Gright stuff matter0

Margerine is only simulated butter but


*al$ing with an artificial leg is real wal$ing
*hen do the right extrinsic features# e.g. right history matter0
Counterfeit money and art forgeries are fa$es but

A copy of a file or application is the real thing


"re inputs/outputs all that matter5
"re inputs/outputs all that matter5
Consider# for example# /ed -loc$As
Blochead>a creature that loo$s just
li$e a human being# but that is controlled
by a 1game"of"life loo$"up tree#2 i.e. by a
tree that contains a programmed
response for every discriminable input at
each stage in the creatureAs life. If we
agree that -loc$head is logically
possible# and if we agree that -loc$head
is not intelligent does not have a mind#
does not thin$!# then -loc$head is a
counterexample to the claim that the
Turing Test provides a logically
sufficient condition for the ascription of
intelligence. After all# -loc$head could
be programmed with a loo$"up tree that
produces responses identical with the
ones that you would give over the entire
course of your life given the same
inputs!.
Ob:ections to the Turing Test as -hat Matters
Ob:ections to the Turing Test as -hat Matters
Intentionality The Chinese <oom0 )earle# 1Minds# -rains and
3rograms2!
Uou cant cran$ semantics out of syntax0 mere symbol"
manipulation# however adept# doesnt create meaning or
understanding.
)onsciousness The Inverted )pectrum0 -loc$# 1Inverted (arth2!

/either behaviorism nor functionalism can capture the felt#


intrinsic character of phenomenal mental states# e.g. 1what it is
li$e2 to see red.

Semantic E2ternalism )wampman0 ,avidson# 1:nowing Fnes


Fwn Mind2!

*hat oneAs words mean+if they mean+is determined not


merely by some internal state# but also by the causal history of
the spea$er and the role he plays within his environment.
Intentionality Ob:ection
Intentionality Ob:ection
*hat does 1C:Ap5rr2 mean4 According to the syntactic rules of the
first game# 1)ha$"A"*88#2 its a *88 but when I construct and
manipulate *88s I dont $now what Im doing.
)onsciousness: the Inverted 3ualia Ob:ection
)onsciousness: the Inverted 3ualia Ob:ection
3T4he inverted spectrum argument is this5 when you and I have
e&periences that have the intentional content looing red, your
'ualitative content is the same as the 'ualitative content that I have
when my e&perience has the intentional content of looing green.
*e use color words in the same way# ma$e the same inferences#
and respond in the same way to the same stimuli but it seems to be
conceivable that! our experiences are different in their intrinsic#
5ualitative character0 Gwhat it is li$e to see red is different from
Gwhat it is li$e for me. The Turing Test cant capture the Gwhat it is
li$e feature of experience.
Semantic E2ternalism
Semantic E2ternalism
)onsciousness: The +ombie Problem
)onsciousness: The +ombie Problem
It seems conceivable that a being with /F 5ualia could pass the
Turning Test. ,o 5ualia matter4 If so# for what4

Anda mungkin juga menyukai