In recent years, as complex computer technology has become more advanced, many believe that the gap between a human's ability to "think" and a computer's is beginning to close. The concepts presented in the prolific "Turing Test" are often cited by those who support this belief as reason to endorse the fact that computers will one day have a mind similar to a human's. In this widely used test designed by Alan Turing to determine whether or not computers can "Think", a human interrogator plays "The Imitation Game" by trying to distinguish a human's replies to questions from a computer's within a specified amount of time. Turing argued that if the interrogator is unable to distinguish the computer from the human, then the computer has achieved the ability to "think".
A commonly raised question brought up when analyzing the methodology behind the "The Turing Test", is can a computer's "thinking" exist with both syntax and semantics? And are both these required to "think"? A high profile theory which sets out to prove that although a computer may be able to pass "The Turing Test" it's understanding of the issues being posed to it are non-existent, is John Searle's "Chinese Room Experiment".…