Alan Turing’s Contribution to Mathematics
The first half of the twentieth century was marked by two most cruel, terrifying, and devastating wars mankind has ever known. Clearly, warfare was not just delimited by the boundaries of front. No small amount of scientists was involved in warfare on both parts. The life and work of Alan Turing can be viewed as, probably, one of the brightest examples of how wartime experience has affected the civilian population, their day-to-day activities, their labors, and their lives at large. All in all, Alan Turing has made an unprecedentedly significant, real-world contribution to many fields of human knowledge, such as, for example, mathematics, logics, philosophy, cryptanalysis, linguistics, computer and cognitive science.
Alan Mathison Turing was born in London on July the 23rd, in 1912. The future scientist was born to a family of a civil servant. Alan Turing received proper education. He was educated in a private facility, one of the best in England. Turing was enrolled in Cambridge University in 1931, with Mathematics as his major. As a Cambridge University Graduate, class of 1934, Alan Turing was presented with an opportunity enter a fellowship at King’s College the same year. Turing’s collaboration with the American mathematical logician Alonzo Church began in 1936, with the publication of Turing’s thesis.
At this point, it essential to point out that even though the questions Alan Turing raised in his researches pertain to different spheres of knowledge, the connection between the researches themselves is rather self-explanatory. Alan Turing’s life’s work began with the publication with one of his most fundamental works is titled “On Computable Numbers, with an Application to the Entscheidungsproblem [Decision Problem]” in the second half of the 1930s. In their respective works, Turing and Church employed different researches, although both scholars arrived at the same conclusions. As it turned out, Turing’s research has had more serious impact on the field of computing. In order to obtain PhD in mathematical logic, Alan Turing entered Princeton University and became Alonzo Church’s apprentice later in 1936. Turing graduated from Princeton University in 1938.
The so-called Entscheidungsproblem was intended to find solutions to topical mathematical problems through accessing the provability of mathematical statements themselves. The statement cited above may be interpreted in a following way. In ‘Entscheidungsproblem’, scholars explored mathematical statements. By doing so, the mathematicians attempted to identify which of the mathematical statements are provable, and which are not. The scholars employ decision method to find a solution to ‘Entscheidungsproblem’. As of 1936, Turing and Church separately, independently found that ‘Entscheidungsproblem’ has actually no resolution. The two scholars, each in his own way, proved that no consistent formal arithmetic system proves effective as a decision method. At this point, at it is crucial to take a step back to point out the following. The arithmetic systems are deemed more advanced, if compared to the logical systems. Turing and Church postulated the statement mentioned above. Furthermore, building on that particular statement, the two logicians-mathematicians have shown that logical systems “have no effective decision method” either. Clearly, these days, the very idea that mathematics as a field of human knowledge in all its entirety can be reduced to elaborating the methodology of carrying out the computational procedures seems to utopian. In the 1930s, however, Turing-Church statement turned happened to revolutionize the way people have been thinking about hard sciences for centuries. At the same time, the period of Alan Turing’s collaboration with Alonzo Church was marked by Turing’s invention of Turing machine. Turing machine, above all else, gives insights into the limits of human computation. One of the guiding principles of the Turing machine’s function claims that literally all human-computable functions are similar to what Alan Turing himself has referred to as lambda-definable functions. A finite number of conditions (q1, q2, …, qR) may also be referred to as m-configurations. Turing machine has a tape divided into sections. The sections on a tape are called squares. Each square bears a symbol. A square with a symbol on it is called ‘scanned square’. A symbol on a square is, therefore, called ‘scanned symbol’. ‘Scanned symbol’ is the only one that machine is technically “aware of”. As the m-configuration is being altered, machine can recognize some of the symbols previously scanned. “The possible behaviour of the machine at any moment is determined by the m-configuration qn and the scanned symbol S(r)”. S(r) and qn is the exact expression, the configuration, through which one may calculate machine’s behavior.
Lambda-definable functions are mainly based on the positive integers the value of which can be calculated by means of the processes of repeated substitution. All in all, to a lesser or greater extent, Alan Turing’s further researches were unwittingly related to resolving ‘Entscheidungsproblem’ as the scholar has been operating the notions related to ‘Entscheidungsproblem’ itself throughout his career.
In the year 1938 Alan Turing returned to Great Britain to continue his fellowship at King’s College . In the summer of 1938, Alan Turing joined the Government Code and Cypher School . As the WWII broke in the autumn of 1939, the scholar moved to the organization’s headquarters at Bletchley Park, Buckinghamshire. At Bletchley, Alan Turing worked as a part of the team the mission of which was to break the code of Enigma, the machine that the German high command used to encrypt and transmit top secret messages. Biographers, historians, researchers, scientists, and cryptanalysts are inclined to think that Turing’s code-breaking work has saved the millions of lives and has substantially, considerably accelerated the end of the WWII. To prove the statement mentioned above, one has to consider and take the following data into account. The researchers have estimated that by the early 1942, the team of cryptanalysts at Bletchley Park was capable of intercepting and decoding about 39 thousand messages per month. The figures have increased exponentially as 84 thousand messages were deciphered every month, circa 2 messages every minute, day and night.
As the WWII came to its end in 1945, National Physical Laboratory in London hired Turing to work on electronic computer design. Thus, Alan Turing has become the inventor of the Automatic Computing Engine, nowadays referred to as “the first complete specification of an electronic stored-program all-purpose digital computer”. Alan Turing has also made an important contribution to defining the intuitive notion of ‘effective calculability’. Effective calculability is one of the guiding principles of the functioning of Turing’s machine. At the same time, effective calculability principle can be viewed as some sort of a driving force that inspired Turing from the moment of the publication of his thesis and the research related to ‘Entscheidungsproblem’ on. Therefore, these days, some scholars attempt to prove the controversy of some of Turing’s most important contributions to compatibility theory for real functions and real numbers. Scholars have found that lambda-definable functions have very much in common computable real numbers. At the time when both notions were introduced, the scientific world was not yet capable of realizing what the equivalency of that kind represents. Scientists have found that the equivalency of lambda-definable functions and computable real numbers happens to be “the foundation of computability theory for functions of an integer variable”, a paradigm-altering premise and a groundbreaking discovery in itself. The lambda-definable functions and computable real numbers equivalency theory proves that “the different definitions studied in recursion theory are actually equivalent”. Church-Turing thesis, basically, extrapolates the equivalency of lambda-definable functions and computable real numbers to all integer functions. Integer functions are considered calculable in virtually all cases when a possible and reasonable meaning is ascribed to expressions of that kind. Assuming that the statement cited above is correct, the character of Turing machine can be defined as natural and intuitive.
There is no small amount of scholars who suggest that computable real functions are, in fact, undefinable by nature. The evidence does support that Alan Turing has managed to give a universally accepted definition of computable real functions. However, Turing’s definition of computable real functions is no longer in use. On the one hand, that particular flaw casts doubt on the scientific validity of some of Alan Turing’s major works. On the other hand, many scholars support the following claim: “Many approaches to computable analysis which are based on Turing machines (such as the ‘Type-2 Theory of Effectivity’ and the different paradigms introduced by Markov, Banach and Mazur, Pour-El and Richards, and others) provide non-equivalent (although strictly related) definitions”. In order to counter the criticism of Turing’s approach to defining computable real functions, the following points have to be taken into account. First of all, there is no denying the fact that in the physical world, calculations are performed mainly by individuals or different mechanical devices. It also probably goes without saying that operations performed during different computational processes, therefore, depend on several factors, such as, for example, individual attitudes, chosen calculation methods, and/or the parameters of the selected computing machine. Naturally, real numbers are generally thought of as ‘infinite’ objects. Hence, some sort of standardization of computing activity is needed in order to define the very notion of computability, which, basically, Turing machine was designed for in the first place.
Central Limit Theorem is one of the cornerstones of both, statistics and mathematical probability, the former and the later being inseparably connected. Biographers and scholars maintain that Alan Turing developed a keen interest in the statistical aspects of any mathematical problem he had taken into consideration. Thus, Alan Turing has been working on discovering some form sequential analysis. Sequential analysis was playing an important role in Alan Turing research. The scientist has foreseen the emergence of the empirical Bayes method. The scholar has used Bayes factor logarithms as a part of his own decryption methodology. The three achievements mentioned above happen to be some of Alan Turing’s major contribution to statistics.
Alan Turing was one of those who had a working and profound understanding of numbers. Binary numbers are most commonly known as the essential element of communication. Even more importantly, many scholars assert that virtually all information can be converted into a binary code. Assuming that the foregoing statement is correct, one cannot help but admit that to these days, understanding of what it may represent is still beyond our contemplation. Binary code may suggest that there is no distinction between what all living beings observe every moment of their lives. For example, artistic images, moving images, and musical sounds are all the same. The statement cited above represents one of the specific statements to which the designers of artificial intelligence adhere to.
Alan Turing is considered to be a father of modern cognitive science and artificial intelligence studies. Turing believed that human brain is, at large, a digital computing machine. Alan Turing hypothesized that at birth, brain cortex is an unorganized matter and that knowledge and skills are acquired and developed in the process of training. Thus, “Turing proposed what subsequently became known as the Turing test as a criterion for whether an artificial computer is thinking”. The question whether or not artificial computer can think is senseless in itself. The evidence does support that machines can think. Alan Turing’s own research has supported that statement. Artificial intelligence has become a subject of heated discussions already. Nowadays, artificial intelligence has become a matter of ethics. As far as Alan Turing is concerned, one might like to point out that the scientists believed artificial intelligence and machine mathematics are inseparably connected. Many scholars these days draw a clear and vivid demarcation line between human and machine mathematics. Understanding the differences between machine and human mathematics has some definite potential for further development of science and technology. Turing machine, in a way, can be viewed as one of the mankind’s first attempts of creating artificial intelligence. All in all, Turing machine is capable of imitating human activity of computing and the principle of substitution is one of the key principles of the functioning of the machine itself.
Without any doubt, Alan Turing can be counted among the great minds of the twentieth century. The scientist has paved the way for the processes that nowadays are known under the general term of the mechanization of mathematics. Alan Turing has made a string of real-world contributions to the development of science and technology. He used all his skills and knowledge to serve people. As a scientist, Turing believed in progress. The scholar also had a form belief that civilization coupled with humanity and understanding the need of preserving the knowledge and advancing science and technology is the key to prosperity and better future. Turing’s approach towards the Decision Problem has significantly changed the way people nowadays perceive quantitative information. Turing machine gave push to artificial intelligence studies and the processes of mechanization of mathematics in general. Turing’s contribution to mathematics was also reflected in logics, philosophy, mathematical biology, and cognitive studies.