Skip to main content

The Alan Turing Legacy

International Symposium | Madrid, October 23 – 24, 2012
Fundación Areces/Real Academia de Ciencias Exactas, Físicas y Naturales




  • Manuel de León

Real Academia de Ciencias Exactas, Físicas y Naturales
Consejo Superior de Investigaciones Científicas (CSIC)

  • David Ríos Insúa

Real Academia de Ciencias Exactas, Físicas y Naturales

  • Jesús María Sanz Serna

Real Academia de Ciencias Exactas, Físicas y Naturales
Universidad de Valladolid 





Miguel Ángel Alario y Franco
Presidente de la Real Academia de Ciencias Exactas, Físicas y Naturales

Manuel de León
David Ríos Insúa
Jesús M. Sanz-Serna

Coordinadores del Simposio

Federico Mayor Zaragoza
Presidente del Consejo Científico
Fundación Ramón Areces

José Manuel Sánchez Ron

Real Academia Española
Real Academia de Ciencias Exactas, Físicas y Naturales
Universidad Autónoma de Madrid

Title: Alan Turing: “The Person of the 20th century"
Abstract: In December 1999, “Time” magazine chose Albert Einstein as "The Person of the Century". This was an entirely reasonable choice but, as I will argue in this talk, there are good reasons for Turing to have received this honor, or at least to have been considered for it. One such reason is his purely scientific work, which stems from the greatest mathematical tradition. That work had a direct influence on the development of mathematics itself, and was finally also instrumental in shaping a new (technological) world, both in the calculation and handling of information as well as in giving rise a new manner of establishing social relations. An example on such more “applied”, or mundane, regard is in Turing’s contributions to the deciphering of secret codes during the Second World War, which in a somewhat (metaphorical) sense may be regarded as a new tool for undermining personal privacy, that essential civil right whose denial by British society finally ruined his life.

Ramón López de Mántaras

Recipient of the 2011 AAAI Robert S. Engelmore Award
Profesor de Investigación del CSIC

Title: The AI Journey: The road traveled and the (long) road ahead
Abstract: In this talk I will first briefly remind what Artificial Intelligence (AI) is about, distinguishing between strong and weak AI. Then I will summarize some of the many impressive results we have achieved in weak AI along the road traveled so far, including some specific results in  robotics and music obtained at the IIIA-CSIC. Next I will describe some of the future challenges to be faced along the (long) road we still have ahead of us, with an emphasis on the so-called "integrated systems" incorporating perception, learning, reasoning, communication and action and why the Turing test is not suitable for such systems. Finally, I will comment on the importance of interdisciplinary research for building these integrated systems, which are a stepping stone towards strong AI, and give some examples of results in interdisciplinary research which are  relevant for building humanoid robots with artificial skin, artificial muscles, and artificial cartilages.


Nigel P. Smart

University of Bristol, United Kingdom

Title: Provable Security
Abstract: Turing's test is about whether a person at a remote location can determine whether they are communicating with a real person or a computer. A similar idea is used in cryptography: An adversary needs to determine whether it is interacting with a real system or a simulation. If he cannot tell the difference then the system is called "secure". In this talk I will provide an introduction to this modern approach to cryptography.

Froilán Martínez Dopico

Universidad Carlos III de Madrid

Title: Alan Turing and the origins of modern Gaussian elimination
Abstract: The solution of a system of linear equations is by far the most important problem in Applied Mathematics. It is important in itself and because it is an intermediate step in many other relevant problems. Gaussian elimination is nowadays the standard method for solving this problem numerically on a computer and it was the first numerical algorithm to be subjected to a rounding error analysis. In 1948, Alan Turing published a remarkable paper on this topic: ``Rounding-off errors in matrix processes’’ (Quart. J. Mech. Appl. Math. 1, 287-308). In this paper, Turing formulated Gaussian elimination as the matrix LU factorization and introduced the “condition number of a matrix”, both of them fundamental notions of modern Numerical Analysis. In addition, Turing presented an error analysis of Gaussian elimination that improved  previous analyses and deeply influenced the definitive analysis developed by James Wilkinson in 1961. Alan Turing’s work on Gaussian elimination appears in a fascinating period for modern Numerical Analysis. Other giants of Mathematics, such as John Von Neumann, Herman Goldstine, and Harold Hotelling were also working in the mid-1940s on Gaussian elimination. The goal of these researchers was to find an efficient and reliable method to solve systems of linear equations in the recently invented “automatic computers”. At that time, it was not clear at  all whether Gaussian elimination was a right choice or not. The purpose of this talk is to revise, at an introductory level, Alan Turing’s contribution to the analysis of Gaussian elimination, its historical context, and its influence on modern Numerical Analysis.




Leslie Valiant

Nevanlinna Prize 1986, Knuth Prize 1997, Turing Award 2010
Harvard University, USA

Title: Biological Evolution as a Form of Learning
Abstract: Living organisms function according to protein circuits. Darwin's theory of evolution suggests that these circuits have evolved through  variation guided by natural selection. However, the question of which circuits can so evolve in realistic population sizes and within realistic numbers of generations has remained essentially unaddressed. We suggest that computational learning theory offers the framework for investigating this question, of how circuits can come into being adaptively from experience, without a designer. We formulate evolution as a form of learning from examples. The targets of the learning process are the functions of highest fitness. The examples are the experiences. The learning process is constrained so that the feedback from the experiences is Darwinian. We formulate a notion of evolvability that distinguishes function classes that are evolvable with polynomially bounded resources from those that are not. The dilemma is that if the function class, say for the  expression levels of proteins in terms of each other, is too restrictive, then it will not support biology, while if it is too expressive then no evolution
algorithm will exist to navigate it. We shall review current work in this area.

Miguel Ángel Martín Delgado

Universidad Complutense de Madrid

Title: Alan Turing and the Limits of Computation
Abstract: Computability studies which problems can be computed and which cannot be computed. This goes to the very limits of what is knowable. Turing developed the theory of computability by finding mechanical procedures to compute things, and also devised appropriate definitions of what algorithms are. This is the way Mathematics must be understood at a fundamental level with axioms. Eventually, this leads to the very notion of mathematical creativity. Turing showed what things we can compute in a very precise and universal way, but he also proved that there are things that we cannot compute. After explaining the limits of computability, including examples, the natural question is: can we go beyond? This depends on what is called the Turing barrier. Some alternatives to Turing computation are discussed, including quantum computation. Turing's work is given some historical perspective with respect to some of his precursors, contemporaries and mathematicians who developed his ideas farther.


Miguel Ángel Herrero

Universidad Complutense de Madrid

Title: Mathematics and biology: a difficult relation
Abstract: In 1952 Alan Turing authored an article on the chemical basis or morphogenesis (The chemical basis of morphogenesis. Phil. Trans. Roy. Soc. London B 37-72.237), his only published work related with Biology. His opening words tersely describe his approach as follows: It is suggested that a system of chemical substances, called morphogens, reacting together and diffusing through a tissue, is adequate to account for the main phenomena of morphogenesis. Such a system, although it may originally be quite homogeneous, may later develop a pattern or structure due to an instability of the homogeneous equilibrium, which is triggered off by random disturbances. Turing was not the first to use mathematical methods to deal with biological problems. Moreover, his approach remained unnoticed, or was plainly rejected by many biologists at that time, and rapidly fell into oblivion. His proposals were independently rediscovered about twenty years later, only to meet with a hostile reception among leading developmental biologists. An argument often used in the resulting exchanges is that Life is too complex to be reduced to a mathematical formulation. However, such misgivings have not prevented Ecology and Mathematics from mutually benefiting from their respective discoveries during the XX century. In this lecture, I will briefly describe some striking aspects of the exciting, if difficult, relation between Mathematics and Biology throughout history. Particular attention will be paid to developments that have taken place during the last half century, and which can be related, at least in part, to Turing´s work.



Josep Díaz
Premio Nacional de Informática 2011
Universitat Politecnica de Catalunya

Title: Alan Turing: the link from Leibniz to Gödel
Abstract: The talk is split into two parts. In the first part, we survey the quest for a universal procedure to solve problems. In the second, starting from the letter from Gödel to von Neumann, we survey the meaning of unfeasibility through the truncated Entscheidungsproblem, and give examples of problems from different fields where it seems difficult to obtain efficient procedures.



Gerald E. Sacks

Harvard University, USA

Title: Two Lemmas Of Infinitary Logic
Abstract: Applications of Goedel's condensation method to infinitary logic.

Ignacio Luengo

Universidad Complutense de Madrid

Title: Turing and Enigma
Abstract: Turing’s work was vital for deciphering the messages sent by the German Navy’s Enigma machine in the Second World War. His decisive contribution together with the rest of the work carried out at Bletchley Park, were kept secret by the British Government until the mid-1970s. We reconsider Turing’s work on Enigma from an historical cryptographic point of view.


Pedro Bernal Gutiérrez

Teniente General del Ejército del Aire
Ex-director del Centro Superior de Estudios de la Defensa Nacional (CESEDEN)

Title: Turing and Cryptography in the Second World War
Abstract: Communications acquired a crucial importance during the Second World War. In order to protect their flow of information from the allied powers, Germany developed highly complex encryption equipment and procedures, such as the ENIGMA machine. The allies themselves, and Great Britain in particular, took up an undercover cryptological battle that was to prove decisive for the outcome of the war. Turing played a key role in this struggle with his contribution of original ideas in the field of mathematical logic and computation.

David Pearce

Universidad Politécnica de Madrid

Title: Logic, Computation, Knowledge Representation and Problem Solving
Abstract: Alan Turing laid the foundation for the beginning of a long standing partnership between logic and the science of computation. Since the 1970s this partnership has expanded, deepened and diversified in many new directions. Logic became an inspiration for designing programming languages, a tool for specification and semantics, and an instrument for verifying the correctness of software. It has also provided languages for reconstructing human knowledge and for modeling and automating many human reasoning tasks. In this talk I discuss some of the current work in artificial intelligence that involves the use of logic for programming, problem solving, knowledge reconstruction and the analysis of artificial societies.




Bruce Wilcox

Loebner Prize 2010 y 2011
Telltale Games, USA

Title: Making it real: Loebner-winning chatbot design
Abstract: For the last three years, chatbots of my wife and I have participated in the Loebner Prize in Artificial Intelligence, coming in 1st twice and 2nd once, with a different persona each year. One of them even fooled a human judge. A world-class chatbot should tell the story of its life, have a consistent personality, and respond emotionally. It takes a lot of script. And it takes a powerful engine designed to support natural language processing in a variety of ways and make it relatively easy to author all that script. This talk discusses ChatScript, the open-source Natural Language scripting language and engine running our bots. And it looks at how we construct chatbot personalities and what we have learned over time.

Luis M. Laíta

Catedrático Emérito de la Universidad Politécnica de Madrid (UPM), Profesor Emérito de la
Universidades Cardenal Cisneros y CEU, Madrid.
Real Academia de Ciencias Exactas, Físicas y Naturales

Title: George Boole, a precursor of today’s Computer Algebra based Demonstrations
Abstract: In the monograph “The Genesis of Boole´s Logic, its History and a Computer Exploration” (L. M. Laita, Memories of the Royal Academy of Sciences of Madrid, Mathematics, Volume XXXIII (2005), I studied the historical, philosophical and scientific influences led by the great English mathematician and logician George Boole. The question is: why in this short talk are Alan Turing (born 1912, died 1954) and George Boole (born 1815, died 1864) linked? In outline, the answer to the question “why” is that their works can be considered as intimately related to what today is known as “Symbolic Computation”. The computation model provided by Turing Machines, as well as their more modern computation models, proved to be equivalent in some crucial ways to universal Turing machines, and can be considered Symbolic Computation, as they are based on strict programs and strict concepts of the word “algorithm”. The link with George Boole is established on our claim that his algebraic logic demonstrations, based on a methodology called “the method of Separation of Symbols”, first used in France by, among others, Lagrange, Laplace and Arbogast, was later used in Great Britain (by, for instance, the Scottish mathematician Duncan F. Gregory) and was subsequently developed fully and in-depth by Boole. More importantly, as we will try to show in our exposition, Boole’s use of the method of separation of symbols can be extended to what now is known as “Symbolic Computation”; in particular, but not exclusively, dealings with symbolic logic proofs based on Buchberger´s Gröbner bases. As we will illustrate, this can be used in what in Artificial Intelligence are known as “expert systems”. As far as we know, neither Boole nor Turing ever referred to “Artificial Intelligence”, but we believe that some of their thoughts were directly related with this topic.


Joan Bagaria i Pigrau

ICREA Research Professor, Universitat de Barcelona

Title: On Turing's legacy in mathematical logic and the foundations of mathematics
Abstract: While Turing is best known for his work on computer science and cryptography, his impact on the general theory of computable functions (recursion theory) and the foundations of mathematics is of equal importance. In my talk I will focus on some of the ideas and problems arising from his work in these areas, such as the analysis of the structure of Turing degrees and the development of ordinal logics, which have been of continuous interest up to the present day and which still present formidable mathematical challenges.

Closing of the symposium