Interviewer: Dr. Ragupathy Venkatachalam

Production: Dr. Ricardo Leizaola

Interview date: 16^{th} February 2020

Place: Goldsmiths, University of London

The excerpts of this video are highlights of the discussion that run for over an hour. The full transcript of the discussion, and a brief biographical statement are available in the tabs below.

The topics discussed in the video include:

Production: Dr. Ricardo Leizaola

Interview date: 16

Place: Goldsmiths, University of London

The excerpts of this video are highlights of the discussion that run for over an hour. The full transcript of the discussion, and a brief biographical statement are available in the tabs below.

The topics discussed in the video include:

- The nature of economic dynamics
- Growth theory and economic dynamics
- Algorithmic stopping rules and equilibrium concepts
- Algorithms and intuition

What is your view of economic dynamics and what are the important questions in this area?

Economic dynamics - I must say that I was influenced greatly by the dynamical way of looking at things by Richard Goodwin. He emphasised non-linearity, but also looked at linear dynamics. First Richard Goodwin and then Bjorn Thalberg in Lund also emphasised dynamics. … Economic dynamics was associated for me with non-optimum cycle models and this was what Goodwin emphasised all the time..... Economic dynamics was a vehicle through which I was able to see the non-maximum allocation processes. That was important for me because dynamics was a tradition I came with from my engineering studies in mechanical engineering, particularly Hayashi’s lectures on oscillation theory and aerodynamics lectures on vibration theory. They were essentially dynamic things. I specialised in hydrodynamics, quantum dynamics, electrical dynamics, all kinds of dynamics, and it was a heartening experience to see that it was important in economics as well, but I was puzzled as well that these dynamical questions did not relate to the dynamics of engineering systems apart from electrical oscillation theory.…Anyway, economic dynamics meant for me a way to bring in non-optimum dynamical questions into it, but it is only recently I have been able to analyse this.**You seem to argue that there are varieties of mathematics in your writings. What are these varieties and do economic insights depend crucially on the kind of mathematics that we choose?**

Well, first of all I can answer by saying I don’t think there is a right kind of mathematics. I don’t think orthodox economists, conventional economists, know anything of the varieties of mathematics. ... Economists do not differentiate enough between finite arguments and infinite arguments. Accuracy depends on whether you are talking about infinite or finite. Approximation depends on whether you are talking about finite or infinite. Then it comes to the definition of infinity.…That is why I think that one should know one branch of mathematics well, being aware of many other branches of mathematics that are possible. I will just mention smooth infinitesimal analysis, non-standard analysis, Russian constructivism, intuitionist constructivism. You can read ordinary mathematics in any number of books, but these things you can still read the classics and understand. That means Bishop, Brouwer, Turing and so on.**Could you explain its scope and the motivations behind computable economics in a way that relates to understanding and investigating economic problems?**Computable mathematics – intuition is important, imagination is important and the intuitionism of Brouwer is important. Approximations are important, algorithms are important...My computable economics was going towards problem solving, but in a classic computability sense it was about showing the impossibility of solving the problems of standard economics instead of showing them as paradoxes and anomalies, as Allais and Ellsberg did in the beginning. For example, Savage’s

Problem solving is important because that is what we have in economics and social science: problems. We have to solve them. We find methods to solve them and these methods, if they are tied to the straitjacket of rationality, equilibrium and optimisation, we can only solve a few of the problems. We have to solve the problems anyway so we have to expand the methods and expand the concepts of standard economics.…You develop the mathematics to solve the problems. You don’t take the mathematics as given and then tailor the problem to suit the mathematics, to suit the solution of the mathematical principle. You tailor the mathematics. ... Standard economics emasculates problems by the mathematics that they know or mathematics that is already developed. The danger is they will apply it, like the computational economists do, like the agent-based economists do. They will take constructive mathematics as it is and apply it to computational economics and to interpret what they call agent-based economics, whereas mathematicians expand this all the time due to real problems.… I think problems are intrinsic to economics and social science in general. Agents individually and collectively try to solve problems. They try to mathematise it, but they develop the mathematics. Sometimes we learn about it, sometimes we don’t learn about it.

Classical behavioural economics is the kind that Simon, Nelson and Winter, and Day and that kind of people emphasise. Modern behavioural economics is the kind that comes out of the acceptance of subjective expected utility theory. That means you accept subjective probability theory of the Savage variety, the resulting subjective expected utility. None of this is accepted by the classical behavioural economists. They accept the probability theory of algorithmic probability theory, which is wedded to von Mises’s definition of patterns for randomness first and probability afterwards. Randomness is patternless; Patternless is randomness. Patternless is defined algorithmically. These people are completely algorithmic in their definitions of behavioural entities and their decision processes for solving problems. The modern behavioural economists who accept subjective expected utility and therefore subjective probability do not accept an algorithmic definition of probability theory or utility theory so they have to face Allais paradoxes and Ellsberg paradoxes which they solved logically – classical logical way. This tradition names the paradoxes as anomalies and goes on through experimental economics and modern behavioural economics of the Thaler variety. They accept all the tenets of neoclassical economics as regards utility maximisation and equilibrium search is concerned. None of this is accepted by classical behavioural economists. That is the distinction.

I prefer not to have any equilibrium concept. The classical behavioural economists did not work with any equilibrium concept. They defined non-equilibrium, but I don’t work with any equilibrium at all and computability theory has no equilibrium at all. An algorithm either stops or doesn’t stop. It may not stop. Many algorithms don’t stop at all... I don’t want to use the word equilibrium or non-equilibrium or a-equilibrium, without equilibrium. Economics is algorithmic. Algorithms have no equilibrium concepts. They have stopping rules. They are made from approximations of intuitions. That is enough for me.

History is very important, but not in the sense that Joan Robinson talks about history versus equilibrium. She talks about history in an ahistorical sense: history is given. History is not given. We reinterpret history all the time, so we reinterpret intuitions all the time because we reinterpret history all the time. History is important and study of history is important. Knowing history is important. The extent to which you know history and you are a master of history, you can talk about intuitions approximated to algorithms. Algorithms is history dependent.

**In your work you put Keynes, Sraffa, Simon, Brouwer, Goodwin and Turing together. What do you see as a single connecting thread?**

You talk about ASSRU (Algorithmic Social Science Research Unit) really because in that we put together Simon, Turing, Brouwer, Goodwin, Keynes and Sraffa.^{[2]} This is about computability, bounded rationality, satisfising, intuitionism and then the irrelevance of equilibrium in Sraffa’s work and Goodwin’s cycle theory and Keynes’s overall view of the importance of morality in daily affairs, if you like.**You spoke about varieties of mathematics and you touch upon many different traditions and varieties of economics. **

I think there are only two types of economics, but there are many types of mathematics. Finite, but many types… only two types of economics: classical and neoclassical. The neoclassical model requires full employment, equilibrium, rationality and all the paraphernalia of this. Classical methods do not require any of these concepts. There can be unemployment, there can be no equilibrium and so on.… When I teach, I teach students first of all to have an open mind; secondly, to master one kind of mathematics and one kind of economics, but with an open mind to be sceptical about what they master. Scepticism is the mother of pluralism in my opinion. I don’t teach pluralistic economics, I don’t advocate pluralism, but I advocate scepticism, which is one step ahead or above pluralism. To be sceptical of economics and mathematics is a healthy attitude. To be sceptical in general is a healthy attitude. It is because some of these people were sceptical of conventional modes of thought that they were able to extend, generalise and use other types of thought and way of reasoning and so on. This is true of the people like Brouwer and this is true of economics people like Keynes. Keynes was a master of scepticism. There used to be a saying, ‘If you have four views, three of them are held by Keynes,’ but Keynes also is supposed to have said, ‘When facts change, I change.’ That means scepticism. It is not that facts change, it is just to be sceptical of whatever you do. To emphasise the answer to your question on pluralism: I teach people to be sceptical, but to learn one mathematics and one economics thoroughly with a sceptical, open mind to its generalising. That is what made Sraffa great, that is what made Brouwer great, that is what made great people yet. The moment Hilbert changed and became non-sceptical, he became dogmatic. Turing was always sceptical. He was always innocent and he was always sceptical.

**Notes:**[1] Savage, Leonard J. (1954),

[2] Algorithmic Social Sciences Research Unit, University of Trento, Italy.

RV: Ragupathy Venkatachalam (interviewer)

KV: First of all, thank you very much for having me. To be in this interview series is an honour. I don’t pretend to be in the same class as the others you have interviewed in this category, but I am grateful for you and for Goldsmiths for having had me. Economic dynamics - I must say that I was influenced greatly by the dynamical way of looking at things by Richard Goodwin. He emphasised non-linearity, but also looked at linear dynamics. First Richard Goodwin and then Bjorn Thalberg in Lund also emphasised dynamics. Both of them emphasised also the Philips machine and cybernetics. In a way they emphasised analogue computing traditions in economics and mathematics and applied mathematics. Goodwin was an applied mathematician in one way and Thalberg in another way, but I was also influenced by Paul Samuelson’s Foundations of Economic Analysis, part 2.

KV: I have come to realise that growth theory is mostly steady-state growth theory, which is no different from static theory. It is either steady state or stationary state. Harrod, who emphasised growth theory, knew very little dynamics. His dynamics was classical dynamics, mechanical dynamics that was of the sort that was in Ramsey’s father’s book

KV: I remember Stigler saying about Samuelson’s Foundations of Economic Analysis that it was a book about what he knew about difference and differential equations.

KV: You asked three questions there. One is about reasonings and inferences in mathematics, whether it helps economics; one whether economics requires mathematics to make it accurate because mathematics is supposed to be accurate; and the third one is about the varieties of mathematics, whether economists have been wedded to the wrong kind of mathematics. You imply – you don’t ask – is there a right kind of mathematics? Well, first of all I can answer by saying I don’t think there is a right kind of mathematics. I don’t think orthodox economists, conventional economists, know anything of the varieties of mathematics. A simple example is Paul Samuelson, who doesn’t even know the classical mathematics that he claims to be a master of completely, but he is a sympathetic, generous person and he never mentions constructive mathematics although he does mention non-standard analysis. Still he is wedded to classical mathematics. At the other extreme, there is a graduate textbook which is used in Columbia, for example, in mathematical economics, which belittles constructive mathematics, belittles Brouwer and Heyting and their rejection of the law of the excluded middle. Now there are all kinds of mathematics that do not rely on the law of the excluded middle,

KV: I must say first thing, Robin Gandy, who was Alan Turing’s only PhD student, reviewed the first volume of Bourbaki’s book, in which he wrote it would be not constructivism that will be put in the dust heap of history as Bourbaki claimed, but it is Bourbakianism that will be put in the dust heap of history, that he predicted. Today, Bourbakianism is no longer

KV: Yes. Bourbaki attempted to prove the existence and uniqueness and all of mathematics was developed with existence-uniqueness principles, but on the basis of axiomatic development of algebra that van der Waerden and then later on MacLane and Birkhoff made. The existence-uniqueness principle, without how to get to, for example in economics, proved equilibrium, was not part of even neoclassical economics. Even Fisher, Walras and Pareto tried to develop methods to prove how to get to an equilibrium that they could show to exist or to be unique. Fisher used hydraulic principles, Walras used stock market principles, Pareto used analogue computing principles and so on, but they didn’t know that they were doing this. They didn’t know the mathematical underpinnings of what they were doing so they believed… These people who followed them took existence-uniqueness as the principal development, partly because of von Neumann’s influence in game theory and growth theory, but von Neumann himself changed later on in life when he understood that logic only mirrors a part of human thinking. What I mean by existence-uniqueness is the lack of a method to prove a way of getting to the existence or the uniqueness of thing proved to exist.

KV: First of all, you ask why there is a wedge between existence-uniqueness and the method to arrive at whatever is proved to exist and is unique. My answer is based on two things. This is a fault of economic method, which is mimicking uncritically a mathematics that doesn’t pay any attention to meaning. I am only paraphrasing what Bishop said in his introduction – Bishop’s book of 1967 on Foundations of Constructive Analysis.

RV: How are they different?

KV: Recursive economics is the economics of the neoclassicals. There is a famous textbook by Sargent and Ljungqvist

RV: You have answered. I wanted to touch upon something that you just said about problem-solvers and problem-solving, which is an important element of your computable economics. The “agents as problem-solvers” view or metaphor seems to be in contrast with what we learn in typical graduate school textbooks, where agents are optimisers, maximisers or signal processors. Why is problem-solving a better metaphor in your view and what implications does this have on limits to rationality?

KV: You ask about problem-solving and what importance it has in economics. I would say social science in general. Let me first point out that Lucas in his Phillips lectures also talks about problem-solving, but they are quite different aspects of problem solving. Problem solving is important because that is what we have in economics and social science: problems. We have to solve them. We find methods to solve them and these methods, if they are tied to the straitjacket of rationality, equilibrium and optimisation, we can only solve a few of the problems. We have to solve the problems anyway so we have to expand the methods and expand the concepts of standard economics. That is what Simon does. I want to emphasise that irrational numbers and rational numbers and real numbers are based on the integers, quotients of integers. I will go one step further. Quotients of integers made me think of quotients of vectors. We don’t talk about quotients of vectors. Then you talk about Hamilton’s quaternions. I think there is a role for quaternions as well. Hamilton was inspired by problems that he met that he wanted to solve. This comes to grips with a question you asked earlier. You develop the mathematics to solve the problems. You don’t take the mathematics as given and then tailor the problem to suit the mathematics, to suit the solution of the mathematical principle. You tailor the mathematics. Hamilton tailored quotients of vectors. For example, complex numbers were debated for a long time before they were admitted into mathematics, but it helped in circuit theory, for example. Circuit theory was a practical theory. Practical problem solving requires mathematics to be expanded and Hamilton did so with quaternions. We haven’t used quaternions. We don’t use quotients of vectors. Take activity analysis models: it doesn’t use quotients of vectors, just takes vector analysis – at most, tensor analysis, which is a generalisation of vector analysis. I think problems are important. Problems are the guiding light of economics. Standard economics emasculates problems by the mathematics that they know or mathematics that is already developed. The danger is they will apply it, like the computational economists do, like the agent-based economists do. They will take constructive mathematics as it is and apply it to computational economics and to interpret what they call agent-based economics, whereas mathematicians expand this all the time due to real problems. For example, Per Martin-Löf’s type theory. Type theory can be linked to Russell’s definition of type theory to eliminate Gödelian and Berry-type paradoxes in mathematics from arising in set theory and set theory as a foundation for mathematics, but type theory doesn’t have to depend on any of these things. It comes from problems – Sraffa-type problems, any number of economic problems. I think problems are intrinsic to economics and social science in general. Agents individually and collectively try to solve problems. They try to mathematise it, but they develop the mathematics. Sometimes we learn about it, sometimes we don’t learn about it.

KV: You define computability theory in terms of algorithms and intuitions, but I want to emphasise that it is also approximations. That is important. Approximations marry intuitions to algorithms. If you have intuitions that cannot be algorithmised, it means you have not found the right approximation yet to intuitions. Real life is about approximating intuitions so that you can write algorithms. It is true that there are intuitions that cannot be algorithmised, but

KV: It is so, but remember that you ask about excess demand functions and their failure or non-failure in abstract terms, and their importance in general equilibrium theory. Excess demand functions are very important in general equilibrium theory, but none of the general equilibrium economic people acknowledge that excess demand is wedded to intuition. It is intuitionism that gives rise to the Church-Turing thesis. It is giving rise to intuitive development of effective calculability. The excess demand function is not effectively calculable in an intuitionistic sense. It may be calculable for an agent who has a different type of intuition. This intuition may be about the general tribes in Ceylon, for example, or in Brazil. Their intuition may be different. We have to learn their intuition to say that excess demand function is universally failing general equilibrium economics, but general equilibrium economics uses the excess demand function of classical mathematics, which has no place for intuition. I use the intuitive definition of effective calculability to show that in constructive or computable mathematics, excess demand functions are non-existent. It has to do with intuition. They banish intuition from general equilibrium theory. You have to admit intuition, and whose intuition is important.

KV: First, I will tell you that you can look at Simon to look at decidable results and Simon’s development of the Turing machine concept of algorithms. You can look at Boole and Jevons to talk about exclusive ‘or’ or inclusive ‘or’ and later development by McCulloch and Pitts of universality, universal computation in these systems of reasoning as used by Conway with surreal numbers and so on. There are many different number systems and ordinary economics uses only one type of number system. Now examples of uncomputability are the rationality postulates of standard economics. The general equilibrium of standard economics is non-constructive and uncomputable for many reasons – not only for computable reasons, but also for constructive reasons in an intuitionistic sense. Constructive reasons because the Bolzano-Weierstrass theorem is intuitionistically unacceptable and the Bolzano-Weierstrass theorem is used in every algorithm of general equilibrium theory proving the existence of equilibrium. Even the Sperner simplex, even the simplex of Scarf, uses the Bolzano-Weierstrass theorem. There is a mathematics without using the Bolzano-Weierstrass theorem. Just for clarification, the Bolzano-Weierstrass theorem is like the ‘game’ of 20 questions. You go right or left. So you can toss a coin and go right, but you don’t know whether you will reach the goal because the goal might be if you take the left turn. You assume that the goal is at the end. This is where Ahab comes in. My goals are irrational; so these goals are defined after the effect. That is why the Scarf proof is not computable nor constructive. Scarf is wrong to say that Brouwer is confined to infinite processes that must have constructive decisions with them. Those are examples like rationality and equilibria from standard economics. For the example of computable solutions, you can look at Simon. He gives umpteen examples of computable solutions, even of the proof in Whitehead and Russell of the first volume of

KV: It is very simple. Classical behavioural economics is the kind that Simon, Nelson and Winter, and Day and that kind of people emphasise. Modern behavioural economics is the kind that comes out of the acceptance of subjective expected utility theory. That means you accept subjective probability theory of the Savage variety, the resulting subjective expected utility. None of this is accepted by the classical behavioural economists. They accept the probability theory of algorithmic probability theory, which is wedded to von Mises’s definition of patterns for randomness first and probability afterwards. Randomness is patternless; Patternless is randomness. Patternless is defined algorithmically. These people are completely algorithmic in their definitions of behavioural entities and their decision processes for solving problems. The modern behavioural economists who accept subjective expected utility and therefore subjective probability do not accept an algorithmic definition of probability theory or utility theory so they have to face Allais paradoxes and Ellsberg paradoxes which they solved logically – classical logical way. This tradition names the paradoxes as anomalies and goes on through experimental economics and modern behavioural economics of the Thaler variety. They accept all the tenets of neoclassical economics as regards utility maximisation and equilibrium search is concerned. None of this is accepted by classical behavioural economists. That is the distinction.

KV: I want to say that Sraffa in his book of 1960 doesn’t talk about equilibrium at all. If you talk about equilibrium, you talk about resource allocation, efficient resource allocation and so on. Equity is left aside. I prefer not to have any equilibrium concept. The classical behavioural economists did not work with any equilibrium concept. They defined non-equilibrium, but I don’t work with any equilibrium at all and computability theory has no equilibrium at all. An algorithm either stops or doesn’t stop. It may not stop. Many algorithms don’t stop at all. There is a stopping rule. Some don’t stop, some circulate, some keep on going, so Emil Post defined recursively enumerable sets different from recursive sets on the basis of stopping rules of algorithm. I don’t want to use the word equilibrium or non-equilibrium or a-equilibrium, without equilibrium. Economics is algorithmic. Algorithms have no equilibrium concepts. They have stopping rules. They are made from approximations of intuitions. That is enough for me.RV: That is very interesting and this brings me to a related question, which is about methodology as well. It is on the micro foundations that seem to be quite accepted and fashionable in macroeconomics today. What are your thoughts on this? You talk about phenomenological macroeconomics and this seems to be in sharp contrast to that.KV: This is entirely my personal view – almost entirely. I think the future of micro foundations for macroeconomics depends on microeconomics. Neoclassicals and new classicals – hardcore neoclassicals and new classicals – want micro foundations. New classicals in fact think there is only microeconomics to do. This has to do with the fallacy of composition. I think in the future only macroeconomics will survive and that is because of the fallacy of composition and because microeconomics will be dissolved by the prevalence of algorithmic economics. All micro entities, all individual entities, would work with algorithms. Algorithms are intuitive, they can be approximated, but algorithmic, and therefore there will be no microeconomics to give the foundations for macroeconomics. Macroeconomics is another thing altogether. It has no micro foundations at all. It has its own foundations and the micro foundations have to do with the fallacy of composition and its failure in providing foundations for macro, but macroeconomics has its own foundations, has its own entities, and it gives problems as well – not only the agent’s problems or the so-called micro problems, but macroeconomic problems, national economic problems. Therefore, we must develop macroeconomics as much as we can, which is entirely my view. For example, in the case of cycle theory, using for example the Poincaré-Bendixson theorem, one proves the existence of cycles, but now the Poincaré-Bendixson theorem is algorithmised. Mathematics in the age of the Turing machine, as Hales calls it, gives a new life to macroeconomics. Macroeconomics is dynamic intrinsically because algorithms are dynamic. Macroeconomics has no static counterpart at all in the kind of macroeconomics I do and the kind of macroeconomics that is relevant even in Paul Samuelson’s part 2. Stability concepts will be developed as the dynamic concepts of macroeconomics are given an algorithmic content. Are these algorithms stable? Are these algorithms therefore stoppable or not stoppable? What do we do if we stop them in the middle? We approximate when we stop them in the middle of a process. All engineers, all imaginative engineers, welcomed infinite processes because they could be stopped.

KV: I must say that I have never found the words fundamental uncertainty in Keynes’s writing. It is Minsky who made a big deal of this fundamental uncertainty. I don’t even believe there is such a thing as fundamental uncertainty. Uncertainty has to do with randomness. Randomness is number theoretic, patternless, algorithmised, so I don’t think this fundamental uncertainty has anything to do with algorithmic undecidability or algorithmic computability or algorithmic uncomputability. Fundamental uncertainty is in the same category as subjective expected utility-based probability. Neither Ramsey nor de Finetti would subscribe to fundamental uncertainty as Minsky defines it and as some of these post-Keynesians define it.

KV: I think it matters because intuition is important. Intuition is history dependent. Whoever talks about intuition talks about intuition is a historical sense. This also Brouwer does. His concept of intuition, Heyting, Bishop, Per Martin-Löf and so on have developed because historically it is evolving, but not in a Darwinian or Mendelian sense – evolving intuition. History is very important, but not in the sense that Joan Robinson talks about history versus equilibrium. She talks about history in an ahistorical sense: history is given. History is not given. We reinterpret history all the time, so we reinterpret intuitions all the time because we reinterpret history all the time. History is important and study of history is important. Knowing history is important. The extent to which you know history and you are a master of history, you can talk about intuitions approximated to algorithms. Algorithms is history dependent.

KV: You talk about ASSRU (Algorithmic Social Science Research Unit) really because in that we put together Simon, Turing, Brouwer, Goodwin, Keynes and Sraffa.

KV: Two things from childhood come to mind. One is my father telling that he went into the kitchen of our house in the country and he found a man seated there eating. He said, ‘How could I tell him to leave the kitchen? He was hungry, he was eating.’ That made an impact on me on poverty. Then the second thing was when a man came accused of murder and he was going to be charged. He asked my father for help in appearing for him, but he didn’t have money to pay. My father chased him away, telling him to go and ask his MP, who he had worked for against my father. Then after a few minutes he asked me to bring this man back. So I realised poverty was important. Poverty is what drove him to come to my father and ask. Poverty has since then been very important for me, for poverty alleviation. This took form in Japan, where I studied, in the mathematics teacher who taught me the meaning of proof in mathematics being interested in poverty as well. It was the same when I came to economics. I was driven by the considerations for poverty by Gunnar Myrdal’s book on Asian Drama

KV: My answer is this: you mentioned that I spoke about variety of mathematics and variety of economics. I think there are only two types of economics, but there are many types of mathematics. Finite, but many types. They grow. Some kinds of mathematics die, but only two types of economics: classical and neoclassical. The neoclassical model requires full employment, equilibrium, rationality and all the paraphernalia of this. Classical methods do not require any of these concepts. There can be unemployment, there can be no equilibrium and so on. Classical economics was subverted by neoclassical economics in the name of continuity and in the name of generalising. They did not generalise. They mathematised neoclassical economics and for the mathematisation they chose one type of mathematics and closed their eyes to all other types of mathematics. When I teach, I teach students first of all to have an open mind; secondly, to master one kind of mathematics and one kind of economics, but with an open mind to be sceptical about what they master. Scepticism is the mother of pluralism in my opinion. I don’t teach pluralistic economics, I don’t advocate pluralism, but I advocate scepticism, which is one step ahead or above pluralism. To be sceptical of economics and mathematics is a healthy attitude. To be sceptical in general is a healthy attitude. It is because some of these people were sceptical of conventional modes of thought that they were able to extend, generalise and use other types of thought and way of reasoning and so on. This is true of the people like Brouwer and this is true of economics people like Keynes. Keynes was a master of scepticism. There used to be a saying, ‘If you have four views, three of them are held by Keynes,’ but Keynes also is supposed to have said, ‘When facts change, I change.’ That means scepticism. It is not that facts change, it is just to be sceptical of whatever you do. To emphasise the answer to your question on pluralism: I teach people to be sceptical, but to learn one mathematics and one economics thoroughly with a sceptical, open mind to its generalising. That is what made Sraffa great, that is what made Brouwer great, that is what made great people yet. The moment Hilbert changed and became non-sceptical, he became dogmatic. Turing was always sceptical. He was always innocent and he was always sceptical.

KV: You give the example that in physics one doesn’t teach the history of physics, in my opinion as much as one should teach. Let me first say history of thought is important because I view history, everything, as a tree. The tree is something in which you can walk backwards and forwards. History of thought is to walk backwards in the tree and to find nodes where you could have taken a certain path that you didn’t take. It is like the Bolzano-Weierstrass theorem, but you can go back in time. Therefore, students to whom I teach economics, asking them to master one kind of economics, I also teach the history or try to emphasise the history of thought from a tree perspective of going back and finding alternative paths that could have been taken, but were not taken. Why were they not taken is a question that the student must ask, but the teacher’s role is to point out that these were not taken and give his or her view of why it was not taken. My own opinion is based on mathematics, that it is not taken because this history of thought that they go back to, they go back to classical mathematics. But in the case of physics, I am not sure I agree with you that they don’t teach the history of thought. In the case of physics, proof is not important. It is workability that is important, but when one talks about important concepts in physics like the Feynman diagrams or the Dirac delta function, then one goes back and tries to find out in what sense is the Dirac delta function a function of the fact that he was an electrical engineer and he was first taught electrical oscillation theory – this is Dirac – and that he developed the idea of the Dirac delta function from electrical oscillation theory. Feynman diagrams, you have to go back to proofs and the importance of proofs. You can’t axiomatise the Feynman diagrams and the Feynman principle. What is the role of axioms? What is the role of proof? Feynman went back to try to see how Newton, for example, proved. Newton used geometric methods to prove. Chandrasekar from astronomy went back to Newton and Galileo to try to understand their proof techniques. What proof techniques do we use now to make these things workable? Feynman first off made the diagrams so that they worked in quantum electrodynamics for solving certain problems, mind you. Then he wondered why he was not able to axiomatise this and he went back to history. Is it because I looked for workability without looking for proofs? How did Newton prove this? In fact, Newton used non-standard analysis in his infinitesimal calculus. Anyway, that is beside the point. I think history of thought is important because you go back all the time and you have to teach the history of thought as the path in a tree that is travelable both up and down, sideways and all directions.

KV: I don’t think I am equipped to give them advice on this. I can only outline the path that I took and where I have ended up. I can now tell I have ended up with more scepticism, more questions than I began with. That is where you have to end up, I think: with more questions and much more scepticism about methods and proofs and axioms and epistemology in general.

(End of recording)

**Notes:**

[1] Samuelson, P (1947), Foundations of Economic Analysis, Cambridge, MA: Harvard University Press.

[2] Frisch, R. (1933). Propagation Problems and Impulse Problems in Dynamic Economics, in

[3] Slutzky, E. (1937). The summation of random causes as the source of cyclic processes.

[4] Zambelli, S. (2007). A Rocking Horse That Never Rocked: Frisch's “Propagation Problems and Impulse Problems”. History of Political Economy, 39(1), 145-166.

[5] Ramsey, A.S. (1929),

[6] Samuelson, P (1947), Foundations of Economic Analysis, Cambridge, Mass: Harvard University Press.

[7] Bishop, E. (1967) Foundations of Constructive Analysis, New York: Academic Press.

[8] Sargent, T and Ljungvist, L. (2018), Recursive Macroeconomic Theory, Fourth edition, Cambridge, MA: MIT Press.

[9] Russell,B. (2000), Autobiography, 2nd edition, London: Routledge.

[10] Russell, B., & Whitehead, A. (1973).

[11] Sraffa, P. (1960).

[12] Hodges, A (1983), Alan Turing: The Enigma, London: Burnett Books/Hutchinson.

[13] Tarksi, A. (1941),

[14] Savage, Leonard J. (1954),

[15] Simon, H. A., & Newell, A. (1958). Heuristic problem solving: The next advance in operations research.

[16] Algorithmic Social Sciences Research Unit, University of Trento, Italy.

[17] Myrdal, G. (1968),

[18] Keynes, J.M. (1936)

[19] Sraffa, P. (1960).

[20] Brouwer, L. E. J (1981), Cambridge Lectures on Intuitionism, Cambridge: Cambridge University Press.

[21] Simon, H. (1991), Models of My Life, Cambridge, MA: MIT Press.

[22] Newell, A. and H.A. Simon. (1972). Human Problem Solving. Englewood Cliffs, NJ: Prentice-Hall, INC.

Kumaraswamy Vela Velupillai was born in 1947 in Colombo, Sri Lanka. He obtained his undergraduate degree in Engineering from Kyoto University, Japan. He has a master's degree in Economics from the University of Lund, Sweden and a PhD in Economics from the University of Cambridge.

Professor Velupillai is an Emeritus Professor, formerly the *Professore di Chiara Fama* in the Department of Economics at the University of Trento, Italy and Distinguished Professor of Economics at the New School for Social Research, USA. He is also a Senior Visiting Professor at the Madras School of Economics, India. He has held professorships and visiting positions across various universities in UK, Europe, Asia and USA.

He is the founder of Computable Economics, which attempts to mathematise economic theory using the algorithmic methods of recursion theory and constructive mathematics. He has also contributed to various other research areas such as macroeconomic theory, classical behavioural economics, mathematical economics, history, methodology and philosophy of economics.

Professor Velupillai has authored and/or edited 12 books and over 120 scholarly articles. His recent books include *Keynesian, Sraffian, Computable and Dynamic Economics* (Palgrave Macmillan,2021), *Models of Simon* (Routledge, 2018) and *Computable Foundations for Economics *(Routledge, 2010). A Festschrift in Velupillai's honour, Computable, Constructive and Behavioural Economic Dynamics (Routledge, 2012), edited by Stefano Zambelli.

A co-production of Goldsmiths Economics and ISRF.