Formal Philosophy

Logic at Columbia University

Synthese S.I. on Decision Theory and the Future of Artificial Intelligence

by Yang Liu

Guest Editors:
Stephan Hartmann (LMU Munich)
Yang Liu (University of Cambridge)
Huw Price (University of Cambridge)

Description:
There is increasing interest in the challenges of ensuring that the long-term development of artificial intelligence (AI) is safe and beneficial. Moreover, despite different perspectives, there is much common ground between mathematical and philosophical decision theory, on the one hand, and AI, on the other. The aim of the special issue is to explore links and joint research at the nexus between decision theory and AI, broadly construed.

We welcome submissions of individual papers covering topics in philosophy, artificial intelligence and cognitive science that involve decision making including, but not limited to, subjects on

  • causality
  • decision making with bounded resources
  • foundations of probability theory
  • philosophy of machine learning
  • philosophical and mathematical decision/game theory

Submissions:
Contributions must be original and not under review elsewhere. Although there is no prescribed word or page limit for submissions to Synthese, as a rule of thumb, papers typically tend to be between 15 and 30 printed pages (in the journal’s printed format). Submissions should also include a separate title page containing the contact details of the author(s), an abstract (150-250 words) and a list of 4-6 keywords. All papers will be subject to the journal’s standard double-blind peer-review.

Manuscripts should be submitted online through Editorial Manager: https://www.editorialmanager.com/synt. Please choose the appropriate article type for your submission by selecting “S.I. : DecTheory&FutOfAI” from the relevant drop down menu.

Deadline:
The deadline for submissions is February 15, 2018.
For further information about the special issue, please visit the website: http://www.decision-ai.org/cfp/

Vasudevan: Entropy and Insufficient Reason

by Robby

UNIVERSITY SEMINAR ON LOGIC, PROBABILITY, AND GAMES
Entropy and Insufficient Reason
Anubav Vasudevan (University of Chicago)
4:10 pm, Friday, November 10th, 2017
Faculty House, Columbia University

Abstract. One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen (1981). The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this paper, I present an analysis of the Judy Benjamin problem that can help to make sense of this seemingly odd feature of maximum entropy inference. My analysis is based on the claim that, in applying the principle of maximum entropy, Judy Benjamin is not acting out of a concern to maximize uncertainty in the face of new evidence, but is rather exercising a certain brand of epistemic charity towards her informant. This charity takes the form of an assumption on the part of Judy Benjamin that her informant’s evidential report leaves out no relevant information. I will explain how this single assumption suffices to rationalize Judy Benjamin’s behavior. I will then explain how such a re-conceptualization of the motives underlying Judy Benjamin’s appeal to the principle of maximum entropy can further our understanding of the relationship between this principle and the principle of insufficient reason. I will conclude with a discussion of the foundational significance for probability theory of ergodic theorems (e.g., de Finetti’s theorem) describing the asymptotic behavior of measure preserving transformation groups. In particular, I will explain how these results, which serve as the basis of maximum entropy inference, can provide a unified conceptual framework in which to justify both a priori and a posteriori probabilistic reasoning.

Button: Internal categoricity and internal realism in the philosophy of mathematics

by Robby

UNIVERSITY SEMINAR ON LOGIC, PROBABILITY, AND GAMES
Internal categoricity and internal realism in the philosophy of mathematics

Tim Button (University of Cambridge)
4:10 pm, Wednesday, April 19th, 2017
Faculty House, Columbia University

Abstract. Many philosophers think that mathematics is about ‘structure’. Many philosophers would also explicate this notion of ‘structure’ via model theory. But the Compactness and Löwenheim–Skolem theorems lead to some famously hard questions for this view. They threaten to leave us unable to talk about any particular ‘structure’.

In this talk, I outline how we might explicate ‘structure’ without appealing to model theory, and indeed without invoking any kind of semantic ascent. The approach involves making use of internal categoricity. I will outline the idea of internal categoricity, state some results, and use these results to make sense of Putnam’s beautiful but cryptic claim: “Models are not lost noumenal waifs looking for someone to name them; they are constructions within our theory itself, and they have names from birth.”

Columbia Festival of Formal Philosophy

by Yang Liu

A series of logic related talks at Columbia University in the next a few weeks. Please click the link of each talk series below for more information.

SUPPES LECTURES
by Kenny Easwaran (Texas A&M University)

Graduate Workshop
Measuring Beliefs
3:00 pm – 5:00 pm, Friday, March 31, 2017
716 Philosophy Hall, Columbia University

Departmental Lecture
An Opinionated Introduction to the Foundations of Bayesianism
4:10 pm – 6:00 pm, Tuesday, April 4, 2017
716 Philosophy Hall, Columbia University
Reception to follow in 720 Philosophy Hall

Public Lecture
Unity in Diversity: “The City as a Collective Agent”
4:10 pm – 6:00 pm, Thursday, April 6, 2017
603 Hamilton Hall, Columbia University

UNIVERSITY SEMINAR ON LOGIC, PROBABILITY, AND GAMES
Gödel’s Disjunction
Peter Koellner (Harvard University)
5:00 pm, Friday, April 7th, 2017
716 Philosophy Hall, Columbia University
Dinner to follow at Faculty House

WORKSHOP ON PROBABILITY AND LEARNING
Saturday, April 8th, 2017
716 Philosophy Hall, Columbia University

10:00 am – 11:30 am
Typical!
Gordon Belot (University of Michigan)

11:45 am – 13:15 pm
Schnorr Randomness and Lévi’s Martingale Convergence Theorem
Simon Huttegger (UC Irvine)

2:45 pm – 4:15 pm
Probing With Severity: Beyond Bayesian Probabilism and Frequentist Performance
Deborah Mayo (Virginia Tech)

4:30 pm – 6:00 pm
Radically Elementary Imprecise Probability Based on Extensive Measurement
Teddy Seidenfeld (Carnegie Mellon University)
Reception to follow

Koellner: Gödel’s Disjunction

by Robby

UNIVERSITY SEMINAR ON LOGIC, PROBABILITY, AND GAMES
Gödel’s Disjunction
Peter Koellner (Harvard University)
5:00 pm, Friday, April 7th, 2017
716 Philosophy Hall, Columbia University

Abstract. Gödel’s disjunction asserts that either “the mind cannot be mechanized” or “there are absolutely undecidable statements.” Arguments are examined for and against each disjunct in the context of precise frameworks governing the notions of absolute provability and truth. The focus is on Penrose’s new argument, which interestingly involves type-free truth. In order to reconstruct Penrose’s argument, a system, DKT, is devised for absolute provability and type-free truth. It turns out that in this setting there are actually two versions of the disjunction and its disjuncts. The first, fully general versions end up being (provably) indeterminate. The second, restricted versions end up being (provably) determinate, and so, in this case there is at least an initial prospect of success. However, in this case it will be seen that although the disjunction itself is provable, neither disjunct is provable nor refutable in the framework.

Workshop on Probability and Learning

by Rush Stewart

COLUMBIA WORKSHOP ON PROBABILITY AND LEARNING
Saturday, April 8th, 2017
716 Philosophy Hall, Columbia University

10:00 am – 11:30 am
Gordon Belot (University of Michigan)
Typical!
Abstract. This talk falls into three short stories. The over-arching themes are: (i) that the notion of typicality is protean; (ii) that Bayesian technology is both more and less rigid than is sometimes thought.
Slides

11:45 am – 13:15 pm
Simon Huttegger (UC Irvine)
Schnorr Randomness and Lévi’s Martingale Convergence Theorem
Abstract. Much recent work in algorithmic randomness concerns characterizations of randomness in terms of the almost everywhere
behavior of suitably effectivized versions of functions from analysis or probability. In this talk, we take a look at Lévi’s Martingale Convergence Theorem from this perspective. Levi’s theorem is of fundamental importance to Bayesian epistemology. We note that much of Pathak, Rojas, and Simpson’s work on Schnorr randomness and the Lebesgue Differentiation Theorem in the Euclidean context carries over to Lévi’s Martingale Convergence Theorem in the Cantor space context. We discuss the methodological choices one faces in choosing the appropriate mode of effectivization and the potential bearing of these results on Schnorr’s critique of Martin-Löf. We also discuss the consequences of our result for the Bayesian model of learning.

13:15 pm – 2:45 pm
Lunch

2:45 pm – 4:15 pm
Deborah Mayo (Virginia Tech)
Probing With Severity: Beyond Bayesian Probabilism and Frequentist Performance
Abstract. Getting beyond today’s most pressing controversies revolving around statistical methods and irreproducible findings requires scrutinizing underlying statistical philosophies. Two main philosophies about the roles of probability in statistical inference are probabilism and performance (in the long-run). The first assumes that we need a method of assigning probabilities to hypotheses; the second assumes that the main function of statistical method is to control long-run performance. I offer a third goal: controlling and evaluating the probativeness of methods. A statistical inference, in this conception, takes the form of inferring hypotheses to the extent that they have been well or severely tested. A report of poorly tested claims must also be part of an adequate inference. I show how the “severe testing” philosophy clarifies and avoids familiar criticisms and abuses of significance tests and cognate methods (e.g., confidence intervals). Severity may be threatened in three main ways: fallacies of rejection and non-rejection, unwarranted links between statistical and substantive claims, and violations of model assumptions. I illustrate with some controversies surrounding the use of significance tests in the discovery of the Higgs particle in high energy physics.
Slides

4:30 pm – 6:00 pm
Teddy Seidenfeld (Carnegie Mellon University)
Radically Elementary Imprecise Probability Based on Extensive Measurement
Abstract. This presentation begins with motivation for “precise” non-standard probability. Using two old challenges — involving (i) symmetry of probabilistic relevance and (ii) respect for weak dominance — I contrast the following three approaches to conditional probability given a (non-empty) “null” event and their three associated decision theories.
Approach #1 – Full Conditional Probability Distributions (Dubins, 1975) conjoined with Expected Utility.
Approach #2 – Lexicographic Probability conjoined with Lexicographic Expected Value (e.g., Blume et al., 1991)
Approach #3 – Non-standard Probability and Expected Utility based on Non-Archimedean Extensive Measurement (Narens, 1974).
The second part of the presentation discusses progress we’ve made using Approach #3 within a context of Imprecise Probability.
Slides

Reception to follow