Stephan Hartmann (Ludwig Maximilians-Universität München)

4:10 pm, February 13, 2015

Faculty House, Columbia University

*Abstract.* The following are abstracts of two papers on which this talk is based.

The Problem of Old Evidence has troubled Bayesians ever since Clark Glymour first presented it in 1980. Several solutions have been proposed, but all of them have drawbacks and none of them is considered to be the definite solution. In this article, I propose a new solution which combines several old ideas with a new one. It circumvents the crucial omniscience problem in an elegant way and leads to a considerable confirmation of the hypothesis in question.

Modeling how to learn an indicative conditional has been a major challenge for formal epistemologists. One proposal to meet this challenge is to construct the posterior probability distribution by minimizing the Kullback-Leibler divergence between the posterior probability distribution and the prior probability distribution, taking the learned information as a constraint (expressed as a conditional probability statement) into account. This proposal has been criticized in the literature based on several clever examples. In this article, we revisit four of these examples and show that one obtains intuitively correct results for the posterior probability distribution if the underlying probabilistic models reflect the causal structure of the scenarios in question.

Filed under: Seminar Tagged: conditionals, evidence, probability ]]>

Filed under: Events Tagged: games, knowledge, voting ]]>

Gregory Wheeler (Ludwig Maximilian University of Munich)

4:10 pm, October 31, 2014

Room 2, Faculty House, Columbia University

*Abstract.* Accuracy-first epistemology aims to supply non-pragmatic justifications for a variety of epistemic norms. The contemporary basis for accuracy-first epistemology is Jim Joyce’s program to reinterpret de Finetti’s scoring-rule arguments in terms of a “purely epistemic” notion of “gradational accuracy.” On Joyce’s account, scoring rules are taken to measure the accuracy of an agent’s belief state with respect to the true state of the world, where accuracy is conceived to be a pure epistemic good. Joyce’s non-pragmatic vindication of probabilism, then, is an argument to the effect that a measure of gradational accuracy satisfies conditions that are close enough to those necessary to run a de Finetti style coherence argument. A number of philosophers, including Hannes Leitgeb and Richard Pettigrew, have embraced Joyce’s program whole hog. Leitgeb and Pettigrew, for instance, have argued that Joyce’s program is too lax, and they have proposed conditions that narrow down the class of admissible gradational accuracy functions, while Pettigrew and his collaborators have sought to extend the list of epistemic norms receiving an accuracy-first treatment, a program that he calls Epistemic Decision Theory.

In this talk I report on joint work with Conor Mayo-Wilson that challenges the core doctrine of Epistemic Decision Theory, namely the proposal to supply a purely non-pragmatic justification for anything resembling the Von Neumann and Morgenstern axioms for a numerical epistemic utility function. Indeed, we argue that none of the axioms necessary for Epistemic Decision Theory have a satisfactory non-pragmatic justification, and we point to reasons why to suspect that not all the axioms could be given a satisfactory non-pragmatic justification. Our argument, if sound, has consequences for recent discussions of “pragmatic encroachment”, too. For if pragmatic encroachment is a debate to do with whether there is a pragmatic component to the justification condition of knowledge, our arguments may be viewed to address the true belief condition of (fallibilist) accounts of knowledge.

Filed under: Seminar ]]>

CUNY Graduate Center, Rm. 9207

October 14 and 15, 2014

Preliminary list of speakers:

Deirdre Wilson (UCL)

Laurence Horn (Yale)

Kent Bach (SFSU)

Robyn Carston (UCL)

Ariel Rubinstein (NYU and Tel Aviv)

CUNY:

Michael Devitt

Stephen Neale

Rohit Parikh

Students:

Marilynn Johnson (CUNY)

Ignacio Ojea (Columbia)

Todd Stambaugh (CUNY)

Cagil Tasdemir (CUNY)

Program here.

Filed under: Events Tagged: games, pragmatics ]]>

R. Ramanujam (Institute of Mathematical Sciences, India)

4:00 – 6:00 PM, Friday, June 2, 2014

Room 4421, CUNY GC

*Abstract.* We consider large games, in which the number of players is so large that outcomes are determined not by strategy profiles, but by distributions. In the model we study, a society player monitors choice distributions and intervenes periodically, leading to game changes. Rationality of individual players and that of the society player are mutually interdependent in such games. We discuss stability issues, and mention applications to infrastructure problems.

Filed under: Events Tagged: games, knowledge ]]>

Eric Pacuit (University of Maryland)

4:15 – 6:15 PM, Friday, May 9, 2014

Room 3309, CUNY GC

*Abstract.* It has long been noted that a voter can sometimes achieve a preferred election outcome by misrepresenting his or her actual preferences. In fact, the classic Gibbard-Sattherthwaite Theorem shows that under very mild conditions, every voting method that is not a dictatorship is susceptible to manipulation by a single voter. One standard response to this important theorem is to note that a voter must possess information about the other voters’ preferences in order for the voter to decide to vote strategically. This seems to limit the “applicability” of the theorem. In this talk, I will survey some recent literature that aims at making this observation precise. This includes models of voting under uncertainty (about other voters’ preferences) and models that take into account how voters may response to poll information.

Filed under: Events Tagged: games, knowledge, voting ]]>

Hannes Leitgeb (Ludwig Maximilian University of Munich)

4:15 pm, May 2nd, 2014

716 Philosophy Hall, Columbia University

*Abstract.* I am going to make precise, and assess, the following thesis on (all-or-nothing) belief and degrees of belief: It is rational to believe a proposition just in case it is rational to have a stably high degree of belief in it.I will start with some historical remarks, which are going to motivate calling this postulate the “Humean thesis on belief”. Once the thesis has been formulated in formal terms, it is possible to derive conclusions from it. Three of its consequences I will highlight in particular: doxastic logic; an instance of what is sometimes called the Lockean thesis on belief; and a simple qualitative decision theory.

Filed under: Seminar Tagged: belief, probability ]]>

Andy Egan (Rutgers University)

4:10-6:00 PM, April 3rd, 2014

716 Philosophy Hall, Columbia University

*Reception will follow*

Filed under: Events Tagged: causal decision theory, decision, games ]]>

Arif Ahmed (University of Cambridge)

4:15 PM, April 4th, 2014

716 Philosophy Hall, Columbia University

*Abstract.* Most philosophers today prefer ‘Causal Decision Theory’ to Bayesian or other non-Causal Decision Theories. What explains this is the fact that in certain Newcomb-like cases, only Causal theories recommend an option on which you would have done better, whatever the state of the world had been. But if so, there are cases of sequential choice in which the same difficulty arises for Causal Decision Theory. Worse: under further light assumptions the Causal Theory faces a money pump in these cases. It may be illuminating to consider rational sequential choice as an intrapersonal game between one’s stages, and if time permits I will do this. In that light the difficulty for Causal Decision Theory appears to be that it allows, but its non-causal rivals do not allow, for Nash equilibria in such games that are Pareto inefficient.

Filed under: Seminar Tagged: causal decision theory, decision, games ]]>