CUNY SEMINAR IN LOGIC AND GAMES
Learning Conditionals
Soroush Rafiee Rad (Tilburg University)
Thursday, July 18, 4:15 PM
Room 4421, CUNY GC
Abstract. Modeling how to learn an indicative conditional has been a major challenge for Formal Epistemologists. One proposal to meet this challenge is to request that the posterior probability distribution minimizes the Kullback-Leibler divergence to the prior probability distribution, taking the learned information as a constraint (expressed as a conditional probability statement) into account. This proposal has been criticized in the literature based on several clever examples. In this paper, we revisit four of these examples and show that one obtains intuitively correct results for the posterior probability distribution if the underlying probabilistic models reflect the causal structure of the scenarios in question.