MCMP – Mathematical Philosophy (Archive 2011/12)

Follow MCMP – Mathematical Philosophy (Archive 2011/12)
Share on
Copy link to clipboard

Mathematical Philosophy - the application of logical and mathematical methods in philosophy - is about to experience a tremendous boom in various areas of philosophy. At the new Munich Center for Mathematical Philosophy, which is funded mostly by the German Alexander von Humboldt Foundation, philoso…

MCMP Team

  • Apr 22, 2019 LATEST EPISODE
  • infrequent NEW EPISODES
  • 50m AVG DURATION
  • 250 EPISODES


Search for episodes from MCMP – Mathematical Philosophy (Archive 2011/12) with a specific topic:

Latest episodes from MCMP – Mathematical Philosophy (Archive 2011/12)

Modality and Categories

Play Episode Listen Later Apr 22, 2019 62:00


Steve Awodey (CMU/MCMP) gives a talk at the MCMP Workshop on Modality titled "Modality and Categories".

Adaptive Logics: Introduction, Applications, Computational Aspects and Recent Developments

Play Episode Listen Later Apr 22, 2019 78:41


Peter Verdée (Ghent) gives a talk at the MCMP Colloquium (8 Feb, 2012) titled "Adaptive Logics: Introduction, Applications, Computational Aspects and Recent Developments". Abstract: Peter Verd ́ee (peter.verdee@ugent.be) Centre for Logic and Philosophy of Science Ghent University, Belgium In this talk I give a thorough introduction to adaptive logics (cf. [1, 2, 3]). Adaptive logics are first devised by Diderik Batens and are now the main research area of the logicians in the Centre for Logic and Philosophy of Science in Ghent. First I explain the main purpose of adaptive logics: formalizing defea- sible reasoning in a unified way aiming at a normative account of fallible rationality. I give an informal characterization of what we mean by the notion ‘defeasible reasoning’ and explain why it is useful and interesting to formalize this type of reasoning by means of logics. Then I present the technical machinery of the so called standard format of adaptive logics. The standard format is a general way to define adaptive logics from three basic variables. Most existing adaptive logics can be defined within this format. It immediately provides the logics with a dynamic proof theory, a selection semantics and a number of important meta-theoretic properties. I proceed by giving some popular concrete examples of adaptive logics in standard form. I quickly introduce inconsistency adaptive logics, adap- tive logics for induction and adaptive logics for reasoning with plausible knowledge/beliefs. Next I present some computational results on adaptive logics. The adap- tive consequence relation are in general rather complex (I proved that there are recursive premise sets such that their adaptive consequence sets are Π1- complex – cf. [4]). However, I argue that this does not harm the naturalistic aims of adaptive logics, given a specific view on the relation between actual reasoning and adaptive logics. Finally, two interesting recent developments are presented: (1) Lexi- cographic adaptive logics. They fall outside of the scope of the standard format, but have similar properties and are able to handle prioritized infor- mation. (2) Adaptive set theories. Such theories start form the unrestricted comprehension axiom scheme but are strong enough to serve as a foundation for an interesting part of classical mathematics, by treating the paradoxes in a novel, defeasible way.

Belief Dynamics under Iterated Revision: Cycles, Fixed Points and Truth-tracking

Play Episode Listen Later Apr 20, 2019 79:47


Sonja Smets (University of Groningen) gives a talk at the MCMP Colloquium titled "Belief Dynamics under Iterated Revision: Cycles, Fixed Points and Truth-tracking". Abstract: We investigate the long-term behavior of processes of learning by iterated belief-revision with new truthful information. In the case of higher-order doxastic sentences, the iterated revision can even be induced by repeated learning of the same sentence (which conveys new truths at each stage by referring to the agent's own current beliefs at that stage). For a number of belief-revision methods (conditioning, lexicographic revision and minimal revision), we investigate the conditions in which iterated belief revision with truthful information stabilizes: while the process of model-changing by iterated conditioning always leads eventually to a fixed point (and hence all doxastic attitudes, including conditional beliefs, strong beliefs, and any form of "knowledge", eventually stabilize), this is not the case for other belief-revision methods. We show that infinite revision cycles exist (even when the initial model is finite and even when in the case of repeated revision with one single true sentence), but we also give syntactic and semantic conditions ensuring that beliefs stabilize in the limit. Finally, we look at the issue of convergence to truth, giving both sufficient conditions ensuring that revision stabilizes on true beliefs, and (stronger) conditions ensuring that the process stabilizes on "full truth" (i.e. beliefs that are both true and complete). This talk is based on joint work with A. Baltag.

Tracking the Truth Requires a Non-wellfounded Prior!

Play Episode Listen Later Apr 20, 2019 92:01


Alexandru Baltag (ILLC Amsterdam) gives a talk at the MCMP Colloquium titled "Tracking the Truth Requires a Non-wellfounded Prior! A Study in the Learning Power (and Limits) of Bayesian (and Qualitative) Update". Abstract: The talk is about tracking "full truth" in the limit by iterated belief updates. Unlike Sonja's talk (which focused on finite models), we now allow the initial model (and thus the initial set of epistemic possibilities) to be infinite. We compare the truth-tracking power of various belief-revision methods, including probabilistic conditioning (also known as Bayesian update) and some of its qualitative, "plausibilistic" analogues (conditioning, lexicographic revision, minimal revision). We focus in particular on the question on whether any of these methods is "universal" (i.e. as good at tracking the truth as any other learning method). We show that this is not the case, as long as we keep the standard probabilistic (or belief-revision) setting. On the positive side, we show that if we consider appropriate generalizations of conditioning in a non-standard, non-wellfounded setting, then universality is achieved for some (though not all) of these learning methods. In the qualitative case, this means that we need to allow the prior plausibility relation to be a non-wellfounded (though total) preorder. In the probabilistic case, this means moving to a generalized conditional probability setting, in which the family of "cores" (or "strong beliefs") may be non-wellfounded (when ordered by inclusion or logical entailament). As a consequence, neither the family of classical probability spaces, nor lexicographic probability spaces, and not even the family of all countably additive (conditional) probability spaces, are rich enough to make Bayesian conditioning "universal", from a Learning Theoretic point of view! This talk is based on joint work with Nina Gierasimczuk and Sonja Smets.

Possible Worlds, The Lewis Principle, and the Myth of a Large Ontology

Play Episode Listen Later Apr 20, 2019 54:28


Ed Zalta (Stanford) gives a talk at the MCMP Workshop on Modality titled "Possible Worlds, The Lewis Principle, and the Myth of a Large Ontology".

Accuracy, Chance, and the Principal Principle

Play Episode Listen Later Apr 20, 2019 69:53


Richard Pettigrew (University of Bristol) gives a talk at the MCMP Colloquium titled "Accuracy, Chance, and the Principal Principle"

Theory and Concept in Tarski's Philosophy of Language

Play Episode Listen Later Apr 20, 2019 60:29


Douglas Patterson (Universität Leipzig) gives a talk at the MCMP Colloquium titled "Theory and Concept in Tarski's Philosophy of Language". Abstract: In this talk I will set out some of the background of Tarski's famous work on truth and semantics by looking at important views of his teachers Tadeusz Kotarbinski and Stanislaw Lesniewski in the philosophy of langauge and the "methodology of deductive sciences". With the understanding of the assumed philosophy of language and logic of the important articles set out in this manner, I will look at a number of issues familiar from the literature. I will sort out Tarski's conception of "material adequacy", discuss the relationship between a Tarskian definition of truth and a conceptual analysis of a more familiar sort, and consider the consequences of the views presented for the question of whether Tarski was a deflationist or a correspondence theorist.

The 'fitting problem' for logical semantic systems

Play Episode Listen Later Apr 20, 2019 69:02


Catarina Duthil-Novaes (ILLC/Amsterdam) gives a talk at the MCMP Colloquium titled "The 'fitting problem' for logical semantic systems". Abstract: When applying logical tools to study a given extra-theoretical, informal phenomenon, it is now customary to design a deductive system, and a semantic system based on a class of mathematical structures. The assumption seems to be that they would each capture specific aspects of the target phenomenon. Kreisel has famously offered an argument on how, if there is a proof of completeness for the deductive system with respect to the semantic system, the target phenomenon becomes „squeezed“ between the extension of the two, thus ensuring the extensional adequacy of the technical apparatuses with respect to the target phenomenon: the so-called squeezing argument. However, besides a proof of completeness, for the squeezing argument to go through, two premises must obtain (for a fact e occurring within the range of the target phenomenon): (1) If e is the case according to the deductive system, then e is the case according to the target phenomenon. (2) If e is the case according to the target phenomenon, then e is the case according to the semantic system. In other words, the semantic system would provide the necessary conditions for e to be the case according to the target phenomenon, while the deductive system would provide the relevant sufficient conditions. But clearly, both (1) and (2) rely crucially on the intuitive adequacy of the deductive and the semantic systems for the target phenomenon. In my talk, I focus on the (in)plausibility of instances of (2), and argue That the adequacy of a semantic system for a given target phenomenon must not be taken for granted. In particular, I discuss the results presented in (Andrade-Lotero & Dutilh Novaes forthcoming) on multiple semantic systems for Aristotelian syllogistic, which are all sound and complete with respect to a reasonable deductive system for syllogistic (Corcoran˙s system D), but which are not extensionally equivalent; indeed, as soon as the language is enriched, they start disagreeing with each other as to which syllogistic arguments (in the enriched language) are valid. A plurality of apparently adequate semantic systems for a given target phenomenon brings to the fore what I describe as the „fitting problem“ for logical semantic systems: what is to guarantee that these technical apparatuses adequately capture significant aspects of the target phenomenon? If the different candidates have strikingly different properties (as is the case here), then they cannot all be adequate semantic systems for the target phenomenon. More generally, the analysis illustrates the need for criteria of adequacy for semantic systems based on mathematical structures. Moreover, taking Aristotelian syllogistic as a case study illustrates the fruitfulness but also the complexity of employing logical tools in historical analyses.

The conservativity of truth and the disentanglement of syntax and semantics

Play Episode Listen Later Apr 20, 2019 54:30


Volker Halbach (Oxford) gives a talk at the MCMP Colloquium titled "The conservativity of truth and the disentanglement of syntax and semantics"

Cognitive motivations for treating formalisms as calculi

Play Episode Listen Later Apr 20, 2019 75:26


Catarina Duthil-Novaes (ILLC/Amsterdam) gives at talk at the MCMP Colloquium titled "Cognitive motivations for treating formalisms as calculi". Abstract: In The Logical Syntax of Language, Carnap famously recommended that logical languages be treated as mere calculi, and that their symbols be viewed as meaningless; reasoning with the system is to be guided solely on the basis of its rules of transformation. Carnap˙s main motivation for this recommendation seems to be related to a concern with precision and exactness. In my talk, I argue that Carnap was right in insisting on the benefits of treating logical formalisms as calculi, but he was wrong in thinking that enhanced precision is the main advantage of this approach. Instead, I argue that a deeper impact of treating formalisms as calculi is of a cognitive nature: by adopting this stance, the reasoner is able to counter some of her „default“ reasoning tendencies, which (although advantageous in most practical situations) may hinder the discovery of novel facts in scientific contexts. One of these cognitive tendencies is the constant search for confirmation for the beliefs one already holds, as extensively documented and studied in the psychology of reasoning literature, and often referred to as confirmation bias/belief bias. Treating formalisms as meaningless and relying on their well-defined rules of formation and transformation allows the reasoner to counter her own belief bias for two main reasons: it 'switches off' semantic activation, which is thought to be a largely automatic cognitive process, and it externalizes reasoning processes; they now take place largely through the manipulation of the notation. I argue moreover that the manipulation of the notation engages predominantly sensorimotor processes rather than being carried out internally: the agent is literally 'thinking on the paper'. The analysis relies heavily on empirical data from psychology and cognitive sciences, and is largely inspired by recent literature on extended cognition (in particular Clark, Menary and Sutton). If I am right, formal languages treated as calculi and viewed as external cognitive artifacts offer a crucial cognitive boost to human agents, in particular in that they seem to produce a beneficial de-biasing effect.

Do 'Looks' Reports Reflect the Contents of Perception?

Play Episode Listen Later Apr 20, 2019 47:41


Berit Brogaard (University of Missouri, St. Louis) gives a talk at the MCMP Colloquium titled "Do 'Looks' Reports Reflect the Contents of Perception?"

On the Emergence of Descriptive Norms

Play Episode Listen Later Apr 20, 2019 48:56


Stephan Hartmann (Tilburg) gives a talk at the MCMP Workshop on Computational Metaphysics titled "On the Emergence of Descriptive Norms".

Conclusive Reasons, Transmission, and Epistemic Closure

Play Episode Listen Later Apr 20, 2019 38:13


Charles B. Cross (University of Georgia) gives a talk at the MCMP Colloquium titled "Conclusive Reasons, Transmission, and Epistemic Closure".

Hume on Space and Geometry

Play Episode Listen Later Apr 20, 2019 48:22


Graciela di Pierris (Stanford) gives a talk at the MCMP Colloquium titled "Hume on Space and Geometry". Abstract: Hume’s discussion of space, time, and mathematics in Part II of Book I of theTreatise has appeared to many commentators as one of the weakest parts of his work.I argue, on the contrary, that Hume’s views on space and geometry are deeplyconnected with his radically empiricist reliance on phenomenologically given sensoryimages. He insightfully shows that, working within this epistemological model, wecannot attain complete certainty about the continuum but only at most about discretequantity. Therefore, geometry, in contrast to arithmetic, cannot be a fully exactscience. Nevertheless, Hume does have an illuminating account of Euclid’s geometryas an axiomatic demonstrative science, ultimately based on the phenomenologicalapprehension of the “easiest and least deceitful” sensory images of geometricalfigures. Hume’s discussion, in my view, demonstrates the severe limitations of apurely empiricist interpretation of the role of such figures (diagrams) in geometry.

Toward Leibniz's Goal of a Computational Metaphysics

Play Episode Listen Later Apr 20, 2019 48:20


Ed Zalta (CSLI Stanford) gives a talk at the MCMP Workshop on Computational Metaphysics titled "Toward Leibniz's Goal of a Computational Metaphysics".

Russellian Descriptions & Gibbardian Indicatives (Two Case Studies Involving Automated Reasoning)

Play Episode Listen Later Apr 20, 2019 45:43


Branden Fitelson (Rutgers University) gives a talk at the MCMP Workshop on Computational Metaphysics titled "Russellian Descriptions & Gibbardian Indicatives (Two Case Studies Involving Automated Reasoning)". Abstract: The first part of this talk (which is joint work with Paul Oppenheimer) will be about the perils of representing claims involving Russellian definite descriptions in an "automated reasoning friendly" way. I will explain how to eliminate Russellian descriptions, so as to yield logically equivalent (and automated reasoning friendly) statements. This is a special case of a more general problem -- which is representing philosophical theories/explications in a way that automated reasoning tools can understand. The second part of the talk shows how automated reasoning tools can be useful in clarifying the structure (and requisite presuppositions) of well-known philosophical "theorems". Here, the example comes from the philosophy of language, and it involves a certain "triviality result" or "collapse theorem" for the indicative conditional that was first discussed by Gibbard. I show how one can use automated reasoning tools to provide a precise, formal rendition of Gibbard's "theorem". This turns out to be rather revealing about what is (and is not) essential to Gibbard's argument.

Logic and the Brain

Play Episode Listen Later Apr 20, 2019 96:48


Hannes Leitgeb (MCMP/LMU) gives a lecture at the Carl-Friedrich-von-Siemens-Stiftung titled "Logic and the Brain". Introductory words by Enno Aufderheide (secretary gemeral, Humboldt Foundation).

Accuracy & Coherence

Play Episode Listen Later Apr 20, 2019 49:27


Branden Fitelson (Rutgers University) gives a talk at the MCMP Workshop on Bayesian Methods in Philosophy titled "Accuracy & Coherence". Abstract: In this talk, I will explore a new way of thinking about the relationship between accuracy norms and coherence norms in epistemology (generally). In the first part of the talk, I will apply the basic ideas to qualitative judgments (belief and disbelief). This will lead to an interesting coherence norm for qualitative judgments (but one which is weaker than classical deductive consistency). In the second part of the talk, I will explain how the approach can be applied to comparative confidence judgments. Again, this will lead to coherence norms that are weaker than classical (comparative probabilistic) coherence norms. Along the way, I will explain how evidential norms can come into conflict with even the weaker coherence norms suggested by our approach.

The Lockean Thesis Revisited

Play Episode Listen Later Apr 20, 2019 64:55


Hannes Leitgeb (MCMP/LMU) gives a talk at the MCMP Workshop on Bayesian Methods in Philosophy titled "The Lockean Thesis Revisited".

Proof-theoretic semantics and the format of deductive reasoning & Prawitz's completeness conjecture (A sketch of some ideas)

Play Episode Listen Later Apr 20, 2019 67:14


Peter Schroeder-Heister (Tübingen) gives a talk at the MCMP Colloquium Mathematical Philosophy - first part: "Proof-theoretic semantics and the format of deductive reasoning", second part: "Prawitz's completeness conjecture (A sketch of some ideas)".

Applying coherence based probability logic to philosophical problems

Play Episode Listen Later Apr 20, 2019 52:39


Niki Pfeifer (MCMP/LMU) gives a talk at the MCMP Workshop on Bayesian Methods in Philosophy titled "Applying coherence based probability logic to philosophical problems".

Conditionals and Suppositions

Play Episode Listen Later Apr 20, 2019 70:46


Richard Bradley (LSE) gives a talk at the MCMP Colloquium titled "Conditionals and Suppositions". Abstract: Adams' Thesis - the claim that the probabilities of indicative conditionals equal the conditional probabilities of their consequents given their antecedents - has proven impossible to accommodate within orthodox possible worlds semantics. This paper considers the approaches to the problem taken by Jeffrey and Stalnaker (1994) and by McGee (1989), but rejects them on the grounds that they imply a false principle, namely that probability of a conditional is independent of any proposition inconsistent with its antecedent. Instead it is proposed that the semantic contents of conditionals be treated as sets of vectors of worlds, not worlds, where each co-ordinate of a vector specifies the world that is or would be true under some supposition. It is shown that this treatment implies the truth of Adams' Thesis whenever the mode of supposition is evidential.

Diachronic Dutch Book Arguments for Forgetful Agents

Play Episode Listen Later Apr 20, 2019 64:51


Alistair M. C. Isaac (University of Michigan) gives a talk at the MCMP Colloquium titled "Diachronic Dutch Book Arguments for Forgetful Agents". Abstract: I present a general strategy for applying diachronic Dutch book arguments to bounded agents, with particular focus on forgetful agents. Dutch book arguments were introduced by subjectivists about probability to test the consistency of synchronic epistemic norms. Diachronic Dutch book arguments (DDBs) apply this technique to test the consistency of diachronic epistemic norms, norms about how beliefs change in time. Examples like forgetfulness have led some to doubt the relevance of DDBs for evaluating diachronic norms. I argue that there is no problem in applying DDBs to formally specified decision problems involving forgetfulness. The real worry here is whether these formal problems capture the relevant details of real world decision-making situations. I suggest some general criteria for making this assessment and defend the formalization of decision problems involving bounded agents, and their investigation via DDBs, as essential tools for evaluating epistemic norms.

The Contradiction in Will Test: A Reconstruction

Play Episode Listen Later Apr 20, 2019 62:43


Matthew Braham (University of Bayreuth) gives a talk at the MCMP Colloquium titled "The Contradiction in Will Test: A Reconstruction".

Formal epistemological explication (news for the Bayesian agenda)

Play Episode Listen Later Apr 20, 2019 39:08


Vincenzo Crupi (MCMP/LMU) gives a talk at the MCMP Workshop on Bayesian Methods in Philosophy titled "Formal epistemological explication (news for the Bayesian agenda)".

Carnap on extremal axioms and categoricity

Play Episode Listen Later Apr 20, 2019 59:08


Georg Schiemer (MCMP/LMU) gives a talk at the MCMP Workshop on Carnap titled "Carnap on extremal axioms and categoricity". Abstract: The talk will investigate Carnap's early contributions to formal semantics in his work on general axiomatics between 1928 and 1936. Inparticular, we give a historically sensitive discussion of Carnap's theoryof extremal axioms from the late 1920s onwards. The main focus is seton the unpublished documents of the projected second part of UntersuchungenzurallgemeinenAxiomatik (RC 081-01-01 to 081-01-33).We present a formal reconstruction of the semantic notions 'formalmodel', 'model structure', und 'submodel' first formulated there. Themain interprctive issue addressed in the talk concerns Carnap's understandingof the relationship between the "completenessof the models"of an axiomatic theory and other metatheoretic notions investigatedby him at the time, most notably that of semantic completeness andcategoricity.

Frequencies, Chances and Undefinable Sets

Play Episode Listen Later Apr 20, 2019 61:36


Jan-Willem Romeijn (University of Groningen) gives a talk in the talk series "MCMP & Statistics Department" titled "Frequencies, Chances and Undefinable Sets". Abstract: In this talk I aim to clarify the concept of chance. The talk consists of two parts, concerning the epistemology and metaphysics of chance respectively. In the first part I consider statistical hypothese and their role in inference. I maintain that statistical hypotheses are best explicated along frequentist lines, following the theory of von Mises. I will argue that the well-known problems for frequentism do not apply in the inferential context. In the second part of the talk I ask what relation obtains between these frequentist hypotheses an the world. I will show that we can avoid the problem of the reference class, as well as the closely related conflict between determinism and chance, by means of a formal antireductionist argument: events can be assigned meaningful and nontrivial chances if they correspond to undefinable sets of events in the reducing theory.

Knowledge about Probability in the Monty Hall Problem

Play Episode Listen Later Apr 20, 2019 45:27


Charles B. Cross (University of Georgia) gives a talk at the MCMP Workshop on Bayesian Methods in Philosophy titled "Knowledge about Probability in the Monty Hall Problem".

Mathematical Science, Naturalism, and Normativity

Play Episode Listen Later Apr 20, 2019 55:20


Michael Friedman (Stanford) gives a talk at the MCMP Colloquium titled "Mathematical Science, Naturalism, and Normativity". Abstract: I address concerns in contemporary philosophy about the place of mathematics andmoral (and other) norms in a naturalistic world picture. I think that these worriesare largely misplaced, and I address them with an historical narrative from Platoto Kant, beginning from the fact that Plato's original "platonism" (in the theoryof forms) tried to give a kind of unified account of both mathematics and moralnorms. I contend that this was not mysterious or "spooky" but a perfectlyreasonable and intelligible response to the state of mathematical science of thetime—especially concerned with the relationship between ideal mathematicalstructures and the physical world. I then explore how this last relationship wasfundamentally transformed in the early modern period, beginning with Galileo, andcontinuing from Descartes, through Leibniz and Newton, and finally to Kant—where aradically new kind of relationship between mathematics (especially appliedmathematics) and moral normativity (still in the spirit of Plato) then emerged.

Core Logic

Play Episode Listen Later Apr 20, 2019 38:12


Neil Tennant (Ohio State University) gives a talk at the MCMP Colloquium titled "Core Logic".

Possibilities without possible worlds/histories

Play Episode Listen Later Apr 20, 2019 46:17


Tomasz Placek (Jagiellonian University, Kraków) gives a talk at the MCMP Colloquium titled "Possibilities without possible worlds/histories". Abstract: Possible worlds have turned out to be a particularly useful tool of modal metaphysics, although their globality makes them philosophically suspect. Hence, it would be desirable to arrive at some local modal notions that could be used instead of possible worlds. In this talk I will focus on what is known as historical (or real) modalities, an example of which is tomorrow's sea-battle. The modalities involved in this example are local since they refer to relatively small chunks of our world: a gathering of inimical fleets on a bay near-by has two alternative possible future continuations: one with a sea-battle and the other with no-sea battle. The objective of this talk is to sketch a theory of such modalities that is framed in terms of possible continuations rather than possible worlds or possible histories. The proposal will be tested as a semantic theory for a language with historical modalities, tenses, and indexicals.

Group Presentation, Munich Center for Mathematical Philosophy (LMU)

Play Episode Listen Later Apr 20, 2019 88:43


Members of the MCMP (Julien Murzi, Johannes Stern, Martin Fischer, Ole Hjortland, Marta Sznayder, Norbert Gratzl, Johannes Korbmacher) present their current research

Self-reference

Play Episode Listen Later Apr 20, 2019 53:25


Volker Halbach (Oxford) gives a talk at the Workshop on Mathematical Philosophy titled "Self-reference". Abstract: What does it mean for a sentence to say about itself that it is P? Here P can stand for any unary sentential function such as 'is provable', 'is not provable', 'is true', or 'is a sentence'. I will study this question in a metamathematical setting. After reviewing some early attempts to tackle the question and their impact on problems in metamathematics such as Henkin's problem, I will put forward a new proposal and test its adequacy with some examples.

Group Presentation, Munich Center for Mathematical Philosophy (LMU)

Play Episode Listen Later Apr 20, 2019 75:11


Members of the MCMP (Roland Poellinger, Florian Steinberger, Thomas Meier, Vincenzo Crupi and Olivier Roy) present their current research.

Three contrasts between two senses of coherence

Play Episode Listen Later Apr 20, 2019 58:44


Teddy Seidenfeld (CMU) gives a talk in the talk series "MCMP & Statistics Department" titled "Three contrasts between two senses of coherence" (Joint work with M. J. Schervish and J. B. Kadane – Statistics, CMU). Abstract: B. de Finetti defended two senses of coherence in providing foundations for his theory of subjective probabilities. Coherence 1 requires that when a decision maker announces fair prices for random variables these are immune to a uniform sure-loss - no Book is possible using finitely many fair contracts! Coherence 2 requires that when a decision maker's forecasts for a finite set of random variables are evaluated by Brier Score - squared error loss - there is no rival set of forecasts that dominate with a uniformly better score for sure. De Finetti established these two concepts are equivalent: fair prices are coherent 1 if and only if they constitute a coherent 2 set of forecasts if and only if they are the expected values for the variables under some common (finitely additive) personal probability. I report three additional contrasts between these two senses of coherence. One contrast (relating to finitely additive probabilities) favors coherence 2. One contrast (relating to decisions with moral hazard) favors coherence 1. The third contrast relates to the challenge of state-dependent utilities.

From Analysis to Explication

Play Episode Listen Later Apr 20, 2019 47:09


André Carus (Chicago/Cambridge) gives a talk at the MCMP Workshop on Carnap titled "From Analysis to Explication". Abstract: Analytic philosophy was named for the "analysis" of propositions begun by Russelland Moore in the first years of the twentieth century, epitomized by the theory ofdescriptions.This style of analysis has been joined by many others since then. But they all share certain common defects, which are overcome by "explication," areplacement for all forms of analysis developed by Carnap in his later years.However, it was suggested by Quine, Dreben, and their students that Carnap's form ofexplication depends on metaphysical assumptions Quine dispensed with.It is arguedin this paper that this suggestion is based on misunderstandings, and thatexplication is preferable to analysis, especially since it offers a more plausible picture of philosophy.

IPAD – Information Processing and the Analysis of Democracy

Play Episode Listen Later Apr 20, 2019 60:08


Vincent Hendricks (Copenhagen/Columbia) gives a talk at the Workshop on Mathematical Philosophy titled "IPAD – Information Processing and the Analysis of Democracy". Abstract: Only one species have configured a democracy and decided to live according to deliberative democratic guidelines. The configuration and decision is particular to man. A deliberative democracy is characterized by both group deliberation, decision and action. Central to this epistemic composite is information as information processing is an essential fabric of rational deliberation, decision and action which in turn amount to the rational interaction among members of a group or a democracy. Thus, a robust deliberative democracy is the quintessential example of rational agent interaction. This intimate connection fuels a new research paradigm in interdisciplinary philosophy: IPAD -- Information Processing and the Analysis of Democracy.

Voting, Deliberation and Truth

Play Episode Listen Later Apr 20, 2019 48:28


Stephan Hartmann (Tilburg) gives a talk at the Workshop on Mathematical Philosophy titled "Voting, Deliberation and Truth". Abstract: There are various ways to reach a group decision. One way is to simply vote and decide what the majority votes for. This procedure receives some epistemological support from the Condorcet Jury Theorem. Alternatively, the group members may prefer to deliberate and will eventually reach a decision that everybody endorses -- a consensus. While the latter procedure has the advantage that it makes everybody happy (as everybody endorses the consensus), it has the disadvantage that it is difficult to implement, especially for larger groups. What is more, a deliberation is easy to bias as those group members who make others change their mind may not necessarily be the best truth-trackers. But even if no such biases are present, the consensus may be far away from the truth. And so we ask: When is deliberation a better method to track the truth than simple majority voting? To address this question, we propose a Bayesian model of rational non-strategic deliberation and compare it to the straight forward voting procedure. The talk is based on joint work with Soroush Rafiee Rad.

An "Evidentialist" Worry About Joyce's Argument for Probabilism.

Play Episode Listen Later Apr 20, 2019 41:22


Branden Fitelson (Rutgers) gives a talk at the Workshop on Mathematical Philosophy titled "An "Evidentialist" Worry About Joyce's Argument for Probabilism.". Abstract: In this talk, I will raise a potential problem for Joyce's argument for probabilism (and sufficiently similar "accuracy-dominance"-based arguments for probabilism). The problem involves a potential conflict between "accuracy-dominance" (coherence) norms and certain "evidential" norms for credences. An interesting analogy with the case of full belief is also drawn (which connects up with a larger project on the relationship between accuracy, coherence, and evidential norms for various sorts of judgments). This is joint work with Kenny Easwaran.

On an occasionally heard objection to Carnap's conception of logical truth

Play Episode Listen Later Apr 20, 2019 31:36


Steve Awodey (CMU/MCMP) gives a talk at the MCMP Workshop on Carnap titled "On an occasionally heard objection to Carnap's conception of logical truth".

Carnap's Logico-Mathematical Neutrality between Realism and Instrumentalism

Play Episode Listen Later Apr 20, 2019 49:52


Michael Friedman (Stanford) gives a talk at the MCMP Workshop on Carnap titled "Carnap's Logico-Mathematical Neutrality between Realism and Instrumentalism". Abstract: I discuss the evolution of Carnap’s treatment of theoretical terms from the late1930s to his mature work on the Ramsey sentence formulation of scientific theoriesin the late 1950s and 1960s.I concentrate on Carnap’s use of this device toremain completely neutral between realism and instrumentalism.A central point ofdiscussion is his commitment to a purely logico-mathematical interpretation of thequantified existential variables in the Ramsey sentence.Far from being adesperate or ad hoc maneuver, I argue that this is essential to Carnap’s point ofview and, in particular, to the way in which he understands the characteristicallyabstract representations of modern mathematical physics throughout his intellectualcareer.In the end, Carnap recommends nothing more nor less than that we eschewfruitless “ontological” disputes in favor of cooperating with contemporarymathematical physicists in attempting (axiomatically) to clarify the mathematicaland conceptual foundations of their discipline.)

Tolerance & Voluntarism

Play Episode Listen Later Apr 20, 2019 34:18


Paul Dicken (Cambridge) gives a talk at the MCMP Workshop on Carnap titled "Tolerance & Voluntarism". Abstract: Carnap's dissolution of the scientific realism debate rests upon two central claims:the first regarding the appropriate logical reconstruction of a scientific theory;the second, a background conception of the nature of ontological dispute. Recentwork has focused on the first of these claims; in this talk I discuss the second,and relate it to similar moves made by van Fraassen in his own articulation of empiricism.

Validity Curry

Play Episode Listen Later Apr 20, 2019 33:02


Julien Murzi (MCMP/LMU) gives a talk at the Bristol-Munich Workshop titled "Validity Curry".

Logic in Games

Play Episode Listen Later Apr 20, 2019 95:01


Johan van Benthem gives a talk at the MCMP Colloquium titled "Logic in Games". Abstract: We discuss logic *of* games as a foundation for rational interaction, suggesting a 'theory of play' extending standard game theory. We also discuss logic *as* games, the other direction of this contact. We conclude by looking at some natural entanglements between the two perspectives.

Inexhaustibility and Reflection

Play Episode Listen Later Apr 20, 2019 29:09


Marianna Antonutti M. (Bristol) gives a talk at the Bristol-Munich Workshop titled "Inexhaustibility and Reflection".

Logics as Scientific Theories

Play Episode Listen Later Apr 20, 2019 63:34


Timothy Williamson (Oxford) gives a talk at the MCMP Colloquium titled "Logics as Scientific Theories". Abstract: Logic has far more in common with other branches of science than is usually recognized. One major aim of science is to develop theories that are true, highly general, and maximally informative subject to those constraints. When the generality requirement is made precise in some natural ways, related to Tarski’s account of logical consequence, the resultant theories meet central requirements for logical systems. An appropriate methodology for choosing between different candidate theories has many similarities to the methodology for theory choice in other branches of science. This involves no reduction of logic to psychology, linguistics, or specifically natural science. The talk will be illustrated with examples from modal logic.

Explorations in Bayesian confirmation and models of information search

Play Episode Listen Later Apr 20, 2019 42:31


Vincenzo Crupi (MCMP/LMU) gives a talk at the Bristol-Munich Workshop titled "Explorations in Bayesian confirmation and models of information search".

Is logical knowledge dispositional?

Play Episode Listen Later Apr 20, 2019 25:41


Florian Steinberger (MCMP/LMU) gives a talk at the Bristol-Munich Workshop titled "Is logical knowledge dispositional?".

Logic as Modelling

Play Episode Listen Later Apr 20, 2019 33:51


Neil Coleman (Bristol) gives a talk at the Bristol-Munich Workshop titled "Logic as Modelling".

Logical Dynamics of Intelligent Interaction

Play Episode Listen Later Apr 20, 2019 82:57


Johan van Benthem gives a talk at the MCMP Colloquium titled "Logical Dynamics of Intelligent Interaction". Abstract: We morivate the logical dynamics of information-driven agency, and then discuss it from three perspectives: as a natural extension of the traditional scope of logic, as a foundation for the interdisciplinary study of agency, and as a source of new mathematical issues of pure interest.

On a Proposed Extension of Infinitary Logic

Play Episode Listen Later Apr 20, 2019 71:06


Timothy Williamson (Oxford) gives a talk at the MCMP Colloquium titled "On a Proposed Extension of Infinitary Logic". Abstract: In discussing ‘translation’ schemes between possibilist discourse about merely possible objects and actualist discourse that abjures such objects, Kit Fine proposed interpreting quantifiers over pluralities or sets of possibilia using infinite sequences of modal operators and actualist quantifiers. After explaining the philosophical background, the talk will concern the more technical problem of interpreting such infinite sequences of operators. Various proposals will be assessed. The only ones that work depend on postulating possibilia in the meta-language.

Claim MCMP – Mathematical Philosophy (Archive 2011/12)

In order to claim this podcast we'll send an email to with a verification link. Simply click the link and you will be able to edit tags, request a refresh, and other features to take control of your podcast page!

Claim Cancel