MCMP

Follow MCMP
Share on
Copy link to clipboard

Mathematical Philosophy - the application of logical and mathematical methods in philosophy - is about to experience a tremendous boom in various areas of philosophy. At the new Munich Center for Mathematical Philosophy, which is funded mostly by the German Alexander von Humboldt Foundation, philoso…

MCMP Team

  • Mar 17, 2018 LATEST EPISODE
  • infrequent NEW EPISODES
  • 41m AVG DURATION
  • 66 EPISODES


Search for episodes from MCMP with a specific topic:

Latest episodes from MCMP

Mathematical Empiricism. A Methodological Proposal

Play Episode Listen Later Mar 17, 2018 83:19


Hannes Leitgeb (LMU/MCMP) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "Mathematical Empiricism. A Methodological Proposal". Abstract: I will propose a way of doing (mathematical) philosophy which I am calling 'mathematical empiricism'. It is the proposal to rationally reconstruct language, thought, ends, decision-making, communication, social interaction, norms, ideals, and so on, in conceptual frameworks. The core of each such framework will be a space of "possibilities", however, these "possibilities" will consist of nothing else than mathematical structures labeled by empirical entities. Mathematical empiricism suggests to carry out (many) rational reconstructions in such mathematical-empirical conceptual frameworks. When the goal is to rationally reconstruct a part of empirical science itself (which is but one philosophical goal amongst many others), it will be reconstructed as "taking place" within such frameworks, whereas the frameworks themselves may be used to rationally reconstruct some of the presuppositions of that part of empirical science. While logic and parts of philosophy of science study such frameworks from an external point of view, with a focus on their formal properties, metaphysics will be embraced as studying such frameworks from within, with a focus on what the world looks like if viewed through a framework. When mathematical empiricists carry out their investigations in these and in other areas of philosophy, no entities will be postulated over and above those of mathematics and the empirical sciences, and no sources of epistemic justification will be invoked beyond those of mathematics, the empirical sciences, and personal and social experience (if consistent with the sciences). And yet mathematical empiricism, with its aim of rational reconstruction, will not be reducible to mathematics or empirical science. When a fragment of science is reconstructed in a framework, the epistemic authority of science will be acknowledged within the boundaries of the framework, while as philosophers we are free to choose the framework for reconstruction and to discuss our choices on the metalevel, all of which goes beyond the part of empirical science that is reconstructed in the framework. There is a great plurality of mathematical-empirical frameworks to choose from; even when ultimately each of them needs to answer to mathematical-empirical truth, this will underdetermine how successfully they will serve rational reconstruction. In particular, certain metaphysical questions will be taken to be settled only by our decisions for or against conceptual frameworks, and these decisions may be practically expedient for one purpose and less so for another. The overall hope will be to take what was good and right about the distinctively Carnapian version of logical empiricism, and to extend and transform it into a more tolerant, less constrained, and conceptually enriched logical-mathematical empiricism 2.0.

Notations and Diagrams in Algebra

Play Episode Listen Later Mar 17, 2018 19:12


Silvia de Toffoli (Stanford University) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "Notations and Diagrams in Algebra". Abstract: The aim of this talk is to investigate the roles of Commutative Diagrams (CDs) in a specific mathematical domain, and to unveil the reasons underlying their effectiveness as a mathematical notation; this will be done through a case study. It will be shown that, differently from other mathematical diagrams, CDs do not depict spatial relations, but represent mathematical structures. CDs will be interpreted as a hybrid notation, that goes beyond the traditional bipartition of mathematical representations into graphic and linguistic. It will be argued that one of the reasons why CDs form a good notation is that they are highly ‘mathematically tractable’: experts can obtain valid results by ‘calculating’ with CDs. These calculations, take the form of a ‘diagram chase’. In order to draw inferences, experts move algebraic elements around the diagrams. These diagrams present a dynamic nature. It is thanks to their dynamicity that CDs can externalize the relevant reasoning and allow experts to draw conclusions directly by manipulating them. Lastly, it will be shown that CDs play essential roles in the context of proof as well as in other phases of the mathematical enterprise, such as discovery and conjecture.

Ethics and Morality in the Vienna Circle

Play Episode Listen Later Mar 17, 2018 53:33


Anne Siegetsleitner (Innsbruck) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "Ethics and Morality in the Vienna Circle". Abstract: In my talk I will present key aspects of a long-overdue revision of the prevailing view on the role and conception of ethics and morality in the Vienna Circle. This view is rejected as being too partial and undifferentiated. Not all members supported the standard view of logical empiricist ethics, which is held to be characterized by the acceptance of descriptive empirical research, the rejection of normative and substantial ethics as well as an extreme non-cognitivsm. Some members applied formal methods, some did not. However, most members shared an enlightened and humanistic version of morality and ethics. I will show why these findings are still relevant today, not least for mathematical philosophers.

Degrees of Truth Explained Away

Play Episode Listen Later Mar 17, 2018 17:35


Rossella Marrano (Scuola Normale Superiore Pisa) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "Degrees of Truth Explained Away". Abstract: The notion of degrees of truth arising in infinite-valued logics has been the object of long-standing criticisms. In this paper I focus on the alleged intrinsic philosophical implausibility of degrees of truth, namely on objections concerning their very nature and their role, rather than on objections questioning the adequacy of degrees of truth as a model for vagueness. I suggest that interpretative problems encountered by the notion are due to a problem of formalisation. On the one hand, indeed, degrees of truth are artificial, to the extent that they are not present in the phenomenon they are meant to model, i.e. graded truth. On the other hand, however, they cannot be considered as artefacts of the standard model, contra what is sometimes argued in the literature. I thus propose an alternative formalisation for graded truth based on comparative judgements with respect to the truth. This model provides a philosophical underpinning for degrees of truth of structuralist flavour: they are possible numerical measures of a comparative notion of truth. As such, degrees of truth can be considered artefacts of the model, thus avoiding the aforementioned objections.

What Are No-Go Theorems Good for?

Play Episode Listen Later Mar 17, 2018 20:11


Radin Dardashti (LMU/MCMP) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "What Are No-Go Theorems Good for?". Abstract: No-go Theorems in physics have often been construed as impossibility results with respect to some goal. These results usually have had two effects on the field. Either, the no-go result effectively stopped that research programme or one or more of the assumptions involved in the derivation were questioned. In this talk I address some general features of no-go theorems and try to address the question how no-go results should be interpreted. The way they should be interpreted differs significantly from how they have been interpreted in the history of physics. More specifically, I will argue that no-go theorems should not be understood as implying the impossibility of a desired result, and therefore do not play the methodological role they purportedly do, but that they should be understood as a rigorous way to outline the methodological pathways in obtaining the desired result.

Mathematical Philosophy and Leitgeb’s Carnapian Big Tent: Past, Present, Future

Play Episode Listen Later Mar 17, 2018 35:17


André W. Carus (LMU) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "Mathematical Philosophy and Leitgeb’s Carnapian Big Tent: Past, Present, Future". Abstract: Hannes Leitgeb’s conception of mathematical philosophy, reflected in the success of the MCMP, is characterized by a pluralism — a Big Tent program — that shows remarkable continuity with the Vienna Circle, as now understood. But logical empiricism was notoriously opposed to metaphysics, which Leitgeb and other recent scientifically-oriented philosophers, such as Ladyman and Ross, embrace to varying degrees. So what, if anything, do these new, post-Vienna scientific philosophies exclude? Ladyman and Ross explicitly exclude much of recent analytic metaphysics, decrying it — very much in the logical empiricist spirit of critical Enlightenment — as vernacular “domestication” of counter-intuitive science. But it turns out, in the light of recent research on Carnap’s later thought, that Leitgeb’s Big Tent conception, though it excludes less than Ladyman and Ross, adheres more closely to Carnap’s Enlightenment ideal.

Valuing Questions

Play Episode Listen Later Mar 17, 2018 18:45


Liam Kofi Bright (CMU Pittsburgh) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "Valuing Questions". Abstract: If all scientists seek the truth, will they agree on how this search should be carried out? Social epistemologists have alleged that were scientists to be truth seekers they would display an unwelcome homogeneity in their choice of what projects to pursue. However, philosophers of science have argued that the injunction to seek the truth is incapable of providing any guidance to scientific project selection. Drawing on theories of the semantics of questions to construct a model of project selection, I argue that the injunction to seek the truth can guide choice through a philosophcially well motivated decision theory, but may indeed discourage division of cognitive labour. I end by discussing methods of maintaining heterogeneity among a community of inquirers, even veritistic ones, in light of my results.

Relating Theories of Intensional Semantics: Established Methods and Surprising Results

Play Episode Listen Later Mar 17, 2018 19:11


Kristina Liefke (LMU/MCMP) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "Relating Theories of Intensional Semantics: Established Methods and Surprising Results". Abstract: Formal semantics comprises a plethora of ‘intensional’ theories which model propositional attitudes through the use of different ontological primitives (e.g. possible/impossible worlds, partial situations, unanalyzable propositions). The ontological relations between these theories are, today, still largely unexplored. In particular, it remains unclear whether the basic objects of some of these theories can be reduced to objects from other theories (s.t. phenomena which are modeled by one theory can also be modeled by the other theories), or whether some of these theories can even be reduced to ontologically ‘poor’ theories (e.g. extensional semantics) which do not contain intensional objects like possible worlds. This talk surveys my recent work on ontological reduction relations between the above theories. This work has shown that – more than preserving the modeling success of the reduced theory – some reductions even improve upon the theory’s modeling adequacy or widen the theory’s modeling scope. Our talk illustrates this observation by two examples: (i) the relation between Montague-/possible world-style intensional semantics and extensional semantics, and (ii) the relation between intensional semantics and situation-based single-type semantics. The relations between these theories are established through the use of associates from higher-order recursion theory (cf. (i)) and of type-coercion from programming language theory (cf. (ii)). Part of this work is joined with Markus Werning (RUB Bochum) and Sam Sanders (LMU Munich/MCMP).

Inductive Reasoning with Conceptual Spaces: A Proposal for Analogy

Play Episode Listen Later Mar 17, 2018 22:19


Marta Sznajder (University of Groningen/MCMP) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "Inductive Reasoning with Conceptual Spaces: A Proposal for Analogy". Abstract: In his late work on inductive logic Carnap introduced the conceptual level of representations – i.e. conceptual spaces – into his system. Traditional inductive logic (e.g. Carnap 1950) is a study of inductive reasoning that belongs to the symbolic level of cognitive representation (in the three-level view of representations presented by Gärdenfors (2000)). In the standard, symbolic approach the confirmation functions are functions applied to propositions defined with respect to a particular formal language. In my project I investigate alternative approach that is a step towards modelling inductive reasoning directly on the conceptual spaces: considering probability densities (or distributions) over the set of points in a conceptual space rather than traditional credences over propositions. I will present one way in which analogical effects can enter inductive reasoning, using the tools of Bayesian statistics and building up from Carnap’s idea that analogical dependencies between predicates can be read off conceptual spaces via the distances that encode similarity relations between predicates. I consider a quasi-hierarchical Bayesian model in which the different hypotheses considered by the agent are probability distributions over a one-dimensional conceptual space, representing possible distributions of the particular qualities among a studied population.

Five Years MCMP: Looking Back

Play Episode Listen Later Mar 17, 2018 23:21


Roland Poellinger (LMU/MCMP) gives a talk at the Workshop on Five Years MCMP: Quo Vadis, Mathematical Philosophy? (2-4 June, 2016) titled "Five Years MCMP: Looking Back". Abstract: In this presentation I will speak about the MCMP's outreach and line up some of the center's achievements in the last five years. I will put special emphasis on our media output since many of our activities are mirrored in our media-related efforts such as our video channels on iTunes U, our Coursera online courses, and our publication database on the MCMP's web portal.

On Some Puzzling Features of Existential Discourse

Play Episode Listen Later Mar 17, 2018 68:04


Dolf Rami (Göttingen) gives a talk at the MCMP Colloquium (21 January, 2016) titled "On Some Puzzling Features of Existential Discourse". Abstract: Existence is a very puzzling notion that bewitched philosophers since the beginning of Western Philosophy. In this talk, I will compare the three most popular general views on existence and I will point out their main advantages and weaknesses. These are (a) the often so-called second-level view of existence, (b) the Meinongian view of existence and (c) the Parmendian view of existence. I will try to show that the best overall view of existence is a version of the Parmendian view of existence that makes use of negative free logic.

On the Role of Supplementation Principles in Mereology

Play Episode Listen Later Mar 17, 2018 63:39


Aaron Cotnoir (St. Andrews) gives a talk at the MCMP Colloquium (4 February, 2016) titled "On the Role of Supplementation Principles in Mereology". Abstract: Mereology is the formal theory of parts and wholes. Despite the frequent claim that a certain class of `supplementation' principles are analytically true of the concept of parthood, students of mereology often find such principles tricky to understand. This is made more complicated by the supposed relation between supplementation and mereological extensionality. In this paper, I outline the algebraic role of supplementation and argue that, contrary to received opinion, extensionalists and non-extensionalist alike should accept them.

Causation & Time Reversal

Play Episode Listen Later Mar 17, 2018 58:08


Matt Farr (Queensland) gives a talk at the MCMP Colloquium (20 January, 2016) titled "Causation & Time Reversal". Abstract: What would it be for a process to happen ‘backwards’ in time? Would such a process involve different causal relations? On a standard interpretation of time reversal, time reversal symmetric theories radically underdetermine causal relations between events. This has led many to imply that time reversal symmetry motivates eliminativism about causation. This paper assesses the compatibility of time reversal symmetry with causation by asking whether causal relations ought to invert under the action of time reversal or remain invariant. I show that in neither case is there an incompatibility between time reversal symmetry and causation and hence time reversal symmetric theories pose no special problem for causality. I argue for a ‘non-causal’ interpretation of time reversal, whereby time reversal does not invert causal relations, and assess the consequences of this interpretation for the epistemology and metaphysics of causation.

On the Relationship Between Intrinsic and Extrinsic Justifications

Play Episode Listen Later Mar 17, 2018 56:35


Neil Barton (Birkbeck) gives a talk at the MCMP Colloquium (14 January, 2016) titled "On the Relationship Between Intrinsic and Extrinsic Justifications". Abstract: Recent discussions of the justification of new axioms for set theory have often focussed on a distinction between two different kinds of justification. Intrinsic justifications argue that putative axioms are implied by an underlying mathematical conception, whereas extrinsic justifications concern the consequences of said principle. In this paper, we argue that intrinsic and extrinsic justification as it has been explained in the literature is unsatisfactory. In its stead we propose a new account of intrinsic and extrinsic justification, one which develops a harmony between the two notions and avoids the problems we see for extant accounts.

Turbulence, Universality and Emergence

Play Episode Listen Later Mar 17, 2018 44:21


Margaret Morrison (Toronto) gives a talk at the MCMP Colloquium (3 February, 2016) titled "Turbulence, Universality and Emergence". Abstract: Turbulent flows are paradigm cases of complex systems where multi-scale modelling is required. The fundamental problems in the field are strong fluctuations and couplings – problems that are also present in condensed matter physics (CMP) and field theory. Like the latter two areas of physics, renormalization group methods have been used to treat some of the theoretical difficulties with turbulent flows. However, unlike CMP where universality and emergence is, in some sense, reasonably understood, it is less than straightforward in cases of turbulence. I examine some of these issues, in particular the relation between multi-scale modelling and emergence, in an attempt to clarify how or even whether a notion of emergence might be applicable in the context of turbulent flows.

How (not) to make everyone better off

Play Episode Listen Later Mar 17, 2018 48:58


Anna Mahtani (LSE) gives a talk at the MCMP Colloquium (16 December, 2015) titled "How (not) to make everyone better off". Abstract: he concept of ‘pareto superiority’ plays a central role in welfare economics. Pareto superiority is sometimes taken as a relation between outcomes, and sometimes as a relation between actions – even where the outcome of the actions is uncertain. Whether one action is classed as (ex ante) pareto superior to another depends on the prospects under the actions for each person concerned. I argue that a person's prospects (in this context) can depend on how that person is designated. Without any constraints on acceptable designators, then, the concept of pareto superiority is incomplete and gives inconsistent results. I consider various ways of completing the concept, and draw out the implications for debates in welfare economics.

Non-Classical Knwoledge

Play Episode Listen Later Mar 17, 2018 59:42


Ethan Jerzak (Berkeley) gives a talk at the MCMP Colloquium (17 December, 2015) titled "Non-Classical Knwoledge".

The Quantified Argument Calculus

Play Episode Listen Later Mar 17, 2018 47:26


Hanoch Ben-Yami (CEU) gives a talk at the MCMP Colloquium (16 December, 2015) titled "The Quantified Argument Calculus". Abstract: I present the principles of a logic I have developed, in which quantified arguments occur in the argument position of predicates. That is, while the natural language sentence ‘Alice is polite’ is formalised P(a), the sentence ‘Some students are polite’ is formalised P(∃S). In this and several other respects, this logic is closer to Natural Language than is any version of Frege’s Predicate Calculus. I proceed to discuss further features of this logic, the Quantified Argument Calculus (Quarc). For instance, the Quarc incorporates both sentential negation and predication negation. The use of converse relation terms and of anaphors vis-à-vis variables is also discussed. I then concisely introduce the proof system and semantics, and describe the system’s power and its metalogical properties. I conclude by extending the Quarc to modal logic and discussing its treatment of the Barcan formulas.

Anaphora and Presuppositions in Dependent Type Semantics

Play Episode Listen Later Mar 17, 2018 88:05


Daisuke Bekki (Ochanomizu University) gives a talk at the MCMP Colloquium (2 December, 2015) titled "Anaphora and Presuppositions in Dependent Type Semantics". Abstract: Dependent type semantics (DTS) is a framework of proof-theoretic discourse semantics based on dependent type theory, following the line of Sundholm and Ranta. DTS attains compositionality as required to serve as a semantic component of modern formal grammars including variations of categorial grammars, which is achieved by adopting a mechanism for underspecified terms. In DTS, the calculation of presupposition projection reduces to type checking, and the calculation of anaphora resolution and presupposition binding both reduce to proof search in dependent type theory, inheriting the paradigm of anaphora resolution as proof construction. I will demonstrate how DTS gives a unified solution to benchmarks for presupposition and anaphora, including presupposition projection and filtering, temporal and bridging anaphora.

Ensemble Realism. A new Approach to Statistical Mechanical Probability

Play Episode Listen Later Mar 17, 2018 52:54


Nick Tosh (NUI Galway) gives a talk at the MCMP Colloquium (2 December, 2015) titled "Ensemble Realism. A new Approach to Statistical Mechanical Probability". Abstract: “What we know about a body can generally be described most accurately and most simply by saying that it is one taken at random from a great number (ensemble) of bodies which are completely described.” So wrote Willard Gibbs in 1902, but with his fingers crossed, for he regarded ensembles as convenient fictions. A century later, they are still convenient, and we still have no settled account of the literal meaning of statistical mechanical probability assignments. My aim is to show how talk of ensembles might be taken seriously.

Pan-Perspectival Realism

Play Episode Listen Later Mar 17, 2018 40:33


Paul Teller (UC Davis) gives a talk at the MCMP Colloquium (10 December, 2015) titled "Pan-Perspectival Realism". Abstract: Conventional scientific realism is just the doctrine that our theoretical terms refer. Conventional antirealism denies, for various reasons, theoretical reference and takes theory to give us only information about the word of the perceptual where reference, it would appear, is secure. But reference fails for the perceptual every bit as much for the perceptual as for the theoretical, and for the same reason: the world is too complicated for us to succeed in attaching specific referents to our terms. That would appear to leave us with a kind of latter day, representational idealism: All we have are representations. I argue that our representations tell us about an independent world without securing reference by showing that the world is very like the way it is represented in a range of different, often complementary modeling schemes. Though never exact, these representations are of something extra-representational because they present the world modally as going beyond what is represented explicitly.

Anti-Exceptionalism About Logic

Play Episode Listen Later Mar 17, 2018 50:34


Ole Hjortland (Bergen) gives a talk at the MCMP Colloquium (3 December, 2015) titled "Anti-Exceptionalism About Logic". Abstract: Logic isn’t special. Its theories are continuous with science; its method continuous with scientific method. Logic isn’t a priori, nor are its truths analytic truths. Logical theories are revisable, and if they are revised, they are revised on the same grounds as scientific theories. These are the tenets of anti-exceptionalism about logic. The position is most famously defended by Quine, but has more recent advocates in Maddy (2002), Priest (2006a; 2014), Russell (2014; 2015), and Williamson (2013b; 2015). Anti- exceptionalism would not be an attractive position, however, if it was only a rejection of exceptionalism, and in particular a rejection of apriorism. A number of questions remains: What is a logical theory according to the anti-exceptionalist, what is a logical theory a theory of, and what constitutes evidence for such a theory? We argue against Williamson’s deflationary account of logical theories, and we show how a non-deflationary account undercuts his argument for classical logic. Instead we offer an alternative account of logical theories, on which logical pluralism is a plausible consequence of anti-exceptionalism.

The Unreasonable Effectiveness of Nonstandard Analysis

Play Episode Listen Later Mar 17, 2018 54:42


Sam Sanders (MCMP/LMU) gives a talk at the MCMP Colloquium (26 November, 2015) titled "The Unreasonable Effectiveness of Nonstandard Analysis". Abstract: There is a persistent belief, propagated by such luminaries as Errett Bishop and Alain Connes, that infinitesimals (in the sense of Robinson’s Nonstandard Analysis (NSA) ) somehow are fundamentally non-constructive and that NSA is devoid of numerical meaning, as Bishop was wont to say. In this talk, we disprove the Bishop-Connes claim regarding NSA. To this end, we show that theorems of NSA are equivalent to their associated “highly constructive" theorems from numerical analysis (not involving NSA). We shall focus on potential applications of these results in philosophy and computer science, especially concerning vagueness and ontology.

Gini vs. Shannon: The Case for Quadratic Entropy in Formal Philosophy of Science

Play Episode Listen Later Mar 17, 2018 45:22


Vincenzo Crupi (Turin) gives a talk at the MCMP Colloquium (26 November, 2015) titled "Gini vs. Shannon: The Case for Quadratic Entropy in Formal Philosophy of Science". Abstract: A probabilistic representation of the notion of uncertainty is an important tool in formal philosophy of science and epistemology: it yields theoretical and mathematical connections with the informativeness of a statement, gradational accuracy, evidential support, how a probability distribution diverges from another, the expected informational utility of an experiment, and more besides. Whenever a choice is made for a measure of uncertainty in these contexts, Shannon entropy is standard, but it is by no means the only option available. In fact, I will question the motivation for this predominance, and suggest that it may be due to historical accident more than fully compelling theoretical reasons. The attractive features of Shannon entropy as a representation of epistemic uncertainty may have been oversold or just taken for granted too quickly, while the comparative appeal of at least one competing approach (quadratic entropy) has been largely neglected.

Quantum Causal Models, Faithfulness and Retrocausality

Play Episode Listen Later Mar 17, 2018 45:27


Peter Evans (Queensland) gives a talk at the MCMP Colloquium (25 November, 2015) titled "Quantum Causal Models, Faithfulness and Retrocausality". Abstract: Wood and Spekkens (2015) argue that any causal model explaining the EPRB correlations and satisfying no-signalling must also violate the assumption that the model faithfully reproduces the statistical dependences and independences---a so-called “fine-tuning” of the causal parameters; this includes, in particular, retrocausal explanations of the EPRB correlations. I consider this analysis with a view to enumerating the possible responses an advocate of retrocausal explanations might propose. I focus on the response of Näger (2015), who argues that the central ideas of causal explanations can be saved if one accepts the possibility of a stable fine-tuning of the causal parameters. I argue that, in light of this view, a violation of faithfulness does not necessarily rule out retrocausal explanations of the EPRB correlations.

Positive Reflection Calculi

Play Episode Listen Later Mar 17, 2018 71:44


Lev Beklemishev (Russian Academy of Sciences Moscow) gives a talk at the MCMP Colloquium (12 November, 2015) titled "Positive Reflection Calculi". Abstract: We deal with the fragment of propositional modal logic consisting of implications of formulas built up from the variables and the constant `true' by conjunction and diamonds only. We call such fragments strictly positive. The interest towards strictly positive modal logics independently emerged around 2010 in two different disciplines: the work on description logic by Zakharyaschev, Kurucz, et al., and the work on proof-theoretic applications of provability logic by myself, Dashkov, et al. The advantages of considering such fragments are twofold. On the one hand, strictly positive fragments of modal logics are usually (and not surprisingly) much simpler than the original logics. Typically, strictly positive fragments of standard modal logics are polytime decidable. On the other hand, the strictly positive language, being weaker than the standard modal language, allows for many more meaningful interpretations. In this talk we review basic results on strictly positive logics, their syntax and semantics. Furthermore, we develop the framework of reflection calculus, that is, a logic in which the diamonds are interpreted as reflection schemata in arithmetic, possibly of unrestricted logical complexity. This framework allows for a natural treatment of extensions of arithmetic by Tarskian truth predicates and the corresponding reflection principles.

Counterfactual Belief and Actuality

Play Episode Listen Later Mar 17, 2018 54:59


Jan Heylen (KU Leuven) gives a talk at the MCMP Colloquium (19 November, 2015) titled "Counterfactual Belief and Actuality". Abstract: The central question of this article is how to combine counterfactual theories of knowledge with the notion of actuality. It is argued that the straightforward combination of these two elements leads to problems, viz. the problem of collapsing knowledge and the problem of missing knowledge. In other words, there is overgeneration of knowledge and there is undergeneration of knowledge. The combination of these problems cannot be solved by appealing to methods by which beliefs are formed. An alternative solution is put forward. The key is to rethink the closeness relation that is at the heart of counterfactual theories of knowledge.

Where are the Woman in Medieval Logic?

Play Episode Listen Later Mar 17, 2018 41:31


Sara Uckelman (Durham University) gives a talk at the MCMP Colloquium (18 November, 2015) titled "Where are the Woman in Medieval Logic?". Abstract: Recent research into medieval logic has shown that the field is full of material of interest to the contemporary logicians, from dynamic analyses of the relationship between proof and knowledge, to novel solutions to the Liar paradox (and others), to logics for deceit and lying, to reasoning about uncertainty and ignorance, and much, much more. Recent research has also shown that many contemporary academic disciplines, mathematics and philosophy included, suffer from a problematic gender imbalance, with women disproportionately underrepresented both in academic positions and in the teaching syllabi. Recognition of this fact has come with a push to revisit the history of these fields to resurrect and rehabilitate the contributions of women. This talk stems from the intersection of these two developments and seeks to answer the question of "Where are the women in medieval logic?" We currently don't know of any female author of logical works in the Middle Ages. There are (at least) two causes for this: (i) We haven't discovered any yet. (ii) There weren't any. We consider the relative merits of both of these explanations, discussing the historic and cultural/contextual facts that prevented women from participating in main-stream logical research. Next we turn to the question of 'what is logic', showing how the answer to this question determines the answer to the original question: For if we expand 'logic' to include not only the production of theoretical, instructional treatises, but also to include the application of Aristotelian dialectic in non-academic contexts, we find a number of women worthy of further investigation when developing a more complete history of logic.

Truthlikeness, Accuracy and Epistemic Value

Play Episode Listen Later Mar 17, 2018 61:29


Graham Oddie (Boulder) gives a talk at the MCMP Colloquium (29 October, 2015) titled "Truthlikeness, Accuracy and Epistemic Value".

Doxastic Responsibility and the Basing Relation

Play Episode Listen Later Mar 17, 2018 55:56


Anne Meylan (Basel) gives a talk at the MCMP Colloquium (11 November, 2015) titled "Doxastic Responsibility and the Basing Relation". Abstract: People are responsible for their beliefs and not only for their actions. However, they are not apparently able to control their beliefs as they are able to control their actions. This is what I call “the problem of doxastic responsibility”. The aim of this presentation is to describe a difficulty for a particular solution to this problem. This solution —I dub it “the solution of reasons-responsiveness”— has been extensively defended recently. The plan is as follows. In the first and second part, I present the problem of doxastic responsibility and the solution of reasons-responsiveness respectively. Crucially, this solution is very largely inspired by an account of our moral responsibility for actions that takes reasons-responsiveness to be necessary and sufficient for actions. In the third and final part, I levy a potential objection of my own against the solution of reasons-responsiveness. Briefly, the solution of reasons-responsiveness does not capture the difference between cases of based and cases of un-based beliefs.

Admissibility Decisions, Permissible Previsions

Play Episode Listen Later Mar 17, 2018 49:53


Arthur Pedersen (Max Planck Institute/MCMP) gives a talk at the MCMP Colloquium (5 November, 2015) titled "Admissibility Decisions, Permissible Previsions". Abstract: In this talk I shall consider the problem of designing a theory of judgment and decision making that adequately addresses outstanding challenges to the normative adequacy of strict “Bayesian” theories inspired by the pioneering developments of Ramsey, de Finetti, and Savage. The questions I shall ask are basic ones: how are personal opinions expressed in situations of uncertainty, especially those where information is conflicting or scarce; what bearing do they have upon decisions taken to promote personal objectives; and what norms—in particular, what decision-making criteria—set appropriate standards against which to evaluate interrelated states of attitudinal judgments such as these? I shall propose a theory of judgement and decision making that advances constitutional reform along two dimensions: (a) it repeals gratuitous requirements the strict Bayesian canon imposes on states of attitudinal judgment (e.g., complete comparability of personal opinions), and (b) it enacts mandatory requirements the strict Bayesian canon fails to impose on such states (e.g., admissibility—rejection of weakly dominated courses of action). Unlike alternative proposals, the theory I advance (i) specifies the function of an agent’s judgments in inquiry and decision making, (ii) does not capitulate to the mathematical convenience of weakened Archimedean restrictions (or other regularity conditions, e.g., measurability, topological, or subsidiary cardinality restrictions), and (iii) subsumes existing proposals (admitting numerical representations in terms of formal power series fields in a single infinitesimal).

Gustav Shpet on the Function of Understanding History

Play Episode Listen Later Mar 17, 2018 47:43


Elena Tatievskaya (Augsburg) gives a talk at the MCMP Colloquium (4 November, 2015) titled "Gustav Shpet on the Function of Understanding History". Abstract: I discuss the question whether the method of history can be characterized with the help of the concept of understanding without reference to empathy. I consider as an argument for such a possibility Shpet’s (1879-1937) “logic of history”. Its formulation is based upon the assumption that the subject matter of history defines the peculiarities of its methods. According to Shpet history deals with “organizations which function as “messages”. The meaning of such a message is an end-means-relation to be defined. Therefore understanding which is an instrument of historical cognition and fulfills the task of concept formation is directed to objects which have semiotic nature. Historical concepts differ from concepts of natural sciences and are formulated in the form of descriptions used to explain an individual historical change by means of a theory.

First Steps towards Non-Classical Logic of Informal Provability

Play Episode Listen Later Mar 17, 2018 34:31


Rafal Urbaniak (Ghent) gives a talk at the MCMP Colloquium (29 October, 2015) titled "First Steps towards Non-Classical Logic of Informal Provability". Abstract: Mathematicians prove theorems in a semi-formal setting, providing what we'll call informal proofs. There are various philosophical reasons not to reduce informal provability to formal provability within some appropriate axiomatic theory (see Marfori 2010, Leitgeb 2009). But the main worry is that we have a strong intuition that whatever is informally provable is true. So we seem committed to all instances of the so-called reflection schema: P(A) -> A (where P is the informal provability predicate). Yet, a sufficiently strong arithmetical theory T resulting from adding to PA (or any sufficiently strong arithmetic) all its instances for provability in T will be inconsistent. The main idea behind most of the current approaches (Shapiro 1985 Horsten 1994, Horsten1996} is to extend the language with a new informal provability predicate or operator, and include all instances of the reflection schema for it. Contradiction is avoided at the price of dropping one of the derivability conditions. Thus, various options regarding trade-offs between various principles which all seem convincing are studied. In order to overcome some of the resulting difficulties and arbitrariness we investigate the strategy which changes the underlying logic and treats informal provability as a partial notion, just like Kripke's theory of truth (Kripke 1975). Alas, no three-valued logic can do the job that K3 does for truth for informal provability. The main reason is that the value of a complex formula in those logics is always a function of the values of its components. This fails to capture the fact that, for instance, some informally provable disjunctions of mathematical claims have informally provable disjuncts, while some others don't. We develop a non-functional many-valued logic which avoids this problem and better captures our intuitions about informal provability. We describe the semantics of our logic and some of its properties. We argue that it does a better job when it comes to reasoning with informal provability predicate.

Epistemic Logic, Game Theory and Behavior

Play Episode Listen Later Mar 17, 2018 39:40


Rohit Parikh (CUNY) gives a talk at the MCMP Colloquium (29 October, 2015) titled "Epistemic Logic, Game Theory and Behavior". Abstract: Enormous developments have taken place in epistemic reasoning since the foundational work of Hintikka and Lewis. But the entry of computer scientists in this area has made for much more rapid and more technical developments. We will cover the following topics: (a) Kripke structures and Aumann structures; (b) Formalism and completeness; (c) Communication and change in knowledge thereby; (d) The dependence of action on states of knowledge; (e) Manipulation of actions via manipulation of knowledge; (f) Deducing states of belief from the observation of actions. We will use examples from literature, animal behavior and the behavior of children.

Panel III: Discussion on "Has Physics changed? - and should it?"

Play Episode Listen Later Mar 13, 2018 96:38


Discussion on "Has Physics changed? - and should it?" at the Workshop on "Why trust a Theory?" (7-9 December, 2015).

Panel II: Discussion on "How far do we get with Empirical Data?"

Play Episode Listen Later Mar 13, 2018 85:02


Discussion on "How far do we get with Empirical Data?" at the Workshop on "Why trust a Theory?" (7-9 December, 2015).

Panel I: Discussion on "Why Trust a Theory?"

Play Episode Listen Later Mar 13, 2018 88:47


Discussion on "Why Trust a Theory?" at the Workshop on "Why trust a Theory?" (7-9 December, 2015).

String Theory to the Rescue

Play Episode Listen Later Mar 13, 2018 61:45


David Gross (UC Santa Barbara) presented Joseph Polichinski's (UC Santa Barbara) talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "String Theory to the Rescue". Abstract: The search for a theory of quantum gravity faces two great challenges: the incredibly small scale of the Planck length and time, and the possibility that the observed constants of nature are in part the result of random processes. A priori, one might have expected these obstacles to be insuperable. However, clues from observed physics, and the discovery of string theory, raise the hope that the unification of quantum mechanics and general relativity is within reach.

What is a Theory?

Play Episode Listen Later Mar 13, 2018 53:37


David Gross (UC Santa Barbara) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "What is a Theory?".

The Limits of Cosmology, Post-Planck

Play Episode Listen Later Mar 13, 2018 30:27


Joseph Silk (Johns Hopkins Univ. Baltimore/Univ. Pierre et Marie Curie, Paris) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "The Limits of Cosmology, Post-Planck". Abstract: I will discuss how one might follow up on the Planck satellite which has given us a remarkable confirmation to high precision of what has become known as the “standard model of cosmology.” This model is purely phenomenological and establishes a robust framework around which a number of fundamental issues remain unresolved. In order to make further progress, what is our optimal choice of future strategy?

Fundamental Theories and Epistemic Shifts: Can History of Science serve as a Guide?

Play Episode Listen Later Mar 13, 2018 32:32


Helge Kragh (Copenhagen) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Fundamental Theories and Epistemic Shifts: Can History of Science serve as a Guide?". Abstract: Epistemic standards and methodologies of science inevitably reflect the successes and failures of the past. In this sense, they are in part of a historical nature. Moreover, the commonly accepted methodological criteria have to some extent changed over time. Faced with the problem of theories that cannot be tested empirically, perhaps not even in principle, it may be useful to look back in time to situations of a somewhat similar kind. Roughly speaking, previous suggestions of non-empirical testing have not fared well through the long history of science. Ambitious and fundamental theories of this kind have generally been failures, some of them grander than others. So, is there any reason to believe that they will not remain so in the future? Can we infer from history that empirical testability is a sine qua non for what we know as science? Not quite, for it is far from obvious that older scientific theories can be meaningfully compared to modern string theory or multiverse physics. History of science is at best an ambiguous guide to present and future problems, yet it does provide reasons for scepticism with regard to current suggestions of drastic epistemic shifts which essentially amounts to a new “definition” of science.

Aspects of Quantum Gravity

Play Episode Listen Later Mar 13, 2018 41:13


Dieter Lüst (LMU) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Aspects of Quantum Gravity".

What can we learn from Analogue Experiments?

Play Episode Listen Later Mar 13, 2018 33:33


Karim Thebault (MCMP/LMU) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "What can we learn from Analogue Experiments?". Abstract: In 1981 Unruh proposed that fluid mechanical experiments could be used to probe key aspects of the quantum phenomenology of black holes. In particular, he claimed that an analogue to Hawking radiation could be created within a fluid mechanical 'dumb hole'. Since then an entire sub-field of 'analogue gravity' has been created. In 2014 Steinhauer reported the experimental observation of Hawking radiation within a Bose-Einstein condensate dumb hole. What can we learn from such analogue experiments? In particular, can they provide confirmation of novel phenomena such as black hole Hawking radiation?

Considering the Role of Information Theory in Fundamental Physics

Play Episode Listen Later Mar 13, 2018 30:51


Chris Wüthrich (Geneva) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Considering the Role of Information Theory in Fundamental Physics". Abstract: Information theory presupposes the notion of an epistemic agent, such as a scientist or an idealized human. Despite that, information theory is increasingly invoked by physicists concerned with fundamental physics, physics at very high energies, or generally with the physics of situations in which even idealized epistemic agents cannot exist. In this talk, I shall try to determine the extent to which the application of information theory in those contexts is legitimate. I will illustrate my considerations using the case of black hole thermodynamics and Bekenstein's celebrated argument for his formula for the entropy of black holes. This example is particularly pertinent to the current workshop because it is widely accepted as 'empirical data' in notoriously deprived quantum gravity, even though the laws of black hole thermodynamics have so far evaded direct empirical confirmation.

Scientific Methodology: A View from Early String Theory

Play Episode Listen Later Mar 13, 2018 29:37


Elena Castellani (Florence) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Scientific Methodology: A View from Early String Theory". Abstract: Looking at the developments of quantum field theory and string theory since their very beginnings, it does not seem that the methodology in fundamental physics has changed. The same strategies are applied in theory building and assessment. The methodology leading to the string idea and its successive developments is the same one we can find in many fundamental developments in theoretical physics. These have been crowned with successful empirical confirmation (sometimes, after a number of years): starting with the history of the positron to arrive at the Higgs particle.

Lost in Math

Play Episode Listen Later Mar 13, 2018 29:17


Sabine Hossenfelder (NORDITA, Stockholm) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Lost in Math". Abstract: I will speak about the role of social and cognitive biases in hypotheses pre-selection, and reflect on the rationale behind the concepts of naturalness, simplicity and beauty.

Limits in testing the Multiverse

Play Episode Listen Later Mar 13, 2018 42:00


George Ellis (Cape Town) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Limits in testing the Multiverse". Abstract: Our ability to test cosmological models is severely constrained by visual horizons on the one hand, and physical horizons (limits on testing physical theories) on the other. Various arguments have been given to get round these limitations. I will argue that these amount to philosophical choices, which may or may not correspond to physical reality, and hence resulting claims do not amount to established scientific results. This holds in particular to a variety of claims of physical existence of infinities of galaxies, universes, or beings like ourselves in a multiverse. We need a strong philosophical stance to distinguish which of these claims should indeed be regarded as proven science,and which not.

Theory in Fundamental Physics: The View from the Outside

Play Episode Listen Later Mar 13, 2018 34:34


Massimo Pigliucci (City College of New York) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Theory in Fundamental Physics: The View from the Outside". Abstract: Trouble, as explicitly hinted at in the title of a recent book by Lee Smolin, has been brewing for a while within the fundamental physics community. Ideas such as string theory and the multiverse have been both vehemently defended as sound science and widely criticized for being “not even wrong,” in the title of another book, by Peter Woit. Recently, George Ellis and Joe Silk have written a prominent op-ed piece in Nature, inviting their colleagues to defend the very integrity of physics. To which cosmologist Sean Carroll has responded that physics doesn’t need "the falsifiability police,” referring to the famous (and often misunderstood or badly applied) concept introduced by Karl Popper to demarcate science from pseudoscience. The debate isn’t just “for the heart and soul” of physics, it has spilled onto social media, newspapers and public radio. What is at stake is the public credibility of physics in particular and of science more generally — especially in an era of widespread science denial (of evolution and anthropogenic climate change) and rampant pseudoscience (antivax movement). Since philosophers of science have been invoked by both sides, it is time to take a look at the “physics wars” from a detached philosophical perspective, in my case informed also by my former career as an evolutionary biologist, a field that has peculiar similarities with what is going on in fundamental physics, both in terms of strong internal disputes and of perception by a significant portion of the general public.

Secret Quantum Lives of Black Holes and Dark Energy

Play Episode Listen Later Mar 13, 2018 36:23


Georgi Dvali (LMU) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Secret Quantum Lives of Black Holes and Dark Energy".

Non-empirical Confirmation

Play Episode Listen Later Mar 13, 2018 33:41


Richard Dawid (MCMP/LMU) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Non-empirical Confirmation". Abstract: The talk will analyse reasons for the high degree of trust many physicists have developed in empirically unconfirmed theories. An extension of the concept of theory confirmation (to be called “non-empirical confirmation”) will be suggested that allows for confirmation by observations that are not predicted by the theory in question. The last part of the talk will address a number of worries that have been raised with respect to this approach.

Physics without Experiments?

Play Episode Listen Later Mar 13, 2018 27:18


Radin Dardashti (MCMP/LMU) gives a talk at the Workshop on "Why trust a Theory?" (7-9 December, 2015) titled "Physics without Experiments?". Abstract: Most of the fundamental theories in modern physics are relevant at energy scales or length scales where empirical access is currently hard to obtain. Accounts of theory assessment within the philosophy of science literature are, however, usually concerned with the relation between the theory and the empirical data they predict. So these “traditional” approaches seem not to allow for theory assessment in these cases. Recently, Richard Dawid has developed an account of non-empirical theory assessment, which uses evidence not entailed by the theory to assess the theory. In this talk I present a problem-oriented perspective on these issues, which allows to better assess the possibilities and limitations of non-empirical theory assessment.

Claim MCMP

In order to claim this podcast we'll send an email to with a verification link. Simply click the link and you will be able to edit tags, request a refresh, and other features to take control of your podcast page!

Claim Cancel