MCMP – Philosophy of Science

Follow MCMP – Philosophy of Science
Share on
Copy link to clipboard

Mathematical Philosophy - the application of logical and mathematical methods in philosophy - is about to experience a tremendous boom in various areas of philosophy. At the new Munich Center for Mathematical Philosophy, which is funded mostly by the German Alexander von Humboldt Foundation, philoso…

MCMP Team

  • Apr 18, 2019 LATEST EPISODE
  • infrequent NEW EPISODES
  • 46m AVG DURATION
  • 86 EPISODES


Search for episodes from MCMP – Philosophy of Science with a specific topic:

Latest episodes from MCMP – Philosophy of Science

How Almost Everything in Space-time Theory Is Illuminated by Simple Particle Physics: The Neglected Case of Massive Scalar Gravity

Play Episode Listen Later Apr 18, 2019 59:13


J. Brian Pitts (Cambridge) gives a talk at the MCMP Colloquium (6 February, 2013) titled "How Almost Everything in Space-time Theory Is Illuminated by Simple Particle Physics: The Neglected Case of Massive Scalar Gravity". Abstract: Both particle physics from the 1920s-30s and the 1890s Seeliger-Neumann modification of Newtonian gravity suggest considering a “mass term,” an additional algebraic term in the gravitational potential. The “graviton mass” gives gravity a finite range. The smooth massless limit implies underdetermination. In 1914 Nordström generalized Newtonian gravity to fit Special Relativity. Why not do to Nordström what Seeliger and Neumann did to Newton? Einstein started in setting up a (faulty!) analogy for his cosmological constant Λ. Scalar gravities, though not empirically viable since the 1919 bending of light observations, provide a useful test bed for tensor theories like General Relativity. Massive scalar gravity, though not completed in a timely way, sheds philosophical light on most issues in contemporary and 20th century space-time theory. A mass term shrinks the symmetry group to that of Special Relativity and violates Einstein's principles (general covariance, general relativity, equivalence and Mach) in empirically small but conceptually large ways. Geometry is a poor guide to massive scalar gravities in comparison to detailed study of the field equation or Lagrangian. Matter sees a conformally flat metric because gravity distorts volumes while leaving the speed of light alone, but gravity sees the whole flat metric due to the mass term. Largely with Poincaré (pace Eddington), one can contemplate a “true” flat geometry differing from what material rods and clocks disclose. But questions about “true” geometry need no answer and tend to block inquiry. Presumptively one should expect analogous results for the tensor (massive spin 2) case modifying Einstein’s equations. A case to the contrary was made only in 1970-72: an apparently fatal dilemma involving either instability or empirical falsification appeared. But dark energy measurements since 1999 cast some doubt on General Relativity (massless spin 2) at long distances. Recent calculations (2000s, some from 2010) show that instability can be avoided and that empirical falsification likely can be as well, making massive spin 2 gravity a serious rival for GR. Particle physics can let philosophers proportion belief to evidence over time, rather than suffering from unconceived alternatives.

Evaluating Risky Prospects: The Distribution View

Play Episode Listen Later Apr 18, 2019 63:22


Luc Bovens (LSE) gives a talk at the 6th Munich-Sydney-Tilburg Conference on "Models and Decisions" (10-12 April, 2013) titled "Evaluating Risky Prospects: The Distribution View". Abstract: Policy Analysts need to rank policies with risky outcomes. Such policies can be thought off as prospects. A prospect is a matrix of utilities. On the rows we list the people who are affected by the policy. In the columns we list alternative states of the world and specify a probability distribution over the states. I provide a taxonomy of various ex ante and ex post distributional concerns that enter into such policy evaluations and construct a general method that reflects these concerns, integrates the ex ante and ex post calculus, and generates orderings over policies. I show that Parfit’s Priority View is a special case of the Distribution View.

On the Conception of Fundamentality of Time-Asymmetries in Physics

Play Episode Listen Later Apr 18, 2019 50:13


Daniel Wohlfarth (Bonn) gives a talk at the MCMP Colloquium (30 January, 2013) titled "On the Conception of Fundamentality of Time-Asymmetries in Physics". Abstract: The goal of my talk is to argue for two connected proposals: Firstly: I shall show that a new conceptual understanding of the term ‘fundamentality’ - in the context of time-asymmetries - is applicable to cosmology and in fact shows that classical and semi-classical cosmology should be understood as time-asymmetric theories. Secondly: I will show that the proposed conceptual understanding of ‘fundamentality’, applied to cosmological models with a hyperbolical curved spacetime structure, provides a new understanding of the origin of the (quantum) thermodynamic time-asymmetry. In the proposed understanding a ‘quantum version’ of the second law can be formulated. This version is explicitly time-asymmetric (decreasing entropy with decreasing time coordinates and visa versa). Moreover, the physical effectiveness of the time-asymmetry will be based on the crucial Einstein equations and additional calculations in QFT. Therefore, the physical effectiveness of the time-asymmetry will be independent of an ontic interpretation of ‘entropy’ itself. The whole account is located in the set of semi classical quantum cosmology (without an attempt to quantize gravity) and depends on the definability of any cosmic time coordinates.

Simplicity and Measurability in Science

Play Episode Listen Later Apr 18, 2019 42:10


Luigi Scorzato (Roskilde) gives a talk at the MCMP Colloquium (16 January, 2013) titled "Simplicity and Measurability in Science". Abstract: Simple assumptions represent a decisive reason to prefer one theory to another in everyday scientific praxis. But this praxis has little philosophical justification, since there exist many notions of simplicity, and those that can be defined precisely strongly depend on the language in which the theory is formulated. Moreover, according to a common general argument, the simplicity of a theory is always trivial in a suitably chosen language. However, this "trivialization argument" is always either applied to toy-models of scientific theories or applied with little regard for the empirical content of the theory. In this paper I show that the trivialization argument fails, when one considers realistic theories and requires their empirical content to be preserved. In fact, the concepts that enable a very simple formulation, are not necessarily measurable, in general. Moreover, the inspection of a theory describing a chaotic billiard shows that precisely those concepts that naturally make the theory extremely simple are provably not measurable. This suggests that, whenever a theory possesses sufficiently complex consequences, the constraint of measurability prevents too simple formulations in any language. In this paper I propose a way to introduce the constraint of measurability in the formulation of a scientific theory in such a way that the notion of simplicity acquires a general and sufficiently precise meaning. I argue that this explains why the scientists often regard their assessments of simplicity as largely unambiguous.

Descriptivism about Theoretical Concepts Implies Ramsification or (Poincarean) Conventionalism

Play Episode Listen Later Apr 18, 2019 47:36


Holger Andreas (MCMP/LMU) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Descriptivism about Theoretical Concepts Implies Ramsification or (Poincarean) Conventionalism".

Theoretical Terms and Induction

Play Episode Listen Later Apr 18, 2019 58:15


Hannes Leitgeb (LMU/MCMP) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Theoretical Terms and Induction".

Causality and Theoretical Terms in Physics

Play Episode Listen Later Apr 18, 2019 50:31


C. Ulises Moulines (LMU) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Causality and Theoretical Terms in Physics".

Theoretical Terms, Ideal Objects and Zalta's Abstract Objects Theory

Play Episode Listen Later Apr 18, 2019 33:59


Xavier de Donato (USC) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Theoretical Terms, Ideal Objects and Zalta's Abstract Objects Theory".

Causal-descriptivism Revisited

Play Episode Listen Later Apr 18, 2019 54:45


Stathis Psillos (Athens) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Causal-descriptivism Revisited".

Avoiding Reification

Play Episode Listen Later Apr 18, 2019 29:27


Michele Ginammi (Pisa) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Avoiding Reification".

Leibniz Equivalence

Play Episode Listen Later Apr 18, 2019 53:45


Jeffrey Ketland (Oxford) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Leibniz Equivalence".

Implicitly defining mathematical terms

Play Episode Listen Later Apr 18, 2019 35:15


Demetra Christopoulou (Patras) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Implicitly defining mathematical terms".

Definition, elimination and introduction of theoretical terms

Play Episode Listen Later Apr 18, 2019 31:03


Gauvain Leconte (Paris) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Definition, elimination and introduction of theoretical terms".

Theoretical Terms, Ramsey Sentences and Structural Realism

Play Episode Listen Later Apr 18, 2019 49:26


John Worrall (LSE) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Theoretical Terms, Ramsey Sentences and Structural Realism".

Typicality in Statistical Physics and Dynamical Systems Theory

Play Episode Listen Later Apr 18, 2019 38:04


Charlotte Werndl (LSE) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Typicality in Statistical Physics and Dynamical Systems Theory".

The epsilon-reconstruction of theories and scientific structuralism

Play Episode Listen Later Apr 18, 2019 30:28


Georg Schiemer (LMU/MCMP) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "The epsilon-reconstruction of theories and scientific structuralism".

The Criteria for the Empirical Significance of Terms

Play Episode Listen Later Apr 18, 2019 35:24


Sebastian Lutz (LMU/MCMP) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "The Criteria for the Empirical Significance of Terms".

Cooperation and (structural) Rationality

Play Episode Listen Later Apr 18, 2019 51:36


Julian Nida-Rümelin (LMU) gives a talk at the 6th Munich-Sydney-Tilburg Conference on "Models and Decisions" (10-12 April, 2013) titled "Cooperation and (structural) Rationality". Abstract:Cooperation remains a challenge for the theory of rationality, rational agents should not cooperate in one shot prisoner's dilemmas. But they do, it seems. There is a reason why mainstream rational choice theory is at odds with cooperative agency: rational action is thought to be consequentialist, but this is wrong. If we give up consequentialism and adopt a structural account of rationality, the problem resolves, as will be shown. In the second part of my lecture I shall show that structural rationality can be combined with bayesianism, contrary to what one may expect. And finally I shall discuss some philosophical implications of structural rationality.

Idealization, Prediction, Difference-Making

Play Episode Listen Later Apr 18, 2019 41:58


Michael Strevens (NYU) gives a talk at the 6th Munich-Sydney-Tilburg Conference on "Models and Decisions" (10-12 April, 2013) titled "Idealization, Prediction, Difference-Making". Abstract: Every model leaves out or distorts some factors that are causally connected to its target phenomena – the phenomena that it seeks to predict or explain. If we want to make predictions, and we want to base decisions on those predictions, what is it safe to omit or to simplify, and what ought a causal model to capture fully and correctly? A schematic answer: the factors that matter are those that make a difference to the target phenomena. There are several ways to understand the notion of difference-making. Which are the most useful to the forecaster, to the decision-maker? This paper advances a view.

Rationality and the Bayesian Paradigm

Play Episode Listen Later Apr 18, 2019 46:59


Itzhak Gilboa (HEC) gives a talk at the 6th Munich-Sydney-Tilburg Conference on "Models and Decisions" (10-12 April, 2013) titled "Rationality and the Bayesian Paradigm". Abstract: It is claimed that rationality does not imply Bayesianism. We first define what is meant by the two terms, so that the statement is not tautologically false. Two notions of rationality are discussed, and related to two main approaches to statistical inference. It is followed by a brief survey of the arguments against the definition of rationality by Savage's axioms, as well as some alternative approaches to decision making.

From Shannon's Axiomatic Approach to a New Sense of Biological Information

Play Episode Listen Later Apr 18, 2019 52:56


Omri Tal (CPNSS/LSE) gives a talk at the MCMP Colloquium (17 April, 2013) titled "From Shannon's Axiomatic Approach to a New Sense of Biological Information". Abstract: Shannon famously remarked that a single concept of information could not satisfactorily account for the numerous possible applications of the general field of communication theory. Recent interest in assessing the ‘population signal’ from genetic samples has mainly focused on empirical results. I employ some basic principles from Shannon’s work on information theory (Shannon 1948) to develop a measure of information for extracting ‘population structure’ from genetic data. This sense of information is somewhat less abstract than entropy or Kolmogorov Complexity and is utility-oriented. Specifically, given a collection of genotypes sampled from known multiple populations I would like to quantify the potential for correct classification of genotypes of unknown origin. Motivated by Shannon's axiomatic approach in deriving a unique information measure for communication, I first identify a set of intuitively justifiable criteria that any such quantitative information measure should satisfy. I will show that standard information-theoretic concepts such as mutual information or relative entropy cannot satisfactorily account for this sense of information, necessitating a decision-theoretic approach.

A Model-Based Epistemology of Measurement

Play Episode Listen Later Apr 18, 2019 49:33


Eran Tal (Bielefeld) gives a talk at the MCMP Colloquium (12 June, 2013) titled "A Model-Based Epistemology of Measurement". Abstract: The epistemology of measurement is an interdisciplinary area of research concerned with the conditions under which measurement and standardization methods produce knowledge, the nature, scope, and limits of this knowledge and the sources of its reliability. A primary goal of such studies is to better understand the ways in which theoretical and statistical assumptions about a measurement process influence the content and quality of its outcomes. Such assumptions often involve idealizations, that is, intentional distortions of aspects of the measuring instrument, measured object and environment that appear to threat the accuracy and objectivity of measurement. Here I argue that the opposite is the case: idealization is a necessary precondition for obtaining accurate and objective measurement outcomes. A measurement outcome, I submit, is a value range assigned to a parameter in a model in a way that allows the model to coherently predict the final states (‘indications’) of a process. Idealizations are necessary for identifying the measured parameter with a particular object, for distinguishing genuine effects from errors and for comparing measurement outcomes to each other. These claims are exemplified with a study of the contemporary evaluation and comparison of atomic clocks across national metrological laboratories. Building on these insights, I conclude by highlighting the promise held by model-based approaches for further research in the epistemology of measurement.

Separating Truth from Its Idealization

Play Episode Listen Later Apr 18, 2019 47:56


Paul Teller (UC Davis) gives a talk at the MCMP Colloquium (18 April, 2013) titled "Separating Truth from Its Idealization". Abstract: Science never succeeds in providing representations that are both perfectly precise and completely accurate. Instead science constructs models that are always in some ways inexact – imprecise, not perfectly accurate, or both. If this goes for the results of science, how much more should we expect it to hold for human knowledge generally! I explore this expectation for the project of modeling what it is for a statement to be true. The familiar model of characterizing truth in terms of predicating a precisely delimited property of a precisely delimited referent succeeds famously in characterizing semantic structure, but falters with questions about application to the world because we rarely, if ever, succeed in perfectly determinately picking out properties and referents. I sketch an alternative model-building approach that takes advantage of the ubiquitous occurrence of imprecision: For an imprecise statement to be true is for its precise “semantic alter-ego”, though false, to function as a truth.

Making sense of multiple climate models' projections

Play Episode Listen Later Apr 18, 2019 38:39


Claudia Tebaldi (Climate Central & NCAR) gives a talk at the 6th Munich-Sydney-Tilburg Conference on "Models and Decisions" (10-12 April, 2013) titled "Making sense of multiple climate models' projections". Abstract: In the last decade or so the climate change research community has adopted multi-model ensemble projections as the standard paradigm for the characterization of future climate changes. Why multiple models, and how we reconcile and synthesize -- or fail to -- their different projections, even under the same scenarios of future greenhouse gas emissions, will be the themes of my talk. Multi-model ensembles are fundamental to exploring an important source of uncertainty, that of model structural assumptions. Different models have different strength and weaknesses, and, how we use observations to diagnose those strengths and weaknesses, and then how we translate model performance in a measure of model reliability, are currently open research questions. The inter-dependencies among models, the existence of common errors and biases, are also a challenge to the interpretation of statistics from multi-model output. All this constitutes an interesting research field in the abstract, whose most current directions I will try to sketch in my talk, but is also critical to understand in the course of utilizing model output for practical purposes, to inform policy and decision making for adaptation and mitigation.

On the Justification of Deduction and Induction

Play Episode Listen Later Apr 18, 2019 69:55


Franz Huber (Toronto) gives a talk at the MCMP Colloquium (7 May, 2014) titled "On the Justification of Deduction and Induction". Abstract: In this talk I will first present my preferred variant of Hume (1739; 1748)'s argument for the thesis that we cannot justify the principle of induction. Then I will criticize the responses the resulting problem of induction has received by Carnap (1963; 1968) and by Goodman (1954), as well as briefly praise Reichenbach (1938; 1940)'s approach. Some of these authors compare induction to deduction. Haack (1976) compares deduction to induction. I will critically discuss her argument for the thesis that it is impossible to justify the principles of deduction next. In concluding I will defend the thesis that we can justify induction by deduction, and deduction by induction. Along the way I will show how we can understand deduction and induction as normative theories, and I will argue that there are only hypothetical, but no categorical imperatives.

An Analogical Inductive Logic for Partially Exchangeable Families of Attributes

Play Episode Listen Later Apr 18, 2019 74:15


Simon Huttegger (UC Irvine) gives a talk at the MCMP Colloquium (22 May, 2014) titled "An Analogical Inductive Logic for Partially Exchangeable Families of Attributes". Abstract: Since Carnap started his epic program of developing an inductive logic, there have been various attempts to include analogical reasoning into systems of inductive logic. I will present a new system based on de Finetti's concept of partial exchangeability. Together with a set of plausible axioms, partial exchangeability allows one to derive a family of inductive learning rules with enumerative analogical effects.

On Bell's local causality in local classical and quantum theory

Play Episode Listen Later Apr 18, 2019 58:19


Gábor Hofer-Szabó (Budapest) gives a talk at the MCMP Colloquium (9 April, 2014) titled "On Bell's local causality in local classical and quantum theory". Abstract: This paper aims to give a clear-cut definition of Bell's notion of local causality. Having provided a framework, called local physical theory, which integrates probabilistic and spatiotemporal concepts, we formulate the notion of local causality and relate it to other locality and causality concepts. Then we compare Bell's local causality with Reichenbach's Common Cause Principle and relate both to the Bell inequalities. We find a nice parallelism: both local causality and the Common Cause Principle are more general notions than captured by the Bell inequalities. Namely, Bell inequalities cannot be derived neither from local causality nor from a common cause unless the local physical theory is classical or the common cause is commuting, respectively.

The Completion of Logical Empiricism: Hempel's Pragmatic Turn

Play Episode Listen Later Apr 18, 2019 36:45


Gereon Wolters (Konstanz) gives a talk at the MCMP Colloquium (22 October, 2014) titled "The Completion of Logical Empiricism: Hempel's Pragmatic Turn". Abstract: For most of his life Carl Gustav Hempel (1905-1997) subscribed to the Carnapian variant of logical empiricism. According to Rudolf Carnap (1891-1970) philosophy of science is "rational reconstruction" (syntactically and/or semantically) of basic methodological concepts like probability, explanation, confirmation, and so on. Practically unnoticed by the philosophical community Hempel later gave up this approach and developed an "explanatory-normative methodology" (E-N--Methodology). Decisive for his change was the work of Thomas S. Kuhn (1922-1996), whom Hempel first had met at Stanford in 1963/64. Hempel interpreted this pragmatic turn as a return to the Neurathian (Otto Neurath (1882-1945)) variant of logical empiricism.

Computational Model as Generic Mechanisms

Play Episode Listen Later Apr 18, 2019 48:58


Catherine Stinson (Ryerson University) gives a talk at the MCMP Colloquium (21 May, 2014) titled "Computational Model as Generic Mechanisms". Abstract: The role of computational models in science is a bit of a puzzle. They seem to be very unlike experiments in terms of their access to empirical facts about their target systems, yet scientists make liberal use of computational models to experiment and make discoveries. I connect this problem to one concerning mechanistic explanation. There a puzzle arises as to how schematic or abstract mechanisms can be explanatory, which they often seem to be, if one is committed to thinking of explanation as intimately connected to causation. Abstractions aren’t the sorts of things that have causal powers. A solution to both problems is to think of computational models not as abstractions, but as bare instantiations of abstract types, which I’ll call generics. Generics are the sorts of things that have causal powers. Computational models can then be considered experiments on generics, which gives them access to empirical facts about those generics. I argue that many common types of experiment can be better understood as experiments on generics, and suggest a shift in how we think of the inferences made in interpreting and applying experimental results.

Persistence of the lifeworld? On the relation of lifeworld and science

Play Episode Listen Later Apr 18, 2019 33:51


Gregor Schiemann (Wuppertal) gives a talk at the MCMP Colloquium (28 May, 2014) titled "Persistence of the lifeworld? On the relation of lifeworld and science". Abstract: In contrast to the concept of science, the concept of the lifeworld describes an experience, which is characterised by familiar social relations, actions that are performed as a matter of course, and a lack of professionalism. Divergent relations between science and the life-world are possible, as I will demonstrate in the first part, by considering exemplary processes of scientification of the lifeworld and their opposing tendencies. I judge the contradicting evaluations as an expression of a cultural change, in which the existence of the lifeworld as a non-scientific experience is at stake. To evaluate the situation, I develop a concept of lifeworld in the second part, that reveals an attitude towards the world, and performances of actions, both of which are historically contingent; and whose abolition can, today, already be imagined. In the concluding third part the fact that we are, however, somewhat far removed from a conceivable end of the lifeworld is demonstrated.

Agent-based simulations in empirical sociological research

Play Episode Listen Later Apr 18, 2019 47:21


Isabelle Drouet (Paris-Sorbonne) gives a talk at the MCMP Colloquium (4 June, 2014) titled "Agent-based simulations in empirical sociological Research". Abstract: Agent-based models and simulations are more and more widely used in the empirical sciences. In sociology, they have been put at the core of a research project: analytical sociology, as theorized and practiced in, e.g., Hedström’s Dissecting the social (2005). Analytical sociologists conceive of ABMs as tools for causal analysis. More precisely, they see ABSs as the one method enabling the social sciences to produce genuine explanations of macro empirical phenomena by micro (or possibly meso) ones, and the purported explanations clearly are causal ones. My talk aims at clarifying in which sense exactly and under which conditions agent-based models and simulations as they are used in analytical sociology can indeed causally explain, or contribute to causally explain, social facts.

On the Distinction between Internal and External Symmetries

Play Episode Listen Later Apr 18, 2019 43:53


Radin Dardashti (MCMP/LMU) gives a talk at the MCMP Colloquium (2 July, 2014) titled "On the Distinction between Internal and External Symmetries". Abstract: There is no doubt that symmetries play an important role in fundamental physics, but there is no agreement among physicists on what this role exactly is. So it is not surprising that it has caught the interest of philosophers in recent years leading to a lively discussion on the epistemological and ontological significance of symmetries. Especially in this context it becomes relevant whether common distinctions made between different kinds of symmetries are purely conventional or have a deeper mathematical and/or physical justification. It is the aim of this talk to discuss the distinction between internal and external (or spacetime) symmetries and its possible justification. First, I will discuss attempts at combining internal and external symmetries, which lead to several no-go theorems. A naive interpretation of these results leads to the conclusion that the distinction is physically/mathematically justified. Second, the strong dependence of the no-go results on physical and mathematical assumptions is discussed and it is shown how the distinction becomes blurred once mathematical assumptions are weakened. This is exactly what happens in supersymmetric extensions of the standard model of particle physics. So, in the final part, I will argue that under a certain (philosophical) assumption the question about the status of the distinction can be made into an experimental question.

Model Tuning and Predictivism

Play Episode Listen Later Apr 18, 2019 40:57


Mathias Frisch (Maryland) gives a talk at the MCMP Colloquium (26 June, 2014) titled "Model Tuning and Predictivism". Abstract: Many climate scientists maintain that evidence used in tuning or calibrating a climate model cannot be used to evaluate the model. By contrast, the philosophers Katie Steele and Charlotte Werndl have argued, appealing to Bayesian confirmation theory, that tuning is simply an instance of hypothesis testing. In this paper I argue against both views and for a weak predictivism: there are cases, model-tuning among them, in which predictive successes are more highly confirmatory than accommodation. I propose a Bayesian formulation of the predictivist thesis.

Rational Routines

Play Episode Listen Later Apr 18, 2019 37:07


Martin Peterson (Eindhoven) gives a talk at the MCMP Colloquium (18 June, 2014) titled "Rational Routines". Abstract: Recent research in evolutionary economics suggests that firms and other organizations are governed by routines. What distinguishes successful firms and organizations from less successful ones is that the former are better at developing, using and modifying routines that fit with the circumstances faced by the organization. Individual agents also rely on routines: many people do not actively choose what to eat for breakfast, or how to travel to work, or how to organize their daily activities in the office. In this talk I explore the hypothesis that routines, rather than preferences over uncertain prospects, should be used as the fundamental building block in theories that aim to analyze normative aspects of real-life decision making. I focus on a single, structural property of routines: I show that as long as routines are weakly monotonic (in a sense defined in the talk) the decision maker is rationally permitted to apply all routines available in a given time period, in any order, and there is no requirement to apply any routine more than once. Furthermore, there is no other way in which the same set of routines could be applied that would produce an operational state that is strictly better. I finally compare my results with some quite different claims about routines made by Krister Segerberg in the 1980's.

Propensities, Chance Distributions, and Experimental Statistics

Play Episode Listen Later Apr 18, 2019 55:48


Mauricio Suarez (London, Madrid) gives a talk at the MCMP Colloquium (12 November, 2014) titled "Propensities, Chance Distributions, and Experimental Statistics". Abstract: Probabilistic or statistical modelling may be described as the attempt to characterise (finite) experimental data in terms of models formally involving probabilities. I argue that a coherent understanding of much of the practice of probabilistic modelling calls for a distinction between three notions that are often conflated in the philosophy of probability literature. A probability model is often implicitly or explicitly embedded in a theoretical framework that provides explanatory – not merely descriptive – strategies and heuristics. Such frameworks often appeal to genuine properties of objects, systems or configurations, with putatively some explanatory function. The literature provides examples of formally precise rules for introducing such properties at the individual or token level in the description of statistically relevant populations (Dawid 2007, and forthcoming). Thus, I claim, it becomes useful to distinguish probabilistic dispositions (or single-case propensities), chance distributions (or probabilities), and experimental statistics (or frequencies). I illustrate the distinction with some elementary examples of games of chance, and go on to claim that it is readily applicable to more complex probabilistic phenomena, notably quantum phenomena.

Fifteen Dimensions of Evaluating Theories of Causation. A Case Study of the Structural Model and the Ranking Theoretic Approach to Causation

Play Episode Listen Later Jul 9, 2015 57:44


Wolfgang Spohn (Konstanz) gives a talk at the Workshop on Causal and Probabilistic Reasoning (18-20 June, 2015) titled "Fifteen Dimensions of Evaluating Theories of Causation. A Case Study of the Structural Model and the Ranking Theoretic Approach to Causation". Abstract: The point of the talk is not to defend any exciting thesis. It is rather to remind you of all the dimensions theories of causation must take account of. It explains 15 such dimensions, not just in the abstract, but as exemplified by the structural model and the ranking theoretic approach to causation, which, surprisingly, differ on all 15 dimensions. Of course, the subcutaneous message is that the ranking theoretic approach might be preferable. However, the main moral is to be: Keep all these dimensions in mind, and don't think that any one of these dimensions would be settled! Even if working at a specific issue, you are never on secure ground.

Indigenous and Scientific Knowledge. A Model of Knowledge Integration and its Limitations.

Play Episode Listen Later Jul 8, 2015 43:19


David Ludwig (VU University Amsterdam) gives a talk at the MCMP Colloquium (17 June, 2015) titled "Indigenous and Scientific Knowledge. A Model of Knowledge Integration and its Limitations". Abstract: Philosophical debates about indigenous knowledge often focus on the issue of relativism: given a diversity of local knowledge systems, how can certain types of (e.g. scientific or metaphysical) knowledge claim to transcend their historical and cultural contexts? In contrast with philosophical worries about differences between knowledge systems, research in ethnobiology and conservation biology is often motivated by the practical need to integrate indigenous and modern scientific knowledge in the co-management of local environments. Instead of focusing on alleged incommensurability, real-life conservation efforts often require the incorporation of knowledge from different sources. Based on ethnobiological case studies, I propose a model of knowledge integration that reflects shared reference to property clusters and their inferential potentials. Furthermore, the proposed model does not only explain integration of indigenous and modern scientific knowledge but also predicts limitations of knowledge integration. I argue that the proposed model therefore does not only help to understand current practices of ethnobiology but also provides a nuances picture of the ontological and epistemological relations between different knowledge systems.

Navigating the Twilight of Uncertainty: Decisions from Experience

Play Episode Listen Later Jul 7, 2015 55:02


Ralph Hertwig (Max Planck Institute for Human Development) gives a talk at the Workshop on Causal and Probabilistic Reasoning (18-20 June, 2015) titled "Navigating the Twilight of Uncertainty: Decisions from Experience". Abstract: In many of our decisions we cannot consult explicit statistics telling us about the relative risks involved in our actions. In lieu of explicit statistics, we can search either externally or internally for information, thus making decisions from experience (as opposed to decisions from descriptions). Recently, researchers have begun to investigate choice in settings in which people learn about options by experiential sampling over time. Converging findings show that when people make decisions based on experience, choices differ systematically from description-based choice. Furthermore, this research on decisions from experience has turned to new theories of decision making under uncertainty (ambiguity), “rediscovered” the importance of learning, and suggested important implications for risk and precautionary behavior. I will review these issues.

Context, Conversation, and Fragmentation

Play Episode Listen Later Jul 7, 2015 47:34


Dirk Kindermann (Graz) gives a talk at the MCMP Colloquium (25 June, 2015) titled "Context, Conversation, and Fragmentation". Abstract: What is a conversational context?One inuential account (Lewis, Stalnaker, Roberts) says that it is a shared body of information | the information conveyed and/or presupposed by all interlocutors. Conversation, on this account, proceeds by variously influencing, and being influenced, by this body of information. In this talk, I argue that standard idealising assumptions, according to which this body of information is consistent and closed under entailment, put the account at risk of being inapplicable to ordinary speakers|rational agents with limited cognitive resources. I argue that to mitigate the problem, we should think of context as a fragmented body of information. I explain what fragmentation amounts to and develop a simple model of afragmented common ground. I close by presenting some advantages of a fragmentation strategy in explaining some otherwise puzzling conversational phenomena.

On the Role of the Light Postulate in Relativity

Play Episode Listen Later Jun 30, 2015 57:51


R. A. Rynasiewicz (Johns Hopkins University) gives a talk at the MCMP Colloquium (10 June, 2015) titled "On the Role of the Light Postulate in Relativity". Abstract: As presented by Einstein in 1905, the theory of special relativity follows from two postulates: first, what he called the principle of relativity, and second, an empirical fact about the relation of the propagation of light relative to its source that has come to be called the light postulate. In 1910 Waldemar von Ignatowsky claimed to be able to derive the Lorentz transformations, and hence special relativity, without the light postulate using only the principle of relativity and assumptions that Einstein seems to have implicitly made, such as linearity and the isotropy and homogeneity of space. In his authoritative Relativitätstheorie of 1921, Pauli dismissed Ignatowsky’s result without explanation as void of physical significance. More recently, respected physicists and foundationalists, such as David Mermin (1984), have defended Ignatowsky and claimed that special relativity pre- supposes nothing about electromagnetism. In the first part of this talk, I discuss just what the light postulate asserts (both in special and in general relativity). In the second, I hope to shed light on the debate, if not definitively settle it. (To say on which side would spoil the suspense.) I will also discuss related attempts to dismiss the conventionality of simultaneity.

Explaining Macroscopic Systems from Microscopic Principles

Play Episode Listen Later Jun 30, 2015 42:53


Peter Pickl (LMU) gives a talk at the MCMP Colloquium (10 June, 2015) titled "Explaining Macroscopic Systems from Microscopic Principles". Abstract: The revolutionary idea of the late 19th century that the physics of gases can be explained by the dynamics of small, point-like particles had a great influence on physics as well as mathematics and philosophy. This idea has changed our understanding of the physics of macroscopic systems significantly as well as the way we see our universe as a whole. The question of how the connection between the microscopic and the macroscopic world can be explained also arises in other fields, for example the life sciences. Answering this question might have a similar impact on the research in these fields. In the talk I will present recent techniques and results of our research group in deriving macroscopic evolution equations from microscopic principles for certain classical, quantum mechanical and biological systems.

Convergence of Iterated Belief Updates

Play Episode Listen Later Jun 29, 2015 55:00


Berna Kilinç (Boğaziçi University) gives a talk at the MCMP Colloquium (3 June, 2015) titled "Convergence of Iterated Belief Updates". Abstract: One desideratum on belief upgrade operations is that their iteration is truth-tropic, either on finite or infinite streams of reliable information. Under special circumstances repeated Bayesian updating satisfies this desideratum as shown for instance by the Gaifman and Snir theorem. There are a few analogous results in recent research within dynamic epistemic logic: Baltag et al establish the decidability of propositions for some but not all upgrade operations on finite epistemic spaces. In this talk further convergence results will be established for qualitative stable belief.

The Causual Nature of Modeling in Data-Intensive Science

Play Episode Listen Later Jun 29, 2015 61:47


Wolfgang Pietsch (MCTS/TU Munich) gives a talk at the MCMP Colloquium (3 June, 2015) titled "The Causual Nature of Modeling in Data-Intensive Science". Abstract: Abstract: I argue for the causal character of modeling in data-intensive science, contrary to wide-spread claims that big data is only concerned with the search for correlations. After introducing and discussing the concept of data-intensive science, several algorithms are examined with respect to their ability to identify causal relationships. To this purpose, a difference-making account of causation is proposed that broadly stands in the tradition of David Lewis’s counterfactual approach, but fits better the type of evidence used in data-intensive science. The account is inspired by causal inferences of the Mill’s method type. I situate data-intensive modeling within a broader framework of a Duhemian or Cartwrightian scientific epistemology, drawing an analogy to exploratory experimentation.

Against Grue Mysteries

Play Episode Listen Later Jun 29, 2015 45:43


Alexandra Zinke (Konstanz) gives a talk at the MCMP Colloquium (28 May, 2015) titled "Against Grue Mysteries". Abstract: In a recent paper, Freitag (2015) reduces Goodman’s new riddle of induction to the problem of doxastic dependence. We are not justified in projecting grue because our grue-evidence is doxastically dependent on defeated evidence. I try to implement this solution into an inductive extension of AGM belief revision theory. It turns out that the grue-example is nothing but an inductive version of well-known examples by Hansson (1992), which he uses to argue for base revision: If revision takes place on belief-bases, rather than on logically closed belief sets, we can easily account for the doxastic dependence relations between our beliefs. To handle the grue-case, I introduce the notion of an inductively closed belief-base. If we update on this inductively closed belief-base, the grue problem dissolves.

On Einstein's Reality Criterion

Play Episode Listen Later Jun 29, 2015 42:09


Gábor Hofer-Szabó (Hungarian Academy of Sciences) gives a talk at the MCMP Colloquium (28 May, 2015) titled "On Einstein's Reality Criterion". Abstract: In the talk we characterize the different interpretations of QM in an operationalist-frequentist framework and show what entities the different interpretations posit. We define completeness and correctness of an interpretation in terms of how this posited ontology relates to the "real world ontology" posited by principles independent of the interpretations. We argue that the Reality Criterion is just such a principle. We also argue that the EPR argument, making use of the Reality Criterion, is devised to show that certain interpretations of QM are incomplete, whereas Einstein's latter arguments, making no use of the Reality Criterion, are devised to show that the Copenhagen interpretation is simply wrong. Next, investigating the nature of prediction, an essential part of the Reality Criterion, we formulate two hypothesis: (i) the Reality Criterion is a special case of Reichenbach's Common Cause Principle; (ii) it is a special case of Bell's Local Causality Principle.

Predicting Outcomes in Five Person Spatial Games: An Aspiration Model Approach

Play Episode Listen Later May 28, 2015 80:25


Bernard Grofman (Irvine) gives a talk at the MCMP Colloquium (13 May, 2015) titled "Predicting Outcomes in Five Person Spatial Games: An Aspiration Model Approach". Abstract: There are many situations where voters must choose a single alternative and where both the voters and the alternatives can be characterized as points in a one or two or more dimensional policy space. In committees and legislatures, often choice among these alternatives will be done via a decision agenda in which alternatives are eliminated until a choice is made, sometimes requiring a final vote against the status quo. A common form for such an agenda is what has been called by Black (1958) standard amendment procedure, a “king of the hill” procedure in which there is an initial alternative who is paired against another alternative, with the winner of that pairwise contest becoming the new winner, and the processes continuing until either the set of feasible alternatives is exhausted or there is a successful motion for cloture. Beginning with a seminal experiment on five person voting games conducted by Plott and Fiorina (1978), there have been a number of experiments on committee voting games with an potentially infinite set of alternatives embedded in a two dimensional policy space. In games where there is a core, i.e., an alternative which, for an odd number of voters, can defeat each and every other alternative in paired comparison, outcomes at or near the core are chosen, but there is also considerable clustering of outcomes even in games without a core. A major concern of the literature has been to develop models to explain the pattern of that clustering in non-core situations. Here, after reviewing the present state of the art, we offer a new family of models based on the Siegel-Simon aspiration approach, in which voters satisfice by choosing “acceptable” alternative, and the set of outcomes that are considered acceptable by each voter changes as the game continues.

Modeling Cognitive Representations with Evolutionary Game Theory

Play Episode Listen Later May 11, 2015 35:37


Marc Artiga (MCMP) gives a talk at the MCMP Colloquium (7 May, 2015) titled "Modeling Cognitive Representations with Evolutionary Game Theory". Abstract: Cognitive science has been developed on the idea that cognitive systems are representational. Recently, however, some people have challenged this idea. The goal of this talk is to provide some mathematical tools for resolving this question. More precisely, I will defend two claims. First, I will argue that Evolutionary Game Theory can help us establish which states are representations. Secondly, I will defend that this strategy can be used to show that perceptual states (among others) are indeed representational.

Structures, Mechanisms and Dynamics in Theoretical Neuroscience

Play Episode Listen Later May 11, 2015 50:08


Holger Lyre (Magdeburg) gives a talk at the MCMP Colloquium (6 May, 2015) titled "Structures, Mechanisms and Dynamics in Theoretical Neuroscience". Abstract: Proponents of mechanistic explanations have recently proclaimed that all explanations in the neurosciences appeal to mechanisms – including computational and dynamical explanations. The purpose of the talk is to critically assess these statements. I shall defend an understanding of both dynamical and computational explanations according to which they focus on the explanatorily relevant spatiotemporal-cum-causal structures in the target domain. This has impact on at least three important issues: reductionism, multi-realizability, and explanatory relevance. A variety of examples from the theoretical neurosciences shall be used to show that very often the explanatory relevance, burden, and advantage in view of law-like generalizability lies in picking out the relevant structure rather than characterizing mechanisms in all details.

Philosophy of Statistical Mechanics from an Emergentist Viewpoint

Play Episode Listen Later May 11, 2015 63:33


David Wallace (Balliol College) gives a talk at the MCMP Colloquium (15 April, 2015) titled "Philosophy of Statistical Mechanics from an Emergentist Viewpoint". Abstract: I sketch a view of the philosophy of statistical mechanics as (a) concerned primarily with the interrelations between different dynamical systems describing more or less coarse-grained degrees of freedom of a system, and only secondarily with thermodynamic notions like equilibrium and entropy, and (b) informed by developments in contemporary mainstream physics. I develop, as concrete examples, (i) the projection-based approach to kinetic equations developed in the 1970s by Balescu, Prigogine, Zwanzig et al, and (ii) the relevance of quantum mechanics to nominally “classical” systems like the ideal gas.

The Mathematical Route to Causal Understanding

Play Episode Listen Later May 11, 2015 47:35


Michael Strevens (NYU) gives a talk at the MCMP Colloquium (30 April, 2015) titled "The Mathematical Route to Causal Understanding". Abstract: Causal explanation is a matter of isolating the elements of the causal web that make a difference to the explanandum event or regularity (so I and others have argued). Causal understanding is a matter of grasping a causal explanation (so says what I have elsewhere called the "simple theory" of understanding). It follows that causal understanding is a matter of grasping the facts about difference-making, and in particular grasping the reasons why some properties of the web are difference-makers and some are not. Mathematical reasoning frequently plays a role in our coming to grasp these reasons, and in some causal explanations, deep mathematical theorems may do almost all the work. In these cases - such as the explanation why a person cannot complete a traverse of the bridges of Königsberg without crossing at least one bridge twice - our understanding seems to hinge more on our appreciation of mathematical than of physical facts. We have the sense that mathematics gives us physical understanding. But this is quite compatible with the explanation in question being causal in exactly the same sense as more unremarkable causal explanations.

Occam's Razor in Algorithmic Information Theory

Play Episode Listen Later Feb 19, 2015 37:01


Tom Sterkenburg (Amsterdam/Groningen) gives a talk at the MCMP Colloquium (15 January, 2015) titled "Occam's Razor in Algorithmic Information Theory". Abstract: Algorithmic information theory, also known as Kolmogorov complexity, is sometimes believed to offer us a general and objective measure of simplicity. The first variant of this simplicity measure to appear in the literature was in fact part of a theory of prediction: the central achievement of its originator, R.J. Solomonoff, was the definition of an idealized method of prediction that is taken to implement Occam's razor in giving greater probability to simpler hypotheses about the future. Moreover, in many writings on the subject an argument of the following sort takes shape. From (1) the definition of the Solomonoff predictor which has a precise preference for simplicity, and (2) a formal proof that this predictor will generally lead us to the truth, it follows that (Occam's razor) a preference for simplicity will generally lead us to the truth. Thus, sensationally, this is an argument to justify Occam's razor. In this talk, I show why the argument fails. The key to its dissolution is a representation theorem that links Kolmogorov complexity to Bayesian prediction.

Claim MCMP – Philosophy of Science

In order to claim this podcast we'll send an email to with a verification link. Simply click the link and you will be able to edit tags, request a refresh, and other features to take control of your podcast page!

Claim Cancel