Bergen Philosophy of Science Workshop
Seminaret i år går av stabelen torsdag 5. juni kl. 14.45-18.00 og fredag 6. juni kl. 09.45-13.30. Seminaret er på engelsk.
Hovedinnhold
Program
Torsdag 5. juni
- 14.45 Coffee. Welcome from the Chair, Prof. Reidar Lie
15.00 - 16.00
R. Batterman (Pittsburgh): Minimal Model Explanations
16.00 - 17.00
M. Sprevak (Edinburgh): Fictionalism about Neural Representations
17.00 - 18.00
D. Dieks (Utrecht): Emergence, Reduction, Underdetermination and Explanation in Recent Quantum Gravity Research
Fredag 6. juni
- 9.45 Coffee
10.00 - 11.00
M. Suarez (London / Madrid): Propensities, Chances, and Experimental Statistics
11.00 - 12.00
M. Morreau (Tromsø): Mr. Fit, Mr. Simplicity and Mr. Scope: from Social Choice to Theory Choice in Science
12.00 - 12.30 Lunch on site
12.30 - 13.30
A. Barberousse (Lille / Paris): Bayesian Methods in Climate Modeling
- 13.30 Farewell
______________________
Abstracts
Mark Sprevak, Univ. of Edinburgh
http://www.ppls.ed.ac.uk/philosophy/people/mark-sprevak
Fictionalism about Neural Representations
This paper explores Fictionalism about talk of neural representations in cognitive science. This type of Fictionalism promises to (i) avoid the hard problem of naturalizing representations, without (ii) incurring the high costs of eliminating useful representation talk. I try to articulate this form of Fictionalism, and show that, despite its apparent advantages, it faces two serious objections. These objections are: (1) Fictionalism about talk of neural representations ultimately does not avoid the problem of naturalising representations; (2) Fictional representations cannot play the explanatory role required by cognitive science.
Michael Morreau, Univ. of Tromsø
http://en.uit.no/ansatte/organisasjon/ansatte/person?p_document_id=329116&p_dimension_id=88151
Mr. Fit, Mr. Simplicity and Mr. Scope: from Social Choice to Theory Choice in Science
Different scientists, in Thomas Kuhn‟s equitable view of the matter, though committed to the same choice criteria and working with the same empirical data, may reach different conclusions about which of several rival theories is best. There is, Kuhn claimed, no ‘neutral’ algorithm for theory choice. Kenneth Arrow’s celebrated ‘impossibility’ theorem tells us that there is no good way of deriving, from voters’ preferences or utilities, a collective ordering of alternative social states from better to worse. Now, it has recently been argued that, putting Kuhn’s criteria of accuracy, simplicity, scope etc. in the place of voters, an analogue Arrow’s theorem sets Kuhn’s “no algorithm” thesis on a firm foundation. The consequence is supposedly that there is no good way of deriving, from evaluations by Kuhn’s choice criteria, an ordering of rival theories by their overall merit. We will see that: (1) Arrow’s theorem can tell us little or nothing about Kuhn’s thesis because the analogue in theory choice of its crucial domain assumption is, in central cases, obviously false; that: (2) While variant might in principle constrain rational theory choice, it probably doesn’t actually do so very much; and that: (3) Other, positive findings from the field of social choice can be used to rationalize certain simple procedures that some scientists follow, such as preferring theories that are better in more ways than not. We will consider the case of statistical model selection.
Robert Batterman, Univ. of Pittsburgh
http://www.philosophy.pitt.edu/person/robert-batterman
Minimal Model Explanations
This paper discusses minimal model explanations, which we argue are distinct from various causal, mechanical, difference making, etc., strategies prominent in the philosophical literature. We contend that what accounts for the explanatory power of these models is not that they have certain features in common with real systems. Rather, the models are explanatory because of a story about why a class of systems will all display the same large-scale behavior because the details that distinguish them are irrelevant. This story explains patterns across extremely diverse systems and shows how minimal models can be used to understand real systems.
Anouk Barberousse, Univ. of Lille / IHPST Paris, Sorbone
http://www-ihpst.univ-paris1.fr/4,anouk_barberousse.html
Bayesian Methods in Climate Modeling
The use of Bayesian methods in climate modeling is quickly developing since the late 1990s. Whereas most statisticians and physicists employ classical statistics, some have begun to develop alternative methods. The aim of the paper is to analyze the arguments given by Bayesian statisticians and physicists in favor of Bayesian methods. I shall show that these arguments come under two different headings: specialized and case-by-case when they are directed to the other climate students, vs. general, epistemological, and pragmatic when they are directed toward decisionmakers. I shall claim that this duality reflects climate science's current scientific and epistemological situation.
Mauricio Suárez, London University / Complutense University of Madrid
http://sas.academia.edu/MauricioSu%C3%A1rez
Propensities, Chances, and Experimental Statistics
Probabilistic modelling may be most generally described as the attempt to characterise (finite) experimental data in terms of models formally involving probabilities. I argue that a coherent understanding of much of the practice of probabilistic modelling calls for a distinction between three notions that are often conflated in the philosophy of probability literature. A probability model is often implicitly or explicitly embedded in a theoretical framework that provides explanatory - not merely descriptive - strategies and heuristics. Such frameworks often appeal to genuine properties of objects, systems or configurations, with putatively some explanatory function. The literature provides examples of formally precise rules for introducing such properties at the individual or token level in the description of statistically relevant populations (Dawid 2007, and forthcoming). Thus, I claim, it becomes useful to distinguish probabilistic dispositions (or single-case propensities), chance distributions (or probabilities), and experimental statistics (or frequencies). The ascription of particular propensities - as Charles Peirce noted long ago, see also Suárez, 2013 - is to be justified (or criticized) by abductive means in terms of their explanatory qualities. I illustrate the distinction with some elementary examples of games of chance, and go on to claim that it is readily applicable to more complex probabilistic phenomena, notably quantum phenomena.
Dennis Dieks, Utrecht University
http://www.projects.science.uu.nl/igg/dieks/
Emergence, reduction, underdetermination and explanation in recent quantum gravity research
In this talk we will consider, from the perspective of the philosophy of science, a number of recent and novel ideas in physics, coming from quantum gravity research. These ideas revolve around the "Holographic Principle", which says that some n-dimensional theories without gravitation are equivalent to (n+1)-dimensional theories in which there is gravitation---for example, a gravitation-less theory on a two-dimensional surface may be equivalent to a gravitational theory in three dimensions. It is often stated in the pertinent physics literature that holographic scenarios depict gravitation and space as "emergent", and make it possible to understand them as arising from something deeper. After explaining the general idea of holography and a specific recent holographic proposal, we will attempt to analyze their philosophical status.