Tuesday, March 31, 2009

The role of mathematics in the study of evolution

The population biologist, James Crow, in an article in the latest issue of the Journal of Biology, reminds us of Ernst Mayr's challenge (in 1959) to explain the relevance of mathematical models to evolutionary studies. In Mayr's words the challenge is "what, precisely, has been the contribution of this mathematical school to the evolutionary theory, if I may ask such a provocative question?". Crow's 2009 response is pretty typical of the responses made by mathematical evolutionists (including Haldane) back in the early 1960s: a laundry list of evolutionary problems solved by mathematics. While, this sort of response has value, it isn't general enough for philosophers of science. Further, some of the problems solved were mathematical to begin with. Mayr is clearly asking what role mathematics plays in the biological sciences, not in the mathematical sciences.

How should philosophers respond? Chris Pincock, for one, is working on a book that explores this issue. I hope this post prompts Chris to reply.

For what it is worth, I have a couple of half-backed ideas to initiate a list of how mathematical models contribute to evolutionary biology in particular and perhaps science in general. What I am really hoping is for some of you to point me to some of the relevant philosophical literature. Or, even better, I hope some of you add to the list.

Invoking A. Garfinkel (1982), I think one of the crucial contributions mathematical models make to sciences is that they allow us to consider what could have been otherwise (which, on some theories of causation, help us understand causal relations between events). Garfinkel's example is from population ecology, in particular the Lotka-Volterra equation which tracks the dynamics of population levels between preditors and prey. I can't represent the equation in this webpost but you all know the basic idea: the influences on the population numbers of preditors and prey are modeled in terms of the frequency in which the preditors encounter and eat their prey. From higher frequency of encounters we can predict that the population of prey organisms will go down. From observation alone we might confirm that a particular rabbit was eaten by a particular fox at a particular time. But, what observation doesn't tell us and what the L-V equation does is what would have happened had the particular rabbit not been eaten by the particular fox. If the population of foxes is high enough (and the rabbit popuation is low enough) then it is relatively likely the rabbit would have been eaten anyhow (by a different fox). If the population of foxes is low enough (and the rabbit population is high enough) then the chance of the rabbit getting eaten is relatively lower.

This one is a bit more vague. This time I'm channeling Poisson, Quetelet, LaPlace, and Gauss (and Strevens has an excellent book on the topic--Bigger than Chaos). One of the remarkable discoveries in the empirical sciences is the existence of large scale regularities that emerge from a "chaos" of individual variation. Extinction and adaptive speciation are two examples from evolutionary biology, predator/prey relations is an example from ecology. From physics we have gas laws. From demography there are sex ratio skews (towards boys found in England in the 18th and 19th century), and crime data in Paris that showed consistent crime rates in the 1820s despite the variety of ways crimes are committed (among a host of other found demographic phenomena). In economics Adam Smith hypothesized that well-ordered economies emerge from the variety of ways that individuals strive (unfettered) for their own reproductive success. Statistical data helped us see these patterns but probability theory (law of large numbers, central limit theorem) allowed us to see how these patterns could possibly emerge without the interference of external forces. Placed in historical context this application of mathematics (in the form of probability theory) was crucial in distinguishing the naturalistic sciences from theology. Before Gauss, Poisson, and LaPlace, people thought that, for example, the sex ratio skew towards boys was part of God's plan to make sure there are enough men for women to marry (after all many bachelors die in war). So, mathematics plays a role in explaining how large scale regularities emerge without reference to special external forces.

Incidentally, I think Darwin's version of natural selection relies a bit on both a rather primitive form of probability theory (in the form of what every gambler knows--that even slightly weighted dice gives the player an advantage or disadvantage) and an external force, or a force external to the mere lives, deaths, and reproductive activities of individuals. Crucial for Darwin's theory of natural selection is that a struggle for existence "inevitably follows" the Malthusian crush of population growth against resource restrictions. The struggle due to population growth is the natural selector and a condition that is external to individual life histories. But, modern versions of natural selection have downplayed the role of population growth. Evolution by natural selection has no need for an external force. Adaptive speciation (as shown by more sophisticated probabilistic models) can emerge from individuals who vary in their reproductive qualities.

Well, this has gone on long enough....

Monday, March 30, 2009

SVT: Original and Continuing Motivation

At the recent MS3 meeting, I gave a brief presentation on Pat Suppes contributions to thinking about models. One point is relevant to Contessa’s question about the original arguments in favor of the SVT over the statement view. At least two of the three founders of the SVT, Suppes and Beth (the third was Arthur Burks), were much concerned with the foundations of physics, Suppes with classical mechanics and Beth with quantum theory. They found attempting reconstructions in first (or even second) order logic to be impossibly cumbersome. The physics gets lost in the logic. To be convinced of this, one need only look at Richard Montague’s 1962 first order reconstruction of classical mechanics. [Deterministic Theories. In Formal Philosophy and Selected Papers of Richard Montague, ed. R. H. Thomason, 303-59. New Haven, Yale University Press, 1974.] As I remember, one can barely make out F= ma in something like Axiom 24. Set Theory and State Spaces are far more perspicuous than first order formulae. van Fraassen, who was inspired by Beth, had a similar motivation. The general idea of getting the philosophy of science closer to the science has been for me, and I think many others, a major attraction of the SVT, even though the primary interest has been understanding the actual practice of science rather than the foundations of theories.

Friday, March 27, 2009

The assignment of probabilities

I often want to feel that science and philosophy have been enriched by the introduction and deployment of various kinds of probability theories. I am also often attracted to the further thought that Bayes theory should be a normative ideal in helping me establish how one should think about, say, evidence or risk.
Yet reflecting on this moving review, http://ndpr.nd.edu/review.cfm?id=15666, of Swinburne's recent book (Was Jesus God?) or, say, Earman's earlier well known use of Bayes' theory Hume's Abject Failure (not to mention the assignment of probabilities in risk models used by the financial industry or the manipulation of, say, Bayes' theory in research applications by scientists) has made me wonder if the following isn't true: if the competent use of a probability theory can lead one to assign a positive value to the probability that God has anthropomorphic qualities this should be seen as a reductio ad absurdum of the theory. I know there are many good objections to this melancholic thought (not to mention that I may be unable to construct a valid argument for it). But at what point does one admit that the problem is with the class of tools rather than the uses made by them? Or is the National Rifle Association's claim that "Guns do not kill people...people kill people" really convincing?

Thursday, March 26, 2009

Bas van Fraassen's Scientific Representation: Paradoxes of Perspectives

As many of you will already know, Bas van Fraassen's new book Scientific Representation: Paradoxes of Perspectives (OUP 2008) has been out for a few months and I highly reccommend it to all of you who haven't read it already.

If you are interested and you would like to have some idea of what the book is about, it looks like a few it'sonlyatheorists have been busy reviewing it lately. So, here is a review of it by Ron Giere, here one by Steven French, and here is one that I've written for NDPR. (If I missed anyone, please let me know!)

Also, Bas has kindly invited me to be one of the contributors to a book symposium on the book that will appear some time next year in Analysis and I'm considering developing some of the points I raise in my review in that piece. So, if you have any comments about my review, please let me know (either commenting on this post or by e-mail).

Friday, March 20, 2009

A cool quote from Richard Hooker (1593), and some belated evidence for a Kuhnian speculation on forms/laws

Yesterday, I presented a paper on Newton in a Newton seminar at Duke/UNC co-hosted by Andrew Janiak and Alan Nelson. Among the folks in audience were David Miller and Marc Lange (whom I mistook for a student), so discussion was very stimulating. The student commentator, Matt Priselac (UNC), offered superb subtle criticism of my arguments. The main point of my paper (sorry for self-promotion, but I'll get to the point shortly) was that Newton's use of "emanation" in his famous unpublished piece DeGravitatione (lovingly known as DeGrav) should be understood as a form of formal causation (thus, being able to split the difference between folks like Ted McGuire, Ed Slowik, and Dana Jalobeanu who treat it as a Platonizing efficient cause and folks like Howard Stein and Andrew Janiak that treat it as a form of conceptual necessity); depending on how one reads the emanation thesis in DeGrav one's understanding of Newton's metaphysics of space and its connection to theology shifts. [Let's leave aside how revealing DeGrav really is of Newton's mature views.] But I use my exegetical claims as a jumping off point to make claims about Newton on measurement. Anyway, my main bit of evidence is a striking quote from Bacon's New Organon ( 2.I-II; emphasis in original):

"On a given body, to generate and superinduce a new nature or new natures is the work and aim of human power. Of a given nature to discover the form, or true specific difference, or nature-engendering nature, or source of emanation (for these are the terms which come nearest to a description of the thing), is the work and aim of human knowledge. Subordinate to these primary works are two others that are secondary and of inferior mark: to the former, the transformation of concrete bodies, so far as this is possible; to the latter, the discovery, in every case of generation and motion, of the latent process carried on from the manifest efficient and the manifest material to the form which is engendered; and in like manner the discovery of the latent configuration of bodies at rest and not in motion.... For though in nature nothing really exists besides individual bodies, performing pure individual acts according to a fixed law, yet in philosophy this very law, and the investigation, discovery, and explanation of it, is the foundation as well of knowledge as of operation. And it is this law with its clauses that I mean when I speak of forms, a name which I the rather adopt because it has grown into use and become familiar."

For the purposes of my paper the quote is useful because i) it links the source of emanation with the discovery of form and a nature-engendering nature (thus giving me contextual support to claim that when Newton uses emanation he can be thinking of (Baconian) formal causation; ii) the context makes clear that Bacon is willing to reformulate a notion of form separate from final causes (which he thought useless); iii) Bacon introduces talk of law as an epistemic doctrine about forms (which are ontologically composed of matter); iv) Bacon’s rescue of a notion of form anticipates important aspects of Boyle’s use. Along the way, I pointed out that this passage can help us start put some flesh on Kuhn’s old speculative story about how talk of forms was transformed into talk of laws during Scientific Revolution. (Newton’s switch away from emanation to (in Cartesian language) discussion of laws of motion in Principia can be seen as culmination of this.)

Now after the seminar Marc Lange called my attention to a lovely quote from Richard Hooker (1593--note the date!):
Whereas therefore things natural which are not in the number of voluntary agents... do so necessarily observe their certain laws, that as long as they keep those forms which give them their being, they cannot possibly be apt or inclinable to do otherwise than they do; seeing the kinds of their operations are both constantly and exactly framed according to the several ends for which they serve, they themselves in the meanwhile, though doing that which is fit, yet knowing neither what they do, nor why: it followeth that all which they do in this sort proceedeth originally from some such agent, as knoweth, appointeth, holdeth up, and even actually frameth the same. (Richard Hooker, Of the Lawes of Ecclesiastical Politie, Book I.iii.4.)

As Lange remarks: “Shades of Hempel and Oppenheim from 350 years later!” Lange discusses the Hooker passage briefly in chapter 1 of his forthcoming book "Laws and Lawmakers", (OUP). On my reading of context Hooker is claiming that natural things are law-like in virtue of God's unknowable general providence. (Cf. Descartes in Meditations!) The form is a manifestation (if I may use that term) of God's providence. What's especially interesting is that Hooker's nature is very much knowable because it is part of maker's knowledge; yet Hooker transforms that traditional (medieval) doctrine because he does not seem to be interested in discovering 'local' final causes. So, not unlike Spinoza (who must have familiar with the Bacon passage quoted above—I have to check if he read Hooker while preparing his TTP), Hooker has an account of what one might call 'blind' forms (‘blind’ because divorced from final causes) that are responsible for the law-following order we find in nature. It is, of course, especially interesting that Hooker connects forms to modality in the way he does. I wonder if Hooker is an actualist (like Spinoza), for whom the possible is constrained by the actual, or if he has a more 'Leibnizian' conception of modality (the actual is just one of a universe full of possible ones).
Damn, I may have to read Hooker soon. Given the evidence from Hooker and Bacon, I wouldn’t be surprised if the building blocks of the modern conception of (scientific) law are to be found deep in the Renaissance.
Either way, it should be clear that Descartes’ project is in many ways nowhere near as revolutionary as he often makes it seem.

I wanted to mention some fascinating material from Newton on measurement, but this has gone too long as is.