Tuesday, December 27, 2011

Call for Papers: Consciousness and Moral Cognition

The editors at Review of Philosophy and Psychology invite submissions for a special issue on consciousness attribution in moral cognition. Guest authors include: Kurt Gray (Maryland), Edouard Machery (Pittsburgh) and Justin Sytsma (East Tennessee State), and Anthony I. Jack (Case Western Reserve) and Philip Robbins (Missouri).

Submissions are due March 31, 2011.

The full CFP, including relevant dates and submission details, is available here.

Abbreviated CFP: When people regard other entities as objects of ethical concern whose interests must be taken into account in moral deliberations, does the attribution of consciousness to these entities play an essential role in the process? In recent years, philosophers and psychologists have begun to sketch limited answers to this general question. However, much progress remains to be made. We invite contributions to a special issue of The Review of Philosophy and Psychology on the role of consciousness attribution in moral cognition from researchers working in fields including developmental, evolutionary, perceptual, and social psychology, cognitive neuroscience, and philosophy.

Friday, December 23, 2011

Against Increasing the Power of Grant Agencies in Philosophy

Clark Glymour has an opinion piece urging philosophers to reach out beyond their disciplinary circles and encouraging the pursuit of big-dollar grants. Adam Briggle and Robert Frodeman say much the same thing. (Glymour emphasizes philosophy of science and Briggle & Frodeman applied ethics.) I agree that philosophers as a group should reach out more than they do. But I think the increasing emphasis on grant-getting in academia is a disease to be fought, not a trend to be encouraged.

Academic research scientists spend a lot of time applying for grant money. This is time that they are not spending doing scientific research. I've often heard that applying for an NSF grant takes about as much time as writing a journal article. Now, most scientists need money to do their research and there should be mechanisms to fund worthy projects, so maybe for them passionate summers of grant application are a worthy investment. But do philosophers need to be doing that? I doubt philosophy is best served by encouraging philosophers to spend more time thinking up ways to request money.

Furthermore, for both scientists and philosophers I think a better model would be a hybrid in which it is possible to apply for grants but in which, also, productive researchers could be awarded research money without having to apply for it. Look, V.S. Ramachandran is going to do something interesting with his research money no matter what, right? Philip Kitcher too. Let them spend their time doing what they do best and monitor the funds post facto. Let us all have a certain small amount of money to attend (and sometimes organize) conferences, without our having to manufacture elaborate bureaucratic pleas in advance. The same total funding could go out, with much less time wasted, if grant writing were only for exceptional cases and exceptional expenses.

A very different type of reason to resist the increasing academic focus on grant-getting is this: Grant-driven bureaucracy decreases the power of researchers to set their own research agenda and increases the power of the grant agencies to set the agenda. Maybe that's part of what Glymour and Briggle & Frodeman want, since they seem to distrust philosophers' ability to choose worthy topics of research for themselves. But philosophy in particular has often been advanced by people working outside the mainstream, on projects that might not have been seen as valuable by the well-established old-school researchers and administrators that tend to serve on grant committees. In ancient Greece, the sophists were the ones getting grants, while Socrates was fightin the powa.

If you want to apply for grants, terrific! I have no problem with that. Get some good money to do your good work. Organize an interesting conference; fly across the world to thumb through the archives; get some time away from teaching to write your book. Absolutely! But let's not try to push the discipline as a whole more into the grant-getting game than it already is.

Thursday, December 15, 2011

Frege's Puzzle and In-Between Cases of Believing

There's a huge literature in philosophy of language on what's called "Frege's puzzle" about belief reports. Almost all the participants in this literature seem to take for granted something that I reject: that sentences ascribing beliefs must be determinately true or false, at least once those sentences are disambiguated or contextualized in the right way.

Frege's puzzle is this. Lois Lane believes, it seems, that Superman is strong. And Clark Kent is, of course, Superman. So it seems to follow that Lois Lane believes that Clark Kent is strong. But Lois would deny that Clark Kent is strong, and it seems wrong to say that she believes it. So what's going on? There are several standard options, but all lead to trouble of one sort or another. (If you don't like Superman, try Twain/Clemens or Unabomber/Kaczynski.)

On a dispositional approach to belief of the sort I favor, to believe some proposition P -- the proposition, say, that that guy (variously known as "Superman" or "Clark Kent") is strong -- is to be disposed to act and react, both outwardly and inwardly, as though P were true. (On my version of dispositionalism, this means being disposed to act and react in ways that ordinary people would regard as characteristic of belief that P.) Lois has some such dispositions: For example, she's disposed to say "Superman is strong". But she notably lacks others: She's not disposed to say "Clark Kent is strong". She's disposed to ask Superman/Clark Kent to lift her up in the air when he's in costume but not when he's in street clothes.

Personality traits also involve clusters of dispositions, so consider them as an analogy. If someone is disposed to be courageous in some circumstances and not courageous in other circumstances, it might be neither quite right to say that she is courageous nor quite right to say that she isn't. "Courageous" is a vague predicate, and we might have an in-between case, in which neither simple ascription nor simple denial is entirely appropriate (though there may also be contexts in which simple ascription or denial works well enough -- e.g., battlefields vs. faculty meetings if she has battlefield courage but not interpersonal courage). Compare also "Amir is tall", said of a man who is 5'11". Lois's belief about Superman/Clark Kent might similarly be an in-between case in the application of a vague predicate.

You'll probably object that Lois simply and fully believes that Superman is strong, and it's not an in-between case at all. I have two replies. First, that way of putting it -- in terms of Superman rather than Clark Kent -- highlights certain aspects of Lois's dispositional profile over others, thus creating a conversational context that tends to favor believes-strong ascription (like a battlefield context might favor ascription of courage to a person who has battlefield courage but not other sorts of courage). Second, consider a version of the case in which the belief ascriber doesn't have the name "Clark Kent" available, but only the name "Superman". The ascriber and his friend are looking through a window at Superman/Clark Kent in street clothes. The ascriber's friend, who doesn't know that Lois is deceived, asks, "Does Lois believe that Superman is strong?" What should the ascriber reply? He should say, "Well, um, it's a complicated case!" I see no point in insisting that underneath that hedge there needs to be a determinate metaphysical or psychological or (disambiguated [update Dec. 16: e.g., "de re / de dicto"]) linguistic fact that yes-she-really-does (or no-she-really-doesn't), any more than there always has to be a determinate fact about whether someone is tall simpliciter or courageous simpliciter.

Now this is a heck of a mess in philosophy of language, and I haven't thought through all the implications. I'm inclined to think that excessive realism about the identity of propositions is part of the problem too. I don't claim that this is a full or non-problematic solution to Frege's puzzle. But it seems to me that this general type of approach should be more visible among the options than it is.

[HT: Lewis Powell on Kripke's Puzzle.]

Saturday, December 10, 2011

Descartes, Moore, Whatevs!

On Nature's website:
“Descartes said that if there's something you can be certain of in this world, it's that your hand is your hand,” says Ehrsson.
Um, whoops! Descartes said that what he couldn't doubt was his own thinking. It was G.E. Moore who famously said it would be absurd to suggest that he didn't know that "here is a hand".

Descartes, G.E. Moore, whatever! It's only philosophy, after all -- not something worth bothering to get right in the the flagship journal of the natural sciences.

(If I sound prickly, maybe it's because I'm currently on hold with AT&T, about to talk to my eleventh representative in two months about being double billed for internet service.)

Update, Dec. 15: The author of the piece has now corrected the error. It turns out that philosophy is worth getting right after all!

Thursday, December 08, 2011

Creativity and Dishonesty

A recent paper by Francesca Gino and Dan Ariely suggests that relatively creative people are more likely to be dishonest than are relatively less creative people because they are better at concocting rationalizations for potential dishonesty. I can't say I'm entirely swooned by Gino & Ariely's methodology, which measures dishonesty by seeing whether people will give wrong answers in psychology laboratory studies when they are paid to give those wrong answers. (If psychologist says: "Roll a die, I'm not going to check the outcome, but I'll pay you $1 if you say it's a 1 and $6 if you say it's a 6", how exactly should the participant react to what's going on here?) I'd rather see more naturalistic observations of behavior in real-life situations, or at least better cover stories. Nor do I think Gino & Ariely do a terrific job of establishing that ability to creatively rationalize is the real mediator of the apparent difference in honesty.

Nonetheless, the conclusion is interesting, the mechanism plausible, and the results at least suggestive. And their picture fits nicely with my favorite hypothesis about the apparent fact that professional ethicists behave no morally better than do socially similar non-ethicists. Philosophical moral reflection, I'm inclined to think, rather than being inert, is bivalent: On the one hand, it highlights the moral dimension of things and can help you appreciate moral truths; but on the other hand, people who are skilled at it will also be skilled at finding superficially plausible rationalizations of attractive misconduct which might then allow them to feel freer to engage in that misconduct (e.g., stealing a library book). Professional ethicists develop their creativity in exactly an area in which being creative brings substantial moral hazards.

Tuesday, December 06, 2011

The Baby Boom Philosophy Bust

In 2010, I compiled a list of the top 200 most-cited contemporary authors in the Stanford Encyclopedia of Philosophy. (By "contemporary" I mean born in 1900 or later.) One striking feature of this list is the underrepresentation of baby boomers, especially near the top.

Let's compare the representation of people born 1931-1945 (the fifteen years before the baby boom) with those born in 1946-1960 (the bulk of the baby boom), among the top 25.

Among the pre-baby boomers, we find:
David Lewis (#1)
Saul Kripke (#6)
Thomas Nagel (tied #7)
Jerry Fodor (#9)
Daniel Dennett (tied #10)
Frank Jackson (tied #10)
Robert Nozick (tied #13)
John Searle (tied #13)
Gilbert Harman (#16)
Ronald Dworkin (#18)
Joseph Raz (tied #19)
Bas Van Fraassen (tied #19)
Fred Dretske (tied #22)
Peter Van Inwagen (tied #22)
Alvin Goldman (tied #24).
Among the baby boomers we find:
Martha Nussbaum (tied #19)
Philip Kitcher (tied #24).
These numbers seem to suggest that the depression-era and World War II babies have had a much larger impact than the baby boomers on mainstream Anglophone philosophy.

You might have thought the reverse would be the case. Aren't there more baby boomers? Haven't baby boomers been culturally dominant in other areas of society? So what's going on here?

One possibility is that the boomers haven't yet had time to achieve maximum influence on the field. Someone born in 1940 has had ten more years to write and to influence peers and students than has someone born in 1950. Although I think there is something to this thought, especially for the younger boomers, I suspect it's not the primary explanation. A boomer born in 1950 would be sixty years old by 2010. The large majority of philosophers who have a big impact on the field achieve a substantial proportion of that impact well before the age of sixty. Certainly that's true of the top philosophers on the list above -- Lewis, Kripke, Nagel, and Fodor. Their most influential work was in the 1960s to early 1990s. The boomers have had plenty of time to generate the same kind of influence, if it were simply a matter of catching up from a later start. In fact, contemporary Anglophone philosophers seem to have their average peak influence from about age 55-70, declining thereafter. On average, the baby boomers should be enjoying peak citation rates right now, and the depression babies should be starting to wane.

Here's an alternative diagnosis: College enrollment grew explosively in the 1960s and then flattened out. The pre-baby-boomers were hired in large numbers in the 1960s to teach the baby boomers. The pre-baby boomers rose quickly to prominence in the 1960s and 1970s and set the agenda for philosophy during that period. Through the 1980s and into the 1990s, the pre-baby-boomers remained dominant. During the 1980s, when the baby boomers should have been exploding onto the philosophical scene, they instead struggled to find faculty positions, journal space, and professional attention in a field still dominated by the depression-era and World War II babies.

This started to change, I think, with the retirement of the depression babies and the hiring boom of Gen-Xers in the late 1990s and early 2000s. It remains to be seen if history will repeat itself.