Archive for November 2008

Friday xkcd break: Believe it Suckers!

November 28, 2008

Props again, xkcd.  Though in fact I’ve always found Stairway… to be kind of embarassing, but we’ve always got the Boss to be thankful for.

Anti Program Notes: NPR’s Day to Day succumbs to psychic woo.

November 26, 2008

This is properly (or at least popularly) PZ Myers territory, but I could not believe my ears this afternoon listening to this story on NPR’s Day to Day broadcast for Wed., November 26.  Four minutes and change of credulous woo on psychic Roxanne Uselman’s burgeoning business as a New York -based psychic offering readings to businessmen and stock jobbers.

Uselman, to her credit, has dropped her prices to accomodate more of the economically challenged — 125 bucks an hour vs. 155 before the downturn.  She’ll even accomodate you by phone if that’s your desire.

Give her credit, though:  she described her work this way:  “What will be for one person, their reading, another person will have another type of reading.  It just depends on what the person, what I channel.  There’s really no logic to it.” (Italics added, of course.)

I have no grief to give to the self-styled psychic here. She’s entitled to believe what she believes, and whatever she believes, to separate the gullible and the desperate from their cash.  Consenting adults and all that.

But NPR?  They should know better.  I’m sure their defense is that the story was really one about the way in which the financial crisis is flowing into truly unexpected corners of the economy, and that’s fine.  But that message could have been sent with a one sentence statement of the fact that psychics are reporting an upturn in money-related questions, used as a lead or tag to a story about a practice that didn’t depend on magical thinking.  Four minutes plus of credulity as Uselman prattled on — even to the point of making a pitch for more business (I’ll do telephones!  If you’re in NY, come on by!) stretches that excuse to the breaking point.

This was an embarrassment.  I’m emailing my distress as a listener.  If I were a science reporter on NPR, I’d put a phone call into the D2D producers right now.  Just sayin’.

Image:  Richard Bergh, “Hypnotic Seance” 1887

Andrew Sullivan and Eric Posner are Dangerous Fools: Numbers and Iraq redux edition.

November 26, 2008

Andrew Sullivan is innumerate.

This is, of course, the blog-equivalent of the dog-bites-man story, except that this time his ignorance of matters quantitative does not merely encompass the manipulation of numerical objects, but their rhetoric, the use and abuse of selected quantities to minimize the perception of human suffering.

The occasion for this arrant blindness comes from a blog entry on the University of Chicago Law School faculty blog by Eric Posner, in which Posner argues that the Iraq invasion was a humanitarian and human rights success.

The arguments for human rights advances is based on a number of criteria — freedom of the press, democratic behavior and so on, and I’m not going to quarrel there.

But the claim that the American led invasion has reduced the violence, murder and injury suffered by the people of Iraq over that imposed by Saddam Hussein’s regime is marked by such sleight of hand as to be both (a) deceiving and (b) strongly suggestive of bad faith.

Andrew framed Posner’s claim thusly:

In short: if we never invaded, Iraqi civilian deaths due to sanctions may well have been greater than the wartime deaths.

Andrew’s culpability here is simply that he used his bully pulpit — by some measures the most bulliest in the blogosphere — to promote an argument that turns on a critical weaseling of the data to preserve that very point.  Posner’s commenters on the original post do a very good job of dissecting the numerous, elementary errors in his use of mortality statistics; its the very simple mindedness of Posner’s gaming of the numbers that make me see this as pure propaganda, rather than mere stupidity.

But those critics focus on errors of method, mostly, Posner’s habit of picking useful baselines, his comparing of incomparables and so on.  I just want to bring one more fault up, one that I believe even a completely numerically challenged Andrew Sullivan should have been able to pick up.

That’s this one:

Let’s suppose that the sanctions regime had continued for 10 years, from 2003 to 2013, and further that security flattens out—it doesn’t get worse, but it doesn’t get better. Under these assumptions, 400,000 Iraqi children would have died if the war had not occurred and the sanctions regime continued. Now, almost 100,000 Iraqis died during the war, and so one of the war’s benefits is that it saves the lives of 300,000 Iraqis (over 10 years).

There is a great deal that is wrong with this passage.  The assumption of a continued sanctions regime for a decade is highly questionable, given that one of the stated pretexts for war in the beginning was that the sanctions path was unlikely to hold indefinitely — in part for exactly the same humanitarian concerns that Posner professes here.

But while that error is real, and is of a piece with much else in the post that most charitably can be deemed sloppy thinking (again — check out the comments, the glaring lie-by-citation comes in the 100,000 number.

Posner is right:  there is a reputable project — Iraq Body Count — that states that as of this writing between 89,369 and 97, 568 civilian deaths by violence have been documented since the war began.  Problem number one is that IBC itself acknowledges that its belt-and-suspenders approach to documenting a death, necessary to preserve its credibility as the arbiter of the floor, or minimum number of deaths evoked by the war, produces a substantial undercount.  In 2006, an IBC presentation stated that the total deaths could be as much as double their published number.

That same presentation then took up the then-controversial Lancet/Johns Hopkins study that suggested that between 300,000 and 900,000 civilian deaths had occurred by 2006 as a result of the war, charging that a number of methodological flaws marred the results. The arguments are off point, as the underlying claim in the study is that it is measuring excess deaths rather than deaths by violence.

The distinction is crucial, as Posner’s claim, echoed by Sullivan, is that the number of Iraqi deaths due to the war is less than those from all causes due to the direct or indirect consequences of Saddam Hussein’s continued rule and the continuation of sanctions.  If you want to compare violent deaths — those the IBC counts — with violence imposed by Saddam’s regime, that’s an apples to apples pairing. If you want to count all the suffering of children lacking food or medicine due to the sanctions regime and Saddam’s manipulation of the UN Oil for Food fiasco, then the proper comparison is to all the suffering induced by the social disruption, the lack of services, the failure of governance that flowed in the wake of the invasion — those the Lancet study and others sought to estimate.

Those numbers are huge.  They range from over 300,000 (as of 2007) to over a million.  Most of the estimates run well above Posner’s highly suspect extrapolation of 400,000 deaths. Both totals are grotesque, of course.  It is better to preside over the slaughter of 400,000 than a million only in the most curdled of calculations of moral responsibility. Iraq before and after 2003 offers ample scope for pondering how the international approach to that country and its governance for decades has failed its people.

But it is simply wrong — and dangerous, and morally bankrupt — to defend the invasion of Iraq on the grounds that it saved lives.  No reasonable assessment of the data on hand support that claim, and its making serves to grease the skids for the next, ever hopeful essay in defense of American exceptionalism and the uses of violence for good.

I don’t know much about Posner.  He has the fact of a famous father behind him, but this does not mean that he is merely a self-made son in the manner of such luminaries as Kristol, W. or Goldberg, J.  At the same time, the real accomplishments to be found on his resume beg the question of why he would publish such a clearly false claim about the number of deaths to be considered.  I don’t know the answer.

As for Andrew.  It’s odd.  He’s someone who I think is a sentimental naif a lot of the time.  He is obviously smart, obviously enormously prolific in his reading and his writing, and he has fought the good fight over these last several years on a bunch of issue. He certainly has noted the increasing weight of evidence that the Iraq war was a fiasco, and a bloody one at that.  At the same time he does seem to freeze every time he faces a claim that has numbers in it.  This number, the total of Iraqi dead, is hardly a hidden datum at this point; he should have remembered the controvesies and responses to a number of claims.  And yet he gave the props of his influential blog to Posner’s nonesense.

Again, I don’t know why Sullivan refused to think for a moment about Posner’s claim before posting.  It may be a residual reflex to find some way to defend his initial support for the war: kind of a “hey, it bankrupted this country; devastated that one; brought America into moral jeopardy (see torture, inter alia) and diminished our soft and hard power throughout the world, but at least it saved some kids” thought.  Except it didn’t, and there is still no excuse for the moral and strategic error commited in 2003 and compounded since.

Image:  Francisco de Goya, Los Desatres de la Guerra, plate 30, during and after 1810.

The First Amendment?

November 25, 2008

Bad Gayz Eated It.

John Cole explains.

Quote for the Day: Jacob Burckhardt hearts him some science writing dept.

November 25, 2008

Jacob Burckhardt is hardly a household name anymore, outside certain rather specialized houses, but it would be not too great an exaggeration to say that he “discovered” the Italian Renaissance, establishing the notion of a distinct period, a time and place in which fundamental changes took place that rung out the end of the ideas and culture of the middle ages, laying the foundations of for habits of mind and the concrete history of that time we think of as modern.

Perhaps most significantly, he was an early historian, perhaps the seminal one, who focused on the history of art in particular and “culture” more generally as essential approachs to “history,” full stop.  That attitude led him to think about the history of science in the Renaissance as something other than simply a chain of discoveries that form individual sequences within particular disciplines…which in turn led him to just about the earliest praise of science writing I’ve been able to find.

If it is a little back-handed, Burkhardt’s compliment still captures what I think of as a critical truth:  science is not self-contained;  it is an expression of culture, and its survival as a living human enterprise depends on culture at large remaining aware of its claim on the public’s understanding and emotion alike.  In his landmark work, The Civilization of the Renaissance, Burkhardt writes:

Even the simple dilletante of a science — if in the present case we should assign to Aeneas Sylvius so low a rank — can diffuse just that sort of general interest in the subject which prepares for new pioneers the indispensable groundwork of a favourable predisposition in the public mind.  True discoveres in any science know well what they owe to such mediation.

(Part Four.  The translation above comes from the Penguin Classics edition of 1990.)

I’d argue that science writing and science writers have more ambition than simply acting as cultural diffusers.  The ones I admire most think of themselves as writers whose subject is science, and not simply science writers; that is they (and I, in the privacy of my own thoughts) are trying to use language to the limits of capacity for expression, to move readers and not simply to inform them.  That said, Burkhardt is right:  a civilized culture, a civilized time manages to communicate some version of its most sophisticated thinking to every interested citizen.

Image:  Studies of Embryos by Leonardo da Vinci (Pen over red chalk 1510-1513).

Stimulate This: Build the Grid First

November 24, 2008

As everyone in range of youtube now knows, President Elect Obama* is committed to spending what it takes to revive the American economy.  A very welcome development, after months of spending what it takes to transfer risk from the rich to the rest of us.  But still, such ambition does beg a question:  stimulate what?

My basic approach to this question is the obvious one:  pouring money into an economy works to stimulate activity, but it works best if you spend the money on things that have the capacity to evoke more economic activity in their turn.

That is — while there is a new new deal urgency to provide relief to those suffering the worst in this economic downturn (i.e., the jobless and the foreclosed), there is a ceiling to the broader economic impact that such relief can provide.  Take a look at this excellent post by Eric over at The Edge of the American West in which a fisking of John Maynard Keynes’ letter to Roosevelt in 1938 underscores what was understood  then (and still holds true) about the limits of relief as anti-recession policy.  (See also this for an analysis that extends into the role of WW II spending on recovery.)

So if you really want to promote long term economic growth from within a depression/recession, you have to buy some tickets in the game.**  Or, to put it more formally, you have to use the power of government spending to build capital that will in turn prove to be economically useful over a much longer time-frame than the immediate quarter or even year in which the Treasury prints the necessary cash (debt) to round out all those zeroes being talked about in Washington right now.

From where I sit (staring out over the MIT campus), that means spending on projects rich in science and technology — or at least ones that foster the uses of what science and technology produce: ideas and physical things that contribute to human well-being.

So what I’d like to do here is to begin a discussion, if possible, of what we should do with the stimulus process that could be informed by what science and engineering approaches suggest are the best long term investments in the country’s economy.

A couple of suggestions to get us going, then:

For one, there is just a broad based investment in the American research establishment.  It makes little sense to try and pick winners in the next great idea competition; the trick is to fund as many of the best people as you can find and let them come up with ideas that enhance human well-being (and thus produce a lot of economic activity in their wake).

That’s the thinking behind this post (and this follow-up) in which I made a pitch for a major investment in human capital:  paying for the education and early research careers of a much larger pool of young scientists and engineers than we now support.

It’s a good idea in just about any economic climate, and would have some stimulus effect — but in all honesty it falls between the relief and stimulus poles of any future plan.  The need to support young scientists is becoming acute as universities both public and private confront the joys of endowments and state/federal budgets that are under the pressures we all know.  Also, though we will see economic and cultural benefits from the discoveries to be thus enabled, the time frame is a little loose.

For a more concrete idea, try this:  early action on one thing the Obama team has already said it wants:  a new “smart” power grid.

The new grid is a prime example of the sort of stimulus I think we need because, first, it will pay for itself over a reasonable amortization period, given the potential improvement over current losses in the power distribution system.

But more than that, the new grid is crucial because it enables much else that we want to do for economic, environmental, and national security reasons.  We need a dramatically enhanced power transmission system to handle the particular demands on the transport of electricity from the proposed increase in renewable generating capacity in the wind/solar belts of the largely underpopulated middle and southwestern desert portions of the country.

Those places are a long way away from most of the major population centers that will use the power thus generated, which means we need as efficient a grid as possible.  But the issue is more pressing than that.  An industry study [link to PDF] suggests that wind/solar power being less controllable and more irregular than conventional plants, puts unusual demands on a grid.  The one we have now won’t hack it, and it will prove to be a significant design and construction plan to get one in place that can.  See this NYTimes piece for a first cut through the reasons why.

All of which means that funding now for a new grid meets two goals:  immediate classical Keynsian stimulus, with jobs created right here right now, and long term capital investment of the sort that only government can undertake. Think of this as 21st century analogue to the construction of the interstate highway system, without many of the ecological side effects.  A win-win in other words.

(FWIW, as a more direct heir to the road building of both the thirties and the fifties,  I’d love to see an investment in high speed passenger rail that would eliminate the need for air travel on any journey of less than 300 or so miles around the major hubs — the same basic arguments apply, but because the benefits are felt most immediately in regions rather than nationwide, a harder sell).

So over to you, dear readers:  what else should our better part of trillion bucks of new government capital spending buy?

*I still love writing tht.

**The reference is to this old joke.

***Faraday is here both for his contributions (enormous) to the creation of the electric economy and for his yet to be topped line on the reason to support scientific research.  Asked by Prime Minister William Gladstone of what use was electricity, he replied, “Why, sir, there is every possibility that you will soon be able to tax it!”

Image:  Alexander Blaikle, “Michael Faraday*** delivering a Christmas Lecture at the Royal Institution,” c. 1856.

Breaking News: Copernicus Unearthed

November 21, 2008

Forgive the headline; I could not resist.

Via the Nature group’s blog The Great Beyond comes notice that remains have been found of the man who can be seen as having fired the first shot of the scientific revolution (and to have put human beings in their place).  The blog reports:

A skull from Frombork cathedral in Poland has been identified as that of revolutionary astronomer Copernicus.

Marie Allen, of Uppsala University, says DNA from the skull is a match for DNA from hairs found in books owned by Copernicus, whose book De revolutionibus orbium coelestium started the movement to viewing the sun – rather than the Earth – as the centre of the solar system.

“The two strands of hair found in the book have the same genome sequence as the tooth from the skull and a bone from Frombork,” she says (AFP).

See this article from The Guardian for more details.

I love this story, not least for the connection of books to a kind of immortality:  we make and leave parts of ourselves in every book we read.

This is a big, big deal for anyone who likes to think about how the way we think now took form.  Tim F.of Balloon Juice sent notice of this story to me, and for him, it is the connection of Copernicus to Galileo that has the most resonance; Galileo’s defense of a sun – centered cosmos in the face of official Catholic rejection of Copernicus’s idea marks for many the birth of the modern sensibility, the assertion of the authority of experience over revelation.

I think that’s right — or at least, that seeing in and around Galileo one of the major steps towards the modern idea of science is certainly on target.  But Copernicus himself holds my attention here.  It is almost impossible to state how significant his combination of insight and rigor was in creating a Copernican “party” amongst the learned of Europe.

It was that impact that gave both license and direction to the ongoing and expanding European inquiry into nature, an effort that over the next 150 years became a scientific transformation so total that there was not going back.

There is one best place to trace how that which I am misleadingly calling a party took form. It comes courtesy of the near-legendary Harvard historian of science Owen Gingrich, who has carried on a decades long love affair with Copernicus and his book, De revolutionibus orbium coelestium (On the Revolutions of the Heavenly Spheres).

I owe Owen thanks for help he gave me early in the process of writing my Newton book– and for a happy afternoon in his unbelievably book-crammed office, looking over facsimile editions of Copernicus to puzzle out the meaning of a diagram or two. (That is– he was puzzling them out and I was holding his coat.)

But I owe him a greater debt of gratitude for his The Book Nobody Read, his tale, part memoir, part brilliant intellectual history, of tracking down every extant example of the first and second editions of De revolutionibus… and analyzing who wrote what marginal notes in each copy.  In doing so, he reconstructs the path Copernicus’s ideas took through the learned of Europe.  It’s a great read, a great glimpse of what it means to have a revolution in ideas at the level of individual thinking, feeling human beings exhilarated by a new thought.

(My own encounter with Owen’s book led me to grab the opportunity that came when I visited the Newton scholar Scott Mandelbrote at Peterhouse College, Cambridge.  Scott is or was at the time the man in charge of Peterhouse’s library, which owns a first edition of De revolutionibus. At lunch the day we met the topic of Copernicus came up, so he incredibly kindly took me into the library and pulled that treasure off the shelf for me to pick up and turn the pages.

It may be an odd passion, but I can’t describe how thrilling it was to pick up an almost five hundred year old book — such a little thing — that set off so many fireworks.  It is, by the way, a beautiful book just from the point of view of the printer’s art.  In particular, the woodcut drawings are truly elegant:  they possess a sharp, precise line that still has the quality of an individual craftsman’s gesture; there are sweeps to the curves, and slight deepening or widening of the stroke that gives emphasis to the diagrams.  They literally don’t make ’em like that anymore.)

Images:  Teothor de Bry, copperplate portrait of Nicholas Copernicus, 1598.

Nicholas Copernicus, diagram of the heliocentric system from De revolutionibus orbium coelestium, 1543.

Friday Inanity: Stupid Cat Pictures Dept.

November 21, 2008

What Beer Does To Writers

November 20, 2008

Courtesy of Ta-Nehisi, we have reference to Burkhard Bilger’s piece on the revival of American craft brewing and the movement’s recent excursions into  extreme beer — the 10 percent or more alcohol monsters that leave you crying in your cups if you aren’t careful.*  Ta-Nehisi loved it for the quality of Bilger’s writing, quoting the lede as an example of the (writing, not brewing) craft being practiced at the highest level.

It is fine writing — jump either of the links to check it out — but it reminded me of a much earlier piece of equally fine writing by William Least Heat Moon, published in the Atlantic more than twenty years ago, back when the phrase “good American beer” was still an oxymoron to most.

Here’s Moon’s last paragraph, in which he and his companion, “the Venerable,” make the mistake of going to the well one last time:

South of Sacramento, near Interstate 5, we stopped in a bar overhung with ferns, stained glass, old-time signs. We went in looking not for the perfect bar but only for a working phone. We knew that men who discuss the bubbles in a head of beer, who read patterns in the Irish lace – those men do not come into bars like this. Yet we had a small hope that some bottle of an untried oddity might be tucked away. The offerings, of course, were Hobson’s choice. Maybe the wish to put a touchstone to these last days of golden glasses urged us, I don’t know, but we ordered our Hobson’s, our industrial. The Venerable lifted his glass, drank, and set it down. He turned to me blankly and said, “Did I miss my mouth?”

I’ve used that last line I can’t think how many times in the years between then and now to describe all manner of experiences that somehow flat out failed.  It’s perfect.  Read the whole piece — it’s good on its own terms, and it captures a surprisingly distant recent moment in our past.  We live in a different country now — and Moon has been testifying to the change for a long time.

*Bilger writes about, among much else, Dogfish Head’s 120 minute IPA, which has an alcohol content in a double digit ABV percentage.  I have yet to encounter it, but I fear it.  I tried the 90 minute version on the recommendation of another blogospheric beer lover, Tim from Balloon Juice, with its 9% ABV, and even that seemed a bit off balance to me.  But unless it’s just too tame to borne, I can commend the 60 minute IPA, 6% ABV.  Just bitter enough — very nice.  (Though as a Bay Area boy born and bred, and blessed during high school with a bar just north of the UC Berkeley campus that (a) had Anchor Steam on tap and (b) was not exactly meticulous about checking IDs, I give that very fine old beer pride of place.)

Image:  David Teniers II, “Tavern Scene,” 1658.

Quote for the Day: Stephen Pinker/Albert Einstein edition

November 20, 2008

It could be just me, but I ain’t so sure about this:

Some people raise an eyebrow at linguists’ practice of treating their own sentence judgments as objecitve empirical data.  The danger is that linguist’s pet theory could unconsiously warp his or her judgments.  It’s a legiimtate worry, but in practice linguistic judgments can go a long way.  One of the perquisites of research on basic cognitive processes is that you always have easy access to a specimin of the species you study, namely, yourself.  When I was a student in a perception lab I asked my advisor when we sould stop generting tones to listen to and start doing the research. He correcte me:  listening to the tones was research, as far as he was concerned, since he wasconfident that if a sequence sounded a certain way to him, it would sound that way to every other normal member of the species.  As a sanity check (and to satisfy journal referees) we would eventualy pay students to listen to the sounds and press buttons according to what they heard, but the results always ratified what we could hear with our own ears.  I’ve followed the same strategy in psycholinguistics, and in dozens of studies I’ve found that the average ratings from volunteers have alsways lined up with the original subjective judgments of the linguists.  (Stephen Pinker, The Stuff of Thought, 2007, p. 34)

I know (or I think I do) what Pinker is trying to say here.  You can’t even begin to formulate an idea without having some idea of what you’re looking at or for.  Professional experience and a depth of knowledge of other work in the field do count.  One’s own perceptions are real, and can (must) guide experimental design and interpretation.

But at the same time, I fear Pinker’s diminishment of the possibility of observer bias, of the fact that people have commitments both conscious and unconscious to a given idea or expected outcome.

That such expectations can deeply affect one’s ability to understand what your measurements actually are saying to you is a matter of historical fact — and this kind of observer bias can strike even the brightest of investigators, even in fields seemingly safely far removed from the subjectivity and noise that accompanies any attempt to penetrate human mental life.  Peter Galison has dissected the famous (among a certain crowd) case of Albert Einstein’s misplaced confidence in the interpretation of his collaboration with W. de Haas on an experiment to explore properties of what became known as the Einstein-de Haas effect.

The experiments the two conducted did advance the understanding of the magnetic behavior of electrons, though a proper interpretation of what was going on had to wait (in a familiar trope for early 20th century physics) for quantum mechanical intervention.  But the point here is that Einstein had made a theoretical calculation to determine the expected value of the ratio of the magnetic moment to the angular momentum of electrons travelling in their closed orbits around atomic nuclei.  In his calculation, he derived a value of one.

Then he and de Haas performed the measurement, using a delicate and complicated experimental set up. Sure enough, they were able to extract data that produced a value for the quantity to be confirmed of 1.02.   Einstein was aware that this looked almost too sweet — he wrote that the “good agreement may be due to chance” — but the coincidence of expectation and result was too much for him to ignore.

Unfortunately, subsequent experiments, and then the theoretical description in quantum mechanical terms showed the correct value to be two.

The moral?  Pace Pinker, while judgments by practitioners immersed in their fields do and should go a long way, past (and future) performance is no guarantee that observer bias ain’t about to bite you in the ass right now.   (Say I, ex cathedra — that is, someone whose last lab experience involved hideous acts performed on a frog — see E. M. Fogarty, “Anatomy of a Frog,” Journal of Irreproducable Results, 1963, 11, 65.)

That said — I’m well stuck into The Stuff of Thought and am enjoying it greatly.  I just got stuck for a moment on what might be the scientist-popularizer’s equivalent of an episode of irrational exuberance.