Archive for January 2008

Lt. Whiteside — sad update

January 31, 2008

This is an update to an update, and sad one too. The Washington Post today reported that Lt. Elizabeth Whiteside attempted to kill herself for a second time. The report, which came as the lede to a larger story on the record level of attempted and completed suicides among active duty soldiers in 2007, goes on to say that she did so the day that the army finally announced that it was dropping charges against her for incidents during her first suicide attempt, during which she pointed her gun at fellow soldiers.

On this second attempt, Whiteside attempted to overdose on antidepressants and other drugs. She left a note that rad, in part, “I’m very disappointed with the Army,” …Hopefully this will help other soldiers.”

Whiteside is now in stable condition, and is due for a discharge from the Army that will preserve her access to mental health benefits. But a central implication of her story is that many others don’t make it.

There is an invaluable review of different strands of that larger story at Mike Dunford’s The Questionable Authority blog.

More than just covering the numbers, Dunford looks at possible causes for the wave of suicides, and gives a brief, depressing introduction to the structural problems within the Army that obstruct attempts to improve the mental health care system available to American service men and women.

Phil Carter’s Intel Dump picks up on the story too, leading with the New England Journal of Medicine report on the link between Traumatic Brain Injury (a blow to or into the brain) and Post-Traumatic Stress Disorder. There is a vigorous comment thread there, mostly populated by current or retired service members that gets at some of the problems in the military’s handling of soldiers/marines with mental injuries suffered in Iraq and Afghanistan.

Given all that, I’ll just make the point I’ve tried to argue in several other posts: that when we talk about problems in the public engagement with science, it seems to me that it isn’t a deficit in specific knowledge that matters. Rather, it’s the habit of mind with which scientists approach empirical questions.

Consider this, from the Army’s first response to Lt. Whiteside’s case:

“Military psychiatrists at Walter Reed who examined Whiteside after she recovered from her self-inflicted gunshot wound diagnosed her with a severe mental disorder, possibly triggered by the stresses of a war zone. But Whiteside’s superiors considered her mental illness “an excuse” for criminal conduct, according to documents obtained by The Washington Post.”

At the hearing, Wolfe, who had already warned Whiteside’s lawyer of the risk of using a “psychobabble” defense, pressed a senior psychiatrist at Walter Reed to justify his diagnosis.”

If your default on neuroscience is that it is “psychobabble” then, Houston, we have a problem.

In particular, neuroscience understands – this is a true banality – that mind is a phenomenon of brain; dozens, hundreds of lines of evidence show that beneath disordered behavior are physiological derangements of the brain. If you are in charge of veterans who have undergone all kinds of stress, you don’t need to know what the latest fMRI study shows – but you do need to know that much.

Image: Francisco de Goya, “The Charge of the Mamelukes,” 1814, The reproduction is part of a collection of reproductions compiled by The Yorck Project. The compilation copyright is held by Zenodot Verlagsgesellschaft mbH and licensed under the GNU Free Documentation License

The Latest Must-Have from the Wizards of Cupertino

January 31, 2008

Serious stuff later — but I opened my morning e-mail to find a note from a former student linking to this:

Hey — if Porsche can brand a hard disk, this seems like the next logical step into consumer hell.

(h/t Michelle)

Signs of the apocalypse: lust for science edition

January 30, 2008

Just learned about the site (yeah, sure — ed.) in the context of Superbowl coverage and mancrush-ee Tom Brady (ranked ten).

Where’s the science in this?

Right here, bubba:

As the number seven most crushed man in history — our own

Charles Darwin. That’s ahead of Brady, of course, though well off the pace of number one, (JC, which should surprise no one).

My man Albert comes in at number thirty nine, behind Galileo and Pasteur in the scientists’ heat.

Much stupid fun here — with much that is hard to explain. I mean, Wierd Al Yankovich? At twenty six? Behind Homer Simpson?!

Update:  As of Monday, February 8, Charles Darwin has slipped a notch to number 8, pushed by Tom Brady’s leap from 10-3 — though after last night’s debacle, I wouldn’t be surprised to see the Sage of Down  House rise up again.

And now for something completely different…

January 30, 2008

Someone tell PZ Myers to book travel to Nantes:

(See the second photo down for the reference).

Do look — amazing images.

A Day That Lives in Infamy: Remember January 30.

January 30, 2008

(This post winds up on a science-ish blog because of my long history with Albert Einstein, in the course of which I did the work that enables me to write what follows.)

As this New York Times piece reminded me, seventy five years ago, this was a truly bad day. Just past noon (about six hours ago, Berlin time) on January 30, 1933, Adolf Hitler took an oath administered by President Paul von Hindenburg, and assumed the office of Chancellor of Germany.

That this would be a disaster was obvious to some. General Erich Ludendorff knew both the players in that disastrous moment. He had been, with Hindenburg, the leader of the de facto military junta that ran Germany in the last years of the so-call “Great” War, and he had conspired with Hitler in the Beer Hall putsch of 1923. After Hitler became chancellor, Ludendorff wrote to the President in despair: “I solemny predict that this accursed man will cast our Reich in the abyss…Future generations will damn you in your grave for what you have done.”

Albert Einstein also undertood what Hitler’s rise meant, much earlier than most. He and Winston Churchill, then in the political wilderness, commiserated in the summer of 1933, and that September, Einstein’s frustration with the world’s myopia burst out in a newspaper interview: “I cannot understand the passive response of the whole civilized world to this modern barbarism,” he said. “Does not the world see that Hitler is aiming at war?” (From Abraham Pais, Einstein Lived Here.)

Einstein, of course, was right, which doesn’t surprise me — I hear he was a pretty smart guy.

But what I want to emphasize here is one lesson I learned in the writing of that tome that seems to me to have resonance in other circumstances, even ours now, perhaps.

That is: Hitler’s ascension to the chancellorship was a disaster—but not an inevitable one.

He certainly did his part to reach that pinnacle, but there were literally dozens of points at which he could have been stopped – even up to the last months and weeks. The outcome turned on many factors of course, but certainly among them were the inaction of those who might have defended the German republic throughout its troubled birth and early years; and then, at the end, the disastrous folly of those who were trying to destroy it for their own ends – and hoped to turn Hitler to their purposes.

From which I conclude:

It doesn’t only require active, purposeful malice to incinerate a civil society (h/t Balloon Juice). Aloof disdain and especially pure self-interested stupidity act as accelerants to the bonfire. (I had a couple of links there – but I don’t want to Godwinize this post, so fill in the blanks as you will).

Remember January 30.

(If you want a little more on the background to the tragedy or errors that propelled Hitler to power, go to the jump for an excerpt from my bookthat talks a little bit about the disastrous choices made by a range of German political actors in the early thirties that created the opening Hitler took. There is a lot more to the story, in versions written by many others, of course – but this gives a bitter taste of the events in question.)

Image: Brandenburg Gate Quadriga at night. Photo by Johann Gottfried Schadow, used under a Creative Commons Attribution ShareAlike 2.5 license.

Update: tweaked a little for readability (horrible word).


Internet vocabulary question and answer

January 29, 2008

In the comments over at Balloon Juice’s Florida Primary Open Thread I saw the word “pwned” again, for the umpteenth time, in this case, as in

Jonah Goldberg got punked’….

Please. We should be more high minded and serious about our political discourse. Jonah got … PWNED!”

That did it — I finallly broke down and admitted to mysef that I’ve never known how to say the damn word — seemingly so important for contemporary ‘net communication.

So, violating the terms of my Y chromosome, I looked it up here.

The best thing I learned is that there are just two words — count ‘em! — in English that use “w” as a vowel. Both are loan words from Welsh, no language for those who fear consonants. They are cwm, rhymes with “tomb”, meaning a cirque — a feature formed at the leading edge of a glacier; and crwth, a Welsh lyre, rhymes with “tooth.”

As the Wikipedia entry solemnly points out, that suggests that the correct pronounciation of pwn is “poon,” as in Jonah just got ‘pooned.

Maybe you all knew that, but I am greatly relieved to have that settled.
Even better — there’s a sort onomotopeia going here. That pronounciation just sounds right for the meaning…and it has a nice echo from the use of the word for all sorts of misbehaviour going on in Neal Stephenson’s ur-Cyberspace text, Snow Crash.

Image: Walfgang Zwischen, “A New England Whaler,” 1856. U.S. Library of Congress. Source: Wikipedia Commons.

Why the Public Disses Science: It’s all Jim Watson’s fault

January 29, 2008

I’m late getting my thoughts in order on the recent excellent Science Blogging Conference in North Carolina, (and as you can see below, I got distracted by the latest public health outrage from the administration) but here is one reaction to a persistent theme of meeting, the question why science ranks so low to American society at large.At times, in fact, the conversations in the halls and during the presentations pulled a kind of inverse Sally Field: “They hate us. Right now they hate us.” (Click the link to see why the paraphrase does not match your memory).

The chat came to head in the reactions to the presentation by Jennifer Jacquet of Shifting Baselines. She argued that: science cannot compete with what really interests mass media owners and audience: celebrity “news.”

Her example: Britney Spears high-stepping in spiked boots and not much else drowns out every worthy National Academy report ever released – not to mention with any public health story that doesn’t involve induced priapism, painlessly shed pounds, or eternal youth.

This post by Abel Pharmboy on Terra Sigillata summarizes the state of play of responses to Jacquet’s talk. See also Jennifer Ouelllette’s optimistic take on the issue, while James Hrynshyn takes a more dour view.

But what’s been missed so far in the conversation (IMHO, of course) is a look at how science lost the hold on both attention and trust it is perceived to have once had.

There is a rich historical vein to be mined here – at the conference we talked about things like the approach-avoidance fear and need for science evoked by the fact of atomic weapons. The Vietnam War had the same effect on a lot of people’s view of science that World War I had on Albert Einstein. He wrote to friend that “Our whole, highly praised technological progress and civilization in general, can be likened to an ax in the hand of a pathological criminal.”

But I think that you can identify one moment, one event, when the public’s view of what science was really like shifted to a much less reverent place. That moment came — in the English speaking world, at least – in 1968, when James “Lucky Jim” Watson published The Double Helix.

Forty years on, it’s almost impossible to imagine what a radical, profoundly disruptive picture of science and the scientist that book painted for its best-seller’s audience. Don’t take my word for it. Listen to Peter Medawar:

“Considered as literature, The Double Helix will be classified under Memoirs, Scientific. No other book known to me can be so described. [Emphasis added]…Many of the things Watson says about the people in his story will offend them, but hs own artless candour excuses him, for he betrays in himself faults graver than those he professed to discern in others…Watson’s childlike vision makes them seem like the creatures of a Wonderland, all ata strange contentious noisy tea –party which made room for him because for people like him, at this particular kind of party, t here is always room.” (from Medawar’s review of The Double Helix, New York Review of Books, March, 1968).

Less kind observers took a more aggrieved view of Watson’s accomplishment. Quoted in a Harvard Magazine article, molecular biologist Robert Sinsheimer said the book described a life in science as a “clawing climb up a slippery slope, impeded by the authority of fools, to be made with cadged data…,with malice toward most and charity toward none.”

Watson has, of course, gone on to exceed himself, becoming an object lesson that a Nobel prize is not election to any scientific papacy: a great discovery in one field does not confer infallibility in all utterance.

But whatever he may have said since, what Watson wrote four decades ago changed the rules of the game for the public perception of scientists.

To be fair: The Double Helix has inspired a lot of people to go into science. Watson made it sound exciting, dramatic, fun — and as Medawar said when he described Jim as Lucky, “in addition to being extremely clever, he had something important to be clever about.” Watson brought that news to a broad public in an engrossing and accessible form.

But think of the book from this angle: imagine an alternate version of The Double Helix written by Tom Wolfe. That’s what Watson managed to do to his own profession. He laid open for bemused scrutiny the character of the scientist, just as Wolfe exposed — from his particular, wicked point of view, the foibles of hippies, architects, or the rich.

You can read, laugh, and condescend – judge – all at the same time. That’s Watson’s approach to his fellow scientists and himself: and you aren’t going to look quite the same way at a group of people who have just endured such a joyfully delivered literary wedgy.

It can’t all be Watson’s fault, of course. It isn’t really his fault at all. He just wrote a funny, gripping, somewhat mean, very young man’s book. The blame for the forty years since lies proximately with those who set out to deceive, all those dishonest, self-interested, manipulative knaves and fools who have exploited every uncertainty, every unanswered question to muddy the waters on everything from global warming, to the possibility of missile defense, to evolution, and so on, and on, and on.

But if nothing else, The Double Helix marked the moment in our culture’s history when the character of the scientist truly became fair game. And once that’s up for grabs, all kinds of mischief become easier to accomplish.

Update:  Just to bring up something from the comments.  I realize that there is better comparison than Tom Wolfe for what Watson’s book meant to the public view of science.  The Double Helix, I think, has something of the same significance to public attittudes about science as Jim Bouton’s Ball Four had for baseball. That was huge at the time — and it has proved impossible ever to go all the way back to the pre-Bouton fantasy of the game.  Same with Watson.

Images: Henri de Toulouse-Lautrec, “Jane Avril,” 1893 and “Queen of Joy,” 1892. Source: Wikipedia Commons.


Get every new post delivered to your Inbox.

Join 8,678 other followers