Posted tagged ‘History’

Racism Kills…And Kills…And Kills

June 15, 2016

Anger?  Heartbreak? Disbelief? Berserker rage?  Which should come first in response to this?

For forty years, the Tuskegee Study of Untreated Syphilis in the Negro Male passively monitored hundreds of adult black males with syphilis despite the availability of effective treatment. The study’s methods have become synonymous with exploitation and mistreatment by the medical community. We find that the historical disclosure of the study in 1972 is correlated with increases in medical mistrust and mortality and decreases in both outpatient and inpatient physician interactions for older black men. Our estimates imply life expectancy at age 45 for black men fell by up to 1.4 years in response to the disclosure, accounting for approximately 35% of the 1980 life expectancy gap between black and white men. (h/t Jesse Singal at The Science of Us)

Graveyard_in_the_Tyrol_1914-1915_JS_Sargent (1)

That’s the abstract of a paper by Stanford Medical School’s Marcella Alsan and the University of Tennessee economist Marianne Wanamaker. It’s currently in the working paper stage at the National Bureau of Economic Research (which is, despite its name, not a government research institution).

As Singal writes over at New York Magazine that means both that this is not quite the final draft of this paper (or at least, that it hasn’t yet gone through the whole journal process yet) — and that there is a host of nuance and specific contingencies that surround the Tuskeegee story.  But the central point remains:  specific acts of racial cruelty harm not just those bearing the immediate brunt, but also can — and did here — do  lasting and lethal damage to so many more.

Alsan and Wanamaker conclude:

Our findings underscore the importance of trust for economic relationships involving imperfect information. Typically the literature on trust has focused on trade settings (Greif, 1989); however, much of medical care depends on health providers and patients resolving information asymmetries. Trust, therefore, is a key component of this interaction…

Indeed.

And if we needed any more reasons to take this election seriously (we don’t) think on this:  Donald Trump’s candidacy is based on racism, on the denial of a share in American polity and society to those who look the wrong way.  There’s a breach of trust there, deep and dangerous — and in so many ways, deadly as hell.

John Singer Sergant, Graveyard in the Tyrolbetween 1914 and 1915

May Day!

May 1, 2016

Happy International Workers Day!

That’s one incarnation of a classic — and here’s another, with a lovely story to frame it.

So, to channel my inner President Obama, talking to Senator Sanders last night “this is the time and place” in which I wish all my comrades a happy, peaceful, easeful international labor day.  We may all Tikkun Olam again tomorrow.

A Shuttered Past

November 21, 2015

I think we need some antidote to the depths of derp we’ve seen (and on this blog picked over with all the horror that follows a good look at last night’s supper this morning) coming from the Syrians Are Coming brigade of bed-wetters.

So, instead, let’s take a look at someone who used their media smarts for good — and, in doing so, helped forge the chain that led to the fact (glory be) that we have the president we do right now, serving as a bulwark against the stupid that would have toppled a lesser person.

That would be this man:

Frederick_Douglass_c1860s

That’s Frederick Douglass, of course, in a shot taken in the 1860s.

Here he is as a younger man:

Unidentified_Artist_-_Frederick_Douglass_-_Google_Art_Project-restore

And in old age:

Frederick_Douglass_LOC_collodion_c1865-80

Those are three of the 160 surviving photographs taken of Douglass — a figure that currently ranks as the most confirmed separate portraits taken of any American in the 19th century.*  Scholars John Stauffer, Zoe Trodd and Celeste Marie-Bernier have a new book out, Picturing Frederick Douglass,  In it they use a sequence of images to drive a new biography of Douglass, and in doing so allow us to see technological change as it was lived — and used — by a brilliant observer of his own life and times.  As the authors write in the introduction, Douglass loved photography, and saw it as an exceptionally potent tool for making the world a different and better place. Douglass loved the fact that

What was the special and exclusive of the rich and great is now the privilege of all. The humblest servant girl may now possess a picture of herself such as the wealth of kings could not purchase 50 years ago.

In that context Stauffer, Trodd and Marie-Bernier make the case that Douglass saw photography as  tool to alter social reality:

Poets, prophets and reformers are all picture-makers–and this ability is the secret of their power and of their achievements. They see the what ought to be by the reflection of what is, and endeavor to remove the contradiction.

Such reasoning (and more besides) led Douglass to the photographer’s studio over and over again, actively seeking out the camera as a tool that could help him create the reality of African-American humanity, presence, significance.

Photography allowed him to be seen.  In that determined, asserted presence,  you have (it seems to me) an early herald of of the circumstances in which Barack Obama could become president.  Alas, in the fact of the racist and vicious forces with which Douglass had to contend, we can be similarly reminded that in our times the sight of a black man commanding our gaze drives too many among us into spasms of demented, terribly dangerous rage.

But put that aside for a second, and look at some fabulous images of an extraordinary — and extraordinary-looking — man.  (A few more examples.)

And if you feel the need for some open thread, well take that too.

*The runners up are cool too:  In the research for this book, the authors found George Armstrong Custer, that avatar of puffed-up vanity taking second place, with 155 portraits.  Red Cloud came next at 128, followed by Whitman and Lincoln at 127 and 126, the poet and his captain connected again.  It seems likely, according to these writers, that when further work is done, Ulysses S. Grant may trump them all, but that doesn’t change the point of what Douglass set out to do.

Images:

1.  c. 1860s

2.  c. 1850, daguerrotype

3. before 1880, Brady-Handy collection.

Appomattox Day

April 9, 2015

On which that genteel butcher Bobby Lee, surrendered the treasonous Army of Northern Virginia to General Ulysses S. Grant.

Grant+Lee

Grant’s terms have generally been regarded as generous, to the point that the military leaders of the rebellion were spared the threat of criminal trials for the actions in defiance of properly constituted Federal authority.  Looking forward, not back, is no new trope in American politics.

In any event, my view of the Civil War echoes Sherman’s:  “secession was treason, was war…”

Apologists for the Lost Cause trump up the usual counters — I’ve just been batting this around with a wistful mythologist on Twitter.  The battle wasn’t for slavery, but state’s rights; the men under arms weren’t traitors — they were just soldiers fighting other men’s battles, or for a misguided but sort-of reasonable goal; and so on.

It is vital, I believe to push back on that nonesense.  That’s not how those at the time saw it, not when it got down to the nub.  Lee’s Army of Northern Virginia was a slave holder’s army, and the documents with which the makers of the Confederacy declared their cause made the reason for secession clear.  State’s rights were the means to the real end:  the permanent power to hold other human beings as property:

For twenty-five years this agitation has been steadily increasing, until it has now secured to its aid the power of the common Government. Observing the forms of the Constitution, a sectional party has found within that Article establishing the Executive Department, the means of subverting the Constitution itself. A geographical line has been drawn across the Union, and all the States north of that line have united in the election of a man to the high office of President of the United States, whose opinions and purposes are hostile to slavery. He is to be entrusted with the administration of the common Government, because he has declared that that “Government cannot endure permanently half slave, half free,” and that the public mind must rest in the belief that slavery is in the course of ultimate extinction.

This sectional combination for the submersion of the Constitution, has been aided in some of the States by elevating to citizenship, persons who, by the supreme law of the land, are incapable of becoming citizens; and their votes have been used to inaugurate a new policy, hostile to the South, and destructive of its beliefs and safety.

That is: only reason secession occurred is because the South finally lost an election.

It is in that context that April 9 is a great day.  Certainly, too much was left undone.  Too much remains undone.  But at Appomattox, the traitor Robert E. Lee’s surrender enshrined at least the possibility that Federal authority could remedy grievous wrong.  And that’s cause for remembrance of a great hope kindled, and of the too-long wait, still ongoing, for its fulfillment —  what Abraham Lincoln saw as the course forward from the moment of surrender:

With malice toward none, with charity for all, with firmness in the right as God gives us to see the right, let us strive on to finish the work we are in, to bind up the nation’s wounds, to care for him who shall have borne the battle and for his widow and his orphan, to do all which may achieve and cherish a just and lasting peace among ourselves and with all nations.

Happy Appomattox Day, everyone.

Image: Montage of these two photographs:

32. Lt. Gen. Ulysses S. Grant standing by a tree in front of a tent, Cold Harbor, Va., June 1864. 111-B-36.

145. Lee, Gen. Robert E.; full-length, standing, April 1865. Photographed by Mathew B. Brady. 111-B-1564.

The Uses of the Past: Science/Science Writing Talk

January 17, 2012

I’ve always found that the best way to tackle a complicated story – in science or anything else, for that matter – is to think historically.  But even if I’m right in seeing a historical approach as an essential tool for writers, that’s not obviously true, however well (or not) it may work for me.  Science news is or ought to be new; science itself, some argue, is devoted to the task of relentlessly replacing older, less complete, sometimes simply wrong results with present-tense, more comprehensive, and right (or right-er) findings.

Thinking about this, I put together a panel on the Uses of the Past that was held at last year’s World Conference of Science Journalists in Doha, Qatar.  The panelists – Deborah Blum, Jo Marchant, Reto Schneider and Holly Tucker led a  discussion that was lively and very supportive of the history-is-useful position (not to mention valuable in itself).  But the conversation was far from complete.

So we’re going to do it again, this time at Science Online 2012. (You can follow all the fun by tracking what will be in a few days a tsunami on Twitter, tagged as #scio12).  This is an “unconference,” which means that I and my co-moderator, Eric Michael Johnson, will each present what amounts to a prompt – really a goad – for the audience/participants to run away with.  As Eric and I have discussed this session, one thing has stood out:  where I’ve thought of the term “uses of the past” as a challenge to writers about science for the public, an opening into approaches that will make their work better, Eric has been thinking about the importance of historical thinking to the practice of science itself – what working scientists could gain from deeper engagement not just with the anecdotes of history, but with a historian’s habits of mind.  So just to get everyone’s juices flowing, Eric and I thought we’d try to exchange some views.  Think of this as a bloggy approach to that old form, the epistolary novel, in which we try to think about the ways in which engagement with the past may matter across fields right on the leading edge of the here and now.

So.  Here goes…

____________________________________________________________

Dear Eric,

I have to confess; I’ve never needed convincing about history; I’m a historian’s son, and all my writing, just about, has had a grounding in the search for where ideas and events come from.

But all the same, it’s simply a fact that the professional scientific literature from which so many stories for the public derive seems, on first glance, to be as present-tense as it is possible to be.   As I write this, I’m looking at the table of contents of <a href=”http://www.sciencemag.org/content/335/6064.toc”>my latest (January 6) digital issue of <em>Science</em></a>. In the “Reports” section – where current findings are deployed — there is nothing but the now and the near future under discussion.  Just to pull up a few of pieces at whim:  we can learn of the fabrication of wires on the nano-scale that obey Ohm’s law (an accomplishment its makers claim will support advances in both classical and quantum computing to come).  We can read of a new measurement of the ratio of isotopes of tungsten (performed by some of my MIT colleagues in concert with researchers at the University of Colorado) that suggests (at least as a preliminary conclusion) that the terranes that make up the earth’s continents have remained resistant to destruction over most of the earth’s history. And then there is a report from researchers into that living genetics/evolution textbook, <em>C. elegans</em>, that adds yet one more telling detail within a broader understanding of the intertwined behavior of genetic and environmental processes.

All of these – and all the rest of what you can find in this issue of that journal, and so many others – tell you today’s news.  Each of these could form the subject of a perfectly fine popular story.  Yet none of these do or necessarily would as popular stories engage the history that lies behind the results.

That is: you could tell a story of a small step taken towards the goal of building a useful quantum computer without diving into either the nineteenth century’s investigation into the properties of electrical phenomena or the twentieth century’s discovery of the critical role of scale on the nature of physical law.  You can talk about the stability of continents without recognizing the significance of that research in the context of the discovery of the intensely dynamic behavior of the earth’s surface.  You certainly may write about mutation rates and stress without diving into that old fracas, the nature-nurture argument that goes back to Darwin’s day and before.  This is just as true for the researcher as the writer, of course.  Either may choose to ignore the past without impairing their ability to perform the immediate task at hand:  the next measurement, the next story.

You could, that is, but, at least In My Humble Opinion, you shouldn’t.  From the point of view of this science writer, history of science isn’t a luxury or an easy source of ledes; rather, it is essential for both the making of a better (competent) science writer, and in the production of science writing that communicates the fullest, most useful, and most persuasive account of our subject to the broad audiences we seek to engage.

In briefest form, I argue (and teach my students) that diving into the history of the science one cover trains the writer’s nose, her or his ability to discern when a result actually implies a story (two quite different things). It refines a crucial writer’s tool, the reporter’s bullshit detector. At the same time, explicitly embedding historical understanding in the finished text of even the most present-and-future focused story is, I think, more or less invaluable if one’s goal is not simply to inform, but to enlist one’s readers in gerunds of science:  doing it, thinking in the forms of scientific inquiry, gaining a sense of the emotional pleasures of the trade.  I’ll talk more about both of these claims when my turn comes around…but at this point, I think I should stop and let you get a word in edgewise.  Here’s a question for you:  while I can see the uses of the past for writers seeking to extract from science stories that compel a public audience – do working scientists need to care that much about their own archives.  What does someone pounding on <em>C. elegans</em> stress responses, say really need to know about the antecedents of that work?

Best,

Tom

____________________________________________________________

Dear Tom,

The British novelist, and friend of Aldous Huxley, L.P. Hartley began his 1953 novel <em>The Go-Between</em> with a line that, I suspect, many working scientists can relate to, “The past is a foreign country: they do things differently there.” The process of science, much like the process of art, is to dredge through what has been achieved in the past in order to generate something altogether new. That is perhaps the only thing that the two fields of creative endeavor have in common; the past must be understood only so that you can be released from it. However, much like you, I’ve never needed convincing about history either. While I agree that the past can be a foreign country at times, I’ve always enjoyed traveling.

I came to history through my work in science, but I found that understanding the historical context for why scientists in the past came to the conclusions they did helped inform the questions I was asking. I’ve always believed that the scientific method was the best way of eliminating our own personal biases when seeking answers about the natural world, but that unexamined assumptions can still slip through the scientific filter. By examining how these flawed assumptions made it through I hoped it would help me in my own work. Perhaps the best way to explain what I mean by this is to briefly discuss how an early brush with history encouraged me into the research direction I ultimately pursued in graduate school. The book was <em><a href=”http://www.amazon.com/Natures-Body-Londa-Schiebinger/dp/080708901X”>Nature’s Body</a></em> by the Stanford historian of science Londa Schiebinger that I found in a used bookstore during my senior year as an undergraduate in anthropology and biology. In one chapter of her book she discussed the early history of primate research and how the prevailing assumptions about gender influenced the hypotheses and, as a result, the conclusions about those species most similar to ourselves. One of the earliest descriptions of great apes in the West, after <a href=”http://www.timeshighereducation.co.uk/story.asp?storycode=415874″>Andrew Battell’s exaggerated stories about “ape monsters,”</a> was by the Dutch physician Nicolaes Tulp, probably the most widely recognized figure in the history of science that almost no one has ever heard of.

In 1632 Tulp commissioned the artist Rembrandt to paint his anatomy lesson, which ended up being one of the Dutch master’s most famous works (if anyone today recognizes Tulp’s name, it’s most likely from the title of this painting). Nearly a decade after he posed for this portrait Tulp published his Observationes Medicae (Medical Observations) in which he described the anatomy of a female ape he’d received on a ship bound from Angola. He was immediately struck by the similarities with humans and the drawing he published, identified as Homo sylvestris, demonstrated a striking example of cultural bias. Made to look the way he assumed this female would appear while alive, Tulp emphasized his own culture’s gender stereotypes. The female sat with her hands in her lap, framing what appeared to be a pregnant belly, and her head was glancing downwards in a distinctly demure pose.

By itself this depiction wouldn’t have been particularly revealing; it was just one individual allowing their own social biases to influence his science. What was remarkable, however, is the way Schiebinger showed how Tulp’s depiction would appear time and time again in the subsequent centuries when describing female primates, not just in appearance but also in behavior. More than two hundred years later, when Darwin described the differences between males and females in his theory of sexual selection, he had the same unmistakable gender bias that influenced his thinking. I had never taken a women’s studies course in my life, but this insight was an enormous wake up call for me. I realized there had been a common set of assumptions that endured for centuries, what the historian Arthur Lovejoy called “the spirit of the age,” and had gone unexamined until relatively recently when a new generation of primatologists–such as Jane Goodall, Sarah Blaffer Hrdy, and Frans de Waal–began studying the female half of the equation that had been largely ignored as an important area of study. Knowing this history pushed me to ask different questions and focus on a topic that I discovered hadn’t been addressed before: why female bonobos had such high levels of cooperation despite the fact that they had a low coefficient of genetic relatedness (violating the central premise of <a href=”http://scienceblogs.com/primatediaries/2010/05/punishing_cheaters.php”>Hamilton’s theory of kin selection</a>). Different scientific topics have their own entrenched assumptions that otherwise critical researchers may not have considered; that is, until they see the broad patterns that a historical analysis can reveal.

Cheers,

Eric

____________________________________________________________

Dear Eric,

I love your story, partly because the original painting is so extraordinary and it’s good to have any excuse to revisit it.  But I value it more for your argument that engaging with the thought and thinking (not quite the same thing) of scientists past fosters insight into present problems.  That goes just as much for science writers – that is to say, those seeking to communicate to a broad public both knowledge derived from science and the approaches, the habits of thought that generate those results.

Rembrandt’s painting itself gives some hints along this line.  There’s a marvelous and strange discussion of the work in another novel written in English, W. G. Sebald’s <em><a href=”http://www.amazon.com/Rings-Saturn-W-G-Sebald/dp/0811214133/ref=sr_1_1?s=books&ie=UTF8&qid=1326733737&sr=1-1″>The Rings of Saturn</a></em>.  There, Sebald points to the fact that none of the anatomists are actually looking at the corpse under the knife. Tulp himself stares out into the middle distance, whilst other members of his guild peer instead at an anatomical atlas open at the foot of the table. As Sebald studies the one of the often-discussed details of the painting, he argues that what appears to be simply an error in the depiction of the <a href=”http://www.ncbi.nlm.nih.gov/pubmed/17225789″>dissection of the left</a> hand reveals an artist seeking to see past the formal abstraction of the lesson, drawing attention instead to the actual body on the table, the physical reality of a single dead man.

Not wishing to push too hard on that (unproven, unprovable) interpretation, Sebald still points out something that rewards the attention of science writers.  Rembrandt depicts both facts — the body, the tendons of the exposed hand – and ideas, at a crucial moment of change in the way natural philosophers sought verifiable knowledge.

We see, amidst the reverence for the book, the authority of prior learning, an event actually occurring on the canvas:  the effort to extract understanding from the direct testimony of nature. Amidst all else that can be read there, Rembrandt’s painting reminds the viewer of the time – not really all that long ago – when a fundamental idea was being framed with its first answer:  yes, it is possible to understand biological forms as machines, and to investigate their workings directly.

So, to take the long road home to the question of why bother with history when covering the news of today and tomorrow, here are two thoughts (of the three with which I will hope to provoke our fellow unconferees on Thursday).  First: as you argue for scientists, understanding of the past can lead writers to stories they may not have known were there.

To give an example, I’ll have to leave anatomy behind (about whose history I sadly know very little). I recently had an occasionto look back at <a href=”http://books.google.com/books?id=KniUvcxFtOwC&pg=PA281&lpg=PA281&dq=michelson+sixth+decimal+place+ryerson+physical+laboratory&source=bl&ots=0oDZa8vpy3&sig=6_BQaDfvsUE-G_nLWBmNF8l4boM&hl=en&sa=X&ei=91oUT_3mAeXq0gHvuI22Aw&ved=0CE8Q6AEwBg#v=onepage&q=michelson%20sixth%20decimal%20place%20ryerson%20physical%20laboratory&f=false”>A. A. Michelson’s infamous remark</a> from 1894 when he asserted that physics was done except for that which could be discovered in the sixth decimal places of measurements.

There is a lot wrong in that claim, but if you look more closely at what he said, you can find something less obvious in Michelson’s claim – and that can lead to insight into what goes into the making of all kinds of very modern physics, from (possibly true) observations of faster than light neutrinos to the ways in which cosmologists are extracting knowledge from high-precision measurements of the cosmic microwave background (and much else besides, of course).

So there’s a story-engine chugging away inside history, which is there to be harnessed by any writer – facts, material, from which to craft story.  There’s also a story-telling tool, a method that derives directly from historical understanding.  A core task for science writing is the transformation of technically complicated material into a narrative available to broad audiences – which must be done without doing violence to the underlying ideas.  If the writer remembers that every modern problem has a long past, then she or he can prospect through that history when the problems and results in that sequence are intelligible to any audience.  For just one last, very quick example:  general relativity is a hard concept to explain, but framing the issue that it helped to resolve in the context of what Newton’s (seemingly) simpler account of gravity couldn’t handle – that spooky action at a distance that permits the gravitational attraction of the sun to shape the earth’s orbit – and you’re in with a chance.

Best,

Tom

____________________________________________________________

Dear Tom,

I think you touched on something very important with regard to the idea that science writing is a transformation that takes the technical language of science (primarily mathematics and statistics–that is, if it’s done correctly) and interprets it into the communication of everyday experience. Science writing is a process of translation. The history of science as a discipline is precisely the same thing, though historians typically engage in a different level of linguistic analysis by looking at language meaning and the way that science provides insight into the process of historical change. But it seems that there is no better way to think about how the history of science can be useful to science journalists than to consider what we do as essentially a process of translation. Art is involved in any translation work and there is never a one-to-one correspondence between the original and what it eventually becomes. We must be true to our source material but also evoke the same overall meaning. To put this more simply: why are the findings being reported important to scientists in a given field and how can that same importance be conveyed to a readership with a very different set of experiences? It seems to me that there are two primary ways of doing this: engaging with the history of <em>why</em> this question matters or tapping into contemporary <em>attitudes</em> that evoke connections with the findings reported (where the latter approach <a href=”http://scienceblogs.com/primatediaries/2009/10/grand_evolutionary_dramas_abou.php”>goes wrong</a> happens to be one of my <a href=”http://blogs.scientificamerican.com/primate-diaries/2011/09/02/male-chauvinist-chimps/”>favorite</a&gt; topics of critique, one that is <a href=”http://www.huffingtonpost.com/eric-michael-johnson/intelligent-design-creati_b_636200.html”>unfortunately</a&gt; an extremely rich resource to draw from).

However, there is one other reason why the history of science is important for science journalists that we haven’t quite touched on yet. A journalist who knows their history is better protected from false claims and the distraction of denialism. The scientific press release is a unique cultural invention and all too often seeks to manipulate journalists into framing a given story so as to exaggerate that study’s actual impact. The historically minded journalist is less likely to get bamboozled. In a similar way, the <em>he said-she said</em> model of reporting is a persistent and irritating rash for almost every professional journalist I’ve interacted with. But the temptation to scratch is always present, even though the false equivalency reported is rarely satisfying over the long term. The history of science can be the journalistic topical ointment. Those who know the background of anti-vaccine paranoia, or who recognize the wedge strategy of creationist rhetoric, can satisfy their need to report on a story that captures the public’s attention while also providing useful information to place that issue within it’s proper context. History matters.

Your friend,

Eric


Eric Michael Johnson
Department of History
University of British Columbia
http://www.history.ubc.ca/people/eric-michael-johnson
http://blogs.scientificamerican.com/primate-diaries/

Images:  Johannes Vermeer, Lady Writing a Letterbetw. 1665 and 1666.

Hans Holbein the Younger, The Ambassadors, 1533.

Nicholaes Tulp,  “Homo sylvestris” Observationes Medicae, Book III, 56th Observation, 1641

Rembrandt van Rijn, The Anatomy Lesson of Dr. Nicolaes Tulp, 1632

A bit more blogrolling, and Newton and the Counterfeiter’s latest notice

July 14, 2009

More self aggrandizement, and a pointer.  PhiloBiblos aka Jeremy Dibbell has just posted a very nice brief review of Newton and the Counterfeiter (AmazonPowellsBarnes and NobleIndiebound) at his book-loving blog, now to be found on the blogroll at left.

Key quote:

It’s the kind of story that would make a good novel, but which written by the right person works even better as history.

Touring through Dibbell’s other posts, I found many delights, including this one which pointed me here, which then led to Thomas Jefferson’s reading list…which forced me to add  J. L. Bell’s Boston 1775 to my blogroll.  Bell writes on the roots of the American Revolution in my current meatspace domicile, aka the Hub of the Universe, Athens of America, Somerville’s neighbor….Boston.

Beware of PhiloBiblos, by the way.  Too many juicy links…which provides me an excuse for a second hit of xkcd in a single day:

You have been warned.

Getting Ready for 200/150: “How Many Removes From Charles?” Edition

December 16, 2008

As everyone with a pulse and an interest in science knows, 2009 is the big Darwin year — the 200th anniversary of his birth (February 12) and the 150th of the publication of The Origin.  I will in a week or so have some news about what Inverse Square — or a derivative thereof — is doing to join the chorus on that one; I think I’ve got something shaping up that the community will enjoy.

In the meantime, and as I get stuck into my prep for that project, just a quick thought as I peered at the Darwin/Wedgewood family tree Janet Browne helpfully included at the front of the Voyaging volume of her Darwin magnum opus.  There I found that Darwin’s latest-surviving child, Leonard.  Leonard Darwin was born in 1850, before the Crimean War, the Sepoy MutinyAme–the Indian Rebellion of 1857 — and the American Civil War; and he saw the end of World War II, India’s independence and the effective end of the British Empire, all before his death  in 1948.*  And, not to overlook the most important factoid, young Leonard would have been a curious eight year old just as his father was in the midst of his most intense labors distilling the work of decades into the book that became The Origin of Species.

That skein of history would be remarkable enough just for one man’s memory, but what struck me was the thought that my f Uncle David, born and raised in England, with an army background (and subsequent career of his own) that could have led him to Major Darwin (Royal Engineers), might indeed have exchanged a conversational commonplace or two with the son of the man whose birth and work we celebrate soon.

All of which is to point out the obvious — and perhaps one tangential thought not quite so banal. The distance between anyone reading this and Charles Darwin is not that great.  It is entirely imaginable to have had a conversation with someone you know or knew who could have heard the stories of life at Down House from someone who watched and listened as Charles Darwin assembled his argument.  The middle of Queen Victoria’s reign, and the very center of a revolution in ideas seem very far away when we toss around anniversary numbers like a bicentennial, or one hundred and fifty years since this or that.  They are not, at least by the measure of human memory.  She danced with a man who danced with woman who danced with the Prince of Wales; we are that close to Charles and his pigeons and all the rest.

Nothing new there — just a reminder of the numbers.  But the thought that crossed my  mind as I wondered if my uncle did in fact ever meet Leonard (as above–I had not known to ask, of course, until the chance-met glance at the bottom of the family tree) was that Richard Dawkins may have missed the point of his own reflection that he too would have been a believer before Darwin.

If you follow my sense of the slenderness of the gap that separates us in the passage of generations, of the transfer of ideas and culture that pass from grandparent to grandchild at so near a remove from Charles Darwin’s in his study in 1859, then the broken chain of belief that separates Dawkins from Victoria’s (or Emma Darwin’s) Anglican God is very short indeed.

And that thought made me wonder  if the heat and urgency I read in Dawkins’ atheism seems a little misplaced.  Without wandering too far into this thicket, it does seem to me worth remembering that it has been a very short time in the history of human society, and a still shorter time if the person-person touch of memory matters, since Darwin’s thought  struck its blow to conventional faith.

It takes some time for big ideas to sink in.  (For a biblical example, as long as we are on the subject, God through Moses affirmed the equality of women, at least as far as inheritance and rights of property go, in the Book of Numbers, which dates back 3,400 years ago or so.  That thought took a while to penetrate, did it not?)

There is no doubt in my mind that Darwin’s rigorous materialism takes some getting used to; that part of the point of 2009 is to confront not just Darwin’s thinking, but the success of the research program that his work (and that of many others, of course) set in motion.  I’m confident, that is, not angry — and I remember that we have not been inside this world view for any length of time at all.  One hundred and fifty years?  The lives my uncle’s life has touched — mine and those before me — stretch back before then.  Easily.

*Here from the Wikipedia entry on Leonard Darwin linked above, is John Maynard Keynes take on Charles’s son, who proves to have just a hint of the wasp about him in his turn:

Keynes explained the decision to publish the niece’s “very personal account”: “Leonard Darwin’s life covered so vast an epoch of change in men’s ideas, his own attitudes towards the problems of his age were so characteristic of the best and noblest intelligences of his time, and he grew up in the environment of a family of so immortal a renown …” (p. 439) Darwin expressed his feelings about Keynes in a letter to Fisher (Correspondence p. 141), “I neither like him nor trust him … But he’s very clever …”

Image: Auguste Renoir:  “La danse à la campagne,” 1883.

Quote for the Day: Jacob Burckhardt hearts him some science writing dept.

November 25, 2008

Jacob Burckhardt is hardly a household name anymore, outside certain rather specialized houses, but it would be not too great an exaggeration to say that he “discovered” the Italian Renaissance, establishing the notion of a distinct period, a time and place in which fundamental changes took place that rung out the end of the ideas and culture of the middle ages, laying the foundations of for habits of mind and the concrete history of that time we think of as modern.

Perhaps most significantly, he was an early historian, perhaps the seminal one, who focused on the history of art in particular and “culture” more generally as essential approachs to “history,” full stop.  That attitude led him to think about the history of science in the Renaissance as something other than simply a chain of discoveries that form individual sequences within particular disciplines…which in turn led him to just about the earliest praise of science writing I’ve been able to find.

If it is a little back-handed, Burkhardt’s compliment still captures what I think of as a critical truth:  science is not self-contained;  it is an expression of culture, and its survival as a living human enterprise depends on culture at large remaining aware of its claim on the public’s understanding and emotion alike.  In his landmark work, The Civilization of the Renaissance, Burkhardt writes:

Even the simple dilletante of a science — if in the present case we should assign to Aeneas Sylvius so low a rank — can diffuse just that sort of general interest in the subject which prepares for new pioneers the indispensable groundwork of a favourable predisposition in the public mind.  True discoveres in any science know well what they owe to such mediation.

(Part Four.  The translation above comes from the Penguin Classics edition of 1990.)

I’d argue that science writing and science writers have more ambition than simply acting as cultural diffusers.  The ones I admire most think of themselves as writers whose subject is science, and not simply science writers; that is they (and I, in the privacy of my own thoughts) are trying to use language to the limits of capacity for expression, to move readers and not simply to inform them.  That said, Burkhardt is right:  a civilized culture, a civilized time manages to communicate some version of its most sophisticated thinking to every interested citizen.

Image:  Studies of Embryos by Leonardo da Vinci (Pen over red chalk 1510-1513).

Veteran’s Day — nee Armistice Day — poem and remembrance

November 11, 2008

Update: Check out Lovable Liberal’s remembrance too.

Michael D. over at Balloon Juice has dredged up the inevitable In Flanders Fields as a token of memory on this sad day.

I have to confess I hate John McCrae’s poem because of the third verse, with its appropriating of the dead to keep the torch burning that consumed so many young men in a truly pointless and brutally mis-led war.  It’s home-front poetry, for all that it was written by a man who fought and died in the conflict — by which I mean that it plays on the familiar tropes of glory and honor deemed suitable for the consumption of those gentlemen and ladies then a-bed safely removed from the horror and squalor of the trenches.*

In the comment thread, one reader offers up Owen’s equally famous Dulce et Decorum Est as an antidote — and it certainly does offer the honest soldier’s counter argument:

My friend, you would not tell with such high zest
To children ardent for some desperate glory,
The old Lie: Dulce et decorum est
Pro patria mori.

For my part, two thoughts:  first to McCrae himself.  The poem was born of his direct experience that was fully immersed in the bloody and in-the-moment pointlessness of the war as anything Owen wrote.  Read the story of how the poem came to be here.  The third verse that so offends me?…I have no doubt that it was truly felt, the more so that the poem was written in the spring of 1915 — the first full campaign season in the trenches — and before the grinding fact of the four-year meatgrinder could fully crush its schoolboy bravado.  In any event, he was there, he saw what he saw and felt what he felt, and he gets to express that emotion any way he damn pleases.

It’s the use of the poem by those who have not earned that authority in the same way that gets me, especially now, in the wake of five years of war when my friends on the other side of keyboard wars have so often called for sacrifices as long as others make them.  Maybe I’m the one fighting old battles here, in the new world after November 4, 2008, but I don’t think so.

(Note that I haven’t even begun to write about the collective criminal folly that permitted the trenches to consume so many men for so long.  For a lucid professional’s take on that question, the best place to start is the classic:  B.H. Liddell Hart’s seminal work Strategy.  My own take on it can be found in, interspersed with other stuff, in chapters 3-12 of this book.**)

Second thought:  here is one more poem just to make sure that I  drive home the point about the cost of stupid decisions in war.

This is another by Wilfred Owen, much less well known, perhaps less well made than Dulce…. but in its own way yet more wrenching:

S. I. W.

“I will to the King,
And offer him consolation in his trouble,
For that man there has set his teeth to die,
And being one that hates obedience,
Discipline, and orderliness of life,
I cannot mourn him.”
W. B. Yeats.

Patting goodbye, doubtless they told the lad
He’d always show the Hun a brave man’s face;
Father would sooner him dead than in disgrace, —
Was proud to see him going, aye, and glad.
Perhaps his Mother whimpered how she’d fret
Until he got a nice, safe wound to nurse.
Sisters would wish girls too could shoot, charge, curse, . . .
Brothers — would send his favourite cigarette,
Each week, month after month, they wrote the same,
Thinking him sheltered in some Y.M. Hut,
Where once an hour a bullet missed its aim
And misses teased the hunger of his brain.
His eyes grew old with wincing, and his hand
Reckless with ague. Courage leaked, as sand
From the best sandbags after years of rain.
But never leave, wound, fever, trench-foot, shock,
Untrapped the wretch. And death seemed still withheld
For torture of lying machinally shelled,
At the pleasure of this world’s Powers who’d run amok.

He’d seen men shoot their hands, on night patrol,
Their people never knew. Yet they were vile.
“Death sooner than dishonour, that’s the style!”
So Father said.

One dawn, our wire patrol
Carried him. This time, Death had not missed.
We could do nothing, but wipe his bleeding cough.
Could it be accident? — Rifles go off . . .
Not sniped? No. (Later they found the English ball.)

It was the reasoned crisis of his soul.
Against the fires that would not burn him whole
But kept him for death’s perjury and scoff
And life’s half-promising, and both their riling.

With him they buried the muzzle his teeth had kissed,
And truthfully wrote the Mother “Tim died smiling.”

*There is no shortage of great prose accounts of the disasters of the Western Front.  The first I read were by two of the War Poets — Robert Graves, in Goodbye To All That, and Siegfried Sasoon in his trilogy collected under the title George Sherston’s Memoirs, now out of print.  The central work of the trilogy, Memoirs of an Infantry Officer, can still be found.

**Here’s a passage from my attempt to capture the relentless pointlessness of the so-called Great War at the level of the battlefield.  The incident described took place 90 years ago to the day.

There was one incident that captured the essence of war on the western front, the distillation of its arbitrary violence.  At two minutes to eleven in the vicinity of Mons a Canadian private named George Price was hit by a sniper’s bullet.  He died instantly.  The man who killed him remains unknown.  That man made a choice.  He was a marksman, a skilled soldier.  He had just moments remaining in which it was legal for him to kill.  There was no need to fire, no purpose, and some risk at least to himself and any comrades near him.  If he waited until eleven, and then put his gun down, the only consequence would be that a young stranger would go home.   Instead, the shot rang out.  Two minutes ticked past.  The war ended.  George Price lay dead.

Image:  Red Poppies at the Menin Gate, Ypres, Belgium.  Photograph taken on March 11, 2006.

Against Ta-Nehisi Coates…

October 24, 2008

…or rather, against his defense of white racism. The post is a meditation on why women are, in his perception, so harsh on Sarah Palin; his epiphany came when he tried to imagine a black equivalent to the Palin candidacy — and he couldn’t:

A brother in that position not only would not be considered for 2012, he would be impeached when he returned to governorship for embarrassing the state, and then have his ghetto card revoked for embarrassing the local Negrocracy.

For this, the writer is grateful, which makes perfect sense.  It’s better by far to have a strong sense of standards than some unthinking identity commitment.

That’s the implication of the Yiddish phrase, “A shande fur de goyim” — a shame before the non-Jews. Nothing could be worse than to be such a schande; it’s why Jews, or at least  those I hang with, wince with every Jack Abramoff or, to channel a different era, why Abbie Hoffman’s use of the phrase to describe Judge Julius Hoffman during the Chicago Eight trial was such a potent barb.

More deeply, we have a lot of history that tells us it is better on every level, from the moral to the practical, to be not merely no worse than the majority societies in which most Jews live, but to be closer to blameness, to bring no scandal to our names and homes. So, thus far, I’m with Ta-Nehisi.  But then he goes on to write who he could or would wish to credit for the existence of such internal correctives:

White racists have taken a lot of heat on this blog. But the truth of the matter is that they may be the single biggest promoters of black excellence in this country’s history. There is a reason Tony Dungy was the first winning coach in Tampa Bay’s history–he had to be.

Again, from where I sit looking over the ethnic/race/identity sorrows of history, I know that there is a partial truth here.   I’m enough older than Ta-Nehisi to have Jackie Robinson’s story as the archetype of the pressure on the standard-bearer.  There is no doubt that Dungy did a very hard thing — much harder than most watching him grasped, I think — but Robinson was literally in a league of his own on the need to combine superlative performance with extraordinary internal strength and self-control.  (For the record, I’m not so old that I ever saw Robinson play; but his was the story we read in grade school.)

The same dynamic played out time and again in public and in private Jewish lives — including the importance of public heroes finding someway to express both a particular and a universal greatness; think of Sandy Koufax refusing to pitch on Yom Kippur and you have a hint of the balancing act involved.

But where I think Ta-Nehisi goes wrong is in giving racists themselves credit for the excellence of a Dungy or anyone else.  I don’t doubt that there is a forged-in-fire power to the notion of proving oneself despite the efforts of those with evil intention to thwart you. But Ta-Nehisi goes astray (IMHO) when he writes this:

… A little bit of bigotry would have prevented all of this [the Palin debacle]. So to all the Ferraros out there I have one request–more racism please. It improves our stock. It makes black people, a better people.

No, it does not.  I don’t think you could or should credit racism for what Dungy can claim as his own achievement, nor that of Einstein, perhaps — or more on point for a science-and-public life blog — the life Percy Julian lived.

Percy Julian is not as well known as he should be.  Get introduced to him out here, and or watch the excellent two hour biography that NOVA broadcast a year or so ago.  Ruben Santiago-Hudson, who plays Julian, is worth the price of admission on his own, and to brag a little, my wife, Katha Seidman won her second Emmy for her design of the show.

The short form:  Julian was one of the pioneering synthetic chemists of the between the wars period and just after WW II.  If you have ever used a cortisone cream, other corticosteroid medicines, or birth control pills, you owe Dr. Julian a debt of thanks.

He had a great career; he was honored (belatedly); he got rich — all good.  He also was bedeviled by racist constraints from childhood through to the time he was getting his own company off the ground, and in particular institutional and individual bigotry kept him from the first career he intended to pursue, that of an academic chemist, pursuing whatever research that seemed to him most promising.

That he made an enormous contribution to his field as an industrial chemist is a tribute to just the kind of determined excellence Ta-Nehisi celebrates in Dungy.  But the price paid, the cost in opportunities not just lost, but actively barred has to be accounted for too.

I’ll stipulate that Ta-Nehisi knows this very well indeed. For my part, I’m lucky that my ethnic identifier, in this country at least, is farther removed than his from our own versions of the ghetto and Jim Crow.  It was my great-grandfather that made it out of the old country, and his stories have not survived the passing of the last of his own children.

I am not completely tone-deaf to irony and sarcasm either, nor the echoes of that supremely useful phrase “the soft bigotry of low expectations” as applied both to Governor Palin and such sometime-symbolic figures as the athlete formerly known as Pacman and Mike Tyson.

But I still think that Ta-Nehisi is undercounting the persistant tax that bigotry imposes on its targets.  You could call it the Julian tax, the daily toll exacted in the pursuit of excellence constrained within limits not of your own choosing.

I’ll stop here — but for a truly beautiful meditation that touches on this theme (and much else) look to Bill T. Jones’ memoir The Last Night on Earth.

Image:  Ben Shahn “Sign on a Restaurant, Lancaster Ohio” 1938.  Library of Congress [http://www.loc.gov/rr/print/list/085_disc.html].  Source:  Wikimedia Commons.