Archive for December 2008

Some Nice Physics Blogging: Sean Carroll/Simple Questions dept.

December 31, 2008

Good New Year’s Eve fare (in that it bears, at least sort of, on why we should expect next year to be more or less the same as this one, only slightly less usefully energetic…) from Sean Carroll over at Cosmic Variance’s still more or less new and quite spiffy home at Discover.com.

Sean’s writing about Boltzmann’s argument that the apparent order of the universe as observed, with a low entropy past and a unidirectional arrow of time pointing towards future states with higher entropy cannot be simply a statistical fluctuation within a larger construct spending most of its time in thermal equilibrium.

Read the post — Sean goes into the history of the argument, and introduces some of what makes this a profound observation cum insight.

What made me smile on reading it, though, was not simply the content of this particular deep bit of thinking that comes from a delightfully (and deceptively) simple pair of questions — how is it that the universe display observable order everywhere we look; why does time flow in just one direction — but that there are a lot of such questions.  From one point of view, that is, physics is a very simple field.*

To give one more example, one that entranced me back in the late 80s when I was just beginning a decade and a half-long dance with Albert Einstein, consider Olber’s paradox.  In essence, the question asked here is “why is the sky dark at night?”

Shakespeare nailed that one, of course, in As You Like It, act III scene 2, when Corin the shepherd informs Touchstone the clown that he knows “that a great cause of the night is lack of the Sun.”

But the problem gets a little more complex when you make the assumption that we inhabit an infinite, static universe.  In such a place, would not the fact that for every spot on the sky there would be at some distance a star, whose light, given the infinitude of time, would ultimately reach the earth.  If so — the entire nightsky should glow with starlight, and the fact that it is not suggests some problem in the conception.

The paradox was proposed by a German astronomer, Heinrich Wilhelm Olbers, and was published as such in 1826, though it was anticipated by Kepler, among others. The solution to the apparent paradox encompassed this observation  by Edgar Allen Poe — yes, the Raven guy in his prose poem Eureka, published in 1848:

Were the succession of stars endless, then the background of the sky would present us an uniform luminosity, like that displayed by the Galaxy -–since there could be absolutely no point, in all that background, at which would not exist a star. The only mode, therefore, in which, under such a state of affairs, we could comprehend the voids which our telescopes find in innumerable directions, would be by supposing the distance of the invisible background so immense that no ray from it has yet been able to reach us at all.

But while this served as an explanation for a dark sky at a given time, it does not preclude a star-lit sky in the future.  That loophole could only be closed with 20th century physics, especially the cosmological extensions of General Relativity that underpin suggest a finite age to the universe (at least our subassembly within a presumptive multiverse) and the expansion predicted within the Big Bang concept, and observered by Messrs. Hubble, Humason and a cast of thousands since (and not just by the Hubble wars vets either).

Why is the sky dark at night?

The answer to the question in detail is enormously technically complicated, and draws on ideas and technology that require years of specialized training to master.  But both the question itself and its qualitative answer can be understood by a child of eight.  (That is a factual claim, not a rhetorical flourish.  My son is eight, and I tried this on him.;)  This is what I call fun.

Image:  Vincent Van Gogh, Starry Night Over the Rhone, 1888.

If Schroedinger had a parrot…

December 30, 2008

Would quantum mechanics have been irrevocably altered?….

(Thanks to Eric Roston for the heads up.)

Vodpod videos no longer available.

more about “If Schroedinger had a parrot…“, posted with vodpod

Bloggers v. Journalists round three: the agony of victory.

December 30, 2008

This is becoming the gift that keeps on giving, (think cod, consumed after perhaps just a little too long on the slab….) but whilst taking a break from the transition from an almost collapsing old Dobbins of a powerbook to a gleaming, steaming new MacBook Pro, I chanced to read Bora’s latest on the eternal war between the digerati and the formally fittest ink-stained denizens of the Juraissic.

The inciting incident was an unfortunate piece by an op-ed writer from the Newark paper, published in the opinion section of The Wall St. Journal.  Bora is right; it’s about as uninformed and fact-less a piece of bloviation as I’ve seen in a long time.

But I have to say that Bora’s cry of rage seems to me a bit off.  You’ve won, dude, and the unfortunate Mr. Mulshine’s piece should properly be read as an acknowledgement of that fact.  And in focusing so much time and fury on this one pathetic plaint I think Bora has in fact missed two key issues.

The first is that Mulshine himself, and his venue as well, are not journalistic enterprises.  He’s an editorial writer, a species of scriveners usually kept physically separated from the newsroom lest their ponderousness induce such sagging in the floor as to tilt to the whole newsgathering operation.  He may have been a reporter once, though from his self description it seems likely he was a bad one.

Even more, Bora, consider the source!  We’re talking The Wall Street Journal here.  The Op-Ed pages!  Everyone in journalism knows what that means.  This is the fiction section of the paper, the place where the prematurely aged (you got that one right) gather to spin fables that comfort in the face of an obdurately resistant reality.  The Journal’s opinion pages bear the same relationship to journalism as the Discovery Institute does to evolutionary biology.  Getting shocked at stupidity published there has the same utility as getting furious at this specimen.

Anyway, who cares what a minor figure in a not exactly on-point segment of the business (remember — editorialists are not reporters) thinks about this stuff.  It is clear that the future of mass communication in general, and news gathering and dissemination in particular is going to be digitally mediated and deriving from a range of sources, from full time pros to various sorts of part timers/users/amateurs/citizens whatever.  It’s done.  A cri de coeur like that of poor Mr. Mulshine are keening at the loss, not an actual argument to be addressed.

Where I do think Bora gets it a bit off, or perhaps just does not express himself precisely here is in his disdain for professional knowledge and experience in journalism as a craft; he seems to think (a) that beat reporters are not or do not become expert; (b) that reporters see themselves primarily as gatekeepers, and (c) that editors are pretty uniformly the enemy of discourse.

All of these can be true, and I’ve seen instances in my own career.  But without going into chapter and verse it’s easy to denigrate the knowledge and craft of folks doing work outside one’s own field.  Reporters and editors do provide certain critical functions that are already being translated into new models of so-called citizen journalism.

That is:  two things that organized media (think John Gotti?….) do, or ought to, is aggregation and q.c.; the third function, and the one that Mulshine, ineptly, is trying to highlight, is that of devoting full time resources with presumptively undivided attention to critical subjects, regions, or stories — that is a big issue for the blogosphere, but there are already plenty of models to show how this can be done.  Just ask Josh Marshall.

Those first two functions are editorial, rather than reportorial. The construction of a model of editing that enables rather than constrains citizen journalism is one of the actively pursued challenges taken on by a lot of those ink-stained wretches who take the new media seriously.

Two of the best I know are my old college room mate, Michael Skoler, trained up as a radio reporter on the science desk of NPR, now head of the Center for Innovation in Journalism at American Public Radio.  Michael developed Minnesota Public Radio’s model of public insight journalism which forms a collaboration between citizen journalists — thousands of Minnesotans, and MPR’s professional staff.

Another old friend, Ellen Hume, trained up at the Wall St. Journal — on the real side of the paper, not the opinion sandbox — worked as a true establishment MSM type as a Washington correspondent and TV gabber, and now runs MIT’s Center for Future Civic Media, which is a pursuing more technology centered  research on the creation and use of new tools to develop, and disseminate information for both traditional news-gathering purposes and for other civic/political goals.

The point:  both Michael and Ellen, and many others come at this with a very deep background in journalism; they’ve spent decades each trying to find, construct and share stories that in some cases cost them dearly to get.  This knowledge is useful; easy disdain for that hard won experience makes for good blog posts.  It does not correspond to the reality, which is that the good journalists out there see in the new tools and new sources of information (that’s all of us here in the blogosphere) enormous advantages they could have used back in the day.

All of this is prelude to the argument I want to take some time to craft, which is to push back– not all the way, but partly — on the notion that the blogosphere in and of itself is sufficient to take on the role traditional journalism has (at least in myth) played in the past.  The reason why efforts like those undertaken in Minnesota and across the way from my office matter is that in a finite day the ubiquitous and self-correcting nature of what might be called the informal journalism of the internet exists synoptically — but people don’t. They — I, we — have finite time to perform the editorial work of chasing down contending versions of reality until some resolution sets in.  We have only so much time to put together the range of stories we might find interesting or important in each day.

Someone will take care of all that, whether it be some part of the civic journalism movement, or mutating mass media.  If we don’t create and use the tools that make the totality of our efforts accessible, then it seems to me likely that people like Rupert Murdoch et al. — who aren’t dumb, not matter what other qualities may attach themselves to them — will create the filters, packaging, production values and aggregation work that will capture much more of a share of audience than they should.  More on that soon, though I think I am due (and you too) a little break from this topic.

Image:  Allosaurus on lunch break.

Science Bloggers vs. Science Writers Round 2: It’s Just A Theory dept.

December 28, 2008

This may be a true blogospheric case of a day late and a dollar short, but I’d like to pursue a few more threads drawn from the kerfluffle Bora set off with this post.  I want to get into the meat of what Bora wrote — especially in two areas, but before I get there I wanted to take a swipe at what I see as a dangerously mistaken notion put forward by one of Bora’s sciblings in the intitial wave of responses to the initial post.

This is what Ed Yong had to say in an otherwise smart piece that offered some good advice to scientists confronting the media:

…the majority of journalists are not seekers of the truth; they serve at the all-important altar of “The Story” and the ultimate goal of The Story is to keep the reader/listener/viewer entranced with it from opening word to final syllable. It’s entertainment…

There is a critical error in this passage, and it is one that I have seen repeated again and again by scientists (and others, to be sure) for whom the complexity of their own work blinds them to the technical hurdles faced in other fields.  (That this happens within science before it even makes its way out to confound science/rest-of-the-world interactions is the point made by Greg Laden in the post on which I blogged below.)

The error lies with the claim that the goal of communicating in story-form is “entertainment.”  This has essentially the same affect for a writer as saying that evolution is “only a theory” has for anyone who knows anything about that the science of life.

Entertainment occurs in the presence of a well-made story, certainly, but that pleasure derives from success at the primary goal of story:  engagement.  Ed has it right in the sentence before:  the function of story form is to enable the author to hold a reader’s attention to the end of what he or she is trying to say — and to do so in a way that will enable that audience to understand and remember whatever it was the writer was trying to communicate.

Not to go all evolutionary psychology on y’all (though that is a tale-telling discipline as ever was) but story structure is something that human beings seem to rely on to frame meaning and to construct memory.  I’m not versed enough in the neuroscience to pull up chapter and verse here, (but see, e.g., some of Jonah Lehrer’s writing for a variety approaches to connect brain function and human culture)  but it is clear anthropologically that stories are the ways human beings have organized their knowledge for a very long time.

None of this is remotely new, nor I suspect, any surprise to scientists reading this; science is, pace Ed, (and perhaps Bora?) a culture deeply steeped in story telling from the informal  level of conjecture in the lab or seminar up to and including (some but not all) of its most formal communication.  A couple of examples:

Albert Einstein’s first relativity paper, “On the Electrodynamics of Moving Objects,” begins with a little story: consider this, Albert writes, this little mystery.  According to what we now say we know, if a magnet moves through a conductor, an electric field is formed that produces a current in the wire.  If, on the other hand, the conductor moves while the magnet remains at rest, no field is formed, “while in the conductor an electromotive force will arise, to which in itself there does not correspond any energy, but which ….gives rise to electrical currents.” (Collected Papers of Albert Einstein, vol. 2, English translation, p. 140.)

This little anecdote is a story in itself, of the “Let me tell you something strange” variety, almost a tall-tale.  It is also what the screenwriting types call the inciting incident — a mystery or a problem to be solved through a series of narrative incidents, the sequence of mathematical derivations that Einstein pursues to reach his extraordinary narrative (and physical) conclusions.

Charles Darwin explicitly framed his great work, The Origin of Species, as an argument and nothing like  a novel, but it is an essay permeated with stories from the descent of fancy pigeons from the rock dove, to the narrative of sedimentation that underpins the assertion that the geological record is imperfect, to the hypothetical narrative here, just a few pages before the fabled “tangled bank” scene:

When we no longer look at an organic being as a savage looks at a ship, as at something wholly beyond his comprehension; when we regard every production of nature as one which has had a history; when we contemplate every complex structure and instinct as the summing up of many contrivances, each useful to the possessor, nearly in the same way as when we look at any great mechanical invention as the summing up of the labour, the experience, the reason, and even the blunders of numerous workmen; when we thus view each organic being, how far more interesting, Ispeak from experience, will the study of natural history become!

Last, I promise):  I’ll be publishing this June my account of Isaac Newton’s work at the Royal Mint, chasing counterfeiters and helping to create the modern financial world (thanks, Isaac).  Along the way, I read Principia, and while no one has ever accused Newton of ripping prose style, I found that when you read that book as a book, and not as a series of demonstrations, Book Three, “The System of the World,” has a narrative structure that is integral to the argument Newton was trying to make:  that his new mechanics extended through the entire universe, to its infinite, and to human senses inaccessible, extent.  He did so by the way he organized that last section — which takes on the recognizable form of an epic journey.

But all that, of course, was then.  What about now?  Modern scientific communication is a highly formalized and artificial genre, of course.  No one reading this has to be told that. But story still creeps in, as it must, given the way people tell themselves stories about what they do as the ideas frozen into papers take shape.  The issue is not the data, but, as in the dispute with which Bora led off his original post, in the interpretation of whatever has been measured or observed.  For interpretation, read story, as in what story does my experiment tell me?; as in, who has the better story here?  Darwin again:

Authors of the highest eminence seem to be fully satisfied with the view that each species has been independently created. To my mind it accords better with what we know of the laws impressed on matter by the Creator, that the production and extinction of the past and present inhabitants of the world should have been due to secondary causes, like those determining the birth and death of the individual.

Is Darwin an entertainer?  Well, yes he is, if you have a certain cast of mind.  But is that pleasure, the thrill at a well turned thought, leading you just where the writer wants you to go, his primary ambition and accomplishment?

One last thought.  The real conflict between science writers and their scientist-sources does not seem to me to be the question of accuracy.  It really lies in the fact that writers and their subjects disagree on who owns the story being told.   For scientists acting as sources, it ain’t theirs.

That gets to the meat of what Bora was arguing, that the emergence of the blogosphere and of the communications technology behind it in principle will eliminate much or all of the need for intermediaries like science writers.  I think he’s wrong, mostly, and I’ll take that up in another piece.  But in the meantime, I’ll leave you with an anecdote that illustrates the underlying tension.  A few years ago I wrote a piece that became a cover story for Discover on some of the issues raised in the race to construct the next generation of extremely large optical telescopes.

In that piece I focused on one instrument, the Giant Magellan Telescope, for two reasons.  The first was that the GMT group had decided to start building their hardware long before they had the full sum in hand to construct the entire observatory — and I could use the casting of the first of seven mirror segments as my path into the subject.  The second was that my story was not simply about building big ‘scopes, but focused instead on the questions raised around the choices of dozens of people working on that and other similar projects to commit enormous chunks of their careers to such an uncertain goal.  Think LHC, think the James Webb Space Telescope, think, even, of the human genome project at its inception, think of the GMT’s competitor project, the Thirty Meter Telescope or TMT.

And that’s where the problem emerged.  I conveyed that theme through the stories of a half dozen different people working on the GMT, and those mini-stories occupied most of my account.  I did interview both Richard Ellis, of Caltech, and Jerry Nelson of the University of California, Santa Cruz, the two leaders of the TMT project.  I told them both up front that the emphasis of my article would lie with their rivals, but that I wanted to make sure to include enough of what they had to say so that my readers would know that there was more than one project striving after the same goal.

And that’s what I wrote.  And Richard and Jerry were both upset, and for good reason from their perspective, and they told me so very clearly.

That good reason:  there was and remains a public and private competition between the two projects for funding, in which a sense of inevitability, of unchecked progress, was very important.  An article that featured one project much more than the other was an advantage to one side, (and the TMT people were already aggrieved after Dennis Overbye published his New York Times account of the GMT mirror casting without any significant mention of the TMT folks.)

But I wrote back that, in effect, they had no cause for complaint — for two reasons.  One was that I had a story I was trying to tell, and it was about one aspect of life in science, not simply about one machine or another; I may or may not have succeeded in telling that story, but that I didn’t tell a different story hardly seemed (or seems) to me an adequate critique.

The other was that I am interested and have been for a long time in the interplay between instruments and discovery, especially in astronomy, and I planned to keep on covering what is a truly remarkable story of transformational technology unwrapping the universe.  I had every intention of doing a TMT story as soon as there was (a) enough time passed to allow the market for big telescope stories by Tom Levenson to recover, and (b) there was some kind of a hook to hang on which to hang the piece.

I haven’t written that piece yet.  Partly, I’ve been busy; new job, new book, kids, life, twelve inches of snow to shovel last weekend, all of the above. And partly, this brutal fact:  there are many more good stories to tell in the world than I or anyone has time to write.  If I or any writer get a rocket from some source about the wreckage they’ve made of some story, I don’t say I’ll never write about that person’s work again.  But I’m human enough to hesitate to pick up the phone to call up such an aggrieved soul.  Other things come along, other pebbles on the shore catch the light and grab my magpie’s interest.

The moral:  if you are upset that the story you would have told was not, and you do not choose to write it yourself, then think about how you might want to convey your disappointments and your hopes to the offending story-teller.  And as for me, this new-year’s resolution.  I’ll give Jerry a call this January and find out how goes the TMT.  (Better than the GMT, I think, given that Gordon Moore’s foundation decided to give the California-based project a ton of bucks that the GMT consortium has yet to match.)

Images:  Pierre-Auguste Renoir, “Portrait of Jean and Geneviève Caillebotte,” 1895. Source:  The Yorck Project: 10.000 Meisterwerke der Malerei. DVD-ROM, 2002. ISBN 3936122202. Distributed by DIRECTMEDIA Publishing GmbH.

Jan Matejko, “The Astronomer Copernicus in Conversation with God,” 1872.

And Spare A Thought For…

December 25, 2008

The Beagle 2 Lander — lost. presumed wrecked on this day 2003.

Not only is it appropriate to remember this one-among-many-failed space missions on the eve of the Darwin year, but it serves as a more general reminder of how hard it is to do science.

If the stuff we wanted to know (is there/was there life on Mars?; what underlies the remarkable order we observe in the universe?; what explains the odd fact that the object typing these letters is aware of itself typing these letters?; and so on) was easy, then everyone would do it and/or we would know all there is to be known.

Ain’t happened yet; doesn’t seem likely that it will.  The little Beagle, silent this last half a decade gives one minor insight into why.  So raise a glass to it, and to those who thought the gamble worth the risk of sending it off in the first place.

Happy Newton day all, again.

Image: Chasma Boreale, a feature of Mars’ north polar ice cap.  NASA Mars as Art gallery.

Happy Newton Day.

December 25, 2008

What Olivia said.

I got nothing much to add — turkey overdose yesterday; prime rib on the schedule for a couple of hours from now.  No reasoned comment can survive that much animal protein in less than 24 hours.

So I’ll just leave you with this. (h/t Amanda M.)

Ramen.

Science Bloggers v. Science Journalists: first thoughts

December 24, 2008

Courtesy of Bora, or perhaps courtesy of what was apparently a less then edifying round of bloggingheadstv*, there is another round of “who needs science journalists” sprung up in and around the Science Blogs community.

There is a lot of material here, an assault on a very broad front (or perhaps to mix military metaphors a bit, enough here to supply assaults on all  five Normandy beaches).  So I’m going to try the experiment — for this blog — of responding in bits rather than try and match Bora’s own heroic effort to encompass the full range of what evoked his rare ire.

In any event, I’m going to start on the periphery of this wrangle, with a look at Greg Laden’s journalistic ethics exercise.  In this post Laden makes a preliminary plea to his fellow scientists to note the difference in scientific cultures, in the practice, habits and rhetoric of different disciplines within the meta-discipline called science.  He writes,

You’ll notice that I also treat papers in anthropology, in particular human evolution, human paleoanthropology, and archaeology. But what you have not seen much of is archaeology outside of the human origins area. This is not because I don’t find this interesting, or am not trained in that area. Rather, this is because it is hard to do. I have a paper by a colleague, Tom Huffman, sitting on my desktop right now waiting for my attention, to turn it into a BPRR post, but it is proving to be quite difficult because of the nature of the research Tom reports and the way it is written up.

This is not because Tom did a bad job. No, he did a great job. It is rather because this area of archaeology … as is the case with a number of areas of science … is in fact NOT treated in the literature as Coturnix has suggested for science in general. It is not the case that these research results are written up in the standard scientific form or that they even follow a coherent form within the subfields.

This distinctiveness is true in a lot of other ways as well.…Elsewhere (but I’m not going to bother to run this down and cite it) I’ve seen similarly dumb statements about science that can only be seen as based on the belief that all science is done in a lab, as though there were no field scientists.

Word.  What Greg says is exactly right, and is in my experience routinely missed by many working scientists.

But while Greg tries to educate his colleagues to the significance of the fact that a physicist thinks, acts and speaks very differently from an archaeologist, he ignores the fact that writers about science for the public manifest a similar range of species.  Instead, he asserts a blanket claim:  it is unethical for journalists to check material selectively, with only one source for a story; whereas faced with an analagous ambiguity, a scientist is ethically obligated to pursue the issue to some kind of a resolution.

This is really a Hollywood notion of the practice of journalism.  It is notionally true if you are a daily journalist writing for a newspaper, wire service, or perhaps the science news division of a radio or television broadcast.  (Please note that these are all endangered populations.)  It is often honored in the breach, being a rule designed to prevent the targets of journalistic investigation from having too many chances to shape a story to their ends.  I have known plenty of  science beat people who have checked specific questions back with one source or another.

It is not true for magazine writers — fact checkers exist in part to check the accuracy of exactly this kind of thing if the writer misses the error, and in any event longer form journalism demands a built in checking process.  One may draw certain lines; many people do not send text to sources, though I have occasionally e-mailed particularly vexing passages (an explanation of an adaptive optics idea to Roger Angel, for one more or less recent example). But going back for second round questions on a selective basis?  Of course.  Happens all the time, and there is nothing unethical about it.  (Think for a moment:  does The New Yorker care, e.g. whether or not both Michael Oppenheimer and Stephen Schneider got to clear up a misapprehension about the significance of grid sizes in climate modeling over history?  No it does not, as long as the finished story gets the idea and its importance correct.)

It is not true at all for science documentarians.  When I have gotten stuck in to the final stages of writing a NOVA script you can be sure that I checked every claim of fact and any interpretation of which I was not sure with one, usually more sources, some in the program some not, and this is a common practice.

And so on.  Bloggers tend not to check in the same way, for at least one obvious reason:  the real time and comment features provide post-hoc correctives. The significance of this should now be obvious:  time frames dictate the meticulousness and methods of ensuring accuracy.  The shorter the deadline and the more ephemeral the form, the less rigorous the checking process.

That said, you don’t want to get stuff wrong in any form.  I remember every error and still wake up at night over them — particularly a particularly careless, stupid, and annoyingly trivial error committed….wait for it…24 years ago.  (I filled a dry lake with water in print.  Oy.)

And this brings me to the last thought of this stage of my response to Bora’s provocation.  The folks within the Science Blogs community — commenters more than the names at the top of the columns, I think — see science journalists and science writers as either the enemy, active, intention-filled obstacles to communicating the beauties and truths of science to the public; or as mere roadblocks, too incompentent to know their own uselessness.  Well, fifty percent of both researchers and writers are below average, to be sure, but pace the most often misquoted, science writers are the least of the problem.

Instead, it is vanishing venues first and foremost; with that loss comes the evaporation of specialized beat staff (science reporters may evoke occasional heartburn, but try the city-hall hack, trying his or her best to get up to speed on a story some PR person at a university has just announced is the next best thing to a cure for piles, and feel not disdain, but sympathy); picking up on that parenthetical, it’s the consequence of the fact that science writ large is a multi-multi-billion dollar enterprise and there are many more road blocks in the way of a platonically perfect story than the competence of the reporter:  the interests of the parties to a story may (often) conflict, and accuracy becomes a much more elusive ideal when that is so.

That’s a separate issue from the core of Bora’s and Laden’s concerns in these posts, so I’ll save what thoughts I could have on that score for a different post.  But if I were to try to sum up this post whilst standing on one leg it would be that the varieties of writers are as distinctive as those of scientists, and it is important to understand the specific constraints of the culture of particular writer with whom one deals.  At a minimum: know your reporter’s deadline.  That will tell you the degree with which you need to check with the reporter on the first call to see if they’ve understood what you are trying to tell them.

*I qualify my description of the offending BHTV episode because I never, ever watch BHTV.  I find it the worst of all worlds:  the production values of home movies; the story structure of a freshman bull session; and the relentless linearity of broadcast delivered in a medium that lives and dies by random access.  Life is too short.

Image:  a page of Galileo’s notes from January, 1610 that would form some of the material for one of the first pieces of popular science writing, Sidereus Nuncius.

Annals of Dumb: Playboy Edition/misplaced facticity edition

December 17, 2008

Via Huffington Post I discover that Playboy‘s Mexican edition has committed the predictable folly of placing on its cover a strategically partially dressed young woman to whom has been attached the caption “Te Adoramus Marìa.”

To no sentient being’s surprise this has aroused ire amongst the faithful, the more so because the edition came out the day before the celebration of the Day of the Virgin of Guadeloupe.  The predictable round of apology and disclaimers has begun and this will pass as another minor skirmish in the eternal war between desire and faith…or whatever pompousity commentators will come up with to mark the occasion.

But what got me was not the cover, nor the very nice young lady depicted, nor  her garment, meant, I think to evoke a kind of demure prayer shawl but looking like nothing so much as the tablecloth you pull out when the children are going to be sitting elsewhere.   No, it was this line:

Playboy magazine apologized for a controversial cover featuring a scantily-clad woman resembling the Virgin Mary, Reuters reported. (Italics added)

Resembling?  Really?

I could go all serious here, and storm at the feckless fact-averse who cannot seem to face the notion that we don’t know who Mary was (not to mention the uncheckable sourcing  backing up the claim of a particular sexual status), and hence have no clue about the appearance of one amongst all the  young mothers – to -be in the Roman province of Galilee about two thousand years ago.

It may be conventional to depict Mary as a young, often dark-haired beauty, and the woman on the Playboy cover matches that broad description — but then so do lots of people who do not greatly resemble each other. (Think, e.g. of Halle Berry and Sarah Silverman, just to take two folks off the top of my head.  And thanks for The Great Schlepp, Sarah, as long as we’re here.)

But rather than go into some long discourse on this as an illustration one of the ways in which claims of established fact by the faithful take forms unintelligible to scientific rationalists–and vice versa,  I figured out how I could boil the whole argument down to  the old Catskills punch line.  Looking at the Playboy cover, all I could think was,

“Funny.  She doesn’t look Jewish.”

Image:  El Greco, “The Assumption of the Virgin,” 1577

Getting Ready for 200/150: “How Many Removes From Charles?” Edition

December 16, 2008

As everyone with a pulse and an interest in science knows, 2009 is the big Darwin year — the 200th anniversary of his birth (February 12) and the 150th of the publication of The Origin.  I will in a week or so have some news about what Inverse Square — or a derivative thereof — is doing to join the chorus on that one; I think I’ve got something shaping up that the community will enjoy.

In the meantime, and as I get stuck into my prep for that project, just a quick thought as I peered at the Darwin/Wedgewood family tree Janet Browne helpfully included at the front of the Voyaging volume of her Darwin magnum opus.  There I found that Darwin’s latest-surviving child, Leonard.  Leonard Darwin was born in 1850, before the Crimean War, the Sepoy MutinyAme–the Indian Rebellion of 1857 — and the American Civil War; and he saw the end of World War II, India’s independence and the effective end of the British Empire, all before his death  in 1948.*  And, not to overlook the most important factoid, young Leonard would have been a curious eight year old just as his father was in the midst of his most intense labors distilling the work of decades into the book that became The Origin of Species.

That skein of history would be remarkable enough just for one man’s memory, but what struck me was the thought that my f Uncle David, born and raised in England, with an army background (and subsequent career of his own) that could have led him to Major Darwin (Royal Engineers), might indeed have exchanged a conversational commonplace or two with the son of the man whose birth and work we celebrate soon.

All of which is to point out the obvious — and perhaps one tangential thought not quite so banal. The distance between anyone reading this and Charles Darwin is not that great.  It is entirely imaginable to have had a conversation with someone you know or knew who could have heard the stories of life at Down House from someone who watched and listened as Charles Darwin assembled his argument.  The middle of Queen Victoria’s reign, and the very center of a revolution in ideas seem very far away when we toss around anniversary numbers like a bicentennial, or one hundred and fifty years since this or that.  They are not, at least by the measure of human memory.  She danced with a man who danced with woman who danced with the Prince of Wales; we are that close to Charles and his pigeons and all the rest.

Nothing new there — just a reminder of the numbers.  But the thought that crossed my  mind as I wondered if my uncle did in fact ever meet Leonard (as above–I had not known to ask, of course, until the chance-met glance at the bottom of the family tree) was that Richard Dawkins may have missed the point of his own reflection that he too would have been a believer before Darwin.

If you follow my sense of the slenderness of the gap that separates us in the passage of generations, of the transfer of ideas and culture that pass from grandparent to grandchild at so near a remove from Charles Darwin’s in his study in 1859, then the broken chain of belief that separates Dawkins from Victoria’s (or Emma Darwin’s) Anglican God is very short indeed.

And that thought made me wonder  if the heat and urgency I read in Dawkins’ atheism seems a little misplaced.  Without wandering too far into this thicket, it does seem to me worth remembering that it has been a very short time in the history of human society, and a still shorter time if the person-person touch of memory matters, since Darwin’s thought  struck its blow to conventional faith.

It takes some time for big ideas to sink in.  (For a biblical example, as long as we are on the subject, God through Moses affirmed the equality of women, at least as far as inheritance and rights of property go, in the Book of Numbers, which dates back 3,400 years ago or so.  That thought took a while to penetrate, did it not?)

There is no doubt in my mind that Darwin’s rigorous materialism takes some getting used to; that part of the point of 2009 is to confront not just Darwin’s thinking, but the success of the research program that his work (and that of many others, of course) set in motion.  I’m confident, that is, not angry — and I remember that we have not been inside this world view for any length of time at all.  One hundred and fifty years?  The lives my uncle’s life has touched — mine and those before me — stretch back before then.  Easily.

*Here from the Wikipedia entry on Leonard Darwin linked above, is John Maynard Keynes take on Charles’s son, who proves to have just a hint of the wasp about him in his turn:

Keynes explained the decision to publish the niece’s “very personal account”: “Leonard Darwin’s life covered so vast an epoch of change in men’s ideas, his own attitudes towards the problems of his age were so characteristic of the best and noblest intelligences of his time, and he grew up in the environment of a family of so immortal a renown …” (p. 439) Darwin expressed his feelings about Keynes in a letter to Fisher (Correspondence p. 141), “I neither like him nor trust him … But he’s very clever …”

Image: Auguste Renoir:  “La danse à la campagne,” 1883.

What I Think About the Hour Before the Dentist Goes Medieval on my Gumline:

December 15, 2008

Deep scaling (don’t askWarning. Don’t click on that link.  Really gross image on the other side) to come at the disgustingly appropriate appointment time of 2:30 (my eight year-old liked that one).  So I digress, in any way I can.  I’m sure there are important matters of science and society to consider, but I’m thinking blood and molars, so it’ll all have to wait.

To that end, consider this piece of delightful Boston v. Los Angeles trash talk — from the LA Times, no less!:

The Celtics and Lakers can even be defined by their color analysts. Tom Heinsohn is a hulking beast who looks like a hitman. Mychal Thompson is from the Bahamas and wears sandals and puka shells.  Heinsohn is Tony Soprano, Thompson is Bob Marley.

If the Celtics had Laker Girls, they would be Janet Reno and Madeline Albright.

The Lakers are fast and fun and athletic and entertaining and pretty as can be.  I love the Lakers!  I say, to heck with the tacos!  Headbands for everyone!   Or free passes to Lamar’s favorite day spa.

Let them score 120, give up 110 and we can all all go home happy, without the angst.

As Paul McCartney was just telling me, Let It Be.

Think of it like this: The Celtics are Rottweilers, the Lakers French Poodles. The Celtics bite your arm off. The Lakers win Best in Show.

If the Celtics were a video game, they’d be Grand Theft Auto.  The Lakers are Tickle Me Elmo.

In movie terms, the Lakers are Paul Newman, the Celtics Charles Bronson. Newman made better movies, aesthetically he was the one you wanted to watch and paid to see.

Except of course that it was Bronson who kills everyone in the end.

— Ted Green