Archive for the ‘Chemistry’ category

Happy Mole Day

October 23, 2015

Yo! Avogadro fans!

Today is our day!

I’m a little late getting this post up, as Mole Day ends at 6:02 p.m

Remember:  that equal numbers of molecules of a gas occupy equal volumes under the same temperature and pressure is not just a good idea.


…for which insight Amadeo Avogadro received the honor of having his name attached to the number of molecules that make up one mole of a substance, a number set by convention as the number of carbon 12 atoms that add up to 12 grams of the substance. That number:  6.02*10^23.  Hence, Mole Day, running from 6:02 a.m. to 6:02 p.m on 10/23.

In Avogadro’s honor, a tune:

Geek out, my friends.

For A Good Time On The Intertubes: Deborah Blum, Poison, Murder, Chemical Ignorance Edition

January 15, 2014

Hey, everyone.

It’s that season again — third Wednesday of the month (what, already?) at at 6 p.m. ET, I’ll be talking on that old Intertube Radio Machine with science writer extraordinaire Deborah Blum.  Live and later here, and/or in Second Life at San Francisco’s Exploratorium in-world theater, should you be minded to join our virtually live studio audience.

Deborah is probably known to you as the author of The Poisoner’s Handbook, a really elegant book on the birth of forensic chemistry in the Prohibition-era investigations of New York City’s nascent chemical crime investigative laboratory.  It’s just a fabulous read — noir true crime with a solid steel core of great science running through every misdeed.


The PBS series The American Experience just broadcast an adaptation of the book, by the way, which can be viewed here.

There’s a lot more to Deborah’s career than simply this most recent success.  She won a Pulitzer Prize as a reporter for The Sacramento Bee for reporting on ethical issues in  primate research, work contained and extended in her first book The Monkey Wars.  She’s published five previous books in total, all great — my favorite is Love At Goon Park, but there’s not a dud in the bunch. Far from it.  Her day job now is teaching science and investigative journalism at the University of Wisconsin, Madison. Her students are lucky ducks (or badgers).

We’ll be talking about the new stuff:  poison, the emergence of systematic chemistry as a tool, the issues we face of our ignorance of so much of the chemical universe — the West Virginia spill will be our proof text there — and more.  We’ll also continue the extended conversation I’m having with several colleagues about the constraints and worse affecting the work of women in science writing.  Deborah has been a leader in organizing public thinking and discussion on these matters, so that’ll be on tap as well.

I should add what you may have guessed: Deborah is a good friend as well as a professional colleague.  So I’ve got the experience to assure you she’s a great conversationalist.  It will be an interesting hour.  Come on down!

Image:  Jacques-Louis David, The Death of Socrates1787.

On the Origin of Science Writing: Joseph Priestley/Isaac Newton edition

January 6, 2009

Sunday over at Daily Kos, Devilstower had a very nice review of Steven Johnson’s new biography of Joseph Priestley, The Invention of Air. The review did its job — make me go out and get the book.

That said, Devilstower made one claim that I think dramatically overstates Priestley’s accomplishments — while diminishing the real history of the democratizing spread of scientific thinking that significantly preceded Priestley, and continues today.  Devilstower wrote:

He was the author of what was probably the first popular book on science. Not the first science book, there had been many of those, but the first popular science book. What’s the difference? As an example, Newton’s Philosophiæ Naturalis Principia Mathematica had been out for several decades before Priestley began writing, but as the title suggests, Newton chose to write his book in Latin, not English, and crafted a text both intentionally dense and painfully difficult. Despite the revolutionary wealth of information in Newton’s book, only the most advanced scholars could pry lose its secrets. Priestley broke with this scholarly tradition and wrote his works in English, being careful to explain enough background to address a general audience. Rather than go through the elaborate mental gymnastics that Newton presented, Priestley laid his work bare on the page.

The problem with this is that it is just not true.  By 1768, when Priestley published a popularized version of his more technical History and Present State of Electricity, the notion of writing about nature in the vernacular had been on the rise for the better part of two centuries at least.

The breakthrough to writing in the living language spoken by more than the scholarly and clerical elite actually came first in Italy during the Renaissance.  Think Petrarch, and his sonnets; Machiavelli with Il Principe — The Prince; Dante, of course, with the Inferno, Purgatorio, and Paradiso — and for our purposes here, that somewhat later figure Galileo Galilei, who risked Papal wrath for the double sin of countenancing the Copernican view of the heavens and saying so in Italian.  That this was understood to be a popular work, i.e. one for lay audiences, can be seen by clicking that link — it leads to the 1661 translation, “Inglished from the original Italian copy by Thomas Salusbury.”

There you have a crucial work of popular science available to Italian readers in 1632, and to Priestley’s great-grandparents a generation later.

In that gap, you can find a bit of the broader historical processes creating not just the knowledge required to write about science, but the possibility of doing so for a broad audience.

For example:  you can’t have a popular literature without the means of getting words on paper cheaply enough to attract the punters.  That took time:  for example, England lacked any mills making white paper, suitable for printing at the beginning of the seventeenth century.  (There was some production of brown paper for wrapping and packaging.)  By mid-century, as Isaac Newton was just heading off to grammar school, there were still just two mills in the entire country making the high-quality stuff (thanks to James Gleick’s fine brief life Isaac Newton for that fact and the heads-up to the larger story).  Paper for printing was almost exclusively imported, and was very expensive — 24 sheets of the good stuff cost a day’s wages for a laborer.

This is one of the reasons books were expensive, print runs small, and great care was taken not to get lumbered with copies of works that lacked a market.  That is, to the mystery of why Shakespeare’s First Folio came so late — in 1623  seven years after its author’s death — and with what seems like a small print run, though at the time it was a very considerable edition at around 1,000 copies, part of the answer is that those involved needed to be sure that the market for a large and hence costly book was actually there.

But by the latter half of the 1600s, and especially towards the end of the century, paper was much more generally available, printing technology had spread, and a popular and even a kind of tabloid media could take form.  In that context you find, if not the first, then certainly some of the earliest popular science writing in English, with the most notable early example coming from Newton’s antagonist, Robert Hooke, and his landmark Micrographia, published in 1665.

As much as anything Joseph Priestley ever wrote, Hooke’s extraordinary work offers a social as well as an intellectual portrait of the time.  He wrote it for a general audience; he wrote it to make money; he wrote it out of origins as a poor member of the educated class, made good by his own efforts, his skill, and his capacity to attract the patronage especially of Robert Boyle.

There is social mobility there, technological change, and a new culture forming in the emergence of a public both interested enough and with enough surplus cash to purchase a pricey, lavishly illustrated work about nature.

And then there is the case of Newton himself, calumnied by Devilstower for “elaborate mental gymnastics” of which Priestley is supposed to have been free.  But there is this problem:  Newton was not trying to write a popular text in the Principia, just as Priestley was not in his larger work on electricity.  Newton was presenting new results to a community of colleagues in the form in which its claims could be understood, assessed and tested — that is, in the emerging language of mathematics.

That was the point:  this was a new approach to science and Newton laid out his work to make both his results and his methods available to the audience that could make use of it.  That he was attempting to persuade his fellow natural philosophers, and not the public at large should no more be held against him than should Einstein’s framing of general relativity in exceptionally sophisticated mathematical form (for physicists at the time).

Of course, Einstein went on to write the popular text Relativity, still in print, still a valuable introduction to both the special and general theories.  It is perhaps less well known that Newton  took pains to make available to a broader public the qualitative content of his ideas.

For example, in the famous letters he wrote to the divine, Richard Bentley, to help him prepare the first series of Boyle lectures, he sought to explain his theories in order, as he wrote   (I’m paraphrasing from memory here, so don’t sue me) to enhance faith in and wonder at the works of the divine.  He took pains to explain to Bentley how, in his view, his theory of gravity  did just that, reaching through his correspondent to the public audience that would encounter his lectures.

Most important, in the context of Devilstower’s view of Priestley’s role in creating a place for the broad mass of men and women in the scientific enterprise, in contrast to Newton’s supposed scholarly elitism, Newton himself more than Principia, and his other great book on scientific topics, The Opticks, was published in 1704 — in English.

I can’t say that the book is  a glorious ornament to English prose style, but it’s readable, it’s there, and it contains critical ideas not just about experiments that Newton had performed in his youth, but (in later editions) about how science itself works, how the interplay of experiment and reasoning produce results.

Last, beyond Newton himself, Newtonianism was a big enterprise throughout Europe both during and after Newton’s lifetime.  Mordechai Feingold’s catalogue The Newtonian Moment captures the explosion of popularizations of Newton’s work that followed with striking speed after the admittedly dense Principia.

All this is not to say that Priestley was not important.  Of course he was, both as a scientist and as a democratizing writer about science for a broad audience.  I’m really arguing just two points here.

The first, more minor, is just a defense of the value of historical knowledge.  The past is a much more fine grained place than one in which Priestley, born in 1733, confronts Isaac Newton, born in 1642, across an empty field.  The notion that science was the province of the Latinate elite until it was suddenly set free is false — and does real violence to a much more important understanding, my second point: the issue of the public understanding of science is something that has been a significant battleground for centuries.

We are fighting it still, of course, and to the extent that we are making progress, it is as the heirs of a tradition formed not just by one or another hero, but by centuries of effort by the famous and the much more modest alike.

Images:  “Dr. Phlogiston” — anti Priestley cartoon c. 1780-90.

William Blake, “Isaac Newton,” 1795.

Joseph Wright, “An Experiment on a Bird in an Air Pump,” 1768. Note that this painting, depicting a scientific demonstration as a recognized form of respectable entertainment for an unequivocally non-expert audience dates from the same year as Priestley’s foray into science popularization.  Image source:  The Yorck Project: 10.000 Meisterwerke der Malerei. DVD-ROM, 2002. ISBN 3936122202. Distributed by DIRECTMEDIA Publishing GmbH.

Cold Weather Beer Thoughts

December 11, 2008

Following up an exceptionally episodic series on the craft, science and literature of beer (“ah, that was a mug pint of proper 1420, that was…”) I offer the fruits of my finally googling a question that had nagged at me for a while:  what the hell is the difference between stout and porter anyway?

That led me to this delightful thread, and to this magisterial answer from the Campaign for Real Ale:

Porter was a London style that turned the brewing industry upside down early in the 18th century. It was a dark brown beer – 19th-century versions became jet black – that was originally a blend of brown ale, pale ale and ‘stale’ or well-matured ale. It acquired the name Porter as a result of its popularity among London’s street-market workers. At the time, a generic term for the strongest or stoutest beer in a brewery was stout.

The strongest versions of Porter were known as Stout Porter, reduced over the years to simply Stout. Such vast quantities of Porter and Stout flooded into Ireland from London and Bristol that a Dublin brewer named Arthur Guinness decided to fashion his own interpretation of the style. The beers were strong – 6% for Porter, 7% or 8% for Stout. Guinness in Dublin blended some unmalted roasted barley and in so doing produced a style known as Dry Irish Stout. Restrictions on making roasted malts in Britain during World War One led to the demise of Porter and Stout and left the market to the Irish. In recent years, smaller craft brewers in Britain have rekindled an interest in the style, though in keeping with modern drinking habits, strengths have been reduced. Look for profound dark and roasted malt character with raisin and sultana fruit, espresso or cappuccino coffee, liquorice and molasses, all underscored by hefty hop bitterness. Porters are complex in flavour, range from 4% to 6.5% and are typically black or dark brown; the darkness comes from the use of dark malts unlike stouts which use roasted malted barley. Stouts can be dry or sweet and range from 4% to 8% ABV.

In the discussion linked above, I particularly liked the description of the recent trend in American brewing to produce monstrously strong beers as “kamikaze beers.”  Precisely so.

And yes, in case you were wondering, I’m staring out a plate glass window onto grey pavement, spitting rain, and temperatures in the thirties.  Perfect weather (if not yet quite the perfect time) for some black beer.

Program Notes: Technology Review/Former Student Props edition

October 26, 2008

A little suggested reading, combined with some love for recent graduates of the MIT Graduate Program in Science Writing — the little corner of the Institute which it is now my honor to direct.

First up, the cover story in the current Technology Review, “Sun + Water = Fuel” by Kevin Bullis, who completedthe grad program in 2005.  It tells the story of a discovery by MIT chemist Daniel Nocera, who has found a catalyst that may (note the conditional) make it possible to separate oxygen out of water at a cost that would make that energy source competitive or better with fossil fuels.

I had thought to blog this finding when the press release hit my inbox, but now I don’t have to.  Kevin has done an excellent bit of reporting, explains what’s going on clearly, and writes it up with, I think, the correct balance of optimism and the always needed skepticism in the face of technological predictions.  (See the comment thread on this article for an illustration of the line Kevin tried to walk.)   He’s a writer to watch — graceful and stylish, with a true love of tech.

Then there’s this story, “The Flaw at the Heart of the Internet.”  Erica Naone is another one of our stars.  She graduated from our program in 2007.  This story is chilling in its account of the near miss in which Dan Kaminsky identified a significant vulnerability in the way the web matches more or less plain  language names, the DNS monikers like “” with the numerical addresses by which the internet itself identiies for the locations thus named.  That flaw would allow attackers to hijack DNS information and replace the intended material with content of the marauder’s own.

While Black Hat 2008 awarded Kaminsky its Pwnie Award for “Most Overhyped Bug,” Erica’s piece gives you a very good argument why (a) you should have been at least retrospectively, very, very afraid; and (b) more generally, to remember the eternal truth most vividly expressed in Neal Stephenson’s Snow Crash, that the internet is not a benign playground.  There be dragons out there.*

On that note — a third article in the current Tech Review that is a true must read comes from my old friend and long-time MIT guy Simson Garfinkel. (If there were anyone with beaver-blood running in his veins, its Simson, a four (or more, I can’t keep up) Institute degree holder who is as far as I can tell, perfectly adapted to MIT’s unique intellectual island ecosystem.)

The piece, “Wikipedia and the Meaning of Truth,” Simson has written what seems to me to be a very important article that emphasizes the Wikipedia’s appeal to authority as its ultimate standard of what merits inclusion in what is rapidly becoming the default web-based repository of recognized knowledge.  A must read, IMHO.

(And I have one anecdote about the pitfalls of the imputation of authority of printed sources.  I wrote an article not that long ago for a national publication not to be named here.  The fact checker called me up to confirm some detail.  I said, basically, that it had come out of my own research.  She demanded a published source.  I asked if my own book would do.  She said yes.  Sic.)

*The other pleasure of Erica’s article for me was that I finally got a semi-definitive (at least Wikipedia-worthy) pronounciation for the web-slang term “pwn” — which apparently rhymes with “own.”  I had previously suggested at least partly tongue-in-cheek that it might derive from the Welsh use of the “w” as a vowel. The Welsh “cwm” pronounced “koom” exists as a loanword in English (and has also be transcribed as Comb or Coombe). Given that earlier this year I offered the suggestion/question whether or not pwn should be pronounced “poon,” following the Welsh example, and evoking Neal Stephenson’s use of the word in Snow Crash to describe what his character Y.T. does when she uses her magnetic harpoon to attach to the vehicles that can pull her along on her Kourier rounds.  Sadly, inventive as that may have been, it appears that my attempt at etymology is not just wrong, but terribly, terribly so.

Image:  J.M.W. Turner, “Sunrise With Sea Monsters,” 1845.

Against Ta-Nehisi Coates…

October 24, 2008

…or rather, against his defense of white racism. The post is a meditation on why women are, in his perception, so harsh on Sarah Palin; his epiphany came when he tried to imagine a black equivalent to the Palin candidacy — and he couldn’t:

A brother in that position not only would not be considered for 2012, he would be impeached when he returned to governorship for embarrassing the state, and then have his ghetto card revoked for embarrassing the local Negrocracy.

For this, the writer is grateful, which makes perfect sense.  It’s better by far to have a strong sense of standards than some unthinking identity commitment.

That’s the implication of the Yiddish phrase, “A shande fur de goyim” — a shame before the non-Jews. Nothing could be worse than to be such a schande; it’s why Jews, or at least  those I hang with, wince with every Jack Abramoff or, to channel a different era, why Abbie Hoffman’s use of the phrase to describe Judge Julius Hoffman during the Chicago Eight trial was such a potent barb.

More deeply, we have a lot of history that tells us it is better on every level, from the moral to the practical, to be not merely no worse than the majority societies in which most Jews live, but to be closer to blameness, to bring no scandal to our names and homes. So, thus far, I’m with Ta-Nehisi.  But then he goes on to write who he could or would wish to credit for the existence of such internal correctives:

White racists have taken a lot of heat on this blog. But the truth of the matter is that they may be the single biggest promoters of black excellence in this country’s history. There is a reason Tony Dungy was the first winning coach in Tampa Bay’s history–he had to be.

Again, from where I sit looking over the ethnic/race/identity sorrows of history, I know that there is a partial truth here.   I’m enough older than Ta-Nehisi to have Jackie Robinson’s story as the archetype of the pressure on the standard-bearer.  There is no doubt that Dungy did a very hard thing — much harder than most watching him grasped, I think — but Robinson was literally in a league of his own on the need to combine superlative performance with extraordinary internal strength and self-control.  (For the record, I’m not so old that I ever saw Robinson play; but his was the story we read in grade school.)

The same dynamic played out time and again in public and in private Jewish lives — including the importance of public heroes finding someway to express both a particular and a universal greatness; think of Sandy Koufax refusing to pitch on Yom Kippur and you have a hint of the balancing act involved.

But where I think Ta-Nehisi goes wrong is in giving racists themselves credit for the excellence of a Dungy or anyone else.  I don’t doubt that there is a forged-in-fire power to the notion of proving oneself despite the efforts of those with evil intention to thwart you. But Ta-Nehisi goes astray (IMHO) when he writes this:

… A little bit of bigotry would have prevented all of this [the Palin debacle]. So to all the Ferraros out there I have one request–more racism please. It improves our stock. It makes black people, a better people.

No, it does not.  I don’t think you could or should credit racism for what Dungy can claim as his own achievement, nor that of Einstein, perhaps — or more on point for a science-and-public life blog — the life Percy Julian lived.

Percy Julian is not as well known as he should be.  Get introduced to him out here, and or watch the excellent two hour biography that NOVA broadcast a year or so ago.  Ruben Santiago-Hudson, who plays Julian, is worth the price of admission on his own, and to brag a little, my wife, Katha Seidman won her second Emmy for her design of the show.

The short form:  Julian was one of the pioneering synthetic chemists of the between the wars period and just after WW II.  If you have ever used a cortisone cream, other corticosteroid medicines, or birth control pills, you owe Dr. Julian a debt of thanks.

He had a great career; he was honored (belatedly); he got rich — all good.  He also was bedeviled by racist constraints from childhood through to the time he was getting his own company off the ground, and in particular institutional and individual bigotry kept him from the first career he intended to pursue, that of an academic chemist, pursuing whatever research that seemed to him most promising.

That he made an enormous contribution to his field as an industrial chemist is a tribute to just the kind of determined excellence Ta-Nehisi celebrates in Dungy.  But the price paid, the cost in opportunities not just lost, but actively barred has to be accounted for too.

I’ll stipulate that Ta-Nehisi knows this very well indeed. For my part, I’m lucky that my ethnic identifier, in this country at least, is farther removed than his from our own versions of the ghetto and Jim Crow.  It was my great-grandfather that made it out of the old country, and his stories have not survived the passing of the last of his own children.

I am not completely tone-deaf to irony and sarcasm either, nor the echoes of that supremely useful phrase “the soft bigotry of low expectations” as applied both to Governor Palin and such sometime-symbolic figures as the athlete formerly known as Pacman and Mike Tyson.

But I still think that Ta-Nehisi is undercounting the persistant tax that bigotry imposes on its targets.  You could call it the Julian tax, the daily toll exacted in the pursuit of excellence constrained within limits not of your own choosing.

I’ll stop here — but for a truly beautiful meditation that touches on this theme (and much else) look to Bill T. Jones’ memoir The Last Night on Earth.

Image:  Ben Shahn “Sign on a Restaurant, Lancaster Ohio” 1938.  Library of Congress [].  Source:  Wikimedia Commons.

Friday Afternoon Science We Can Believe In

October 17, 2008

I’m weary of the election.

I can’t take much more of the BS and hate (and violence) emanating from the other side.  (See, e.g. this, and this, and this, for recent examples.)

I hear (and obey!) Senator Obama’s warning call keep plugging, not to get complacent.  I’ll be telephoning/canvassing this weekend, and next, and the four days up to and including the first Tuesday in November.  I will do my damn best not to leave any effort on the table.

But I’m damn tired — the GOP noise machine has had that much impact.  As far as this blog is concerned, acorn denotes that object out of which mighty oaks may go, and no sleazeball, scumbag robocalling, lying mailer-dispatching, my-campaign-is-positive honorless hypocrite is going to convince me otherwise.

I want to get some science back in this blog, something where people are engaged with the (secular) better angels of human nature, and asking deep and interesting questions of the material world in which we live.

That is: I want to think about work like this:

Outer space smells like hot metal, fried steak and the welding of a motorbike, scientists suggest. A chemist is recreating the smell to help Nasa to train its astronauts.

The Times (of London) Online reports that NASA has called on chemist Steve Pearce, whose day job has him investigating fragrances, to come up with a compound with the right pong to prepare those heading for International Space Station for the unique experience of sniffing in space.

Ah, Friday…but in fact the brief item does suggest that there is a genuinely interesting question behind the immediate application:

“We have already produced the smell of fried steak, but hot metal is proving more difficult,” he [Pearce] said. “We think it’s a high-energy vibration in the molecule.”

(h/t Scout Finch (my nominee for best screen name in the political blogosphere) at Daily Kos)

Image:  Jorge Barrios, “Un hombre soldando al arco una reja.” 2007.  Source:  Wikimedia Commons.