Archive for the ‘Engineering’ category

The Way David Macaulay Works

August 27, 2010

Something of a Friday brain dump seems to be going on chez Inverse Square. I’m trying to help out a little on what would for me be another David Macaulay-hosted project.

(Info on first one in which I participated — the Peabody award winning Building Big — is here.)

That led me back to the video of the talk David gave at MIT a couple of years ago, when I had the fun of hosting him for a few day-long visit. (I’m the guy introducing David whilst forgetting to introduce myself or my program — an academic rookie’s mistake.)

But David makes no such errors. The talk, titled “The Way David Macaulay Works” is a wonderful fully illustrated tour through his career and the process through which he investigates the built and the natural world. Have fun.

Vodpod videos no longer available.

The Way David Macaulay Works: Finding Ideas, Ma…, posted with vodpod

Why Obama is right and McCain wrong on Energy: MIT edition

August 1, 2008

Continuing the energy theme just a little longer….

This may be a bit of home-institution boosting, and I haven’t done any due diligence on this press release, but still, this is promising news out of Daniel Nocera’s lab at MIT.  It is also a perfect example why Obama’s emphasis on alternatives to oil and coal is the better choice of governing philosophies for US energy policy, and McCain’s oil now, oil forever approach is not.

Nocera and his post doc, Matthew Kanan have taken a long look the process of photosynthesis that enables plants to extract usable energy from sunlight.  They’ve come up with a two-step process that can split ordinary, neutral pH water into hydrogen and oxygen to supply the feedstocks for fuel cells that could supply electricity to power cars, homes or whatever.  The key to the idea is the use of solar-generated electricity to power the electrolysis taking place in the Nocera lab’s device.  More detail in the press release, and Nocera’s general description of this line of research here

There is, as always, the caveat:  this is a research finding, not an industrial process.  It will take time and significant engineering creativity to turn this advance into a major source of energy and a partial replacement for carbon-based fuels — if it ever gets there.

But this is the necessary initial step.  You don’t get alternative energy unless you do the research.  You can’t do the research if you can’t get funding.  It is difficult — though to be sure, as this finding shows, not impossible — to pay for this work when you have a disinterested or actively hostile, petroleum-addicted President and administration.  A President Obama would do so — candidate Obama has already made that very clear as recently as yesterday, whatever the national press thought of the important news of the day.  A President McCain, delivering on candidate McCain’s promise to develop all available domestic sources of oil….not so much.

Here’s the MIT press release making the point for me:

The success of the Nocera lab shows the impact of a mixture of funding sources – governments, philanthropy, and industry. This project was funded by the National Science Foundation and by the Chesonis Family Foundation, which gave MIT $10 million this spring to launch the Solar Revolution Project, with a goal to make the large scale deployment of solar energy within 10 years.

You don’t get what you don’t pay for.

And as a lagniappe, this bit of barely informed editorializing:  the reason McCain’s approach is wrongheaded is not just that there it encourages the use of polluting sources of energy instead of pursuing clean or cleaner sources; it’s not that there is some mystical reward to using a renewable source as opposed to a notionally available, notionally cheap(ish) nonrenewable source — this isn’t a tree – hugger argument.  No, it’s wrong because it increases the liklihood that the transition we will have to make someday to a non-oil based economy will come harder, more expensively, and more destructively than it needs to, or would under a more science – friendly approach.  The real energy question is when and how much do you want to pay the piper.

That is:  McCain hasn’t noticed, though he has surely been told, that oil is something of a mug’s game,  coming under pressure from both supply and demand sides.  Between peak oil and the rise of major developing nations — economies that remained tied to oil are buying into not just an increasingly high price for their energy, but also a significant, and I would bet, on nothing more than a hunch, an increasing risk of oil shocks, major disruptions in supply  and/or price over the next decades.

That, as much as the absolute cost of energy as a share of any economic activity, is what ought to scare people, (if my hunch is correct).  Major uncertainty is a very expensive quality; when the probability collapses into a particular damaging event, the impact on real people’s real lives is profound.  Why on earth should we place ourselves more in the path of such an oncoming train than we have to.

And one last note — as I’ve given Marc Ambinder some eminently deserved grief (hey–if he can assert his judgment as fact, so can I) for his blithering yesterday about why he isn’t talking about energy, he has a solid post about Obama’s economic message today that contains a bit of content reporting and a bit of process analysis.  Nothing fancy, but just an example of a beat reporter writing a clear and useful little story from within his defined territory.  Credit where credit is due.

Updates on the 100 mpg car front

July 28, 2008

Way, way back when there was a Republican fight for the nomination, Mike Huckabee made a little splash by calling for a one billion dollar prize to encourage the production of a generally available care capable of 100 mpg.

I ridiculed Mike here and here. Mostly because (a) the billion bucks was such a wildly disproportionate reward, given the X-prize being offered for the same basic goal seemed to think that ten million would do the trick, and, at least as important, at least one production vehicle on the verge of release, the Tesla roadster, could already lay claim to the milestone. (Latest news — as of a couple of weeks ago, 12 production vehicles had been completed, with the assembly line cranking away at a blistering four vehicles a week.)

But the what I want to highlight here is the power of 4 buck a gallon gas to concentrate the mass market manufacturer’s mind.

Most immediately, it looks like the GM Volt is real as of 2010 — though at a higher price than originally proposed, 40K instead of around 30K. It will have an MPG equivalent of 150 mpg running on its electric motor, which will drop if the range-extending gasoline engine gets called into use. GM also has a Saturn Vue plug-in SUV project in the works. Toyota is working on its plug in response, with a current, very short range claim of 99.9 mpg.

But what caught my eye today was this report from the Green Car Congress, showcasing the British Motor Show’s latest offerings of cool to cute electric, energy efficient cars.

The headliner? The four-seconds-to-60/10 minutes to recharge Lightning GT. 300 large, I’m afraid, so this is another pure fantasy. But taken all in all, and never forgetting the 350 mpg personal transportation available in the form of this electric scooter, it looks like the use of market mechanisms to control green house gas emissions is, pace the McCain campaign’s whispered walk-back on the issue, is working just as the econ 1 textbooks tell you it should.

Image:  Lightning GT, Lightning Car Company photo.

Worst use of technology nominee: Food and Beverage Division

June 18, 2008

Caution: bad tempered vent to come.  Coors is the target, and their advertising goons.  Avoid if such old-fogeydom  annoys you.


I love science.  I really love technology.  I’m a toy and gadget freak.  I think it is amazingly cool that a bit of engineering mojo produces stuff like this.

But I have become truly sick of this.  Leave aside the raw contempt the associated ad campaign has for both stupid wives and boorish husbands…just stop to think about all the engineering talent that Coors brought to bear on  the design problem involved in making “The new vented wide mouth directs airflow into the can to enhance the swigging experience for can drinkers.”

Enhance the swigging experience?

Excuse me.  Just say it.  Time to chug.

Pity the poor team, up against the launch deadline, doing their 18/7s, working out the perfect size and shape and airflow and the rest, and then suddenly looking up and realize that their accumulated decades of person-years of study and experience had just been devoted to the task of speeding frat boys (superannuated, if the ad series is to be believed) towards their desired level of alcoholic coma.

All those problem sets and robot labs for this?

Just for the record:  it’s not beer that’s the problem (though it remains an open question how much violence one does to the language by calling Coors “beer”); what bugs me is the sheer mindlessness of the product differentiation game being played here.  Does anyone out there really care about the hole in the top of their beer cans?  If you want to gulp it down faster…just put it in a plastic cup or ten.  Otherwise, just shut up.

Image:  Henri de Toulouse-Lautrec, “Monsieur Boileau,” 1893.  Image:  Wikimedia Commons.

Off-Label use of a DKos Post

June 12, 2008

Check out this — not so much for the snark about McCain, but for the delightful gallery of (period) appropriate tech.

Actually, while I enjoy a good, well prepared, someone-with-not-enough-to-do, professional grade snark as well as the next blogger, I fear that the author, DHinMI, is being a bit unfair here. [Of course s/he’s being unfair. It’s a blog, bozo! It’s a sarcastic bit of fun for the morning! Get a life. –ed.)

No, no — not unfair to McCain; he’s fair game, and if he didn’t want to be twitted for his age, he should have won in 2000.

No, what impressed me about this gallery is the degree to which the technology and experience at the end of the 19th century was so much more like our own than it was like that of the generation of the founders a century before.

Look at the photos on offer: Long distance communications; mass transport; medicine, (not really represented in this gallery) which, for all the easy humor, at least had the germ theory, a grasp of infection and the need for sterile conditions in hospital operating suites, new energy sources, organized, professional, government run emergency services, mass visual media, and, perhaps above all, electricity with which to make so much of the rest go.

Compare this, as I once heard the great physicist and teacher Philip Morrison do, to the situation in 1800. Whale oil as the primary source of light with which the reading and writing public could extend their work into the night. Slow transport, entirely powered by one’s own body, one’s horse, or by wind or water. Debridement and then amputation as the primary therapy for infected wounds. Communication beyond line of sight/hearing proceeding at the same rate as the transport of other goods: slow, slow, slow. And so on.

Morrison, in the lecture I heard, went into detail about the operations of a major wheat growing operation in the upper midwest in 1900. The web existed — or rather a web, a network; telegraph communications enabled the farm’s owners to follow grain prices around the world on a daily basis. Rail transport meant that the threshed wheat from that farm could enter that global market in a timely way.

Chicago, the nearest major city, was home to 1.6 million people. All those people consumed with a vengeance: in the landmark Marshall Field complex on and around State St. in the first decade of the twentienth century, the famous department store employed 12,000 people, doing 25 million dollars in retail and twice that, 50 million in wholesale business around the world. The technology needed to permit such enormous agglomerations of people advanced too — Chicago’s supply of indoor plumbing required continuous tending, culminating in the opening of its new, model sewage system in 1900, centered on a canal that could carry 600,000 cubic feet of water per minute.

All of which means that Morrison’s wheat farmer, some miles out of town, was, all of a century ago, completely innocent of HTML and the joys of a 3G iPhone — but was nonetheless enmeshed in a global system of information exchange and commerce, mass produced consumer goods and entertainment (even recorded music, via the mass market business in player pianos that boomed with new technology in the 1890s and 1900s).

To put it another way: I can imagine myself adapting pretty readily to life in my current home of Boston in 1900. 1800? Not so easy, I think.

So, channeling a little bit of that remarkably clear thinker, the late and missed Professor Morrison, it’s always tempting to think that what’s happening right now is so new, so wonderful, that it is without precedent in human experience. But there has been a whole lot of such experience over time, and sometimes at least, the newness of technology is in the ease of what it enables, and not in its pure, raw, novelty. That is: a question one should ask of the past is not just “how far?” but “how near?”

(Not that any of this, of course, makes me want a president more comfortable with a Hollerith calculating machine than the device on which I compose this.)

Image: Camille Pissarro, “Place du Havre, Paris,” 1893. Location: Art Institute of Chicago. Source: Wikimedia Commons.

Just in Case You Were Wondering…

June 11, 2008

…When a lab created black hole might next form and end life as we know it….

(Joke, folks, in case you weren’t sure.)

…Follow this countdown to the activation of the Large Hadron Collider. (h/t Peter Steinberg via Planet Musings.)

Given that by pretty much any standard I can think the LHC of is the most complex machine ever built, this seems like a milestone worth noting.

One thing that does strike me in this last month before lift-off (or perhaps better, dive-in ) is the seeming reversal of roles in the fact of how often, and how frequently breakthrough science turns on top-flight engineering.

That is: a ton of science turns on instrumentation. A leap in the power of key instruments produces not just better data, but qualitatively new information. Think of how much of modern astronomy — and really, modern cosmology – turns on the twin transformations in the size of the light buckets of modern telescopes, and the enormous increase in the resolution and throughput of spectrographs. Everything from exoplanets to the fundamental questions raised by the observation of dark energy emerges directly from the engineering advances that produced the observational astronomy renaiscance of the last two decades. (Many of which, to be sure, were led by scientist-engineers, among whom Jerry Nelson may be taken as the type specimen).

High energy physics is in the same boat, perhaps more so: when and as observation of the universe fails to supply sufficient data (see above) only large machines focused on very small spatial interactions can do the job. It’s a cliche to call accelerators as the telescopes of the microcosmos, but the analogy ain’t bad. It is precise in this way: each significant increase in the power of the two types of instruments yields new science. The making of the tool precedes the discoveries that we then, rightly, celebrate

Which is my point: engineers take their lumps for, in the phrase I remember from a now-mislaid Seth Lloyd interview, trading in science so well established that even engineers can understand it. See xkcd‘s take for the succinct version of the basic trope:*

Well, for the last ten years or so, it has been the engineers ascendancy. In a few weeks and over years to come, physicists will again dominate the life and meaning of the LHC. Consider this a tip of the hat to the extraordinary creative skill that will permit the glamorous side of high energy physics to strut the catwalk once more

*There is also J. Robert Oppenheimer’s “compliment” to the chemist George Kistiakowsky, whose leadership of the implosion group was essential to the completion of the Manhattan Project’s plutonium bomb. In an interview late in his life conducted by Carl Sagan, ultimately edited and broadcast on NOVA, Kistiakowsky said that Oppenheimer told him that as a chemist, he was a very good third rate physicist.

More on the fate of science under Bush (and McCain?…)

May 9, 2008

See this comment from Kevin on the Daily Kos thread responding to the McCain/science post below.

Kevin wrote:

Thoughts from a Cancer Biology graduate student (8+ / 0-)

I’m new to the site, but I just thought i’d throw my two cents in here. I’m finishing up my PhD in Molecular Cancer Biology at Duke University and I hope to give you some insight as to how bad things are getting in the scientific community. When i first entered graduate school in 2002, nearly 25 percent of all new grants were being funded by the NIH. Now, slightly more than 10 percent are. This has led to limited job opportunities for graduating students, a smaller group of professors holding a larger piece of the NIH pie (fewer new ideas and perspectives on complex and longstanding problems), and will surely have long lasting consequences on the ability to recruit new brilliant minds as the job market continues to decline.

I urge all to speak to your congressmen, and speak up about a problem many will talk about and few will actually do anything for. You can also find out more information at the American Association for the Advancement of Science website

Technology is at the heart of almost all new invention. At a time when we need great thinkers to solve problems inherent in the U.S. and clearly the rest of the world (i.e. global warming, petroleum dependency, health sciences research and yes, even our countries defense capabilities) the Bush administration has taken away funding and slowed the progress that we’ve been moving towards in all these areas. Unless steps are taken soon, our ability to solve these problems will be greatly compromised in order to pay for a war we dont need, and tax cuts we cant afford.

Pay close attention to the key number in Kevin’s post: there has been a nearly 60% drop in grants funded by the NIH over the education of one graduate student. Similar cutbacks are occuring at other major science and engineering funding agencies.

Everything Kevin says about the consequences of such a decline is true: fewer grad students; fewer jobs for newly graduated researchers (not to be confused with graduated beakers); shrinking incentives for technically or mathematically skilled undergraduates to consider science or engineering basic research as a career, and so on.

The larger consequences follow on with shocking speed. It takes a long time — decades — to build up a research infrastructure. Labs, space, machines — but above all people who have ideas and time and room enough to pursue ideas that don’t work out (most of them) and the few that do. (Take a look at this NOVA program about Judah Folkman for the virtues of persistence and the absolute necessity of an ongoing flow of grad student and post doc money to produce important results.)

As Kevin argues, it takes much less time — years, maybe a decade, to unravel the technical capacity to do research. To take an example from the engineering side of things. As late as 1973, with the launch of Skylab, the United States possessed the ability to lift large payloads into orbit, and to carry manned missions as far as the moon, all using one of the true monuments of 20th century technology, the Saturn V rocket. That was the moon rocket’s last flight. Within a few years, though much of the infrastructure of the moon missions remained, the core manufacturing capacity to build more such rockets was lost.

The consequence: Skylab was designed to remain safely in orbit until 1981, two years past the scheduled debut of the Space Shuttle, which would be deployed to dock with America’s space station (yup, we had one thirty five years ago), and move the facility to a higher orbit.

Then Skylab’s parking orbit deteriorated early, in 1979. The shuttles, behind schedule, were unavailable. The last Saturn Vs had already long since been mothballed and placed, in some cases, on museum display. The production line had been shut down for almost a decade. A decade after landing men on the moon, the US had exactly no space vehicles capable of carrying humans to near earth orbit.

And now, even though the shuttle does exist, we lack anything approaching the heavy life capacity the US space program possessed forty years ago. Hence the very costly, unlikely-to-finish-anytime-soon Ares rocket development project, now scheduled for first flight in 2015, forty three years after the last American walked on the moon.

That is: to put it in the words of that noted analyst of science policy, Joni Mitchell,

Don't it always seem to go
That you don't know what you’ve got
‘Til its gone

To return to the core theme of this post, this blog, and Kevin’s comment: John McCain’s priorities for federal spending put science funding in deep danger. If we continue to gut funding for basic science research and education, we face the loss not just of specific projects left undone, but of the capacity to do the cutting edge science and technological investigation that is the foundation of our prosperity and our national security.

Usually I illustrate this blog with fine art. But this clip from a seminal work in American motion picture history seems more appropriate somehow.