If you watch a pot, will it actually boil?

I’ve always thought I have known what “a watched pot never boils” means.

Like many others, I presumed that a pot only seems to boil more slowly when you pay attention to it, and that the best way to make a pot boil “faster” is to distract yourself with another task or thought.

But, while making lamb stew tonight, it occurred to me that I might be incorrect in my understanding. After all, you can make a pot of water boil faster by putting the lid on it, to conserve heat. Maybe it’s because I’m presently enjoying Wolf Hall, but it occurred to me that there was a time in the not-too-distant past when cooking required wood, and that wood might be scarce or expensive, at least for some people.

If you’re conserving wood or coal, I thought, your cooking fire might keep cool, in a relative sense, by remaining small compared to the pot you were cooking in. It might be that, like a stove on low, the pot’s contents might just barely reach a boil with the lid on. And taking the lid off, to check on it, might lower the temperature below boiling.

Thus, I realized, if you constantly watched this pot, leaving it lidless, it would not boil at all!

Does anyone know more about the origin of the idiom, giving the power to confirm or deny this understanding?

In Praise of Good Obituaries

If not for his obituary in the New York Times, I might never have heard of Lionel Casson, a historian focused on ancient maritime history. I’ve now placed one of his books on hold at the library, and hope to read it shortly.

The obituary was pretty near perfect: a description of Dr. Casson’s accomplishments, and what made them unique and interesting, including an evocative distillation of his central subject. The obituary ends with a brief but witty anecdote.

In short, this obituary celebrate the man and his work, and generates additional interest in what might be an obscure subject of whom most people, myself included, have never heard.

Five Reasons Farhad Manjoo is Doomed

All right, maybe he’s not doomed, but once again I’m less than impressed with Slate’s technology columnist, Farhad Manjoo, who says today that
Google’s Chrome OS is doomed.

His article starts out by admitting that Google’s Chrome OS (which I’ll abbreviate GCOS, even if that name has been used before) is intended, at least initially, for Netbooks.

Nevertheless, his first point (“Linux is hard to love”) complains that it’s hard to install software on Linux and that Linux doesn’t have much hardware peripheral support.

Of course, partnering with hardware manufacturers should address immediate peripheral support; the only other hardware most people want on a netbook is 802.11 support, CDMA or GSM wireless cards from various providers, and maybe an external mouse, if decadence is the order of the day.

His second point (“We aren’t ready to run everything on the Web”) is exactly why GCOS is being first targeted at Netbooks. They’re called Netbooks for a reason: most folks aren’t running Microsoft Office on these little boxes.

It’s true that we aren’t ready to run everything on the Web just yet. But Google isn’t shipping GCOS on everything just yet, either.

His third and fourth reasons (“Microsoft is a formidable opponent” and “Google fails often”) are true. But they apply equally to every product those vendors ship. If Manjoo was writing about the Zune, would he say that Apple is doomed? Unlikely—even if Microsoft shipped a great product. Does Manjoo think that Google Mail is doomed now that it’s left beta? Of course not.

Really, points three and four are just puffery to stretch the whole article out to five points. Readers like lists. Well, editors do, anyway. And readers certainly click on top ten, five reasons. But I’m not sure that readers are more satisfied after having clicked through, given the frequent poor analysis of the lists.

Manjoo’s final point claims that “The Chrome OS makes no business sense,” because they’re giving it away for free. His central claim in this section is that it’s “a wasteful customer acquisition expense”: it’s better for Google to spend more money improving their advertising engine than to branch out into new areas.

Correctly, he notes that the primary point of Chrome OS from a business sense “is to screw with Microsoft.” I think that’s defensive: every dollar that Microsoft spends chasing the notebook market and protecting their desktop franchise is a dollar they’re not spending on Bing. Which I haven’t tried, but I hear is pretty good. So by attacking Microsoft, they’re protecting their core franchise. (I doubt that Google’s Web search improvements are hampered by pouring resources into GCOS.)

Within his “no business sense” claim, Farhad Manjoo also suggests that GCOS as customer acquisition is unnecessarily expensive because Gmail and Google Docs can run from Windows. The cost of customer acquisition may be high—but the customer defection rate of users back to Microsoft Office would be infinitely lower on an operating system that doesn’t run office.

Really, though, whether Manjoo is right or wrong is beside the point. In a prior life, I wrote a column about computer security. My job, distilled to its most vaporous essence, was to be controversial, and attract readers; whether I was right or wrong was beside the point, as far as the bottom line was concerned. And make no mistake, attracting readers and clicks was the real objective. The content that did so was secondary.

Thus blogging about Manjoo’s article helps accomplish his real goal: being controversial enough to attract readers. I’m torn between a cynical acceptance of that answer—which would push me to stop reading the damn articles, since they’re worthless from an analytical perspective—and a utopian wish that higher-quality analysis would attract more readers, and be more valuable than “top five reasons X is doomed” journalism.

But how many of you clicked on this article because of its title?

Two Writing Milestones

Within a single week, I’ve passed two milestones with regard to my writing.

First, I’ve gone into positive territory on my royalties for Think Unix. Yes, after nearly eight and a half years I’ve earned back my advance, and am now owed approximately three dollars seventy-five cents by the publisher.

I’m exceedingly pleased that people continue to read and purchase this book, and that except for the two chapters on Unix GUIs the book has remained useful. I wanted to write an “evergreen,” and I feel like I succeeded. Not that I couldn’t improve the book, or that there aren’t things I wish I’d done better, but I think I did pretty well.

Second, I’m pleased to announce that a short story of mine is being published. I’ve waited until the magazine was printed and ready to go, as I’ve had things fall through in the past – but you can buy issue one of The Ne’er-Do-Well Magazine, which contains my short story “Lodestar.”

If you’ve read previous versions of this story, I’d encourage you to buy the magazine and re-read it, as it’s been substantially revised. Sheila, the editor, is exceedingly perceptive, and her input did the story a lot of good. I’m looking forward to my copy arriving, and reading the rest of the pieces too.

As a teaser, an unrelated short-short, I still get pictures from him sometimes is on the magazine’s site, along with short-shorts from other contributors.

Absolutism is the New Relativism

In an eye-opening article in Slate, Fred Kaplan writes that Condoleezza Rice “invokes her academic credentials to evade responsibility for decisions that she’s made or for policies that she’s helped devise.” More specifically, Rice argues that as a student of history, she has learned that far-future consequences are unforseeable, that the now-seemingly-negative may turn out to be positive, and vice-versa — and that, because we can’t predict how her decisions will be judged in thirty years, a hundred years, or a thousand years, we must not judge them today, either.

This may strike many people as both eminently true and eminently indefensible; after all, we still have to make decisions, and build on those decisions, even if we can’t know what our great-grandchildren’s great-grandchildren will think, even if our goal is to make choices that will enable those distant descendants to exist and to thrive.

To me, the interesting aspect of her comment is that her eschatological objectivity (in The End, we can and will know) brings Rice and her fellow conservative academics to the same place as the radical subjectivity of their left-wing postmodern academic opponents in the culture wars, a position for which the postmodernists were soundly spanked by the good upstanding believers in absolute, objective reality.

Of course, the news isn’t that people engaged in politics (even academic politics) pillory their opponents for things that both sides do for opposite reasons. The news (if such an aphorism can be news) is that the poet was right: extremes meet.

There oughtta be a word…

… it’s not notorious, exactly, nor is it infamous. It is a work of art or cultural phenomenon that is wrongheaded and odious, but either incredibly influential or perfectly evoking the zeitgeist. Examples may be, depending upon your political and/or artistic persuasion, Bill O’Reilly, Michael Moore, Robocop, Magnolia, Lydia Lunch, Paris Hilton, Duran Duran, or KISS.

On Jargon

I was reading somewhere about how men talk about money as sport, which is why they use terms like “Price / Earnings Ratio,” which the article termed “jargon.” The article went on to suggest that it was the use of such jargon that accounted for women’s disinterest in discussions about money.

Maybe people (both men and women!) use terms like “Price/Earnings Ratio” because no other phrase captures the meaning. In fact, both “price” and “earnings” in that sentence have complex definitions that refer to other compound concepts.

In other words, a good reason to speak in “jargon” is because it offers shorter terms with more precise shades of meaning than “regular” language allows. The idea that experts have amassed a vocabulary for the sole purpose of excluding others speaks either to a feeling of insecurity on the part of those who have this idea, or a notion that there can be no more precise thoughts or meanings in that entire field of endeavor than they already understand. In other words, they must believe that there’s no content to the field.

One couldn’t talk about mood using only the words “happy” and “sad.” We need a larger vocabulary to accomodate important concepts: “frustrated” means something different from “irritated,” “amused” isn’t the same as “schadenfreude,” and so on. We don’t consider this jargon because we assume normal people understand these shades of meaning. For someone less attuned to others’ moods (eg, someone with Asperger’s), these words could be construed as jargon.

Found in Translation

A new employee at our company, who is Japanese and I believe is working in Japan, sent out an e-mail to the company. Its subject: Whispered Self Introduction.

The notion of a whispered self-introduction seemed both apropos and beautiful; I don’t know if it’s a literal translation of something Japanese, or just a fortuitous use of English.