BOOK REVIEW: Barbara Ehrenreich is old enough to die, but still has plenty to say

In Gulliver’s Travels, the titular character visits many strange, foreign lands in the service of satirist Jonathan Swift’s desire to poke fun at the flaws of the culture of his time and to talk regularly about human excretions.

Adaptations, especially those aimed at children, tend to only reproduce the book’s evocative imagery of being a giant among the tiny Lilliputians and being doll-sized among the towering Brobdingnagians. They leave out the scenes of defecating enough to fill a miniature church or being forced to watch colossal serving girls urinate. Laputa, the floating island of scientific wonder, sometimes appears in other contexts such as Japanese animated film, but without the associations of trying to turn digested food back into something edible. The humanoid primitives, Yahoos, survived into common parlance better than the rational, equine Houyhnhnms, but without the association of feces-flinging.

One brief, excreta-free section from Gulliver’s Travels is not among those often reproduced whatsoever: Gulliver meeting the immortal struldbrug. These are a special breed of human who are able to live essentially forever but without eternal youth. Their teeth fall out, their eyesight and hearing fail, their memories dull, they aren’t allowed autonomy or property ownership, and eventually can’t communicate even with each other because their dialects grow indistinguishable. The people of the land of Luggnagg are thankful for death because they’re constantly reminded of what the real alternative is.

I’m not the first to point out the similarity of modern medicine in creating cursed immortality as a reality for us, but our appreciation for the inevitability and even relief of death continues to lag behind for most.

Barbara Ehrenreich is definitely not counted among such people, and her latest book Natural Causes is a short, solid piece of prose about what it means to suffer from age, accepting the reality of death, and the sorts of things a person ought to consider when weighing both.

Continue reading “BOOK REVIEW: Barbara Ehrenreich is old enough to die, but still has plenty to say”


As Billy said, ‘Brevity is … wit.’

The other day, Ralph Fiennes, the famous British film actor who also loves stage acting, said he does not so much love the current direction of language.

“We’re in a world of truncated sentences, soundbites and Twitter,’ Fiennes said, being quoted for a soundbite.  “(Language) is being eroded — it’s changing. Our expressiveness and our ease with some words is being diluted so that the sentence with more than one clause is a problem for us, and the word of more than two syllables is a problem for us.”

And he’s worried about the relevancy of Shakespeare going forward now that he says he sees young drama students are having more trouble with the Bard than those a few generations ago would have (one wonders if Fiennes really remembers how well those young students did generations ago). He’s worried about how you perform plays with a lot of words with multiple syllables when the direction of language is more Hemingway than Faulkner, read, spoken and understood.

I used to be with Mr. Fiennes on this, and in college, was really worried that that kind of written language would be the equivalent of Newspeak from George Orwell’s 1984. For example, the text, “I love you,” is soul-baring, while “luv u ;)” is common, casual, and expresses nothing. It’s not even “double-plus ungood,” as a character from Orwell’s novel would be expected to say, it’s “++ungud.”

It’s not that you’d have to censor people anymore; they wouldn’t be able to articulate anything meaningful, let alone seditious. (That was my thinking.)

But, that’s a very college sort of thought to have. And you see writers who have just graduated, even journalists supposedly trained to be concise, want to write using the biggest word that comes to mind, maybe even because it’s the first. School trains you to prove that you’re intelligent and educated more than that you’re actually a good writer or know what you’re talking about. “First thought, best thought,” — but only if your first thought is actually good at communicating.

Wasn’t it Shakespeare who said, “Never make use of a sesquipedalian word when a diminutive one will suffice”?

Or was he the one who said, “Brevity is the soul of wit”?

There’s certainly nothing wrong with great-big words, and they are often good to know, especially when not to use them. Twitter, in common usage, may be a gift for people to concisely say stupid and empty things. But that’s what most people say anyway. The longshoreman philosopher of the mid-20th century, Eric Hoffer, said there wasn’t an idea that could be expressed in 200 words.

“But the writer must know precisely what he wants to say,” Hoffer cautioned. “If you have nothing to say and want badly to say it, then all the words in all the dictionaries will not suffice.”

You may need more than one tweet of 140 characters to get the full thing across, but you’re also going to make every letter count. You’re going to spill over the limit and go back and look at what you’ve written. Have I expressed this in the most effective way possible? Why am I wasting space on adjectives when I could use a more inherently evocative word (“walked without hurry” vs. “sauntered”). If someone reads only this message, how can I make this memorable and impactful on its own?

Writing has always been easy; so too chatting and tweeting. But good writing is always heavy labor, it’s just the form has changed now.

The future belongs to the aphorist. And I’m OK with that.

Which of these magical objects would you take?

The other day I saw a poll, one of those ridiculous hypotheticals I think was designed to test a person’s personality, although what exactly was being tested and what the answers meant are still beyond me.

There were four magical objects offered, each with an impossible ability but having a limitation. They were a notebook, a car, a pen and a wallet, and these were their descriptions: Continue reading “Which of these magical objects would you take?”


We say we value honesty, in ourselves and other people, but hard as it may be, it’s much easier to tell the truth than to hear it. “We lie loudest when we lie to ourselves,” but intellectual dishonesty is as much a problem of perception as deception.

Humans are rational beings, or so I’m told. Occasionally, we gather evidence to come to an unbiased conclusion, check the facts to come to a reasoned answer. More often our reasoned conclusion is the result of irrational prejudices, or at least subjective opinions framing and coloring what information we receive and how we sort it.

In the 19th century, Quakers and Virginians were reading the same Bible so far as I know. Yet somehow they managed to come to completely opposite conclusions about the place of slavery in Christianity. God had created blacks as mentally inferior therefore their natural place was under the control of white masters. A few decades later, the curse of Ham had been replaced by the science of Darwin, now proving objectively that the Negro was naturally biologically inferior. The reasoning changed, but the conclusion remained unchanged.

Today people look at test scores and poverty rates and alternately prove that African-Americans are biologically/culturally inferior to whites or victims of a structurally racist society. “Just look at the evidence!” both sides say. “It’s plain to see.”

That’s a poor example, at least today. That’s not an issue with a 50-50 split anymore, but like writers of The Bell Curve or James Watson, very intelligent people can gather a great deal of evidence and see in it something they already want to.

A better example, or fairer one, is the Second Amendment in the Bill of Rights, commonly called “the right to bear arms.” On this subject, you will find Clyde Jr. of Arkansas and Antonin Scalia holding roughly the same views. They support their views with entirely different levels of complexity, but in the end, they support the same thing. Ruth Ginsberg is no less qualified a legal scholar than Scalia, has probably read all the same books, histories, and decisions as he has, but her conclusion is more in line with a pot-smoking hippie.

It bothers me very much to read editorials, most of them written by intelligent people, claiming that the Second Amendment clearly says this or that when the only thing clear about it is that when you read it for yourself, “the right of the people to keep and bear arms shall not be infringed” comes after a very clear qualifier: “A well-regulated militia being necessary.”

That muddles things. Historical context muddles things. Just to what extent “arms” was supposed to encompass then and now muddles things.

I’m phrasing things in this way because I’m actually sympathetic to the gun rights cause, and that makes me more sensitive to and critical of how they go about things.

I would support a city’s right to restrict gun rights based on my personal libertarian principles of local control and heterogeneous laws. The free market of ideas, competing sociological laboratories, etc., etc. Of course the Supreme Court ruling came on the District of Columbia, so setting that aside, whether gun ownership is a right is not an answer that can be read into the Constitution, and certainly not by the chicken-bone soothsayers running around today.

The only way to answer the question honestly is to surrender all claims of superseding authority and make the most convincing argument you can at a fair level of discourse.

That is, start with the question of whether gun ownership is a natural right. There’s no document, political, religious, or otherwise to answer this question, just your own reason and beliefs, and it’s either a yes or a no. For me, it’s pretty obviously “no” because guns have only existed for the past 500 years or so, and one would think intrinsic rights would be as old as our species. But, gun ownership may be a derivative right of something else, that is the right to self-protection and defense. Whereas in the days of Og bonking Unk on the head with a club entitled Unk to a club or rock of his own, so too a world of guns entitles us to guns for this purpose. This seems sensible enough to me.

The question, as a libertarian, is always where your rights stop and another’s begin. Where is the border between your right to protect yourself and another’s right not to be threatened? That’s an important question, and consulting the Constitution does nothing. A nuclear weapon can’t be acceptable to remain in private ownership just because the founders hadn’t the foresight to prohibit them (and I’ve heard some arch-libertarians sincerely make that argument).

I heard Scalia use as an example that when he was in high school, a fellow classmate complained about reading Shakespeare. The teacher said, “Sir, when you read Shakespeare, he isn’t on trial; you are.” In the same way, the traditions and laws aren’t on trial by contemporary measures; we are.

Well, if we are on trial, the judges are absent. The Founders with a capital “F” aren’t here, and they never really were. Jefferson’s vision of the nation is not more valid than Hamilton’s or Adams’ or Washington’s. When we try to use them to parrot our own opinions and substitute persuasive argument, we may be doing our best to tell the truth, but ultimately we’re lying.

The best way to be an honest person may just be to admit when you’re telling a falsehood. The best way to be intellectually honest, then, may be to admit your biases and work around them as best as you are able, lying as you go, but not compounding the lie with claims of impartiality.

We’re only impartial to things we care nothing for, and rarely does anyone comment at any length about things they care nothing for.