BOOK REVIEW: Barbara Ehrenreich is old enough to die, but still has plenty to say

In Gulliver’s Travels, the titular character visits many strange, foreign lands in the service of satirist Jonathan Swift’s desire to poke fun at the flaws of the culture of his time and to talk regularly about human excretions.

Adaptations, especially those aimed at children, tend to only reproduce the book’s evocative imagery of being a giant among the tiny Lilliputians and being doll-sized among the towering Brobdingnagians. They leave out the scenes of defecating enough to fill a miniature church or being forced to watch colossal serving girls urinate. Laputa, the floating island of scientific wonder, sometimes appears in other contexts such as Japanese animated film, but without the associations of trying to turn digested food back into something edible. The humanoid primitives, Yahoos, survived into common parlance better than the rational, equine Houyhnhnms, but without the association of feces-flinging.

One brief, excreta-free section from Gulliver’s Travels is not among those often reproduced whatsoever: Gulliver meeting the immortal struldbrug. These are a special breed of human who are able to live essentially forever but without eternal youth. Their teeth fall out, their eyesight and hearing fail, their memories dull, they aren’t allowed autonomy or property ownership, and eventually can’t communicate even with each other because their dialects grow indistinguishable. The people of the land of Luggnagg are thankful for death because they’re constantly reminded of what the real alternative is.

I’m not the first to point out the similarity of modern medicine in creating cursed immortality as a reality for us, but our appreciation for the inevitability and even relief of death continues to lag behind for most.

Barbara Ehrenreich is definitely not counted among such people, and her latest book Natural Causes is a short, solid piece of prose about what it means to suffer from age, accepting the reality of death, and the sorts of things a person ought to consider when weighing both.

Continue reading “BOOK REVIEW: Barbara Ehrenreich is old enough to die, but still has plenty to say”

As Billy said, ‘Brevity is … wit.’

The other day, Ralph Fiennes, the famous British film actor who also loves stage acting, said he does not so much love the current direction of language.

“We’re in a world of truncated sentences, soundbites and Twitter,’ Fiennes said, being quoted for a soundbite.  “(Language) is being eroded — it’s changing. Our expressiveness and our ease with some words is being diluted so that the sentence with more than one clause is a problem for us, and the word of more than two syllables is a problem for us.”

And he’s worried about the relevancy of Shakespeare going forward now that he says he sees young drama students are having more trouble with the Bard than those a few generations ago would have (one wonders if Fiennes really remembers how well those young students did generations ago). He’s worried about how you perform plays with a lot of words with multiple syllables when the direction of language is more Hemingway than Faulkner, read, spoken and understood.

I used to be with Mr. Fiennes on this, and in college, was really worried that that kind of written language would be the equivalent of Newspeak from George Orwell’s 1984. For example, the text, “I love you,” is soul-baring, while “luv u ;)” is common, casual, and expresses nothing. It’s not even “double-plus ungood,” as a character from Orwell’s novel would be expected to say, it’s “++ungud.”

It’s not that you’d have to censor people anymore; they wouldn’t be able to articulate anything meaningful, let alone seditious. (That was my thinking.)

But, that’s a very college sort of thought to have. And you see writers who have just graduated, even journalists supposedly trained to be concise, want to write using the biggest word that comes to mind, maybe even because it’s the first. School trains you to prove that you’re intelligent and educated more than that you’re actually a good writer or know what you’re talking about. “First thought, best thought,” — but only if your first thought is actually good at communicating.

Wasn’t it Shakespeare who said, “Never make use of a sesquipedalian word when a diminutive one will suffice”?

Or was he the one who said, “Brevity is the soul of wit”?

There’s certainly nothing wrong with great-big words, and they are often good to know, especially when not to use them. Twitter, in common usage, may be a gift for people to concisely say stupid and empty things. But that’s what most people say anyway. The longshoreman philosopher of the mid-20th century, Eric Hoffer, said there wasn’t an idea that could be expressed in 200 words.

“But the writer must know precisely what he wants to say,” Hoffer cautioned. “If you have nothing to say and want badly to say it, then all the words in all the dictionaries will not suffice.”

You may need more than one tweet of 140 characters to get the full thing across, but you’re also going to make every letter count. You’re going to spill over the limit and go back and look at what you’ve written. Have I expressed this in the most effective way possible? Why am I wasting space on adjectives when I could use a more inherently evocative word (“walked without hurry” vs. “sauntered”). If someone reads only this message, how can I make this memorable and impactful on its own?

Writing has always been easy; so too chatting and tweeting. But good writing is always heavy labor, it’s just the form has changed now.

The future belongs to the aphorist. And I’m OK with that.

Which of these magical objects would you take?

The other day I saw a poll, one of those ridiculous hypotheticals I think was designed to test a person’s personality, although what exactly was being tested and what the answers meant are still beyond me.

There were four magical objects offered, each with an impossible ability but having a limitation. They were a notebook, a car, a pen and a wallet, and these were their descriptions: Continue reading “Which of these magical objects would you take?”

Honesty

We say we value honesty, in ourselves and other people, but hard as it may be, it’s much easier to tell the truth than to hear it. “We lie loudest when we lie to ourselves,” but intellectual dishonesty is as much a problem of perception as deception.

Humans are rational beings, or so I’m told. Occasionally, we gather evidence to come to an unbiased conclusion, check the facts to come to a reasoned answer. More often our reasoned conclusion is the result of irrational prejudices, or at least subjective opinions framing and coloring what information we receive and how we sort it.

In the 19th century, Quakers and Virginians were reading the same Bible so far as I know. Yet somehow they managed to come to completely opposite conclusions about the place of slavery in Christianity. God had created blacks as mentally inferior therefore their natural place was under the control of white masters. A few decades later, the curse of Ham had been replaced by the science of Darwin, now proving objectively that the Negro was naturally biologically inferior. The reasoning changed, but the conclusion remained unchanged.

Today people look at test scores and poverty rates and alternately prove that African-Americans are biologically/culturally inferior to whites or victims of a structurally racist society. “Just look at the evidence!” both sides say. “It’s plain to see.”

That’s a poor example, at least today. That’s not an issue with a 50-50 split anymore, but like writers of The Bell Curve or James Watson, very intelligent people can gather a great deal of evidence and see in it something they already want to.

A better example, or fairer one, is the Second Amendment in the Bill of Rights, commonly called “the right to bear arms.” On this subject, you will find Clyde Jr. of Arkansas and Antonin Scalia holding roughly the same views. They support their views with entirely different levels of complexity, but in the end, they support the same thing. Ruth Ginsberg is no less qualified a legal scholar than Scalia, has probably read all the same books, histories, and decisions as he has, but her conclusion is more in line with a pot-smoking hippie.

It bothers me very much to read editorials, most of them written by intelligent people, claiming that the Second Amendment clearly says this or that when the only thing clear about it is that when you read it for yourself, “the right of the people to keep and bear arms shall not be infringed” comes after a very clear qualifier: “A well-regulated militia being necessary.”

That muddles things. Historical context muddles things. Just to what extent “arms” was supposed to encompass then and now muddles things.

I’m phrasing things in this way because I’m actually sympathetic to the gun rights cause, and that makes me more sensitive to and critical of how they go about things.

I would support a city’s right to restrict gun rights based on my personal libertarian principles of local control and heterogeneous laws. The free market of ideas, competing sociological laboratories, etc., etc. Of course the Supreme Court ruling came on the District of Columbia, so setting that aside, whether gun ownership is a right is not an answer that can be read into the Constitution, and certainly not by the chicken-bone soothsayers running around today.

The only way to answer the question honestly is to surrender all claims of superseding authority and make the most convincing argument you can at a fair level of discourse.

That is, start with the question of whether gun ownership is a natural right. There’s no document, political, religious, or otherwise to answer this question, just your own reason and beliefs, and it’s either a yes or a no. For me, it’s pretty obviously “no” because guns have only existed for the past 500 years or so, and one would think intrinsic rights would be as old as our species. But, gun ownership may be a derivative right of something else, that is the right to self-protection and defense. Whereas in the days of Og bonking Unk on the head with a club entitled Unk to a club or rock of his own, so too a world of guns entitles us to guns for this purpose. This seems sensible enough to me.

The question, as a libertarian, is always where your rights stop and another’s begin. Where is the border between your right to protect yourself and another’s right not to be threatened? That’s an important question, and consulting the Constitution does nothing. A nuclear weapon can’t be acceptable to remain in private ownership just because the founders hadn’t the foresight to prohibit them (and I’ve heard some arch-libertarians sincerely make that argument).

I heard Scalia use as an example that when he was in high school, a fellow classmate complained about reading Shakespeare. The teacher said, “Sir, when you read Shakespeare, he isn’t on trial; you are.” In the same way, the traditions and laws aren’t on trial by contemporary measures; we are.

Well, if we are on trial, the judges are absent. The Founders with a capital “F” aren’t here, and they never really were. Jefferson’s vision of the nation is not more valid than Hamilton’s or Adams’ or Washington’s. When we try to use them to parrot our own opinions and substitute persuasive argument, we may be doing our best to tell the truth, but ultimately we’re lying.

The best way to be an honest person may just be to admit when you’re telling a falsehood. The best way to be intellectually honest, then, may be to admit your biases and work around them as best as you are able, lying as you go, but not compounding the lie with claims of impartiality.

We’re only impartial to things we care nothing for, and rarely does anyone comment at any length about things they care nothing for.

Bible curriculum

I was already out of high school when the Bible curriculum became an issue. Actually, I was a student of David Newman’s at Odessa College when the controversy first took shape, so I was in the front row if not in the ring. While I enjoyed him as a professor, and think the quality of his instruction is an incredible bargain for a community junior college, we disagreed on the Bible curriculum, or at least how to respond to it.

See, we agreed then, and I assume still do, that the Bible is the single most important work of literature to the Occident. Essentially nothing of significance composed during the past two thousand can be fully appreciated, or in some cases understood at all, without a firm grounding in Biblical theology, history, and parable. We differed slightly in that he wanted a general Western literature class that would include stuff like Plato’s Allegory of the Cave, Oedipus Rex, and the Aeneid, while I was satisfied with a course that studied the Bible solely, so long as it was indeed study and not proselytizing (something I wasn’t convinced the original course was).

So our main disagreement was in how to express our disagreement. If he reads this he can correct me, but (in addition to protecting local non-Christians who might be discriminated against), I understand his motivation to be that if someone didn’t stand up and fight it here, it could be common and have a negative effect on our public schools.

Meanwhile I was and am of the opinion that ignoring things often does help them go away. I don’t base that on wishful thinking, but my own observations of West Texas. We aren’t actually religious or devout, we just like appearing to be. It is, after all, considerably easier to convince a Christian to wear a cross as a necklace than to give his shirt to a mugger, much easier to bless food or a sneeze than those who curse him. And as years of Sunday School and Big Church made abundantly clear, Christians want to show up as much as we feel obligated and definitely do not want to read the Bible (for ourselves).

It is my opinion that a “Survey of William Shakespeare” class would be a fantastic class for a public high school. But no more than a dozen already highly interested students would be willing to take it. Everyone else would be looking for blow-off classes, because that’s what electives are in high school. Maybe religious-devotion would make more kids sign up for a King James Bible course than Shakespeare, but not that much more.

By making it a controversial issue and a standard for the overtly religious to rally behind, all that happened was drive up interest. ‘Married… with Children’ syndrome. And even with this, at last count there were still only 38 students enrolled in the class in the whole district, 38 out of what, 3,000?

So maybe I’m being hypocritical for talking about it now without any newsworthy reason, but I really do think it’s a great idea and something high school kids can handle intellectually, although probably not religiously. What I mean is that when you stop confining yourself to Adam and Eve, Joseph, Samson, David and Goliath, and Jesus’ miracles, you get into some pretty tough stuff. I mean, interesting, just from Judges no one can say Ehud, Jephthah, or the Levite and his concubine aren’t interesting (and magnificent works of literature), but they start to bring up theological questions that aren’t in Sunday School, and in terms of content, it’s the sort of thing that gets books banned from school from some of the same people pushing for this course.

Then you get into looking at the whole process of redaction and canonization, and most Christians do not want to know about or objectively examine that. It’s gradual, it’s messy, and it’s considerably less simple than, “God wrote it here it is.” When you get away from Sunday School answers and start being honest, the Bible can be troubling. Some people even lose faith over it.

But, the benefits are worth it. When you look at the Bible objectively, understand it historically, and measure it artistically, it becomes quickly clear that whatever God’s involvement in the composition, the end result is divine. Intellectual dishonesty makes the Bible boring. But Bible study, there’s no end to that or its enjoyment.

You gain things from comparing the Epic of Gilgamesh to Noah’s ark. You gain things from looking at the Old Testament as the Tanakh, complete in itself, and not the prequel of Jesus. You gain things from looking at the development of the character of God in the Bible, and people’s understanding of Him from the “Let us” creator to someone who walks in Eden and fears the sunrise to the tribal deity that best the Egyptians and Canaanites to the still, small voice that follows the people of Judah to Babylon and judges (and forgives) Nineveh.

How nice this would be, to give Jews and Christians an insight into how the other views the verses, to give the otherwise religious and non-religious an understanding of the basis of those theologies, and to give all a better appreciation of a book spanning dozens of books, hundreds of characters, thousands of years, and God knows how many writers.

It would be nice if it was done in such a way, it is possible for it to be done in such a way, but it likely never will be done in such a way. And I will continue to be apathetic and snide about the class, hoping churches will offer Bible studies enough and of such quality that the need for them in schools will cease to exist.