What Kathy Hochul should have known

Earlier this month, New York Governor Kathy Hochul was a guest speaker at the Milken Institute Global Conference. In the context of a larger point about the inner city’s digital divide, Hochul told her audience:

“We have young black kids growing up in the Bronx who don’t even know what the word ‘computer’ is.”

Hochul, a white limousine Democrat, was predictably skewered from both the left and the right. The left took her to task over having placed National Guard troops in New York City’s chaotic, crime-ridden subways this spring. Voices on the right, long accustomed to being on the other side of “gotcha” moments like this, sensed blood in the water and pounced. “See?” they shouted gleefully. “Democrats are the real racists!”

I don’t believe that Hochul’s intention was to be “racist”, at least not in the conventional understanding of that term. But her public gaffe highlights a fact that Hochul should have known: White politicians simply have nothing to gain by referring directly to race.

White politicians can talk about crime, education, or upward mobility. They can talk about inner cities and societal divides. They can even make oblique references to “diversity”,  a term so broad and vague that it actually means…nothing. But when white politicians refer directly to race, they are effectively donning bullseyes and “Kick me!” signs.

White Republican politicians mostly understand this. Half the country is always looking for an opportunity to call them racist, anyway. But white Democrats believe that they—by virtue of being white Democrats—are special.

In this, as in so many things fatuous, President Joe Biden can be counted upon to lead the way. During the runup to the 2020 election, then-candidate Biden told Charlamagne tha God that the black podcaster “ain’t black” if he failed to cast his vote for Biden that November, or even saw the matter in terms of a question. Condescension takes many forms, including the assumption that one’s listener would prefer the word “ain’t” over the correct English contraction.


A Boy Scout by any other name?

The Boy Scouts of America will soon undergo a name change. Starting in 2025, the organization will be known as Scouting America.

This is reportedly being done for reasons of—you guessed it—inclusivity. But the name change also recognizes a fait accompli. About 20% of the Boy Scouts are now girls, according to recent membership statistics. (The BSA began admitting girls in 2019, through an auxiliary program.)

I was a Boy Scout more than 40 years ago, from 1979 to 1980. My scouting career was short-lived. This was not because of any unpleasant experiences with the BSA or my local troop. Rather, I simply didn’t like camping. My idea of camping involves staying in a minimum of a three-star hotel. Cooking Spam over an open fire, sleeping on the ground, and taking one’s calls of nature in the middle of nature…not for me.

When I was a Boy Scout, not only did the organization not allow girls, there was also a blanket ban on gay membership. I was 11 years old in 1979, and I am almost certain that no one questioned me about my sexual orientation when I presented myself at the troop for membership.

Whatever the rulebook said, I’m sure there were gay Boy Scouts in 1979. But people weren’t preoccupied with sexual identity like they are now. No one bothered to interrogate the boys about such things, just as no one sought to trumpet them. Those were more commonsense, live-and-let-live times.

Yours truly as a Boy Scout, 1979 or 1980

But female Boy Scouts were unknown in 1979.

I’m trying to imagine what scouting outings would have been like for me as an adolescent boy on the cusp of puberty had girls been part of my troop. I was just starting to notice girls as I passed from age 11 to age 12. The oldest scouts were always 18. If a third or a half of those older scouts had been high school girls? Who knows? I might have stuck with the Boy Scouts a few years longer.

But seriously…this is not drag queen story hour. I grumble as much as anyone about PC nonsense. But I also worry about becoming reflexively resistant to any form of change that happens to coincide with political correctness.

During my growing-up years (the 1970s and 1980s), I watched any number of unisex clubs, schools, and organizations open their doors to both genders—of which, everyone agreed in those days, there were only two.

The Catholic high school I attended started as an all-girls secondary boarding school, St. Joseph Academy, in 1915. The school went coed in 1951 and was renamed Archbishop McNicholas High School.

A sop to Truman-era political correctness? Not likely. Wokeness as we know it today hadn’t even been prototyped yet. The far more likely explanation is that the Roman Catholic Archdiocese of Cincinnati wanted the new school to attract the broadest possible range of students. Catholic schooling, after all, has always been a minority concern.

Similarly, the inclusion of girls in the BSA has likely been driven by practical necessity, as much as anything else. The BSA’s membership peaked at 5 million in 1972, just as the final cohort of Baby Boomers was moving through its ranks. By 2020, membership had fallen to 2 million. Membership fell even further during the pandemic. At present, BSA membership is at an all-time low, even with the inclusion of girls.

We are not in 1972 America anymore. The average suburban child is uninterested in the outdoors. He or she would rather spend the day staring blankly into the screen of a smartphone.

The Boy Scouts, in short, needs members. And if some of those members happen to be girls, well…why not?


Xenophobia, translated by the Japanese media

The Japanese media is well aware of President Biden’s recent characterization of their country as “xenophobic”. 

Damn those Japanese, and their unwillingness to accept thousands of undocumented immigrants. Why can’t they shape up, and fling open their borders?

Ergo, the Japanese are “xenophobic”.

What do we mean by ‘xenophobia’?

Xenophobia is a Western concept and an exclusively Western preoccupation. The rest of the world takes for granted a certain degree of what our nattering class regularly denounces as xenophobia.

  1. Most of the world accepts as a given that people have a stronger attachment to their family and neighbors, than to people living an ocean away.
  2. Most of the world believes that stark differences in basic customs and values lead naturally to conflict.
  3. Only the West—and only the recent manifestation of the West— seeks diversity for diversity’s sake. 
  4. You can be offended by that last statement if you choose, but it’s a fact. Ask a Chinese, a Ugandan, a Pole, or a Saudi to expound on the importance of diversity (or multiculturalism). They’ll laugh in your face.
Translating xenophobia into Japanese

The Japanese media therefore had to improvise the translation. They translated Biden’s assessment as:


This means, “The Japanese dislike foreigners.”

That isn’t exactly the same thing as “xenophobia”. But philosophical concepts are often reliant on cultural context, and typically don’t translate well.

There is no concise English translation for the Japanese concepts of of 本音 and 建前. They have to be explained. And so it is with “xenophobia” in Japanese.


Brian Cox on the Bible

Brian Cox is a 77-year-old British actor whom I’ve never heard of, though I have seen some of the movies that Wikipedia tells me he’s appeared in. Cox has had character roles in scores of films since the early 1970s.

For some reason, Cox thought it was necessary to tell an interviewer recently that the Bible is “one of the worst books ever”, and that “only stupid people believe it.”

Okay, I’ll bite.

I hate to turn this into yet another “Okay, Boomer,” moment. But Brian Cox is apparently living in 1966. In its April 8, 1966 issue, Time magazine famously asked the question, “Is God Dead?”

Cox would have been twenty at the time. I was a little more than two years from the day of my birth.

My point here being: it is no longer edgy for a self-assessed Western intellectual to declare his disdain for Christianity. In fact, a declaration of atheism has become rather trite, or at the very least, ho-hum. On the contrary, it is edgy for one of our Western cultural elites to declare that she is a Christian believer.

Western Europe’s march toward secularism and atheism did not begin in the benighted 1960s, although the 1960s accelerated it. Western Europe’s slide toward nonbelief began in the nihilism that followed the devastation of the First World War.

The American author Ernest Hemingway (1899 – 1961) was a European by interest and temperament. Hemingway always felt most at home in Europe, and it was in Europe that he got his start as a writer.

Read Ernest Hemingway’s post-World War I European novel, The Sun Also Rises. It is a story of soulless, uninspired people making their way through a bleak, amoral universe.

The Sun Also Rises, 1926 first edition

Hemingway was an atheist—or at least an agnostic. In his 1940 novel For Whom the Bell Tolls, he coined the phrase, “All thinking men are atheists.”

Hemingway died by his own hand at the age of 62. He killed himself with a shotgun.

Western Europe, too, is committing a sort of slow suicide. Having forsaken both faith and tradition, this is a culture that can no longer be troubled to reproduce itself. Practically every European country is aging, shrinking, withering, a shadow of its former (and mostly Christian) self.

The only positive population growth in Europe nowadays comes from Muslim immigration. Europe’s Muslim immigrants haven’t gotten the memo on atheism yet. More on this in a moment.

As Europeans have grown increasingly secular over the past one hundred years, Islam has become the fastest-growing religion in Europe. Europe’s old cathedrals draw the tourists. But it is the mosques that draw the transplanted faithful.

If Brian Cox wanted to say something edgy about faith and atheism, then, he took the coward’s way out. Why pick on a religion that is dying out in Europe, anyway? Why not target the one that some of Europe’s residents still actually believe in?

But Brian Cox, being a coward (or possibly just an out-of-touch codger) did not say that the Quran is a bad book that only imbeciles follow. He picked an easy target, a religion whose remaining followers will only shake their heads at him, rather than murder him for blasphemy.

On second thought, maybe Brian Cox just wanted to say something dismissive about religion—something that would have been edgy in 1966—while making sure that he celebrates as many remaining birthdays as possible. An atheist, after all, has nothing to be hopeful about in the hereafter.


A turning point for America’s universities?

You’ve probably noticed that many of America’s college campuses have turned into dysfunctional protest camps of late. With the weather turning warmer, the protests that began at Columbia University have spread to campuses throughout the country.

This comes at a time when three things are happening:

  1. Overall university enrollment is declining
  2. Employers are putting less emphasis on expensive college degrees
  3. The Ivy League universities, in particular, are losing their cachet
  4. Student debt has become a divisive political issue
  5. University presidents are hauling down CEO-level salaries

Public opinion is turning against our universities, especially those in the so-called Ivy League. This is a big shift from a generation ago.

“But wait!” some of you will shout. “Those college students are protesting what’s happening in Gaza! Certainly that gives them the right to skip class and take over public spaces! What’s wrong with you? You must be out of touch!”

Let me ask you this: Do I have a right to stage a noisy, disruptive demonstration at my local McDonald’s or Applebees in protest of world hunger? Do I have a right to harass you, as you leave Wendy’s with your 20-piece Chicken Nugget Combo?

What? I don’t? That’s ridiculous, you say?

But wait…isn’t world hunger an important issue?

I am not making light of the ongoing tragedy in the Middle East. The Israeli-Palestinian conflict goes back a century. There are legitimate grievances on both sides. There are, moreover, Americans of sincere goodwill on both sides of this debate.

That’s one thing. Suburban white kids cosplaying as Hamas militants on the campus of Columbia University is another. If the students want to help the Palestinians, let them spend this summer serving meals and washing laundry in a refugee camp. I’d have them scrub a few toilets, too.

It isn’t as if this is the first time campus chaos has erupted in recent years. And most of the students’ “issues” are not all that weighty.

Not so long ago, Yale students were protesting…Halloween costumes. Cultural appropriation, or pronouns, or some such nonsense.

Students at Princeton University pressured university officials to change the name of the Woodrow Wilson School of Public and International Affairs. Why? They didn’t like something that Woodrow Wilson had said or written more than a century ago. So they screeched and bellyached and shouted until university officials toed their line.

One cannot blame the students entirely. The universities were already becoming leftwing monocultures when I was a college undergrad in the late 1980s.

Even that was not the very beginning. The problem began in the 1960s, twenty years before I stepped on campus.

What we are seeing now on university campuses is the result of a half-century of rot from the inside out. The present condition did not develop overnight, and it won’t be fixed overnight.

But at least the public is now starting to pay attention to the problem: the sorry state of our overpriced and increasingly unproductive universities.

That, ironically, is the one positive development likely to result from the latest wave of student protest tomfoolery.


She said “Do svidaniya” to the USA

I’m a lifelong language learner, and I’ve long had an interest in Russia. Russian is one of the languages I study.

And no—before you ask—the present situation doesn’t change that. I grew up during the Cold War. Ambivalent feelings toward Moscow have always been a part of my psyche. That doesn’t make Russia and its ancient civilization any less interesting as a field of study.

I follow a handful of Russia-based YouTubers. Among these is Sasha (Alexandra) of the YouTube channel Sasha Meets Russia.

In the video below, Sasha explains why she has decided to say “do svidaniya” (goodbye) to the USA, and stake her future in Russia.

As an American, Sasha is uniquely prepared to emigrate to Russia. Though she spent most of her life in the United States, she is of mixed Russian-American heritage. She speaks fluent Russian, along with native English and fluent French.

Nevertheless, her dissatisfaction with what the USA has become in recent decades will resonate with many Americans who don’t speak Russian. She refers, wistfully, to what America was in the 1950s and 1960s. One need not go back that far. I would settle for what America was in the 1980s or 1990s.

Especially notable is Sasha’s account of confronting “woke culture” as a teenager and public school student in the People’s Republic of Massachusetts about ten years ago. I’m grateful that I missed all that nonsense, as a student of an earlier era.

Why am I bringing her to your attention? Partly because of her youth. There has never been a shortage of 50- and 60-year-olds who are convinced that the society around them is going to hell. But this is the assessment of an educated, physically attractive Gen Z woman who would have plenty of prospects wherever she went.

Yet she doesn’t choose to stay here. She chooses the country that our mainstream media and political elites constantly denounce as evil.


P.S.: And Sasha, I should note, is not alone:

Molly Ringwald’s diverse anxieties

Molly Ringwald was born the same year I was, and her movies were part of the teen culture in which I came of age.

I don’t think I would have ever described myself as a Molly Ringwald “fan”, exactly, but nor was I a detractor. Like most Gen Xers, I saw her movies, in the same spirit that I watched MTV and listened to bands like Journey and Bon Jovi. Pop culture was more monolithic in those pre-Internet times, and you kind of took what they gave you.

I enjoyed Ringwald’s performance in The Breakfast Club (1985), a movie that almost all people my age have seen at least once.

Ringwald was a gifted teen actress. She was also a gifted twentysomething adult actress in the 1994 television miniseries, The Stand. She starred as Frannie Goldsmith, the heroine of Stephen King’s beloved apocalyptic horror novel.

So I have no qualms with Molly Ringwald the thespian. I have been far less impressed with Molly Ringwald the public person. In recent years, she’s become a fashionably left-leaning celebrity gadfly, mouthing all the familiar slogans when goaded by journalists and interviewers.

Most particularly, Ringwald seems to feel a compulsive need to apologize for the John Hughes teen movies that made her famous. Bashing the creations of the late Hughes (1950 – 2009) has become a peculiar obsession of hers.

Case-in-point: during a recent interview, Ringwald declared:

“Those [John Hughes] movies are very white and they don’t really represent what it is to be a teenager in a school in America today.”

“Very white”? Did she really just say that?

I’ll overlook the obvious non sequitur here: John Hughes was a Baby Boomer who made movies about teenage life in the mid-1980s. Although he technically wrote about Gen Xers, he was probably thinking about Baby Boomers most of the time. At any rate, Hughes never aspired to depict teen life in the mid-2020s. The mid-2020s were still forty years in the future, and many of those yet unborn teens’ parents hadn’t even met yet.

But that isn’t what Ringwald is really getting at. She is implying that because Hughes’s movies did not feature racially diverse casts, there was somehow something retrograde, or even racist about them.

Pop culture in the 1980s actually was quite diverse. Yes, it was the decade of Molly Ringwald, Bruce Springsteen, and Madonna. But it was also the decade of Michael Jackson, Prince, Whitney Houston, and Billy Ocean. We all watched The Cosby Show on television. Eddie Murphy was on everyone’s list of favorite comedians, both in stand-up and in film.

(In many ways, the music scene was far more diverse in the 1980s than it is now. Black artists got proportionately more mainstream attention, whereas nowadays everything in the pop music space is maniacally focused on blonde, vapid Taylor Swift.)

But what about those movies of John Hughes? It isn’t technically inaccurate to say that they were “white”, if we must call them that. There is no evidence that John Hughes was specifically opposed to racial or cultural diversity, but racial and cultural diversity clearly weren’t his focus.

And…so what? Diversity, in the best sense of that word, doesn’t mean—or shouldn’t mean—that every film, TV series, novel, and toothpaste commercial is suspect if it doesn’t contain a box-checked, racially diverse cast of characters. If everything is box-checked to death, then that becomes the norm, and nothing is truly diverse. Diversity, when carried to ideological extremes, can become monochromatic, predictable, and boring.

I don’t remember ever watching The Cosby Show, and saying, “Where the heck are all the Asian Americans and Native American characters? Where are the Jewish and Muslim characters?”

Real life itself, moreover, is not always diverse. During the 1980s, I attended a suburban high school not unlike the one depicted in The Breakfast Club. There were a handful of Filipino students, and a few kids with partial Japanese heritage. Other than that, my high school was as white as Wonder Bread. I make no apology for this. The degree of racial and ethnic diversity in one’s environment has always depended on where one lives.

Molly Ringwald is old enough—and smart enough, I suspect—to realize her own folly. Her hand-wringing about her 40-year-old movies being “white” seems to be her way of keeping herself “relevant” in a chaotic twenty-first-century culture that is neurotically obsessed with identity politics.

I would have a lot more respect for Ringwald if she would simply own her past performances (which were quite good, on the whole) rather than pandering to the diverse but intolerant present. Not even Claire Standish was such an abject conformist.


Molly Ringwald as Claire Standish in The Breakfast Club (1985)

Alex Garland’s ‘Civil War’

This is an election year. Given the two candidates and the mood of the country, the 2024 election will almost certainly entail controversy. Whoever wins, millions of Americans will be angry and disappointed by the result. There will be accusations of cheating, or voter suppression, or something.

British filmmaker Alex Garland has therefore chosen an auspicious year for the release of Civil War, a movie about a hypothetical Civil War II in the United States.

But perhaps he has made a movie that is just a little too timely. More on that shortly.

Civil War is “deliberately vague” about the exact causes and instigators of its hypothetical conflict. The movie posits four different factions, each comprised of various states.

This is where things get hinky. Garland doesn’t follow the Red-Blue formula that most of us would expect. For example, the movie portrays Texas and California in an alliance. We can all agree that this is something that would never happen in real life.

This unrealistic scenario is, I suspect, deliberate, too. Garland did not want to make a movie that blatantly picks sides in the American culture wars. Making the alliances unrealistic would be one way to do that.

Reviews and…buzz?

Reviews of Civil War are mixed. I’m not the first person to observe that the political alliances depicted in the film don’t mirror our current political divisions.

Some reviewers seem to have taken issue with that. Johnny Oleksinski of the New York Post put it this way:

“Civil War’s shtick is that it’s not specifically political. For instance, as the US devolves into enemy groups of secessionist states, Texas and California have banded together to form the Western Forces. That such an alliance could ever occur is about as likely as Sweetgreen/Kentucky Fried Chicken combo restaurant.”

Oleksinski called Civil War “a torturous, overrated movie without a point”. We may conclude that he didn’t like it.

But what “point” was Oleksinski looking for, exactly? Alex Garland faced an obvious marketing dilemma here. If he had made a movie about the Evil Libs, he would have alienated half his potential audience. If he had made a movie about the Evil MAGAs, he would have alienated half his potential audience.

There is really no way to please everyone with a movie like this. Except by remaining vague. And then you irritate people because you didn’t take a stand.

I haven’t heard a lot of buzz about this movie in my own social circle, nor in my personal Facebook feed. Civil War is not exactly a movie that most people will want to see with their kids. Nor is it likely to become a date night favorite.

Civil War’s topic, and the clips I have seen of it, make the movie seem too similar to the news stories we have seen in recent years: the BLM riots of the summer and fall of 2020, and the J6 riot of January 6, 2021. The current war between two former Soviet Republics: Ukraine and Russia.

How many people want to pay good money to see a movie about something like that at the cinema?

Good question. I suspect that Civil War will find a wider audience once it moves to streaming/cable.

Could another Civil War really happen?

Alex Garland is not alone in his speculations about a Civil War II. Frankly, I have my doubts.

The First Civil War (1861 – 1865) was actually about something. Southerners were fighting to preserve their entire economic system. White Northerners were fighting to preserve the Union.

(Contrary to what many people believe, the Union did not initially wage the Civil War with the goal of ending slavery. The sainted Lincoln, moreover, would have let the Confederate states keep their slaves, if only they had not seceded.)

Blacks had the biggest stake of all, with their freedom on the line.

Whichever side you were on, there was something worthwhile to fight about.

But what about now? Are we really going to go to war over transgender bathrooms and idiotic pronoun rules? Over the self-evident question of what a woman is? Over abortion? Over the annual Pride Month spectacles? Over whether or not President Biden will force Americans to buy uneconomical and unwanted electric vehicles?

The issues that divide us now, as divisive and tiresome as they are, seem trivial by comparison.

A civil war, over all that nonsense? Hopefully, the country has not become that stupid. But you never know.


The school where a trip to the principal’s office is a punishment, indeed

I was a kid in the 1970s and 1980s. Life was not perfect then, to be sure, and the perils for children were many. This was the era in which “stranger danger” really took root. I remember several Halloweens during the 1970s when there were rampant stories of razor blades and drugs being placed in trick-or-treat candy.

These were likely just urban legends, but such was the mood then, among parents: the safe, reliable world of postwar suburban America had been swept away with the 1960s, the decade that destroyed America’s innocence, probably forever.

One question my parents never had to consider, though, was how much exposure I should have to drag queens, and adults twerking in public to express “pride” in their alternative sexualities. Nor was any adult authority figure in my midst nutty enough to encourage me to “question my birth-assigned gender identity”.

Gay celebrations have their place. They did in the 1970s and 1980s, too. Pride parades were already a thing, even in Midwestern cities like Cincinnati, where I grew up. No one was locked in a proverbial closet who didn’t want to be there.

But in that era, most of the adults in charge were actually….adults in charge. They recognized that what is appropriate for adults is not always appropriate (let alone necessary) for children.

Nevertheless, there is a sector of our society that seems determined to immerse children in as much aberrant sexual content as they can. Apparently, this is how the “woke” crowd shows how open-minded they are.

A public school district in Oklahoma has just taken this trend a step further: They’ve named a drag queen and former “Miss Gay Oklahoma” as the principal of an elementary school in the Sooner State.

This same individual (a biological man) was previously arrested for child porn, in a case that was later dismissed. At least we can assume that he doesn’t dislike children.

Needless to say, a backlash has ensued.

This may have been an administrative mistake. That’s what I suspect. Since the dawn of the “if you can’t do, then teach” mindset more than a generation ago, the human capital in our public schools has declined precipitously. Could they miss something like this? Sure they could. Remember whom you’re dealing with here.

I suspect that this particular situation will be worked out. Within days, we’ll learn that this individual has been sacked, and someone of at least slightly less dubious background put in his place.

This will, however, be yet one more chink in what remains of public confidence in our public schools. Yet more ammunition for advocates of homeschooling.


Much ado about M&Ms

There are a lot of things worth getting upset about and debating. Anthropomorphized spokescandies are not among them, I would submit. But in the midst of the culture wars, we are arguing about talking candy, too.

Some time back, Mars, the owner of the M&M brand, decided that its iconic spokescandies weren’t “inclusive” enough. The folks in the Mars marketing department responded by making the female spokescandies generally less svelte and less feminine.

(This raises questions about what “inclusivity” actually means. What about inclusivity for slender and conventionally attractive women, after all?)

That might have been the end of it. Then Tucker Carlson of Fox News got involved. In a blistering commentary, Carlson denounced the new, frumpier spokescandies:

“M&M’s will not be satisfied until every last cartoon character is deeply unappealing and totally androgynous, until the moment you wouldn’t want to have a drink with any one of them. That’s the goal. When you’re totally turned off, we’ve achieved equity. They’ve won.”

Would I want to “have a drink” with any talking candy? Hmm…let me get back to you on that. Tucker Carlson has a point, to be sure; but perhaps there are better points to be made out there.

It’s foolish for a candy company to agonize over whether or not an anthropomorphized chocolate candy is “inclusive”. This is the kind of nonsense that only Ivy League MBAs worry about. The rest of just want our M&Ms.

Tucker Carlson, though, overreacted to a situation that could have been dismissed with a “whatever” and an eye-roll. Not that he was going to do that, of course. Tucker Carlson’s business model requires that he constantly present his viewers with fresh sources of outrage.

And then Mars provided the final overreaction. This past week, the company breathlessly announced that the spokescandies would be retired because of all the “controversy”. Henceforth, Maya Rudolph will represent M&Ms in ads and TV commercials.

I did enjoy Maya Rudolph’s performance in The Good Place, so I don’t mind the change. But was there ever really that much controversy, beyond the aforementioned Tucker Carlson commentary? I don’t recall any calls for conservatives to boycott M&Ms, or anything like that.

Even CNN, no fan of conservatives or Tucker Carlson, was skeptical. Perhaps this final overreaction was a Mars publicity stunt, calculated to stir up attention for those little pieces of hard-coated chocolate.


‘Dark Places’, and the heavy metal controversies of the 1980s

I’m a fan of Gillian Flynn’s novels, and I enjoyed the film adaptation of Gone Girl (2014). So I thought: why not give Dark Places (2015) a try? Although I had read the 2009 novel, enough years had passed that much of the plot had seeped out of my mind. (That happens more and more often, the older I get.)

First, the acting. The two female leads in this movie (Charlize Theron, Chloë Grace Moretz) were perfect choices. Charlize Theron has proven herself willing to downplay her physical beauty for the sake of a dramatically challenging antihero role. (See her performance as Aileen Wuornos in Monster (2003).) And the lead role of Libby Day, the tragic but unlikable protagonist of Dark Places, forced her to make the most of these skills.

Chloë Grace Moretz, meanwhile, played the teenage femme fatale, Diondra Wertzner, in the backstory scenes (which comprise a significant portion of the movie). Moretz provided just the right blend of sex appeal and darkness that this character required, more or less what I imagined while reading the novel.

I’ve been following Moretz’s career since her breakout role as a child vampire in Let Me In (2010). Now in her twenties, Moretz seems almost typecast as a dark/horror movie actress; but she always manages to pull off the perfect creepy female character. (Note: Be sure to watch Let Me In if you haven’t seen it yet.)

Dark Places kept me glued to the screen. As I was watching the film, the plot of the book came back to me. Dark Places remained faithful to its literary source material, but in a way that moved the plot along more smoothly than the novel did. (This might be one of those rare cases in which the movie is actually a little better than the novel, which—despite being good—drags in places.)

As alluded to above, Dark Places is primarily set in the twenty-first century, with a significant portion concerning flashback events of 1985, when the adult characters were children or teenagers.

I was 17 in 1985, and I remember that era well. Much of this part of the story revolves around rumors of teenage “devil worship”, and the influence of “satanic” heavy metal: Dio, Iron Maiden, Black Sabbath, Ozzy Osbourne. This is an old controversy that I hadn’t thought about much in decades. Dark Places brought some of those long-ago debates back to me.

I listened to plenty of heavy metal back in the 1980s. (I still do). The heavy metal of Ronnie James Dio, Black Sabbath, Ozzy Osbourne and Iron Maiden does not encourage satanism, any more than films like The Exorcist encourage satanism. But like The Exorcist, some ‘80s heavy metal does dwell excessively on dark themes. And here is where the source of the confusion lies.

I never had the urge to draw a pentagram on my bedroom wall or sacrifice goats while listening to Blizzard of Oz or Piece of Mind. Nor did I detect any dark exhortations in the lyrics, whether overt or subliminal.

Since the 1980s, Ozzy Osbourne has become a reality TV star. Iron Maiden’s lead singer, Bruce Dickinson, has emerged as a polymath who writes books and flies commercial airliners when not on tour.

Ozzy strikes me as one of the most gentle people you might ever meet. Dickinson, meanwhile, is a conservative (in the British context of that political label) and a eurosceptic. Neither man fits the profile of the devil-worshipping maniac.

I will admit, though, that some 80s metal music became a bit cumbersome to listen to on a regular basis. I eventually moved on to more light-hearted, commercial rock like Def Leppard. I still listen to a lot more Def Leppard than Ozzy Osbourne or Iron Maiden. But I digress.

The 1980s fear-mongering over heavy metal turned out to be just that: fear-mongering. Although I’m sure there were isolated real-life horror stories, I didn’t know a single kid in the 1980s who was into satanism. The teenage satanists of the 1980s existed almost entirely within the fevered imaginations of a few evangelical preachers and their followers.

Back to Dark Places. The problem (with both the book and the movie) is that it is a fundamentally depressing story, without any characters that the reader/viewer can wholeheartedly root for. While there is a reasonable conclusion, there is nothing approaching a happy ending, or even a satisfying ending. That is a central flaw that no acting or directing talent can rectify.

This doesn’t mean that the movie isn’t worth watching. It is. But make sure you schedule a feel-good comedy film shortly thereafter. You’ll need it. And don’t watch Dark Places if you’re already feeling gloomy or depressed.


No more law clerks from Yale?

Judge James C. Ho, a Trump appointee who serves the 5th U.S. Circuit Court of Appeals in New Orleans, has announced that he will no longer consider students from Yale Law School for clerkships. (Ho, moreover, is not alone, as the above article indicates.)

Ho’s reason? Ho states that Yale Law School not only “tolerates” cancel culture, but “actively practices it.”

That may be true. Yale University, along with the rest of the Ivy League, is only slightly less ideological than the East German institution that used to train members of that country’s secret police, or Stasi.

Stupidity emanating from Yale students is so common now, that poking fun at them has become something of a turkey shoot. In 2015, protests famously erupted on the Yale campus over Halloween costumes and cultural appropriation. But that’s only the beginning.

While I understand Ho’s sentiment, his gesture will likely become but one more salvo in the ideological boycott wars. For one thing: it is easy to imagine left-leaning judges (who outnumber conservative ones in most states) responding in kind.

We might argue that Judge Ho is taking the wrong approach entirely. Nothing so hobbles the intellectual capacities of a young person than to graduate from any school in the Ivy League in the third decade of the twenty-first century.

Yale Law School students need nothing so much as some sane, balancing influences. Who better to provide that, than a judge who disagrees with most of their professors?


Participation trophies and organic chemistry

Maitland Jones Jr., an award-winning professor at NYU, was fired after a group of his students signed a petition alleging that his organic chemistry course was “too hard”.

I should begin with the usual disclaimer: I don’t know Maitland Jones, or the students who signed the petition. I never took his organic chemistry course. But that doesn’t mean I’m completely unfamiliar with the broader questions here.

In the academic year of 1987 to 1988, I took three semesters of organic chemistry at the University of Cincinnati. The reader might reasonably ask why I did this to myself.

During the previous summer, I had taken an intensive Biology 101 course, comprised of three parts: botany, zoology, and genetics.

I got A’s in all three sections of Biology 101. Botany and zoology were easy for me because I have always been good at memorizing large amounts of information that has no logical connections. (I’m good at foreign languages, for much the same reason.) I struggled a bit with the genetics portion of Biology 101, which requires more math-like problem-solving skills. But I still managed to pull off an A.

I was 19 years old at the time. With the typical logic of a 19-year-old, I concluded that I should go to medical school. I changed my undergrad major to premed, and began taking the math and science courses that comprised that academic track.

That’s how I crossed paths with organic chemistry. Organic chemistry was nothing like the Biology 101 course I had taken over the summer session. Biology 101 was aimed at more or less the entire student body. (I initially took it to satisfy my general studies science course requirement.) Organic chemistry was aimed at future heart surgeons and chemical engineers. Organic chemistry was the most difficult academic course I have ever taken, or attempted to take.

Organic chemistry is difficult because it requires the ability to memorize lots of information, as well as the ability to apply that information in the solution of complex problems. Organic chemistry is, in short, the ideal weed-out course for future heart surgeons and chemical engineers.

How did I do in organic chemistry? Not very well. I managed two gentlemanly Cs, and I dropped out the third semester.

My dropping out would have been no surprise to my professor. Nor was I alone. Plenty of other students dropped out, too.

Early in the course, I remember the professor saying, “Not everyone is cut out to be a doctor or a chemist. Organic chemistry is a course that lets you know if you’re capable of being a doctor or a chemist.”

That was 1987, long before the participation trophy, and back when a snowflake was nothing but a meteorological phenomenon. My experience with organic chemistry was harrowing, so far as “harrowing” can be used to describe the life of a college student. But in those days, disappointments, setbacks, and the occasional outright failure were considered to be ordinary aspects of the growing up experience. My organic chemistry professor did not care about my feelings or my self-esteem. He only cared if I could master the intricacies of stereochemistry, alkenes, and resonance.

The good news is that I was able to quickly identify a career that I would probably not be good at. Even more importantly, you, the reader, will never look up from an operating table, to see me standing over you with a scalpel.

If we have now reached the point where students can vote their professor out of a job because a course is too hard, then we’ve passed yet another Rubicon of surrender to the cult of feel-good political correctness.

A decade ago, many of us laughed at the concept of the participation trophy. But at the same time, many of us said: “What’s the big deal?”

The big deal is that small gestures, small surrenders, have larger downstream consequences. A participation trophy is “no big deal” on an elementary school soccer field. At medical school, participation trophies can endanger lives, by enabling the less competent to attain degrees and certifications which they would never have acquired in saner times.

Are you planning on getting heart surgery down the road? You might want to get it now, before the present generation of premeds and medical students becomes the next generation of doctors.


Kristen Clarke, Harvard, and “race science”

Kristen Clarke, Biden’s nominee to head the DOJ Civil Rights Division, penned a 1994 letter to the Harvard Crimson, stating that African Americans have “superior physical and mental abilities”.  At the time, Clarke was an undergraduate at Harvard, and the president of the university’s Black Students Association.

Clarke based her letter on…race science.

Here are some excerpts from the letter:

“One: Dr Richard King reveals that the core of the human brain is the ‘locus coeruleus,’ which is a structure that is Black, because it contains large amounts of neuro-melanin, which is essential for its operation.

“Two: Black infants sit, crawl and walk sooner than whites [sic]. Three: Carol Barnes notes that human mental processes are controlled by melanin — that same chemical which gives Blacks their superior physical and mental abilities.

“Four: Some scientists have revealed that most whites [sic] are unable to produce melanin because their pineal glands are often calcified or non-functioning. Pineal calcification rates with Africans are five to 15 percent [sic], Asians 15 to 25 percent [sic] and Europeans 60 to 80 percent [sic]. This is the chemical basis for the cultural differences between blacks and whites [sic].

“Five: Melanin endows Blacks with greater mental, physical and spiritual abilities — something which cannot be measured based on Eurocentric standards.”


Obviously, this is complete hooey, dressed up in the sort of pseudo-scientific language that passes for erudition at places like Harvard.

Obviously, the mainstream media would be shrieking, Twitter would be exploding, if a white nominee to any senior federal government post had made similar claims about whites, based on “race science”.

Nevertheless, I’m of two minds on this one.

Clarke’s age is not available online, but her Wikipedia entry states that she graduated Harvard in 1997. Backing into the numbers, this would mean that she was about 19 years old when she wrote the above words.

Kristen Clarke

Most people don’t reach full adulthood until they are about halfway through their twenties. (This is why I would be in favor of raising the voting age, rather than lowering it, but that’s another discussion.)

This doesn’t mean you should get a blank check for everything you do when you’re young, of course. But there is a case to be made that all of us say and think things during our formative years that will make us cringe when we look back on them from a more mature perspective.

This is certainly true for me. I was 19 years old in 1987. I am not the same person now that I was then—both for better and for worse.

Secondly, let’s acknowledge environmental factors. Being a student at Harvard is likely to temporarily handicap any young person’s judgement and intellectual maturity. Even in 1994, Harvard University was a hotbed of pointy-headed progressivism and insular identity politics.

Clarke was also involved in the Black Students Association. There was a Black Students Association at the University of Cincinnati when I was an undergrad there during the late 1980s. Members of UC’s BSA were known to write whacko letters like the one above. Most of them, though, were nice enough people when you actually talked to them in person. They just got a little carried away when sniffing their own farts in the little office that the university had allocated for BSA use.

What I’m saying is: I’m willing to take into account that 1994 was a long time ago. A single letter from a 19-year-old, quoting pseudo-academic race claptrap, shouldn’t be a permanent blight on the record of a 47-year-old. And I would say the same if Kristen Clarke were white, and had taken a very different spin on “race science”.

We all need to stop being so touchy about racial issues, and so preoccupied with them. That goes for whites as well as blacks, and vice versa.

I’m willing to give Clarke a fair hearing, then. But I’m skeptical. Her 1994 Harvard letter isn’t an automatic disqualifier; but it’s a question that needs to be answered.

I’m also skeptical of Biden. Biden may be a feeble old man; he may be a crook. He is not particularly “woke” at a personal level. In fact, some of his former positions on busing and crime suggest that he’s anything but “woke” on matters of race.

Yet Biden is now head of a Democratic Party that is obsessed with race. This means that Biden may try to overcompensate, by filling his government with race radicals. This recent selection supports that concern.

Given the time that has elapsed between the present and 1994, given Kristen Clarke’s age at the time, I want to hear what she has to say in 2021 before I outright condemn her as a hater or a looney. But this recent personnel selection doesn’t make me optimistic about the ideological tilt of the incoming Biden administration.