The 2024-5 school year has barely started, and it’s happened again: another school shooting. Four people were killed today, and nine injured, when a 14-year-old boy went on a shooting rampage in a Georgia high school.
This incident will inevitably reignite the gun control debate. Especially during an election year. That’s okay. I’m not a Second Amendment absolutist, and this is not a pro-gun post.
But I can’t help remembering my own adolescent years. I was a high school freshman in September 1982.
Then, as now, the experience of being a teenage boy had its ups and downs. My high school experience was a positive one, as high school experiences go. But I had my share of slings and arrows, disappointments and setbacks.
I was sometimes angry. Teenage boys frequently are. I had aggressive impulses. Teenage boys frequently do.
I had easy access to guns. My grandfather was a World War II combat veteran who kept a small arsenal in his suburban home. And at the age of 14, I knew how to handle firearms.
But not in my wildest dreams would I have considered the act of taking a gun to school, and shooting any of my classmates.
I was not exceptional in this regard. Practically no young men of my generation did such things.
American gun ownership, whatever one thinks of it, is nothing new. Nor is the American high school experience. But the mass school shooting, as we know it today, is a relatively new phenomenon. Such things simply did not happen in the 1980s.
We must therefore ask ourselves why our young people have changed so drastically in two generations, especially our young males.
Adolescent boys of my generation grew up playing dodgeball. Some of us (myself included) were subjected to corporal punishment. (I was spanked both at home and at school.)
We regularly engaged in fisticuffs. Few of us actually told “fag” jokes (I always considered such jokes to be puerile); but we merely shrugged when we heard them. We were never encouraged to question our gender identity. We engaged in locker room banter that would cause no end of finger-wagging from today’s nabobs of political correctness.
We lived by the conventional wisdom, “Boys will be boys.”
But we did not carry out mass shootings. We did not shoot anyone, or harm anyone.
Chivalry—the protection of those weaker than you—was an ideal that most of us admired and aspired to.
There might be a lesson there. At any rate, it might provide a starting point for asking: what the heck has gone so terribly wrong, with the way we raise young men?
While the sentiments are by no means universal, a large number of British citizens are opposed to their government’s open-door immigration policies.
In Europe, most immigrants come from the Muslim Middle East. Many are young, male, and unattached, and do not readily assimilate into secular, liberal European culture.
There have been numerous incidents and outrages, some lasting for decades. Between the 1970s and 2013, a Muslim Pakistani gang in Rotherham, England, sexually abused more than 1,400 British girls. While British citizens complained, their government seemed mostly concerned with combatting the resultant backlash against immigration.
And this is but one example. Political correctness is a cult in America. In the United Kingdom, it’s the established faith of officialdom.
But this year, enough was finally enough. In 2024, British citizens rebelled at the polls—and in the streets, just like the citizens of France, Germany, and elsewhere in Europe. (Throughout Europe, there is a clear divide between the will of the people, and the will of those who occupy the halls of government.)
We Americans tend to regard Freedom of Speech as absolute, the depredations of left-leaning tech firms and the Biden administration notwithstanding.
It is not necessarily so in Europe. Ordinary people in countries like Germany and the UK are generally freer to speak their minds than those in countries like North Korea and Saudi Arabia. But Western Europeans are subject to limitations that most Americans—including some who plan to vote for Harris in November—would find unacceptable.
The authorities in the UK, in particular, can be rather heavy-handed. British activist Tommy Robinson recently had to flee the UK after he screened his own documentary in defiance of a British court order.
This past week, Lucy Connolly, the wife of a Tory councilor, was taken into custody for posting anti-immigration messages on X.
And they wonder why we had the American Revolution back in 1776. Speaking of America, London’s Metropolitan Police chief, Commissioner Sir Mark Rowley, has threatened to extradite Americans who say the wrong things about British laws and the situation on the far side of the Atlantic pond.
“We will throw the full force of the law at people. And whether you’re in this country committing crimes on the streets or committing crimes from further afield online, we will come after you,” Rowley told reporters in August.
“Try it, redcoat!” some X users retorted. To most Americans, the idea of ‘Sir’ Mark Rowley throwing them in a British jail over a social media post sounds far-fetched. And perhaps it is. But don’t forget, we have an election coming up in a few months. Democratic vice presidential candidate Tim Walz has already said that he would like to see more restrictions on free speech—at least for those who disagree with him.
Britain appears headed for some rough times ahead. Tommy Robinson and Lucy Connelly are not grumpy, septuagenarian Baby Boomers. Robinson and Connolly were both born in the 1980s. They are Millennials: members of the generation believed to be blindly left-leaning in all matters.
The current UK regime, on the other hand, is the creation of left-leaning Baby Boomers. The generation whose fathers fought World War II. They grew up in the prosperity of the postwar years, and fashionably disdained British values.
But that generation has turned Britain into a decaying, crime-ridden cesspool. The younger generation is rising in response. But is it too late?
I’m part of the original Star Wars generation. I was nine years old in the summer of 1977, when I sat in the cinema with my dad, and watched those now famous words scroll across the big screen: “A long time ago in a galaxy far, far away….”
Why just with my dad? My mom was invited. I specifically remember that. I also remember that my mom, then barely in her thirties, had no interest in seeing Star Wars. That might be worth noting.
Like many kids in 1977, I became an instant Star Wars fan. I pestered my parents for the comics, the action figures, the Burger Chef posters. All of it.
I watched The Empire Strikes Back in 1980 and Return of the Jedi in 1983. These, too, I saw with my dad. (I do not believe that my mom, in her entire life, ever watched a Star Wars film. And she watched a lot of movies.)
I liked The Empire Strikes Back the best. I was the perfect age for it in 1980 (12). I was also deeply immersed in Star Wars fandom.
I was already on the verge of outgrowing Star Wars, though. By the time Return of the Jedi came out in 1983, I was well into high school, and focused on other matters.
But still, I really enjoyed all three installments of the original 1977-1983 trilogy.
When The Phantom Menace came out in 1999, I was skeptical. So much time had passed since the end of the original trilogy. I feared a lack of continuity.
And sure enough, I was right. The Phantom Menace was a shadow of the original Star Wars trilogy.
Since then, I’ve watched most of the subsequent Star Wars movies, including Rogue One (2016), and the more controversial Sequel Trilogy (2015-2019).
I neither loved nor hated these movies. They struck me as…okay.
But what about the politics?
I noticed, in the 2010s, that there was a conscious effort to create new Star Wars characters who were female, nonwhite, and (more recently) gay. How could I not know, as much as everyone was talking about it?
Now for a brief word to younger readers. Millennials and Zoomers, I’ve observed, often have the conceit that they “discovered” diversity, and that all popular culture prior to the 21st century was a sea of white males grunting at each other.
Far from it. One of the stars of the original Star Wars trilogy was the black Billy Dee Williams, in the role of Lando Calrissian. And what would Star Wars have been without Princess Leia?
The original Battlestar Galactica (1978-9) had numerous characters of color (Colonel Tigh, Boomer). There were “traditional” female characters like Cassiopeia, but also women warriors, like Athena and Sheba.
What is new is a neurotic obsession with diversity box-ticking and virtue-signaling. Filmmakers of yesteryear understood that diversity, like religion, is best practiced and not preached.
It has now been almost half a century since that first Star Wars movie came out in 1977. Star Wars, now only vaguely recognizable as something connected to the original trilogy, has become another battleground in our incessant culture wars.
Disney, an aggressively ideological corporate entity, has owned Lucasfilm since 2012. The Star Wars fanbase has always skewed male, and that fanbase seems to hate each successive Disney-produced Star Wars project with a new level of intensity.
The latest Disney-sponsored Star Wars production, The Acolyte, was directed by a lesbian film director and featured lesbian space witches. The main character was a young woman of color. Instead of simple representation, The Acolyte leaned toward overrepresentation. There was nary a white male to be seen in the cast at all.
The vast majority of Star Wars fans panned The Acolyte, and the $180 million series was canceled after a single season.
This led to the now familiar, now predictable online arguments. Conservatives gleefully tweeted, “Go woke, go broke!” Meanwhile, mainstream media scolds wagged their fingers at all the sexism, racism, and homophobia, their usual obsessions.
Kathleen Kennedy, the Disney-installed president of Lucasfilm, made the brilliant observation that Star Wars fandom is “male-dominated”. Ya think?
I could have told her that in 1977,1980, and 1983, when my mom had no interest in watching any of the movies of the original Star Wars trilogy.
I could have told Kennedy that on my childhood playground in 1978. While the boys were staging imaginary lightsaber battles and fantasizing about being Luke Skywalker, the girls were skipping rope and talking about how dreamy Shaun Cassidy and Scott Baio were. (Yes, Scott Baio, now a conservative Republican, was a teeny-bopper heartthrob in the late 1970s.)
I’m sure there are middle-aged male fans of Taylor Swift. I would also bet that somewhere in America, there is a 12-year-old girl who is obsessed with Arnold Schwarzenegger’s action films from the 1980s.
But such individuals are outliers. Generally speaking, people’s tastes in pop culture follow broad demographic trends.
I’m a 56-year-old male, and I wouldn’t attend a Taylor Swift concert if you gave me free tickets and a backstage pass. Most Gen Xers (myself included) didn’t “get” The Big Chill, a 1983 movie about thirtysomething Baby Boomers. I’d be willing to bet that the typical 17-year-old of 2024 wouldn’t connect with the teen movies I watched in the 1980s.
And as for all that stuff the kids are doing on TikTok nowadays? I don’t get most of that, either.
This all boils down to a very simple—and formerly uncontroversial—concept: target marketing based on demographics and psychographics.
People’s consumer tastes quite often differ by age, income level, place of residence, and yes—gender!
Guess how many men went to watch last summer’s Barbie movie? Not many. According to the market research firm Ipsos, the Barbie cinema audience was75 percent female.
Should the producers of Barbie 2 (if there is such an abomination) focus on what male viewers want, or on what female viewers want? If they’re smart, they’ll focus on what female viewers want, and what female viewers liked about the first Barbie film: lots of female empowerment messaging with plenty of unsubtle digs at various male stereotypes.
Back to Star Wars. Star Wars fans are often caricatured as racist and sexist. But they’re the same folks who loved Princess Leia, Lando Calrissian, and all the aforementioned diverse characters in the original (1978) Battlestar Galactica. Star Wars fans aren’t looking for an all-white, all-male space opera. Not by a long shot. But lesbian space witches might be a bridge too far.
Just before the release of The Acolyte, Kathleen Kennedy publicly opined that “storytelling does need to be representative of all people”.
Storytelling? Sure. But does every story need to fully represent every person? Does Barbie need to represent me, specifically? What about the next movie targeted at Gen Z twentysomethings? (Again, I’m in my fifties.)
True diversity, on the contrary, means that not every work of art speaks to everyone, or is about everyone. If a particular work of art doesn’t speak to you or about you, you can watch, read, or listen to something else.
The failure of The Acolyte notwithstanding, there does seem to be a market for “woke” science fiction films and movies. Most Star Wars fans hated The Acolyte, but not all of them did.
The solution? Simple. Create a new space opera franchise, in which straight white male characters and tropes are openly downplayed in favor of racial, gender, and sexual identity-based diversity. In other words, have as many lesbian space witches as you want. Knock yourself out.
Just don’t call your movie Star Wars. Call it something else. Make something completely new. Then no one will complain.
Will I pay good money to see the first film in your new space opera franchise about transgender starship pilots hunting lesbian space witches of color?
Show me the movie trailer first, and then ask me.
But you’ll have a lot better odds of getting me to see that, than the inevitable Barbie sequel, or (NO! NO! NO!) anything related to Taylor Swift.
I have been doing my best to ignore the J.D. Vance “cat lady” nonsense. It would seem to me that with two wars, a border crisis, and economic uncertainty, we all have bigger fish to fry.
Nevertheless, some of the related memes have made their way into my personal Facebook feed. There is a certain kind of person who seems to think that the opinion of Jennifer Aniston or Taylor Swift automatically lends weight to a particular viewpoint. I beg to differ.
I would ordinarily remain on the sidelines during a debate like this. But perhaps I have a dog in this fight. I am, alas, a childless cat lady of sorts myself. I’m 56 with no children, and—at this point—the odds of me procreating are minimal.
I’m not going to apologize for not having kids. At the same time, though, I must acknowledge: people getting married and having kids is what makes the world go around.
That has nothing to do with religion or so-called “family values”. It is, rather, economics, at the most basic level. The economies of both Japan and South Korea are hitting the wall right now because they have lost the will to reproduce themselves. Many European countries face the same predicament.
Likewise, I don’t fundamentally have a problem with the idea that as a childless person, I have a different perspective from someone who has children and (at my age) even grandchildren. This is no different from someone pointing out that as a straight white male, I don’t have the same perspective as someone who is black, gay, female, or LGBTQ.
So what’s the big deal? I understand, of course, that this is an election year, and political partisanship is involved. But beneath that, there is an apparent desire on the part of some voters to constantly have their self-esteem stroked, to constantly be told that their choices were the best ones that could possibly have been made.
Like I said—no apologies for the choices I’ve made. But if J.D. Vance (or anyone else) fails to high-five me for being a genetic dead end at 56, I’ll understand. They can high-five me for something else.
Now…back to those two wars, the border crisis, and economic uncertainty.
I’m a fan of Gillian Flynn’s novels, and I enjoyed the film adaptation of Gone Girl (2014). So I thought: why not give Dark Places (2015) a try? Although I had read the 2009 novel, enough years had passed that much of the plot had seeped out of my mind. (That happens more and more often, the older I get.)
First, the acting. The two female leads in this movie (Charlize Theron, Chloë Grace Moretz) were perfect choices. Charlize Theron has proven herself willing to downplay her physical beauty for the sake of a dramatically challenging antihero role. (See her performance as Aileen Wuornos in Monster (2003).) And the lead role of Libby Day, the tragic but unlikable protagonist of Dark Places, forced her to make the most of these skills.
Chloë Grace Moretz, meanwhile, played the teenage femme fatale, Diondra Wertzner, in the backstory scenes (which comprise a significant portion of the movie). Moretz provided just the right blend of sex appeal and darkness that this character required, more or less what I imagined while reading the novel.
I’ve been following Moretz’s career since her breakout role as a child vampire in Let Me In (2010). Now in her twenties, Moretz seems almost typecast as a dark/horror movie actress; but she always manages to pull off the perfect creepy female character. (Note: Be sure to watch Let Me In if you haven’t seen it yet.)
Dark Places kept me glued to the screen. As I was watching the film, the plot of the book came back to me. Dark Places remained faithful to its literary source material, but in a way that moved the plot along more smoothly than the novel did. (This might be one of those rare cases in which the movie is actually a little better than the novel, which—despite being good—drags in places.)
As alluded to above, Dark Places is primarily set in the twenty-first century, with a significant portion concerning flashback events of 1985, when the adult characters were children or teenagers.
I was 17 in 1985, and I remember that era well. Much of this part of the story revolves around rumors of teenage “devil worship”, and the influence of “satanic” heavy metal: Dio, Iron Maiden, Black Sabbath, Ozzy Osbourne. This is an old controversy that I hadn’t thought about much in decades. Dark Places brought some of those long-ago debates back to me.
I listened to plenty of heavy metal back in the 1980s. (I still do). The heavy metal of Ronnie James Dio, Black Sabbath, Ozzy Osbourne and Iron Maiden does not encourage satanism, any more than films like The Exorcist encourage satanism. But like The Exorcist, some ‘80s heavy metal does dwell excessively on dark themes. And here is where the source of the confusion lies.
I never had the urge to draw a pentagram on my bedroom wall or sacrifice goats while listening to Blizzard of Oz or Piece of Mind. Nor did I detect any dark exhortations in the lyrics, whether overt or subliminal.
Since the 1980s, Ozzy Osbourne has become a reality TV star. Iron Maiden’s lead singer, Bruce Dickinson, has emerged as a polymath who writes books and flies commercial airliners when not on tour.
Ozzy strikes me as one of the most gentle people you might ever meet. Dickinson, meanwhile, is a conservative (in the British context of that political label) and a eurosceptic. Neither man fits the profile of the devil-worshipping maniac.
I will admit, though, that some 80s metal music became a bit cumbersome to listen to on a regular basis. I eventually moved on to more light-hearted, commercial rock like Def Leppard. I still listen to a lot more Def Leppard than Ozzy Osbourne or Iron Maiden. But I digress.
The 1980s fear-mongering over heavy metal turned out to be just that: fear-mongering. Although I’m sure there were isolated real-life horror stories, I didn’t know a single kid in the 1980s who was into satanism. The teenage satanists of the 1980s existed almost entirely within the fevered imaginations of a few evangelical preachers and their followers.
Back to Dark Places. The problem (with both the book and the movie) is that it is a fundamentally depressing story, without any characters that the reader/viewer can wholeheartedly root for. While there is a reasonable conclusion, there is nothing approaching a happy ending, or even a satisfying ending. That is a central flaw that no acting or directing talent can rectify.
This doesn’t mean that the movie isn’t worth watching. It is. But make sure you schedule a feel-good comedy film shortly thereafter. You’ll need it. And don’t watch Dark Places if you’re already feeling gloomy or depressed.
Maitland Jones Jr., an award-winning professor at NYU, was fired after a group of his students signed a petition alleging that his organic chemistry course was “too hard”.
I should begin with the usual disclaimer: I don’t know Maitland Jones, or the students who signed the petition. I never took his organic chemistry course. But that doesn’t mean I’m completely unfamiliar with the broader questions here.
In the academic year of 1987 to 1988, I took three semesters of organic chemistry at the University of Cincinnati. The reader might reasonably ask why I did this to myself.
During the previous summer, I had taken an intensive Biology 101 course, comprised of three parts: botany, zoology, and genetics.
I got A’s in all three sections of Biology 101. Botany and zoology were easy for me because I have always been good at memorizing large amounts of information that has no logical connections. (I’m good at foreign languages, for much the same reason.) I struggled a bit with the genetics portion of Biology 101, which requires more math-like problem-solving skills. But I still managed to pull off an A.
I was 19 years old at the time. With the typical logic of a 19-year-old, I concluded that I should go to medical school. I changed my undergrad major to premed, and began taking the math and science courses that comprised that academic track.
That’s how I crossed paths with organic chemistry. Organic chemistry was nothing like the Biology 101 course I had taken over the summer session. Biology 101 was aimed at more or less the entire student body. (I initially took it to satisfy my general studies science course requirement.) Organic chemistry was aimed at future heart surgeons and chemical engineers. Organic chemistry was the most difficult academic course I have ever taken, or attempted to take.
Organic chemistry is difficult because it requires the ability to memorize lots of information, as well as the ability to apply that information in the solution of complex problems. Organic chemistry is, in short, the ideal weed-out course for future heart surgeons and chemical engineers.
How did I do in organic chemistry? Not very well. I managed two gentlemanly Cs, and I dropped out the third semester.
My dropping out would have been no surprise to my professor. Nor was I alone. Plenty of other students dropped out, too.
Early in the course, I remember the professor saying, “Not everyone is cut out to be a doctor or a chemist. Organic chemistry is a course that lets you know if you’re capable of being a doctor or a chemist.”
That was 1987, long before the participation trophy, and back when a snowflake was nothing but a meteorological phenomenon. My experience with organic chemistry was harrowing, so far as “harrowing” can be used to describe the life of a college student. But in those days, disappointments, setbacks, and the occasional outright failure were considered to be ordinary aspects of the growing up experience. My organic chemistry professor did not care about my feelings or my self-esteem. He only cared if I could master the intricacies of stereochemistry, alkenes, and resonance.
The good news is that I was able to quickly identify a career that I would probably not be good at. Even more importantly, you, the reader, will never look up from an operating table, to see me standing over you with a scalpel.
If we have now reached the point where students can vote their professor out of a job because a course is too hard, then we’ve passed yet another Rubicon of surrender to the cult of feel-good political correctness.
A decade ago, many of us laughed at the concept of the participation trophy. But at the same time, many of us said: “What’s the big deal?”
The big deal is that small gestures, small surrenders, have larger downstream consequences. A participation trophy is “no big deal” on an elementary school soccer field. At medical school, participation trophies can endanger lives, by enabling the less competent to attain degrees and certifications which they would never have acquired in saner times.
Are you planning on getting heart surgery down the road? You might want to get it now, before the present generation of premeds and medical students becomes the next generation of doctors.
Kristen Clarke, Biden’s nominee to head the DOJ Civil Rights Division, penned a 1994 letter to the Harvard Crimson, stating that African Americans have “superior physical and mental abilities”.At the time, Clarke was an undergraduate at Harvard, and the president of the university’s Black Students Association.
Clarke based her letter on…race science.
Here are some excerpts from the letter:
“One: Dr Richard King reveals that the core of the human brain is the ‘locus coeruleus,’ which is a structure that is Black, because it contains large amounts of neuro-melanin, which is essential for its operation.
“Two: Black infants sit, crawl and walk sooner than whites [sic]. Three: Carol Barnes notes that human mental processes are controlled by melanin — that same chemical which gives Blacks their superior physical and mental abilities.
“Four: Some scientists have revealed that most whites [sic] are unable to produce melanin because their pineal glands are often calcified or non-functioning. Pineal calcification rates with Africans are five to 15 percent [sic], Asians 15 to 25 percent [sic] and Europeans 60 to 80 percent [sic]. This is the chemical basis for the cultural differences between blacks and whites [sic].
“Five: Melanin endows Blacks with greater mental, physical and spiritual abilities — something which cannot be measured based on Eurocentric standards.”
Obviously, this is complete hooey, dressed up in the sort of pseudo-scientific language that passes for erudition at places like Harvard.
Obviously, the mainstream media would be shrieking, Twitter would be exploding, if a white nominee to any senior federal government post had made similar claims about whites, based on “race science”.
Nevertheless, I’m of two minds on this one.
Clarke’s age is not available online, but her Wikipedia entry states that she graduated Harvard in 1997. Backing into the numbers, this would mean that she was about 19 years old when she wrote the above words.
Most people don’t reach full adulthood until they are about halfway through their twenties. (This is why I would be in favor of raising the voting age, rather than lowering it, but that’s another discussion.)
This doesn’t mean you should get a blank check for everything you do when you’re young, of course. But there is a case to be made that all of us say and think things during our formative years that will make us cringe when we look back on them from a more mature perspective.
This is certainly true for me. I was 19 years old in 1987. I am not the same person now that I was then—both for better and for worse.
Secondly, let’s acknowledge environmental factors. Being a student at Harvard is likely to temporarily handicap any young person’s judgement and intellectual maturity. Even in 1994, Harvard University was a hotbed of pointy-headed progressivism and insular identity politics.
Clarke was also involved in the Black Students Association. There was a Black Students Association at the University of Cincinnati when I was an undergrad there during the late 1980s. Members of UC’s BSA were known to write whacko letters like the one above. Most of them, though, were nice enough people when you actually talked to them in person. They just got a little carried away when sniffing their own farts in the little office that the university had allocated for BSA use.
What I’m saying is: I’m willing to take into account that 1994 was a long time ago. A single letter from a 19-year-old, quoting pseudo-academic race claptrap, shouldn’t be a permanent blight on the record of a 47-year-old. And I would say the same if Kristen Clarke were white, and had taken a very different spin on “race science”.
We all need to stop being so touchy about racial issues, and so preoccupied with them. That goes for whites as well as blacks, and vice versa.
I’m willing to give Clarke a fair hearing, then. But I’m skeptical. Her 1994 Harvard letter isn’t an automatic disqualifier; but it’s a question that needs to be answered.
I’m also skeptical of Biden. Biden may be a feeble old man; he may be a crook. He is not particularly “woke” at a personal level. In fact, some of his former positions on busing and crime suggest that he’s anything but “woke” on matters of race.
Yet Biden is now head of a Democratic Party that is obsessed with race. This means that Biden may try to overcompensate, by filling his government with race radicals. This recent selection supports that concern.
Given the time that has elapsed between the present and 1994, given Kristen Clarke’s age at the time, I want to hear what she has to say in 2021 before I outright condemn her as a hater or a looney. But this recent personnel selection doesn’t make me optimistic about the ideological tilt of the incoming Biden administration.