Billie Eilish, Van Halen, and the inevitability of musical obsolescence

In lighter news, Wolfgang Van Halen, the son of Van Halen lead guitarist Eddie Van Halen, has publicly defended teen singer Billie Eilish.

Why? Eilish didn’t know anything about his dad’s famed rock band—even that it existed.

When she recently appeared on the Jimmy Kimmel Live show, the host asked her to name a few members of Van Halen. Eilish’s response was “Who? No, who is that?” Kimmel (jokingly) threatened to cry, and the older sector of the Twitterverse went wild with gibes about Eilish’s lack of knowledge about great rock bands of the 1980s.

I’m certainly okay with chiding youngsters who can’t identify the major combatants of World War II, or the three branches of the US government. Regarding their knowledge of Reagan-era popular music, though, I’m inclined to cut them some slack.

Knowledge about music (like other elements of pop culture) tends to be generational. Van Halen’s heyday was in the 1980s—along with Def Leppard, Rush, Iron Maiden, and other bands from that era that I love. That era ended a decade before Eilish was even born, though.

I think it’s great that many of these bands are still touring, and even making new music. One of my high school friends recently attended a Who concert with her husband. She noted on Facebook that she was younger than the Who—and she’s fifty-one. If you’d asked me in 1982, when I was a freshman in high school, I would have told you that the Who was a fading relic of yesteryear. Well, I now technically qualify for membership in the AARP, and the Who is still going strong—for a group of septuagenarian rockers, that is. (I wonder if they’re still bedding groupies?)

That said, I also recognize that the old bands that are still making music are largely playing to older audiences. I’m sure there are a handful of teenage Def Leppard fans—just like there were kids who obsessed on the long defunct Doors in the 1980s. Most of us, though, tend to focus on music that was new and popular during our teens and twenties. (Speaking of the Who: Although the Who is mostly a 1960s/1970s rock band, they put out two commercially successful albums in the early 1980s: Face Dances (1981) and It’s Hard (1982). This explains their popularity with people of my generation.)

Likewise, I don’t know much about the music that young people are listening to nowadays. I do know who Taylor Swift is—because her face seems plastered to the Internet. But I hadn’t heard of Billie Eilish, to cite one example. To be honest, I haven’t really become a fan of any new music since the mid-1990s, when I was in my mid-twenties.

It’s perfectly okay for musical tastes to change. That’s the whole concept behind “popular culture”. If musical tastes never changed, we’d all still be listening to Frank Sinatra and Elvis. It is also inevitable that musical acts that were universally known to one generation will become distant trivia to the next.

But I’m an old guy, and the kids can have the new stuff—including this Billie Eilish. I’m not going to stop listening to Rush, Def Leppard, and yes…Van Halen.

Should you major in the liberal arts?

One day in August of 1986, I sat down before the desk of a Northern Kentucky University (NKU) guidance counselor to discuss my class schedule for the upcoming fall semester. I was an incoming freshman. 

The guidance counselor began by asking me what I intended to major in during my four years at NKU. I told her either history or English literature.

She told me, in so many words, that I was an idiot.

“So what you’re saying,” she said, “is that you want to pay to read books. You can read books on your own time, for free.”

Needless to say, I was taken aback. This was several decades before it became fashionable to refer to callow young adults as “snowflakes”. Nevertheless, I was used to receiving almost unconditional encouragement from the adults in my midst. The ink on my high school diploma was still drying, after all. 

I explained to her that English and history had been my favorite subjects in high school. She told me that that didn’t matter. What mattered was, how was I going to prepare myself to earn a living by reading Shakespeare, and learning about the Treaty of Ghent? 

What are the liberal arts?

I will freely admit that I love the liberal arts. 

What are the liberal arts, exactly? In short, they are subjects of fundamental, as opposed to applied, inquiry. 

Imagine a group of students at one of the great universities of Europe during the Renaissance, say in…the year 1504. How many of them do you think were studying HVAC repair or managerial accounting? No, most of them were studying subjects like philosophy, mathematics, astronomy, and literature. 

The liberal arts, then, don’t necessarily mean subjects for the numerically challenged, like literature and history. The liberal arts actually are divided into four categories: social sciences, natural sciences, mathematics, and humanities. Some of these require a substantial amount of number-crunching. 

The problem with the liberal arts

The problem with the liberal arts, of course, is that it’s hard to make a living as a philosopher or a historian. In most cases, a career in the liberal arts means a career in academia. That can be an okay career, but there are only so many universities and so many open slots. 

And you’ll need an advanced degree. As one of my college biology professors once told my Biology 101 class: “A Bachelor of Science in Biology qualifies you to sell shoes.”

Oh, and that was in 1987, when there were still mall shoe stores to give you a job selling shoes. Today, of course, people buy shoes from Amazon—where they buy seemingly everything else. 

What should you major in, then?

Probably something that you see people doing around you, for actual money: nursing, engineering, accounting, and yes—HVAC repair. (The guy who works on my furnace and air conditioner is typically scheduled four weeks in advance). 

This is a bitter pill to swallow for those of us who love books. It’s not that there aren’t books in the applied fields, but they are different kinds of books, and you’ll require a different kind of curiosity to get enthusiastic about them. 

But trust me—it can be done. I finally ended up majoring in economics—yes, another field in the liberal arts! Later on, though, I took some accounting courses at the graduate level. Accounting can be interesting. Really, it can be.

Why should you be so coldly realistic when choosing a major? The answer is simple: money. With rising tuition costs, a college education is an investment. And every investment requires a plan for ROI (return on investment).

If this doesn’t seem immediately obvious to you (it didn’t seem obvious to me in 1986), don’t be too hard on yourself. Our system of higher education is still stuck in the early twentieth century in many ways. 

Before World War II, a university education was largely seen as a “finishing school” for members of the upper classes. Most of them didn’t have to be too practical. They already had plenty of career options. Many went to work in well-established family businesses. 

In the decades after World War II, the significance and the purpose of the university degree gradually changed. On one hand, by the 1960s, people from the middle and working classes were now attending university. That was a good thing. On the other hand, though, the 4-year university degree lost its scarcity value. And since people in the middle and working classes needed to earn money, they had to be practical when choosing a major.

These changes were already well underway in 1986, when I entered college. But I’m not going to lie to you: When I graduated in 1990, it was still possible to get a corporate, professional job if you had some kind of a 4-year degree—even in history or astronomy. 

Yes, the engineering majors were most in demand, but few English lit graduates truly went without. Somebody would hire you. The only downside was, you wouldn’t be using your major. I graduated with a B.A. in Economics, but no employer ever asked me to draw a supply-and-demand curve, or estimate the marginal utility of some activity. (My first job out of college was a low-level sales job for a machine tool distributor.) 

Nowadays, though, it isn’t uncommon to find twentysomethings with BAs in English working as baristas and waiters. I’m not gloating here, or saying that’s good or fair. I am saying that it’s harder than ever to get a decent-paying job with a liberal arts degree.

What about the liberal arts, then?

Here’s another confession: I still love the liberal arts, and I still think they’re vitally important. 

There are some things that every person should know, and most of these come under the rubric of liberal arts. Every person should know how to set up a basic algebraic equation. Every person should know the differences between capitalism and Marxism. Every person should know the basics of the histories of the major world cultures, and the major world religions. 

Oh, and the Treaty of Ghent: Signed on Christmas Eve, 1814, the Treaty of Ghent ended the War of 1812 between the United States of America and Great Britain. If you’re reading these words in one of those two countries, you should probably know the Treaty of Ghent, too.

To revisit the advice that the NKU guidance counselor gave me over thirty years ago: You can learn about the liberal arts without spending thousands of dollars in university tuition. 

Want to read Shakespeare? His works are in the public domain. If you have an Internet connection, you can read them for free. You don’t have to spend $300 per credit hour to read Shakespeare. 

Likewise, most of the liberal arts have been covered in thousands and thousands of books, written by some very bright people. You can get books for free at your public library. Or you can spend a little money and get them on Amazon. (But you won’t spend even a fraction of what you’d spend to acquire a 4-year degree in one of these subjects.)

The hard truth

I wish the truth were otherwise. I love the idea of study for its own sake. But given the realities of tuition fees and the job market, you probably shouldn’t pay to take college courses in Shakespeare or European history. 

You’d be much better off reading about these subjects on your own time. Go to college for engineering or nursing. 

Or maybe HVAC repair. Remember that HVAC repairman who’s booked four weeks in advance. I doubt that there are many philosophers with four week waiting lists, and not many poets or playwrights, either. 

The 100 year-old German grenade

Sometimes truth really is stranger than fiction.

Imagine the following scenario: During WWI, factory workers in Germany assemble a Granatenwerfer grenade for use against troops of the Allied Powers. 

A century later, that same grenade, still unused and undetonated, turns up in a river in Grand Rapids, Michigan–one century, two continents, and an ocean away.

Oh, and a man pulls it from the river with a magnet!

If a novelist included such a plot device in a story, readers would call it contrived. But this is exactly what happened in real life, as the above story details.  

Exactly how did a 100 year-old German grenade from WWI end up in a river in Michigan?

I don’t think that anyone will ever know the full answer to that. But it must be a good story. 

Dysfunctional families and Thanksgiving

Yesterday I called one of my former work colleagues to wish him a Happy Thanksgiving in advance. I asked him what he planned to do for the holiday. He’s married with children, after all, and folks who are married with children are supposed to have festive family holidays.

Not so much, in his case. None of his relatives live in the same area of the country that he does. His wife’s parents are dead. She has three sisters, but none of them are speaking to his wife, for various reasons.

What about his children? I asked.

They’re away at college, he told me, and doing things with their friends.

Oh.

Another married friend of mine is spending Thanksgiving not with either set of parents (all four of which are still alive), but with friends in his adopted city of Pittsburgh. To the best of his knowledge, his young adult children will be present, though (according to him) they’ll spend most of the dinner exchanging text messages with friends.

Was Thanksgiving always like this? Not according to my memory…Or maybe I just imagined that.

All of us, I think, have an image of the blissfuly happy Thanksgiving get-together, straight out of a Norman Rockwell painting. That is an alluring ideal, and (let’s be clear here) an admirable one. No one is going to convince me that idyllic family life isn’t a worthwhile goal.

But my experience and observation leads me to believe that it’s an ideal that less than half of us manage to attain anymore—if we ever did.

My family life was relatively happy, as such things go. But there were relatives with drinking problems (I’m part Irish Catholic, after all), and heated disagreements over politics—decades before the Trump era. I remember riding home from Thanksgiving in the back seat of my dad’s station wagon, both of my parents angry about something that an aunt or a cousin had said during the extended family meal.

If this holiday finds you in an ideal (or close to ideal) family setting, count your blessings. If that isn’t the case, try not to despair too much.

You’re not alone, after all.

How songs connect us to memory

I was never a huge fan of Richard Marx. (I never actively disliked his music, either. I just wasn’t a raving fan.) 

But boy, the summer of 1987 was his moment. That summer, I spent a lot of time in my car, listening to FM radio, and this Richard Marx song was on the radio endlessly

When I hear it now, I’m instantly transported back to that time and place. That hot, fun summer. I was nineteen years old.

Songs often ground us to particular moments in our past–sometimes even songs that we didn’t necessarily love at the time, but nevertheless heard a lot.  

The Internet, Jonathan Franzen, and distractions

About a year ago, literary novelist Jonathan Franzen shared his “10 rules for novelists”. Number 8 was:

“It’s doubtful that anyone with an Internet connection at his workplace is writing good fiction.”

Jonathan Franzen

I’m not sure I would be this absolutist about the matter. But as someone old enough to have reached adulthood before the Internet was “a thing”, I can appreciate just how distracting cyberspace can be.

It was bad enough in the beginning. But then came social media (I’ll spare you my usual rant), and those damned smartphones. 

As for Jonathan Franzen: The guy gets a bad rap, and I’m not sure why. Yes, he is quirky and eccentric. Yes, he is fashionably progressive and eye-rollingly politically correct in his politics. But no more so than many other people in the arts.

I’ve read two of his novels: The Corrections (2001) and Freedom (2010). I thought both books were pretty good. 

Fall of the Berlin Wall + 30 years

The Berlin Wall fell thirty years ago today, on November 9, 1989. (Several more years would pass before it would be systematically demolished.)

I won’t recount the entire history of the Wall’s fall here. (You can find that in various places throughout the Internet.) But I will provide a personal perspective.

I was twenty-one years old in November 1989, and a college student. Like most people at the time, I viewed the fall of the Wall with intense optimism.

And there was a lot to be optimistic about in late 1989: The USSR still existed, but a progressive-minded reformer, Mikhail Gorbachev, was at the helm. And he was allowing the Berlin Wall, that symbol of Cold War Soviet tyranny, to come down.

US domestic politics were relatively calm. Not everyone loved George H.W. Bush, of course. But few saw his administration as seriously divisive. This was an era when you could simply ignore US domestic politics, if you wanted to. There wasn’t a lot of drama.

There were problems in the Muslim Middle East. (Aren’t there always?) But the August 1990 invasion of Kuwait was still just a gleam in Saddam Hussein’s eye. No one in the West had yet heard of Osama Bin Laden.

We believed, at that time, that the world was on the verge of a peaceful new era of free markets, international harmony, and peace.

Some scholar–Francis Fukuyama, I believe it was–described this moment as “the end of history”, meaning: the end of traditional historical conflicts.

But it didn’t work out that way, did it? Russia did not develop into another Sweden (as many predicted at the time), but became a paranoid, bellicose, neo-czarist state, in some ways worse than the USSR. The Muslim Middle East continued its long descent into fratricidal chaos. China became more aggressive.

And the West–well, let’s just say that both North America and Western Europe looked much better in 1989 than they do today.

Proof that things don’t always work out as you expect. The evidence can deceive you. Sometimes the future is better than you anticipate, but sometimes it’s far worse, too.

Catherine the Great: the HBO miniseries

I’ve been watching the Catherine the Great television miniseries on HBO.  A few observations, in no particular order:

This is based on a period of history that many American viewers won’t know much about. You will get far more out of this miniseries if you have a familiarity with Russian history. No–you don’t need a PhD. But if you don’t know who Catherine the Great actually was, and her historical significance, you might want to skip this one. The HBO miniseries is primarily entertainment, but it assumes a certain degree of background knowledge.

-Not every minute is exciting. There are plenty of gunfights, riots, and decapitations, as befits any historical depiction of czarist Russia. But there are also many explorations of Catherine the Great’s relationships with men–both political and sexual. (In her case, the two often overlapped.)

-Helen Mirren is amazing. She is 74 years old, and she handles those ballroom dancing scenes like a woman half her age. Quite impressive. I hope I do so well when I’m in my seventies.

Halloween 2019: report

The weather didn’t cooperate for Halloween in my part of the world (Cincinnati).

After a mostly dry, balmy October, a cold front brought a downpour to the area last night through this morning.

Then around noon today, the rain stopped and the temperatures started falling.

I mean falling…

By the time the trick-or-treat hour of 6 p.m. arrived, it was below forty degrees, and the wind was gusting. Far from ideal trick-or-treat weather.

There were trick-or-treaters, but far fewer than the usual number. As a result, I now have a stockpile of Reese Cups.

I don’t eat candy; but I’m sure I’ll find someone willing to take the surplus off my hands. Reese Cups are delicious…as I seem to remember.

Camrys are fragile things

This is what happens when a Toyota Camry strikes a pole in a parking lot at low speed.

The concrete-reinforced pole sustained no damage, I should note.

No, I wasn’t driving the vehicle. A relative of mine was driving it.

This is just a friendly reminder to be careful when you’re behind the wheel. Driving is serious business, and it doesn’t take much to do damage to things–or people–when cars are involved.

Black Tuesday

On October 29, 1929, –90 years ago today–the world changed.

This was the Crash. Investors on the NYSE lost $14 billion ($206 billion in 2019 dollars.) Over the next four days, total losses would balloon to $30 billion.  You do the rest of the math…

Black Tuesday ended the Roaring Twenties–the Jazz Age of F. Scott Fitzgerald–and brought on the Great Depression.

I don’t remember Black Tuesday or the Great Depression, of course. Those who can are now a dwindling number.

But I do remember some of those who lived through it.  I heard about the Great Depression secondhand.

My grandparents often talked about life during the Great Depression years.  As my grandfather explained it, “You didn’t really consider yourself poor, because everyone around you was poor. Your cousins were poor. Your neighbors were poor.”

My grandmother maintained what the family jokingly referred to as “Depression mindset” through the end of her days. She was very frugal, and very much a hoarder…You know–the kind of person who reuses every glass jar, and buys tea bags in bulk because they’re cheaper that way.

Depression mindset is completely alien to those of us who were born in a time and place of greater abundance.

Reuse a glass jar? Heck, we think our iPhones are “old” after we’ve been using them for two years.

Whether that’s a good thing or a bad thing…well, I’ll leave that one up to the reader.

My new iPhone 11

I had been using an iPhone 6 Plus that I’d purchased in May 2015. I believe in getting my money’s worth, but hey, at a certain point, you’ve got to upgrade. 

My iPhone 6 had deteriorated to the point where it would no longer shut down, the silent toggle switch no longer worked, and various apps (including the text app) closed at random while I was using them.

Time to upgrade!

I bought a new iPhone 11 last week, and so far I’m very happy with it. 

I’m particularly impressed with the video camera. It may even inspire me to give YouTube another try.

As I wrote yesterday, the Apple Store has become a lousy customer experience. But Apple still makes great products.

Rereading ‘Salem’s Lot’ after 35 years

The original hardcover, published in 1975

I recently decided to reread Stephen King’s vampire novel, ‘Salem’s Lot. This seemed reasonable enough, as I had first read the book in 1984. (After thirty-five years, just about any novel or film will seem fresh again.)

I have a lot of nostalgia associated with this novel, as I tend to have a lot of nostalgia associated with a lot of things. This was the book that birthed my adult interest in reading and writing.

In February of 1984, I was a sophomore in high school. During my free period, I worked behind the counter of the school library. That’s right: I was a librarian.

But I wasn’t a big reader. Not at that time, at least. I had been a very avid reader during my childhood years, devouring series like John Dennis Fitzgerald’s The Great Brain, and Alfred Hitchcock and the Three Investigators.

Once I hit puberty, though, I developed other interests: football and rock music, specifically.

I did play high school football for a while—if you can dignify what I did with that description. (I was a third-string right tackle, or something like that.) And I messed around with a few garage bands. I can still play the basic chords on a guitar. (But I was always much more interested in lyrics than in music.)

One day, when things were slow in the school library, I picked up a dogeared paperback copy of ‘Salem’s Lot on a whim, and started reading it.

I was immediately hooked. I checked the book out, and read the entire thing in less than a week.

After that, I read the rest of Stephen King’s oeuvre, as it existed in 1984. Stephen King fans tend to divide themselves between those who prefer his newer style—long, rambling books like Duma Key and 11/22/63, and those who prefer the tightly plotted, shorter novels of his earlier years. Put me solidly in the latter camp. The Stephen King books I most love: The Stand, Pet Sematary, Christine, Carrie, The Dead Zone, Cujo, and ’Salem’s Lot were already available in 1984. (’Salem’s Lot, in fact, had already been out for a decade in 1984, and had already been adapted into a made-for-TV movie, starring David Soul as Ben Mears.)

 

There is much about ‘Salem’s Lot to love. Let’s start with the way Stephen King pulls you into the small-town New England setting. I have spent most of my life in Ohio, and I’ve never been within a hundred miles of Maine. But when I read ‘Salem’s Lot, I had a deep, palpable feeling of small-town Maine life in the mid-1970s, when the story takes place.

The horror element of the story builds slowly, and is an organic part of the setting. The horror is embedded in the history of the town, and Ben Mears’s terrifying childhood experience in the Marsten House. When the supernatural phenomena begin to occur, they are believable precisely because Stephen King has already made you believe in this world of ‘Salem’s Lot, a small town in rural Maine.

It starts with the very prosaic, quite mundane details, as seen through the eyes of Ben Mears. It begins as Mears, still haunted by the death of his wife, is driving into the town where he had spent a few happy summers of his childhood:

…and he could see Schoolyard Hill through the slash in the trees where the Central Maine Power pylons ran on a northwest to southeast line. The Griffen farm was still there, although the barn had been enlarged. He wondered if they still bottled and sold their own milk. The logo had been a smiling cow under the name brand: “Sunshine Milk from Griffen Farms!” He smiled. He had splashed a lot of that milk on his cornflakes at Aunt Cindy’s house.

That, you see, is how a master horror writer like Stephen King suspends your disbelief. He begins by investing you in the characters and the settings. Then he introduces the paranormal—the scary stuff.

 

The vampires in ‘Salem’s Lot are old-school vampires. They are spiritually foul, evil creatures who pose a threat to your immortal soul. The best horror fiction involves the threat of death—either spiritual death or physical death. ‘Salem’s Lot involves both.

I will confess a love of the old-school vampires, done in the Bram Stoker mode. I moderately enjoyed Richard Matheson’s I Am Legend, but it was a lightweight vampire novel compared to ’Salem’s Lot. A virus-created vampire is not a proper vampire. A proper vampire must be a supernatural, reanimated being. It must recoil from crucifixes, and be burned by holy water. A vampire is not a scientific accident, or a misunderstood antihero (more on that abomination shortly).

 

Stephen King maintains a pretty tight pace throughout ‘Salem’s Lot. Like I said, I read it the first time in less than a week; and I read it the second time at a similarly brisk pace.

Nevertheless, the book was originally published in 1975. Since then, much as changed. The reading public has become accustomed to 200+ channels on cable television, Jame Patterson-style minimalist thrillers, and…of course, the Internet, cell phones, and all the distractions of digital life. Attention spans are much short than they were in 1975, or even 1984.

I would like to declare that I haven’t been personally influenced by any of this, but I know better. As much as I admire Stephen King’s “world-building” in ‘Salem’s Lot, there were a few passages in which he spends a bit too many words going in-depth about the foibles and petty hypocrisies of small-town life.

Also, I was fifteen when I read the book for the first time. I was fifty when I reread it. In the intervening years, I have read many novels, and consumed countless television dramas, movies, etc. Perhaps my standards are more exacting than they were in 1984.

 

There is a feeling of pathos that the reader gets from ‘Salem’s Lot, and I believe that this is one of the book’s under-appreciated aspects. Much of the best horror fiction does leave us slightly sad and reflective. After reading a good horror novel, you should be like the wedding guest in The Rime of the Ancient Mariner: “a sadder and a wiser man” (or woman).

Ben Mears comes to ‘Salem’s Lot in order to recover from an existential tragedy, the death of his wife, Miranda, in an accident. What he encounters there, however, is yet another tragedy—this one even more profound and disturbing.

On a personal level, he briefly finds love again, in his budding relationship with Susan Norton. But that (spoiler alert) is not to last. His loss of Susan, moreover, will be closely tied to the vampire outbreak, culminating in a scene that is reminiscent of a scene in Bram Stoker’s Dracula.

 

I love ‘Salem’s Lot, as this post probably makes clear. My own personal attachment to the book aside, I sincerely believe that it is a great novel, and probably the best novel of the vampire genre yet written.

I despise what Stephanie Meyer and her many imitators have done to the vampire genre. The vampire should be dark and terrifying. Twilight—and the many Twilight knock-offs—have transformed the vampire into a teenage girl’s romantic fantasy. (Search for “vampire novel” on Amazon, and most of the results will be YA romance novels. Gag me.)

But we still have ‘Salem’s Lot. If you like the idea of a real vampire novel, then you should definitely read this one, if you haven’t done so already.

‘The Breakfast Club’: its strengths, and yes…its flaws

This was one of the big teen movies of my youth. I saw it when it came out in the mid-1980s. I recently watched it again as a middle age (51) adult.

 The basic idea of The Breakfast Club is immediately relatable: Five very different teens (a nerd, a jock, a princess, a basket case, a criminal) are thrown together in the enclosed space of their high school’s library. They are then forced to interact over the course of a day-long detention period on a Saturday. This is a small drama, but also a much larger one: The setup for the movie provides a concentrated and contained view of all teenage interactions.

Why we like The Breakfast Club

I liked The Breakfast Club, for all the usual reasons that millions of people have liked the movie since it first hit cinemas in February 1985. Everyone who has ever been a teenager can relate to feeling awkward and misunderstood; and The Breakfast Club has teenage angst in spades. The cast of characters is diverse enough that each of us can see parts of himself in at least one of these kids. 

The Breakfast Club is free of the gratuitous nudity that was somewhat common in the teensploitation films of the era. There is no Breakfast Club equivalent to Phoebe Cates’s topless walk beside the swimming pool in Fast Times at Ridgemont High. (There is a brief glimpse of what is supposed to be Molly Ringwald’s panties. But since Ringwald was a minor at the time, an adult actress filled in as a double for this shot.)

Nor are any of the actors especially good-looking or flashy. They all look like normal people. No one paid to see this movie for its star power or sex appeal. The Breakfast Club succeeded on the basis of its script, and solid acting and production values. 

What I didn’t see in 1985

I enjoyed the movie the second time around, too. I have to admit, though, that teenage self-absorption can seem a little frustrating when viewed through adult eyes.

I’m the same age as Michael Anthony Hall and Molly Ringwald; we were all born in 1968. The other actors in the film are all within ten years of my age. Nevertheless, this time I was watching their teenage drama unfold as an older person–not a peer. Teenage drama is, by its very nature, trivial (and yes, a little annoying) when viewed from an adult perspective. 

The movie also makes all adults look corrupt, stupid, or craven–as opposed to the hapless and victimized, but essentially idealistic– teens. Every young character in The Breakfast Club blames his or her parents for their problems, and these assertions are never really challenged.

We get only a few shots of the parents, when the kids are being dropped off for their day of detention. The parents are all portrayed as simplistic naggers. 

The teens’ adult nemesis throughout the movie, Assistant Principal Vernon, is a caricature, a teenager’s skewed perception of the evil adult authority figure.  The school janitor, meanwhile,  is no working-class hero–but a sly operator who blackmails Vernon for $50.

A movie written for its audience

One of the reasons you liked this movie if you were a teenager in 1985 is that it flattered you–without challenging your myopic, teenage perspective of the world. If you weren’t happy, it was probably because of something your parents did, not anything that you did–or failed to do. 

That may have been a marketing decision. Who knows?  The Breakfast Club goes out of its way to flatter its target audience–the suburban teenager of the mid-1980s. I suppose I didn’t see that when I was a member of that demographic. I see it now, though. 

My top three recurring dreams

I have a tendency to dream, almost every night. I don’t want to get all woo-woo on you (or well, maybe I do), but I regard dreams as a gateway into…something. And by that I mean more than just my subconscious. 

I certainly have dreams that I would rather not share. But within the range of what is shareable and family-friendly, these are the recurring dreams that I tend to have the most often:

1. The exam dream: I show up to a class on exam day. Oops! I forgot that I was taking the class. I haven’t been there all semester! But today is exam day! I’m screwed!

2. The chased by criminals dream: At least once per month, I dream that I’m being chased by a criminal gang. Sometimes they’re common hoodlums, sometimes they’re mafiosos. But they’re always in pursuit of me, for some undefined reason. 

3. I’m driving on an expressway and I can’t get off. This might be attributable to the fact that driving has never been one of my favorite activities.   I also get this dream multiple times per month. I’ll be behind the wheel of a car, on a narrow expressway. I’ve missed my exit, and there’s no way for me to get off the highway. 

The exam dream, in particular, seems to be common. (There are a number of articles about it on the Internet.) The other two…I’m not sure about.

“I Know George Washington”: about the story

A college student takes a summer job in a very unusual company in rural Virginia. 

What’s unusual about the company? Everyone insists that George Washington—the George Washington—is the founder and owner of the firm. 

Moreover, the great man himself will make an appearance before the end of the summer.

That’s the setup for the story, “I Know George Washington”.  

This is one of those stories that came to me in a dream (as so many of them do). 

Or, I should say, the basic idea came to me in a dream—not the complete story. 

After getting the initial seed of the idea, I spent some time fitting it into a narrative. The result is not quite a horror story, but something that might be called psychological suspense, in the spirit of the old Alfred Hitchcock movies like Rear Window and Vertigo. (I am a big fan of Hitchcock, and his technique of playing with the protagonist’s—and the audience’s—hold on reality.)

I’m presently posting snippets of “I Know George Washington” on Edward Trimnell Books.

The great Ohio heat wave

Heat waves past and present…

It’s been unseasonably warm in Ohio for the past week. Today the mercury hit 93 degrees for the high.

I’m familiar with the term “Indian summer”, but this is ridiculous. We’re having dog days of August weather, with Halloween just a few weeks away.

This isn’t the first miserable Indian summer in the Cincinnati area. I distinctly recall the hellish September/October of 1985. I was in my senior year of high school, and running cross country. There were a few races in which I was sure I was going to collapse from heat stroke.

Homecoming 1985, on October 12th, I remember sweating in my suit and tie at the big dance.

Why I am telling you this (other than my natural tendency toward nostalgia)? This late-season heat is a bit unusual. But if we had October heat like this 34 years ago, then the world probably isn’t ending (though you never know, of course).

According to the forecast, things are supposed to cool off here tomorrow evening. And not a day too soon.

Why most writers should stay away from Reddit

I will openly confess that social media has never really been my “thing”. And I think that most writers have an uneasy relationship with it, at best.

Most writers get onto social media and immediately want to promote their books.

“Hey! Buy my book!”

“Did you know I have a new book out?”

“Have you seen my new book? Here’s a link to it at Amazon, for your convenience!”

And so on…

Michael Todd Beauty Soniclear Petite

Did you see what I wrote? Did you?

I’m not quite that tone-deaf. I have rarely attempted the outright sales pitch on social media. I will admit, however, a tendency to use social media exclusively for linking to this blog.

“Hey, read this post I wrote yesterday. You’ve got to read it. World-changing stuff, I’m telling you!”

This is why I rarely use Twitter. Twitter is a place where people bitch about politics, and discuss material written on external links…by other people. And then they bitch about politics some more. And post some more external links. “Did you see what so-and-so said/wrote/did? Here’s a link.”

I’m not interested in doing that. I always want to post links to my material.

This makes me a bad Twitter user.

Microsoft APAC

Reddit is not for me

But if I’m a bad Twitter user, I would be even worse on Reddit. I wouldn’t even think about getting onto Reddit, in fact. According to the Reddit terms of service:

You should not just start submitting your links – it will be unwelcome and may be removed as spam, or your account will be banned as spam.

You should submit from a variety of sources (a general rule of thumb is that 10% or less of your posting and conversation should link to your own content), talk to people in the comments (and not just on your own links), and generally be a good member of the community.

And furthermore:

It’s perfectly fine to be a redditor with a website, it’s not okay to be a website with a Reddit account.

But the thing is, I would be a website with a Reddit account. I know that. This is why I stay the heck off Reddit.

My ratio would be the exact opposite of what Reddit prescribes. About 90% of my links would be to my own content.

 




On social media, it’s all about links…and brief, snarky comments

Think about it from my perspective: Why would I want to post only “10% or less” of my own content, when I write content all day? When I have so much of it to post.

You egotistical bastard, you might counter. What, do you think you’re smarter than everyone else on the Internet? Or a better writer, maybe?

My answer to that is: I’m smarter than some, not as smart as others. The same goes for being a better writer.

But there is another way to look at this. I remember the pre-social media days, when “webrings” were the thing. A common complaint back then focused on websites that consisted only of links—with no original content. Often you would go from website to website, finding nothing but lists of links.

That was considered bad netiquette back then. But Reddit and Twitter are all about linking to content you haven’t created. A complete flipflop of the Internet ethos.

This doesn’t mean that Reddit and Twitter are bad, mind you. I also understand the deeper reasons for the draconian “ten percent rule” at Reddit. The platform’s members don’t want to be overwhelmed with “buy my x!” posts, which would be the inevitable result otherwise.

But this is also why I mostly stay off Twitter and Reddit, and other social media platforms that are all about linking to external sources.

 




And why wouldn’t I link to my own stuff?

The bulk of my time is spent creating my own content. That leaves me relatively little time to gather and curate content written by others.

And yes, there is an unabashedly selfish side to this, as well: After I’ve spent a few hours working on an essay or a short story, will my first impulse be to link to something a stranger wrote? Or an article from USA Today?

Hell, no. My first impulse will be to link to what I wrote. That’s only natural.

 




Curator or creator: know which one you are

But there is also an unselfish side to this. The Internet needs people to curate content, but it also needs people to produce content. If no one produces, then eventually there is nothing to curate.

The key is to know which one you are—a content curator or a content creator.

If you’re primarily a content curator, Twitter and Reddit are for you.

If you’re primarily a content creator, then you should probably stay off Twitter and Reddit. Your time would be better spent working on your own books and blog posts.

Give the curators something to find. They’ll find your stuff…eventually.


Amazon releases a new Kindle

This one has some interesting new features, too:

Amazon has recently released a new version of its cheapest Kindle yet and it’s gotten slimmer compared to previous versions.

For only £69.99 in the United Kingdom or about $89.99, Kindle now has a better screen and front light as well as higher contrast and better touch screen, which were previously only available to more expensive Kindle versions.

This was also the first Kindle under £100 or $130 with a built-in adjustable front light, according to Eric Saarnio, head of the Amazon devices in Europe.

The article also reports the demand for e-reader devices has been down since 2015.

I don’t think this is because people have suddenly stopped e-reading. They are still reading ebooks. But now they’re reading them on their phones.

You may have noticed that smartphones seem to have hypnotic powers, transfixing people for long periods when they should be driving, stepping forward in line at the bank, or generally paying attention to what is going on around them.

Podcasts, audiobooks gaining on Facebook

Here’s some good news: According to a recent study, podcast and audiobook consumption are up; Facebook usage is down.

Other highlights include:

More than half the US population now reports having used YouTube specifically for music in last week. This number is now 70% among 12-34-year-olds.

The study shows an estimated 15 million fewer users of Facebook than in the 2017 report. The declines are heavily concentrated among younger people.

That sounds about right. Aside from music, YouTube has mostly been reduced to adolescent humor and political rants (both of which have their place, mind you, but not in unlimited doses.) YouTube is a great place to watch the latest Def Leppard video. (Hey, I’m from the ’80s.)

As for Facebook: I use it to keep in touch with old high school friends. Beyond that, I can skip it. (And my younger cousins, all of whom were born since 2000, have zero interest in Facebook.)

On the other hand, I love podcasts, love audiobooks. I still prefer reading. But you can listen to podcasts and audiobooks when you’re on the go.

Ebook sales just 7.9% of revenue for Hachette

Hachette, one of the “big five” publishers, reported that ebooks accounted for 7.9% of its global revenue in 2018:

Hachette reported that sales of digital audio rose 30% across its publishing operations and accounted for 2.7% of total revenue, up from 2.0% a year ago. Ebook sales fell in the United States and United Kingdom, but still represented 7.9% of revenue.

One would imagine that the other publishers experienced similar numbers.

Granted, 7.9% is not nothing, but it falls short of expectations..and previous hype. A few years ago, all the pundits were predicting the end of paper, and the triumph of the ebook…So far that hasn’t happened.

I see similar results in my own books. Since I released the paperback edition earlier this year, 12 Hours of Halloween has been selling almost as many copies in paperback as it does in Kindle.