One of the many things I miss about the 1980s is all the great music: Def Leppard, AC/DC, and others.
But some of the female acts were incredible, too. The Bangles were among my personal favorites.
Here’s a gem I found on YouTube: the group performing on American Bandstand in May of 1986. Ronald Reagan was in the White House, I was about to graduate from high school, and Taylor Swift wasn’t even in diapers yet.
Susanna Hoffs has recently written a novel, too, a romantic comedy entitled This Bird Has Flown. And though I’m ordinarily allergic to romance novels, I’m willing to give this one a plug. (Check out the book here, on Amazon, if you’re interested.)
Every time I visit Japan, I lose weight–whether I want to or not.
As this video describes, getting your fill in Japan is something of a challenge. This is especially true if you have an American-style appetite, like yours truly.
At 5’10” and 160 lbs, I’m not exactly obese. Nevertheless, when I go to Japan, I find that I never get enough to eat.
And I like Japanese food–sushi, sashimi, you name it. I love it. I simply want more of it than one finds at most Japanese restaurants.
If you ever visit Japan, my recommendation is that you find a Chinese restaurant and eat there. Chinese restaurants in Japan specialize in cheap, plentiful offerings, just like they do here in the USA. I had some of the best 麻婆豆腐 (mapo doufu) of my life in Nagoya.
Not all of it, of course. But many common repairs and maintenance tasks can be handled by a reasonably attentive amateur with the right tools.
For example: the replacement of a capacitor. A local AC contracting company quoted me a price of $384 for this repair. The necessary replacement part costs $19.95 on Amazon. You’ll also need a multimeter, which can be purchased on Amazon for a very moderate price.
In case you haven’t noticed, I’m a writer, not a handyman or an engineer. Bookish types like me typically don’t like to get our hands dirty. Well, I discovered a few years ago what a self-defeating attitude that is.
I solicited a quote for a necessary home repair, and found that the company in question wanted about 10x the reasonable price for the job. Two other contractors in the area wanted 2 to 3 times the reasonable price.
I define reasonable here as: parts + labor + (reasonable) overhead + (reasonable) profit.
That said, there are limits to what you can do for yourself. The installation of a new AC system is beyond the scope of any homeowner; and environmental laws require that a licensed professional complete the work.
But you should educate yourself about what is actually involved in common home repairs. And never accept a quote from a vendor that isn’t broken out as described above. Most contractors who serve homeowners don’t want to do this, because they don’t want you to know how much they’re overcharging you.
Sometimes the past provides us with a lens for better understanding the present.
In the spring of 2023, Anheuser Busch launched an online marketing campaign that featured Dylan Mulvaney, a transgender social media influencer, as a spokesperson for Bud Light.
This resulted in a backlash and a boycott, with real financial consequences for the company.
But the backlash was predictable. The Bud Light/Dylan Mulvaney campaign did not take place in a vacuum, after all.
In recent years, many corporations have placed biological men in spaces allocated for women. Sports Illustrated has selected multiple transgender (biological male) models for its annual swimsuit issue.
Even Playboy has gotten into the act, thrusting female-presenting, biological male models before its heterosexual male readership.
The idea here seems to be that if you show heterosexual men enough transgender women, eventually they’ll start seeing them as indistinguishable from biological women.
This follows the twenty-first-century pattern of blunt-force culture warfare, something I’ll return to shortly.
But let’s get back to the Bud Light debacle.
It seemed to me that all sides were losers here. Anheuser-Busch was certainly a loser. Bud Light sales tanked, as Bud Light drinkers turned to other beers. The company’s stock value declined, too.
Alissa Heinerscheid, the Anheuser-Busch marketing vice president who had championed the Dylan Mulvaney campaign, was forced to take a “leave of absence”. That’s code in the corporate world for “fired”.
Did LGBTQ people benefit from this? Not really. Anheuser-Busch had just made them cannon fodder in its efforts to promote one of its products.
But our topic here is the 1980s, so I’m going to discuss some gender-bending controversies from the 1980s.
Or rather, gender-bending non-controversies. As it turns out, the 1980s were sometimes gay, and sometimes gender-fluid, too. But in ways that weren’t as deliberately confrontational as what you see in the present.
In 1982, I was 14 years old. My parents had just sprung for a basic cable television package, and it included MTV, then a brand new channel.
MTV played nonstop pop and rock videos. I was an immediate fan.
MTV introduced me to lots of new musical acts. I would subsequently buy the albums of some of them, which was exactly what the corporate minds behind MTV had intended.
On the American popular music scene, the early 1980s was the era of the Second British Invasion. Everyone who was big in youth music during that time seemed to speak with a British accent. MTV greatly facilitated this influx of British pop and rock acts.
One of these was a group called Culture Club. The lead singer of Culture Club, Boy George, appeared to be female. Boy George wore makeup and baggy feminine attire. He wore his hair long and in braids, in a distinctly feminine style.
Boy George’s mannerisms were feminine, too. He didn’t sing in a high-pitched falsetto; but his singing voice was high enough to pass for that of a woman.
Some time elapsed before I even realized that Boy George was not a woman. Sure, I sensed that there was something about the female-presenting singer that was atypical. But I was initially fooled.
Boy George is an extreme example, but he wasn’t the only popular musician in the 1980s to tinker with notions of gender norms. There was a whole subgenre of rock music called “glam rock”, in which male musicians took on deliberately androgynous appearances. This started with David Bowie in the 1970s. By the 1980s, groups like Motley Crue and Ratt were wearing makeup and quasi-feminine hairstyles. Women got into the act, too. Annie Lennox of the Eurythmics wore short hair and masculine business suits.
Culture Club, featuring the gender-fluid Boy George, was enormously popular in the early 1980s. In the fall of 1982, the group’s breakout song, “Do You Really Want to Hurt Me?” reached the number two position on the US charts.
The term “transgender” wasn’t common in the early 1980s, but that’s what Boy George was. He was a man who presented as a woman in public. At the very least, he was a drag queen.
Suddenly, the gender-fluid Boy George was in front of millions of impressionable young people, every time they turned on MTV. And practically all adolescents and teens watched MTV in the 1980s.
If we cast the 1980s in terms of the present, the next logical question might be: when did the backlash start? Where were the alarmed parents, taking to microphones in town hall meetings throughout the country? Where were the calls to boycott cable companies that included MTV in their basic packages?
We didn’t have the Internet in the 1980s, of course. But we did have CNN, network television, newspapers, and radio. There were certainly political and social movements that went “viral” during this period, like the Nuclear Freeze campaign, or the Save the Whales Movement.
But here’s the thing: there was no backlash against all of this gender fluidity on MTV and thoughout popular music. Many adults were aware of Boy George. He was too large of a cultural phenomenon to escape their notice.
In 1984, People magazine ran a cover article about Boy George, with the words, “kids are getting his message”. Yet the adult authority figures of 1984 were notably unconcerned. Boy George did not become a flashpoint in a 1980s version of the culture war.
Was this because the 1980s were more liberal? Hardly. Keep in mind that the 1980s are remembered for their conservatism, and not without reason. Politically, this was an era that belonged to Ronald Reagan in the United States, and Margaret Thatcher in the UK. In many of our institutions, members of the World War II generation still occupied positions of leadership.
A group that called itself the Moral Majority was very active, too. This organization, led by evangelical preacher Jerry Falwell, took very public positions on matters of sex and propriety. For example, the Moral Majority constantly campaigned against convenience stores that sold magazines like Playboy and Penthouse. Hotel chains that offered pay-per-view pornographic movies in guest rooms were put on boycott lists.
And yet, the Moral Majority and its various imitators didn’t care about Boy George, the gender-fluid, female-presenting lead singer of Culture Club.
I don’t ever recall hearing an adult fulminate against this man who dressed up in female attire to sing for young people. Not even once.
But the tolerance of Boy George’s gender fluidity went beyond the adults. What about that most maligned of demographics: young, heterosexual males? You may have been told that the 1980s was an era of “toxic masculinity”. Not in regard to Culture Club, at least, it wasn’t. As a teenage boy in 1982 or 1983, your decision to listen to Culture Club—or to ignore them—wasn’t seen as a statement on your masculinity.
Boy George held back nothing, and still, no one on the right cared. In 1983, he was asked in a television interview whether he preferred men or women. He replied, “Oh, both.” In a 1985 interview with Barbara Walters, Boy George elaborated further, stating that he had had both male and female lovers.
And still, there was no call to shield American children from this bisexual, gender-bending singer who made absolutely no bones about who he was and what he was about.
But why were conservatives so blithely tolerant of Boy George and all this public gender fluidity, in the most conservative era in recent history?
Because other things were different in the 1980s, too. Tolerance went both ways. No one on the right insisted on making Boy George and gender fluidity points of confrontation, because no one on the left did, either. Nor did the people who ran our schools, media outlets, and Fortune 500 companies.
I was in high school during the first half of the 1980s. My teachers were certainly aware of Boy George. Yet none of them suggested in class that maybe some of us should change our gender identification because this famous singer appeared to be doing so. Teachers in American grade schools weren’t making such arguments, either.
Nor would a marketing executive at Anheuser-Busch in 1983 have been foolish enough to troll the company’s core demographic by making Boy George a spokesperson for Bud Light. Bud Light commercials of the 1980s were designed to appeal to the beer’s mostly male, mostly blue-collar customer base. Bud Light ads of that era featured the humorous canine character Spuds MacKenzie, and real women in swimsuits. Some Bud Light marketing campaigns even made use of both in the same ads.
Nor did Boy George—or any of his fans—demand that we pretend Boy George was an actual woman, just because he presented as female in public. In fact, Boy George—who is still around—has publicly taken issue with the contemporary pronoun police.
Back in the 1980s, Boy George wanted to do his thing; and his thing was flamboyant, gender-bending, and bisexual. He didn’t demand that you change your ideas of gender and sexuality in order to accommodate his ideas or his choices.
The main strategy in the culture wars of the twenty-first century seems to be not persuasion, but staking out positions that are practically guaranteed to be inflammatory, then daring the other side to knock a chip off one’s shoulder.
As someone old enough to remember the 1980s, I can report that in regard to most matters, people were a lot more laid back and tolerant then. There was an acceptance of diversity, but it was also understood that diversity went both ways. Boy George represented one kind of diversity. As did the predominantly heterosexual, rough-edged culture of the typical Bud Light drinker. What Alissa Heinerscheid, the now fired marketing VP at Anheuser Busch, dismissively called “fratty” culture.
This sense of moderation on all sides was why Boy George never appeared in a Bud Light ad in the 1980s, and he never incurred the public disdain of Bud Light drinkers. Even as many Bud Light drinkers happily sang along with Karma Chameleon when that song came on the radio, as it so often did.
A latchkey kid is a child or early teenager who is “home alone” for a few hours each day after school. This usually happens because parent(s) are working, and therefore unavailable.
The latchkey kid phenomenon is closely associated with the 1980s, and the generation of Americans born between 1965 and 1970-something. (The so-called “Generation X”.) That said, this was not the experience of everyone who was a school-age kid at some point between 1980 and 1989. Similarly, the generation that grew up in the 1980s was by no means the first—or last—cohort of young people who spent time alone after school.
All those disclaimers aside, we can speak meaningfully of the observable phenomenon, even if it is less than universal, and not strictly confined to the 1980s. The latchkey kid was a definite 1980s trend, owing to some unique circumstances.
Working moms, aka “career women”
In the third decade of the twenty-first century, the word “career woman” sounds quaint. Some might even find it sexist. Of course women have careers, you might say. And you’d be right, if we’re talking about the 2020s.
But four decades ago, things were different…and changing. Millions of upper- and middle-class women were entering the professional, white-collar workforce for the first time.
The concept of women doing paid labor wasn’t entirely new. Working-class women had long performed paid labor outside the home to one degree or another, usually out of simple necessity. And don’t forget Rosie the Riveter, who filled the vacuum in the male workforce during World War II.
There was also a long tradition of women working in specialized professional careers, especially teaching. (Almost all of my elementary school teachers were women.)
What was new in the 1980s was the mass entry of women into private-sector careers traditionally reserved for men. This is why you heard so much about the “career woman” in the 1980s. This really was a new dimension of female employment, and at an unprecedented scale.
These were also the women who were the young and early middle-age mothers of that era. Their children typically got home from school around 3 p.m., several hours before the end of operations in the typical white-collar workplace.
The result was millions of latchkey kids.
Economic uncertainty
At the same time, a struggling economy had led to high levels of unemployment in the early 1980s. The economy improved as the decade progressed, but unemployment in the United States peaked at 10.8 percent in 1982!
This economic dislocation was dramatized in the 1983 movie, Mr. Mom, starring Teri Garr and Michael Keaton. In the movie, an out-of-work automotive engineer becomes a stay-at-home dad. His wife, meanwhile, becomes the family breadwinner, accepting a high-profile position in the advertising industry. Hijinks ensue, as Dad the Engineer attempts to cope with grocery shopping, housecleaning, childcare, and other traditionally “female” tasks.
Mr. Mom is a comedy; and as would be expected of any movie made 40 years ago, it is largely dated now. Nevertheless, the film serves as a time capsule of the economic anxieties—and realities—of the early 1980s.
As manufacturing took a hit, some traditionally masculine careers (what could be more manly than automotive engineering?) went into decline. Jobs involving computers, marketing, and other forms of white-collar “knowledge work”, were beginning to rise in importance. Many of these careers appealed to women.
Small families and broken homes
The 1980s latchkey experience was also affected by recent demographic changes. The 1980s was, compared to the decades before and since, a decade of small families. By the time the first GenXers were born in 1965, the postwar Baby Boom was petering out. The World War II generation was done with reproduction and childrearing, and that burden increasingly fell onto the shoulders of the Baby Boomers themselves. Most of the Baby Boomers opted for smaller families.
Once married, the Baby Boomers divorced in record numbers, causing divorce rates to peak in 1980. The so-called “broken home” was another reason for the latchkey kid phenomenon. Divorce compelled many mothers to enter the workforce.
My latchkey kid experience
But what if both of your parents were happily married, and gainfully employed? That was my situation.
I was twelve years old in 1980. That same year, my mother took a job as a contract administrator at a local defense contracting firm. The company made the fuses that went into Cold War-era weapons like the shoulder-fired TOW missile and the Hellfire Missile System.
I spent several hours alone each afternoon, between the time when school let out, and when my parents arrived home from work (usually between 5:30 and 6:00 p.m.)
Learning to entertain yourself without technology
What to do during that time? Homework? Surely you jest. There were a few other kids in my neighborhood, and I got along with them. But we could only hang out so much until we grew tired of each other.
How about TV? Afternoon television in the early 1980s was a wasteland for adolescents and teens. Cable television was just taking off, and nothing worth watching aired until the evening, when adult audiences were tuned in.
Most of the afternoon programming on the non-cable networks consisted of either cartoons or soap operas, neither of which was of much interest to a twelve-year-old boy.
Nor did technology offer much in the way of engaging entertainment. The Internet and cell phones were still decades away. Video games were in their infancy. (Think “Pong”.) If someone had uttered the word “iPad” to me in 1980, I would probably have assumed it had something to do with personal hygiene.
That left latchkey kids largely responsible for entertaining themselves. This was especially true on rainy days, and during the winter months, when it was distinctly unpleasant to hang around outside.
Most of us learned to entertain ourselves in various ways. I became an avid reader, and began dabbling with writing my own articles and stories.
I also immersed myself in various hobbies: coin collecting, stamp collecting, and angling. In the summer of 1978, my grandfather had introduced me to bass fishing. I acquired back issues of Field & Stream and Fishing Facts, and read them all cover-to-cover. By 1981, I knew more about developments in fishing than my grandfather did.
Ironically, this was just about the time that I dropped fishing for other, “cooler” pursuits. But I can still speak knowledgeably about the differences between spinning, spincast, and baitcast reels. I could give you a solid introduction to bass fishing in Midwestern and Upper-South lakes and rivers. (The basics of fishing, I’ve since discovered, really haven’t changed that much since I went fishing with my grandfather.)
The net effects of the latchkey kid phenomenon
All this time alone gave me the ability to ignore the crowd, zero in on an objective, and take a deep dive. I have resisted the modern obsession with cellphones and social media, the compulsive need to be constantly connected, and in constant communication.
Spending so much time alone, at such an early age, taught me to keep my own counsel. I don’t need that much approval from others. I really don’t care what most people think, either about me, or about what I’m doing.
This self-containment has obviously given me the advantage of independence. I have many, many faults; but being a joiner, a follower, or a bandwagon-rider is not among them. I never met a rule, group standard, or authority that I couldn’t challenge.
But all that independence and self-containment has a downside, too. The crowd can be a sinister mob, but that isn’t always the case. A thriving society cannot subsist solely on the uncoordinated activities of lone wolves. There are situations in which being a team player, marching in line, and having a willingness to be led by others are necessary and desirable.
I recognize this principle abstractly, but I don’t feel it in my bones. I’ve never been able to switch off my independence instinct for anything beyond a short-term, provisional basis.
My corporate career was notably lackluster, partly because I bristled at being ordered around. Nor did I naturally aspire to becoming the boss—the core motivator of most underlings. I have never wanted to take orders…or give them.
Oh, and I’m over the age of fifty and single. That might suggest that I have some commitment issues.
A final note on the latchkey kid. The term itself seems to be a retroactive one. Although my research tells me that it existed in the 1970s and 1980s, I don’t recall ever hearing it in those years, either in the media or in daily conversation. Nor do I ever recall anyone describing himself as “a latchkey kid”. The definition and analysis of the phenomenon have mostly come later. Back then, being a latchkey kid was simply what a lot of us did.
When I was a kid in the mid-1970s, my dad used to sing this song from the radio. The refrain went:
“Sundown, you’d better take care
If I find you’ve been creepin’ round my back stair.”
This was Gordon Lightfoot’s hit song, “Sundown”, of course. In the year the song climbed the charts, 1974, I was but six years old. I therefore didn’t grasp its meaning. But the song still brings back memories of that time.
And now that I’m old enough to understand “Sundown”, I find it an unusual take on the familiar romantic love triangle: that of the cuckolded male.
Fast-forward to 1986. My high school English teacher, wanting to demonstrate how stories could be told in poems and song lyrics, played “The Wreck of the Edmund Fitzgerald” for us on one of the AV department’s record players. Yet another of Gordon Lightfoot’s songs.
I immediately connected with this song, even though I was unaware of the historical reference behind it. My teacher told our class about the November 1975 shipwreck of the Edmund Fitzgerald in Lake Superior. That gave the song even more weight. It was a work of imagination and art…but also something real.
“The Wreck of the Edmund Fitzgerald” was released in 1976, to commemorate the shipwreck of the previous year. It remains one of my favorite songs from a musical era that I was too young to appreciate as it was taking place.
Last November marked the 47th anniversary of the wreck of the Edmund Fitzgerald. This got me thinking about the song, and about Gordon Lightfoot. According to Google, Lightfoot was still touring in his eighties.
But all tours, and all lives, must come to an end. Gordon Lightfoot passed away on May 1, of natural causes.
While Lightfoot and his music were a little before my time, I always appreciated his work. There are few songs quite as haunting and memorable as “The Wreck of the Edmund Fitzgerald”. And whenever I hear “Sundown”, I always hear my dad singing along with the radio in the mid-1970s.
A brilliant musician, and an artistic life well-lived. Gordon Lightfoot, 84, RIP.
Paramount+ is airing a reboot of the 1987 psychological thriller film, Fatal Attraction. The reimagined version will be not another movie, but a series.
The 1980s was an era when a new movie could still fill cinemas, whereas the 2020s is the era of streaming. Series, moreover, are more profitable, and what 2020s audiences seem to prefer. I therefore understand the decision to go with a series instead of a new film.
What I don’t understand is the logic behind making this a “reboot” at all. The announced connection to the past would seem to serve no purpose here; and I’m a guy with a notable attachment to the past.
The original 1987 film, Fatal Attraction, starred Michael Douglas as a married, white-collar family man who has an affair with a single woman (Glenn Close). The affair leads to a romantic obsession. Turning the usual dynamic of the obsessive male on its head, Fatal Attraction made the female character the one who couldn’t take “no” for an answer.
This is an interesting twist, and one with endless possibilities. It is also one that audiences might be receptive to in 2023, as an analog to the #MeToo paroxysms of 2018 and the Trump era.
“Issues” movies can work when they contradict the usual narrative in this way. In 1994, Michael Crichton’s novel, Disclosure, flipped the expected gender roles with a tale about a female boss who sexually harasses her male subordinate. This was made into a movie the same year, starring Michael Douglas (yes, again) and Demi Moore.
Whereas Fatal Attraction was genuinely creepy, Disclosure strained credulity at times.But Disclosure came at a time when we were having a national conversation about sexual harassment in the workplace. The Clarence Thomas-Anita Hill hearings had taken place in the then recent summer of 1991. Sexual harassment avoidance training, implicitly targeted at male sexual aggression, was suddenly a part of new employee orientation programs at every organization with a sizable workforce.
The dynamics of unwanted sexual/romantic attention is, in short, a timeless theme. With both Fatal Attraction and Disclosure now decades behind us, it makes sense to revisit the idea with what the Wall Street Journalcalls “a modern spin”. Much has changed since 1987, after all. What doesn’t make so much sense is to create an explicit link to the 1987 original.
Twenty-first-century reboots of twentieth-century films and television shows almost never go well. This happens partly because contemporary filmmakers are either unwilling or unable to be faithful to the source material.
The rebooted MacGyver (2016) was a mess that has since been cancelled. The rebooted Magnum P.I. (2018) is not a bad show; but it shares little more than a name and a Hawaiian setting with the 1980s original.
In 2016, Hollywood decided to reimagine the 1984 comedic paranormal film, Ghostbusters. But sticking with twenty-first-century sensibilities, Hollywood changed the genders the original actors, presenting an all-female cast. This provoked a wave of (admittedly overblown and ridiculous) online outrage, and charges of “wokeness”. Middle-age fans complained that filmmaker Paul Feig had ruined their childhoods.
But here’s the thing: Paul Feig could have made a fun paranormal film about ghost hunters without calling it Ghostbusters. Feig might even have created a whole new franchise for a new century. He might even have done that with an all-female cast, if such was his desire. Instead he chose to ride the wave of a 32-year-old movie, and the 2016 reboot of Ghostbusters performed poorly in the marketplace. Seven years later, the film is now mostly remembered for the controversy that surrounded it.
The new Fatal Attraction series is clearly set in the present time. There is no attempt at a nostalgia factor here. (Which would be weird, anyway, given the nature of the source material.) And after 36 years, audiences weren’t exactly waiting on the edges of their seats for a sequel to the 1987 movie.
Why, then, the insistence on creating a connection to a cinematic relic from the Reagan era? This will almost surely disappoint older viewers who do want nostalgia. Younger audiences, meanwhile, won’t get the reference.
I’m a nostalgic GenXer with an enduring attachment to the pop culture of the 1980s.I’m always ready, moreover, for yet another Def Leppard album, even though I’ve been buying them since 1983.
But I don’t need to see endless remakes of movies and TV shows that I watched as a kid, teenager, and young adult. Where stories are concerned, I want Hollywood to give me something new…even if a particular tale revisits a timeless theme, like romantic obsession gone awry.
Former politician and talk show host Jerry Springer has died.
Most people know Springer for his gonzo talk show work on national television. Decades before that, he was a well-known figure in Cincinnati politics and local broadcasting.
Springer spoke at my Cincinnati-area high school in 1985. At this time, the biggest skeleton in Springer’s closet was a 1974 scandal in which Springer, then a Cincinnati City Council member, paid a sex worker with a personal check. Springer resigned from city council in a certain degree of disgrace.
Several of my male classmates couldn’t resist calling out, “Where’s the check?” while Springer was speaking at our school in 1985. Springer, a good sport, laughed off their taunts and moved on.
Jerry Springer was never one to be impeded by other people’s opinions of him. I recognized that in 1985.
After the Jerry Springer talk show debuted in 1991, I tuned in a few times. In all honesty, the show was never for me. But I didn’t watch much network television of any kind during the early 1990s. I was too busy, and my life too disjointed.
I’ll always remember the local, Cincinnati version of Jerry Springer, anyway. The speaker at my high school who wasn’t about to be deterred by an embarrassing incident from his past, or others’ ungracious insistence on calling attention to it.
Perhaps there is a lesson for all of us here. One can go far, despite being hampered by very human flaws and a less than perfect track record. The trick is to shrug off the crowd’s disdain, and keep moving forward.
Although the connection between this specific day and the historical crucifixion of Jesus is arbitrary, today is the day when Christians around the world celebrate the Resurrection of Jesus. Easter—not Christmas—is the most important holiday in the Christian calendar. (See? I learned something from twelve years of Catholic school.)
Many Christians struggle to connect Biblical themes with modern realities. If you’re among that crowd, you’re not alone. Although raised in the Catholic faith, I struggle not just with faith itself, but with how to best practice it in the context of those aforementioned modern realities. I’m not going to hold up myself as an example of Christian virtue. I miss the mark almost every day.
Not everyone is Christian, of course. But not everyone is Jewish, Muslim, or Buddhist, either. We live in a world in which “none of the above” is the fastest-growing religious affiliation.
At the same time, though, atheism, as expressed by Richard Dawkins and Sam Harris roughly a decade ago, has failed to supplant conventional religion. Atheism is “no longer cool”. Nor does atheism provide a hopeful vision for fragile human beings who must ultimately confront death—both their own, and those of the people they love.
The West, at least, is in the grip of a spiritual identity crisis.
But back to Easter. If you can think of Easter in its conventional, religious context, I encourage you to do that today.
But even if you can’t, today might be a good day to reflect on the concepts of spiritual renewal and resurrection in a more general sense.Maybe it would also be a good idea to consider the question: What do I believe? You don’t owe me an answer, but you do owe yourself one.
In those days before a zillion cable channels (let alone the Internet), there was TV Guide.
Launched in 1953, these little weekly magazines would be familiar to anyone from the Baby Boom generation or Generation X. (Some of the older Millennials may have dim early childhood memories of TV Guide, too.)
Each issue of TV Guide contained a listing of the week’s programming, of course. There were also articles in the front of the magazine that were sometimes worth reading. (If you were interested in television and Hollywood happenings, that was.)
The covers, moreover, were often minor works of art. Like this one from 1986, which depicts the cast of Cheers, one of the most popular shows of the 1980s.
TV Guide was always on my mother’s shopping list. It was on everyone’s shopping list. Why? Because without this publication, you would have a hard time knowing what programs were on, on which channels, and at what times.
The magazine was cheaply priced. (The 60¢ May 10, 1986 issue shown above would equate to only about $1.65 in today’s dollars.) But TV Guide was nevertheless essential.
With a shelf life of only one week, these weren’t magazines that anyone saved for posterity. Sometimes, though, one of them would end up beneath a sofa or behind a recliner, only to turn up months later.
Needless to say, no one prints, purchases, or needs TV Guide anymore. Not in this era of cable, Hulu, Netflix and YouTube.
Yes, another casualty of our digital age of hyper-abundance. TV Guide’s original mission has become not just obsolete—but impossible, even if someone wanted it.
It would not be incorrect to say that TV Guide is a relic of pre-Internet times; but this description would be insufficiently precise. TV Guide is a relic of a time when the scope of available programming for a single week was small enough that it could be completely curated, listed, and described in a single publication. Needless to say, those days are gone; and—barring some cataclysmic change that restarts everything from scratch—those days are gone forever.
Pepsi has raised the prices of its soft drinks by more than 15% in recent months. A 12-pack of any of the company’s chemical-infused, acidic canned liquids now runs around seven dollars in the Cincinnati area. Coca-Cola products are priced at a similarly extortionate level.
We’ve been trained to crave sodas for at least three generations. My grandfather was a fan of Coca-Cola. He was one of those World War II servicemen to whom Coca-Cola aggressively marketed its products. He was never without his supply.
World War II-era Coca-Cola ads
My grandfather was congenitally opposed to any form of diet cola, though. He drank only the original formula, with real sugar. But then, a Coca-Cola in his day was a rare treat, something to consume after hard hours of labor. In that context, the sugar boost was a feature, not a bug.
Subsequent generations started drinking sodas to fulfill their basic hydration needs, and that led to a demand for diet colas. One of the first of these was Coca-Cola’s Tab. Marketed mostly to women, Tab was the forerunner to Diet Coke.
1982 Tab ad
My mother drank Tab. Back in the 1970s and early 1980s, she always had a carton of Tab on the floor of our kitchen pantry. Tab had a heavy saccharine taste, but it was—in my opinion, at least—vastly superior to Diet Coke, which Coca-Cola debuted in 1982. Continue reading “The dark secret of my (former) diet soda addiction”
Today I scratched another town off my Indiana bucket list: Madison, located in the southernmost portion of the Hoosier State, along the Ohio River in Jefferson County.
Madison is less than two hours from the east side of Cincinnati, so the drive was not arduous. I went with my dad, who is a native Hoosier from southern Indiana. He had many anecdotes about how much the area had changed since the 1960s. Since I was not born until 1968 myself, I will have to take his word for it.
A view from Madison into Kentucky
The charm of Madison, though, is that much of the town’s original 19th century architecture has been preserved. Throughout Madison’s central historic district, you’ll find baroque Victorian mansions and narrow brick row houses that will make you think you’ve just dropped back into the 1800s.Continue reading “A visit to historic Madison, Indiana”
Bare minimum Monday is the latest thing on the Internet—especially TikTok, that wellspring of youthful oversharing.
Bare minimum Monday means what it sounds like: doing the bare minimum at work (especially office jobs) on Mondays.
Slacking on the job at certain times of the week is nothing new, of course. And it isn’t limited to Gen Z white-collar workers. During the 1970s and 1980s, the prevailing wisdom was that you didn’t want to purchase a UAW-made automobile that rolled off the assembly line on Monday or Friday.
"Bare minimum Monday" is the latest workplace trend to sweep TikTok. But what exactly is it? pic.twitter.com/LMr6w991el
But Generation Z seems to be putting its own spin on the concept, to the cheerleading of the mainstream media. CNN gushes that younger workers are using “’bare minimum Monday’ as a form of self-care”.
So goldbricking has now become yet another version of seeking safe spaces and avoiding microaggressions. Just what the younger generation needed: yet another reason for older folks (who still do most of the hiring) to perceive them as effete, fragile, and incompetent.
Of course, there has never been a shortage of 40- and 50-somethings who believe that the younger generation is leading the world straight to perdition. I’m from the original “slacker” generation: Generation X. When I joined the so-called “adult world” as a newly minted college graduate in 1991, I endured the subtle jabs of older colleagues and bosses who quipped that “young people nowadays just don’t know how to put in a full day’s work”. And that was more than 30 years ago.Continue reading “Bare minimum Monday may come back to bite you”
I am at that age when many people have lost their mothers. This is a painful blow at any stage of life. I feel blessed, though, to have had a kind and loving mother. Not everyone is so fortunate.
But still, we miss our moms. So does Paul McCartney, whose mother died of cancer in 1956, when the future Beatle (who turned eighty last year) was only fourteen years old.
McCartney wrote the lyrics to the 1970 song, “Let It Be”. If you listen to the lyrics, they are somewhat ambiguous.
“When I find myself in times of trouble
Mother Mary comes to me
Speaking words of wisdom
Let it be
And in my hour of darkness
She is standing right in front of me
Speaking words of wisdom
Let it be”
Having been raised Roman Catholic, I had always assumed that the song’s “Mother Mary” was a reference to the Mary of the New Testament. Mary has a prominent role in Catholic worship and theology, after all. Paul McCartney, as it turns out, was also baptized in the Catholic faith.
“Let It Be”, though, is really a song about Mary McCartney, Paul McCartney’s late mother. He wrote the song after having an intense—and emotionally reassuring dream—about his mother in 1968. In the dream, McCartney felt his mother’s presence.
Was McCartney’s mother really with him, in some sense? Or was that his subconscious at work? Questions like that are above this writer’s pay grade. I’ll leave the answer up to the reader.
Likewise, McCartney has told interviewers that listeners who prefer to interpret “Let It Be” in a religious sense are free to do so.
That’s what happened 37 years ago today (February 16, 1986), when Mr. Mom was shown on network television for the first time.
The movie was originally released in the cinemas in 1983. This was a time when a.) many Baby Boomer women were becoming working moms, and b.) male employment had been battered by the recession of the early 1980s. Both themes are present in the movie. Continue reading “The network premiere of ‘Mr. Mom’”