Boy George: the non-controversy of the 1980s

Sometimes the past provides us with a lens for better understanding the present.

In the spring of 2023, Anheuser Busch launched an online marketing campaign that featured Dylan Mulvaney, a transgender social media influencer, as a spokesperson for Bud Light.

This resulted in a backlash and a boycott, with real financial consequences for the company.

But the backlash was predictable. The Bud Light/Dylan Mulvaney campaign did not take place in a vacuum, after all.

In recent years, many corporations have placed biological men in spaces allocated for women. Sports Illustrated has selected multiple transgender (biological male) models for its annual swimsuit issue.

Even Playboy has gotten into the act, thrusting female-presenting, biological male models before its heterosexual male readership.

The idea here seems to be that if you show heterosexual men enough transgender women, eventually they’ll start seeing them as indistinguishable from biological women.

This follows the twenty-first-century pattern of blunt-force culture warfare, something I’ll return to shortly.

But let’s get back to the Bud Light debacle.

It seemed to me that all sides were losers here. Anheuser-Busch was certainly a loser. Bud Light sales tanked, as Bud Light drinkers turned to other beers. The company’s stock value declined, too.

Alissa Heinerscheid, the Anheuser-Busch marketing vice president who had championed the Dylan Mulvaney campaign, was forced to take a “leave of absence”. That’s code in the corporate world for “fired”.

Did LGBTQ people benefit from this? Not really. Anheuser-Busch had just made them cannon fodder in its efforts to promote one of its products.

But our topic here is the 1980s, so I’m going to discuss some gender-bending controversies from the 1980s.

Or rather, gender-bending non-controversies. As it turns out, the 1980s were sometimes gay, and sometimes gender-fluid, too. But in ways that weren’t as deliberately confrontational as what you see in the present.

In 1982, I was 14 years old. My parents had just sprung for a basic cable television package, and it included MTV, then a brand new channel.

MTV played nonstop pop and rock videos. I was an immediate fan.

MTV introduced me to lots of new musical acts. I would subsequently buy the albums of some of them, which was exactly what the corporate minds behind MTV had intended.

On the American popular music scene, the early 1980s was the era of the Second British Invasion. Everyone who was big in youth music during that time seemed to speak with a British accent. MTV greatly facilitated this influx of British pop and rock acts.

One of these was a group called Culture Club. The lead singer of Culture Club, Boy George, appeared to be female. Boy George wore makeup and baggy feminine attire. He wore his hair long and in braids, in a distinctly feminine style.

Boy George’s mannerisms were feminine, too. He didn’t sing in a high-pitched falsetto; but his singing voice was high enough to pass for that of a woman.

Some time elapsed before I even realized that Boy George was not a woman. Sure, I sensed that there was something about the female-presenting singer that was atypical. But I was initially fooled.

Boy George is an extreme example, but he wasn’t the only popular musician in the 1980s to tinker with notions of gender norms. There was a whole subgenre of rock music called “glam rock”, in which male musicians took on deliberately androgynous appearances. This started with David Bowie in the 1970s. By the 1980s, groups like Motley Crue and Ratt were wearing makeup and quasi-feminine hairstyles. Women got into the act, too. Annie Lennox of the Eurythmics wore short hair and masculine business suits.

Culture Club, featuring the gender-fluid Boy George, was enormously popular in the early 1980s. In the fall of 1982, the group’s breakout song, “Do You Really Want to Hurt Me?” reached the number two position on the US charts.

The term “transgender” wasn’t common in the early 1980s, but that’s what Boy George was. He was a man who presented as a woman in public. At the very least, he was a drag queen.

Suddenly, the gender-fluid Boy George was in front of millions of impressionable young people, every time they turned on MTV. And practically all adolescents and teens watched MTV in the 1980s.

If we cast the 1980s in terms of the present, the next logical question might be: when did the backlash start? Where were the alarmed parents, taking to microphones in town hall meetings throughout the country? Where were the calls to boycott cable companies that included MTV in their basic packages?

We didn’t have the Internet in the 1980s, of course. But we did have CNN, network television, newspapers, and radio. There were certainly political and social movements that went “viral” during this period, like the Nuclear Freeze campaign, or the Save the Whales Movement.

But here’s the thing: there was no backlash against all of this gender fluidity on MTV and thoughout popular music. Many adults were aware of Boy George. He was too large of a cultural phenomenon to escape their notice.

In 1984, People magazine ran a cover article about Boy George, with the words, “kids are getting his message”. Yet the adult authority figures of 1984 were notably unconcerned. Boy George did not become a flashpoint in a 1980s version of the culture war.

Was this because the 1980s were more liberal? Hardly. Keep in mind that the 1980s are remembered for their conservatism, and not without reason. Politically, this was an era that belonged to Ronald Reagan in the United States, and Margaret Thatcher in the UK. In many of our institutions, members of the World War II generation still occupied positions of leadership.

A group that called itself the Moral Majority was very active, too. This organization, led by evangelical preacher Jerry Falwell, took very public positions on matters of sex and propriety. For example, the Moral Majority constantly campaigned against convenience stores that sold magazines like Playboy and Penthouse. Hotel chains that offered pay-per-view pornographic movies in guest rooms were put on boycott lists.

And yet, the Moral Majority and its various imitators didn’t care about Boy George, the gender-fluid, female-presenting lead singer of Culture Club.

I don’t ever recall hearing an adult fulminate against this man who dressed up in female attire to sing for young people. Not even once.

But the tolerance of Boy George’s gender fluidity went beyond the adults. What about that most maligned of demographics: young, heterosexual males? You may have been told that the 1980s was an era of “toxic masculinity”. Not in regard to Culture Club, at least, it wasn’t. As a teenage boy in 1982 or 1983, your decision to listen to Culture Club—or to ignore them—wasn’t seen as a statement on your masculinity. 

Boy George held back nothing, and still, no one on the right cared. In 1983, he was asked in a television interview whether he preferred men or women. He replied, “Oh, both.” In a 1985 interview with Barbara Walters, Boy George elaborated further, stating that he had had both male and female lovers.

And still, there was no call to shield American children from this bisexual, gender-bending singer who made absolutely no bones about who he was and what he was about.

But why were conservatives so blithely tolerant of Boy George and all this public gender fluidity, in the most conservative era in recent history?

Because other things were different in the 1980s, too. Tolerance went both ways. No one on the right insisted on making Boy George and gender fluidity points of confrontation, because no one on the left did, either. Nor did the people who ran our schools, media outlets, and Fortune 500 companies.

I was in high school during the first half of the 1980s. My teachers were certainly aware of Boy George. Yet none of them suggested in class that maybe some of us should change our gender identification because this famous singer appeared to be doing so. Teachers in American grade schools weren’t making such arguments, either.

Nor would a marketing executive at Anheuser-Busch in 1983 have been foolish enough to troll the company’s core demographic by making Boy George a spokesperson for Bud Light. Bud Light commercials of the 1980s were designed to appeal to the beer’s mostly male, mostly blue-collar customer base. Bud Light ads of that era featured the humorous canine character Spuds MacKenzie, and real women in swimsuits. Some Bud Light marketing campaigns even made use of both in the same ads.

Nor did Boy George—or any of his fans—demand that we pretend Boy George was an actual woman, just because he presented as female in public. In fact, Boy George—who is still around—has publicly taken issue with the contemporary pronoun police.

Back in the 1980s, Boy George wanted to do his thing; and his thing was flamboyant, gender-bending, and bisexual. He didn’t demand that you change your ideas of gender and sexuality in order to accommodate his ideas or his choices.

The main strategy in the culture wars of the twenty-first century seems to be not persuasion, but staking out positions that are practically guaranteed to be inflammatory, then daring the other side to knock a chip off one’s shoulder.

As someone old enough to remember the 1980s, I can report that in regard to most matters, people were a lot more laid back and tolerant then. There was an acceptance of diversity, but it was also understood that diversity went both ways. Boy George represented one kind of diversity. As did the predominantly heterosexual, rough-edged culture of the typical Bud Light drinker. What Alissa Heinerscheid, the now fired marketing VP at Anheuser Busch, dismissively called “fratty” culture.

This sense of moderation on all sides was why Boy George never appeared in a Bud Light ad in the 1980s, and he never incurred the public disdain of Bud Light drinkers. Even as many Bud Light drinkers happily sang along with Karma Chameleon when that song came on the radio, as it so often did.

Mother’s Day in crazy times

I grew up in the 1970s and 1980s. It would be easy for me to say that those were simpler times; and they were.

I’m also an unrepentant nostalgic and traditionalist, so it should therefore surprise no one that I have a lot of respect for mothers and Mother’s Day.

But not everyone agrees.

Just last week, officials at Toronto’s Kew Beach Junior Public High School removed an innocuous Mother’s Day message posted to a marquee outside the school. Why? Someone complained that the message wasn’t sufficiently “inclusive”.

The message was this:

“Life does not come with a manual, it comes with a mom.”

Now, there might be a point here (though not the point the complainer was trying to make). I understand that not all children have mothers. Mothers die in childbirth. Mothers are absent because of addiction, divorce, and other unfortunate circumstances.

That wasn’t the issue here, though. The source of ire up in Canada was that the sign “didn’t take into account kids from different kinds of families like LGBTQ families”.

Two dads, in other words. Or a polyamorous parenting ensemble consisting of multiple adults of various gender identities and orientations.

It will be only a matter of time (mark my words on this one) before some school district or especially woke city council declares Mother’s Day itself to be insufficiently inclusive, and seeks to cancel it. Look at what just happened at Kew Beach Junior Public High School. Someone complained, and the people in charge caved, rather than bear an accusation of thoughtcrime.

There’s a lesson here: When we listen to crazy people, we get crazy times. When we cringe before ideological extremists, their values become our reality.

Mother’s Day is Mother’s Day. It doesn’t include everyone. It doesn’t include me, for example, because I’m a biological male and I’m not a parent.

And you know what? I’m perfectly fine with that.

Happy Mother’s Day, 2023!

Ebanie Bridges, OnlyFans, and the sorry state of manhood

I don’t understand the things that people hype nowadays. Or the things they spend good money on. Maybe I’m out of touch. Or maybe the rest of the world has gone crazy.

This tale begins with Ebanie Bridges, a 36-year-old professional boxer.

Now, before you ask, I have nothing against female boxers, or female athletes in general. Not that I’m a spectator of them, mind you. But then, I don’t watch men’s sports, either. (Hint: if you’re watching more athletics than you’re participating in, you’re probably in danger of becoming a couch potato.)

The aforementioned Ebanie Bridges was recently paid £250,000 just to start an OnlyFans account. That’s a lot of money, sure. But given the number of men plunking down cash on that autoporning site in recent years, why not?

For a mere $12 per month, the desperate, sex-starved male can now view shots of Ms. Bridges in lingerie, overflowing with tattoos and cleavage. Hoo-hah. Grab your willies, guys, your computer mouses, and your credit cards.

But that’s not all. It gets even worse. According to an article in The Sun, Bridges regularly receives “odd requests from ‘paypigs’” who ask her for “gnarly things, such as her dirty socks and bathwater.”

The sad part: I have no doubt that men really are requesting such items, and paying good money for them.

I’ve read those reports of testosterone declining. The average 20-year-old man is much less manly than his grandfather was at 40, or even 50. But have millions of red-blooded men now been reduced to…OnlyFans paypigs? Apparently so.

For most of my life, I didn’t consider myself an “alpha male” in the traditional sense of that word. But such yardsticks are relative. So many men have now lowered the bar to such a degree, that even I have reached alpha male status by default.

Those pathetic shells of men who comprise the subscriber base of OnlyFans…they who plunk down their cash not for sex, even, but for onanistic pleasures on a computer screen.

Oh, and dirty socks. Those, too.

The coming AI fiction glut?

Of all the things overhyped on the Internet at present, so-called AI (artificial intelligence) ranks near the top of the list. (Right after whatever Taylor Swift, and the Duke and Duchess of Sussex happen to be doing at the moment.)

Perhaps you’re a techno-utopian and you’re already annoyed with me for being a wet blanket. This technology is freaking amazing, you say. Before you send me an email accusing me of Luddism: I’m not against all AI, as a blanket policy. But much of the AI marketed at consumers is onanistic, yet another solution in search of a problem. AI is not a monolith. There are a few gems, but a lot of coal, too. 

For example: the AI in Photoshop that allows me to create a layer mask is clearly worthwhile, and incredibly useful AI. Photoshop is a program of pure genius, that enables the artistically inept—like yours truly—to make composites and image collages with only a journeyman’s grasp of the program. (More complicated artistic tasks, like original illustration, require the hand and eye of a professional, of course.)

On the other hand, my last washing machine had an “AI” sensor that was supposed to detect overloads. The sensor malfunctioned, and basically defined every load as an overload (even a load consisting of three medium-sized bathroom towels).

I had to scrap the entire washing machine. When I purchased my next one, I  specifically selected a machine without any AI capabilities. It functions perfectly. And since I’ve been using washing machines since the early 1980s, I’m fairly certain that I have the common sense not to overload one, sans AI assistance. 

Here’s the point. Sometimes AI is useful, and sometimes it’s like the stuffed birds that briefly appeared on women’s hats in the 19th century, before everyone came to their senses. Washing machines do not need AI. Nor does the phone answering system at your cable company. A menu telling you to press “1” for technical support, “2” for billing, and “3” for sales was always more than adequate. And the number-driven phone menu is early 1990s technology. No one needs an AI voice that sounds sort of like a person, but can’t really do anything extra for you, aside from raising your blood pressure.

In recent months, there has been an endless stream of online hype articles about programs like Sudowrite, ChatGPT, etc. These programs produce walls of text that kinda sorta maybe appear to be stories for a paragraph or two. 

The result has been predictable: a vast tsunami of AI-generated fiction, flooding online magazines and Amazon’s self-publishing platform. Clarkesworld reportedly had to temporarily suspend submissions to deal with the glut. 

The AI fiction glut seems to be most acute at the level of short stories and children’s books, which are usually no more than a few thousand words. If you’re going to try to write a book using AI in the first place, after all, why stretch your attention span any more than is absolutely necessary?

Many of these books and stories seem to arise from bets. A recent Reuters story describes a book written by a man “who bet his wife he could make a book from conception to publication in less than one day.” The result was a 27-page “bedtime story about a pink dolphin that teaches children how to be honest”. Make of that what you will.

This trend is also driven by social media, especially on TikTok and YouTube. Since the advent of Amazon-based indie publishing, there has been no shortage of hustlers and scam artists who are eager to tell the unwary how they can “strike it rich!” with low-content books, and even plagiarized books. Should we be surprised that these same video charlatans have now picked up the baton of AI written books? 

The title of this post is a misnomer, of course. The coming AI fiction glut is not “coming”, it is already here. 

We might have foreseen this. Long before AI, overnight fortunes were made by peddling get-rich-quick schemes and “lose weight without diet or exercise” promises. 

Never mind that such ruses predictably disappoint in the long run. The lure of the quick and the easy has an enduring appeal. 

Make Daylight Saving Time permanent, or ban it altogether?

This past Sunday we all moved our clocks back to Standard Time, thereby ending Daylight Saving Time. 

Spring forward, fall back. You know the drill.

And what a drill it’s become. Daylight Saving Time is yet another practice that has gone completely off the rails in my lifetime.

When I was a kid, Daylight Saving Time ran from late April through late October. For example: In 1981, Daylight Saving Time began on Sunday, April 26, and ran through Sunday, October 25.

In 2022, by contrast, Daylight Saving Time ran from March 13 through November 6. This has been the trend for years now: to make Daylight Saving Time extend for as many weeks as possible.

The fetish for Daylight Saving Time has become so intense that a new proposed law, the Sunshine Protection Act of 2021, would make Daylight Saving Time permanent. The bill has bipartisan support. Our two major political parties have finally found something they can agree on, and—big surprise—it isn’t anything that is particularly useful. Continue reading “Make Daylight Saving Time permanent, or ban it altogether?”

Participation trophies and organic chemistry

Maitland Jones Jr., an award-winning professor at NYU, was fired after a group of his students signed a petition alleging that his organic chemistry course was “too hard”. 

I should begin with the usual disclaimer: I don’t know Maitland Jones, or the students who signed the petition. I never took his organic chemistry course. But that doesn’t mean I’m completely unfamiliar with the broader questions here.

In the academic year of 1987 to 1988, I took three semesters of organic chemistry at the University of Cincinnati. The reader might reasonably ask why I did this to myself. 

During the previous summer, I had taken an intensive Biology 101 course, comprised of three parts: botany, zoology, and genetics. 

I got A’s in all three sections of Biology 101. Botany and zoology were easy for me because I have always been good at memorizing large amounts of information that has no logical connections. (I’m good at foreign languages, for much the same reason.) I struggled a bit with the genetics portion of Biology 101, which requires more math-like problem-solving skills. But I still managed to pull off an A. 

I was 19 years old at the time. With the typical logic of a 19-year-old, I concluded that I should go to medical school. I changed my undergrad major to premed, and began taking the math and science courses that comprised that academic track. 

That’s how I crossed paths with organic chemistry. Organic chemistry was nothing like the Biology 101 course I had taken over the summer session. Biology 101 was aimed at more or less the entire student body. (I initially took it to satisfy my general studies science course requirement.) Organic chemistry was aimed at future heart surgeons and chemical engineers. Organic chemistry was the most difficult academic course I have ever taken, or attempted to take.

Organic chemistry is difficult because it requires the ability to memorize lots of information, as well as the ability to apply that information in the solution of complex problems. Organic chemistry is, in short, the ideal weed-out course for future heart surgeons and chemical engineers. 

How did I do in organic chemistry? Not very well. I managed two gentlemanly Cs, and I dropped out the third semester. 

My dropping out would have been no surprise to my professor. Nor was I alone. Plenty of other students dropped out, too.

Early in the course, I remember the professor saying, “Not everyone is cut out to be a doctor or a chemist. Organic chemistry is a course that lets you know if you’re capable of being a doctor or a chemist.”

That was 1987, long before the participation trophy, and back when a snowflake was nothing but a meteorological phenomenon. My experience with organic chemistry was harrowing, so far as “harrowing” can be used to describe the life of a college student. But in those days, disappointments, setbacks, and the occasional outright failure were considered to be ordinary aspects of the growing up experience. My organic chemistry professor did not care about my feelings or my self-esteem. He only cared if I could master the intricacies of stereochemistry, alkenes, and resonance.

The good news is that I was able to quickly identify a career that I would probably not be good at. Even more importantly, you, the reader, will never look up from an operating table, to see me standing over you with a scalpel.

If we have now reached the point where students can vote their professor out of a job because a course is too hard, then we’ve passed yet another Rubicon of surrender to the cult of feel-good political correctness. 

A decade ago, many of us laughed at the concept of the participation trophy. But at the same time, many of us said: “What’s the big deal?”

The big deal is that small gestures, small surrenders, have larger downstream consequences. A participation trophy is “no big deal” on an elementary school soccer field. At medical school, participation trophies can endanger lives, by enabling the less competent to attain degrees and certifications which they would never have acquired in saner times. 

Are you planning on getting heart surgery down the road? You might want to get it now, before the present generation of premeds and medical students becomes the next generation of doctors. 

Should you major in the liberal arts?

One day in August of 1986, I sat down before the desk of a Northern Kentucky University (NKU) guidance counselor to discuss my class schedule for the upcoming fall semester. I was an incoming freshman. 

The guidance counselor began by asking me what I intended to major in during my four years at NKU. I told her either history or English literature.

She told me, in so many words, that I was an idiot.

“So what you’re saying,” she said, “is that you want to pay to read books. You can read books on your own time, for free.”

Needless to say, I was taken aback. This was several decades before it became fashionable to refer to callow young adults as “snowflakes”. Nevertheless, I was used to receiving almost unconditional encouragement from the adults in my midst. The ink on my high school diploma was still drying, after all. 

I explained to her that English and history had been my favorite subjects in high school. She told me that that didn’t matter. What mattered was, how was I going to prepare myself to earn a living by reading Shakespeare, and learning about the Treaty of Ghent? 

What are the liberal arts?

I will freely admit that I love the liberal arts. 

What are the liberal arts, exactly? In short, they are subjects of fundamental, as opposed to applied, inquiry. 

Imagine a group of students at one of the great universities of Europe during the Renaissance, say in…the year 1504. How many of them do you think were studying HVAC repair or managerial accounting? No, most of them were studying subjects like philosophy, mathematics, astronomy, and literature. 

The liberal arts, then, don’t necessarily mean subjects for the numerically challenged, like literature and history. The liberal arts actually are divided into four categories: social sciences, natural sciences, mathematics, and humanities. Some of these require a substantial amount of number-crunching. 

The problem with the liberal arts

The problem with the liberal arts, of course, is that it’s hard to make a living as a philosopher or a historian. In most cases, a career in the liberal arts means a career in academia. That can be an okay career, but there are only so many universities and so many open slots. 

And you’ll need an advanced degree. As one of my college biology professors once told my Biology 101 class: “A Bachelor of Science in Biology qualifies you to sell shoes.”

Oh, and that was in 1987, when there were still mall shoe stores to give you a job selling shoes. Today, of course, people buy shoes from Amazon—where they buy seemingly everything else. 

What should you major in, then?

Probably something that you see people doing around you, for actual money: nursing, engineering, accounting, and yes—HVAC repair. (The guy who works on my furnace and air conditioner is typically scheduled four weeks in advance). 

This is a bitter pill to swallow for those of us who love books. It’s not that there aren’t books in the applied fields, but they are different kinds of books, and you’ll require a different kind of curiosity to get enthusiastic about them. 

But trust me—it can be done. I finally ended up majoring in economics—yes, another field in the liberal arts! Later on, though, I took some accounting courses at the graduate level. Accounting can be interesting. Really, it can be.

Why should you be so coldly realistic when choosing a major? The answer is simple: money. With rising tuition costs, a college education is an investment. And every investment requires a plan for ROI (return on investment).

If this doesn’t seem immediately obvious to you (it didn’t seem obvious to me in 1986), don’t be too hard on yourself. Our system of higher education is still stuck in the early twentieth century in many ways. 

Before World War II, a university education was largely seen as a “finishing school” for members of the upper classes. Most of them didn’t have to be too practical. They already had plenty of career options. Many went to work in well-established family businesses. 

In the decades after World War II, the significance and the purpose of the university degree gradually changed. On one hand, by the 1960s, people from the middle and working classes were now attending university. That was a good thing. On the other hand, though, the 4-year university degree lost its scarcity value. And since people in the middle and working classes needed to earn money, they had to be practical when choosing a major.

These changes were already well underway in 1986, when I entered college. But I’m not going to lie to you: When I graduated in 1990, it was still possible to get a corporate, professional job if you had some kind of a 4-year degree—even in history or astronomy. 

Yes, the engineering majors were most in demand, but few English lit graduates truly went without. Somebody would hire you. The only downside was, you wouldn’t be using your major. I graduated with a B.A. in Economics, but no employer ever asked me to draw a supply-and-demand curve, or estimate the marginal utility of some activity. (My first job out of college was a low-level sales job for a machine tool distributor.) 

Nowadays, though, it isn’t uncommon to find twentysomethings with BAs in English working as baristas and waiters. I’m not gloating here, or saying that’s good or fair. I am saying that it’s harder than ever to get a decent-paying job with a liberal arts degree.

What about the liberal arts, then?

Here’s another confession: I still love the liberal arts, and I still think they’re vitally important. 

There are some things that every person should know, and most of these come under the rubric of liberal arts. Every person should know how to set up a basic algebraic equation. Every person should know the differences between capitalism and Marxism. Every person should know the basics of the histories of the major world cultures, and the major world religions. 

Oh, and the Treaty of Ghent: Signed on Christmas Eve, 1814, the Treaty of Ghent ended the War of 1812 between the United States of America and Great Britain. If you’re reading these words in one of those two countries, you should probably know the Treaty of Ghent, too.

To revisit the advice that the NKU guidance counselor gave me over thirty years ago: You can learn about the liberal arts without spending thousands of dollars in university tuition. 

Want to read Shakespeare? His works are in the public domain. If you have an Internet connection, you can read them for free. You don’t have to spend $300 per credit hour to read Shakespeare. 

Likewise, most of the liberal arts have been covered in thousands and thousands of books, written by some very bright people. You can get books for free at your public library. Or you can spend a little money and get them on Amazon. (But you won’t spend even a fraction of what you’d spend to acquire a 4-year degree in one of these subjects.)

The hard truth

I wish the truth were otherwise. I love the idea of study for its own sake. But given the realities of tuition fees and the job market, you probably shouldn’t pay to take college courses in Shakespeare or European history. 

You’d be much better off reading about these subjects on your own time. Go to college for engineering or nursing. 

Or maybe HVAC repair. Remember that HVAC repairman who’s booked four weeks in advance. I doubt that there are many philosophers with four week waiting lists, and not many poets or playwrights, either. 

The Apple Store business model is broken

Here’s what’s wrong…and how Apple can fix it.

This past week I took my 73 year-old father to the Apple Store in the Cincinnati area with the intent of purchasing at least one (and probably two) items. My dad was in the market for a new iPhone and a new laptop. 

We arrived twenty minutes before the store opened. A young Apple Store associate entered our information in a tablet before the store opened. (Like the government in Logan’s Run, Apple Stores seem to eliminate every member of their band over the age of thirty. I have never been waited on there by anyone much beyond that age.) 

Great! I thought. This is going to be fast! Whiz-bang efficiency!

But I was wrong. It wasn’t fast. 

To make a long story short, we spent 90 minutes waiting around the store. We stood. We paced. We looked at the few items that you can view without the help of a sales associate. (And there aren’t many of those.)

And then, finally, we gave up. We left without buying anything. At the time of our departure, we were told that we would be waited on in…about twenty minutes.

That was probably an optimistic assessment. I think it would have been more like an hour: There were around two dozen other customers waiting around for service, just like us. 

I saw several of them walk out in frustration, too.

Apple: great products, sucky retailing

I am a ten-year member of the Cult of Mac. 

I personally haven’t used anything but Apple products since 2010, when a final malware infection of my Dell PC, loaded with Windows XP, convinced me that enough was enough.  

So I bought an iMac. The rest, as they say, is history. Since then, I’ve owned two iMacs, two MacBooks, four iPods, and three iPhones. 

I’ve become an evangelist for Apple products. I’ve converted not only both my parents, but at least two or three of my friends. 

Apple products really are something special. But boy, those Apple Stores sure do suck.

And I’m not the only one who feels this way.

Widespread complaints

A May 2019 article in the LA Times is entitled, “How the Apple Store has fallen from grace”. Focusing on an Apple Store in Columbus, Ohio, the article could have been written about my recent visit to the Apple Store in Cincinnati: 

Web Smith’s recent experience at his local Apple store in the suburbs of Columbus, Ohio, has been an exercise in frustration.

There was the time he visited the Easton Town Center location to buy a laptop for his 11-year-old daughter and spent almost 20 minutes getting an employee to accept his credit card. In January, Smith was buying a monitor and kept asking store workers to check him out, but they couldn’t because they were Apple “Geniuses” handling tech support and not sales.

“It took me forever to get someone to sell me the product,” said Smith, who runs 2PM Inc., an e-commerce research and consulting firm. “It’s become harder to buy something, even when the place isn’t busy. Buying a product there used to be a revered thing. Now you don’t want to bother with the inconvenience.”

There are many similar stories in the media of late, as well as customer complaints on social media. 

Cult of Mac members still largely love their iMacs, MacBooks, iPhones, iPods, and Apple Watches. But they increasingly dread the next trip to the Apple Store.

So what went wrong? And what needs to be done? 

An obsolete concept of the pre-iPhone era

The first Apple Stores debuted in May 2001—going on twenty years ago. Back then, they showcased only the computers, which had a minuscule market share at the time, compared to PCs made by Dell and Gateway. 

iPods were added in October 2001, but these, too, were specialty products when they debuted. For geeks only. 

The real tipping point was the introduction of the iPhone in 2007, and the subsequent ubiquity of smartphones. 

In 2001, a relatively small percentage of the population owned an iMac or a MacBook. In 2019, 40% of us own iPhones. The iPhone is a mass-market product. But it’s still being retailed as if it were a specialty item.

And when you visit an Apple Store in 2019, you’ll find that 70% of the traffic to these upscale boutiques is iPhone-related. Many are there for routine password resets. 

This is traffic that was never imagined or accounted for in 2001, when the Apple Store concept was launched.

Zen over function

Apple Stores don’t look like ordinary electronics retails stores. Steve Jobs was a devotee of eastern Zen practices, and the Apple Store resembles a Japanese bonsai garden. There is an emphasis on minimalism, and lots of blank space. 

The downside of that is that you can’t do much to serve yourself, as you could in a Best Buy or a Walmart. 

You basically walk into the store, and an employee puts you into an electronic queue. Then you wait around. 

But you have a very clean, zen setting in which to wait. 

Uncomfortable stores

Speaking of those long waits….

Apple Stores do look nice. But they are not comfortable places to spend an hour waiting for a salesperson. Which is almost inevitable. 

There are few stools, and it’s clear that the stools were selected for their  sleekness, not their comfort. 

There aren’t any plush bean bags or sofas to sit on. Heavens no! That would detract from the zen.

Inefficient use of staff

Too many Apple Store employees are exclusively dedicated to crowd control—to herding you into virtual line. 

This is because you can’t serve yourself in an Apple Store. Go into a Best Buy, and there are clearly defined areas for looking at computers, at cell phones, at peripherals. There’s a line for service in every Best Buy. A line for returns. 

Normal retail, in other words. 

There are no clearly defined areas within the Apple Store. Customers are all milling about, most of them doing nothing but waiting to be attended on. 

Many of these customers are frustrated and growing impatient. They want to know how much longer they’ll have to wait. This means that at any given moment, at least a quarter of the Apple Store employees you see on the floor are directing this vast cattle drive. 

They aren’t selling any products, they aren’t helping any customers. They’re just managing the virtual line. 

That amounts to a big waste of the Apple Store’s manpower—and of the customers’ time.

Decline of staff quality

Apple stores were once staffed by highly knowledgeable sales personnel. That was in the days when the stores only carried computers, and hiring was very selective.

Those days are gone. Now that it’s all about selling a gazillion iPhones, Apple Store employees are no longer specialists. Despite the pretentious name “Genius Bar”, geniuses are in short supply on the sales floor nowadays. You’re going to be served by run-of-the-mill retail sales staff. And their expertise, helpfulness, and attitudes vary greatly.

Not enough stores

There are about a dozen AT&T stores within a twenty-minute drive of my house in suburban Cincinnati.

Guess how many Apple Store there are…

One. In the Cincinnati area, we are served by a single Apple Store at the Kenwood Towne Centre.

And for those readers in Los Angeles and New York, who maybe think that Cincinnati is a one-horse cow town: There are 2.1 million people in the Greater Cincinnati area. It’s the 29th largest metropolitan area in the United States. 

And we have one Apple Store.

There are only eight Apple Stores in all of Ohio, and a total population of 11 million. That means one Apple Store for every 1,375,000 Ohioans. 

But it could be worse: There are only three Apple Stores in the entire state of Wisconsin. Kentucky has only one Apple Store.

But there are only twenty-two Apple Stores in the entire State of New York. AT&T has more retail locations than that just in Cincinnati. 

No wonder the stores are packed. I made my aforementioned trip to the Kenwood Towne Center Apple Store with my dad on a Friday. Granted, Friday is typically a busier retail day than Tuesday, Wednesday, or Thursday. But this was during the middle of October—not exactly a peak shopping season. The back-to-school rush is already over. The Christmas shopping blitz won’t begin for another six weeks. 

And at 9:40 in the morning—twenty minutes before opening time—there was already a crowd outside the Apple Store.

The Apple Store needs to be refocused on function rather than branding

As an Apple employee quoted by the LA Times notes, Apple Stores are “mostly an exercise in branding and no longer do a good job serving mission shoppers”.

The “mission shopper” is the shopper who goes into the store with a specific purchase in mind (versus someone who is still torn between a Mac and a PC, or an iPhone and an Android). 

These are customers who could largely serve themselves. If only that were possible. But due to the philosophy of the Apple Store, there is minimal “clutter” at these boutique shops. In other words, these are retail shops with minimal merchandise on display. 

Apple Stores need to become more like Best Buys: There should be clearly defined areas for looking at each category of merchandise, and clearly defined areas to wait for technical support. 

As I mentioned above, most of the traffic in the Apple Store seems to involve iPhone support. The iPhone customers definitely need their own area of the store. 

This probably means abandoning the whole boutique concept. At present, Apple Stores are small but mostly empty spaces in high-rent locations. That is, again, all very zen and cool-looking. But it doesn’t happen  to be a great way to purchase a new MacBook, or to get your iPhone unlocked when you’ve forgotten the passcode.

A broken model in terminal need of repair

 The Apple Store might have been a workable retail model in the pre-iPhone era, when Mac devotees really were an exclusive tribe. The Apple geeks of 2001, with their tattoos and soul patches, may have appreciated the gleaming but empty Apple Stores. 

But the Apple customer base has changed and expanded since 2001. When you factor in iPhones, Apple is now a mass-market brand. (And Apple now owns 13% of the home computer market.)

 Having become a mass-market brand, Apple needs to adopt the more efficient practices of a mass-market brand. 

That means dropping the boutique pretentiousness that makes Apple Stores great places to photograph, but horrible places to buy stuff. The hoi polloi of 2019 are not the rarified Apple geeks of 2001. 

We don’t want or need a zen experience. We just want to get quickly in and out of the Apple Store with minimal delays, like we can at every other retail shop.