Bryan Adams should not have apologized

Singer Bryan Adams apologizes for social media post blaming ‘bat eating’ people for coronavirus

The music of Canadian rocker Bryan Adams was part of the soundtrack of my 1980s youth. His early single “Cuts Like a Knife” was a hit on MTV during my freshman year of high school. I still enjoy listening to his music from time-to-time.

Bryan Adams, like many of us, is frustrated at the coronavirus epidemic. COVID-19, which originated in China, has now killed 292,000 people around the world, including 83,000 in the United States. Continue reading “Bryan Adams should not have apologized”

Republicans, Democrats, and decades-old sexual allegations

Being an Evil Republican™, I know that I am supposed to be giddy about the Tara Reide allegations. In 1993, when Biden was in his forties and Reide was in her twenties, the senator allegedly shoved his hand up Reade’s skirt, and committed other acts that fall short of rape, but certainly not short of sexual assault.

Here we have the smoking gun, right? The nail in the political coffin of Creepy Joe!

Or do we? Continue reading “Republicans, Democrats, and decades-old sexual allegations”

Oliver Stone’s moonbattery

Oliver Stone: ‘Accurate to Say the U.S. Is Weaponizing’ Coronavirus

I’ll always love the movie Platoon (1986), which remains one of my top 10 favorite films of all time. Platoon is a great movie not so much for its portrayal of the Vietnam War, but for what it says about the individual’s struggle to maintain decency in a world where good and evil coexist. (For more on this topic, read my earlier essay about the movie and its meaning.)

Oliver Stone is—or was— a cinematic genius. But as a political pundit, he is immersed in the blame-America-first-last-and-always mindset of the 1960s. Continue reading “Oliver Stone’s moonbattery”

End the shutdown

The shutdown is unsustainable. We need to set a date to end it.

The coronavirus pandemic caught everyone in the United States off-guard in late February and early March. In January, we barely heard about COVID-19. Then in mid-March, a national lockdown/shutdown was imposed. Owing to the American system of federalism, the lockdown was mostly implemented at the state and local levels; but it was national in scope.

Four weeks later, over half of the US economy has simply shut down. Unemployment claims are surging. Washington has already authorized a $2 trillion stimulus bill, which will only be added to our record national debt of $23 trillion. Continue reading “End the shutdown”

The demons of pandemic

Some Roman Catholic priests have remarked on a surge in unholy spiritual activity in the wake of the global coronavirus pandemic, as Catholic speaker Kevin Wells describes in the article below:

Priests reveal how coronavirus crisis has unleashed ‘intense demonic activity’

I know, I can practically hear some of you cackle with mocking glee. “Bible-thumping Trumpster!” you’re saying, as you tune in to MSNBC. Others are hunting around on their hard drives for Darwin and New Atheist memes to send me.

And I’m rolling my eyes right back at you—just so you know. Continue reading “The demons of pandemic”

Should Trump have acted sooner against COVID-19?

Over the past week, a new narrative has emerged in the mainstream media: President Trump has finally, belatedly gotten serious about combatting the coronavirus, or COVID-19. A few brave correspondents at CNN.com—heretofore the mainstream media headquarters of the Resistance—have penned editorials of cautious praise.

Journalists aren’t the only ones. Minnesota Representative Ilhan Omar, one of Trump’s most implacable archenemies in Congress, openly praised the president’s ‘incredible’ response to the pandemic. New York Governor Andrew Cuomo has had nice things to say about President Trump in recent days. In the face of this unprecedented national crisis, the lion and the lamb are lying down together—albeit at a safe social distance of six feet.

But such newfound respect for the president is by no means unalloyed. The other side of the narrative is that Trump should have known better; he should have acted sooner.

There is evidence, after all, that the intelligence community warned the president about the true dangers of coronavirus back in February—even January. And while President Trump did place early restrictions on travel to and from China, the full mobilization of the American  homeland didn’t really get underway until around the Ides of March, give or take a few days.

This brings up an obvious question, the one posed by the title of this piece: Should the president have acted sooner?

Let’s not beat around the bush about the answer: Of course the president should have acted sooner. Most of the rest of us should have acted sooner, too. Speaking of the Ides of March: On Sunday, March 15, I exercised at my fitness center in suburban Cincinnati. I was still half-convinced that I was going to be able to continue working out in a public gym, just like I always have.

But I was wrong. The very next day, Ohio Governor Mike DeWine shut down all bars, restaurants, movie theaters…and health clubs.

I failed to take COVID-19 seriously at first for the same reason that President Trump probably failed to take it seriously. We’ve seen this movie multiple times before, and it has always ended fine for Americans.

No, I’m not talking about the 2011 pandemic film, Contagion. I’m referring to events in the real world. How many times since the beginning of this century have we seen a new flu arise out of  some distant corner of the world, only to dissipate before it reaches American shores?

There have been global outbreaks of H1N1, the avian flu, SARS. None of them seriously impacted daily life in America.

We all make future predictions based on past events. Why should it have been any different this time?

The experts warned President Trump about COVID-19 in January and February of this year. That seems almost indisputable now. But those same warnings, if more generalized, were out there during the presidencies of Barack Obama and George W. Bush. They did nothing, either…so long as they didn’t absolutely have to.

Yes, Bush and Obama were both warned. Over the past fifteen years, I have heard and read multiple warnings from epidemiologists. They repeatedly said that the emergence of a truly global, society-altering pandemic was a question of when, not if.

If I knew that, as a private citizen, then Presidents Bush and Obama also knew. We should have been stockpiling protective masks, ventilators, and hand sanitizer, in the same way that we stockpile petroleum. Imagine how much more prepared we’d be now, if we’d started such actions in 2012, or 2006?

The coronavirus wasn’t the only existential threat that we might have seen coming. What about sentient human threats, like stateless Islamic terrorism?

At the beginning of this century, the cataclysmic black swan event was 9/11. As most readers will know, Osama bin Laden and al-Qaeda were behind that.

The dangers of Osama bin Laden were known to President Bill Clinton. President Clinton had at least one clear chance to take him out with a missile strike. Clinton didn’t act decisively, though, for fear of the political consequences.

And what of Clinton’s successor, George W. Bush? Bush believed that he was going to be a domestic policy president. Shortly after taking office, Bush deprioritized the work of the CIA’s “sisterhood”—a group of mostly female analysts who were then closing in on the Saudi terrorist.

Less than a year into Bush’s first term, 9/11 occurred. How’s that for lack of foresight?

President Reagan, the hero of my Republican youth, played a pivotal role in bankrupting the Soviet Union with an expensive arms race that a Marxist economy simply couldn’t win. During the 1980s, American aid to the Afghan mujahideen helped turn the Soviet invasion of Afghanistan into the USSR’s Vietnam. That effort not only drove the Soviets out of Afghanistan, it also contributed to the collapse of the USSR itself.

What Reagan didn’t foresee, however, was that a decade later, Afghanistan would become the home base of the Taliban. And one of the Arab mujahideen—that same Osama bin Laden—would eventually stop killing commies and start killing everyone else, most of all Americans.

Oh, and President Reagan also didn’t foresee that after the fall of the USSR, Russia was going to turn into something that is arguably worse. Vladimir Putin’s Russia is preferable to Stalin’s USSR; but Mikhail Gorbachev’s USSR might have been preferable to this new incarnation of czarist Russia.

Reagan’s predecessor, Jimmy Carter, also failed to act when he really needed to. Carter should have recognized by 1977 that the Pahlavi regime in Iran was tottering. When the Shah of Iran visited the White House in November of that year, tear gas marred the state visit, as Iranian students studying in the US clashed with riot police. CIA analysts and State Department officials based in Iran (which was then a US ally) warned Carter that something bad was coming over there.

But Carter ignored the warnings. Or at least he didn’t act decisively on them. Fifty-two American hostages spent more than a year of captivity in Iran. And for forty years now, Iran has been not a US ally, but our most persistent and troublesome foe.

I grew up Catholic during the 1970s. In those days, the administration of John F. Kennedy, America’s sainted Roman Catholic commander in chief, was still very much a part of recent memory. Portraits of the fallen president hung in at least one of my primary school homerooms. We memorized passages of Kennedy’s 1961 inaugural address like we memorized passages of Catholic Church catechism. (I can still recite entire paragraphs of it from memory.)

Nevertheless, I can also see where Kennedy failed to heed warnings from his advisors, from history, and from common sense. Kennedy’s Bay of Pigs invasion (1961) was a disaster from the planning stage. Castro’s forces outnumbered the American-backed anti-communist guerrillas by at least 10-to-1.

Kennedy should have known that the Bay of Pigs wasn’t going to be a success. Members of the “deep state”, moreover, advised him not to proceed. But Kennedy went with his gut, and greenlighted the debacle.

The following year, Kennedy narrowly pulled us out of the Cuban Missile Crisis. But every historian will acknowledge that we could have just as easily been incinerated.

Why didn’t Kennedy foresee that the Soviets would put nuclear missiles in Cuba? After all, we had already put nuclear missiles on their doorstep, in Turkey. What the Soviets did was a logical escalation.

What was JFK thinking?

When presidents fail to heed the warnings of advisors and circumstances, the result is often a raft of conspiracy theories. There are Americans who believe that FDR deliberately sacrificed over 2,400 American lives on December 7, 1941, so that the isolationist American public would finally consent to join the war against the Axis powers.

By 1941, after all, FDR had ample evidence that a Japanese attack on Pearl Harbor was imminent. Relations between the United States and the Empire of Japan were already near the breaking point. For years, a final exam question at the Imperial Japanese Naval Academy was, “How would you carry out an attack on Pearl Harbor?”

The Japanese had also tipped their hand with their prior actions. Thirty-seven years before Pearl Harbor, Japan carried out a similar surprise attack on a different enemy. The Russo-Japanese War began in February 1904, when Japanese forces suddenly and without provocation bombarded the Russian naval base at Port Arthur, on the Chinese mainland.

Japan made an official declaration of war three hours later.

Did President Roosevelt knowingly immolate 2,403 Americans on the altar of geopolitics on December 7, 1941? If you believe that, then you essentially believe that FDR was a homicidal sociopath. I don’t believe that.

It’s possible, sure. But the far more likely explanation is that FDR, like so many US presidents before and after him, lacked a perfect insight regarding which dangers required an immediate response, and which could simply be monitored. For no president can respond with urgency to every potential danger.

Hindsight, moreover, is always 20-20. This is as true in our private lives as it is in the fates of nations. ICU units throughout the country are filled with terminal patients whose lifestyle diseases were entirely—or almost entirely—avoidable.

They were informed, ad nauseam, about the dangers of smoking. Their physician warned them to lose weight, to get more exercise. Watch that blood sugar, they were told. Your blood pressure is too high.

They had years to turn their situations around, to avoid disaster. And yet they still wound up in those ICU beds.

Why? They probably weren’t suicidal. But something else was always more urgent—more pressing. Who has time to worry about a heart attack that might strike you ten years in the future, when there is so much that demands your attention right now?

And so it goes with presidents. When you’re President of the United States, you’re constantly bombarded with warnings about short-term and long-term dangers to America. The Chinese are expanding their blue-water navy, with the aim of threatening the American heartland with nukes. Iranian and North Korean hackers are trying to take down our electrical grid. There’s also a new disease in Wuhan, China; you really ought to take a look at that.

On occasion, presidents overreact to a threat. (President Bush’s 2003 invasion of Iraq was a recent, textbook example of such an overreaction.) But most of the time, their mistake is to not recognize a potential threat until it becomes an actual, existential threat.

We can certainly make the case that President Trump fell into that trap in January and February of this year. He should have acted sooner and more decisively in a critical moment. He didn’t. But the same can be said of FDR, Clinton, Carter, Bush, and others.

President Trump is a polarizing figure. This statement doesn’t, in itself, mean that he’s objectively good or bad. It means what you already know: You can’t say his name in a group of people without eliciting strong reactions.

Americans tend to either love him or hate him. If you’re on the left, President Trump is horrible, evil—worse than Hitler, even. Worse than anyone or anything imaginable. Orange Satan.

If you’re on the right, meanwhile, President Trump is the nearly mythical figure of his political rallies (which won’t be resuming anytime soon, thanks to coronavirus). He’s The Art of the Deal, the charismatic host of The Apprentice. He’s the man who is going to Make America Great Again.

Perhaps Trump fits neither of those partisan hyperboles. Perhaps he’s simply yet another American president whose crystal ball was imperfect at a critical moment. And now, as a result, both the president and America find themselves behind the eight ball.

Notice how you don’t hear as much about the upcoming election in November in recent days. Oh yeah, that. We’ll certainly get around to it…provided we can all make it to the polls without having to don hazmat suits.

At the moment, most of us would be happy to simply see an America that is free of coronavirus. Let’s hope that President Trump, and our more conscientious leaders in both parties, get us there soon. There will be plenty of time to play Monday morning quarterback afterward, after the present crisis ends.

COVID-19 and the inevitable facts of life

Let me present the following to you…This is a true story…

A previously unknown, deadly virus arises out of a part of the globe that most Americans regard as remote and exotic.

The virus spreads throughout the world…including to the West.

The disease is particularly deadly among certain demographics, and in particular countries.

A Republican US president is accused of dragging his feet in the fight against the disease.

Dr. Anthony Fauci is deeply involved in the fight against the disease.

Tom Hanks becomes a symbol of the disease.

There are dire predictions that the disease will decimate the United States.

The media reports on the disease nonstop.

A number of famous people get the disease, giving the media even more material.

The disease takes thousands of lives. There are genuinely tragic stories…many of them.

Because of the disease, Americans have to change some practices that they never gave much thought to before. New precautions are required.

By and large, though, life goes on, though life is never quite the same.

An ordinary American is kidnapped and taken to North Korea. He has one objective: escape! Available in paperback or Kindle.

Another time, another pandemic

I’m not talking about COVID-19, the novel coronavirus. I’m talking about AIDS and HIV.

I remember how, in the early 1980s, we first began hearing sporadic stories about a “gay cancer”. Then a few years later, we learned that Americans had a new, fatal, viral disease to contend with: AIDS, which was caused by the human immunodeficiency virus (HIV).

The disease spread especially fast among gay men and intravenous drug users. It decimated parts of Africa, in the so-called “AIDS belt”.

A long list of famous Americans caught AIDS. Some died from AIDS, or AIDS-related complications: Rock Hudson, Liberace, Freddy Mercury, Arthur Ashe. Also Robert Reed—whom Americans of a certain age will remember as the father on The Brady Bunch.

Other famous Americans—Magic Johnson, Greg Louganis, and Charlie Sheen among them—continue to live with HIV.

There were also tragic cases of ordinary HIV-infected Americans, whose situations, for one reason or another, became widely known: Ryan White (1971-1990) was an Indiana teen who acquired HIV from a blood transfusion. Kimberly Bergalis (1968-1991) was young Florida woman who was infected by her dentist. Both White and Bergalis died from AIDS.

Ronald Reagan—not the current occupant of the White House—was the Republican president whom many believed to be slow to accept the reality of AIDS and HIV.

Dr. Anthony Fauci was prominently involved in the initial fight against AIDS and HIV, just as he is now front and center in the fight against COVID-19.

Tom Hanks became a symbol of the human side of COVID-19, when he and his wife were infected in Australia. History repeats itself here, too: Tom Hanks became a symbol of the human side of AIDS, when he starred in the 1993 movie Philadelphia. This was a film about a gay attorney who contracts AIDS, and then sues his law firm for wrongful termination. Groundbreaking for its time, Philadelphia was one of the first mainstream films to frankly portray the topics of AIDS and homosexuality.

The end of civilization?

Anyone who was alive in the late 1980s and old enough to be aware of the news (I was), will recall the impact of AIDS on the American psyche. By 1985, most Americans believed that in wealthy, Western countries like the United States, deadly communicable diseases were a problem of the distant past. AIDS was therefore a rude awakening.

There were dire predictions about the inevitable course of AIDS in the United States. Look at what happened in Uganda, after all.

I recall a book commercial that aired frequently in 1989. It featured a skull on the TV screen, and the tagline: AIDS: The End of Civilization. (This was also the title of the book, by William Campbell Douglass.)

I was in my early twenties in 1989. I remember hearing at least a few people my age (both men and women) state that they were going to forgo any romantic involvements, for fear of getting AIDS. One young man, I recall, said that he was going to “wait until they straighten this all out.” He never specified who “they” were. Nor did he suggest a timeline for “them” to straighten it all out.

How we adjusted to AIDS

As most readers will know, there is still no definitive cure for AIDS/HIV. Nor is there a vaccine.

Many aspects of American life changed forever as a result of AIDS. I’m not just talking about sex. Our clinical fear of blood, and bodily fluids, largely didn’t exist prior to AIDS and HIV.

Many of the changes implemented in response to the AIDS epidemic (including some from the top down) were reasonable and justified. The city of San Francisco closed the infamous bathhouses, which were sources of mass infection for gay men. All donated blood is now tested for HIV, and has been since the 1980s. Health professionals all wear gloves today, whenever they interact with patients, because of the AIDS epidemic (even if AIDS is not a specific fear in each case). I recall going to the doctor prior to AIDS: The doctor usually didn’t bother to put on gloves if he/she was merely examining you.

And, of course, there were changes to sex. During the 1970s (I was still a kid then, but I’ve heard the stories), the widespread availability of the pill combined with looser social mores to create a wild west approach to sex. The 1980s wasn’t the 1950s; but AIDS brought about a much needed reexamination of the sexual revolution.

But AIDS didn’t bring about the end of civilization, as that 1989 book predicted. You still hear about AIDS on the news, but nothing like you did in the late 1980s, when the disease was new, and on the mind of every journalist.

Many late-1980s reactions to AIDS seem like overreactions today, in retrospect. Consider those young people who were so afraid of AIDS that they temporarily became social and sexual hermits.

Few young people today would even think about forgoing the normal process of meeting someone special, dating, getting married and having children because they might get AIDS in the process. (And most young people of the late 1980s snapped out of that, too, after the initial panic subsided.)

That’s a good thing, because otherwise, there would today be no people under the age of about thirty-five. An overreaction to AIDS might have brought civilization to a standstill, even if the disease itself didn’t wipe us out.

Yet another time, and yet another strange, deadly, viral disease

This brings us to what you’re really concerned about right now: not AIDS, but the novel coronavirus, or COVID-19.

AIDS and COVID-19 are very different, of course. AIDS is far deadlier than coronavirus (if you get HIV), but far less contagious. You can’t get HIV from casual contact.

But AIDS and COVID-19 are also similar, in terms of the public reaction that history will record them both as causing in their initial heydays.

Many of the early, dire predictions about AIDS turned out to be wrong.

Is it blasphemous to suggest that the same might be true about COVID-19, too?

These are immoderate times. (Just look at the political environment in the United States.) And the reaction to COVID-19 has been correspondingly extreme. What has occurred in recent days in the United States is, in many ways, an outright panic: the hoarding, the incessant media coverage, the constant chatter and rumormongering on Twitter and Facebook.

But most of all: We have witnessed something unprecedented in modern times—and possibly even in pre-modern times—the complete shutdown of American life. (This didn’t occur in response to AIDS.)

Over the past week, our state, local, and national governments have ordered everything to come to a complete standstill.

We’re hearing talk not just of a recession—but of a depression.

Unemployment as high as 20%.

Everyone confined to their homes for weeks, months, or even longer.

Risk, mortality, and the human condition

The shutdown, of course, is intended to eliminate risks. Postmodern Americans are uncomfortable with risks. And we’ve become increasingly uncomfortable with risks in recent decades.

When I was a kid (in the 1970s), Americans still regularly drove without seat belts. My entire childhood, I rode a bike without a helmet. No one even thought about it. Today, you will almost never see an American child in an American suburb riding a bicycle without a helmet. And if you try to drive without a seatbelt, your car dings and beeps until you put it on.

Eliminating risks is not a bad thing, of course. I’m all for seat belts and bicycle helmets.

But here’s another fact of life, that most Americans don’t want to accept: There is no way to eliminate all the risks of any activity. Risks—of death, disease, accident—are realities that we have to accept, in order to live at all, with or without COVID-19.

Since 1948, at least 30,000 people have died each year in automobile accidents. Thousands more suffer life-changing injuries. (Is there a single person among us who hasn’t lost a friend or loved one to the modern addiction to cars?)

So why don’t we simply ban them? Declare an automotive quarantine? After all, automobiles aren’t strictly necessary. Human civilization existed for thousands of years without motor vehicles. And we already know, based on seven decades of data, that they will kill 30,000 people in the US alone.

Every single year.

The inevitability of death

In early 2015, I faced a common, though extremely unpleasant, reality: the death of a parent.

The physicians told me, in no uncertain terms, that my mother was going to die within the next 72 hours from heart failure.

They turned out to be right.

I still miss my mother every day, but I also realize that nobody’s mother lives forever.

Come to think of it, neither does anyone else. Death—from accidents, from communicable diseases, from old age—is an inextricable part of the human condition.

We don’t like to think about that, though. Death has become the taboo subject that sex once was.

Any celebrity who is sufficiently famous has probably talked about their sex life for eager journalists. And it’s all on the Internet. We now know, for example, that Will Smith and his wife of many years, Jada Pinkett Smith, make use of sex toys in their marriage. We know that actor Ethan Hawke is a proponent of “open marriage” arrangements.

And of course, we know about the sexual fantasies of Tiger Woods, which—according to one of his former girlfriends—are “not normal”.

That kind of talk barely warrants a raised eyebrow nowadays, in all but the most staid settings. But try raising the subject of death (something we will all face, at some point), and gage the reactions. State that you believe in an afterlife—or that you don’t. Watch the people around you squirm.

The reality of death, the ubiquity of death, doesn’t jibe with a society that believes every problem or unfulfilled need will be fixed by the next gadget or scientific discovery.

Some Americans even believe in imminent transhumanism—the notion that we will soon be dramatically enhanced from our vulnerable, mortal states by fusions of our bodies with technology. A new utopia of almost unlimited longevity, brought to you by the gods of science.

But it’s been almost forty years since the first cases of AIDS were identified, and we still don’t have a cure for AIDS…or cancer. How about we cure the basic diseases before we think about becoming immortal cyborgs?

Death is real for all of us, our two hundred-plus cable channels and iPhones notwithstanding. Older Americans are going to die today. American adults in their prime, mothers and fathers, are going to die today, too. So are some children.

Should we bring all social and economic activity to a halt in order to avoid COVID-19 deaths? Well, let me ask you a similar question: Why have we continued to drive for the past 70-odd years, with an annual traffic death toll above 30,000?

Pandemics and tradeoffs

But all that talk about death isn’t purely metaphysical. It also relates to the economic concept of tradeoffs. Everything that you do, or don’t do, has an opportunity cost.

To get right to the point here: Twenty-first-century Americans aren’t prepared for post-apocalyptic levels of isolation. You’ve probably seen The Walking Dead. If The Walking Dead were real, the show’s protagonists would have died from starvation long before they were eaten by zombies.

In a world without functioning factories and grocery stores, the zombies would know what to eat. The living people wouldn’t. Twenty-first century Americans don’t grow our own food. We don’t get water from our wells, or power from the horses in our barns, either.

The facts of modern life don’t support a shutdown

We rely on all kinds of modern institutions, both public and private:

Yes, grocery stores and pharmacies…

But also the factories that make the goods that fill the shelves of grocery stores and pharmacies.

Also, the automobile companies that make the vehicles that transport the employees of those companies to work.

And the insurance companies that cover those employees’ vehicles, and…

The banks that enable those employees to get paid, and…

You see what I mean?

We can’t simply shut all of that down, indefinitely, in order to avoid the COVID-19 virus. If we mitigate the COVID-19 virus but cause mass starvation in the process, then we haven’t really gained much, have we?

Trust the experts?

Yes, we should trust the experts within their fields of expertise. This is no time for Wikipedia virology, or Reddit epidemiologists.

But we should also remember something else: Blindly trusting the experts—including medical experts—can occasionally have dire consequences.

Before COVID-19 made the news, the opioid crisis was our major public health concern.

Opioids have their place in pain management (for example, palliative care for terminal patients). In recent decades, however, doctors and pharmaceutical companies began dispensing these dangerous, highly addictive drugs for the treatment of routine pain.

Trust the experts! That was what you said when your doctor prescribed four weeks of oxycontin for your knee pain. A year later, you’re an opioid addict trying to buy heroin in the bad part of town.

Experts like Anthony Fauci should be telling us what the risks of COVID-19 are. They should not necessarily be deciding what risks we as Americans decide to take. Nor should they decide what we are willing to give up, over the long haul, as tradeoffs.

Trust the government?

Well, to a point.

The government is in a panic mode at present. They want businesses to shutter their operations, and citizens to stay in their homes.

That isn’t sustainable.

It might be better if the government focused on finding ways to keep the economy going while taking preventative measures.

But the “shut it all down” administrative zeitgeist has acquired a life and momentum of its own.

And it isn’t sustainable.

So what should we do?

We’re still early in all of this. It was just last week that various governments ordered everything shut down. Like most of you, I’m abiding by the quarantine procedures. That’s my plan for the immediate future. I’m not suggesting that the time has come for any sort of civil obedience or protests. For now, our focus should be on working together to contain the spread of COVID-19. Let the doctors, nurses, and public health officials do their jobs.

But what’s the long-term game here? No one in government seems to know.

We’ve been told that that the next fifteen days are crucial.

But what if the results of this 15-day quarantine are inconclusive? What if the result falls somewhere in between complete success and complete disaster? What are the criteria for going back to normal (or inching our way back to normal)?

What are we going to do over the long haul?

According to the latest information, a reliable, commercially available COVID-19 vaccine is still a year away.

We can’t shut everything down for a year. Not unless each of us moves to a plot of farmland, and learns to grow our own food.

Conclusions: risks we (may) need to live with

We can’t freeze all economic activity indefinitely. We just can’t.

This means that if the results of this initial quarantine period are inconclusive (as will likely be the case), then we may need to learn to live with COVID-19.

This sounds radical, I know. But you already live with myriad other risks.

You drive to the grocery store (provided the coronavirus hoarders have left something on the shelves), even though you might die in an automobile accident on the way. You might even kill someone else in an automobile accident, for that matter.

We don’t expect young people to take lifetime vows of chastity, even though some of them will acquire STDs, be murdered by abusive spouses, and die in childbirth. (Yes, women still die in childbirth, even in the United States.)

And here’s another point to ponder: At the time of this writing, the death toll from the seasonal flu has already reached 22,000. One hundred forty-four of the dead were children.

I know: COVID-19 isn’t the flu, any more than COVID-19 is AIDS, or automobile accidents.

But we have learned to live with the risks of AIDS, automobile accidents, and the flu. And—yes—we’ve learned to live with the sad but inevitable fact that some of us will die from those things.

Not all of us are going to make it.

That’s sad. But that’s also life on earth.

Sooner or later, we’ll need to restart the world. That world may be a different one–perhaps a more perilous one– but it has to start moving again.

And soon.

A young couple in trouble. A ruthless Mexican drug lord. A high-stakes, life-or-death battle at a casino in Southern Indiana. Read Venetian Springs for FREE in Kindle Unlimited.

Tayla Harris, online anonymity, and trolls

Tayla Harris is a twenty-two year-old Australian rules footballer. Being an American, I’m not familiar with Australian rules football; but it seems to be a combination of what Americans call football, and what we call soccer.

A photo of Harris recently appeared online, which showed her executing an aerial kick, one leg extended.

Australian footballer Tayla Harris

Almost immediately, Harris became the target of predictable sexual and sexist remarks. I say “predictable” because of where the remarks occurred: on social media, and in the comments sections of news websites.

In the real world, most of the time, each of us must be accountable for what we say. You are legally free to say anything to anyone, short of threatening violence. But then you must deal with the consequences: a nasty verbal response, social ostracism, perhaps a physical confrontation.

Not so on social media and the comments sections of websites. A long time ago (no one is really sure why) the Internet developed a fetish for anonymity.

I have never understood this. I came of age in the last decades of the newspaper age. In my teens and twenties, I occasionally wrote letters-to-the-editor. I recall that at least one such letter was published in my local city paper, the Cincinnati Enquirer. While I was a student at the University of Cincinnati, I also placed a few in The News Record, the school’s student newspaper.

For each letter, I was required to sign my name. My name was published along with my opinions. The Cincinnati Enquirer, back in those days, also published the address of the person who wrote the letter.

This is why I roll my eyes when an anonymous and controversial YouTube personality whines about being “doxxed”—-which simply means that he or she has to play by the rules of the pre-Internet world. You want to make blatantly racist remarks? You want to claim that Trump voters should be attacked and beaten up at polling stations? Fine. You have your freedom of speech. But there is no constitutional right to anonymity.

But from the beginning, the insular culture of the Internet (it really was a small, self-contained place in the beginning) embraced the conceit that it was “different” or “special”. Enthusiasts of the Internet sought special rules and exemptions, whether that meant that online retailers shouldn’t have to collect sales tax (because Internet), or that music piracy should be tolerated on P2P networks (because Internet), or that everyone should be anonymous (because Internet).

If you live in Iran, Saudi Arabia, or North Korea and you want to express yourself anonymously online, you have my understanding. There is really no reason, though, why anyone living in one of the Western democracies shouldn’t have to be identified with everything they say online.

I understand one common objection: employers can see online comments. Yes, they can. But all those people who signed their names to letters-to-the-editor back in the 1980s and 1990s presumably had jobs. That’s an excuse, not a reason.

And what is the alternative? The alternative is what we currently have. Social media and comments sections have long been cesspools of obscenity and sputtering billingsgate that few people would ever care to read.

This is a shame, because letters-to-the-editor, back when people had to sign them, often added a lot to the newspaper, especially at the local level.

Back to Tayla Harris and her online trolls. Being a male with a male imagination, it isn’t hard to imagine the sorts of comments that the aforementioned photo attracted—from legions of sexually obsessed fifteen-year-old boys.

And that is, quite often, what you’re dealing with when you encounter a “troll” on the Internet—all those “eggs” on Twitter (a default avatar which the company discontinued in 2017, precisely because it was so synonymous with anonymous trolls). Teenage boys typically have more emotion than common sense, and they have lots of time on their hands.

ONE Condoms - Evergreen Brand 300x250

You could spend hours on Twitter arguing politics with an anonymous fifteen-year-old boy in Belgrade, Serbia. Who wants to spend their life that way? This is one of the many reasons why I avoid Twitter like the plague. Almost everyone who is not a celebrity on Twitter is anonymous. So what’s the point? Who cares?

In reaction to sexist and obscene remarks about the Tayla Harris photo, The Herald Sun recently shut down its comments section. My question is: What took them so long? Websites have been phasing out online comments since at least 2014. “Don’t read the comments” has been a common saying for at least a decade. And when was the last time anyone believed social media to be a valuable source of the common man and woman’s insight and opinion?

It doesn’t have to be this way. But both comments sections and social media will likely remain useless—or worse than useless—until people have to sign their names to what they say. Free speech—always free speech—but with personal accountability.

Once again, this is the way we did it in the pre-Internet days; and back then, commentary from the masses often was worth reading. But today on the Internet, not so much.

Eroscillator - Yale University clinical study

The Owl Shot and fragile, fragile youth

There’s a new trend in bars that cater to college students on dates in South Florida. The protocol is laid out on posters, which are displayed prominently throughout the participating drinking and dining establishments:

If a young woman feels “unsafe” or “uncomfortable” on a date, she can order a special drink called an “Owl Shot”. Then, depending on how she responds when the bartender brings her drink, one of the following measures will be taken:

‘Neat’: Bar staff will escort you to your car

‘On the rocks’: Bar staff will call a ride for you

‘With lime’: Bar staff will call the police

TireBuyer.com

This lockdown procedure might be useful in that rare instance in which a young woman finds herself on a date with the twenty-first-century equivalent of Ted Bundy. But if that’s truly the case, then the “Owl Shot” will probably be inadequate. (The posters are right there for men to see, too, after all.)

This is something I’ve written about here before, of course: the wussification of American youth. I don’t necessarily blame the youngsters, mind you: They’ve been bred and raised to be Eloi, food for the Morlocks who lurk just beyond the reach of hovering parents and educators, or—in this case—officious bartenders in South Florida.

Granted, extraordinary situations do exist. Once in a while, an evil or deranged person starts a fire, or commits mass murder, or turns threatening on his date. But hyper-vigilant, preemptive measures like the “Owl Shot” send a message to young adults: Be afraid. Be very afraid. Always.

And then we wonder why so many members of Generation Z are suffering from chronic anxiety. They have been raised to be terrified of the world, almost since day one. Excepting some very extreme circumstances, an unpleasant date is a situation that two college-age adults should be able to navigate on their own, without secret intervention codes passed to the wait staff. 

Nor is this a men’s rights thing. Let’s face it, some guys are dicks. But the Owl Shot, like our current obsession with #MeToo, and real or imagined sexual harassment, sends a very mixed message about feminism. On one hand, we’re told that women should lead men into combat, and should lead our nation. On the other hand, we’re told that women are chronic, hapless victims who can’t make it through a garden-variety date-from-hell without calling for help from the nearest bartender.

A college-aged Gen X woman, circa 1990, always knew how to deal with the boorish date: She would tell the guy to get lost, throw a drink in his face, or—if the circumstances were sufficiently extreme—knee him in the balls.

That last one, it stops a guy in his tracks every time, a lot more effectively than an Owl Shot. Sometimes the old school approach really is the best approach.

50% Off Select Filtration Systems

Ilhan Omar’s 6-point list

Minnesota’s Representative Ilhan Omar is up for reelection this year. Later this spring she will also release a memoir, This is What America Looks Like: my Journey from Refugee to Congresswoman.  Representative Omar’s book, per the Amazon description, will be “an intimate and rousing memoir by progressive trailblazer Ilhan Omar—the first African refugee, the first Somali-American, and one of the first Muslim women, elected to Congress.” 

We wait with bated breath for the Congresswoman’s book. In the meantime, though, she—like her archnemesis Donald Trump—continues to be a prolific user of Twitter.

In one of her recent tweets, Representative Omar listed out the various labels by which she identifies herself:

I am,

Hijabi
Muslim
Black
Foreign born
Refugee
Somali

You may have noticed that Ilhan Omar didn’t include “American” on the list. Plenty of other people noticed, including Dalia al-Aqidi, the Iraqi American woman who hopes to unseat Omar in November. Al Aqidi (who is also Muslim and a former refugee) responded with the simple statement, “I am an American.”

Ilhan Omar knew exactly what she was doing, and what the reaction would be. At the end of her list, she added another note about herself:

Easily triggering conservatives, Right wing bloggers, anti Muslim bigots, tinfoil conspiracy theorists, birthers, pay me a Banknote with dollar sign to bash Muslims fraudsters, pro-occupation groups and every single xenophobe since 2016.

Representative Omar’s bugbears are based on paranoid fantasies, more than actual people. (“Tinfoil conspiracy theorists” seems a little unspecific, at the very least.) We might encapsulate the above paragraph with the phrase, “everyone who disagrees with Ilhan Omar.” 

So, I suppose the Congresswoman would toss me into one of those buckets, if I gave her the opportunity. I’m one of the millions of Americans whom she hopes to trigger.

I’m not sure that I was “triggered” by Ilhan Omar’s tweet. Ilhan Omar has been the source of so much flimflam since she took office in January 2019. The Congresswoman lost her ability to seriously ruffle me sometime during the summer of that same year. Nowadays, I greet the latest Ilhan Omar pronouncement with a weary sigh, perhaps an eye roll, and the thought, “there she goes again.”

But since the congresswoman has gotten my attention, well…okay, I’ll bite. I’ll pose the obvious question here: Is it too much to ask that an elected member of the US House of Representatives also identify as an American, along with all of those other things? 

I’m not demanding, mind you, that Ilhan Omar wrap herself in the flag and sing “The Star-Spangled Banner” a cappella. But since she took the time to identify herself as “Somali”, “American” might have at least been number seven on the list….maybe?? 

I know what some of you are saying: “We always knew that you were a closet Somali-phobe! What do you have against decent Somali patriots?”

No, I’m not a Somali-phobe. There is nothing wrong with swearing allegiance to the nation of Somalia, either. Somalia needs more patriotic Somalis, you might even argue. But is the US House of Representatives really the most appropriate gig for a self-declared Somali patriot? They have elected offices in Somalia that such a person can run for, or so I hear. 

But Islam is an issue here, too—at Ilhan Omar’s dogged insistence. Like every other aspect of her identity, Ilhan Omar wears her religion not as a set of spiritual beliefs, but as a chip upon her shoulder. She has always been quick to label everyone who disagrees with her an “anti-Muslim bigot.”

Suppose that an elected Roman Catholic congressman from Ohio were to publicly identify himself as a Catholic, a member of the Knights of Columbus, and a former altar boy, without also mentioning that he’s an American. Imagine the evangelical equivalent. 

Now imagine that that same Christian congressman closed with a final paragraph that threw down the gauntlet to everyone who didn’t share his views. Would that draw some criticism from the secular left (especially those who forever fret about “the religious [Christian] right”)? Oh, you bet it would.

Two of Omar’s self-labels refer to Islam. Should “hijabi” (a woman who wears a hijab) be more central than “American” for a woman who writes laws for the rest of us? (Especially considering that she goes to the trouble of listing “Somali”.)

Let’s be clear about this. The implication here is not that Muslims can’t be loyal Americans, or that Muslim Americans shouldn’t serve in Congress. The implication is that Ilhan Omar embraces a worldview in which Islam (along with African ethnicity, being foreign born, and former refugee status) pits one in opposition to the United States and its interests. 

This is the aspect of Ilhan Omar that drives so many people on the right around the bend, and—I suppose—endears her to the revolutionary left. It’s not that she’s Muslim or from Somalia. It’s that she wants to use those aspects of her identity as a cudgel. 

Ilhan Omar wants you to know that she despises the United States, that she considers its customs, people, and traditions beneath her. She therefore sees her 2018 election to the US House of Representatives as a kind of foreign insurgency. She isn’t there to work with the majority of Americans. She’s there to work against them, to compel them to submit to a penance for every real or imagined crime or peccadillo that has ever been committed in the name of the United States. And if you dare object, you’re an “anti-Muslim bigot” or a “tinfoil conspiracy theorist”. 

Whatever else she is, Ilhan Omar is no fool. She chooses her words with purpose. She might have said that she’s a proud American who also happens to be a Muslim. She might have expressed her gratitude as a refugee, toward the country that took her in, and gave her the opportunity to become a member of its government. 

But she didn’t. Ilhan Omar’s stated mission is to “trigger” at least half of all Americans. That’s not me, or anyone else, putting words in her mouth. That’s what she says. Read her tweets. Her words are clear. 

This tells us much about the congresswoman from Minnesota, and how she really feels about her adopted country. 

The door through which you entered

Kshama Sawant, and the rise of the revolutionary immigrant

1.

I’ve met many first-generation immigrants who are among the most patriotic of Americans. There is something to the argument that Americanism and religious faith have one vital thing in common: there is no zealot like a true convert. 

Two of my friends hail from the former South Vietnam. Both of them (barely old enough to remember Vietnam’s descent into communist tyranny) are staunch conservatives. I’m concerned about the avowed and open socialist designs of Bernie Sanders. My two Vietnamese friends both believed that Barack Obama was a communist. Even I won’t go quite that far.

On September 11, 2001, I was working in the corporate world. As chance would have it, my boss was a (non-practicing) Muslim from Sri Lanka. On September 12, 2001, he pulled our group into a meeting room and said, “I find what happened yesterday to be especially disturbing. Remember: You are Americans by birth. I am an American by choice.”

I still get a little chill when I remember those words, spoken on the day after the worst terrorist attack on American soil. And by an American raised in a majority Muslim community abroad, to boot. 

These experiences (among others) have led me to believe that Americanism is not a product of blood or birth, but of philosophy and commitment. On the right at present, there is a conflict between the civic nationalists (mainstream conservatives) and ethnic nationalists (the so-called “alt right”). Put me firmly in the camp of the civic nationalists.  

2.

But then there’s Kshama Sawant. 

Kshama Sawant was born in India and spent her formative years there. After getting a degree in computer science at the University of Mumbai, she moved to the United States.

Shortly after she arrived, Sawant decided that America was a corrupt land of economic inequality. She might have returned home; and no one could have faulted her for this. 

To turn this around: I would like to visit India someday. From what I have heard, though, about the treatment of women (among other human rights abuses) in the world’s largest democracy, I don’t think that Indian citizenship would be for me. 

But Sawant didn’t return home. Instead, she became a U.S. citizen, and embarked on a full-time mission of lecturing Americans about their faults. Sawant became active in Socialist Alternative, a hard-left Trotskyist group that openly seeks to replace free-market democracy with a neo-Marxist dictatorship. 

Such views are very much in vogue in the leftwing enclave of Seattle. Sawant got involved in local politics there, and became a member of Seattle’s City Council, where she uses every pretext to denounce America’s free-market economic system.

And of course, Sawant has been drawn to Bernie Sanders. 

Speaking at a Bernie Sanders rally on February 19, Seattle Councilwoman Kshama Sawant erupted, “We need a powerful socialist movement to end all capitalist oppression and exploitation!” 

Kshama Sawant does not give speeches. She gives harangues. She rants. She screams. She gloats about her plans to overthrow the American system. 

3.

Now, what’s wrong with that? you might say. 

On one level, nothing at all. No one is disputing Kshama Sawant’s right to speak her mind. Nevertheless, there is a mismatch between Sawant’s professed political beliefs and her observed behavior. 

If America is so bad, why doesn’t she vote with her feet? No one, after all, forced her at gunpoint to come here. And while India might not be paradise on earth, it is a basically stable country. She could go back, if she chose to. 

4.

In recent years, we’ve see a gaggle of prominent first-generation immigrants who have deliberately come here, and yet spend most of their time publicly criticizing America. 

These include not only Kshama Sawant, but Ilhan Omar, a first-generation Somali American who is presently a member of the US House of Representatives. According to Omar, America is not only economically unjust, it’s xenophobic and racist. And yet, the American system is sufficiently liberal to permit a first-generation American who doesn’t like America to take part in making laws for the rest of us. 

Umair Haque, a British-Pakistani author, spends a lot of time in America. But he seems to spend most of his hours writing denunciations of the United States in online journals like Medium. Haque, who would presumably have knowledge of the United Kingdom and Pakistan, has remarkably little to say about the flaws in either of those countries. But he has lots of criticism for America. 

5.

There is, I repeat, nothing wrong with disliking America, its system, or its values. 

I feel that way about plenty of places. From what I know of Islam, the Muslim world wouldn’t be the place for me to live. But I’m not about to move to Kuwait or Iran, and tell them that they should be more secular, or more Christian, or more Westernized.

Either the Iranians or the Kuwaitis would predictably ask me what the heck I was doing in their countries, if I so despised their way of life. And you know what? They’d have a valid point.

6.

Let’s return to Kshama Sawant. Most Americans have no interest in overthrowing our current system with a Trotskyite/Marxist system of government. Bernie Sanders is the current frontrunner in the Democratic Party, but his watered-down, somewhat sanitized version of socialism has only captured a minority of Democratic voters. Most Americans want to keep our basic freedoms, including our economic freedoms.

Kshama Sawant, then, has chosen to play a villain’s role right out of central casting—or the most nativist, alt-right corners of the Internet. Sawant is a first-generation immigrant. She’s also an aspiring revolutionary, who hates America as it currently is.

Not every first-generation American has to be a flag-waving patriot like my friends from South Vietnam, or my former boss from Sri Lanka. Patriotism, moreover, can coexist with a sober recognition of America’s warts and imperfections.

But if your only objective as an immigrant is to overthrow the American system, then don’t complain when someone reminds you that the door through which you entered swings the other way, too. 

Bernie Sanders and weaponized democracy

At the time of this writing, Bernie Sanders has just won the New Hampshire Democratic primary. Pundits on both the left and the right now take seriously his chances of winning the Democratic nomination for President of the United States.

So do I….and so should you. Like it or not, Bernie Sanders is now a major force in American politics.

Bernie Sanders’s far-left agenda won’t play as well in the suburbs as Barack Obama’s relative centrism did in 2008 and 2012. But these are strange political times we’re living in, and stranger things might happen.

While President Trump has a large and enthusiastic base of support, the president is a personally abrasive individual who tends to polarize rather than unite Americans. Given his actual performance in office, President Trump should be far more popular than he currently is.

The media has consistently smeared the president. Congressional Democrats have spent far more time concocting various impeachment schemes than they have actually working with President Trump.

But Trump also lacks the consensus-building instincts of a politician. This has been both a strength and a weakness for the president. A Trump-Sanders contest would not necessarily be a cakewalk for the current incumbent. If the pro-Trump movement has taken on cultlike characteristics, well, so has the anti-Trump movement. There are plenty of Americans who might be willing to vote for anyone not named Trump this coming Election Day, the consequences be damned. (It’s been estimated that a Sanders victory could cause the stock market to plunge by as much as 40 percent.)

Bernie Sanders, though, is by no means the unanimous choice of Democratic-leaning voters. Even where his support is strongest, he polls at well under 30 percent. (He won a little more than a quarter of the vote in the New Hampshire primary.)

But the 2020 Democratic field is historically weak. The Democrats have no mainstream candidate on par with Bill Clinton in 1992 or Barack Obama in 2008—someone who can unite a plurality of both centrist and progressive voters.

Bernie Sanders, on the other hand, has hordes of rabid followers who eat, sleep, and breathe for “the Bern”.

Political chaos and the opportunistic radical

This wouldn’t be the first time in history that a small, dedicated, and opportunistic band of radicals has seized the momentum amid political chaos. And no one can doubt that the current state of the United States is one of chaos.

In 1917, the minority Bolsheviks infamously hijacked the Russian revolution from dithering and divided moderates. Adolf Hitler came to power in the 1930s not because a plurality of Germans wanted another world war, death camps, and national destruction—but because Hitler’s followers managed to take advantage of the divided chaos of the Weimar Republic.

Bernie Sanders and his supporters hope to take advantage of the present chaos in American politics, and bring about a socialist revolution that will shift America to the far left. They may very well succeed in doing that. But it is equally possible that they will provoke a backlash that leaves America less democratic, and shifts American politics to the right.

Socialism under Bernie Sanders

Bernie Sanders has unapologetically embraced the socialist label, as have many of his followers. Sanders hasn’t really defined exactly what that means, though he’s given us some clues.

In 1976, Sanders advocated for state ownership of banks, utilities, and major industries.  Under Sanders’s 2020 presidential platform, private companies will be expected to “communize” from within, by awarding shares to employees, according to a government mandate. This means that no sector of the economy would remain truly free if Bernie had his way.

To be sure, Bernie wouldn’t entirely get his way. Much of what Bernie proposes would be challenged on constitutional grounds. Sanders’s ultimate success (or lack thereof) in transforming America would also depend on how successfully he transforms the Democratic Party, and how successfully Democrats manage to win and maintain seats in Congress.

Socialism and the Democratic Party

The Democratic Party is not, in its traditions, the party of socialism.

Mainstream Democrats like Bill Clinton have always envisioned more government involvement in the economy than was favored by Republicans. But their focus was on the free market and individual responsibility. Bill Clinton famously said that “welfare was never meant to be a way of life”.

That isn’t the position of Bernie Sanders, needless to say. Bernie wants us all to be beholden to the state, from cradle to grave.

The rise of far-left Democrats like Alexandria Ocasio-Cortez, Ilhan Omar, and Rashida Tlaib suggests that there is an appetite in the Democratic Party for a more radically leftist platform.

This will be aided by another plank of the new Democratic agenda: open borders.

Open borders and economic redistribution

In recent years, the Democratic Party has observably become the party of open borders, just as it has become the party of economic redistribution.

These dual agendas are closely linked. New immigrants, who often come here with few resources of their own, are more amenable to policies of redistribution. And once they become citizens, they can vote.

A constant surge of new immigrants therefore ensures an eventual electoral majority for the Democratic Party. This will happen even faster if they succeed in their aim of abolishing the Electoral College. (Even “moderate” Democrats, like Pete Buttigieg, now favor the abolition of the Electoral College.)

The Democratic Party is therefore locked into a specific formula: They must support economic redistribution in order to monopolize the votes of new immigrants; and they must support open borders in order to expand their base.

Weaponized democracy 

What is gradually emerging in the Democratic Party, then, is a project of weaponized democracy, in which the ballot box becomes a tool of expropriation. This impulse will be completely unfettered and out in the open under a President Sanders.

As Margaret Thatcher once said, “The trouble with socialism is that eventually you run out of other people’s money.” This has happened in every country where socialism and state ownership have been implemented.

The Bernie Sanders crowd—if they come to power—will eventually run out of other people’s money, too. But that may take some time. And with open borders, the socialist faction within the Democratic Party can simply overwhelm Republicans, independents, and moderate Democrats, with a steady flow of new, left-leaning voters.

This will turn America inexorably to the left. Right?

Well, not necessarily.

Possible backlashes

American democracy is not a law of physics. It is not even a religion. We should not assume that the current form of American democracy will  survive unchanged, if the vote is abused in the manner described above.

If democratic institutions are allowed to become nothing more than the lever by which one group expropriates another through majority rule, it follows that many Americans (those who don’t want to live under socialism) will lose their commitment to those institutions. This will happen even faster, once it becomes clear how the Democratic Party is using open borders to build a long-term, far-left majority.

What form will the backlash take? It might take the form of a change in voter eligibility requirements.

And no, this doesn’t mean taking the vote away from racial minorities or women. Nor would it even mean raising the voting age back to twenty-one. (I say back to, because the voting age was twenty-one less than fifty years ago.)

It might mean something as straightforward as this:  Suppose, for example, that you weren’t allowed to vote until you’d paid in a certain amount of income tax. Not a millionaire level, mind you, but enough to ensure that you had “skin in the game”—or something to lose—in the event that a Bernie Sanders decided to expropriate and redistribute everything. This change alone would erase the electoral impact of the Bernie Bros and Babes overnight.

I’m not advocating for this, mind you. Any reduction in voting rights would be incredibly divisive. But we shouldn’t assume that it couldn’t happen, if democracy is weaponized as a means of mass expropriation. Americans are a contentious, self-asserting people. They will not idly stand by forever, while a radical cabal expropriates them for its own political aims.

Finally, a Sanders presidency could lead to secession movements in conservative states like Utah, Texas, Idaho, Tennessee, etc. Citizens in Salt Lake City and Knoxville might simply decide that if America is now socialist, they no longer want to be part of America.

The coming irrelevancy of the Democratic Party?

But there is another, less dramatic possibility to consider. If the Democratic Party allows itself to be taken over by a radical faction of socialists, the party as a whole may simply become unpalatable to a solid majority of Americans.

This would transform the Democratic Party into a fringe party, incapable of winning elections beyond the local level. Among the MAGA crowd, “Trump 4EVA” shirts have become popular in advance of the 2020 election. Bernie Sanders and his socialist brethren may help make that a reality.

Socialists should run as socialists, not as Democrats

Bernie Sanders is not the first socialist to run for president. Nor is he the first socialist with a substantial following to aspire to the highest office in the land.

Eugene Debs, the Bernie Sanders of the 20th century

Eugene Debs (1855-1926), was the Bernie Sanders of the early 20th century. Debs ran for president five times. He peaked in the election of 1912, when he won 6 percent of the popular vote.

Debs, however, ran honestly, as the candidate for the Socialist Party of America. He did not pretend to be a Democrat. Nor was the Democratic Party of that era about to allow his radical supporters to hijack their organization.

In fact, this would not have been possible under the rules by which the two parties selected candidates a century ago. The open primary system of recent decades, however, allows small groups of radical ideologues to commandeer either of the major parties.

The Democratic Party, which already leans left, is the natural target of radical leftists. Bernie Sanders has become a major candidate in the Democratic Party not because a majority of Democrats want him. (He still polls at under 30 percent among Democrats.) Sanders has risen because he has been able to mobilize large numbers of 18~21-year-olds on the promise of free university tuition.

Sanders has been aided, of course, by the general dearth of historical knowledge among younger voters, who have never been taught about the Marxist experiments of the twentieth century. He has been aided by the leftward tilt of the teaching profession since the 1990s. It is no wonder that young Sanders supporters affectionately regard this would-be dictator as their “socialist grandpa”. Thanks to the failings of their elders, these youngest voters lack the intellectual tools to see through Bernie Sanders’s false promises.

*  *   *

Most of the so-called “Bernie Bros” and “Bernie Babes” had no significant history of involvement with the Democratic Party before casting their votes for Bernie Sanders. Most had never voted or paid substantial income taxes. But because Bernie Sanders was able to mobilize them, he may succeed in his goal of imposing socialism on the rest of us.

Or perhaps not. Bernie Sanders and his supporters hope to shift American politics permanently to the left. But the backlash they provoke may, in the end, push American politics permanently to the right.

Should you major in the liberal arts?

One day in August of 1986, I sat down before the desk of a Northern Kentucky University (NKU) guidance counselor to discuss my class schedule for the upcoming fall semester. I was an incoming freshman. 

The guidance counselor began by asking me what I intended to major in during my four years at NKU. I told her either history or English literature.

She told me, in so many words, that I was an idiot.

“So what you’re saying,” she said, “is that you want to pay to read books. You can read books on your own time, for free.”

Needless to say, I was taken aback. This was several decades before it became fashionable to refer to callow young adults as “snowflakes”. Nevertheless, I was used to receiving almost unconditional encouragement from the adults in my midst. The ink on my high school diploma was still drying, after all. 

I explained to her that English and history had been my favorite subjects in high school. She told me that that didn’t matter. What mattered was, how was I going to prepare myself to earn a living by reading Shakespeare, and learning about the Treaty of Ghent? 

What are the liberal arts?

I will freely admit that I love the liberal arts. 

What are the liberal arts, exactly? In short, they are subjects of fundamental, as opposed to applied, inquiry. 

Imagine a group of students at one of the great universities of Europe during the Renaissance, say in…the year 1504. How many of them do you think were studying HVAC repair or managerial accounting? No, most of them were studying subjects like philosophy, mathematics, astronomy, and literature. 

The liberal arts, then, don’t necessarily mean subjects for the numerically challenged, like literature and history. The liberal arts actually are divided into four categories: social sciences, natural sciences, mathematics, and humanities. Some of these require a substantial amount of number-crunching. 

The problem with the liberal arts

The problem with the liberal arts, of course, is that it’s hard to make a living as a philosopher or a historian. In most cases, a career in the liberal arts means a career in academia. That can be an okay career, but there are only so many universities and so many open slots. 

And you’ll need an advanced degree. As one of my college biology professors once told my Biology 101 class: “A Bachelor of Science in Biology qualifies you to sell shoes.”

Oh, and that was in 1987, when there were still mall shoe stores to give you a job selling shoes. Today, of course, people buy shoes from Amazon—where they buy seemingly everything else. 

What should you major in, then?

Probably something that you see people doing around you, for actual money: nursing, engineering, accounting, and yes—HVAC repair. (The guy who works on my furnace and air conditioner is typically scheduled four weeks in advance). 

This is a bitter pill to swallow for those of us who love books. It’s not that there aren’t books in the applied fields, but they are different kinds of books, and you’ll require a different kind of curiosity to get enthusiastic about them. 

But trust me—it can be done. I finally ended up majoring in economics—yes, another field in the liberal arts! Later on, though, I took some accounting courses at the graduate level. Accounting can be interesting. Really, it can be.

Why should you be so coldly realistic when choosing a major? The answer is simple: money. With rising tuition costs, a college education is an investment. And every investment requires a plan for ROI (return on investment).

If this doesn’t seem immediately obvious to you (it didn’t seem obvious to me in 1986), don’t be too hard on yourself. Our system of higher education is still stuck in the early twentieth century in many ways. 

Before World War II, a university education was largely seen as a “finishing school” for members of the upper classes. Most of them didn’t have to be too practical. They already had plenty of career options. Many went to work in well-established family businesses. 

In the decades after World War II, the significance and the purpose of the university degree gradually changed. On one hand, by the 1960s, people from the middle and working classes were now attending university. That was a good thing. On the other hand, though, the 4-year university degree lost its scarcity value. And since people in the middle and working classes needed to earn money, they had to be practical when choosing a major.

These changes were already well underway in 1986, when I entered college. But I’m not going to lie to you: When I graduated in 1990, it was still possible to get a corporate, professional job if you had some kind of a 4-year degree—even in history or astronomy. 

Yes, the engineering majors were most in demand, but few English lit graduates truly went without. Somebody would hire you. The only downside was, you wouldn’t be using your major. I graduated with a B.A. in Economics, but no employer ever asked me to draw a supply-and-demand curve, or estimate the marginal utility of some activity. (My first job out of college was a low-level sales job for a machine tool distributor.) 

Nowadays, though, it isn’t uncommon to find twentysomethings with BAs in English working as baristas and waiters. I’m not gloating here, or saying that’s good or fair. I am saying that it’s harder than ever to get a decent-paying job with a liberal arts degree.

What about the liberal arts, then?

Here’s another confession: I still love the liberal arts, and I still think they’re vitally important. 

There are some things that every person should know, and most of these come under the rubric of liberal arts. Every person should know how to set up a basic algebraic equation. Every person should know the differences between capitalism and Marxism. Every person should know the basics of the histories of the major world cultures, and the major world religions. 

Oh, and the Treaty of Ghent: Signed on Christmas Eve, 1814, the Treaty of Ghent ended the War of 1812 between the United States of America and Great Britain. If you’re reading these words in one of those two countries, you should probably know the Treaty of Ghent, too.

To revisit the advice that the NKU guidance counselor gave me over thirty years ago: You can learn about the liberal arts without spending thousands of dollars in university tuition. 

Want to read Shakespeare? His works are in the public domain. If you have an Internet connection, you can read them for free. You don’t have to spend $300 per credit hour to read Shakespeare. 

Likewise, most of the liberal arts have been covered in thousands and thousands of books, written by some very bright people. You can get books for free at your public library. Or you can spend a little money and get them on Amazon. (But you won’t spend even a fraction of what you’d spend to acquire a 4-year degree in one of these subjects.)

The hard truth

I wish the truth were otherwise. I love the idea of study for its own sake. But given the realities of tuition fees and the job market, you probably shouldn’t pay to take college courses in Shakespeare or European history. 

You’d be much better off reading about these subjects on your own time. Go to college for engineering or nursing. 

Or maybe HVAC repair. Remember that HVAC repairman who’s booked four weeks in advance. I doubt that there are many philosophers with four week waiting lists, and not many poets or playwrights, either. 

Technology and the voluntary loss of privacy

This is bigger than Katie Hill…

A certain politician from California has been in the hot seat of late because of embarrassing revelations of a highly personal nature. 

Katie Hill, a freshman representative from California, has recently seen her private life aired on the Internet, from The Daily Mail to Twitter… 

And what a colorful private life it is, apparently. Say what you will about Representative Hill and her politics, but she isn’t boring and she isn’t a prude. 

This naturally raises a lot of questions: Should a politician’s sex life be an issue, so long as they aren’t breaking any laws or violating anyone’s rights? Can a politician who leads an unconventional sex life govern effectively?

Politics tends to attract horndogs of both sexes, irrespective of ideology: Consider the examples of Bill Clinton, JFK, and Donald Trump.

Further back in history, consider Catherine the Great and King David. 

That isn’t the angle I want to consider, though. 

I grew up in the 1980s. Back then, unless you were a famous person, most of what you said and did simply wasn’t documented.

Photographs existed, obviously. But individual photos had to be developed, usually at a Fotomat. And since they also had to be printed out on paper, there was a cost associated with them. 

“Instant cameras”, with self-developing film, enjoyed a period of popularity in the 1970s and 1980s. But the film was expensive, and the photo quality wasn’t very good. 

Because of such negative cost and convenience factors, people tended to take photos only when it was an “event”: a birthday celebration, a school play, a family portrait, etc. I won’t go so far as to say that having your photo taken was a big deal in the 1980s, but yes…it was kind of a big deal. It didn’t happen every day, for the average person. 

As a result, most of what you said and did died in the moment. There wasn’t this minute-by-minute record of your life that we have now. 

Those technologically primitive times had their benefits. Suppose that you said something dumb, or you did something that pushed a few boundaries. Unless it was really over the top, it was quickly forgotten. 

Which is, I would suggest, the way it should be.

Katie Hill certainly didn’t want her private photos published on the Internet. Her reasonable expectations of privacy were violated. Let’s be unequivocal about that. 

But the vast majority of the photos which came to light were clearly posed. This strongly implies that she consented to them being taken. 

This, in itself, represents a major lapse in judgment. Why, pray tell, would anyone consent to a naked photo of oneself, smoking from a bong, with an iron cross tattoo plainly visible near one’s pubic region?

We’ve bought into the notion that every moment of our lives needs to be Instagrammed, Facebooked, and selfied. Perhaps this is mass vanity, or perhaps this has just become a habit. Either way, it’s what we’re all doing. 

And this isn’t just the Millennials and the GenZers. I have friends in their forties and fifties who seemingly can’t go out to dinner without taking a half-dozen photos of themselves and uploading them to Facebook. 

Look at us, and what a happy couple we are, having a fancy meal out on the town!

More of our lives needs to remain private. But our private lives especially need to remain private. 

How do you define “private”? Here’s a rule of thumb: Don’t consent to any photo of yourself that you wouldn’t want posted on the homepage of The Daily Mail. Because as Katie Hill now knows, that may very well happen. 

The Apple Store business model is broken

Here’s what’s wrong…and how Apple can fix it.

This past week I took my 73 year-old father to the Apple Store in the Cincinnati area with the intent of purchasing at least one (and probably two) items. My dad was in the market for a new iPhone and a new laptop. 

We arrived twenty minutes before the store opened. A young Apple Store associate entered our information in a tablet before the store opened. (Like the government in Logan’s Run, Apple Stores seem to eliminate every member of their band over the age of thirty. I have never been waited on there by anyone much beyond that age.) 

Great! I thought. This is going to be fast! Whiz-bang efficiency!

But I was wrong. It wasn’t fast. 

To make a long story short, we spent 90 minutes waiting around the store. We stood. We paced. We looked at the few items that you can view without the help of a sales associate. (And there aren’t many of those.)

And then, finally, we gave up. We left without buying anything. At the time of our departure, we were told that we would be waited on in…about twenty minutes.

That was probably an optimistic assessment. I think it would have been more like an hour: There were around two dozen other customers waiting around for service, just like us. 

I saw several of them walk out in frustration, too.

Apple: great products, sucky retailing

I am a ten-year member of the Cult of Mac. 

I personally haven’t used anything but Apple products since 2010, when a final malware infection of my Dell PC, loaded with Windows XP, convinced me that enough was enough.  

So I bought an iMac. The rest, as they say, is history. Since then, I’ve owned two iMacs, two MacBooks, four iPods, and three iPhones. 

I’ve become an evangelist for Apple products. I’ve converted not only both my parents, but at least two or three of my friends. 

Apple products really are something special. But boy, those Apple Stores sure do suck.

And I’m not the only one who feels this way.

Widespread complaints

A May 2019 article in the LA Times is entitled, “How the Apple Store has fallen from grace”. Focusing on an Apple Store in Columbus, Ohio, the article could have been written about my recent visit to the Apple Store in Cincinnati: 

Web Smith’s recent experience at his local Apple store in the suburbs of Columbus, Ohio, has been an exercise in frustration.

There was the time he visited the Easton Town Center location to buy a laptop for his 11-year-old daughter and spent almost 20 minutes getting an employee to accept his credit card. In January, Smith was buying a monitor and kept asking store workers to check him out, but they couldn’t because they were Apple “Geniuses” handling tech support and not sales.

“It took me forever to get someone to sell me the product,” said Smith, who runs 2PM Inc., an e-commerce research and consulting firm. “It’s become harder to buy something, even when the place isn’t busy. Buying a product there used to be a revered thing. Now you don’t want to bother with the inconvenience.”

There are many similar stories in the media of late, as well as customer complaints on social media. 

Cult of Mac members still largely love their iMacs, MacBooks, iPhones, iPods, and Apple Watches. But they increasingly dread the next trip to the Apple Store.

So what went wrong? And what needs to be done? 

An obsolete concept of the pre-iPhone era

The first Apple Stores debuted in May 2001—going on twenty years ago. Back then, they showcased only the computers, which had a minuscule market share at the time, compared to PCs made by Dell and Gateway. 

iPods were added in October 2001, but these, too, were specialty products when they debuted. For geeks only. 

The real tipping point was the introduction of the iPhone in 2007, and the subsequent ubiquity of smartphones. 

In 2001, a relatively small percentage of the population owned an iMac or a MacBook. In 2019, 40% of us own iPhones. The iPhone is a mass-market product. But it’s still being retailed as if it were a specialty item.

And when you visit an Apple Store in 2019, you’ll find that 70% of the traffic to these upscale boutiques is iPhone-related. Many are there for routine password resets. 

This is traffic that was never imagined or accounted for in 2001, when the Apple Store concept was launched.

Zen over function

Apple Stores don’t look like ordinary electronics retails stores. Steve Jobs was a devotee of eastern Zen practices, and the Apple Store resembles a Japanese bonsai garden. There is an emphasis on minimalism, and lots of blank space. 

The downside of that is that you can’t do much to serve yourself, as you could in a Best Buy or a Walmart. 

You basically walk into the store, and an employee puts you into an electronic queue. Then you wait around. 

But you have a very clean, zen setting in which to wait. 

Uncomfortable stores

Speaking of those long waits….

Apple Stores do look nice. But they are not comfortable places to spend an hour waiting for a salesperson. Which is almost inevitable. 

There are few stools, and it’s clear that the stools were selected for their  sleekness, not their comfort. 

There aren’t any plush bean bags or sofas to sit on. Heavens no! That would detract from the zen.

Inefficient use of staff

Too many Apple Store employees are exclusively dedicated to crowd control—to herding you into virtual line. 

This is because you can’t serve yourself in an Apple Store. Go into a Best Buy, and there are clearly defined areas for looking at computers, at cell phones, at peripherals. There’s a line for service in every Best Buy. A line for returns. 

Normal retail, in other words. 

There are no clearly defined areas within the Apple Store. Customers are all milling about, most of them doing nothing but waiting to be attended on. 

Many of these customers are frustrated and growing impatient. They want to know how much longer they’ll have to wait. This means that at any given moment, at least a quarter of the Apple Store employees you see on the floor are directing this vast cattle drive. 

They aren’t selling any products, they aren’t helping any customers. They’re just managing the virtual line. 

That amounts to a big waste of the Apple Store’s manpower—and of the customers’ time.

Decline of staff quality

Apple stores were once staffed by highly knowledgeable sales personnel. That was in the days when the stores only carried computers, and hiring was very selective.

Those days are gone. Now that it’s all about selling a gazillion iPhones, Apple Store employees are no longer specialists. Despite the pretentious name “Genius Bar”, geniuses are in short supply on the sales floor nowadays. You’re going to be served by run-of-the-mill retail sales staff. And their expertise, helpfulness, and attitudes vary greatly.

Not enough stores

There are about a dozen AT&T stores within a twenty-minute drive of my house in suburban Cincinnati.

Guess how many Apple Store there are…

One. In the Cincinnati area, we are served by a single Apple Store at the Kenwood Towne Centre.

And for those readers in Los Angeles and New York, who maybe think that Cincinnati is a one-horse cow town: There are 2.1 million people in the Greater Cincinnati area. It’s the 29th largest metropolitan area in the United States. 

And we have one Apple Store.

There are only eight Apple Stores in all of Ohio, and a total population of 11 million. That means one Apple Store for every 1,375,000 Ohioans. 

But it could be worse: There are only three Apple Stores in the entire state of Wisconsin. Kentucky has only one Apple Store.

But there are only twenty-two Apple Stores in the entire State of New York. AT&T has more retail locations than that just in Cincinnati. 

No wonder the stores are packed. I made my aforementioned trip to the Kenwood Towne Center Apple Store with my dad on a Friday. Granted, Friday is typically a busier retail day than Tuesday, Wednesday, or Thursday. But this was during the middle of October—not exactly a peak shopping season. The back-to-school rush is already over. The Christmas shopping blitz won’t begin for another six weeks. 

And at 9:40 in the morning—twenty minutes before opening time—there was already a crowd outside the Apple Store.

The Apple Store needs to be refocused on function rather than branding

As an Apple employee quoted by the LA Times notes, Apple Stores are “mostly an exercise in branding and no longer do a good job serving mission shoppers”.

The “mission shopper” is the shopper who goes into the store with a specific purchase in mind (versus someone who is still torn between a Mac and a PC, or an iPhone and an Android). 

These are customers who could largely serve themselves. If only that were possible. But due to the philosophy of the Apple Store, there is minimal “clutter” at these boutique shops. In other words, these are retail shops with minimal merchandise on display. 

Apple Stores need to become more like Best Buys: There should be clearly defined areas for looking at each category of merchandise, and clearly defined areas to wait for technical support. 

As I mentioned above, most of the traffic in the Apple Store seems to involve iPhone support. The iPhone customers definitely need their own area of the store. 

This probably means abandoning the whole boutique concept. At present, Apple Stores are small but mostly empty spaces in high-rent locations. That is, again, all very zen and cool-looking. But it doesn’t happen  to be a great way to purchase a new MacBook, or to get your iPhone unlocked when you’ve forgotten the passcode.

A broken model in terminal need of repair

 The Apple Store might have been a workable retail model in the pre-iPhone era, when Mac devotees really were an exclusive tribe. The Apple geeks of 2001, with their tattoos and soul patches, may have appreciated the gleaming but empty Apple Stores. 

But the Apple customer base has changed and expanded since 2001. When you factor in iPhones, Apple is now a mass-market brand. (And Apple now owns 13% of the home computer market.)

 Having become a mass-market brand, Apple needs to adopt the more efficient practices of a mass-market brand. 

That means dropping the boutique pretentiousness that makes Apple Stores great places to photograph, but horrible places to buy stuff. The hoi polloi of 2019 are not the rarified Apple geeks of 2001. 

We don’t want or need a zen experience. We just want to get quickly in and out of the Apple Store with minimal delays, like we can at every other retail shop.

Ric Ocasek 1944-2019

Ric Ocasek, the lead singer of The Cars, has died. 

Anyone who remembers the early 1980s remembers Ric Ocasek. He was a mainstay on FM radio in those days, the voice behind “Shake it Up” and “Since You’re Gone”. 

When MTV came along in 1982, Ocasek’s visage became well known, too. Those sunglasses.

Ocasek was an unlikely rock star: a too-thin, gawky fellow with a pointy nose and a prominent Adam’s apple. Ocasek was nevertheless a man older than my father, who was married to a supermodel about my age. 

He also proved that you don’t need movie-star good looks to be a successful rock star. The Cars were enormously popular throughout the Reagan decade, with Ric Ocasek as their frontman.

The Cars were never my favorite band. But their music was always there, and it was always pleasant. A feel-good sound from a now vanished, better era.

At the time of this writing, the details of Ocasek’s death are unclear. He was seventy-five years old, and an ex-rock star, so…well, we just don’t know.

Ric Ocasek, 75, R.I.P.