Are beauty standards universal across time?

The answer to this one is complicated.

Some biases regarding attractiveness certainly are universal, in that they applied equally in Elizabethan England as they do today (and probably will five hundred years hence.)

Men who are tall and broad-shouldered have a natural advantage with women. Always have…probably always will.

Men have always preferred younger women, and women who have a certain hip-waist-bust ratio.

Speaking of age: In no society that I am aware of, have the elderly ever been regarded as the sexual ideal.

(Hey, I just turned fifty; so I’m not any more enthusiastic about it than you are. But it is what it is.)

The age thing probably makes sense, from an evolutionary perspective. Sexual attraction is ultimately about procreation. Older women can’t get pregnant; and a man’s ability to produce healthy offspring declines with age. If young people were naturally drawn to sexual relations with old folks, the human race would have died out eons ago.

(Twentysomething female readers who don’t wish to remain pawns of their evolutionary impulses are of course welcome to email me.)



Once you get beyond these basics, though, there is some real variation.

I watch a lot of old movies, and I’m often surprised by my reaction—or lack thereof—to female sirens of the early twentieth century.

Almost all of them, no matter how young they were when they appeared in a particular film—make me think of my grandmothers. And that’s a real libido-killer.

Consider that famous pinup of Betty Grable. You’ve seen it: the one that features Grable standing in a swimsuit with her back to the camera. She is looking mischievously over her shoulder.

Betty Grable, 1943

It has been said that no World War II GI was without one of these. (I know that my grandfather, a World War II veteran, had a copy.)

Betty Grable was twenty-seven when she posed for that iconic shot. Put me in a time machine and take me back to 1943, and Betty Grable would consider me an old man at my present age of fifty.

And yet, the pinup photo of Grable (which so inspired men of my grandfather’s generation) does absolutely nothing for me. Even at twenty-seven, Grable strikes me as matronly.

I don’t really see much in the way of wow! feminine attractiveness until you get to the Baby Boomer generation.

This makes sense. In the early 1980s, when I was an adolescent boy discovering the existence of the opposite sex, many Baby Boomer women were still youngish, and therefore objects of fascination from afar.

(There is one group of males who are consistently attracted to older women, by the way: twelve- to fourteen year-old boys!)  



I have never been prone to celebrity obsessions. By this, I don’t mean to claim that I have never been interested in women who are out of my league—but they have tended to be women in my immediate surroundings, versus women on television. When I tilt at windmills, I like the windmills to be nearby.


I do, however, recall a brief adolescent infatuation with Olivia Newton-John, one of the costars of Grease (1978). Since Olivia was a Hollywood celebrity and twenty years my senior, I recognized, even at that age, that these were foolish thoughts. But since celebrities can provide a frame of reference for discussions like this, I’ll note that I can also see some real make-my-heart-flutter beauty in Nancy Sinatra, circa 1968, and Michelle Philips, from anywhere around that time.

Nancy Sinatra, 1968
Michelle Phillips, 1974


My ability to see attractiveness in the young versions of Baby Boomer women (versus women of the World War II generation) makes a certain amount of sense from a cultural perspective, too. My world very much overlapped with the world of the Baby Boomers. When I was an adolescent, Baby Boomers defined youth culture.

Virtually all of the celebrities of my youth were Baby Boomers. So were the female sex symbols: Farrah Fawcett, Jacklyn Smith, Bo Derek.

(I recall seeing my first copy of Playboy at the tender age of eleven, in 1979. The centerfold model of that issue would have been born in the 1950s—making her a Baby Boomer.)

This 1976 poster of Farrah Fawcett was literally everywhere during my adolescent years.



Now I’m going through beauty-standard culture shock from the opposite perspective. To me, there is no aesthetic tragedy to equal the young woman who turns her body into a canvas of gaudy tattoos and ridiculous piercings.

Yes, I said “gaudy” and I said “ridiculous”. Almost all men in my age group feel the same way. For that matter, I strongly suspect that many Millennial men share this opinion, but are hesitant to openly express it.

Slightly below tattoos and piercings on the “what was she thinking?” scale are breast implants. Whenever I see a young woman with breast implants, I think of the strippers in one of the scenes from Bada Bing in The Sopranos.

Where the female body is concerned, I have an unapologetic preference for the “natural” look.

Kat Von D, via Pinterest: Why?????


I’ve also noticed that young Millennial women tend to vary more widely in weight and physical fitness than their predecessors. When I was a young man, it was somewhat rare to see a young woman who was either super-fit or noticeably overweight. (It was the same with males, I should note.)

Generation Y, however, has both more couch potatoes and more gym rats. The result is that most Millennial women tend to strike me as either jaw-dropping, centerfold-attractive…or not very appealing at all.



I will acknowledge a certain chauvinism here in addressing only the female side of the coin. But this is a personal essay—not a broad-ranging academic paper. And so the perspective is personal.

I’m sure that what women find attractive has changed, too, since the early twentieth century. Think about this: In the mid-1980s, most male sex symbols wore mullets (though nobody called them that back then).

I’ll leave that piece for a heterosexual woman to write. As a heterosexual man, I’ve always paid a lot more attention to the distaff side of things. Please keep that in mind as you compose your hate mail.

Beauty standards will continue to change, I’m sure—even as some factors remain constant.

Looking back at my own high school yearbook, I’m struck by how much standards of feminine beauty have changed in a mere thirty-five years…

But I’m going to keep those particular observations to myself. At least a few of my former high school classmates have been known to frequent this blog.

I can handle hate mail from anonymous folks on the Internet…but not from people I’ve known for thirty-five years.

Anthrax and anxiety

The other day I watched a documentary about some highly disturbing revelations to come out of the former Soviet Union.

In 1969 the United States, under then President Richard Nixon, unilaterally abandoned all testing and development on germ warfare. Nixon claimed he did this because he found these weapons simply too horrible to fathom. Pessimists claimed that Nixon was simply trying to cast himself as a peacemaker. (This was during the Vietnam War era, remember.)

What were Nixon’s true motivations? I’ll let you be the judge of that.

The Soviets, being the Soviets, took the most cynical, zero-sum interpretation possible. They signed the Biological Weapons Convention, along with many other countries, in 1972. In secret, however, they ramped up their germ warfare program to previously unimagined levels.

(Supposedly, the Soviets believed that Richard Nixon was lying about the U.S. unilaterally abandoning its germ warfare program. What country would do such a thing? (Certainly the USSR never would.) But in this case, at least, Nixon had been telling the truth.)

The Soviets built a huge anthrax weapons development facility near Yekaterinburg. (This was where Czar Nicholas II and his family were massacred by drunken Marxist revolutionaries in the aftermath of the Bolshevik Revolution.) The Soviet government told the world—and its own people—that the site made nothing but conventional military tanks.

This being the USSR, there was an inevitable foul-up. Some anthrax, in aerosol form, was accidentally released into the air in 1979. Several hundred people died as a result. The Soviet government told the world (and its own people) that the deaths had occurred from ordinary food poisoning.

After this disaster, the Soviets built another biological weapons facility in a remote area of Kazakhstan. This facility created enough weaponized anthrax to wipe out all human life on the entire planet.

Then the Soviet Union dissolved in 1991.

The government of the now independent Kazakhstan really wanted nothing to do with the USSR’s massive germ warfare facility—or its stockpiles. Throughout the 1990s, various efforts were made to clean up the site, and dispose of the stockpiles in a safe manner.

Much of the anthrax was placed into sealed containers and dumped into the Aral Sea, which lies between Kazakhstan and Uzbekistan. Although this sounds inherently dubious, they figured it would be safe there.

But there was a problem.

The Aral Sea has been slowly drying up for decades. Why? More Soviet ingenuity at work. During the Khrushchev era, Moscow had the brilliant idea of diverting the two major rivers that feed the Aral Sea, in an effort to irrigate a desert in the area.

So the once submerged containers are gradually being exposed.

Most of these containers are still difficult to get to. But there are certainly terrorist groups with a lot of motivation, and there’s all that anthrax sitting out there, in those containers.

Oh, and it gets better: There are still scores of unemployed germ warfare experts from the former Soviet Union. Many of them are only in their fifties. And many of them are selling Chinese-made trinkets in Astana in order to make ends meet.


This is a lot to worry about, when you think about it: While the development of a nuclear weapon would likely be an overwhelming task for a stateless terrorist group, biological weapons require much fewer resources.

It is no exaggeration to say that the former Soviet Union’s biological weapons program could still wipe out all human life on earth. The USSR left a long shadow, and nothing left in that shadow was good.

On age and humility

I turned fifty this past year.

Age makes most everyone more conservative, more set in their ways.

I am no exception in this regard.

Age has also dimmed much of my youthful optimism.

On the positive side, however, age has definitely made me more humble.

When I was twenty-three (the most arrogant, self-important age, for most people), I had lots of ideas about myself and what I could do.

But I had not yet been tested. Not in any meaningful way.

At the age of fifty, I’ve definitely been tested by the big things: the deaths of loved ones, illness, disappointments, and the disillusionment of my youthful notions about “how the world is.”

And having been tested, I must report that I’ve sometimes found myself coming up short.

I’ve discovered that I am not that wonderful and impeccably principled person that I believed myself to be at the age of twenty-three.

Each of us is endlessly virtuous, until those virtues are put to the test.

At the age of twenty-three, I saw only black and white. At the age of fifty, I see many shades of gray…most of all in myself.

Reading ‘Revolutionary Ghosts’ over at my YouTube channel

I’ve said before that I’m a text and audio person…not a video person.

(Hey, I’m a middle-aged bald guy…I wasn’t born for video.)

Nevertheless, I can’t abandon YouTube entirely. I’m simply taking my channel in a more focused direction.

Rather than being a dedicated ‘YouTuber,” per se, I’ll mostly be posting audio files of book readings there. That’s the plan at this point, anyway.

Check out ‘Revolutionary Ghosts’ on Amazon.

A cap on student borrowing

The Trump administration is considering new limits to student loans, with the aim of encouraging “responsible borrowing”.

We need to decrease third-party payments in the higher education market, as I discussed in a blog post yesterday. Third-party payments from the federal government have inflated the cost of higher education, placing an unreasonable burden on students, parents, and taxpayers.

The only real beneficiaries of the current system are the higher education elites.


When celebrities lose their relevance

I remember the good old days when Jim Carrey was relevant.

1994–25 years ago–was huge for Carrey. That was the year of Ace Ventura, Pet Detective, The Mask, and Dumb and Dumber.

Carrey peaked a few years later with The Cable Guy, and a role in Batman Forever.

Carrey had a schtick, and it ran its course a generation ago.

But what does a once-popular celebrity do when his best days are behind him?

In Carrey’s case, he whiles away the time making third-grade level art in an attempt to bolster bizarre conspiracy theories:

Actor Jim Carrey, who often uses his visual artwork for advocacy, posted a new image on Twitter Sunday that expresses outrage at President Donald Trump in the wake of the mosque killings in Christchurch, New Zealand.

Carrey’s new artwork shows Trump’s head, with an angry, soulless expression and a Nazi swastika on the forehead, as a blazing asteroid streaking toward Earth.




Now, however you might feel about Donald Trump…This is kind of…dumb, I believe the word is.

Sigh…I miss the 1990s, too, Jim.

Revolutionary Ghosts, Chapter 16

My bedroom was a small, cramped affair, very typical of secondary bedrooms in postwar tract homes. There was barely enough room for a bed, a desk, a dresser, and a chest of drawers. The one selling point of the bedroom was the window over the bed. It afforded me a view of the big maple tree in the front yard, when I felt like looking at it.

I lay down on my bed and opened Spooky American Tales. I briefly considered reading about the Nevada silver mine or the Confederate cemetery in Georgia.

Instead I flipped back to page 84, to Harry Bailey’s article about the Headless Horseman.

After the opening paragraphs, Harry Bailey explained the historical background behind the legend of the Headless Horseman. While most everyone knew that the Headless Horseman was associated with the American Revolution, not everyone knew the particulars:

“Is the Headless Horseman a mere tale—a figment of fevered imaginations? Or is there some truth in the legend? Did the ghastly Horseman truly exist?

“And more to the point of our present concerns: Does the Horseman exist even now?

“I’ll leave those final judgments to you, my friends. 

“What is known for certain is that on October 28, 1776, around three thousand troops of the Continental Army met British and Hessian elements near White Plains, New York, on the field of battle. 

“This engagement is known in historical record as the Battle of White Plains. The Continentals were outnumbered nearly two to one. George Washington’s boys retreated, but not before they had inflicted an equal number of casualties on their British and Hessian enemies…”

By this point in my educational career, I had taken several American history courses. I knew who the Hessians were.

The Hessians were often referred to as mercenaries, and there was an element of truth in that. But they weren’t mercenaries, exactly, in the modern usage of that word.

In the 1700s, the country now known as Germany was still the Holy Roman Empire. It consisted of many small, semiautonomous states. In these pre-democratic times, the German states were ruled by princes.

Many of these states had standing professional armies, elite by the standards of the day. The German princes would sometimes lease out their armies to other European powers in order to replenish their royal coffers.

When the American Revolution began, the British government resorted to leased German troops to supplement the overburdened British military presence in North America. Most of the German troops who fought in the American Revolutionary War on the British side came from two German states: Hesse-Kassel and Hesse-Hanau. The Americans would remember them all as Hessians.

The Hessians had a reputation for brutality. It was said that no Continental soldier wanted to be taken prisoner by the German troops. The Continentals loathed and feared the Hessians even more than the British redcoats.

I supposed that Harry Bailey would have known more about the Hessians than I did, from my basic public school history courses. But Harry Bailey wasn’t writing an article for a history magazine. The readers of Spooky American Tales would be more interested in the ghostly details:

“That much, my dear readers, is indisputable historical record. Journey to the town of White Plains, New York, today, and you will find monuments that commemorate the battle.

“But here is where history takes a decidedly macabre turn, and where believers part ranks with the skeptics. For according to the old legends, one of the enemy dead at the Battle of White Plains would become that hideous ghoul—the Headless Horseman. 

“A lone Hessian artillery officer was struck, in the thick of battle, by a Continental cannonball. Horrific as it may be to imagine, that American cannonball struck the unlucky Hessian square in the head, thereby decapitating him. 

“What an affront, from the perspective of a proud German military man! To have one’s life taken and one’s body mutilated in such a way!

“So great was the rage of the dead Hessian, that he would not rest in his grave! He rose from his eternal sleep to take revenge on the young American republic after the conclusion of the American Revolution.

“This is the gist of Washington Irving’s 1820 short story, ‘The Legend of Sleepy Hollow’. The tale is set in the rural New York village of Sleepy Hollow, around the year 1790. 

“But we have reason to believe that ‘The Legend of Sleepy Hollow’ was not the last chapter in the story of the Headless Horseman. For according to some eyewitness accounts, that fiendish ghoul has returned again from the depths of hell. 

“Read on, my friends, for the details!”

Lying there on my bed reading, I rolled my eyes at Harry Bailey’s florid prose. He was really laying it on thick. But then, I supposed, that was what the readers of a magazine called Spooky American Tales would require.

Then I noticed that the hairs on my arms were standing on end.

My gooseflesh hadn’t been caused by the article in Spooky American Tales—at least, I didn’t think so. I hadn’t yet bought into the notion that the legend of the Headless Horseman might be anything more than an old folktale.

Nor was the temperature in my bedroom excessively cold. Three years ago, my parents had invested in a central air conditioning system for the house. They used the air conditioning, but sparingly. It sometimes seemed as if they were afraid that they might break the air conditioning unit if they kept the temperature in the house below 75°F. With the door closed, it was downright stuffy in my bedroom.

I had an unwanted awareness of that bedroom door, and what might be on the other side of it.

The shape I had seen in the hallway.

Then I told myself that I was being foolish.

It was a bright, sunny June day. The walls were thin, and the door of my bedroom was thin. I could hear the muffled murmurs of the television in the living room.

It wasn’t as if I was alone in some haunted house from Gothic literature. I was lying atop my own bed, in my own bedroom, in the house where I’d grown up. My parents—both of them—were only a few yards away.

There is nothing out there in the hall, I affirmed.

With that affirmation in mind, I continued reading.


Chapter 17

Table of contents

Donna Brazile to Fox News

This is a surprise.

Brazile has issued a statement on the Fox News website, in anticipation of her new gig:

In order for us to best decide as a people how to better protect and preserve our way of life, we need to first be able to hear what others are saying without the filter of bias and contempt. Not until we once again become practiced at treating those of differing views with civility and respect can we begin to join together to solve the myriad of problems our country must overcome.

I’ve watched Donna Brazile for many years. She is something of a race-baiter, and (to her credit, I suppose) she doesn’t hide her intentions.

Her statement, when you read  the whole thing, is quite conciliatory. I hope she is treated well and has a good experience on Fox.

Although Donna Brazile is certainly a big name in the public square, Fox has always maintained a small contingent of leftwing commentators who can serve as foils for its conservative anchor talents.

I’m sure Brazile is being handsomely remunerated for this new role of hers.

Should college be free?

During the 2016 Democratic presidential primaries, Bernie Sanders famously ran on the promise of free higher education, making him the hero of millions of college students. Free college will doubtless become an issue in 2020 as well, with Democratic Party candidates making big promises.

This raises the question: Should college be free?

The answer is complicated. When public resources are completely free, economists have found that a situation called, “the tragedy of the commons” arises. Free public resources are almost always misused and overused.

Moreover, there is nothing wrong with the basic notion that each of us has some responsibility to fund our own future development—and that of our children.

Students and parents should, therefore, have some skin in the game. So to state my conclusion from the outset, no, college should not be completely free.

But this does not mean that the present situation is fair, acceptable, or sustainable over the long term.

I graduated from college many years ago, but the children of many of my friends and relatives are presently going through the college application process. I’ve seen the sticker shock, and I don’t dismiss it. Suffice it to say that parents and students are being overcharged at a level that is borderline criminal.

But is the solution to socialize a corrupt and overpriced system? This is what Bernie Sanders and other progressive candidates of the Democratic Party essentially propose.

And on that one, I can’t agree.



Let me give you an old-timer’s perspective: I began my last full year of college in the fall of 1989. I attended the University of Cincinnati. My annual tuition, as a full-time student, was $3,000.

Yes, I realize that 1989 was a long time ago. There was no Facebook, we typed our term papers on electric typewriters, and dinosaurs roamed the wooded areas at the fringes of college campuses. No one expects anything to cost exactly what it did thirty years ago.

So I checked the inflation-adjusted amount on one of the many consumer-price index (CPI) calculators available on the Internet. When you bring three thousand 1989 dollars into the present, the figure you get is $6,075.

That means that a year of tuition for a full-time student at the University of Cincinnati should cost about about $6,100 per year, or $24,400 for a 4-year degree.

But full-time, in-state tuition at the University of Cincinnati does not cost $6,100 today. In-state tuition at the University of Cincinnati costs about $12,000 today. So students are paying about twice what I paid in 1989.

This makes absolutely no sense. In 1989, of course, we didn’t have the Internet. Everything was done manually, in-person, with lots of labor costs.

And I mean everything. I remember standing in line for hours outside the registrar’s office at UC in the hopes of signing up for a particular class. This involved not only thousands of students, but dozens of clerical workers, who manually entered changes to our schedules.

Our initial schedules, at the beginning of each quarter, were mailed in to the registrar’s office. This meant that someone had to open each envelope, and keypunch it into the school’s mainframe computer.

Today, of course, all of that is done online, largely without the need for any human intervention—or labor costs.

Almost all college students seem to be taking at least some online classes nowadays. This, too, was impossible in 1989. Every minute of instruction involved a real live professor, standing in front of a group of real live students, inside a physical space that had to be heated during the winter months, and air conditioned during the warm months. The lion’s share of those costs can be avoided when a course is conducted online.



College, then, should be much cheaper for today’s students than it was for the students of my day.

Let me retract what I previously said: College tuition in 2019 should cost what it cost in 1989—exactly what it cost in 1989: About three thousand dollars per year at the typical state university.

The question we should all be asking is not: “Who should pay the presently inflated tuition costs?” but rather: Why are universities so inefficient, that they can’t offer a product whose price reflects the efficiencies gained through the Internet and digital technologies?

College in the Internet Age should be dirt cheap, little more than a nuisance cost for most families. Instead it’s become a major investment—or in many cases, a major money pit.


Part of the problem is administrative bloat. Universities all have much larger administrative staffs today than they did in 1989. And I’m not talking about hourly clerical workers. I’m talking about professional, non-teaching administrators who haul down six-figure salaries.

Some of these salaries rival corporate CEO-level compensation. Let’s look at some concrete examples. During the 2016-17 school year, James Ramsey, then the president of the University of Louisville, made $4.3 million. The University of Cincinnati’s president, Neville Pinto, has an annual salary of $660,000. The president of Northern Kentucky University has a base salary of $440,000.

Speaking of NKU, in 2012 it was announced that the ex-president of that institution, Jim Votruba, would be paid $287,675 to teach only two classes. With additional benefits, his total compensation—for teaching two classes—totaled $371,000.

These are some pretty extravagant compensation packages, when you consider the scope of the student debt crisis. This also represents a transfer of wealth—from students, their parents, and taxpayers—to a small group of educational elites.



Universities also have a tendency to go on wasteful construction sprees. When I was a student at the University of Cincinnati, we used to joke that “UC” meant, “under construction”, because the university was always tearing down existing buildings, and constructing new ones.

I recently drove by the UC campus. I noted that at least two of the buildings constructed at great expense during my student days had since been torn down and replaced.

Now, granted, I am aware of my age, and aware that thirty years have passed. Nevertheless, there are buildings in Europe that have literally been inhabited for centuries. An academic building constructed with taxpayer and tuition dollars in 1988 or 1989 should still be perfectly serviceable.



Why are university administrations so wasteful? Mostly, because certain factors have shielded them from market forces.

Universities have, first of all, a monopoly on credentials. Some people do attend four years of college for the experience, and the actual education…such as it is nowadays. Most, however, are there for the sheepskin. Or rather, the doors that the sheepskin can open. If you want to land a professional job, you have to hold a diploma from an accredited, four-year university.

In this regard, things haven’t changed since my younger days. But the situation has gotten even worse. Colleges are now aggressively peddling worthless degrees in subjects like, intersectional gender studies, for example. Degree inflation, and the belief that college confers magical powers of competence on otherwise wet-behind-the-ears eighteen year-olds, has sent even more young people through the doors of these institutions over the past thirty years. The result is that it is now common to meet a twentysomething waitress or hourly Home Depot employee who has a four-year degree.



What can be done about this? About twenty years ago, there was a movement toward private sector, profit-based educational institutions that might break the university monopoly on credentials. You’ve heard of at least some of these, like the University of Phoenix, and Capella University.

That didn’t work out so well. Why not? Not because these institutions were doing anything wrong, necessarily, but because corporate employers, mired in their allegiance to the monopolistic university system, refused to recognize degrees from for-profit institutions. The net result was that thousands of students invested money in degrees from for-profit colleges that turned out to be worthless.

Perhaps for-profit educational institutions aren’t the answer. Nevertheless, we should be exploring ways whereby a person can earn credentials without the involvement of a traditional university.

One possibility would be a series of standardized tests. These have long been employed in fields like IT certification. The basic idea is, if you pass the test, you have the credentials. This obviously wouldn’t work for all fields. (No one wants to be treated by a self-taught brain surgeon.) But it should be possible for degrees in fields like business administration. And since so many universities are conducting their courses online anyway nowadays, self-study wouldn’t really be that much different from what you now do at an accredited, four-year university. Oh…except you wouldn’t have to take on tens of thousands of dollars worth of debt to study on your own for a standardized test.



But perhaps the most corrupting factor in the university system is the presence of third-party payments. I’m talking, of course, about government-backed student loans.

Third-party payments always increase costs. (This is why medical care is so overpriced in the United States, too.) University administrations realize that they don’t have to provide a product at a cost that the average 18- to 23- year old can afford. The reason is, they know that college students will ultimately be relying on government- (i.e., taxpayer-) backed loans. This means that the universities have access to an almost unlimited pipeline of ill-gotten money. Why should they worry about reducing their costs, or taking advantage of efficiencies brought about by the Internet?

Ironically, the corrupting influence of these third-party payments is exactly what the Democratic Party wants to increase in the system.  If college were to become completely free (completely government-funded) university administrations would become even more inefficient and wasteful than they presently are.



A far better approach would be to ask ourselves: How can we incentivize more economically efficient behavior on the part of universities, so that tuition might be affordable for the average college student?

The answer is the exact opposite of what Bernie Sanders has proposed. But no one ever said that Bernie Sanders has a grasp on economics. We should, on the contrary, phase out all government money in the university system. Require them to make do only on what they take in from their customers: students and their parents. If they can’t provide an affordable product, they’ll lose their customer base. That reality has a way of incentivizing efficient behavior.

This is exactly what automobile manufacturers,, and all other private-sector businesses do. They structure their operations so that they can provide products and services at prices that their customers can afford.

But we have succumbed to the myth that educational instruction is so sacred and complicated, that it can’t possibly be subjected to market forces. We believe this, even though what universities do is considerably less complicated than what Toyota or do. This is a myth that has been eagerly perpetuated by the higher education aristocracy. They have no desire to be subjected to market forces, just like everyone else.

I know: Many of you will consider this a radical proposal. Admittedly, it might not be doable overnight. But we could take baby steps in this direction.

We might start by changing the focus of the higher educational debate. Rather than squabbling about who is going to be taxed to support the inefficient and corrupt practices of university administrations, we all need to ask ourselves: Why is their product so overpriced to begin with?

This question, and our insistence on asking it, will be the first step toward making college affordable for students and their parents… as it should have been, all along.