Old Geezers Out to Lunch

Old Geezers Out to Lunch
The Geezers Emeritus through history: The Mathematician™, Dr. Golf™, The Professor™, and Mercurious™

Wednesday, March 27, 2013

Spring in Minneapolis

Dateline Minneapolis. March 27, 2013


Her Holiness, the lawn nun, emerges from winter hibernation. 


Monday, March 25, 2013

For One Day Only..

For one day only, public policy makers seem to behaving somewhat rationally.

First, on yesterday's "This Week" ABC news program, we heard GOP strategist Karl Rove say that he can imagine the next Republican presidential candidate supporting gay marriage.  Now, Karl Rove (former adviser to George W. Bush) is such an evil little troglodyte that your immediate thought must be to wonder at what his secret agenda is, exactly. You can imagine him saying audacious things simply to ignite a firestorm of conservative indignation, for example.  But for the sake of graciousness, let's just imagine that this represents a bit of intelligence making its presence felt within the conservative right wing.

Then, and most encouraging for me, was a local state case in which Administrative Law judge Barbara Nielson ruled in favor of the Minnesota Department of Education in upholding the Department of Education's curriculum guidelines for social studies in public schools.

A group call Education Liberty Watch (along with a group of Republican legislators) had filed suit, arguing that the Department of education had ignored the concept of "American exceptionalism", removed the role of God-given rights, and called the US a 'democracy' rather than a 'republic' at the core of its social studies curriculum.  Judge Nielsen did not find these arguments to have merit.

(By the way, Education Liberty Watch is the Minnesota equivalent of those groups in places like Texas and Mississippi who have successfully forced schools into teaching evolution as merely a theory that offers a different, unproven alternative to the REAL truth of 'intelligent design'. Minnesota, too, has some folks out there on the fringe.)

At the heart of the argument for our Minnesota group is their belief that Americans somehow enjoy special blessings from God Above, causing Him to favor our form of government above all others. Secondly, they find something sacrilegious in the curriculum saying that our rights are bestowed by citizen agreement, not by God. The reason for the group's last dispute—that calling our form of government a democracy rather than a republic is sinister—is less clear. Perhaps it simply had to do with panicky fear of anything that represents change of any kind.

A Minnesota judge, at last, recognized insanity when it showed itself openly, and ruled in favor of logic. The Minnesota Department of Education, she said, had indeed weighed all points of view and opinions, and had selected its curriculum in a way that was logical and reasonable. It was a nice way of saying "go away, nut-cases."

It's a very safe bet to assume our policy makers will continue to make me fume in the near future. So all the more reason to celebrate small blessings when you can.

Thursday, March 21, 2013

Diagnosis du Jour


A news article today had this as a headline:

"1 in 50 Americans Has Autism."

...to which I say: 

No, they don't. 

Autism is a popular disorder today, like some others that are still in full vogue at the moment, and others that have had their prime and are now on the wane. A few years ago, we were convinced that pretty much every kid in American had some form of ADHD. There was a time when "Personality Disorder" was the buzz term, and "Bipolar" now seems to be nominating itself as the next big thing. Depression has developed such staying power that it's developed an entirely different linguistic usage. You know longer say "I'm depressed;" now it's "I have depression", and we're told that virtually all of us has some level of depression, all the time. Routinely waking a little grumpy before the morning coffee now means you have "rapid cycle depression."  A moodiness that sets in during long periods of dreary weather is "Season Affective Disorder (SAD)." 

Don't get me wrong. I'm fully aware that autism is a very real thing. And I surely don't doubt that it comes in different levels of intensity, and that Asperger's disorder is a convenient and helpful way to distinguish a more mild but still debilitating form along the autism "spectrum." I have known a couple of unfortunate families who have had kids fitting the classic, sobering definition of autism: utterly non-verbal, non-social kids who really have no normal future in store for them.  And I have also known some families with kids with clear social development disorders that are distinctly disabling and which deserve to be studied and treated. 

And of course ADHD, personality disorder, Bipolar condition, clinical depression are also real and problematic conditions worth attention. Giving labels to certain easy-to-recognize syndromes and behavior patterns is convenient and useful as a tool for recognizing and dealing with them. The same is true of many, many such labels. My argument is simply with the overuse of these and other diagnostic terms. As descriptors these phrases are fine, but the trouble is that they’ve become pseudo-medicalized, and symptomatic diagnosis has led to an explosion of chemical treatment, many of which have been studied only briefly.

As a firmly established Geezer with close to 60 decades on the planet, much of that time involved with teachers and others in education, I have known hundreds of kids, and the number that genuinely fall into what is now called the autism spectrum really can be no more than a large handful, maybe a dozen at most. It simply is not 1 in 50. The same can be said for most of the other diagnoses du jour. They are quite justified in some instance, but just too popular for their own good. 

Yup. We actually had candy cigarettes in 1960.
And we're still alive. I preferred non-filtered camels. 
It's all indicative, I think, of the insistence we have on making sure we, and everybody we know, is part of some diagnostic class.  For every person who is has some real, genuine digestive disease caused by a wheat gluten allergy, for example, there are millions of people who claim it because it makes them feel special or lets them justify the attention of a medical specialist.  When Geezers were kids, we ate nearly 110 pounds per year real, natural sugar and didn't seem to come apart at the seams.  Today, though, “enlightened” parents are convinced that their fragile kids will explode if a teaspoon of good old fashioned cane sugar enters their system. They are "fructose sensitive." 

The list of over-diagnosed conditions is almost endless. What is "inflammation" and why is it the modern plague? Why is pretty much everybody today susceptible to a life-threatening allergy?  The other day, I heard somebody who sprouted a small rash after hiking in poison ivy down in North Carolina, confide with a prideful whisper that he had been diagnosed by an allergist specialist as suffering from "contact dermatitis."  Look it up. This means you itch after touching something. 

A fellow publishing professional involved in popular health books told me that the next big epidemic will be the discovery that more than half of us suffer from some sort of blood fungus. And shingles is a virtual time-bomb waiting to explode in society: if you don't already have a full-blown case, you almost certainly are in the early festering stages of it; you simply don't know it. 

This could have been me at age 6. I was called "feisty," not "sociopathic." 
This trend is especially fierce among parents of younger children these days, who seem quite desperate to place a diagnostic label on their kids. I've sat at tables during social events and heard every parent of a school age kid proudly give the diagnosis of their child. Among the favorites: a plain old fashioned rebellious teenager has “Oppositional Defiant Disorder.” Not a good old-fashioned normal kid among them.

(One of my Geezer friends notes that this type of parental obsession and sensitivity has increased in direct proportion to parents’ desire to appear engaged, when in fact they are really NOT as engaged as generations before. It makes us feel like we’re doing our jobs if we have some professional ascribe a medical label to our kids.)

This has, of course been brewing since back in the day when Mrs. Mercurious and I were raising the Brats. What following is a paraphrase we had with one of my daughter's teachers when She-Brat was about seven years old.  My memory for the precise words is inaccurate, but the gist of the conversation is something like this. 

Teacher:  "Mr. and Mrs. Mercurious, thanks for coming in today for She-Brat's conference. She is.....an unusual child…A handful. "

Mrs. M (sighing):  "Don't we know it."

My kinda kid. 
Teacher:  "I must tell you, her teaching team has discussed her at length, and we see all the classic signs...."

Mr.s M: (slightly worried): "Signs...."?

Teacher (carefully):  "Yes.  Constant curiosity about just about everything....Questioning authority....catching you in every contradiction.....Unending questions…...the insistence to know where your facts come from.....always asking 'what if'…....choosing to read rather than listen to her teachers...."

Me (skeptical) :  "Yes, what are you suggesting?"

Teacher (with authority):  "Well, Mr. and Mrs. Mercurious, we've concluded that all the classic symptoms are there. She-Brat is quite clearly a Spirited Child."

The fact that this was a formal noun, designated with capital letters, was clear from the teacher's tone.

Me (incredulous):  "Say what?"

The teacher proceeded, with a completely straight face, to describe "Spirited Child" as a well studied syndrome, with clear symptoms, a course of recommended behavioral modification and, if we chose, the possibility of chemical treatment aimed at making She-Brat more like "normal" kids. There was a whole body of published work on the subject matter. "Spirited Child" appeared to be an actual diagnosis, and, unbelievably, a condition thought to require correction. 

Me (venomously) :  "Fuck off. We're outa here."

Mrs M (with equal poison):  "Me too. What he said."

My conclusion here is from the Geezer playbook:  While there are certainly kids who have special needs that need to be recognized and addressed, there's also good merit to recognizing that diversity is normal, expected, and desireable; and that we don't need to find a label for each and every human condition. And treat yourself with the same respect owed the kids. It's okay to have an itchy nose and watery eyes in the spring without defining yourself as "elm-pollen-sensitive." And rather than claiming lactose intolerance, you might just say that “milk makes me fart so I don’t drink it.”

Diagnose your kids, and yourself, only when your happiness and quality and live genuinely depend on it.

(By the way, She-Brat grew into a fine young woman who now works in education herself. When she comes across a spirited child, she gives them an extra cookie.)



Tuesday, March 19, 2013

Citizens of 4F, March 19, 2013

The morning passengers on the 4F bus into downtown Minneapolis are an unusually morose group today. As they board, their jaws are tightly clenched, their eyes are downcast, and upon taking their seats they quickly bury themselves in some form of escapism. Minneapolitans are a bookish lot by nature but this morning it seems that nearly all of them bury themselves in novels, and the remaining handful quickly lose themselves in the world of ipod music players. There is almost no conversation at all this morning, and when one discussion does arise between two young adults, I can see surrounding passengers glance at the talkative pair in thinly disguised disgust. Heavily invested in their melancholy, the passengers are deeply resentful over being distracted from it.

The reason for the the low mood may be because, on the eve of calendar spring, none of us are dressed in the light jackets and sweaters, or even the rain gear, that one might expect for this time of year. Instead, the garb consists of heavy woolen overcoats and puffy down parkas, thick insulated mittens and gloves, heavy boots, sturdy scarves, stocking hats and ear muffs. We are so bundled up that the bus feels like a shipping crate packed with foam peanuts.

With spring arriving 24 hours from now, it is 5 degrees F. this morning, with a stiff northern wind that makes it feel, we're told, like minus 14 degrees.

We have had enough. Minnesotans are relatively stoic, even arrogant about thriving through these brutal winters, but we manage this because of the promise of a beautiful if short spring, a sultry summer that lets us store up heat for the winter, and autumns that are truly stunning. When winter begins to extend past its normal 4 1/2- or 5-month allowance, we start to get testy. Weather that would make us shrug in January feels damn near intolerable in late March. Nature has played an especially cruel joke this year, because just a year ago St. Patrick's day saw 80 degree temperatures and green lawns with gardens already heavy with blooming spring bulbs.

This year, the crocus and daffodils are still hidden by 14 inches of ice and snow cover.  It if goes much longer, we are likely to because suicidal alcoholics, like the Finns.

Politically and scientifically, I subscribe to the belief in the dangers of man-made carbon dioxide increases causing havoc to the planet's climate and ecosystems. But springs like this make me utter a simple curse:

"Global warming, my ass."

Sunday, March 17, 2013

Intimations of Mortality

My father is failing and the fact really can't be ignored any longer. It's not like he's just now had a sudden stroke or heart attack to force confrontation with this reality,  but the march over the past 8 years or so has been unmistakable. Until my father was 70 or 72, he looked, acted, and felt a full 10 or 15 years below his calendar age. His hair was quite dark in that Walter Matheau way; he golfed at least 9 holes, and more often 18, each and every day, rejecting motorized carts to walk the links. He tended an expansive hobby farm that took a full five hours weekly just to mow the lawns. He traveled the globe on exotic vacations that would have tired a man half his age.

But then age began to catch up in a big way. Over the past 8 years he has had a hip replaced, been diagnosed with myesthena gravis, a condition that badly affects his balance and facial muscles and reduces his strength to that of a small child; lifelong athsma and allergy problem has grew into full-blown COPD that now requires full-time oxygen; and in the last year or two, he's had periods of mental fuzziness that have caused doctors to ponder the likelihoood of early stage Alzheimer's.

Early last week, he was admitted to the hospital with pneumonia, which is considerably more serious with his age and underlying conditions than it would for you or me. And there has been a sudden increase in the periods of apparent dementia that suggests that it is not blood oxygen levels, but truly Alzheimer's that may be at work here. And so I traveled out last last week to the expansive prairie landscape of western Minnesota tonight to learn if he's going to pull through this round of difficulties, and, if so, whether he can remain at home with his wife for a while longer or if a residential care facility (the nice word for "nursing home") is now a reasonable step in the near future.

I was about 14 when my own grandfather, dad's father, died up in a nursing home in the far northwestern corner of Minnesota. Shortly before he passed,  I drove up alone with my dad to visit grandpa. It was a sobering experience, since grandpa no longer really recognized us and kept talking about needing to plow the 60 acre field back by the minnow lake. (Grandpa had not been on that family farm for many years.) At my age, it was a startling glimpse of a mortality you don't even contemplate as a young teenager. It was especially hard for my dad to watch his own father, as the spectre of mental decline has been the fate dad has been most afraid of. He said no more than three sentence on the long drive back home that day many years ago, lost in thought. This seems to be a genetic fear, as this is precisely this possible future that most unnerves me.

When I arrived last week to see dad, I found him so weak in the hospital that lifting a cup of water to drink was an exhausting strain. The physical therapist said this was a substantial improvement over the day before, though, and said with steady improvement this week he might still be able to return to his house to live—this time at least. In our first conversation, he rattled on for some time about a helicopter crash in the area, which had, he said, dropped debris on his house, damaging the roof of the garage. Of course, there was no such incident, and gradually I recognized that he had merged the sound of a helicopter from Souix Falls airlifting another patient from the hospital with some story about a plane crash that was being broadcast on CNN. Over the next few days, the mental confusion lifted a little and dad was a little more like himself. But the physical weakness remains, and I found myself at one point running his electric shaver over his face for him—this, for the guy who did not take a sick day in 30 years of teaching school.

These are of course normal life transitions, and each of us will have them with different variations. Your own death is one, of course, but witnessing the decline of loved ones is another big one. When my mother died 17 years ago there came a point during the trauma of it all when it became crystal clear to me that death was precisely the condition under which life itself is made possible. After that I was able to deal with it all fairly philosophically, even spiritually. But even with an intellectual acceptance, the sense of my father now winding down leaves me with a sense of time running out for all of us. There is a little bit of existential terror in all this for me over now approaching the top of the genetic ladder with no further upward step, but it's relieved when I look at the pictures dad keeps on his study shelf: most of them are of his grandkids, my own son and daughter.

My dad hasn't been a perfect man, but he has been a very good one. He's a fairly passive man by nature, and on occasion this has frozen him into failure of action. Sometimes he smiles when a bit of snarling is what's necessary and apropos.  But he's been a devoted family man for his whole life, and a fellow whose reputation in every community he's lived in is as a guy unfailingly willing to help others. I remember many times when he did chores for the neighboring farmer when he was injured or sick; a handicapped elderly neighbor could, and did, always call my dad when in difficulty. Dad provided fresh vegetables for our whole neighborhood out the expansive garden he obsessively tended. Many of his former high-school students who now have their own grandchildren still recall with affection his dedication to teaching them algebra and calculus 40 years ago.  In his new community where he moved 16 years ago,  he's developed the same reputation for helping others willingly and automatically. Of the many local people I talked to over the last few days, every one of them commented on his good heart. Until a few years ago, he was tutoring local farm-kids in their math lessons.

At the end of the film "Saving Private Ryan", the principle character shown many years later while visiting the cemeteries on the beaches of Normandy, turns to his daughter after recalling his life as depicted from the story we have just witnessed and asks "Tell, me, have I been a good man?"

Should my dad ever ask me this, I will say "Yes dad, you surely have been a good man."

—Mercurious—


Friday, March 15, 2013

Refreshing the Soul


—this commentary offered by the Professor—

For Christians, the season of lent is proceeding and we have just observed what in the Episcopal tradition is referred to as “Refreshment Sunday”:  on the fourth Sunday of Lent, one is given dispensation to relax one’s Lenten discipline a bit.  In other words, it’s essentially permission to change things up a bit, to note the passage of time, and to remind convince oneself that lent will soon be over.  It occurred to me that “refreshment” is what many of us desperately need on a regular basis—especially through a workplace environment that has become increasing  electronic, impersonal and time-conscious.   Alas, refreshment seems to be what we’re NOT getting.

Far more common than we like to admit. 
An article in the morning paper recently confirmed something that I had long suspected from talking with various people about the nature of our workplace environment.  It’s described as a situation in which “most” employees working in an office environment either “regularly” or “frequently” eat lunch at their desks.  It went on to describe a subcomponent of workers  (the study estimated 10%) who “occasionally” would eat all three “meals” of the day at their desks. 

As the saying goes: this is wrong in so many ways.  It’s not refreshing, and it isn’t healthy.
The issues of work hours in the United States  and the general level of pressure in office environments might be a good topic for another discussion, but what strikes me about this above information is that it give further evidence that something is wrong with the way we conceive of and execute feeding ourselves (in the good old days of this geezer, we occasionally called it dining.)  There is something amiss in our eating habits, and all you have to do is take a walk with a mental measuring tape and observe the prevailing girth of those around us (hell, us as well!) to verify this; but to begin to unravel some of these interrelated challenges, we need to step back in time a bit.

There was a time, deep in our agricultural past (and to some extent in our manual labour/manufacturing past,) in which three full meals a day were essential to replenish the sheer numbe of calories we burned off through our daily tasks.  Back close to my permanent home in Pennsylvania, the Amish still live such a lifestyle (or should we say “work such a “workstyle?”)  They are on what is sometimes called the “Amish Diet”: take enough steps a day (in traditional communities they average up about 18,000 steps per day) and you can pack away as many meals as you’d like.  In typical American cultures, however, we average less than 8,000 steps per day, and office workers as a subgroup average significantly less than that.  The “Amish Diet” of three full meals a day is—technically speaking—a gastro-intestinal mismatch with how most of us live and work.

Still, it’s hard to imagine a day with, say, only one meal in it, isn’t it?  While it would with little doubt provide us with enough calories to replace those we burn staring at a computer screen and answering e-mails, it would seem to be a component of a pretty grim existence.  Put more directly: we might not need additional calories, but we can’t be expected to grind away for 8, 9 or 10 hours with no breaks in the day, with no contour or shape to our work day.  For that  ideally is what  meal should provide in our day of limited physical activity: a break (most importantly mental); a time to shift one’s attention; a time to interact with colleagues as people rather than co-workers;  a time to take a breath, to think about the many reasons you do what you do in life, and then get back to work.  It’s the same reason we have a Sabbath in the week or holidays in the year or take time out to note the changing of the seasons: we want and need our lives to have some shape.

Hemlock for the body AND the soul. 
And this is precisely why eating at our desks is so bad.  In a similar way, it’s why that peculiar American phenomenon of the “drive-through window” is so reprehensible.  In both of these activities, we get EXACTLY the opposite of what we genuinely need.  We get plenty of calories (especially in the drive-through, but also in the vending machine products that often form the basis of our desk-centered “lunch”), but we get no real tonal, spatial, interpersonal or psychological break. 

The result: in an hour or two, we find ourselves heading for the coffee room again, to perhaps score some leftover birthday cake or grab a quick packet of crisps.  We don’t do this because we are technically hungry: someone who hits the drive-through while on the road gets nearly enough calories in just a burger and soda to power them for an entire day.  We are hungry in a more poetic sense: hungry for the mental break we didn’t give ourselves earlier; hungry for a shaping ritual that our day unconsciously needs.  Yea, I say unto you: the road to hell (and to a wider circumference) is strewn not with sinners, but with fast food drive-throughs!
A better approach to refreshment, don't you think? Lovely. 
I got to thinking about all of this after observing how a cup of tea here in England forms the basis for just what we need in this arena.  The Brits love their tea and tend to feel quite strongly about its restorative (or they might also claim curative) powers.   There’s little you can claim is wrong with you for which  the English won’t first suggest: “you just need a good cup of tea.”   Mind you, they don’t mean that you should dash out to the corner convenience store,  get a Styrofoam cup of tea and gulp it down as you drive down the road or return to your office.  The how and when of tea being prepared, served and consumed is a relatively well defined social ritual which, in turn, provides the social and psychological benefits we crave when we in the workplace decide it is time for “lunch.”  

The English will consume coffee in a purely functional way, grabbing a quick cup to help get the day started, but tea is much more—it is a way by which one’s progression through the day is marked.  When it is tea time, work stops; often people will gather together (since “elevenses” and four o’clock tea time is fairly standard); the tea will be prepared in the proper way (or as close to that ideal as practicable, but always carefully and consciously); and the tea is consumed sitting.  One other ritualized element is essential: after the first sip of tea, invariably, the description “lovely” is uttered serially by all present, with others nodding and hmmmming in affirmation.  This need not take long; sometimes this ritualized break in the day takes less than ten minutes.  But the important thing has been achieved: a true break in the day has occurred; a mental reset has been facilitated, and a moment of appreciation for the simple and familiar things which make up most of our modest lives has occurred.  We have been refreshed. The fact that tea is a virtually no-cal beverage is an additional bonus (let’s not count the milk and biscuits.)

Surely we American are innovative enough to develop an effective approach to refreshment; we can’t expect to eat our way into that feeling without some accompanying thought.  For myself, I have vowed to avoid: eating while doing anything else; eating while walking or standing; and eating without taking a moment to pray or offer thanks in another manner.  The key is in ritualizing our behaviour sufficiently for it to become an activity that is both habitual and significant.  A way to refresh ourselves that would be meaningful, social, relaxing, predictable, cheap and low calorie?  Wouldn’t that be” lovely?”


Tuesday, March 12, 2013

Democracy Revisited


—political wisdom from our Professor—

Democracy is born in Egypt
Democracy is a good thing, right?  This is what we’ve always been taught in our history and civic classes through school.  It is what we tell cultures around the world as they seek alternatives to despotism or single-party rule.  It is invoked in political speeches right along with motherhood, apple pie and baseball.  It must be a good thing, right?  The only trouble is, we find ourselves following the news around the world, and what we see sometimes gives us cause for pause: the “Arab Spring” brought elections to Egypt—and an increasingly radicalized Muslim Brotherhood government.  Civil unrest has followed as the Egyptian military weighs the benefit of intervening and the rioting can be ignited by something as innocuous as a football match.  The “freedom fighters” of Libya were victorious in their battle with the backers of Col. Khadaffi, but our ambassador is now dead and Libya’s resources lie in disarray.  We introduced democracy into Afghanistan, but the Karzai government seems hopelessly corrupt, not to mention its feckless behaviour as a supposed “ally” in fighting the Taliban.  Democracy seems to have its challenges around the world.

This is democracy?  Really?
Fear not, the reader must be thinking: just turn your eyes to the inventors of democracy (the Greeks….ah, we’d better not look there.  All right then, the French….well, after four tries at democracy I guess they managed to do something. Well, there is the British…but they still have the queen (God save her) and a hopelessly fractured coalition government…

Yes,   but what about the world’s most prominent and insistent advocate of democracy, the USA?  How is this democracy thing working out for us?  I would think few observers would dispute that we are caught up in a time in which it is easy to question whether our form of national government is fully up to the task.   The “fiscal cliff” melodrama merely illustrates a dysfunctional Congress that has consistently failed to deal effectively with a number of issues of grave national importance­—most particularly the budget deficit—but also issues of energy, nuclear waste disposal, immigration, and growing economic stratification.  Who among us is willing to present our current Congress and say to the people of the world: “See? You should have one of these!”

What gives?  Has democracy’s “five minutes of fame” (in historical perspective) run out?  Is its instability and deliberate inefficiency incapable of dealing with the rapidly changing, ideologically-charged times in which we live?  Given the examples presented above, one is nudged toward doubt.
In responding to the question of democracy, though, one has to subdivide the problems, for the challenges democracy presents in the middle-east, for instance, are quite different from those presented to us in the US.  There is a very intriguing theory that for democracy to work, one must have a significant and stable middle class.  In other words, for “majority rule” to work, the majority of the population needs to feel they personally have more to lose through social instability than to gain.  If one has a significant amount to lose, your “vote” is practically and pragmatically driven toward competence, stability and honesty in government (or so goes the theory.)  If you have essentially little or nothing to lose (in a material sense) your vote reflects those things that hard-pressed/hopeless people of all ages have tended to gravitate toward: religious fervour or opportunistic corruption (or, many times, both in the same package.)   Israel is the only country in the middle east with a prominent middle class; it is also the only one with a functional democracy (you might argue it is a potentially DANGEROUS democracy, but it is a functional one for better or for worse.)

But the US is nothing if not middle class, isn’t it?  I mean, even people making over 200,000 dollars insist earnestly (and perhaps accurately) that they are the epitome of middle class values.  So why is our democracy in such a seemingly sick phase (let’s hope it’s a phase and not a terminal illness!) 
The problem is that we have forgotten something very important:  America was indeed founded as a democracy, but it was founded very deliberately as a REPRESENTATIVE democracy and not a DIRECT democracy.  The fact that most of us have forgotten this is made manifestly clear every four years when the “wacky” or “inexplicable” nature of the Electoral College is discussed.  There is nothing mysterious, strange or outmoded about the Electoral College; it only seems that way to those who have forgotten our roots as a representative democracy (which means: most of us.) 

The idea embodied by the Electoral College is that we elect individuals whose judgement we trust to then take action on behalf of our nation in a manner in which they consider most positive.  They represent our general interests, but—at least initially—had no obligation to reflect our specific desires or directives.  Technically, we still vote for electors, who in turn vote for a president and vice-president, although a rigid custom has evolved in which electors declare their presidential allegiances ahead of time and invariably cast their votes for that candidate regardless of the circumstances.  The idea of representative democracy is not trivial or insignificant, for America—at its founding and now—is comprised of large numbers of sincere, hardworking individuals, many of whom are too busy or otherwise not inclined to inform themselves regarding  the complexities of running the most powerful nation on earth.  As long as they/we acknowledge this and focus on electing intelligent people of excellent character to REPRESENT us, this works out fine.  But when we start to think that watching sensationalized and heavily edited cable television qualifies us to understand the intricacies (and appropriate costs) of government, and we expect our elected representatives to DIRECTLY  reflect our specific, individual and semi-informed perspectives, we get into hot water.

If we all have to vote on every issue, is there
any time left to drink fine Scotch?
If you look at all of our fifty states, and determine which of them is least functional, California would probably be near the top of most lists.  And it’s no coincidence that California was the first state to overtly apply direct democracy through its ballot initiative process.  Perhaps one of the most famous of these ballot initiative was “proposition 13” proposed by one Harold Jarvis, which set a permanent, constitutionally mandated limit on property taxes.  This fashion of ad-hoc direct democracy seems fine for a while (especially when opening your revised tax bill) but there is no way that such initiatives can respond to changing circumstances (just one example: the Americans with Disabilities Act imposed huge mandated spending on states) or fluctuating economic circumstances.  Ross Perot, when a candidate for president, was quite fond of declaring things to be “just that simple,” which sounded fine—except that it wasn’t and isn’t usually true.

This is how you must spend your free time
if you don't let representative leaders do their jobs
California is a mess, and the entire US is following close behind.  Wisconsin recently conducted a shrill and wasteful “recall” election of their governor—justified on the basis of being “responsive” to the people.  Our electronically connected world has given all of us a megaphone; we use it, and we expect our representatives to hear us.  Fine so far.  But when our “representatives” start to engage in silly stunts like website “voting” on specific bills, recall elections to even political scores and simplistic one-issue “pledges” cooked up by Grover Norquist, we are in trouble.  Our Representatives in congress (if they still deserve that description) increasingly behave as if we were founded as a direct democracy, voting in whichever fashion they perceive the “American Public” to think.  The health of the nation is an afterthought; the health of their political careers is the forethought. 

But we can’t blame them, really.  The unfortunate fact is that party activists are increasingly strident and ideological in their impulses, and increasingly selective about the information upon which they determine their positions.  And they vote accordingly in party primaries, expecting their representatives to directly reflect their ideological stance at all times.  The only way to hinder the harmful effects of these zealots, in a democracy, is to outnumber them.  We simply must find a way to broaden our major parties’ active base in such a way that a more moderate and informed perspective prevails; a perspective that acknowledges that in a representative democracy, we must look for candidates of broad and flexible mind, capable of representing out general interests through rapidly changing circumstances. 
Right now, playing to the zealots who dominate nominating primaries, candidates demonstrate just the opposite traits to those needed my representative democracy: they issue very specific solutions (that will never happen), make very specific promises, and sign ridiculous and rigid “pledges.” They essentially take any discretionary authority away from themselves before they get elected.  They seem either ignorant of how representative democracy works, or too afraid to argue for the prerogatives necessary to be a true representative. 

What would democracy look like if we reawakened the representative roots of our democracy, if we found a way to overtly empower our elected representatives to exercise discretion and judgement in their voting, whether in Congress or in things like the Electoral College?  I have a theory: an electoral college composed of generally respected, broadly informed citizens elected on the basis of their broad judgement by their individual states may never have elected George W. Bush.  And who knows, maybe they would have elected John McCain president knowing that his chief disqualification (the presence of Sarah Palin on the ticket) could have been easily avoided by an electoral college willing to ignore modern protocol, and split the ticket to elect Joe Biden vice-president. (Maybe we could have had George H.W. Bush and Lloyd Bensten?)  It’s called exercising informed discretion….couldn’t we use a bit more of that in our government?

Let’s bring back representative democracy…democracy seems to be in need of it.