Saturday, January 31, 2009

Idio

The only sport I keep up with is tennis, and I never miss Jon Wertheim's commentaries, especially the weekly mailbag. I've even sent a few quirky bits over the years, including this idea about Taylor Dent winning the White House and appointing Don Budge to the Supreme Court, this limerick about the Bryan brothers, and a few other limericks for a contest.

Well, his latest challenge was to predict what the Russian player Marat Safin will do when he retires from tennis sometime this year, and I'm mildly tickled that he published my answer and called it the first winner.

This kind of thing seemed a lot more important back when I lived in Nacogdoches, and we had to make our own fun, because, God knows, there wasn't any naturally-occurring fun to be found. Now it's retreated into proper perspective, as a dorky and guilty pleasure.

Mega

This term I'm teaching five classes: Public Speaking, Intercultural Communication, Rhetoric, Nonverbal, and Interpersonal. It's a good, meat-and-potatoes lineup of core classes in the Communication field. The first four are daytime classes, with mostly traditional undergrads and an occasional nontraditional, which the locals call an OWL, for Older and Wiser Learner. Cute, but I can't seem to get used to it.

That fifth class, Interpersonal, is nothing but OWLs, because it's an entirely online class. It's being offered through the Professional Studies Program, the PSP, which is designed for older adults with full-time jobs and family responsibilities, who want to finish their bachelor's degree. The classes have much longer meetings, are scheduled at night, and are delivered in eight-week terms instead of fifteen. For me, teaching at night is a peek through the gates of hell, so I teach online instead. And my experience with this online class is overhauling one of my major age-linked schemata.

What I now think is, there is no such thing as childish behavior: there is only overwhelmed behavior.

For all these years, I've mentally sorted my students into two categories: children and adults. The children are the ones that put everything off, don't follow instructions, don't stay caught up on the reading, and put more effort into complaining and arguing over class policies than they do into completing the work. And just as I had a knack for keeping two-year-olds in line with a mix of affection and firmness, I've had a lot of success teaching students who displayed those behaviors: I catch them being good, I spot opportunities to be playful and to affirm them, and then whenever they're irresponsible, I act decisively and firmly and make the consequences too serious to take in stride. Just lopping off ten points is a stray raindrop: a zero, and the prospect of additional zeros, is more like a bone-rattling thunderclap.

But through my career, almost every time I've had a student old enough that they'd held a full-time job past the entry level, or married and raised kids past infancy, I never had to resort to that latter repertoire of teaching tools, because it seemed as though they got it. They were adults. They worked ahead and kept up. Sometimes I'd have younger students who practiced all those behaviors, and I decided either they had exceptional parents, or were exceptionally mature. But year after year, I made my snap judgments and sized up students either as children or adults.

This online class is challenging that.

Nearly all the students are displaying some of the behaviors I listed above as "childish." But at the same time, from their writing, and from some of their other behaviors, I can see that they aren't children at all: they do grasp, as a core truth, that they have responsibilities, that consequences aren't a game whose object is outplaying the teacher. Many of them pile up evidence of being sober, settled, functional adults, and yet they still do this incredibly scattershot job of following directions, keeping up with deadlines, being thorough with their proofreading, and other competencies I've always thought of as the hallmarks of adulthood.

This morning, the real difference popped into my head. One second before it did, I was mystified. One second later, it was so obvious that I kicked myself for not seeing it before. The dividing line isn't maturity. The dividing line is between being challenged and being overwhelmed.

Most of my OWLs are devoting their daylight hours to getting their degree. Some of them work, but the work is scheduled around college. They tell me about deals they've hammered out with their spouses to reallot household and child care duties. College has its footprint in their schedule, and even though they find it challenging, they at least have a chronemic architecture that they can adjust this way and that to try to improve the situation.

The PSP students, I firmly believe, have an invisible college career underway, which looks to me like a recipe for disaster.

My class is online. They work on it only when they log into the computer. They don't ever have to set foot on campus. The other classes are night classes. It's not clear to onlookers that they're any different from a PTA meeting or a night out with friends. The whole enterprise doesn't have a footprint in their schedule. I'm not sure anyone's counseled them that it needs to have one, that they need to carve out X number of hours in the day and say "This is school-time, and I am unavailable." As a matter of fact, I think our admissions counselors tell them that this isn't necessary, that they can handle finishing college without letting anything else slip. And I suspect the counselors frame it as, "It's a challenge, but you can rise to the challenge."

But I think that's an important difference: when you plan for it and give it space in your week, it is a challenge. When you simply stuff it into any available cracks, piecemeal, then it goes from challenging to overwhelming.

The puzzle is, I'm not sure what I do about it. I have limited opportunities to advise these students. They have assigned advisors through the PSP program. I can give them tips for scheduling their studying, their classwork, but the few times I've ventured into that territory, it seems as though they don't take me very seriously. I encounter an attitude of, "That sounds nifty, but you have no idea what it's like to work forty hours, and then shoehorn in your kids' need for attention." True, but they have no idea what it's like to successfully complete a degree. Not only did I complete three, but I've been a mentor to dozens upon dozens of students who've made it happen, many of whom did have jobs and kids.

Well, any time I'm at a loss for what to do, there's one surefire place to start: pray about it. Pray for them. And while I do, occasionally, I need to do so a lot more, and a lot more often.

Friday, January 30, 2009

Deca

It never occurred to me I'd reach a thousand this quickly, but now, looking back, it seems very predictable.

Obviously I don't mean blog posts. Every morning, I walk to work, roughly two miles and some change. The route takes me through Alton Baker park, past an off-the-leash dog run where I get to smile at dogs of every breed and age frolicking together, and over the Willamette River on an arcing footbridge. The trail has three or four short uphill stretches that make me lean into my stride, and it's quiet and peaceful enough that I get a good deal of thinking done. Once I get fresh air, (relative) peace and quiet, and a little extra blood to my brain, I tend to brim over with ideas, which is a tremendously exciting way to start and end the day.

So I take that walk just about every morning, and take it home again just about every evening, and by "every" I mean every. I come in to my office almost every Saturday, some Sundays, and even when I'm on vacation. Coming in to the office doesn't mean I spend the entire day behind my desk; instead, I frequently go wander downtown Eugene, doing this or that. My office is nicely located to serve as my base of operations for those expeditions. So, now that I've been here a full seventeen months plus about two weeks, I've done the walk to and from my office on just about five hundred days. If I haven't done it a thousand times yet, then I'm no more than a week or two away.

And next year I teach my one hundredth public speaking class. Some people rub their eyes and say that can't be right, but for a stretch of several years, back when I was at SFA, I taught nothing but public speaking: five sections a long term, two in the summer, and two extra as an adjunct at Angelina College. Keep up that pace for a few years -- sixteen sections a year -- and the double-digits just melt away, and it's no wonder that my public speaking odometer will roll its third digit during my fortieth year.

Forty. Wow. I turn forty this year. The Summer of Love, of the first human footstep on the moon, is two full generations behind us.

In a couple of my classes, we've been talking about communicative codes, and how they are a product of culture. This morning I was struck by the disproportionate layers of meaning wrapped around tens and multiples of tens. True, it's how many fingers we've got, unless our chromosomes did some wacky gyrating, and it's how many commandments God gave Moses while He was still clearing His throat, lexically speaking.

Multiples of ten are nice, round numbers, chiefly because they end in zero. And my brain could ricochet off and chase the concept of "zero," but I've read other people's take on it, and honestly, I've got nothing to add. But the niceness of round numbers tickles me, since it's precisely the round shapes that don't fit ordinary geometric operations. Yesterday, in my nonverbal class, I explained the difference between analogic and digital signals: first, I showed the visual light spectrum, the rainbow, stretching from red to indigo, followed by a huge box of hundreds of crayons. The unbroken spectrum is analogic, and the crayons are digital. Then I showed them an animation of a circle with an inscribed polygon inside it that grew more and more sides several times a second, and I explained how Archimedes approximated pi by that method. From those beginnings, we talked about the relationship between analogical and digital, about portability and convenience and imprecision and infinite vs. finite adjustment. Multiples of ten are good for ballparking, for stepping over the quagmirish details of a complicated number and just getting the nearest landmark in its neighborhood. How old am I? After July, I'll be in my forties. Nice, round number that lets people leap to conclusions about where I am emotionally, developmentally, philosophically.

And then I'll turn fifty, and that'll be ten times more ... well, I'm not sure of that. But it'll clearly be ten more.

Thursday, January 29, 2009

Amo

The school paper put out the word for people to send in short Valentine's Day messages for their honeyloves. They offered to publish them under pseudonyms. This was, of course, too good of an offer to resist, especially since I'm technically the newspaper's faculty overlord. Anyway, when I should've been working, this is what I whipped up:

From: Humpty Dumpling
To: Lovezilla

Honey bucket, our love is a picture drawn in graceful strokes and vivid colors that started out as a fingerpainting by a toddler with alien-hand syndrome, but was swiped off Grandma’s Frigidaire® by a wizened old stevedore with an extra nostril and a secret love affair with three wigmakers, all sisters, so he could code into those hypnotic, Pollockesque swirls a hidden message detailing where the booty from the bank heist might be unearthed some moonlit night, as the nutria frolic under the whispering elms for joy of fertility and outsized rat exuberance.

And that black and melon polka dotted blob to the left of center, shaped a little like the squirrel parts your dog threw up under the ottoman? That’s a special kiss from me for each of your sweet, sweet eyebrows. Treasure it always.

Aqua/Tact

John Updike died, night before last. I spent part of Wednesday evening listening to a radio broadcast of an interview with Updike from about twelve years ago. He stayed on one theme for a bit: the idea that his writing was a message from his present self to his past self. The impossibility of that set me off thinking about noncongruence.

A point I make in many of my classes is that perfect fidelity in understanding another person's reality is necessarily and forever beyond our reach. We are all locked up in our own skulls. We have absolutely zero way of knowing what the world feels like, seems like, even looks like, to any other human being. We can approach another person's perspective, we can grasp parts of it, we can rough it out, but we can never perfect our understanding of it. It's like calculus: we draw nearer and nearer, but we never reach precision.

That got me to thinking about seeing underwater. We see shapes and colors, but the shapes are wavy and the colors filtered. Of course, that's a pretty bad analogy for a lot of reasons: swimming goggles can minimize the effect by pushing the medium away from our eyes just a few centimeters, whereas there's no way to push away the chasm of difference that separates us. And fish do just fine with underwater sight, because their eyes are built for it. For that matter, air also distorts our vision, although not as much. It filters color as well, or the sky wouldn't be blue. So that connection really wound up more a dartboard than a springboard.

I did get a little taken with the nonreciprocity between nonidentity and age, though, because of the point from Updike that got me started. I am not the same as you, and that difference is unbridgeable. I am not the same as my younger or older self, and in one sense that difference is also unbridgeable, but in another sense, it might not be. It's nonsensical to talk about sending messages to the past; can't be done. But you can send a message to the future: in some ways, what I'm doing as I type this is exactly that. I've already gone back and read some of these older blog entries and been reminded of attitudes, opinions, powerful feelings that I held at the time, and I've watched the curve as they faded. But is that the same? Can I understand perfectly my mental state from before? Or am I making up an edited reconstruction of those moments, tainted by my changed perspective, and then fooling myself into exaggerating its precision? These previous writings, feelings, thoughts, all happened to me: can I recapture them? Heraclitus had an easy answer; I can't step into the same river twice. And an axiom of Communication is that each and every communicative encounter is unrepeatable. But is that true of intrapersonal communication? What about when a person returns to a childhood haunt, and the sights, sounds and smells bring back the feelings and the extremely vivid memories? Is that a reconstruction, or is that congruence with a former self?

That got me to thinking about God's freedom from limitation, and it occurred to me that when we say He's omnipresent, that means not just everywhere, but, as we already knew, everywhen. He's in every age, every second, every event. And what that tells me is that God is free from dimension, which limits us. Einstein had a lot to say about spacetime, about how distance is distance is distance, whether measured in separating space or in separating time. That's also true of identities, I think: from here to there, from me to an other, from now to long ago, or even one second ago.

So then, as I was walking to work, I was giving myself a dork's morning warmup by running through the labels for sense data from each of the five senses. Most people know the first two: visual, pertaining to data from your eyes, and audial, describing data from your ears. The other three make good trivia: olfactory, which is data from your sense of smell; gustatory, which comes from your sense of taste; and tactile, from touch. The interesting thing, though, is that while we refer to messages of scent as "olfactic," we change the term slightly and call messages through touch "haptic." Why the change, I wondered? And then I saw how many different words "tactile" supplies with a root: "tact," which you could define as having the right touch in an interpersonal situation that was touchy, and "tactics," referring to taking a strategy out of the planning room and onto the battlefield, where you actually touch (violently) your adversary. (Or where you deploy any strategy. It's obviously not limited to waging warfare.)

Maybe that's the way to understand the limit: we can't be perfectly congruent, we can't achieve perfect overlap, but we can touch. We can come in contact. The contact can be a full-stretch fingertips-only touch, or it can be an embrace of unconditional acceptance, or a grappling backed by strength in a bid to impose mastery. Maybe the touch metaphor is an entry point to understanding distance and the lower limit of its challenge.

This'll take more thought.

Monday, January 26, 2009

Psycho

Anything taken to extremes, no matter how good, turns bad. Anything. Write it down. Memorize it. Live by it. Water is a necessity. Life can't exist without it. But there's such a thing as too much water: it's called "drowning."

Less money for government and more for taxpayers can be good news in some situations. I'll even go a step further and say, many situations. But I've spent the last two days writing angry letters to the editor over the Republican Party's downright irrational obsession with cutting taxes more and more and more.

The Republican leadership is always the first to remind the country that we're at war. But until this decade we'd never, in our entire history, cut taxes during wartime. And we certainly never did it again and again and again and again and again during wartime. That was Bush's brilliant idea, and it's near the bottom of that list. Remember soldiers with no body armor? Vehicles that couldn't withstand an IED? Have you read, heard about, or even known veterans who were denied care? When the coffers are empty, the government can't wage war, and soldiers pay with their lives, and children pay by growing up without parents, and veterans pay in pain and hunger. That cause and effect chain is as simple and inexorable as they come, and the consequences are gruesome.

And have you noticed over the past few years that the supermarket has turned into a death trap? That no sooner are spinach and tomatoes are cleared of E Coli risk than the peanut butter is laced with salmonella? Noticed that this didn't seem to happen nearly as often years ago, have you? That's because back then, we actually paid for enough food inspectors to give us a reasonable expectation of clean food! More and more, people from other countries are having to take the precautions, when they visit the United States, that Americans took for years when we visited developing countries. And it's not that we don't have the level of affluence necessary to police our food; it's just that we've got this wildly moronic idea that the more we slash taxes and hollow out government programs, the more virtuous we are. Dying of salmonella is a virtue I can cheerfully pass up.

And been on the highways lately? I wouldn't recommend sipping coffee while driving unless you want it in your lap. And before you cross an overpass, be sure your will is up to date. Can't maintain the infrastructure if we've cut taxes so near the bone that cement and iron are out of reach. Cardboard just won't cut it.

Understand me: I spent my teenage years and early twenties as a die-hard conservative. I know conservative beliefs, conservative reasoning, conservative arguments, and being a bloody fool isn't anywhere in my understanding of what constitutes a conservative. It's been a pretty unmistakable hallmark of this generation's crop of conservative leaders, but there are much better conservative ideas out there. Mindlessly baying "cut the taxes! Cut the taxes!" is not a platform, is not public service, is not anything other than an excuse not to think, not to lead, Grover Norquist be damned.

The thing about the economic stimulus package is that every minute of delay robs it of effectiveness. Another thing about it is that it already contains tax cuts! For the Republican leadership to dig in their heels and demand more, given the state of government resources, given the past eight years of ramshackle, paper-thin services under Republican budgeting, is intolerable. From 2006 to the present, they've been taking loss after loss because people are tired of having the government steered straight into the shoals, and it looks as though they need another loss, and possibly another, and another, and as much as I hate to say it, they may simply be incapable of learning any better.

I don't want that to be true. I want wise, balanced, considered government that bargains integratively and legislates creatively and finds a way to transcend difference and incorporate both sides' good ideas into the outcome. Instead, we seem to be stuck with an opposition with its needle stuck in exactly one groove, and a groove that anyone who isn't blind and brain-damaged can see is a straight road to disaster.


Get us out of this, someone. We need an opposition party that can muster some brainpower. And we need it now!

Saturday, January 24, 2009

Non

In 1994, I first joined what was then called the Speech Communication Association. I loved belonging to a nationwide professional organization that I affectionately dubbed "Ska." But in 1997, they changed the name to National Communication Association, and for a while I grieved the loss of coolness. Then, one day, the light bulb flickered on.

Someday, I want to run for second vice president of NCA. My platform will have one plank, and my vision for the convention will be straightforward. My platform will be, we should stop calling it NCA and instead call it "Nicka," and my convention theme will be, "Nicka, please!"

Proto

George Gerbner was a Communication professor at the University of Pennsylvania, and later at Temple University. He's known in my field for Cultivation Theory, which says that the contained reality inside a television engraves itself on its viewers' beliefs. People who watch a lot of television wind up with very skewed ideas about how dangerous the world is, how often people are victims of violent attack, what percentage of the population is female and/or nonwhite, and other distortions. Gerbner's team backed these claims up with a good deal of evidence, whereas what I'm writing today is pure speculation. Zero evidence backs it up, but it's a study just begging to be done. Won't be by me, but if it ever does get done, I'll hang on every word of the reported results.

TV shows have in common with movies and books the trait that they're made up of characters and plot, and characters and plot are coherent. They have a logic to them. If the dramatis personae on a show do something that's "out of character," then that becomes a complaint, a sore spot, a moment that a typical viewer might not enjoy. The exceptions, of course, are out of character acts that advance the plot, by dropping clues as to what will happen next, so even when the viewer can spot deviance, it's still of a sort that will be recombined into the larger logic of the story, just as any dissonant phrase in popular music is very likely to be resolved into the theme. And when people consume a steady diet, evening after evening, afternoon after afternoon, weekend after weekend, year after year, of coherent characters and structured plots, it really seems to me that they get the wrong idea about how life works. At the very least, they have a track of expectations hidden somewhere in the confused tangle of the mind that becomes a stumbling block to reasonable deliberation.

This is my perennial grumble, but I think I'm on to something here: my students fall into a very weird form of denial when it comes to getting their work done. They dig themselves into deeper and deeper holes, joking all the time about how awful their procrastination is, and then put themselves through unbelievable torment to try to recoup. And one of the most striking things is how utterly dumbfounded they are on those occasions when the catch-up effort fails. When I grade work and it doesn't pass, or when someone submits an assignment after the deadline and I won't accept it, I don't often see anger or anguish or other "ang" words as the first reaction. Many times I see them as a delayed, second reaction, but the first reaction is almost always puzzlement, incomprehension, utter unpreparedness for the state of affairs. It's not that they see what's happened and they're upset; it's as though they never considered this possibility in the first place, as though water were dry and gravity repelled instead of attracting.

And I'm starting to think it has something to do with overexposure to TV plots.

In TV plots, there are complications, and those build enjoyable tension and curiosity, and then there's a resolution. There's always a resolution. It might not happen this week: it might be strung out over an entire season, but good writing includes a tying up of loose ends. Something swoops in and writes an easily understood ending to the story. And if Gerbner's right that what we see about danger, and about the distribution of demographic groups among the population, primes us to expect the same patterns in real life, then it wouldn't surprise me at all if people expect their problems and challenges to follow the same trajectory: to descend, as though attracted by a strange teleological gravity, toward a solution all by themselves, even in the absence of anyone's deliberation or planning.

And, of course, life isn't life that.

The other thing is, people aren't characters. People do not have a logic that holds them together. People have, at all times, the potential to behave "out of character," and the problem is not with the people, but with the phrase "out of character." That's an attempt to rationalize our incomplete and sloppy pigeonholing of people, our forceful insistence that our perceptions are not only accurate but normative. You should behave the way I expect in all matters, large and small. That, of course, is both impossible and silly, since you can't ever fully understand my expectations and I can't ever fully grasp your motivations, but we do follow that cycle of error over and over again. We especially do it in interpreting people's nonverbals, which is something I'm attuned to right now since I'm teaching the class, but we repeat it in just about all areas.

I have one Communication major, one of my very favorites, who drops by and has lengthy talks with me, and much of them consist of variations on one theme: "I'm not like other people. I'm very complex." The second half of that theme is true: she is quite complex. But where she goes astray is with the first half: assuming that other people are not. We have a label for this: the illusion of asymmetric insight. That's the assumption that other people are easily understood, but that none of them truly and completely understands us. She puts it neatly, but she's not the only one who falls prey to it. I'm aware of it as an idea, a phrase that I can invoke to diagnose a particular tangle of thinking, but I stumble over it all the time.

The thing is, life does have an Author and a plot, but our silliness is in trying to compare the plot of life to a plot authored by a human. I know I'm a broken record, but His ways are not our ways and His thoughts are not our thoughts. My Sunday School class is currently in Genesis, and we're coming back every week to the difference between Cain's descendants and Seth's descendants: Cain's descendants were movers and shakers who made names for themselves by their accomplishments, and Seth's descendants called upon the name of the Lord and waited patiently for the seed of the woman who would overturn the serpent's victory. So it's become a slogan for us: are you making a name for yourself, or are you calling on the name of the Lord? It's got to be one or the other, because doing both isn't an option. And that idea pops up again right here. If I know that I'm an infinitely complex character, participating in an infinitely complex plot, and the Architect of its logic has no need to round the plot off and shape it into an easily chewed and digested bite of narrative that I can fully grasp, then I have far less reason to be complacent, far less reason to trust in my own perceptions and my own judgment, and far more reason to stretch myself and at the same time fall back on my dependence on God.

Put in fewer words, if I can't figure out the story of my life, then my only other option is to walk by faith.

Friday, January 23, 2009

Nom

You've got to feel sorry for people, and inhabitants of places, whose perfectly good names suffer from associations they never asked for.

Oświęcim is a decent-sized city, between a quarter and a third the size of Eugene. I know that Poland has a long history of anti-semitism and mistreatment of Jewish people, but despite that, it just seems as though there must've been a day when people who lived in Oświęcim could claim their hometown without bracing themselves. We, of course, know Oświęcim as Auschwitz. That's not just the name of the death camp: it's the name of the city nearby. It must be a tough thing to write on a stamped envelope.

Then there's Alzheimer. Nice enough name. There aren't any Alzheimers in Eugene, but Charles and Judith Alzheimer live in Klamath Falls. But Alois Alzheimer had to go and describe this new degenerative and terminal brain disease back in 1906, and now the name has passed into the language as a heartbreaking, nightmarish, slow death sentence. And the more the epidemic takes root, the more that's got to be a constant source of cringing. Speaking of which, Dennis and Theresa Dahmer live in Portland. I imagine that's not uncomplicated either.

And the associations aren't always negative: sometimes they just plain don't fit. A few years back, I knew a young man who debated for Samford University in Birmingham, Alabama. He was soft-spoken, white, and within the normal range of charisma and social skill for a debater: not hopeless, but slightly on the geek side. And his name was Michael Jordan. And I'm sure that was a perfectly nice name to have when his parents gave it to him, but by the time I met him, he'd been through years of meeting new people and having to exhale hard and be patient while they got their jokes out of their system.

It's like a lightning strike or an earthquake: the kind of thing that robs you or your home of a good name just seems so wanton. You can beef up your health against illness and lock out burglars with better security, but how do you protect your name from other people's deeds?

Tuesday, January 20, 2009

Post

Dear Future,

Today, Barack Obama will be inaugurated. That'll happen about two and a half hours from when I'm beginning this. I'm setting down my impressions and feelings, and trying to record some of how I felt during the campaign, just because events look so different on their backside, and this, of everything that's happened in my life, feels the most like a historic turning point. (No, I don't count September 11: step outside and ask passersby how many can tell you anything at all about the Haymarket massacre.)

From early in the campaign, I was confident Obama was going to win. A colleague of mine grew up in Mississippi, and then spent many years of her adult life in Tennessee, so she'd seen over and over again how violently bigotry can erupt and turn people toward irrationality. Many times she came to me and said, "You still think he'll win?" I quoted back to her from To Kill a Mockingbird: "It's not time to worry yet. I'll let you know when it's time to worry." That story ended with what looked like a defeat, but showed just a twinkle of progress, and sounded a faint prediction of future success. I hope this story doesn't end that way.

It definitely could, though. Expectations are so high. That worries me. The higher they are, the more completely they shatter when they drop. This won't be a storybook presidency. The Obama team has their clumsy days. We've seen several already. He's filling pretty small shoes, so he'll look good just by his distance from the baseline, but given everything he's up against, I doubt it's enough to sustain the insanely inflated hopes of his most zealous fans.

Most of my students would have absolutely zero idea why the election of John F. Kennedy was such a victory over bigotry. If they know anything about him, they know he was young, thought good-looking, had a pretty wife, was shot, and it's cool to speculate about the possibility that his assassins successfully covered up their crime and got away clean. ("They" didn't. Oswald did it, acting alone. Zero doubt.) But it would startle most of them to learn that for much of its history, the Ku Klux Klan has had three chief targets: blacks, Jews, and Catholics. It would upset them to read front page editorials from major newspapers from the teens and twenties of the last century, saying Catholics couldn't be trusted, saying the flow of Catholic immigrants should be choked off in favor of more desirable races. They would reel at some of the arguments deployed against Al Smith in his run for the presidency, and at the delicate negotiation John F. Kennedy had to carry out to become the nation's first Catholic president.

And only. Almost two full generations later, there hasn't been a second.

Great Britain has had its first female prime minister, as have Israel and Germany. First and only. There hasn't been a second. Breakthroughs are not normalcy. Countries can be one-hit wonders just as easily as actors can win an Oscar and then vanish. What we need is a distinguished career, a string of victories. After that happens, I'll be more ready to say we've reached an era of post-racial politics.





I took the above picture on September 7, 2007. A while back, the elder George Bush mistook September 7 for Pearl Harbor Day. Now it's looking as though what was a decent-sized rally, in a decent-sized room with a few thousand people, was a Pearl Harbor Day from the Doyle's-eye view of history: a sneak attack and a sudden reversal. It's a slightly shaky analogy, given that Pearl Harbor unleashed an extinguishing strength, whereas Obama followed the momentum of his uprising all the way to victory at the ballot box. But strength is still arrayed against him, and its recent setbacks aren't terminal; it hasn't given up or fallen asleep.

The mistake, from either Obama's backers or his opponents, would be to make any claim today about race problems in this country being at an end. But as I've told students a zillion times, problems aren't licenses to panic. Problems aren't the green light to hunker down and prepare for battle. Problems are openings for solutions, and solutions can turn out to be opportunities to grow together in trust and loyalty. One of my favorite new colleagues, our new math professor, assigns problems every day, and the students don't panic or prepare for battle (ideally): they simply work the problems, identify the solutions, and move on with their newfound knowledge to tackle bigger and better problems.

My colleague from Mississippi and Tennessee is downright feisty about reminding people that this isn't an ending. At least once I've heard her say, "Everyone wants to make Obama out as Moses, like this is our nation's arrival in the Promised Land!" But when Moses appeared and was elevated to leadership, that wasn't the end of the journey. All the hardest parts still lay ahead. That's where I think we are today, and I hope we don't test God's patience nearly as much as the Israelites did.

When the United States gets its second African-American president, or possibly the third or fourth, and there have been a few female presidents, Latino presidents, and no one any longer pays much attention to the candidate's race or gender, then I'll concede that we've left the problem behind. The first time a toddler manages to get it in the potty, that's not the end of potty training: it may be an encouraging step forward, but until there are dry nights and accident-free days, the transition is still underway. I don't expect the end of this will happen in my lifetime, but to be honest, I'm not sure I expected this encouraging step forward to happen in my lifetime.

And as I've been writing, I've tried to settle on an example of a criterion we applied to our earliest presidents, but that we've left behind and no longer apply. For a moment I thought I had one: family! In generations past, if you weren't from one of the powerful families, you had no hope of making it into the top circles of influence. It was the whole "first families of Virginia" phenomenon, and it's downright striking how many of our presidents have been distant cousins. I wanted to say we'd left that behind, but then I remembered exactly who's exiting office today. There went that argument.

But so far, so good: Obama's showing signs that he's going to muddle through with better-than-average effectiveness. He's unashamed to listen, even to his opponents. He's been steering away from excess and toward pragmatism. I think we're going to need a gigantic dose of patience, and I don't think right now we're primed to be patient -- patience and fever-pitch excitement coexist pretty uncomfortably, as any small child demonstrates on Christmas Eve -- but I'm not terribly worried about the future. So to you, in the future, I can send a report of realistic, counter-inflationary guarded equanimity. I don't believe the hype, but neither do I believe the gloom and doom. And I don't believe the lies either, and I'm encouraged that a critical mass of the electorate didn't, or today we'd be inaugurating someone else.

Monday, January 19, 2009

Inter

In the field of communication, we make a big deal out of cognitive complexity. Cognitive complexity is the ability to understand the thoughts and opinions of other people. It's not the same thing as empathy, although the concepts are similar: it's understanding another person's reasoning, not their emotions. (And that's an oversimplification of empathy. But I'm going to plough ruthlessly on.)

I've been reading a book the past couple of weeks that's stretched my cognitive complexity in a delightful way: Why The Jews Rejected Jesus, by David Klinghoffer. He makes a pretty thick, complex argument about the relationship between Christians and Jews, but on the way to his conclusion, he stops repeatedly to develop the support, scriptural and otherwise, for the Jewish position that Jesus was not the Messiah. And I have to say, the guy's pretty good. In isolated cases, he's downright compelling. In others, his blind spots are all over him. He complains about how Christians veer back and forth between precise readings of Old Testament prophecy and loose, metaphoric understandings, but he never seems to notice the same variation in his own references. His, of course, are part of the oral Torah, and thus are all perfectly sound interpretations. Of course. One of the big thrusts of his claim is that Christians misunderstand the Old Testament because they study it only after they've encountered the New Testament, and view it through that prism. He admits that the same in reverse is true for Jews, but since they don't regard the New Testament as meaningful in the first place, it's no great downfall to be unable to understand it.

One of the places where he's downright compelling is where he goes back through some of the citations in the Gospels of Old Testament prophecy, especially from Matthew, and points out how sloppy they are. They're the worst kind of prooftexting, the kind that we would never tolerate in a Sunday School or Bible study. They pluck out two or three words, a random detail, and completely ignore the thrust of the passage. And sonofagun if he isn't absolutely correct. When I landed on that realization, it sent some ripples through my world. I haven't had any serious questions about my faith in a very long time, so the fact that he scored a hit put me in a frame of mind that I thought I'd left way, way behind.

My settling down had a lot to do with a realization I arrived at over the summer, after I listened to a recording of Alan Jacobs' biography of C. S. Lewis, The Narnian. Jacobs drew a lot of it from Lewis' own writings, and one passage in particular focused on a span of a few months or years when Lewis made himself a regular guest at the Socratic Club, an Oxford debating society. On those occasions, he'd let the other club members lay out their arguments against God's existence, or against Christianity, and he would then swing into action and demolish them. But Jacobs reports that Lewis stopped this pretty abruptly, and wrote in his journal that every time he was able to prove something about God, he felt his faith weaken.

I've thought for years, and the idea is not original with me, that God doesn't lock the door. There is room inside every scrap of proof for the determined nonbeliever to wiggle free. There is comfort and ease for the person who wants to total up all of human existence and say "Just the product of random chance." It is part of the remarkable genius of God's creation that His signature is all over it, but someone who wants to find no God in any of it can put that world together out of their perceptions, with His permission.

God's position on evidence and proof is difficult to pin down. The Bible is full of good, sound reasoning, but also contains intermittent reminders that reasoning isn't going to get us everywhere we want to go. God's thoughts aren't our thoughts. Both Christ and the apostle Paul made the point that God put much truth beyond the reach of our reasoning abilities, and that things are arranged to permit us to reason our way in completely the wrong direction. In the same way a toddler doesn't have to construct syllogisms to prove that Mom and Dad will still provide food, clothing and love tomorrow, same as today, a child of God doesn't have to prove what they live by. Still, there's a proffer of proof, a teasing of proof, a taste of proof, in the case built in each of the four Gospels. And Christ almost seems to play "get away - closer" with the entire question, sometimes supplying proof, sometimes changing the subject, sometimes teaching that a desire for proof is a symptom of the problem.

I think it's probably a good sign that I continue to make like Jacob and wrestle with the question. I never regard the matter as settled, because settled matters can be ignored, but an ongoing wrestling match is a magnet for attention. Not only that, but it's surely the trajectory of all branches of learning, from science to the humanities to the most obscure branches of trivia, that the most fundamental, bedrock teachings show cracks and imperfections as we learn more and build more on top of them. Those cracks just spotlight the limitations of our intelligence, the flaws in our perceptual apparatus and reasoning skills, not that truth itself has changed or become obsolete. So why should it trouble me that the proofs offered at Christ's arrival show the same slippage? And it's especially telling that these slippages point me back to relationships, to the positioning of the critic against the text. David Klinghoffer is a devotee of the Torah, so his starting assumptions will aim him in a direction from which the New Testament is going to look hostile and threatening. To him, that's a stumbling block: to me, it tells me more about him, and therefore how to love him better, and it also tells me more about the text itself. So the simple matter of interpreting the text by reading its passages against one another isn't the entirety of the enterprise.

These are wandering thoughts, and I don't think there's any real likelihood that they'll cohere and quiet down anytime soon. But it's very enjoyable, especially for an argumentation dork like myself, to probe around in the crags and jags inside of my framework of reasoning and notice that something too simply called out as a flaw is actually a lesson. And I know I'm not done learning those lessons, and that I'm not anywhere near exhausting them. Probably not in this lifetime, actually.

Thursday, January 8, 2009

Sub

This is nothing but self-absorbed rambling. Don't say you weren't warned.

For between ten and fifteen years, I've only been interested in three luxuries: books, coffee, and sushi. As for everything else, I was perfectly happy living on the cheap. I didn't, and don't, care about where I live or what it looks like: I have a one-bedroom apartment with cheap furniture. Back in Texas, it was all garage sale furniture, and even though what I have now was actually store-bought brand new, it's still pretty sparse and on the lower end of the price range. I don't care about what I wear: I just stocked up on dress shirts for work, buying eight of them, two of which were Old Navy clearance, and the other six were Wal-Mart clearance. I don't have a big, spiffy TV set for the simple reason that I don't have, need, or want a TV set at all. Same for a cell phone. I drive my car just one day a week, so it's a rusting clunker with about the power you'd expect from a Hot Wheels. And the only music player I have is a tiny iPod shuffle that was a Christmas present from my brother's family, and I go weeks at a time without touching it.

But now I've got a new one. And I can't quite say I no longer care about what I wear. I've discovered a taste for really nice undershirts.

Now, don't get all squirmy. True, an undershirt is technically part of my unmentionables, but it's not like it's in any way private. We're not talking lingerie, here. I'm quite comfortable answering the door in an undershirt. (And pants, of course.) When I was in college, I went through about a year long phase when I wore nothing but plain white T-shirts everywhere I went. Of course, back then I wore the cheapest ones I could find. And from that day until very recently, I bought undershirts the same place I bought almost all my other clothing: Wal-Mart. Cheap, functional, does the job.

But somewhere along the line, I got some Stafford undershirts from J.C. Penney's, and they're a little thicker and a little sturdier. Bit by bit, I noticed that when I put laundry away, I'd pull those out and stack them on top, so they'd come up first in the queue. And then last summer, I somehow picked up a few Croft & Barrow undershirts, and all through the fall semester, they caught my attention. I'd put one on in the morning, and just have the little flicker of thought, "This is nice." And then, after Christmas, when everything was on sale, I did something I've never done before: I actually went into Macy's, which I consider the dark and evil temple of profligacy (yes, I know there are worse places, but I don't go to those neighborhoods) and spent money. I bought a couple of different brands of, and I can't believe I'm writing this, up-scale undershirts. And oo, they're nice. Like butter. Like velvet. Like sushi.

Can't say why I'm suddenly attracted to this new and utterly random luxury. But if this is as far as my mid-life crisis goes -- no motorcycle, no trophy bride, no skydiving, just the odd foundation garment -- then I'm probably in good shape. And in the coming days or weeks, I expect I'll come up with some entirely contrived insight into the evolutionary changes in my worldview that explain this, but for today, I think it's just a new quirk to join all the other old quirks.

Self-absorbed rambling I promised, and self-absorbed rambling I delivered.

Tuesday, January 6, 2009

Vita

99% of what's relevant to any visit to a doctor's office takes place outside the visit. The doctor cannot see to it that the patient gets enough sleep, eats healthy, exercises, manages stress, avoids unhealthy habits and substances, or even takes the medication correctly. All of that is up to the patient. If the patient drops the ball on all the day-to-day smart, healthy, maintenance choices, and then blames the doctor for not coming up with a magic "healthy pill," then that's just the patient's foolishness and refusal to take any responsibility.

At my first class meetings, I plan to talk through all that with my students and see if they agree. Then I'll remind them what my title is, and I'll leave it at that.

Thursday, January 1, 2009

Con

So there's a joke I first heard Gallagher tell. (I'm frankly ashamed to start this with a Gallagher joke, but that is, in fact, the person I first heard tell it.) "We all know the opposite of pro is con, so is the opposite of progress, Congress?" Along those lines, if you ever want to get on debaters' nerves, ask them, for any given debate, "Were you pro or con?" It happens all the time, and it's unaccountably irritating. The sides are called affirmative and negative, not pro and con.

In both cases, there's a widespread misunderstanding in play. "Pro" and "con" are not opposites; they're complementary. The prefix "con" doesn't mean "against," but actually means "with." Dictionaries list it as a variant of "com," which comes from the Latin "cum," meaning with. If you graduate cum laude, you graduate with honors. And in Spanish, a Latin-derived language, "con" is the word for "with." In French, it's "comme." Now, sometimes "con" is a shortening of "contra," which does translate as "against," but even that is a corruption. Its original meaning was "in comparison with," which is far broader than "against." You might visualize standing up next to another person so a third party can see which of you is taller. You aren't opposites, and one of you being taller doesn't make the other shorter, but you do stand contra, in comparison, so an observer can make the measurement.

Why am I going on about this? Because of the jagged edge our culture hallucinates in every outbreak of argument. I'm gearing up to teach rhetoric and argument this spring, and I'm going to be talking with the students about how our learned aversion to argument, our avoidance of open controversy, is in a lot of ways an American trait, and furthermore, an American sickness. It's a view that categorizes all argument as hostility, enmity, the demolition of affinity. And it's tragically wrong.

If we remind ourselves that "con" means "with," then the pros and cons that go with any proposal look and feel entirely different. Here are the arguments in favor of the idea, the steps toward it (which is the literal meaning of "pro"), and here are the difficulties that come with it. It's the idea of "with any major decision, you have to take the bad with the good." You can refine, address, minimize, but you must not make the perfect the enemy of the good. And, further, you can't decide at all until you've paid attention to the con. Attempts to do so leave the job unfinished, because the con goes with the pro. They aren't adversaries; they're symbiotes.


We are a culture more vulnerable to groupthink than any I can imagine, because we work so hard to extinguish disagreement. If we can't reach an understanding that "con" isn't opposition, isn't something to be pushed out, but rather that "con" is with, that it's a welcome and important part of decisionmaking that needs to be worked through, then we're due for more exploding space shuttles and more blood-drenched post-invasion quagmires.

The notion that every message has a content dimension and a relational dimension is one of the ABCs of communication scholarship: very basic, very fundamental. But it keeps popping into my thinking lately in ways that fascinate me, although I doubt they're terribly original. But this is another one: argument creates a relationship. I know that one's not original with me: Habermas and Ehninger got here first. But it's both true and important. Where there is conflict, there is engagement. (Cue Bridezilla joke.) After the complete breakdown of a positive relationship, if it is replaced by implacable hostility, the likelihood is high that there will be escalating conflict: perhaps physical, perhaps financial or legal or social or aesthetic or any one of an endless range of possible modes of conflict. But that conjunction is nowhere near inevitable when turned around the other way: an outbreak of argument, of disagreement, is not proof that the relationship has blinked out of existence. But too many people spring into action because of precisely that assumption.

One of the worst things that happened to me in 2008 is that I lost a good friend over a political argument. The friend has very conservative views, and sent me an email one day, asking a few questions about a letter to the editor I'd submitted to the local paper. In the email, the friend talked about "liberals like you" and said "all you liberals think," and variations. I wrote back saying I wasn't like all liberals, and I wasn't going to enter the conversation if that was one of its necessary premises. If my friend wanted to discuss with me what I thought, I was willing to do so, but I wouldn't be a synecdoche for a strawfigure. The friend wrote back, no exaggeration, within minutes, and informed me that our friendship was over. Boom. We exchanged a few more emails that day, but the friend stuck to that position that the friendship had ended.


I don't accept that, and I keep hoping, foolishly, that with the passage of time and the cooling of temper, there's hope for reconciliation. But it's that willingness to close the book on a relationship over changed terms in a simple disagreement, and to do so in a matter of seconds, that saddens and troubles me. It's not unique to that person, either: it's the overt manifestation of an attitude toward conflict, argument, controversy, disagreement, that is peculiar to our culture. And if I have a hope for the new academic term, for the new administration, for the new year, it's that we can reach a turning point and change that understanding.

It felt as though I was winding down with the end of that paragraph, but a thought struck me, so I'll tack it on. My mother, who lived through the great depression, is always horrified at how quickly people my age throw away leftovers, get rid of worn clothing, aging appliances, etc. I think our threshold for discarding the imperfect keeps moving lower and lower, that smaller and smaller imperfections qualify as sufficient reason to jettison, and maybe that's part of what's at work here. I'm not too confident of that explanation, because the friend I mentioned in the paragraphs above is only a few years younger than my mother, but the parallel between those two attitudes is pretty striking. Maybe learning to accept the imperfect and keep it with us is both a rule for thinking and for owning. And maybe there's not a difference.