skip to main |
skip to sidebar
Two, three weeks ago I went for coffee with a student and three of my colleagues. The student had some nagging questions, and the colleagues each approached those doubts with very coherent, erudite explanations and recommended readings. I was out of my depth, so I did a lot of nodding. But on the walk back, I made one contribution to the rap. Last week, the student mentioned in an email that what I said was helpful, so I decided to stick it here in case it helps anyone else. It's a good description, in just a handful of moves, of how I understand faith.
Do your parents love you? Have they included you, protected you, encouraged you, lavished affection on you, because they love you? If the answer isn't yes, then it's time for a nimble detour on this flow chart to address some serious, chronic trauma. But in most cases, the answer is yes.
At my most cynical, I could make the case that your parents had done all those things from purely selfish motives. They did them to fit in, to win acceptance from peers, to appear normal, to satisfy their own parents or spouse, and the list of plausible motives could stretch as long as necessary to make the point. Give me enough time to weave my argument and I could have you believing that they never loved you for one moment, from your birth to now.
Neither explanation would finally dismiss the other. I could raise a lot of doubt in your mind, but I couldn't construct a proof that necessarily excluded the possibility that your parents loved you. Similarly, every heartwarming story you could tell, all put together, could not stop me from my competing narrative that they behaved as they did to please themselves, to use you as a prop in their self-presentation.
You're left in a very Heisenbergian suspension. You either live in a world in which you have loving parents, or in a world in which your parents are cold, calculating, and very convincing liars. You have no real way of settling which world is real.
Pivotal question: in which of the worlds is your life better? Then live in that one. Live as though that's the correct explanation. If, in doing so, you find out that your life is, in fact, better, then you chose wisely. You might not know which explanation is really the correct one, but you were never offered the option of perfect, unquestionable certainty in the first place, so you lose nothing.
Transference of this analogy from your family to your faith is left as an exercise for the student.
Elvis took amphetamines to fuel the runaway train of his life. Then, he'd take barbiturates to fall asleep, followed by amphetamines to burn out the barbiturates, followed by barbiturates to dampen the amphetamines. Lather, rinse, repeat. For a while, he was able to work that cycle and look good, but sooner or later the strain etched visible changes on him, and not for the better. And in the end, he died young and left an ugly corpse.
In Texas, business goes buzzing along, enjoying this huge infusion of wealth from the various petrochemical industries. Absurdly high gas prices mean eye-popping earnings, which then sweep through all the contiguous businesses like a flood finding its level in rivers, streams and creeks. And Texans congratulate themselves about how much better their choices have been than those of their compatriots, how recession-proof the state is. But my memory goes back to 1985, when an oil glut dropped gas prices to astonishing lows, and suddenly no enterprise in Texas, from the government on down, had the resources to accomplish anything. And it's perfectly plain that the future of energy in this state, nation, world, is not just more of the same. Very wealthy, well-connected people can put all their power behind calls for more domestic exploration and drilling, but there comes a point where the amphetamines no longer do their job, and my strong suspicion is that the day is closer than most Texans, and all Texan leaders, want to admit to themselves.
The state of Oregon sucked a fiscal teat longer than made any sense, and has been trying to work through withdrawal and come out clean. The progress is inching, and agonizing, but postponing it only would've made it more severe. Texas is riding a binge, and storing up a lot of pain for itself when everything topples.
Part of me thinks this is just a symptom, and the deeply rooted illness comes from a desperate craving for what's uncomplicated. If I were to try to distill Texan-ness down to one idea, that would be it. Some of the iceberg-tips that grow out of that nature are pretty appealing: a bracing assertiveness and a child-like faith. But it's also very Texan to play ostrich, to ignore bad news and hope it goes away, to shoot the messengers and double down on a dumb idea. People who know their Texas history should recognize those tendencies in a million and one turning points that have gone wrong.
I do have a love for the state where I was born, but it's an exasperated love, the love we give a backwards child who sets off one disaster after another, who marches proudly and stubbornly into an endless parade of preventable messes. It's a love almost untouched by admiration or emulation, a lot of combined smiles and eye rolls. And a lot of worrying, shrugging, and fatalism.
The mistake we make again and again is declaring an end to history. We think, My entire life has built to this moment, to my current understanding, and there is no future. I will never again be surprised, pleasantly or not. I will never again learn.
One of my lazy pleasures is re-reading books I first tackled when I was a child. Naturally, I now catch things that I didn't notice, or wasn't ready to grasp, until now. But it's almost brain-wrenching to think that I might come back in my sixties or seventies and spot bits that whooshed right over my head back when I was a clueless forty-two year old. Nevertheless, I'm sure I will, and doubtless more so with the Bible than with any other book.
This spring, I've found myself in more discussions with people who think that our consensus, authorized, "safe" account of what the Bible teaches cannot, must not, will never change in the slightest, than I ever would've seen coming. In the course of those discussions I've pointed out that for centuries, both the curse of Ham, and Ezra's command that the Israelites divorce their foreign wives, were cited as proof that God had instituted white supremacy. That was taught in seminaries and preached from respectable pulpits by giants of the faith. Then, when the time came, when our slowly accumulating understanding of the world flowed into the proper shape, God moved, and a tipping point was reached. Today we understand that the Bible never taught white supremacy or nonwhite inferiority. Its text didn't change in the slightest, but our understanding of it improved.
Another example, and one I hadn't considered until I read about it earlier this week: for endless stretches of time, it was unquestioned truth inside Christian teaching that the Jews as a people were rejected by God because of their collective guilt from Christ's unjust execution. People clung to this teaching despite Christ's pretty unambiguous words on the subject, and some still do to this day. For the most part, though, we've understood our error and moved on from it.
This is on my mind today because two recent graduates, of whom I'm inordinately fond, have recently been struggling a lot with their faith. One is struggling publicly, and the other quite privately. What I want so desperately to convey to them is that history hasn't ended, learning hasn't run its course, and it's not time to close the book on their faith. The Church has had to return to an unchanged Book and accept that we had outgrown our flawed understandings, just as surely as I revisit books I loved in my childhood years and measure my own growth against ink on a page that was the same before, during and after my encounter with it. In my teens and twenties, I had a long fallow period when I never cracked my Bible, and had a lot of cynical things to say about its teachings and reliability. After that ran its course, and my understanding had germinated and gestated to the precise degree of readiness, it got its second wind, and doctrines I found naïve and childish reasserted themselves as powerful and moving truths. They hadn't changed, but I had continued to grow.
I'm fairly confident that these kinds of discoveries lie ahead for these students, so I'm not too worried, so long as they don't develop an ego-attachment to their incomplete understandings. I'm pretty sure they're inquisitive and curious enough that that's a small risk. It's something that figures prominently in my prayers.
So back in summer of 2010, it was jump-rope rhymes. This morning, my philosophy professor colleague and former pastor was working on the eulogy for someone he barely knows, and it struck me funny that you could write quick little two-line, grossly inappropriate eulogies. Here are my first attempts, and you are invited to share your own in the comments:
Don't get too big a whiff
Your loved one's now a stiff!
Don't hope she sits up
Your loved one's gone tits-up.
Her to-do list? Time to chuck it;
She's hauled off and kicked the bucket!
The coroner's report confirms
Your family member's feeding worms.
Don't get tear-stains on your shirt
She won't notice through the dirt.
Want your dearest? Nope! Can't have her
How to put this? Look -- cadaver!
Don't protest my word belies her;
What's she care? She's fertilizer!
If you'll stop crying one smidge sooner, I'll
Speed things up and end this funeral.
Two things I've heard people say they admire about politics are, to be blunt, idiotic. The first is direct language, and the second is ideological consistency.
Officeholders who want to make a positive, constructive difference are going to speak diplomatically. They have to thread the needle between too many different, opposing groups. Politician-speak is not incredibly pleasant, but neither is it avoidable. Someone who seeks office and promises to tell the simple, direct truth in all cases is like someone who promises to run a nursery in which there will never, ever be even the slightest odor of poop. The promise is, on a moment's reflection, dumb, and what they're promising to eliminate is a necessary and manageable part of the enterprise.
The second idiotic expectation is ideological consistency. Any office-seeker who promises to be a "consistent conservative voice," or a "consistent progressive voice," is like a mathematician who promises that the solution to every math problem will be an even number. Math solutions are sometimes odd numbers, sometimes irrational numbers, sometimes zero, and the proper next move for government sometimes appears progressive, conservative, libertarian, or any one of a dozen other political flavors.
There's a study waiting to happen about the turn against cognitive complexity in American political culture in 2012. Already I've seen reams of political commentary lamenting the persuasive force of the claim that refusing to compromise, or even listen, is somehow a form of strength. In fact, that's a glaring, crippling weakness, and even more tragic when it's self-inflicted. And the puzzle, for which we desperately need a solution, is why so many people embrace it; what particular fear or narrative or lingering trauma so twists their decisionmaking that they're receptive to it.
So I dreamed that NCU actually did change its mascot, but not to the Cute Puppies. Instead, we became the NCU Wet Willies. The reasoning was that since it rains all the time in Eugene, we're soaking wet, and we're trying to do God's will, so we're the Wet Willies.
There was no suit with a big foam head. Instead, all the athletes in all the sports started giving Wet Willies: the cross country kids gave them to other runners, the basketball and soccer players to the other team, etc. Now that I'm awake, I'm not quite sure how that would work in volleyball, and in softball it would take split-second timing, but whatever. I'm also not quite sure why referees weren't calling fouls or tossing our players out, but give me a break: it was a dream.
So then, all the other teams started wearing earmuffs every time they played us, and as a result they couldn't hear one another or their coaches. Plus, for some reason I remember that the cross country runners got really sweaty ears, and that was bad for running. Anyway, all our teams won the NAIA championship in all their sports, and everybody was really excited.
And then I woke up, and instead of snowing, it was raining. Perfect.
So last night I started reading The Crescent Through the Eyes of the Cross, by Nabeel Jabbour, and as of now I'm about halfway through it; that's how hard it is to put down. One thing he said wasn't new to me, but I'd never thought about it in this context, and another thing stopped me dead in my tracks for just a few minutes.
- I'm familiar with the fact that reasoning in syllogisms (B because A, C because B, D because C, so if you believe A, you must believe D) is a very western thinking pattern but not universal, so one common failing in our efforts to preach the gospel is that we package it in a way that doesn't make sense to hearers from other cultures. Far more effective throughout most of the Middle East is narrative reasoning that makes its point indirectly, but unmistakably. Strongest proof of the premise: Jesus didn't reason in syllogisms, but taught in parables. Paul, on the other hand, was all about the syllogism, but his education had a huge root in classical Greco-Roman thinking.
- The thing that hadn't occurred to me is a question Jabbour says many Muslims ask of Christians: "Why do the Christian nations favor Israel over the Muslim world when Islam is so much closer theologically to Christianity than Judaism? Jews deny that Jesus was the Messiah, and the Talmud even says Jesus is in Hell. Islam accepts that Issa was born of a virgin, did many miracles, and is in Heaven with Allah. Why can't Christians recognize their brothers?"
Put those two together, and here's my answer to both:
There was a family made up of father, mother, and several children, and the mother's father lived in their household. He had not aged gracefully, and was known for his sharp tongue. He denounced the father's work, the mother's decisions, and the children's lessons and games with loud, hurtful language. When guests came to visit, they marveled at the hostility the grandfather showed, and praised the family for taking care of him, even while he made his presence so very unpleasant.
One day, a visitor came from a neighboring town to complete a brief business errand. His parents had been childhood friends of the father and mother, but tragically, were no longer alive. Upon his arrival, everyone was struck by his resemblance to the children. He could easily have been mistaken for their brother! He spent the evening with them, telling stories and enjoying games, and everyone agreed that his nature fit with theirs perfectly. At the end of the evening, he said, "Why don't I simply become part of your family and live here? You can tell everyone that I'm another of your children that was away at school, but returned home to live in the house because of my great love for all of you?"
The mother smiled. "It is a blessing to us that you've visited, and we treasure your friendship, but we have no room here. Our family fills all the rooms in the house."
"But if you were to move your father out, I could take his room," the young man said. "He undercuts everything you do and say, while I am much closer to you in appearance, belief and attitude. I would fit here perfectly. Why not accept me in his place?"
Very, very gently, hoping not to hurt his feelings, the father replied, "Even if my father-in-law's words do not please us, my family would not exist without him. The relationship we have with him is genuine. It is living proof of our family's history. The relationship you propose is based on deceit. You are very near to us, but to tell the world we shared a blood tie would be a lie. We hope always to enjoy your friendship, but friendship with you is not a sufficient reason for us to deny what is."
It's a clumsy first attempt, but at least it doesn't fall into the error of framing the reasoning in a way that will only breed confusion, not understanding.
And that leads me to wonder whether anyone has paid serious attention to contrastive apologetics? Contrastive rhetoric is a fairly young field, having begun in the 1960s with the work of Robert Kaplan, and that makes me curious as to whether anyone has tackled the work of reframing reasoning that clarifies difficult questions in Christian thought in argument patterns that work in different cultures? That might be a research project for this summer.
Walking is superior to bicycling.- My feet will never go flat before, during or after my walk.
- Jesus never rode a bicycle on water.
- "There is nothing like walking to get the feel of a country. A fine landscape is like a piece of music; it must be taken at the right tempo. Even a bicycle goes too fast." -- Paul Scott Mowrer
- My feet don't need a lock, rack or cage, and Eugene is not the foot theft capital of the nation.
- There is no need for, and therefore no such thing as, a walking helmet.
- I only have to beware of distracted and/or psychotic drivers about 5% of the time, when I'm crossing a street. And even then I mostly have stoplights and crosswalks on my side.
- The LORD has not required of us that we do justice, love mercy, and go for a humble bike ride with Him.
- There are no catchy eighties songs with accompanying cheesy dances about biking like an Egyptian.
- I'm fine with walking a mile in someone else's shoes, but I'll pass on riding a mile in someone else's bike shorts.
- Falls being inevitable, would you rather skin your knee or rack yourself on a solid metal bar?
- Cool, thick, velvety green grass is meant to be felt between toes, not gouged out by tires.
- Biking across the stage for your diploma, or down the aisle to your groom, will get you talked about. Doubly so if you pop a wheelie.
- "Walking takes longer than any other known form of locomotion except crawling. Thus it stretches time and prolongs life. Life is already too short to waste on speed." -- Edward Abbey
- Expensive bicycles are a status symbol, but a foot is a foot is a foot.
- God didn't pluck Enoch off his bicycle straight into Heaven.
- "There is this to be said for walking: it's the one mode of human locomotion by which a man proceeds on his own two feet, upright, erect, as a man should be, not squatting on his rear haunches like a frog." -- Edward Abbey
- If God had gone biking through Eden in the cool of the day, He would've roared up on Adam and Eve before they could hide in the trees, and pastors everywhere would be denied a prime sermon illustration.
- Making a bicycle consumes finite resources and energy, generates pollutants, and is repetitive drudgery; making feet is part of makin' babies, which is all-natural and fun.
- "Restore human legs as a means of travel. Pedestrians rely on food for fuel and need no special parking facilities." -- Lewis Mumford
Just like last summer, I've spent the past three months checking a lot of movies out of the Eugene Public Library. Here, without further elaboration, is how much I enjoyed each movie I watched from start to finish between May 1 and today.
★★★★
None.
★★★☆
A History of violence
Eagle vs shark
Eternal sunshine of the spotless mind
Fireproof
Happy feet
Little Miss Sunshine
Primer
The Pursuit of happiness
There will be blood
Waking life
World's greatest dad
★★☆☆
Amazing grace
Capote
Dark city
Something the Lord made
The Accused
The Chronicles of Narnia. Prince Caspian
The Cider House rules
The Curious case of Benjamin Button
The History boys
The Hours
The Ring
Patton
Walk the line
★☆☆☆
Bulworth
Chronicles of Narnia. The voyage of the Dawn Treader
Know1ng
Martian child
☆☆☆☆
None.
So this morning I heard, on NPR, a man from Tucson argue that it was a good idea to force college campuses to allow people to carry handguns. According to him, only law-abiding citizens obey the current rule against it, and he needs to pack his own protection against outlaws. He spoke approvingly of mutually assured destruction, saying it had done a fine job of keeping the world safe from nuclear annihilation for almost seventy years. And that got me to thinking, y'know what? We should also abolish traffic laws.
Seriously: we should paint over all the stripes, take down all the signs, eliminate all the speed limits, and, most of all, repeal the DUI laws. Because, y'know, only the law abiding respect them anyway. It's a war zone on them roads, what with drunk and crazy drivers thirsty for the blood of decent people. The only thing they understand is force! I should be free to run them off the road, knock them from their cars, run over them, reverse, run over them again, back and forth and back and forth until they're roadkill.
Now, I'm a little too tenderhearted for such work, so I might need a pint or two of courage, and that's where repealing the DUI laws comes in. If I'm just as much of a loose cannon behind the wheel, just as much of an unpredictable source of instant death as anybody else, then everybody will know to keep their distance from me, and I'm a lot more likely to get where I'm going without interference from other drivers. Oh, I suppose there's danger I might get in a one-car accident, but where's the fun in bothering to think about that when I'd rather get all worked up over the bogeymen of other cars, all driven by evildoers who have to be kept in check?
I mean, it's clearly my right to drive my car on sidewalks, through hospitals, up the escalator at the outlet mall, isn't it? The right to do anything you want in your car is part of what makes America great! Don't tell me anyone's un-American enough to think that there's a right way and a wrong way to drive a car. We don't cotton to that kind of traitor talk around here. Matter of fact, I think that's one of them Muslin Sorry laws, isn't it? Not here, thank you so much. We fought them over there to prevent them coming over here and actually stopping at all the stop signs just so they can slip in a quick prayer toward Mecca. Them big flowing robes just cover up the fact that they actually wear their seat belts. I'll tell you, Jesus would've weaved in out of traffic and run over kindergarteners in a crosswalk if He had sinners to smite and demons to cast out. Would've carried a handgun, too; Judas could kiss a barrel of cold hard steel for his trouble.
Bring 'em on!
Students who take more than one class from me get accustomed to hearing fresh riffs on a running analogy. Here, I'm going to set down the extended dance mix as a pre-writing exercise before I submit it to the National Communication Association convention in the GIFTS (Great Ideas For Teaching Speech) division.
Communication is like the weather.
- The weather is a complex system made up of a brain-mangling array of inputs, all mixed together in a system so complex and chaotic that we can't master it. Weather forecasting is not an exact science, and people are (for the most part) comfortable with that. But it's also not meaningless speculation, on a par with horoscopes: there are some observable signs that are powerful predictors of certain kinds of weather. Furthermore, weather follows cycles, with certain weather events being more likely at certain times of day or year. Communication is similarly impossible to map precisely, but is subject to forecasts of varying reliability, and those probable events also tend to wax and wane cyclically.
- If communication is like the weather, then culture is like the climate. The climate yields the raw materials for weather, along with a landscape that channels or obstructs the development of weather systems, but the weather also renews the climate: a wet climate will generate rainy weather, and the rainy weather re-moistens the wet climate. Furthermore, if I move a few feet in any direction, it's unlikely the climate will change much, but as I travel dozens, hundreds, thousands of miles, I'm likely to see large variations in climate. However, that curve isn't smooth: at particular spots far removed from my point of origin, I might find that original climate substantially reproduced. Similarly, culture supplies the raw materials and the parameters for communication, but communication renews or changes the culture. If I move a few feet, I'm not terribly likely to find that the culture has changed (although I might stumble into a different co-culture, much like stepping from sunlight into the shade), but a longer journey increases the likelihood I'll find cultural difference. Still, there are places very far apart that are pockets of substantially the same culture.
- Technologically mediated communication (the internet, cell phones) is air conditioning. We create a pocket of weather carved out from the surrounding weather for our comfort. Similarly, we use technologically mediated communication for very self-serving self-presentation, and to overcome physical barriers (distance, an expectation of non-contact) that would otherwise interfere with our communication choices.
- Verbal communication is air, and nonverbal communication is water. These are the newest riffs on this analogy -- in fact, I just thought them up this morning. Deprived of either one, we don't live long, but either can harm us if they're polluted. Air is influential (barometric pressure, wind), but water provides many of the most important clues about imminent events -- think clouds -- and is the easiest to feel and the only one that can be seen. Still, even water that can only be observed indirectly can impact comfort and structural integrity: humidity can make us sweat and can ruin documents and artifacts. Finally, water manifests in many distinct states: vapor, liquid, snow, ice, sleet, dew. Correspondingly, we can't be mentally healthy for long if deprived of communication, but toxic communication can injure us. A lot of us think of words as the substance of communication, but nonverbals provide many of the clues that predict the development and outcome of a communicative encounter. Nonverbals tend to engage more of the senses; only blind people ordinarily employ touch in reading, and it's not possible to smell or taste a word. Chronemic messages are only indirectly observable, but make a big difference in human comfort and relational stability. And, yes, nonverbals come in many forms, from voice qualities to touch to posture to the rest of a very long list.
I've got about six weeks to get my submissions written for this year's National Communication Association convention. Two of my papers will be quick and dirty, but one is a sustained scholarly effort. What's below is my attempt to sketch what I think the final product will look like, to give myself some guidance. If you have a thought, do feel free to share it.
Premise number one: Christians exist for the purpose of drawing near to God. We can only do so, we can only bridge the alienation brought about by our sin, because Christ took the punishment and reconciled us to God. Once we accept this, we are in right relationship with God, God's children, and from there we walk daily with Him, growing nearer to Him as the Holy Spirit works to conform us to the image of His son.
The important bit: the Christian life is relational.
Premise number two: this relational essence makes it the higher priority than message content in things we say to, about, and in service of, God. Paul Watzlawick wrote in Pragmatics of Human Communication that every message has a content dimension and a relational dimension. If a wife asks her husband to lift something heavy for her, and he, watching TV, says "I'll come do it at the next commercial," he may think she's just made a simple request and he's agreed to do it within a reasonable time, which is what the content conveys, but she may fume that he treats her as less important than the television, which is a relational message. Transferring that concept to this discussion, much of what we do, including Bible study, including worship, including prayer, including fellowship, including serving people in need, involves producing and consuming utterances, each of which has a content and relational dimension, but if premise number one is correct, then the relational dimension is always dominant over the content dimension.
The important bit: what we say is never as important as the way our sayings position us relative to God.
Premise number three: our relationship with God is primarily instantiated in a single dialectical tension, not the several that turn up in relationships between humans. Leslie Baxter's work argues that people experience the desire to be together and apart, to be open with one another and maintain privacy, to work up a repertoire of traditions and be spontaneous, and that the life of a relationship is the endless collaborative balancing of those tensions. But all three are meaningless in the relationship between human and God: we're never apart from God, we have no privacy from Him, and we cannot surprise Him. Instead, I tentatively assert that our dialectical tension in relating to God is wisdom vs. innocence. God calls on us to trust Him with a childlike faith, but also allows us to argue with Him, even occasionally letting us win the argument.
The important bit: our relational positioning with God drives us to find the right mix of trust and critical acuity.
Premise number four: Christian argumentation has to date been dominated by an apologetic tilt, which has much in common with multi-vitamins. Taking One-A-Day® can be a good idea if someone's diet actually lacks an important nutrient, but anyone who eats a balanced diet doesn't need such supplements. It's been said that Americans, who lead all other nations in consumption of vitamin pills, simply have the world's most expensive urine. Worse, in some cases high doses of vitamins can be toxic. The fit of this analogy comes from the largely unacknowledged dangers of apologetic argumentation; where someone's faith is crumbling because they can't get over a reasoned objection to Christianity, then apologetic work is a vitamin, correcting a deficiency. But where people pursue such arguments for their own sake, they risk damaging their faith. C. S. Lewis, widely regarded as the contemporary champion of apologetics, repeatedly warned people not to attempt to build up their faith by winning debates, insisting that his own apologetic work had weakened his faith, and the only correction was to experience God's presence directly. Again, the relationship was far more important than the content.
The important bit: Trying to win arguments that prove God's existence or other Christian teachings can address specific obstacles to faith, but is equally likely to weaken it if deployed unnecessarily.
Premise number five: The proper role for Christian argumentation can be understood along the lines of work done by Doug Ehninger in the late nineteen sixties: argument as mutual correction, as a way of granting personhood to another, making oneself vulnerable to another and thereby building a bond. God shows us by joining in argument with us that He is not distant, detached, uninvolved, and as we argue with Him, we are forced to accept correction where we are wrong. Similarly, the arguments we have between ourselves should be opportunities to build fellowship, to grant one another the dignity of making our reasons explicit and being open to persuasion by the other, to surrendering our positions when they are successfully refuted. In all these instances, the relationship is far more important than the content. Rabbinic scholars fell into the trap of adding layer upon layer of content over the Torah, drowning it in commentary and judgments, at the price of a dynamic and engaged relationship with God and one another, and if we pull back from unnecessary apologetic argument and instead use argument as exploration of difference and a procedure for building trust, then we arrive at a more robust and sturdy bond.
The important bit: Argument as procedure has the potential to strengthen relationships, and the Christian life is relational in its essence. ■
I know I'm using argument in incommensurable ways, between us and God and between person and person, but that's one of the things I'll get sorted out. This is just a start, and I've got six weeks to develop it.
I don't know if anything like this happened when I was ten years old. My memory doesn't reach back that far. But when I was twenty, I hit a fork in the road, followed by a comparable fork at thirty and forty, and now I honestly wonder what's going on.
At twenty, I reached the culmination of seven years of non-stop obsession, which built to a climax that didn't leave much more to do. I'd had my first competitive debate at thirteen, and knew immediately that I'd found what I wanted to be good at. The problem was, I'm really not cut out to be a competitive debater. I can think like a debater, and I'm reasonably good with words and on my feet, but I don't have the cut-throat instinct. My competitive streak is about the size of an eyelash. Still, I poured time and effort into debate, and slowly, slowly grew into my potential, which was never much to begin with. In April of 1989, I was in the room as my teammates won the national championship for intercollegiate debate, making fairly heavy use of arguments I'd researched. We celebrated madly that night. This was it, was what I'd always wanted, dreamed about, and I had it.
About ninety days later, give or take, I turned twenty. About ninety days after that, give or take, I quit debate for the first time. I came back for a full season, quit again, came back for one tournament, and retired permanently.
That didn't mean I was soured on debate, though: I'd just made a decision to become a college debate coach. I loved the activity; I just figured all the struggling I'd done, the snail's pace of my improvement, the dozens of places I got stuck, would make me a fantastic teacher of debate. And, honestly, I was better as a coach than competitor. Working with some incredibly gifted colleagues, I was part of a coaching staff that took the University of Georgia program from an underperforming team with loads of potential to a performance, in my last year, that they've still never matched, and that no public school in the history of intercollegiate debate has ever exceeded: second and third place at the NDT (National Debate Tournament) in a single year. Kansas matched it back in 1976, and Emory would later surpass it in 2000 with first and third in a single year, but it's still an achievement I'm proud of my part in. That was my launch into intercollegiate debate coaching: I went on the job market that year, was a fly-in finalist for four different jobs, and was snapped up by Arizona State.
After two years at Arizona State, I very suddenly reached saturation, rapid-onset burnout, and decided I had to walk away from debate altogether. I left Phoenix for a job in Nacogdoches, Texas; it was one hundred percent teaching. It's not as though I wanted to be a teacher, but that was the only work experience I had outside debate coaching that could potentially pay my bills.
During the summer between my last year at Arizona State and my first at SFA, I turned thirty. At twenty, I peaked in direct involvement with debate, and almost immediately lost my love for it. As thirty approached, I peaked in my indirect involvement with debate, and it happened again.
So then, at SFA, I began to learn to teach, which was even more painful and difficult than learning to debate had been. Praise God, the job had me teach a single class, public speaking, over and over and over again; at one point, I had seven sections, which meant I'd teach each lesson seven times, usually in the same week. I can't imagine a more perfect setup for learning to teach, and it paid off. By my third or fourth year, students had begun telling me that I was their favorite teacher, and it slowly dawned on me that teaching was actually a very enjoyable way to spend my days. In my eighth year of full-time teaching, I started on a three year winning streak, and if you're good with math, you can see the pattern cropping up again: in 2007, SFA awarded me the Teaching Excellence Award. Within weeks, I'd accepted a job at Northwest Christian College, and at the end of my first year there, the graduating seniors voted me Professor of the Year for 2008. The following year, I won the 2009 President's Award for Teaching Excellence and Campus Leadership. And about sixty days later, give or take, I turned forty.
So does this mean my love for teaching is about to take a fall? I have seen a few signs of that. The fall term has been tough in each of the past two years. The little spells of mild depression that I fight off from time to time are coming a little more quickly, and are going from mild to moderate. My snap diagnosis is that the dislocation from Texas to Oregon, far away from family and everything familiar, is catching up with me. That might mean I'm going to wither on the vine here, or it might just mean that I have another adjustment to make, and have to be patient and give it time. It's the kind of thing I can't judge while I'm in the middle of it; when I emerge from it, I should have more of a read on what's going on.
And I am very attuned to the potential irrationality of thinking this way. I might just be seeing animal shapes in the clouds. There's nothing magical about periods of ten years, and what I'm describing as though it were a reliable pattern could be nothing but coincidence. It is entirely plausible that my love for teaching could deepen and settle on a reasonably smooth curve, accounting for the occasional dip, for the rest of my days. And there's a very real danger that if I pay too much attention to this alleged "pattern" of round numbers, then framing effects might take over and I might bring it about when it wouldn't have happened otherwise. I might sabotage a career that I love dearly, give it up to corrosion and self-doubt, when it didn't have to be that way. So I'm on guard against that. But the pattern is striking enough that it would be foolish to ignore it entirely.
And I always hope I'll get to the end of these things and either the act of writing will have given me clarity, or that I'll at least have a good zinger to reward anyone who's had the patience, or the lack of anything better to do, to slog through this. Neither seems at hand in this case. So, allakazaam, blog post is ended.
When at home, I fart
Unapologetic'ly.
Pt. Pt. Pt. Pt. Pt.
Time for a new chapter.
Today I got my course evaluations from the term that just ended, and they made plain to me how urgent it is that I charge into battle for the substance and authenticity of what happens in my classroom. I've been a young teacher, trying to figure things out. I’ve been a comfortable teacher with good, developed instincts. I’ve been a popular teacher on a Christian campus with small classes, enjoying positive, light-hearted, friendly relations with my students. None of those teachers are gone; they’re all sedimentary strata in my foundation. But now it’s time for something different, and I plan to pursue it with all the stubborn militancy I can muster.
I do not believe in memorization and will no longer encourage or reward it.
I do not believe in note-taking for note-taking’s sake, and will no longer encourage or reward it.
I do not believe in playing school, and will no longer encourage or reward it.
The overwhelming majority of students here at NCU, and on other campuses across the country, are stubbornly wedged into a set of habits and assumptions that are channeling their time, energy, potential, straight down the drain. I have coexisted with those habits and assumptions for too long. No longer.
To begin with, I, my colleagues, and my students, have to fully grasp that learning is worship.
My students are very committed to the idea that worship has to be authentic, that it cannot consist of going through the motions, but somehow they don’t take that idea with them into the classroom. There’s a widely shared separation of NCU life into the sacred and the profane. The sacred is the ministry work, like staging chapel celebrations, doing community service, or leading small group Bible studies. The profane includes things like jury duty, visits to the doctor, and getting an education. Activities in the second category can be ministry opportunities, as it certainly would be possible to witness to someone in the jury pool, but they aren’t anything anyone would seek out for spiritual development: they’re to be tolerated, not wholeheartedly tackled and experienced.
I’m not convinced that in every case our students choose, consciously, to put getting an education into that second category, but the choice is unmistakable based on their behavior. They pour all their ability and energy into ministry work, but laugh to one another about how often they write papers the night before they’re due, or pull all-night cram sessions just before a test. And, naturally, they rarely take a glance at the graded papers, and take it for granted that material learned for a test is to be forgotten the second the test is over. The notion that the papers might be documents of their intellectual development that need periodic revisits, or that they might retain and make use of the material covered on a test, is entirely foreign.
It’s crazy. They don’t study the Bible that way, but every academic subject gets that arm’s length, dismissive treatment. And what’s crazy about it is that this isn’t a monastery or a convent; it’s a university, and they made the deliberate choice to enroll. The primary purpose of this institution is to offer programs of study that culminate in academic degrees. They came here for the purpose of earning such a degree. Now, I do understand that to a certain extent, people at this stage in life struggle with self-discipline; it’s too tempting to go straight to the enjoyable activities, the socializing, the work that yields instant reward. That’s true on Christian and secular campuses alike. But I’ve also seen impressive, substantive, polished work whenever they make the connection between their efforts and direct service to God. It’s not easy to play a musical instrument, but I’ve heard performances that gave me chills. It’s not effortless to plan a worship event, but I’ve seen worship events that went off like clockwork, with truly thoughtful, thought-provoking elements incorporated seamlessly. The problem is that they don’t see the connection between schoolwork and serving God.
That’s a shame. Christ’s followers certainly did.
Yes, He healed. Yes, He worked miracles. But what He did most of all was teach. He didn’t have a lot of use for people who followed Him around only to see the signs and wonders. “Take my yoke upon you and learn from me,” He said. And He didn’t just teach them how to interpret scripture, how to pray, how to do things that felt sacred: He taught them what to do with their money, how to handle conflict, how to manage contracts. Paul, His apostle to the Gentiles, would go on to castigate believers who stopped working at their jobs so they could idly await His return, saying “The one who is unwilling to work shall not eat.”
Oprah Winfrey, explaining why she builds schools in Africa and not in the United States, said “If you ask the kids what they want or need, they will say an iPod or some sneakers. In South Africa, they don't ask for money or toys. They ask for uniforms so they can go to school.” We’ve had plenty of visitors to campus who talk about the level of need they’ve seen outside the United States, and our students overflow with compassion for children who are hungry, who are victims of abuse. But I wonder if a single one of them appreciates how appalled those same children would be to see them squander their opportunity to learn? When they work to feed the hungry, I know it moves them to think about how blessed they are to have enough food; for abused children, to think about how blessed they were to grow up safe, protected, among loving family members. But they work to exhaustion in order to provide for children who are hungry for education, for learning, for a chance to take possession of their own lives, and they never see the slap in the face they give those children by making a mockery of their own access to exactly what the children crave.
I genuinely don’t get the reasoning that leads students to enroll at a Christian university, identify as fellow Christians who are giving up their entire lives to service, but then do a marginal, half-hearted job on the meat of that affiliation, the completion of coursework to earn a degree. Why not cut out the middleman and go straight to work at a church? The answer is, because most healthy churches won’t hire them unless they have a college degree, and, in many cases, seminary training to boot. What can we infer from that? Could it be that their elders, their role models, see value in the discipline of undertaking complete preparation, a wall-to-wall education, before embarking on a life of service?
And if schoolwork is profane, then why a Christian college? Daily toothbrushing is a good idea, but I doubt many of our students go out of their way to insist on a Christian toothbrush. Dental hygiene is one of the necessary, unavoidable tasks that are preparatory to active participation in the Kingdom of Heaven for another day, but I can’t think of a Christian way to brush one’s teeth that is distinct from an atheist’s approach. If schoolwork goes in the same category as toothbrushing, then why a Christian college? It seems beyond obvious to me that exploring the order in God’s creation is, itself, a form of worship, and pushing back ignorance and choosing to learn critical thinking skills is an offering to God. So why do so many students bring such a meager, poor, depleted offering?
I’m not just talking about sloppiness, by the way. Plenty of type-A, very hard-working students approach schoolwork in a spirit that is very self-centered and entitled. Just this semester, I’ve had several of my more successful students insist that I should design my classes around memorization and taking notes off Powerpoint, two activities that have only the most remote relationship to learning, and a much closer relationship to going through the motions. Several highly capable students dropped my Introduction to Mass Communication class after they tried to memorize everything covered on the first test, but met with disaster. One in particular told me that her learning style involved memorization, and if I didn’t re-design the class to reward memorization, then I was a bad teacher. I replied that it was far more important to me that they understand the course content, and that things memorized for tests tended to be forgotten almost immediately. I’m sure I’m correct about that, but I made zero headway in getting any agreement from her. In other classes, students complained that I’m no longer using Powerpoint, because they don’t know how to take notes. There’s ample research supporting the notion that Powerpoint deadens understanding and atrophies listening skills, and I explained that every time a student asked me to go back to Powerpoint, but they’ve got their comforting routines of writing down the bullet points, and when I disrupted those routines with the radical notion that they should pay attention and engage the material, they turned sullen and put the blame on me for their struggles.
Finally, I think this culture stays wedged in place because of my own behavior, and the behavior of my colleagues. I’ve said for years that I don’t want my students to like me right now; instead, I want them to look back in twenty years and like what I did, and what effect it had on them. If they like me too much right now, then I’m not challenging them enough. A colleague of mine asked me the question, last Spring, “Do you really believe that a class has to be hard for students to be learning?” I bobbled the question at the time, but it’s stayed on my mind ever since. The answer is yes, in a certain sense, it does. Christ’s followers were disciples because they’d taken on discipline, and we today separate our curriculum into academic disciplines because they should have rigor and challenge, and completing them should require more from students than they arrive able to do. A native speaker of Spanish who’s a published author, poet and playwright in Spanish, should not enroll at an American university to major in Spanish. That person has mastered the language, so completing the program is a waste of time and effort.
And I’m afraid that we’re all creeping closer and closer to expecting nothing from our students. We do Powerpoint slides because they’re easy to develop into routines. We give cursory attention to written work, because it demands less effort from us than digging in and grading it line-by-line. We take our cues from student performance, easing back on the level of difficulty in tests and assignments if the grades go down. In some cases, if students become enough of a hassle, we cut corners and overlook whatever we need to in order to make them go away. None of that is tolerable. All of it sells the students, and the service of teaching, short. And I am as guilty as anyone of practicing it. And I have decided not to anymore.
It is no more acceptable to play school than it is to play church. We offer teaching and learning as a form of worship. And scripture makes it clear that God doesn’t want offerings brought reluctantly, or from mixed motives; if we offer something to God, it needs to be in joyous gratitude for what we’ve been given, and if it’s not the best we have to give, then the joy and the gratitude is awfully hard to take seriously. I have no reasonable expectation that I can bring this off perfectly, but I am determined to double down on an insistence on learning, and a challenging of play-school routines and behaviors. And I think I might get my wish: fewer and fewer students are going to like me right away, but if I do it right, more of them may like what they see a generation from now.
I know all the fake arguments in favor of colleges having football teams (building work ethic, learning to function in a team environment, teaching leadership) and the real argument (money), but today I choked on the gear-stripping irrationality of college football in the face of recent discoveries about the price the game exacts from its players. Mounting evidence suggests that Owen Thomas of U Penn committed suicide in large part because he suffered from chronic traumatic encephalopathy, a kind of brain damage previously thought to result from too many concussions. But Thomas hadn't had many concussions; instead, he had the little brain traumas that are a part of the ordinary play of the game, and aren't addressed by any medical intervention. Now, he's obviously an outlier; most football players don't suffer brain damage that leads them to suicide. But from what his case reveals, it seems equally obvious that most, if not all, football players suffer brain damage, and a lot more severe damage than we admit to ourselves.
Am I the only one who thinks this is crazy? Is there another human on earth that remembers why colleges exist? I come to work every day, roll up my sleeves and put eight hours of sweat into training students to use their brains in constructive ways. Why on earth is it tolerable for a college to sponsor a brain damage factory? I doubt I could talk an eating disorders clinic into sponsoring the Coney Island 4th of July hot dog eating contest, but the overwhelming majority of institutions of higher education in this country use, as one of their chief marketing tools, a frontal assault on the bricks and mortar of their students' cognitive faculties. Absolutely insane.
So it struck me again this morning what a hard time the Second Amendment folks have keeping their story straight when it comes to their articles of faith. Elsewhere I've written about the fact that "Banning guns won't stop gun crime, but will just drive gun sales underground" applies with equal force to outlawing abortion, but there's no shortage of politicians and private citizens who think a gun ban would be an absurd failure while clinging just as desperately to the dream that criminalizing abortion would stop the procedure like flipping off a light switch.
Here's another: gun advocates say it's not the guns that kill people, but the choices made by the owners of the guns. Okay, stipulated. But then, the same people are often the quickest to bray for "tort reform," which would effectively cripple the ability of any private citizen to file a lawsuit, and the only support they offer for their position is a string of decontextualized anecdotes about "frivolous" suits. Do they not get the disconnect? Is it really that hard to see that even if a handful of people pursue absurd litigation, that says absolutely nothing at all about the importance of access to the courts as a leveling tool between the wealthy and the powerless? Are they equally in favor of tearing down fire stations because from time to time someone calls in a false alarm?
It's bad enough that their argument is dumb; it's maddening that they recognize how dumb an argument it is in another context, then double down on that dumb argument when it apparently fits a different issue.
People aggravate me.
So I've noticed, just this fall, that the mighty wave of girls named Madison from the past generation or so has started to break across our campus. And this morning I was struck by the singularity of naming female children for US presidents' last names: Madison, Kennedy, Reagan. I then decided, as a public service, to single out presidential last names that would be most unfortunate girls' names, just in case:- Polk
- Fillmore
- Cleveland
- Hoover
- Truman
- Johnson
- Bush
Where's Sarah Palin when you need her? Is there some sort of Palin-911 I can dial?
I've got this neat set of really sharp steak knives, but I hear I'm not allowed to perform surgery on anyone because some elitist liberals decided that you have to actually know things like medicine and human anatomy before you can get a license. That's obviously wrong and evil, because it isn't what I want to hear; if I could only find Sarah, I know she'd give me that warm smile and reassure me that I should be able to do anything that licensed doctors get to do. She'd tell me that studying and learning and mental discipline just turn you into a liberal, and ignorance and insistence are the way to true happiness and goodness. Jesus certainly never would've wanted me to know anything, would He? I need a big ol' fix of Sarah, stat.
I also want to pilot a 747 this afternoon. I don't need flying lessons or anything, do I? Tell Sarah to block out a double appointment for me; this could take a while.
This is another silly word game that goes here just so I can look back years from now and shake my head over the wastes of time that I turned to for entertainment.
Troy Dean is our newly-arrived campus pastor, and in one of our first conversations, he told me about a sewing circle at his old campus in California that they called the Stitch-n-Bitch. That was certainly cute in its own right, but it got me to thinking of other crafts that would pair neatly with speech acts, and I came up with ...
- Weave-n-Grieve
- Tattoos-n-Bad-News
- Needlepoint-n-Anoint
- Flower-Pressing-n-Second-Guessing
- Claymation-n-Character-Assassination
- Lithography-n-What's-Wrong-With-Me?
- Macramé-n-Auto-Da-Fé
- Origami-n-I-Want-My-Mommy
Others came up with "Paint-n-Complaint," "Stained-Glass-n-Talk-Out-Your- ..." which you can probably finish.
I have a Palm Z72, a stone-age precursor of the iPad, that goes everywhere with me. I bought it three or four years ago, and I use it as my personal Bible. It has Olive Tree software on it and five Bible translations: the NIV, the Holman, the New King James, the Spanish NIV, and David Stern's Complete Jewish Bible. The three biggest advantages it has are, first, it fits in my pocket, which means I've always got it on me, never true of my previous Bibles; second, I can navigate it a lot more quickly than a bound Bible; and third, it has a search function, so if I remember just a few words of a verse, I can track it down in a matter of seconds. I must admit that I've wondered a few times, since I have it, why I still put effort into memorizing Scripture?
This is actually a specific example of a wider debate raging in circles from education to journalism to brain science: why should students memorize facts if any conceivable "fact" is a few keystrokes away? Why bother to memorize phone numbers if they can all be saved in your cell phone? But on the other hand, what do you do if you lose your cell phone, or your internet access?
This morning, a story on NPR made it clear to me that this isn't a new problem. They interviewed Leslie Aiello, an anthropologist with the Wenner-Gren Foundation in New York City, and she made the point that human teeth aren't nearly as formidable as the teeth of most other animals, primarily because we're tool users. In other words, our knives, forks, kitchen graters, food processors, all serve as "teeth" in the same sense that a cell phone's contacts list outsources what we once housed in our memories. For that matter, cooking is really just off-site digestion.
The fact that so much of our food is prepared in more and more complex ways gives us a greater degree of control, but it also makes us a lot more vulnerable to mishaps. It's a lot easier to cut yourself with a knife than it is with your own teeth, and a knife makes a better weapon against someone else than an incisor does. And the more we depend on a highly elaborate diet made up of many ingredients and multi-process preparation routines, the more simple disruptions to daily life can sabotage the task of getting fed at all. People who know how to secure simple food, how to live off the land, fare a lot better when things fall apart than highly civilized people do. And for each of those weaknesses, there's an analogue in the storage and retrieval of information.
In particular, primitive hunters and gatherers in prehistoric times had to spend a lot more time and energy just feeding themselves enough to fend off starvation, but in many ways their lifestyle was healthier than ours, and few suffered from obesity or eating disorders. There's plenty to be said about information overload, but what interests me even more is the growing number of people who identify as their number one fear the experience of being absurd in a social encounter. It used to be that I could count on public speaking turning up as most people's top choice, but I've seen survey results over the past few years that pegged small talk with a distant acquaintance as scarier still. And in my gut I suspect that the different ways we produce, consume and retrieve information are at or near the heart of the forces pushing that change. Definitely something I'm keeping my eye on.
In 1996, I yanked the plug on my cable TV. In the ensuing fourteen years, I haven't been a TV watcher, and I've noticed some huge benefits. This is all very unscientific and speculative, but I have no doubt at all that my attention span and memory both have grown explosively since I gave up TV. There are hints in the literature that because viewing is so passive, long hours of engagement with TV programs causes some vital brain functions to atrophy, but none of the research supplies a definite answer. From my experience, though, I'm entirely sure, which means I'm very happy with that decision and plan to stick to it.
It did come at a price, though: it all but froze my pop cultural literacy back in 1996. These days, with the passage of time, that price has grown more and more noticeable. Often, students try to illustrate a concept in class using a TV commercial, or a character from a TV show, and I have to look helpless and say "Well, that's on TV, so I have no idea about that."
When it comes to movies, they're a bit of a gray area. I tell my students, "I see about a movie a year." The TV embargo has changed my thinking patterns so much that I struggle against succumbing to the created world inside a film. The camera points your eyes where they're supposed to go; the music, and other aesthetic clues, tell you which emotion to feel; it's such a mental frog-march that I feel out of place and cynical, so it's rare, these days, that I enjoy a movie start to finish.
With all that said, a few years back, about a month before I arrived in Eugene, a beloved non-chain video store named Flicks and Pics succumbed to the new media environment, and the Eugene Public Library bought up most of their collection. I discovered the library last summer, and now think it's one of the most potent forces for truth and justice within about a million miles of me, so this summer I finally approached their DVD shelves to take a careful look. And there I discovered movie after movie that at some point I'd wanted to see, but never got around to watching.
This summer has been my movie summer. What's below are all the movies I checked out from Eugene Public Library and watched all the way through. That's not to say I found it easy to do so: there's an even longer list that I quit watching in the middle, or that I checked out and then never watched in my allotted three weeks. With most of these, I had to pause at least once and go do something else. And possibly the most intriguing bit is that I have actually noticed my attention span and memory don't have the edge they had last spring. Even this much viewing time, spread out over nearly three months, has had an effect, and not a good one, on my brain wiring. For that reason, I'm cutting off the film festival at the end of this week, what with the arrival of the new month. I might take it up again next summer, but we'll have to see about that.
One quick gloat: I saw every movie on this list for free. I love the Eugene Public Library so, so, so much.
Without further ado, my summer viewing. The explanation of the stars is at the bottom.
★★★★
Hotel Rwanda
The Up Series (7 Up – 49 Up)
The Devil Wears Prada
That Thing You Do!
★★★☆
The Great Debaters
Harvard beats Yale 29-29
Wag the Dog
American Gangster
Paranormal Activity
The People vs. Larry Flynt
I ♥ Huckabees
F for Fake
The Color Purple
Erin Brockovich
Me and You and Everyone We Know
Taxi to the Dark Side
Rabbit-Proof Fence
Super Size Me
★★☆☆
Good Night, and Good Luck
The War Room
A Prairie Home Companion
The Last King of Scotland
Charlie Wilson's War
Pan's Labyrinth
Sicko
Monster's Ball
Blades of Glory
All the President's Men
Barbershop
To Sir, With Love
Grave of the Fireflies
The Remains of the Day
★☆☆☆
Grosse Pointe Blank
Hot Shots!
☆☆☆☆
Fantastic 4. Rise of the Silver Surfer
The stars are a measure of how far the movie deviated from my normal enjoyment of movie-watching. The four star movies were so engrossing that I could've, or did, watch them at one sitting, and if I had to pause them, my mind stayed on them and I wanted to get back as soon as possible. Three stars means I got to the end of the film and judged it a positive experience, and two stars signals that it was an acceptable experience, not worse than my average visit to the theater. One star means I was disappointed, and zero stars means the film was embarrassingly bad; there are so few of those because I was more inclined to shut a film off and take it back than to finish it if it was that bad. I'm honestly not sure why I watched Fantastic 4 through to the end. Within each rating category, I've got the films listed in the order I saw them.
So that's that. Now, back to a diet of reality over image.
I remember, in my doctoral seminar on rhetorical criticism, nailing down the difference between a diachronic and synchronic angle of attack on communicative practice. Diachronic refers to movement through time, while synchronic is identification of relationships at one moment in time. The simplest illustration of the concept involved a game of chess: you might map the moves made by one piece, say, the queen's bishop, all the way through the game, and that would be diachronic. Or you could stop the game about five or ten moves in, identify the strategic potential of every piece on the board, which pieces were under attack, which side had the stronger position, etc., and that would be synchronic.
I learned those lessons in a classroom in Georgia, where the home folks have an especially deft grasp of the concept. Small town Southerners want to know two things when they meet you: where are you from, and who are your people? Effectively, those are the two dimensions that Einstein identified as a continuum: where did you come from in space and in time? What is your place and your lineage? Who came before you, and who surrounds you? They ask because they're looking for that one clue that will sum you up.
The answer, in my case, is cartoons. Cartoons play a major role in both my heritage and my neighborhood.
I grew up in Richardson, Texas, a little suburb of Dallas. My mother still lives there, and I go back to visit every summer. Put a blindfold on me and I could probably find almost any square inch of the town. Mike Judge didn't grow up there, but he did live there for part of his childhood, and it was from the Richardson Public Library that he checked out his first books on animation. In case the name doesn't ring a bell, he's the creator of Beavis and Butthead, as well as the second longest-running animated show on network TV, King of the Hill, which, he's said in interviews, he modeled on his memories of Richardson.
The longest running animated show on network TV is The Simpsons, created by Matt Groening. He didn't spend any part of his formative years in Richardson, but rather in Portland, which means he's not from my town. He is, however, one of my people: he's my fifth cousin. On my father's mother's side of the family, three more generations back, one of my female ancestors was a Groening who married into Schmidt-ness. Her great-granddaughter, Anna Schmidt, married Glen Srader, and about fifty years later, I came along. Admittedly, both of these links are pretty tenuous -- Mike Judge and I shared city limits only for a handful of years, and Matt Groening and I are as closely related as, oddly enough, Franklin and Theodore Roosevelt.
But that's a wild enough coincidence to make me stop and appreciate it. Animated shows that have long, healthy runs on network TV are not common as houseflies; the only two people in my generation that have succeeded in creating such works both show up in my heritage, one each on each of its axes, the diachronic one and the synchronic one, my place and my people.
By the power vested in me as a professor of rhetoric, I'm begging, pleading with the human race, and particularly people who write for a living, to figure out the difference between "begs the question" and "raises the question." They do not mean the same thing.
When an occurrence makes it a good time to take up and discuss a burning question, that's raising it, not begging it. BP's oil spill in the gulf raises the question of whether deep-water offshore drilling should be allowed. Question-begging is a logical fallacy, and has a very precise and technical meaning, namely that an arguer has simply assumed the very part of the argument that needs to be proven. If NCU had a cookie-baking contest, and someone said "Just give the prize to Doyle, since he makes the best cookies of anyone on campus," then that would be question-begging: the whole point of the contest would be to put all the entrants' cookie-baking skills to the test.
And for goodness' sake, there are few enough people left with the crumbs of critical thinking to be aware of, and care about, flawed reasoning, so if we start tossing fallacies onto the linguistic scrap-heap because we're too lazy to get our distinctives right, then we speed up our civilization's decay. Believe me, it doesn't need the help.
I stopped going to church around the time I turned fourteen, and returned just a few months after my thirty-second birthday. Both the stopping and the restarting came shortly after events that could easily be misinterpreted.June 3, 1983 was my last day of eighth grade, and was also the day my father laid down on the floor to watch television and died of a completely unexpected heart attack. My fourteenth birthday came six weeks later to the day, and, near as I can recall, I stopped attending church that very week. But it would be far too tidy to explain my decision as anger against God. Goes the conventional account, if my father, whom I loved very much, could be torn away from me like that, then I wanted nothing to do with God. Simple set piece in a thousand novels and screenplays. The problem is, it wasn't that way at all: I still gave God all my loyalty and called myself a Christian. What I couldn't stand was church.I'd been warned that churches don't handle grief very well. I was braced for the fact that they'd be supportive for about two to four weeks, show up with casseroles, keep us company around the clock, and then they would decide we'd grieved long enough, and vanish. Actually, the vanishing wasn't so bad; we were sick of having a full house, and the thought of one more casserole was enough to squelch my appetite. But what was awful was the way they treated us when we did see them.Comforting, it seems to me, is a very context-specific skill. People tend to be surprisingly good at it when they're actually, physically in attendance at a funeral, or paying a condolence call to the home of someone who's suffered a loss. Where people aren't good at it is anywhere else. Catch them at the grocery store, at school, or, worst of all, in the hallways of the church, and they're like fish out of water. They're absolutely terrified that anything they do or say will cause you to burst out crying, which will immediately make the universe explode. That's precisely what happened next: people I'd known all my life from church took unmistakably to avoiding us. I wouldn't say we were ostracized or shunned, because there was no sense of hostility or disapproval; worse, people tried to make it look casual, or accidental, as though they just hadn't seen us, which was far, far worse because it was such a glaring, if wordless, lie. I weathered this for a couple of weeks, until one Sunday morning, as we headed home, my mother turned around from the driver's seat of the car and asked a question I never, ever thought she'd ask."Do you want to keep going to church?"Until then, it had simply never been open to discussion. There was nothing optional about attending church. But she'd seen what we'd seen, and even if she had the strength to take it, she wasn't about to let it happen to her sons. All three of us stopped going to church. She started back within the year, and my brother returned to regular church attendance, I gather, when his soon-to-be wife conveyed to him that she would only marry a churchgoing man. For me, it took a bit longer.During my entire time as a debater and debate coach, I didn't take seriously the idea of joining a church. When you're on the road as many weekends as I was, it's virtually impossible to put down roots at a church. If you only show up every third or fourth Sunday, then each time you go, you have to keep reminding people what your name is. I simply didn't bother. Then, for about two years after I walked away from debate, I was still too occupied with decompression, with getting used to a humane rhythm of life and a bit of self-care to think about giving up Sundays for Christian fellowship. And, I suppose, at the back of my mind I was still nursing old resentments.The other date that's easy to misunderstand is the day I first went to the church I wound up joining: September 23, 2001. Twelve days after September 11th.No, I didn't start back to church because September 11th put the fear of God in me. Nothing like that. Even though I've been a Baptist all my life, most of my extended family is Methodist, and with one cousin in particular I used to have a good running bout of mutual teasing about the denominational gap. She moved out to East Texas and joined a Methodist church, but eventually grew disenchanted with it and moved her membership to the local Baptist church. You'd better believe I let her know how good it felt to finally, once and for all, claim victory over the Methodists. A year or so later, she made a mid-summer move to the town where I lived, and called me up one day saying she'd found the church she planned to join, and was I interested in visiting it?I walked through the front door, and within five seconds I knew I belonged back.I visited a few more times before I joined, but I've never had any doubts since about whether I belong in a church, in fellowship, in Bible study and teaching, and in service. I remember what my life was like during my unchurched period, and I don't want it back. I remember that my faith was a fact, a single facet of the totality of me, but still something thin and insubstantial and completely unsatisfying. The reality of belonging to a church, of working within it, giving to it, clinging to it as it goes through its ups and downs, is extremely powerful. I'm better with it and weaker without it.So it's not as simple as quitting church because of a death, and it's not as simple as coming back to church because of a shocking event. An outsider who didn't have all the facts could note the timing and feel very convinced of the cause-effect relationship, but that outsider would stray far from the truth simply due to taking the interpretive path of least resistance.I remind myself of this when I see sloppy scholarship, much of which consists of the kind of easy-path "reasoning" described here. In all human activity, and particularly in the traumatic human experiences that work enduring changes, there will nearly always be more to learn, more to explain, than just stringing together each event with the nearest plausible and easily-explained antecedent. But if I had a nickel for every time I saw exactly that kind of ramshackle work lauded as groundbreaking, my church could pay off the mortgage with just one month of my tithe.
So I got to thinking this morning about male nipples, and not for the first time.Nipples aren't sex-linked; they're like arms, legs, ears, the standard equipment that every human being grows from scratch, whether male or female. Male nipples are vestigial, never having been hooked up to a fully functioning mammary gland. Culturally, at least in our culture, it's only the mildest of aberrations for a man to display his nipples. Certainly he, I, shouldn't do it at a formal dinner party, or where food is being prepared or served, but on a public street there's nothing wrong with it, especially on a hot day.And from time to time, I give in to my silly side and use the word "nipples" in class, referring to the male variety. One example: people ask me what's the longest my beard has ever grown, and I tell them it's been down to my nipples. That nearly always gets a nervous giggle, because students' first thought is that I've just said something off-color. If any of them try to correct me, I point out what I wrote above.Today, however, I got to thinking in a different direction: what if women had a visible, non-functional man-bit that it was moderately acceptable to display? I reasoned by analogy from the nipple, which is not really the glandular tissue but merely a covering for the duct, and wondered what it would be like if women had ... well, if they had a part that rhymed with "so dumb," only without the contents that rhyme with "mutts." And what if it was located a bit higher than the male version, which, given how many young women display their bare midriffs, would mean it was often visible? People are certainly weird and irrational enough to find that attractive, sort of a like beauty mark: a little, wrinkly beauty mark, for the abdomen. Wonder if they'd scratch it when it itched?There's no real point to this. I didn't have any flashes of life-changing insight, or anything. It's just a sample of what it's like being in a line of work where you get paid to think about what most people ignore. Even when I'm not on the clock, my thoughts still spill out in weird directions.
I often wonder why in the world God spoils me so much. I wonder why He built into me so many quirks and eccentricities that incline me toward teaching, and then shaded my pleasure centers so that I enjoyed it this much. It just seems almost too perfect; I'm designed to do something, and I'm wired together to love doing just that thing. It's a wonderful way to live, and someday I'm going to have to ask Him why I was the lucky one. I get reassurance that the teaching goes well from course evaluations, from occasional teaching awards, but all of those are flawed measures for reasons I've written about elsewhere. But what happened yesterday was flawless and unmistakable.
Last fall, two of my graduating seniors, who happened to be engaged to one another, dropped in during my office hours and asked if I would marry them. Yesterday, I did. That's still sinking in. I can turn that reality over and over and over in my mind, and it is smooth and solid and impermeable. There are absolutely no "Yeah, but" cracks anywhere in it, and for an academic to surrender to an idea's completeness is no small thing. I was not a perfect teacher for Jordan or Tessa; I had my off days, sometimes wasn't patient enough, sometimes explained things poorly, sometimes sat on assignments and didn't get feedback to them in a timely fashion, but there is absolutely no denying, or even shading, the reality that the time we spent as professor and students was a time of growth and transformation. I made a difference with them, and they made a difference with me. I've known for years that I made a difference with students, and I've definitely been aware that they left their mark on me, but usually it's the sort of thing that's in the air, invisible, out there somewhere, but not easily sensed or gauged. In this case, it was right in my face and unmistakable. Once or twice yesterday I gave in to feeling joyful about it, but most of the day I was simply caught up in awe. It's a very big feeling, by which I don't mean that I felt swelled up or important, but simply that the feeling was overwhelming.I, of course, fell prey to my usual flaw of hanging back, being a little too reserved, doing and saying less rather than taking the risk of doing or saying enough. The wedding party were all in their early to mid-twenties, and although they kept inviting me in to the conversations, inviting me to sit with them and enjoy things, I kept holding back, aware of my age, afraid of being absurd, not wanting to take attention away from Jordan and Tessa in the middle of their celebration by becoming conspicuous. And following my rule -- at all costs, don't touch students -- I offered Jordan several very professional handshakes, when what I should've offered him is what every other male at the wedding did; a big bear hug. He hasn't been my student for seven months, and won't be ever again, so it was perfectly in line to show, to express, that he wasn't just a student I enjoyed hearing speak up in class, or whose papers I enjoyed grading, but that I now regarded him as a friend, as someone I respected and loved, as a brother in Christ, as someone I was proud to say I knew. I also hung back from Tessa, but that felt different; she was a beautiful bride, radiating joy, surrounded by bridesmaids and family and mentors and friends and teammates and a huge crowd of people, all drinking in her presence, so whether I stepped forward and chipped in fully didn't feel as important. Such things are slippery and hard to frame in words, but that was my take.Oh well. Even a year into my forties, I've still got a lot of growing up to do.And on second thought, I don't want to figure out what God is up to, and why I've got it so good. If I ever grasped the reason, I might see my way to where it could stop. And if it's ever going to, I'd rather not know.
We have no reason to believe that we are living in the end times. None.Get out your Bible, painful though it may be, and read Matthew 24. Stick with it at least through verse 36. See? No one knows the date. The angels don't know; only God the father knows. Only Him.This gets on my nerves as much as anything else my brothers and sisters in Christ get up to. "We can tell from the signs that we're living in the end times!" No, we can't. We fit what we notice into Biblical teachings, but that's no different from seeing animal shapes in the clouds. Partly it's how our brains are wired, and partly we do it for the thrill. And sometimes we do it with a conscious agenda of lighting a fire under sluggish Christians, which is probably the worst motive of all.The end will come when it comes. Our job is to live as though we expect it one second from now. But we have zero, and I mean zero, and let me underscore zero, rational basis for saying it'll be in the next year, next ten years, in our lifetimes, or even in this millennium. If Christ tarries until the year ten thousand, that's His call. So enough with the scraped up solemnity and suspense over the end times. Really; enough already.
I just was treated to a bad argument that I enjoyed enormously. And I don't mean "enjoyed" in the sense of belittling it, but rather that I wanted badly to agree with it, and wished that it weren't such a bad argument. I'm putting it here just so I can come back later and marvel at its damaged beauty:Juries and judges in capital cases should be instructed that if the defendant is sentenced to death, the sentence is carried out, and the defendant is subsequently proved innocent, then the judge and the entire jury will be put on trial for murder.
Wow. Terrible reasoning, but I love it. Why oh why can't it make sense?
I have to confess that I don't get cheerleading, and in particular competitive cheerleading. Cheerleading originally had as its purpose whipping up the crowd so the players could feed off their excitement and play harder as a result. The fact that cheerleading is itself a competitive sport seems absurd to me. Often, the cheerleaders ride to the contest site in fifteen passenger vans, so should we make fifteen passenger van driving a sport? Have van drill teams? Have the drivers do ballet moves as they climb out of the van and close the door with the perfect measure of loudness, calibrated down to the last decibel? That doesn't seem any less silly to me than judged cheerleading contests.
More importantly, it strikes me that the move in this direction is one symptom of a very serious sickness in our culture, whether we tag it declining social capital, or alienation, or any of a dozen other labels. At the beginning, cheerleaders interacted with the crowd: they projected excitement and enthusiasm, and they led fans to encourage, vocally, the players on the field. What do they lead now? Some places don't even call it cheerleading anymore; they just call it cheer, as in "cheer camp." And now it's all about performance, all about "we'll leech some of your attention away from the field and show off our dance and gymnastics moves." It's atomized, not collective; it's not about putting fans and athletes together into one cohesive group, but rather about letting fans channel-surf from the game to the dance recital and back again.
I have a niece who's very active in cheer, and I suspect she wouldn't agree with much of this. I know the participants enjoy it, and as a performance style I know it has its fans. So why not completely decouple it from athletic events and stage cheer recitals? Why not give it another name -- "cheerdancing," say -- and go back to actual cheerleading at the games? Then those who turned out for the recitals could make up a community of people who appreciate the performance style, and the actual cheerleaders would return to building up cohesion between players and spectators, and instead of fragmenting and pulverizing, everyone could celebrate what they all mutually enjoyed again.