skip to main |
skip to sidebar
When at home, I fart
Unapologetic'ly.
Pt. Pt. Pt. Pt. Pt.
Time for a new chapter.
Today I got my course evaluations from the term that just ended, and they made plain to me how urgent it is that I charge into battle for the substance and authenticity of what happens in my classroom. I've been a young teacher, trying to figure things out. I’ve been a comfortable teacher with good, developed instincts. I’ve been a popular teacher on a Christian campus with small classes, enjoying positive, light-hearted, friendly relations with my students. None of those teachers are gone; they’re all sedimentary strata in my foundation. But now it’s time for something different, and I plan to pursue it with all the stubborn militancy I can muster.
I do not believe in memorization and will no longer encourage or reward it.
I do not believe in note-taking for note-taking’s sake, and will no longer encourage or reward it.
I do not believe in playing school, and will no longer encourage or reward it.
The overwhelming majority of students here at NCU, and on other campuses across the country, are stubbornly wedged into a set of habits and assumptions that are channeling their time, energy, potential, straight down the drain. I have coexisted with those habits and assumptions for too long. No longer.
To begin with, I, my colleagues, and my students, have to fully grasp that learning is worship.
My students are very committed to the idea that worship has to be authentic, that it cannot consist of going through the motions, but somehow they don’t take that idea with them into the classroom. There’s a widely shared separation of NCU life into the sacred and the profane. The sacred is the ministry work, like staging chapel celebrations, doing community service, or leading small group Bible studies. The profane includes things like jury duty, visits to the doctor, and getting an education. Activities in the second category can be ministry opportunities, as it certainly would be possible to witness to someone in the jury pool, but they aren’t anything anyone would seek out for spiritual development: they’re to be tolerated, not wholeheartedly tackled and experienced.
I’m not convinced that in every case our students choose, consciously, to put getting an education into that second category, but the choice is unmistakable based on their behavior. They pour all their ability and energy into ministry work, but laugh to one another about how often they write papers the night before they’re due, or pull all-night cram sessions just before a test. And, naturally, they rarely take a glance at the graded papers, and take it for granted that material learned for a test is to be forgotten the second the test is over. The notion that the papers might be documents of their intellectual development that need periodic revisits, or that they might retain and make use of the material covered on a test, is entirely foreign.
It’s crazy. They don’t study the Bible that way, but every academic subject gets that arm’s length, dismissive treatment. And what’s crazy about it is that this isn’t a monastery or a convent; it’s a university, and they made the deliberate choice to enroll. The primary purpose of this institution is to offer programs of study that culminate in academic degrees. They came here for the purpose of earning such a degree. Now, I do understand that to a certain extent, people at this stage in life struggle with self-discipline; it’s too tempting to go straight to the enjoyable activities, the socializing, the work that yields instant reward. That’s true on Christian and secular campuses alike. But I’ve also seen impressive, substantive, polished work whenever they make the connection between their efforts and direct service to God. It’s not easy to play a musical instrument, but I’ve heard performances that gave me chills. It’s not effortless to plan a worship event, but I’ve seen worship events that went off like clockwork, with truly thoughtful, thought-provoking elements incorporated seamlessly. The problem is that they don’t see the connection between schoolwork and serving God.
That’s a shame. Christ’s followers certainly did.
Yes, He healed. Yes, He worked miracles. But what He did most of all was teach. He didn’t have a lot of use for people who followed Him around only to see the signs and wonders. “Take my yoke upon you and learn from me,” He said. And He didn’t just teach them how to interpret scripture, how to pray, how to do things that felt sacred: He taught them what to do with their money, how to handle conflict, how to manage contracts. Paul, His apostle to the Gentiles, would go on to castigate believers who stopped working at their jobs so they could idly await His return, saying “The one who is unwilling to work shall not eat.”
Oprah Winfrey, explaining why she builds schools in Africa and not in the United States, said “If you ask the kids what they want or need, they will say an iPod or some sneakers. In South Africa, they don't ask for money or toys. They ask for uniforms so they can go to school.” We’ve had plenty of visitors to campus who talk about the level of need they’ve seen outside the United States, and our students overflow with compassion for children who are hungry, who are victims of abuse. But I wonder if a single one of them appreciates how appalled those same children would be to see them squander their opportunity to learn? When they work to feed the hungry, I know it moves them to think about how blessed they are to have enough food; for abused children, to think about how blessed they were to grow up safe, protected, among loving family members. But they work to exhaustion in order to provide for children who are hungry for education, for learning, for a chance to take possession of their own lives, and they never see the slap in the face they give those children by making a mockery of their own access to exactly what the children crave.
I genuinely don’t get the reasoning that leads students to enroll at a Christian university, identify as fellow Christians who are giving up their entire lives to service, but then do a marginal, half-hearted job on the meat of that affiliation, the completion of coursework to earn a degree. Why not cut out the middleman and go straight to work at a church? The answer is, because most healthy churches won’t hire them unless they have a college degree, and, in many cases, seminary training to boot. What can we infer from that? Could it be that their elders, their role models, see value in the discipline of undertaking complete preparation, a wall-to-wall education, before embarking on a life of service?
And if schoolwork is profane, then why a Christian college? Daily toothbrushing is a good idea, but I doubt many of our students go out of their way to insist on a Christian toothbrush. Dental hygiene is one of the necessary, unavoidable tasks that are preparatory to active participation in the Kingdom of Heaven for another day, but I can’t think of a Christian way to brush one’s teeth that is distinct from an atheist’s approach. If schoolwork goes in the same category as toothbrushing, then why a Christian college? It seems beyond obvious to me that exploring the order in God’s creation is, itself, a form of worship, and pushing back ignorance and choosing to learn critical thinking skills is an offering to God. So why do so many students bring such a meager, poor, depleted offering?
I’m not just talking about sloppiness, by the way. Plenty of type-A, very hard-working students approach schoolwork in a spirit that is very self-centered and entitled. Just this semester, I’ve had several of my more successful students insist that I should design my classes around memorization and taking notes off Powerpoint, two activities that have only the most remote relationship to learning, and a much closer relationship to going through the motions. Several highly capable students dropped my Introduction to Mass Communication class after they tried to memorize everything covered on the first test, but met with disaster. One in particular told me that her learning style involved memorization, and if I didn’t re-design the class to reward memorization, then I was a bad teacher. I replied that it was far more important to me that they understand the course content, and that things memorized for tests tended to be forgotten almost immediately. I’m sure I’m correct about that, but I made zero headway in getting any agreement from her. In other classes, students complained that I’m no longer using Powerpoint, because they don’t know how to take notes. There’s ample research supporting the notion that Powerpoint deadens understanding and atrophies listening skills, and I explained that every time a student asked me to go back to Powerpoint, but they’ve got their comforting routines of writing down the bullet points, and when I disrupted those routines with the radical notion that they should pay attention and engage the material, they turned sullen and put the blame on me for their struggles.
Finally, I think this culture stays wedged in place because of my own behavior, and the behavior of my colleagues. I’ve said for years that I don’t want my students to like me right now; instead, I want them to look back in twenty years and like what I did, and what effect it had on them. If they like me too much right now, then I’m not challenging them enough. A colleague of mine asked me the question, last Spring, “Do you really believe that a class has to be hard for students to be learning?” I bobbled the question at the time, but it’s stayed on my mind ever since. The answer is yes, in a certain sense, it does. Christ’s followers were disciples because they’d taken on discipline, and we today separate our curriculum into academic disciplines because they should have rigor and challenge, and completing them should require more from students than they arrive able to do. A native speaker of Spanish who’s a published author, poet and playwright in Spanish, should not enroll at an American university to major in Spanish. That person has mastered the language, so completing the program is a waste of time and effort.
And I’m afraid that we’re all creeping closer and closer to expecting nothing from our students. We do Powerpoint slides because they’re easy to develop into routines. We give cursory attention to written work, because it demands less effort from us than digging in and grading it line-by-line. We take our cues from student performance, easing back on the level of difficulty in tests and assignments if the grades go down. In some cases, if students become enough of a hassle, we cut corners and overlook whatever we need to in order to make them go away. None of that is tolerable. All of it sells the students, and the service of teaching, short. And I am as guilty as anyone of practicing it. And I have decided not to anymore.
It is no more acceptable to play school than it is to play church. We offer teaching and learning as a form of worship. And scripture makes it clear that God doesn’t want offerings brought reluctantly, or from mixed motives; if we offer something to God, it needs to be in joyous gratitude for what we’ve been given, and if it’s not the best we have to give, then the joy and the gratitude is awfully hard to take seriously. I have no reasonable expectation that I can bring this off perfectly, but I am determined to double down on an insistence on learning, and a challenging of play-school routines and behaviors. And I think I might get my wish: fewer and fewer students are going to like me right away, but if I do it right, more of them may like what they see a generation from now.
I know all the fake arguments in favor of colleges having football teams (building work ethic, learning to function in a team environment, teaching leadership) and the real argument (money), but today I choked on the gear-stripping irrationality of college football in the face of recent discoveries about the price the game exacts from its players. Mounting evidence suggests that Owen Thomas of U Penn committed suicide in large part because he suffered from chronic traumatic encephalopathy, a kind of brain damage previously thought to result from too many concussions. But Thomas hadn't had many concussions; instead, he had the little brain traumas that are a part of the ordinary play of the game, and aren't addressed by any medical intervention. Now, he's obviously an outlier; most football players don't suffer brain damage that leads them to suicide. But from what his case reveals, it seems equally obvious that most, if not all, football players suffer brain damage, and a lot more severe damage than we admit to ourselves.
Am I the only one who thinks this is crazy? Is there another human on earth that remembers why colleges exist? I come to work every day, roll up my sleeves and put eight hours of sweat into training students to use their brains in constructive ways. Why on earth is it tolerable for a college to sponsor a brain damage factory? I doubt I could talk an eating disorders clinic into sponsoring the Coney Island 4th of July hot dog eating contest, but the overwhelming majority of institutions of higher education in this country use, as one of their chief marketing tools, a frontal assault on the bricks and mortar of their students' cognitive faculties. Absolutely insane.
So it struck me again this morning what a hard time the Second Amendment folks have keeping their story straight when it comes to their articles of faith. Elsewhere I've written about the fact that "Banning guns won't stop gun crime, but will just drive gun sales underground" applies with equal force to outlawing abortion, but there's no shortage of politicians and private citizens who think a gun ban would be an absurd failure while clinging just as desperately to the dream that criminalizing abortion would stop the procedure like flipping off a light switch.
Here's another: gun advocates say it's not the guns that kill people, but the choices made by the owners of the guns. Okay, stipulated. But then, the same people are often the quickest to bray for "tort reform," which would effectively cripple the ability of any private citizen to file a lawsuit, and the only support they offer for their position is a string of decontextualized anecdotes about "frivolous" suits. Do they not get the disconnect? Is it really that hard to see that even if a handful of people pursue absurd litigation, that says absolutely nothing at all about the importance of access to the courts as a leveling tool between the wealthy and the powerless? Are they equally in favor of tearing down fire stations because from time to time someone calls in a false alarm?
It's bad enough that their argument is dumb; it's maddening that they recognize how dumb an argument it is in another context, then double down on that dumb argument when it apparently fits a different issue.
People aggravate me.
So I've noticed, just this fall, that the mighty wave of girls named Madison from the past generation or so has started to break across our campus. And this morning I was struck by the singularity of naming female children for US presidents' last names: Madison, Kennedy, Reagan. I then decided, as a public service, to single out presidential last names that would be most unfortunate girls' names, just in case:- Polk
- Fillmore
- Cleveland
- Hoover
- Truman
- Johnson
- Bush
Where's Sarah Palin when you need her? Is there some sort of Palin-911 I can dial?
I've got this neat set of really sharp steak knives, but I hear I'm not allowed to perform surgery on anyone because some elitist liberals decided that you have to actually know things like medicine and human anatomy before you can get a license. That's obviously wrong and evil, because it isn't what I want to hear; if I could only find Sarah, I know she'd give me that warm smile and reassure me that I should be able to do anything that licensed doctors get to do. She'd tell me that studying and learning and mental discipline just turn you into a liberal, and ignorance and insistence are the way to true happiness and goodness. Jesus certainly never would've wanted me to know anything, would He? I need a big ol' fix of Sarah, stat.
I also want to pilot a 747 this afternoon. I don't need flying lessons or anything, do I? Tell Sarah to block out a double appointment for me; this could take a while.
This is another silly word game that goes here just so I can look back years from now and shake my head over the wastes of time that I turned to for entertainment.
Troy Dean is our newly-arrived campus pastor, and in one of our first conversations, he told me about a sewing circle at his old campus in California that they called the Stitch-n-Bitch. That was certainly cute in its own right, but it got me to thinking of other crafts that would pair neatly with speech acts, and I came up with ...
- Weave-n-Grieve
- Tattoos-n-Bad-News
- Needlepoint-n-Anoint
- Flower-Pressing-n-Second-Guessing
- Claymation-n-Character-Assassination
- Lithography-n-What's-Wrong-With-Me?
- Macramé-n-Auto-Da-Fé
- Origami-n-I-Want-My-Mommy
Others came up with "Paint-n-Complaint," "Stained-Glass-n-Talk-Out-Your- ..." which you can probably finish.
I have a Palm Z72, a stone-age precursor of the iPad, that goes everywhere with me. I bought it three or four years ago, and I use it as my personal Bible. It has Olive Tree software on it and five Bible translations: the NIV, the Holman, the New King James, the Spanish NIV, and David Stern's Complete Jewish Bible. The three biggest advantages it has are, first, it fits in my pocket, which means I've always got it on me, never true of my previous Bibles; second, I can navigate it a lot more quickly than a bound Bible; and third, it has a search function, so if I remember just a few words of a verse, I can track it down in a matter of seconds. I must admit that I've wondered a few times, since I have it, why I still put effort into memorizing Scripture?
This is actually a specific example of a wider debate raging in circles from education to journalism to brain science: why should students memorize facts if any conceivable "fact" is a few keystrokes away? Why bother to memorize phone numbers if they can all be saved in your cell phone? But on the other hand, what do you do if you lose your cell phone, or your internet access?
This morning, a story on NPR made it clear to me that this isn't a new problem. They interviewed Leslie Aiello, an anthropologist with the Wenner-Gren Foundation in New York City, and she made the point that human teeth aren't nearly as formidable as the teeth of most other animals, primarily because we're tool users. In other words, our knives, forks, kitchen graters, food processors, all serve as "teeth" in the same sense that a cell phone's contacts list outsources what we once housed in our memories. For that matter, cooking is really just off-site digestion.
The fact that so much of our food is prepared in more and more complex ways gives us a greater degree of control, but it also makes us a lot more vulnerable to mishaps. It's a lot easier to cut yourself with a knife than it is with your own teeth, and a knife makes a better weapon against someone else than an incisor does. And the more we depend on a highly elaborate diet made up of many ingredients and multi-process preparation routines, the more simple disruptions to daily life can sabotage the task of getting fed at all. People who know how to secure simple food, how to live off the land, fare a lot better when things fall apart than highly civilized people do. And for each of those weaknesses, there's an analogue in the storage and retrieval of information.
In particular, primitive hunters and gatherers in prehistoric times had to spend a lot more time and energy just feeding themselves enough to fend off starvation, but in many ways their lifestyle was healthier than ours, and few suffered from obesity or eating disorders. There's plenty to be said about information overload, but what interests me even more is the growing number of people who identify as their number one fear the experience of being absurd in a social encounter. It used to be that I could count on public speaking turning up as most people's top choice, but I've seen survey results over the past few years that pegged small talk with a distant acquaintance as scarier still. And in my gut I suspect that the different ways we produce, consume and retrieve information are at or near the heart of the forces pushing that change. Definitely something I'm keeping my eye on.
In 1996, I yanked the plug on my cable TV. In the ensuing fourteen years, I haven't been a TV watcher, and I've noticed some huge benefits. This is all very unscientific and speculative, but I have no doubt at all that my attention span and memory both have grown explosively since I gave up TV. There are hints in the literature that because viewing is so passive, long hours of engagement with TV programs causes some vital brain functions to atrophy, but none of the research supplies a definite answer. From my experience, though, I'm entirely sure, which means I'm very happy with that decision and plan to stick to it.
It did come at a price, though: it all but froze my pop cultural literacy back in 1996. These days, with the passage of time, that price has grown more and more noticeable. Often, students try to illustrate a concept in class using a TV commercial, or a character from a TV show, and I have to look helpless and say "Well, that's on TV, so I have no idea about that."
When it comes to movies, they're a bit of a gray area. I tell my students, "I see about a movie a year." The TV embargo has changed my thinking patterns so much that I struggle against succumbing to the created world inside a film. The camera points your eyes where they're supposed to go; the music, and other aesthetic clues, tell you which emotion to feel; it's such a mental frog-march that I feel out of place and cynical, so it's rare, these days, that I enjoy a movie start to finish.
With all that said, a few years back, about a month before I arrived in Eugene, a beloved non-chain video store named Flicks and Pics succumbed to the new media environment, and the Eugene Public Library bought up most of their collection. I discovered the library last summer, and now think it's one of the most potent forces for truth and justice within about a million miles of me, so this summer I finally approached their DVD shelves to take a careful look. And there I discovered movie after movie that at some point I'd wanted to see, but never got around to watching.
This summer has been my movie summer. What's below are all the movies I checked out from Eugene Public Library and watched all the way through. That's not to say I found it easy to do so: there's an even longer list that I quit watching in the middle, or that I checked out and then never watched in my allotted three weeks. With most of these, I had to pause at least once and go do something else. And possibly the most intriguing bit is that I have actually noticed my attention span and memory don't have the edge they had last spring. Even this much viewing time, spread out over nearly three months, has had an effect, and not a good one, on my brain wiring. For that reason, I'm cutting off the film festival at the end of this week, what with the arrival of the new month. I might take it up again next summer, but we'll have to see about that.
One quick gloat: I saw every movie on this list for free. I love the Eugene Public Library so, so, so much.
Without further ado, my summer viewing. The explanation of the stars is at the bottom.
★★★★
Hotel Rwanda
The Up Series (7 Up – 49 Up)
The Devil Wears Prada
That Thing You Do!
★★★☆
The Great Debaters
Harvard beats Yale 29-29
Wag the Dog
American Gangster
Paranormal Activity
The People vs. Larry Flynt
I ♥ Huckabees
F for Fake
The Color Purple
Erin Brockovich
Me and You and Everyone We Know
Taxi to the Dark Side
Rabbit-Proof Fence
Super Size Me
★★☆☆
Good Night, and Good Luck
The War Room
A Prairie Home Companion
The Last King of Scotland
Charlie Wilson's War
Pan's Labyrinth
Sicko
Monster's Ball
Blades of Glory
All the President's Men
Barbershop
To Sir, With Love
Grave of the Fireflies
The Remains of the Day
★☆☆☆
Grosse Pointe Blank
Hot Shots!
☆☆☆☆
Fantastic 4. Rise of the Silver Surfer
The stars are a measure of how far the movie deviated from my normal enjoyment of movie-watching. The four star movies were so engrossing that I could've, or did, watch them at one sitting, and if I had to pause them, my mind stayed on them and I wanted to get back as soon as possible. Three stars means I got to the end of the film and judged it a positive experience, and two stars signals that it was an acceptable experience, not worse than my average visit to the theater. One star means I was disappointed, and zero stars means the film was embarrassingly bad; there are so few of those because I was more inclined to shut a film off and take it back than to finish it if it was that bad. I'm honestly not sure why I watched Fantastic 4 through to the end. Within each rating category, I've got the films listed in the order I saw them.
So that's that. Now, back to a diet of reality over image.
I remember, in my doctoral seminar on rhetorical criticism, nailing down the difference between a diachronic and synchronic angle of attack on communicative practice. Diachronic refers to movement through time, while synchronic is identification of relationships at one moment in time. The simplest illustration of the concept involved a game of chess: you might map the moves made by one piece, say, the queen's bishop, all the way through the game, and that would be diachronic. Or you could stop the game about five or ten moves in, identify the strategic potential of every piece on the board, which pieces were under attack, which side had the stronger position, etc., and that would be synchronic.
I learned those lessons in a classroom in Georgia, where the home folks have an especially deft grasp of the concept. Small town Southerners want to know two things when they meet you: where are you from, and who are your people? Effectively, those are the two dimensions that Einstein identified as a continuum: where did you come from in space and in time? What is your place and your lineage? Who came before you, and who surrounds you? They ask because they're looking for that one clue that will sum you up.
The answer, in my case, is cartoons. Cartoons play a major role in both my heritage and my neighborhood.
I grew up in Richardson, Texas, a little suburb of Dallas. My mother still lives there, and I go back to visit every summer. Put a blindfold on me and I could probably find almost any square inch of the town. Mike Judge didn't grow up there, but he did live there for part of his childhood, and it was from the Richardson Public Library that he checked out his first books on animation. In case the name doesn't ring a bell, he's the creator of Beavis and Butthead, as well as the second longest-running animated show on network TV, King of the Hill, which, he's said in interviews, he modeled on his memories of Richardson.
The longest running animated show on network TV is The Simpsons, created by Matt Groening. He didn't spend any part of his formative years in Richardson, but rather in Portland, which means he's not from my town. He is, however, one of my people: he's my fifth cousin. On my father's mother's side of the family, three more generations back, one of my female ancestors was a Groening who married into Schmidt-ness. Her great-granddaughter, Anna Schmidt, married Glen Srader, and about fifty years later, I came along. Admittedly, both of these links are pretty tenuous -- Mike Judge and I shared city limits only for a handful of years, and Matt Groening and I are as closely related as, oddly enough, Franklin and Theodore Roosevelt.
But that's a wild enough coincidence to make me stop and appreciate it. Animated shows that have long, healthy runs on network TV are not common as houseflies; the only two people in my generation that have succeeded in creating such works both show up in my heritage, one each on each of its axes, the diachronic one and the synchronic one, my place and my people.
By the power vested in me as a professor of rhetoric, I'm begging, pleading with the human race, and particularly people who write for a living, to figure out the difference between "begs the question" and "raises the question." They do not mean the same thing.
When an occurrence makes it a good time to take up and discuss a burning question, that's raising it, not begging it. BP's oil spill in the gulf raises the question of whether deep-water offshore drilling should be allowed. Question-begging is a logical fallacy, and has a very precise and technical meaning, namely that an arguer has simply assumed the very part of the argument that needs to be proven. If NCU had a cookie-baking contest, and someone said "Just give the prize to Doyle, since he makes the best cookies of anyone on campus," then that would be question-begging: the whole point of the contest would be to put all the entrants' cookie-baking skills to the test.
And for goodness' sake, there are few enough people left with the crumbs of critical thinking to be aware of, and care about, flawed reasoning, so if we start tossing fallacies onto the linguistic scrap-heap because we're too lazy to get our distinctives right, then we speed up our civilization's decay. Believe me, it doesn't need the help.
I stopped going to church around the time I turned fourteen, and returned just a few months after my thirty-second birthday. Both the stopping and the restarting came shortly after events that could easily be misinterpreted.June 3, 1983 was my last day of eighth grade, and was also the day my father laid down on the floor to watch television and died of a completely unexpected heart attack. My fourteenth birthday came six weeks later to the day, and, near as I can recall, I stopped attending church that very week. But it would be far too tidy to explain my decision as anger against God. Goes the conventional account, if my father, whom I loved very much, could be torn away from me like that, then I wanted nothing to do with God. Simple set piece in a thousand novels and screenplays. The problem is, it wasn't that way at all: I still gave God all my loyalty and called myself a Christian. What I couldn't stand was church.I'd been warned that churches don't handle grief very well. I was braced for the fact that they'd be supportive for about two to four weeks, show up with casseroles, keep us company around the clock, and then they would decide we'd grieved long enough, and vanish. Actually, the vanishing wasn't so bad; we were sick of having a full house, and the thought of one more casserole was enough to squelch my appetite. But what was awful was the way they treated us when we did see them.Comforting, it seems to me, is a very context-specific skill. People tend to be surprisingly good at it when they're actually, physically in attendance at a funeral, or paying a condolence call to the home of someone who's suffered a loss. Where people aren't good at it is anywhere else. Catch them at the grocery store, at school, or, worst of all, in the hallways of the church, and they're like fish out of water. They're absolutely terrified that anything they do or say will cause you to burst out crying, which will immediately make the universe explode. That's precisely what happened next: people I'd known all my life from church took unmistakably to avoiding us. I wouldn't say we were ostracized or shunned, because there was no sense of hostility or disapproval; worse, people tried to make it look casual, or accidental, as though they just hadn't seen us, which was far, far worse because it was such a glaring, if wordless, lie. I weathered this for a couple of weeks, until one Sunday morning, as we headed home, my mother turned around from the driver's seat of the car and asked a question I never, ever thought she'd ask."Do you want to keep going to church?"Until then, it had simply never been open to discussion. There was nothing optional about attending church. But she'd seen what we'd seen, and even if she had the strength to take it, she wasn't about to let it happen to her sons. All three of us stopped going to church. She started back within the year, and my brother returned to regular church attendance, I gather, when his soon-to-be wife conveyed to him that she would only marry a churchgoing man. For me, it took a bit longer.During my entire time as a debater and debate coach, I didn't take seriously the idea of joining a church. When you're on the road as many weekends as I was, it's virtually impossible to put down roots at a church. If you only show up every third or fourth Sunday, then each time you go, you have to keep reminding people what your name is. I simply didn't bother. Then, for about two years after I walked away from debate, I was still too occupied with decompression, with getting used to a humane rhythm of life and a bit of self-care to think about giving up Sundays for Christian fellowship. And, I suppose, at the back of my mind I was still nursing old resentments.The other date that's easy to misunderstand is the day I first went to the church I wound up joining: September 23, 2001. Twelve days after September 11th.No, I didn't start back to church because September 11th put the fear of God in me. Nothing like that. Even though I've been a Baptist all my life, most of my extended family is Methodist, and with one cousin in particular I used to have a good running bout of mutual teasing about the denominational gap. She moved out to East Texas and joined a Methodist church, but eventually grew disenchanted with it and moved her membership to the local Baptist church. You'd better believe I let her know how good it felt to finally, once and for all, claim victory over the Methodists. A year or so later, she made a mid-summer move to the town where I lived, and called me up one day saying she'd found the church she planned to join, and was I interested in visiting it?I walked through the front door, and within five seconds I knew I belonged back.I visited a few more times before I joined, but I've never had any doubts since about whether I belong in a church, in fellowship, in Bible study and teaching, and in service. I remember what my life was like during my unchurched period, and I don't want it back. I remember that my faith was a fact, a single facet of the totality of me, but still something thin and insubstantial and completely unsatisfying. The reality of belonging to a church, of working within it, giving to it, clinging to it as it goes through its ups and downs, is extremely powerful. I'm better with it and weaker without it.So it's not as simple as quitting church because of a death, and it's not as simple as coming back to church because of a shocking event. An outsider who didn't have all the facts could note the timing and feel very convinced of the cause-effect relationship, but that outsider would stray far from the truth simply due to taking the interpretive path of least resistance.I remind myself of this when I see sloppy scholarship, much of which consists of the kind of easy-path "reasoning" described here. In all human activity, and particularly in the traumatic human experiences that work enduring changes, there will nearly always be more to learn, more to explain, than just stringing together each event with the nearest plausible and easily-explained antecedent. But if I had a nickel for every time I saw exactly that kind of ramshackle work lauded as groundbreaking, my church could pay off the mortgage with just one month of my tithe.
So I got to thinking this morning about male nipples, and not for the first time.Nipples aren't sex-linked; they're like arms, legs, ears, the standard equipment that every human being grows from scratch, whether male or female. Male nipples are vestigial, never having been hooked up to a fully functioning mammary gland. Culturally, at least in our culture, it's only the mildest of aberrations for a man to display his nipples. Certainly he, I, shouldn't do it at a formal dinner party, or where food is being prepared or served, but on a public street there's nothing wrong with it, especially on a hot day.And from time to time, I give in to my silly side and use the word "nipples" in class, referring to the male variety. One example: people ask me what's the longest my beard has ever grown, and I tell them it's been down to my nipples. That nearly always gets a nervous giggle, because students' first thought is that I've just said something off-color. If any of them try to correct me, I point out what I wrote above.Today, however, I got to thinking in a different direction: what if women had a visible, non-functional man-bit that it was moderately acceptable to display? I reasoned by analogy from the nipple, which is not really the glandular tissue but merely a covering for the duct, and wondered what it would be like if women had ... well, if they had a part that rhymed with "so dumb," only without the contents that rhyme with "mutts." And what if it was located a bit higher than the male version, which, given how many young women display their bare midriffs, would mean it was often visible? People are certainly weird and irrational enough to find that attractive, sort of a like beauty mark: a little, wrinkly beauty mark, for the abdomen. Wonder if they'd scratch it when it itched?There's no real point to this. I didn't have any flashes of life-changing insight, or anything. It's just a sample of what it's like being in a line of work where you get paid to think about what most people ignore. Even when I'm not on the clock, my thoughts still spill out in weird directions.
I often wonder why in the world God spoils me so much. I wonder why He built into me so many quirks and eccentricities that incline me toward teaching, and then shaded my pleasure centers so that I enjoyed it this much. It just seems almost too perfect; I'm designed to do something, and I'm wired together to love doing just that thing. It's a wonderful way to live, and someday I'm going to have to ask Him why I was the lucky one. I get reassurance that the teaching goes well from course evaluations, from occasional teaching awards, but all of those are flawed measures for reasons I've written about elsewhere. But what happened yesterday was flawless and unmistakable.
Last fall, two of my graduating seniors, who happened to be engaged to one another, dropped in during my office hours and asked if I would marry them. Yesterday, I did. That's still sinking in. I can turn that reality over and over and over in my mind, and it is smooth and solid and impermeable. There are absolutely no "Yeah, but" cracks anywhere in it, and for an academic to surrender to an idea's completeness is no small thing. I was not a perfect teacher for Jordan or Tessa; I had my off days, sometimes wasn't patient enough, sometimes explained things poorly, sometimes sat on assignments and didn't get feedback to them in a timely fashion, but there is absolutely no denying, or even shading, the reality that the time we spent as professor and students was a time of growth and transformation. I made a difference with them, and they made a difference with me. I've known for years that I made a difference with students, and I've definitely been aware that they left their mark on me, but usually it's the sort of thing that's in the air, invisible, out there somewhere, but not easily sensed or gauged. In this case, it was right in my face and unmistakable. Once or twice yesterday I gave in to feeling joyful about it, but most of the day I was simply caught up in awe. It's a very big feeling, by which I don't mean that I felt swelled up or important, but simply that the feeling was overwhelming.I, of course, fell prey to my usual flaw of hanging back, being a little too reserved, doing and saying less rather than taking the risk of doing or saying enough. The wedding party were all in their early to mid-twenties, and although they kept inviting me in to the conversations, inviting me to sit with them and enjoy things, I kept holding back, aware of my age, afraid of being absurd, not wanting to take attention away from Jordan and Tessa in the middle of their celebration by becoming conspicuous. And following my rule -- at all costs, don't touch students -- I offered Jordan several very professional handshakes, when what I should've offered him is what every other male at the wedding did; a big bear hug. He hasn't been my student for seven months, and won't be ever again, so it was perfectly in line to show, to express, that he wasn't just a student I enjoyed hearing speak up in class, or whose papers I enjoyed grading, but that I now regarded him as a friend, as someone I respected and loved, as a brother in Christ, as someone I was proud to say I knew. I also hung back from Tessa, but that felt different; she was a beautiful bride, radiating joy, surrounded by bridesmaids and family and mentors and friends and teammates and a huge crowd of people, all drinking in her presence, so whether I stepped forward and chipped in fully didn't feel as important. Such things are slippery and hard to frame in words, but that was my take.Oh well. Even a year into my forties, I've still got a lot of growing up to do.And on second thought, I don't want to figure out what God is up to, and why I've got it so good. If I ever grasped the reason, I might see my way to where it could stop. And if it's ever going to, I'd rather not know.
We have no reason to believe that we are living in the end times. None.Get out your Bible, painful though it may be, and read Matthew 24. Stick with it at least through verse 36. See? No one knows the date. The angels don't know; only God the father knows. Only Him.This gets on my nerves as much as anything else my brothers and sisters in Christ get up to. "We can tell from the signs that we're living in the end times!" No, we can't. We fit what we notice into Biblical teachings, but that's no different from seeing animal shapes in the clouds. Partly it's how our brains are wired, and partly we do it for the thrill. And sometimes we do it with a conscious agenda of lighting a fire under sluggish Christians, which is probably the worst motive of all.The end will come when it comes. Our job is to live as though we expect it one second from now. But we have zero, and I mean zero, and let me underscore zero, rational basis for saying it'll be in the next year, next ten years, in our lifetimes, or even in this millennium. If Christ tarries until the year ten thousand, that's His call. So enough with the scraped up solemnity and suspense over the end times. Really; enough already.
I just was treated to a bad argument that I enjoyed enormously. And I don't mean "enjoyed" in the sense of belittling it, but rather that I wanted badly to agree with it, and wished that it weren't such a bad argument. I'm putting it here just so I can come back later and marvel at its damaged beauty:Juries and judges in capital cases should be instructed that if the defendant is sentenced to death, the sentence is carried out, and the defendant is subsequently proved innocent, then the judge and the entire jury will be put on trial for murder.
Wow. Terrible reasoning, but I love it. Why oh why can't it make sense?
I have to confess that I don't get cheerleading, and in particular competitive cheerleading. Cheerleading originally had as its purpose whipping up the crowd so the players could feed off their excitement and play harder as a result. The fact that cheerleading is itself a competitive sport seems absurd to me. Often, the cheerleaders ride to the contest site in fifteen passenger vans, so should we make fifteen passenger van driving a sport? Have van drill teams? Have the drivers do ballet moves as they climb out of the van and close the door with the perfect measure of loudness, calibrated down to the last decibel? That doesn't seem any less silly to me than judged cheerleading contests.
More importantly, it strikes me that the move in this direction is one symptom of a very serious sickness in our culture, whether we tag it declining social capital, or alienation, or any of a dozen other labels. At the beginning, cheerleaders interacted with the crowd: they projected excitement and enthusiasm, and they led fans to encourage, vocally, the players on the field. What do they lead now? Some places don't even call it cheerleading anymore; they just call it cheer, as in "cheer camp." And now it's all about performance, all about "we'll leech some of your attention away from the field and show off our dance and gymnastics moves." It's atomized, not collective; it's not about putting fans and athletes together into one cohesive group, but rather about letting fans channel-surf from the game to the dance recital and back again.
I have a niece who's very active in cheer, and I suspect she wouldn't agree with much of this. I know the participants enjoy it, and as a performance style I know it has its fans. So why not completely decouple it from athletic events and stage cheer recitals? Why not give it another name -- "cheerdancing," say -- and go back to actual cheerleading at the games? Then those who turned out for the recitals could make up a community of people who appreciate the performance style, and the actual cheerleaders would return to building up cohesion between players and spectators, and instead of fragmenting and pulverizing, everyone could celebrate what they all mutually enjoyed again.
So there's this so-called imponderable question that makes the rounds in all the lists: why did kamikaze pilots wear helmets? I've seen various answers -- Cecil Adams said it was to keep their heads warm in open cockpits, while Marilyn vos Savant said it was because they didn't really want to face what they were about to do -- but as far as I'm concerned, the answer is pretty obvious.Much of Japanese culture is based on ĺž‹, pronounced kata, which means the proper way of doing something. They have a proper way to do everything, from wrapping presents to greeting strangers. And when I say that the way is proper, I don't mean that it's normal or typical or expected, because all cultures have a normal way to give a present or offer a greeting. In Japanese culture, there are fine, precise details to those and other tasks, and there's a good deal of pride, and a good deal of cultural capital, available to those who execute them perfectly.One way this shows up is in wearing the proper clothing for whatever one is doing. To play golf, you must wear golf clothes. To hike, you must have a hiking outfit. If your clothes aren't consciously and carefully matched to your activity, then the activity itself is less worthy, less satisfactory.Therefore, if you're going to pilot a plane to your country's glory, you've got to be decked out in proper pilot attire. And what do pilots wear on their heads? (Or, what did they in the 1940s?) An aviator's helmet. Kamikaze pilots wore helmets so they would look the part, which is vitally important from one end of Japanese culture to the other.
This was originally a Facebook note. I posted it to explain why I was cutting back from 400+ friends to about a tenth that many.
- My take on communication technology is that balance is everything, and for most folks, there’s a tech-facilitated path that they won’t have an easy time keeping balanced. I see mothers yack on cellphones while ignoring their infants, I see teens who text obsessively and can’t pay attention to their professors or to traffic, and I see my own struggles keeping Facebook in healthy balance. I tend to hover over it and throw unhealthy amounts of the day into it. That’s been behind my decision to shut it down when school is in session for the past few terms. The problem is, that hovering isn’t any healthier during the summer; I need to be productive during the summer, and even when it’s time to relax, I don’t relax very successfully when I’m Facebook-tethered. I’ve never been a smoker, drug-taker or heavy drinker, but the Facebook dopamine squirt gives me trouble, and I’m convinced that the healthiest way to reach balance is to muscle my way to it with no half measures.
- I take seriously the privacy concerns people are reporting, and I’m very sure we haven’t seen the whole picture. Facebook is not providing all its geegaws as a public service; they are in this to make money, and they are continually working behind the scenes to make it more profitable than it is. Today’s Facebook is effectively already yesterday’s Facebook. While the loss of privacy might seem to be a latent threat, it disturbs me that if it were an active threat and I was at risk, I would have no way of knowing it. I keep my privacy settings fairly tight, but more and more I feel like someone driving on a busy highway who gives the road no more than half their attention; it’s not enough, and I need to drive a lot more defensively.
- Finally, I’m more or less sure Facebook has passed its peak. More and more people use it less and less, and the bulk of material that’s even marginally interesting comes from a handful of my four-hundred-plus friends. For a while, I thought that was seasonal, but I’m growing more sure it isn’t. What was once pretty robust and enjoyable has become thin gruel, so it’s a good time to cut way back before my expectations are dashed any further.
There are little reasons, little annoyances, that add an ounce or two of pressure on top of the above, but those three are the major driving forces. Because of them, I’ve arrived at a tentative plan to go on a mass un-friending, and cut back only to people in three categories:
- Family
- Graduates from my department
- A handful of people who were hard to find, and I don’t want to lose again.
Before I do that, I’ll collect a lot of email addresses of other people, just so I have a way to contact them. But it’s time to become one of the low-activity Facebook users and move to using it for a purpose, not just for the sake of using it.
The Freshman's Alphabet WaltzYou, in a world of expanding diameterHere's some advice in dactylic tetrameterA is abandon, so leave your old fears behindB is beginnings refresh and renew your mindC is call home so your parents can sleep at nightD is discern and steer clear of what isn't rightE is expect the adjustment to challenge youF is forgive minor cruelties that others doG is go places and see things you never sawH is your honor, a trustworthy inner lawI is "I think," so please think before saying itJ is use judgment, so pause and reflect a bitK is seek knowledge that drives away mysteryL is let go of mistakes that are historyM is make sure that you sleep enough not to dieN is to never deceive yourself with a lieO is occasional treats to anticipateP is have patience for others to imitateQ is for quiet time just between God and theeR is your roommate; remember the line for PS is stay here on the weekend and grow some rootsT is to tame your tongue; don't be a smartybootsU is umbrellas are useless in OregonSoon as you put yours away it'll pour againV is brief victories, followed by war againW is weekends; don't play till your work is doneX is exhibit good manners to everyoneY is your conscience; don't do what it won't allowZ is for zero regrets twenty years from now
More and more I get the disturbing feeling that what I teach in the classroom has a lot in common with multivitamins, and not in a good way. I gave up on multivitamins about a year ago: I'd read accounts for and against them to get a sense of how the evidence stacked up, and it finally swayed me to the view that they do little besides give Americans the most expensive pee on earth. In fact, a lot of things we do when our health is squarely in the center of our attention have little effect; health, whether good or bad, is accumulated via very long waves of habit and behavior, some of which stretch back before our birth. Much of what we're up against, health-wise, is written in our genes.That's actually not what got me to thinking this morning, but the analogy is striking. What stirred me up was yet another mention of prior knowledge as a pivotal factor in reading effectiveness. Put plainly, guiding a student to becoming a good reader has less to do with technique, SQ3R, instruction, drills, or anything along those lines than it has to do with simply knowing a good deal about a lot of things. People will find passages more difficult to read if they don't have a foundation of knowledge about the subject, and this degree of difficulty dwarfs verbal skill and instruction as a predictor of reading comprehension.Me being in the communication racket and all, I immediately see parallels in my field. One interpersonal communication theory, Uncertainty Reduction, says that we communicate for the purpose of reducing uncertainty and beefing up the baseline from which we interpret, explain and predict others' behavior. For the most part it tracks the effect prior knowledge has on reading effectiveness, but with reference to conversation and other forms of relational communication.That idea has a couple of huge implications for teaching. I work my hiney off trying to beef up people's communication competence, trying to put them through their paces at communication behaviors and skills that will help them reach out to others more effectively and appropriately, but the truth is that all this concern with technique is a tiny splinter in the huge beam that is situational or contextual knowledge. I'm not giving back my paycheck or anything, but it is a bit humbling.The other implication was taken up by E. D. Hirsch in a book I was reading this morning. He makes the argument that in the early years of primary school, we need to teach students a much more uniform foundation of core knowledge to help them achieve cultural literacy. And he begins by acknowledging that this runs into trouble with people who are committed to making public education diverse and multicultural. According to him, diversity in the delivery of cultural artifacts is like teaching thirty different students in your English class thirty different versions of the alphabet: laudable in the abstract, but an invitation to chaos when it comes to the simplest learning skills that they'll need later on.His argument has some appeal, but I'm not convinced. It reminds me of a couple of things I take up in my classes, one of which is the controversy over African-American vernacular English, more commonly known as Ebonics. The way I explain it to my students is that if you've got in your classroom a bunch of kids whose co-cultural heritage gives them a shared way of speaking, then you have no hope of teaching them a different way to speak if your approach is to say "Your way is lazy and wrong, and must be replaced by intelligent, right speaking." Instead, what teachers should do is invite students to become bilingual. AAVE is an internally consistent dialect, but there's another dialect, Standard Spoken English, that ranges between useful and indispensible in workplace situations, so it's worthwhile to learn it as a marketable skill, same as bookkeeping, to have available for use, rather than to change the worth of anyone's identity. And I think that distinction is important to maintain when we get to thinking about context and background knowledge. Hirsch's argument about the democratizing effect of a cultural core does homogenize and artificially normalize too many ideas held by the dominant group in a way that is false to fact, but if we keep our focus squarely on the usefulness of shared knowledge, as set apart from the correctness of that knowledge, the dangers that come with that homogenization might recede a bit.The other thing it makes me think of is the never-ending tension between objective and interpretive perspectives on communication. I've written about this elsewhere, and my students have heard me talk the idea to death: some elements of communication can be measured empirically, while others can only be reported as experience, which some hearers say they share to varying degrees of fidelity, but which can't be captured and bottled. No one understands communication if they devote all their attention to one or the other of those two perspectives. Reasoning from that, I think it's probably true that we've neglected the importance of context and background knowledge, but to go so far as to say they're all that matters runs along the same lines as saying the measurable elements of communication tell us everything we need to know about how it works, which is downright silly. So these are wobbly ideas that are trying to find a balance: in some ways, this tracks the theory-practice dialectic that's coming up over and over again in my work with intraprofessional controversies, because background knowledge is what we accrue inductively through practice, while technique is quite similar to theory: a recipe for behaving, as compared to a recipe for knowing. And as with most things communication-related, it's a bit of a mess. But as with most such things, it's also fascinating and fun to work through, and the more years I do it, the more I enjoy it.
One thing surgeons and serial killers have in common is that people who are squeamish about the sight of blood are that much less likely to become either one. Similarly, one thing police officers have in common with humanity's biggest bungholes is that both tend to be comfortable asserting authority. If you're not the kind of person who can do that, you might have other faults, but there's a glass ceiling separating you from the pinnacle of obnoxiousness. It also limits your career potential in law enforcement.At this point, it would be easy for you to get the wrong impression. This is not an anti-police officer piece. Quite the opposite actually; as I write this, I'm caught up in a burst of impatience at how deeply and powerfully the anti-police officer feelings run in this town.Eugene is a town of aging hippies. By and large, hippies don't warm up to people in uniforms who tell them what they may and may not do. Plus, some hippies, although not quite all, gravitate toward recreational activities that are very illegal. Eugene is also a town with a critical mass of citizens that identify themselves politically as left-wing. For that reason, they're very skeptical of appeals to law and order, and believe police activity usually is orchestrated, and almost entirely behind the scenes, to benefit those who have spent many generations in the wealthy and powerful class, and intend to stay there and to keep out anyone who looks, thinks, or lives differently. And Eugene is a college town. The traditional college-age population is finely situated to be anti-police for the same reason that so many of them go through a rough patch with their parents: they feel ready for complete autonomy, but chafe under the last bits of parental authority, and the friction between those two states builds and builds until something gives.The output of the above factors, and probably a few others I haven't considered, is a seemingly endless flood of anti-police invective. Lots and lots of people here in this town hate the police. All police. And that goes hurtling past silly, far into the realm of the outright asinine.Not all police officers are bullies. Not all police officers have a dysfunctional need to give orders, intimidate, demonstrate their power; far from all of them do. But too many of my neighbors and associates put on a convincing imitation of two year olds who fear and hate being vaccinated: they've got a keen memory of a few incidents that involved pain, and they therefore refuse to grapple with the reality that one moment of unpleasantness is probably a small price to pay for protection against a slow death, quite likely dragged through racking, lingering, hellish torment. Two year olds are shortsighted because they're two years old. But at some point, the two year old worldview has to give way, one hopes, to adulthood.Even given occasions when a police officer behaves badly, it's just absurd to conclude that that police officer, let alone all police officers, carries that as a deeply engraved personality trait. Police work is grueling, and the bad days are unimaginable. A police officer who's gruff during a traffic stop may still be shaking from their own brush with mortality, while grieving the recent loss to violent death of a good friend, or even several good friends. I know how much of my civility I misplace after a day when students have talked back to me, or even just been sluggish in class, so I don't feel as though I'm in any position to hold them to a standard that's ridiculously higher when their working conditions are ridiculously more stressful.And yes, obviously there are bad police officers. There are also bad plumbers, bus drivers, house painters, pastors, and weasel shavers. But not nearly as many people are willing to condemn those entire professions based on nothing but a few experiences, backed up with images on TV and the trash talk of immature friends. (Well, maybe pastors, but not the rest.)The funny thing I'm left wondering is whether this marks the beginning of a swing back to where I started my adult life. I came out of high school far more conservative, politically, than ninety-nine out of a hundred people you'll ever meet. But the lifeblood of my education, from secondary to higher to postgraduate, was in debate, and I was slingshotted to the extreme opposite just by the painful experience of listening to arguments made badly. I went to Baylor, which was a pretty inbred nest of conservative thought, and it was like what I imagine musicians must suffer, having to listen to their favorite music being played sloppily and off-key for year after agonizing year. It soured me on what I'd originally believed. I've since maintained that I'm not really pro-Republican or pro-Democrat, not really pro-liberal or pro-conservative, but rather that I'm anti-bad argument. Throughout most of the Rush Limbaugh era, the boldest and most shameless blast of really embarrassingly bad arguments has come from the right, and that has kept me pinned against the opposite wing of politics. But it looks as though living in Eugene is starting to change that.To an extent, I'll admit, I think this points to how much growing up I still have to do. The truth is that most of the musicians I know actually seem very patient with bad singers and performers. They apparently have the wisdom and kindness to rejoice at others' enjoyment, and to tame their own prissiness and pedantry enough to look past failures of execution to the overflowing heart that motivated the music. If I were a better person, I would be equally pleased to see the passionate engagement and boldness that drives people to enter into substantive conversation and at least attempt to stake out a defensible position. But that might be one of the marks that years in academia has left on me: hearing badly-made arguments still repels me. It doesn't say good things about my allegiance to the truth, but it's a consistent pattern.Have to wait and see where it sends me next, now that I'm here.
Today, I got my first look at my course evaluations for the Spring 2010 term. My Interpersonal evals looked good, the Public Speaking ones were extremely positive, and the evals from Listening Behavior tore the roof off the house. It seems that in the opinion of the students, each of those classes went very, very well. The one outlier was Communication Theory.I've been thinking a lot about that class. It's only the second time I've taught it, and the first time it's been its own class, as opposed to a special problems. Back in January, I announced that there would be two tests in the class, both at the very, very end: one would be an objective test over all the theories we'd covered, and the other would be an essay test, for which I'd give them the essay questions beforehand. In fact, I posted the essay questions before the first class meeting, so they had fifteen weeks to craft their answers. I also posted a study guide for the objective test around the third or fourth week, and stopped talking about it.Late in April, about a week before the objective test, I mentioned it. More than half the class looked very surprised. "We have a test next week?" Not only had I told them at the start and provided a study guide, but the test itself was on the syllabus calendar. In bold. Bright red. But it was a complete surprise to them. More than half the class failed the test, and on the course evaluations I just read, they pointed to the arrangement of the class, and that test in particular, as the reason they didn't think the class was well taught.What this makes me think is that I'm damned if I do and damned if I don't.I'm also the lead instructor for First Year Seminar, so we did focus groups and other such activities to find out how we could make that class a more useful, positive experience for incoming freshman. What was the one message everyone agreed upon? What did they hammer into our heads? "Don't talk down to us or treat us like children. We're adults, and you should show us the same respect you show each other." But what happens when I don't nag them every week to study for their comprehensive final, like mom nagging them to clean their room? Well, that means I don't understand their needs as learners. Their other repeated complaint, on the SSI and in feedback to our marketing firm, is that classes at NCU lack rigor. In this sense as well, my Theory students wound up unhappy receiving exactly what they'd asked for.What they seem to think will happen in the workplace after they graduate is that their bosses will assign them only short tasks that fit within the attention span they choose to bring to bear, and whenever they do any longer-term work, their managers will manage their time for them. I don't think it works that way, but I suppose one of us is right, and if they are, things will work out. And if I was right, they won't be able to say no one tried to teach them differently.Understand that I don't, by any stretch, think the class went perfectly. I learned a lot of lessons about how to tackle that class, and I think it'll look quite different the next time I run it. And I do continue to turn over in my mind what they say, because it is dangerous to rush to judgment and assume my own perspective on the class is all that matters. What they wrote, and what I learned, have a year and a half to percolate through my mind before I have to gear up to do this again. But dangit, this is a class for majors! When I teach the general interest classes that draw people from every major on campus, I'm at peace with the reality that only some of what I talk about will strike a chord with them, and they'll pursue it and connect it with their own experiences and values, and retain that much. All the rest will go pouring out their other ear and be forgotten. But when it comes to Communication majors taking their survey of Communication Theories class, this is their toolbox. These are the ideas that make up the backbone of the field of study. It is not acceptable to me that they "play school," that they go through the motions, that they cram for a test and forget what was on it as soon as they get to their second post-test beer. Not acceptable. If we talked about Coordinated Management of Meaning in January, then it's downright important that they still grasp CMM in May, and in August, and May of the following year, and on and on. If they disagree, too danged bad: time for me to be a granite wall in their path, and they can either change their ways, or else wipe out on my stubbornness.And I can also say that this experience provides some measure of reassurance on a worry I nursed through most of last year: it's a bad sign when you're too popular with your students. I don't want them to like me too much right now; instead, I want the twenty-years-from-now version of them to look back and like how much they grew under my instruction. Their work ethic and responsibility is not a fraction of what it will be, and if I fit their expectations right now, then I'm lowballing terribly. With this class, I got a glimmer of hope that they encountered the level of expectation that will stretch them into their best selves.
In most activities people pursue with any appreciable intensity, there is a kernel of value embedded in an outer layer of utter absurdity.I'll start backing up my assertion with the example of sports. I think any sport that picks up a serious measure of longevity has at its core an essence that is enjoyable and worthwhile. I think athletes who play those sports make great memories for themselves, form powerful bonds with teammates and rival competitors, and learn valuable lessons about discipline, teamwork, patience, humility, and the list goes on. But beyond that core lurk the toxins of popularity and money, and as soon as the sport picks up a sizable audience, whether regional, nationwide, or even global, the latent profit creates a bubble of false, distorted value that utterly skews the priorities of those who play it and those who follow it.Oddly enough, virtually the same thing is true, straight down the line, for academic research. Any academic field that attracts a critical mass of researchers, plus consumers of the research, is definitely on to something. No matter how many hasty, lazy thinkers want to say we're nearing the end of science, and we know everything that needs to be known, we still discover every day new cracks and crevices of reality and human experience (not the same thing at all) that bear examining. But the way academic research works at universities, those crevices become veins of valuable ore to be mined for profit, until almost overnight the scholars are producing obscure, silly, contrived research projects that have vanishingly small power to change anyone's life for the better. I trained in the doctoral program of a Research I university, and my professors all assumed I'd go to a huge state school, crank out five or six journal articles each year, teach at most a single class, and effectively work in a think tank, surrounded by grad students who were my research disciples.Didn't quite turn out that way.You see, one of the few pursuits that I don't think is a core of value surrounded by a cocoon of absurdity is teaching. Teaching, from inside to out, is pure value. It's definitely the case that learning can range from worthwhile to absurd, and the unbalanced relationship ultra-orthodox Judaism has with Talmud study merely for study's sake has lately underscored that for me. But teaching, as far as I'm concerned, as far as my reasoning can take me, is worthwhile all the way through.And notice what do I do for a living? I dabble in research on the side, and I even serve the athletic program at my college. But the kernel of what I do is teaching. Teaching makes everything else run; teaching is what defines me. It's my top priority. It's easily the most worthwhile, least counterfeit, enterprise I pursue, and I believe it's the most world-changing outlet for my energies that God provides.I take great comfort in that. I think it's probably my best protection against burnout.
Yesterday I had a moment of sledgehammer empathy.To begin with, I was a very mediocre debater. I had to work incredibly hard to rack up what little competitive success I did finally wind up with. And since I was on one of the very best college debate squads in the nation, that meant I was reminded on a daily basis how far below most of my teammates I was in talent and skill. In part, this was healthy, because all my life I'd been one of the smarter kids, and things had come to me easily, so this showed me how small a pond I'd swum in, and gave me more realistic measures of how I stacked up against the rest of the world. It also lit a fire under me to work harder, which revolutionized my daily routine: since I'd been very young, I'd gotten by despite underperforming. Now there was something I craved, and all the effort I could possibly devote to it resulted only in inching progress. In many ways, that matured me, and made what I do today possible.Those were the nice parts, but there were dark and painful parts as well. I remember how it felt not to measure up. I remember being a junior, and then a senior, and still not making the elimination rounds at the major tournaments, or winning big debates against good teams. I remember what a ripping, tearing sensation it was to grapple with the reality that my best simply wasn't good enough, that this was something that just wasn't in me. I could put everything else in my life aside and do nothing but debate, but I would never be anything but mediocre. I remember fantasies and daydreams that on particular occasions shriveled up and died, as I figured out that they were out of my reach no matter how hard I tried.So where does the empathy come in?Yesterday I had a student visit my office who'd done badly in one of my classes. The student is on academic probation, and is now facing academic dismissal from the school. In the past, I've been pretty clinical about this: college is not for everyone, and if someone's combined maturity level and intellectual chops don't, after repeated chances, produce the calibre of work required, then it's appropriate and even healthy to remove them, to steer them onto another path. But too often I pull back, brace myself for the tears and obvious grief, go robotically into my soft, gentle voice and relaxed eyes, and simply try to wait them out. Too often I judge them, condemn them, sit still and attentively while they speak their piece, and wonder in my head how much longer it will take. Even if what I do is necessary and proper, and I do believe it is, the detachment from real pain and real grief is not acceptable. I have been guilty of it, and I repent of it. It's not enough just to put on a display of sympathy. It's not enough to show "appropriate empathic concern." If I'm not willing to endure the hurt, to call to my mind exactly what scars that hurt left on me, then I don't have enough motivation, enough passion, to tackle the teaching enterprise and give my all to helping them find a way to grow into and through the challenge of college. I do have to keep firmly before me the reality that the pain grew me, that the pain was necessary, but if I fall into the trap of becoming too blithe, too flippant, too much of a spectator and not enough of a participant, then I can't be the teacher I want to be.And yes, effectively what I'm writing here is "Bring on the burnout." This is a recipe for shortening my teaching career. But no one ever said I should return what God gave me unmarked and undamaged. If I don't guard myself enough, then I can't be available to students in years to come, but if I make the converse error of guarding myself too much, then I become complicit in cruelty, and as far as I'm concerned, the proper direction to err is obvious.
There's something downright Heisenbergian about the question of whether any of us were wanted, or planned, by our parents.I use that term advisedly because in my case, I both was and wasn't. I know absolutely for sure that I was conceived accidentally, as a result of a failure of birth control. I also know absolutely for sure that my parents wanted me very badly, that they put a lot of planning and thought and patient waiting into the enterprise of bringing me into their household.I know these things because I'm adopted. My mother has related enough details about the biologicals and their situation that I know my conception was an unwanted surprise. But I also know my parents were fully invested in the project of raising me from moment one.For most people, the question is cloaked in mystery. It's not an easy topic to take up with one's parents. Amusingly enough, I do know about my pastor, because his father, the pastor emeritus, shared with us the answer to the question in the middle of a sermon. No, they hadn't planned on him. But for the rest, I'm not sure how it could be dropped smoothly into conversation, without kicking up a fair amount of discomfort. And I'm not sure how completely it would be possible to believe a positive answer. It's not possible to accidentally adopt a child, but the only evidence that a borne child was planned and wanted is self-reporting, which is necessarily shaky.I find nicely ticklish the fact that most people do not know the answer to this question, and at the same time that I do know the answer, and that it's really two answers, and that they're opposites.I like that. It's just sort of the way the world is.