My profession demands infinite alternate explanations. Teaching young writers, I exchange one description for another and turn to what something is like instead of what it is. A research paper is baking a cake, passage analysis is throwing a pebble in the pond, a writer must swing, as Tarzan does, from vines chosen in advance.
I am Mr. Analogy.
I thought about my status when I encountered a post online distinguishing between “reasoning from first principles” and “reasoning by analogy.” The author resorts to an analogy himself in saying that using principles makes you a chef and using analogies makes you a cook. The chef is a scientist who combines ingredients anew. The cook will “look at the way things are already done and… essentially copy it.” The cook might adjust the recipe in a minor way but follows an established approach.
This analogy makes me a cook, and I don’t know how I feel about that.
My first impulse is to extend the comparison. Cooks seek practicality and reliable results. They repeat themselves, sure, but they also hone their approach until every element is just right. If I only have so much to teach about writing—only what I understand and accept—I’d better learn to express it in tried and true ways. My students can take or leave what I have to say. I’m only trying to help.
But I’m defensive. I recognize that, to real writers, each task is a fresh challenge that demands new solutions. They never imitate themselves or settle into a monotonous voice. Maybe my cookery demands compliance instead of genius. Perhaps I should stop saying detail and explanation are like bricks and mortar or that, like a knife, a specific supporting detail can grow dull if it’s used for more than one purpose.
You see I can’t stop. The post I read says, “Your reasoning process can usually be boiled down to fundamentally chef-like or fundamentally cook-like. Creating vs. copying. Originality vs. conformity.” At this stage of my teaching career, I’m too tired to reinvent much. I tell myself what’s worked before is still working. I keep my head down and cook.
Analogies, I figure, demand a specific sort of intelligence, one connecting tasks, appealing to common skills, common patterns of thought and application. The analogy-maker hopes for another avenue of discovery, unfamiliar and familiar at the same time. I want my students to say or feel, “I never thought of it that way.” Of course I have thought of it that way, or I wouldn’t trot out the same exhausted comparisons. I just can’t help it.
The explanation becomes new to me if it’s new to them.
Back when Big Chief tablets reigned, I only had to make my pencil rise and fall between the blue horizontal lines to call myself a writer, and what letters described hardly mattered—a boy, a girl, a dog, a hat, some short verbs. Words were unsure of themselves. They carried little inherent meaning. They sat slack-jawed, evidential.
At each stage of education, however, I burdened words more and more. When they started to disappear beneath their loads of thoughts, my teachers called me a “writer.” At first, the label must have been aspirational, designed to puff up my ambition and flatter my “potential.” But what passed for thought was still often evidential, the mental equivalent of “See?”
There’s no defining what happened next because some of it—like the poetry and hand-wringing prose of middle and high school “journals”—happened during. Along the way, words asserted themselves again, insisting on their beauty, crying to be arranged. I began to call myself a writer, and thoughts became my thoughts, which only the right words could describe. Compositions meant to evidence the voice and mind behind them. Foolishly or selfishly or both, I needed to write and, intermittently, believed the world needed to read me.
You write, writers are told, because you can’t not. It’s a compulsion to be heard, and you go on shouting, speaking, or whispering because you must. You wouldn’t be yourself without something auxiliary to yourself, an outrigger of words built just so. The siren of art calls you onto the rocks, and you give yourself to a doom worth embracing. You get an MFA.
But I wonder lately if I’m over that vision of writing. Like walking or breathing, writing is something we do, and, like walking and breathing, the quality of the act appears only at extremes. For writers like me who reside between failure and success, as much energy goes into convincing ourselves we’re special as goes into craft. Reading others’ work, I see some craft is clearly virtuous, is clearly real. And some writers’ faith is redeemed whether the craft is real or not. Outside those two states, though, writers endure. My endurance has run down.
John Berryman famously said no writer will ever know if he or she is any good or not. It’s true you’ll never be certain because you occupy only your own mind, but not-knowing seems more critical now than good or bad. Ambitious writers cling to hope, dreaming of wordless poems or a finally ideal expression of personal truths. “Who knows?” they think.
Not-knowing is a talent I’ve never possessed for long. Because, most of the time now, whether I’m accurate or not, I think I do know. At least, I’ve read enough great writing that pausing between conception and execution usually assures execution never occurs. Generally, I’m okay with that. I’m working on not-caring. Let others want to be authors.
The urge remains—I’m here now, after all—but it’s an urge, not a compulsion. The reason I write, when I write at all, is that I like to. I’m more at peace with putting my pencil down.
When no real or virtual stack of grading awaits me, when no other deadline looms, when I have time to read carefully, annotate thoroughly, and plan thoughtfully and creatively, I love class.
Question and response and further question and further response come to resemble an intricate, entirely improvised dance. There’s inference and implication and irony and laughter. There’s progress toward answers we didn’t know we wanted, and the slightest signal drops discussion into another, more consequential dimension. Even un-staged epiphanies seem meant to be.
Many teachers must feel as I do. Class time is the pounding heart of teaching that sustains the rest. For me, even almost 40 years in classrooms, it’s the only part of the job that makes me feel competent. The rest is ash.
My school has a curious custom. At the end of each period, after students gather up their papers, re-zip their laptop covers, and file everything away in overstuffed backpacks, they—almost all of them—stop to tell their teacher “Thank you.”
I’ve never experienced such widespread and ready thanks in any other school I’ve taught. I’ve asked students new to our school whether that was the convention where they were before, and many say no. We’re an independent school—read: a private school—and admissions people sometimes tout this thanking habit as proof of the special teacher-student relationship here. Everyone, it seems, marvels at this ritual. Most of my colleagues espouse gratitude for this gratitude. They love being thanked.
For some reason, I hate it. I’m reluctant to tell students, but I wish they wouldn’t thank me.
The expression “thank you” looks outward. It includes only one second person singular pronoun “you” and thus appears selfless. It says, “you deserve thanks,” which suggests it’s all about that offering, all about approval, all about appreciation. Yet, if you listen too closely, you hear the understood “I” at the head of the clause, “I thank you.” A gift can begin to sound like a contract—not clear payment for services exactly, but a transaction nonetheless. Heard from that corner, “Thank you” says, “You’ve been paid. I have paid you.”
The Princess Bride begins with the backstory of Buttercup and Westley’s love. She relishes bossing the farm boy around, and he always replies “As you wish.” However, we soon learn his answer is code. The tasks grow simpler and simpler until she asks him to retrieve a pitcher well within her reach. Westley fulfills her desire with “As you wish.” “That day,” the narration reports, “she was amazed to discover when he was saying ‘as you wish,’ what he meant was, ‘I love you’.”
The moment’s indirection is beautiful because it relies on Buttercup hearing Westley say he loves her and not on his saying it. Love is in the reception and not the transmission.
I wonder what I might think if my students didn’t thank me.
People who grow up as I did with the maxim, “If you can’t say anything nice, don’t say anything at all” are prone to hear silence as censure.
My emotional memory is deep enough to recall how torturous high school can be. The details of that time might have fled, but the romantic rejections, the relentless assaults on any belief in my academic, athletic, and artistic worth are still with me.
My senior year, I barely dammed tears when I received less than I expected—the score that should have been mine or indifference that, in light of my earnestness, felt like cruelty. Classmates more insulated by ego weren’t so sensitive, but we all rode waves of confirmation and doubt. I remember.
Do my students ride the same waves? I’m not sure, but my interactions with them assume so. If their high school years are like mine, what they need is for their emotions to be accepted and, as well as I can, valued. Who knows if they do feel I value them, but I hope they feel seen.
Even as, the older I get, the less they may see me.
Occasionally, I try to tell my classes that I don’t like being thanked, but there’s no proper way to say so.
If I say, “Don’t thank me, it’s my job,” it sounds like I’m saying teaching is only my job.
If I say “Don’t thank me, it’s unnecessary,” it sounds like I’m diminishing their gratitude, that I don’t appreciate their appreciation enough.
If I say “Don’t thank me, it’s embarrassing,” I risk an unprofessional confession.
If I say, “Don’t thank me, I don’t deserve it,” which too often comes too close to the truth, they think I’m asking them to dispute it.
One deflection is to string together all the forms of “You’re welcome” I know. The more people thank me, the more ridiculous it sounds.
“You’re welcome, any time, my pleasure, it’s nothing, thank you, think nothing of it, a trifle.”
We study vignettes in my senior writing elective, and, after a longer reading of six vignettes, I asked them to pretend they were determining “The Vignies,” an imaginary award for vignettes aligned with the Oscars, Grammys, or Tonys. They were to name winners in categories like “Top Vignette for Creating an Intimate Connection with a Reader” and “Greatest Mystery of What Was NOT Said (and yet WAS said, in a way… sort of).” They needed to write an awards show style speech announcing their selection and how they reached their decision.
It took some coaxing to get them to play along, but they did ultimately buy in, cooperating not just in the over-the-top fiction of those speeches but in the “we was robbed!” responses I insisted they make on behalf of spurned vignettes.
Forty minutes later, the day felt productive. I’d compelled them to scrutinize the reading, to make some thoughtful judgments, and to think about the bigger matter of how vignettes operate. Some of the speeches were funny too.
And, as they exited, several seniors thanked me.
Recently at my school, students have been secretly recording teachers with cameras in their phones then posting the results online. For the faculty, this behavior creates consternation. Some recorders must mean to show how funny or engaging we are, but others are malicious, hoping to show the opposite—how inept or clueless we are.
I’m sure they’ve focused their cameras on me and can only hope that, on balance, I’ve come across well. Made aware of what they’re up to, however, I wonder how many thanked me afterwards.
It occurs to me that, if thanks are transactions, both parties need to believe, the one thanking and the one being thanked.
At this stage of my teaching career, I can’t look for the attention younger colleagues garner. I probably won’t be asked to give another commencement speech. The fellowships and travel grants my school awards will likely land elsewhere, and I can’t fathom what performance might be enough to add my name to the plaque that designates my school’s best teacher each year. Only retiring might convince students to dedicate the yearbook to me.
I’m not insensitive to praise—who could be? And sometimes I’m haunted by the last line of James Wright’s poem, “Lying in a Hammock at William Duffy’s Farm in Pine Island, Minnesota.”
“I have wasted my life,” it says.
All these thanks and still… perhaps the problem is me.
Desire, the Buddha says, is suffering, but what of half desires? What about all you want and, at the same time, don’t?
In seventh grade, I was in what-I-thought-love with Nita Stroud. She seemed to care about me when I didn’t care much for myself, and my desperation soared to quite unquiet protests of affection. When she broke up by telling me I was “too intense,” I remember feeling confused. Was I relieved, even happy? I’m still not sure.
Desiring nothing means getting everything. By that standard, even a half desire can’t satisfy.
One day one of my students—I’ll call him John—lingered after class. He asked me to write this essay. I was grumbling again over being thanked, how I perhaps should (but didn’t) know what students felt when they said “thank you.” I should write something, I told him, to figure out the source of my ambivalence.
“I’d read that essay,” John said.
These close moments with students are rare. My colleagues tell me I’m “respected” and a student “had a good experience with me.” I don’t know how to read these compliments. What I want is a sure sign I’m reaching someone after all this time. Yet, that’s not something any teacher can expect. I’ve been to many conferences where we teachers receive a pen, some papers, and a command, “Write about a teacher you meant to thank and didn’t.”
I’ve found something to say and someone to say it to. I recognize which teachers made me. At the time though, the hour passed. Another session demanded I move on.
Many days, I walk to school. It’s no mean distance, two miles or so, but it’s a division between home and work. This time of year, it’s dark, and I barely hear anything other than my steps, barely see anything other than threadbare traffic similarly drawn to starting earlier and better.
Teaching has been my singular devotion. I’d label it “a calling,” if I could be so melodramatic. After all this time, I want—too much—for the sacrifice of money and stature to mean something. I’d like to place my worth on another scale. Still I think, “I could have made more. I could have been more.”
During my own schooling and in my current school, smiles pass between students and teachers, a spotlight of kindness illuminating and redeeming shared troubles. In that, somewhere, are thanks. I’m just unsure how to believe it.
Cetologists identify whales by the scars and general wear of their flukes, their idiosyncratic calling cards slipping into the deep again, marking years by disappearance and reappearance. When we moved in May, I thought I’d do the same with pigeons.
They must be as distinctive, I figured, and I meant to know my new neighborhood by its non-human citizens. For a time, on every walk exploring the new streets around us, I scrutinized each bird that lingered on the sidewalk. I meant to memorize a few, sure I’d meet some familiarity eventually.
Of course, I failed. The proliferation of pigeon colors and patterns can’t be captured by one mind, at least not one as small as mine. Even if I thought I remembered their odd, mixed variations of gray and white and brown, who could be sure? Was this pigeon a friend?
Over the last eight months, since abandoning my blog, I’ve written little, only haiku, and part of me discounts those seventeen (or so) syllables as frivolity, too easy to matter for much. They’re pigeons, perhaps beautiful if you’re prone to scrutinize but likely just another square of a sea’s surface or a patch of sky… more of the same.
Ezra Pound, a great lover of haiku, said, “The image itself is speech. The image is the word beyond formulated language.” If so, I wonder what those images add up to and how a person might turn so many disparate moments into anything comprehensive or consoling.
In May of 2015, I wrote a haiku,
enough raindrops will
wet this field
Maybe. If nothing else, belief intends sense. Each haiku promises content, however fleeting. My pigeon friends gather en masse in a parking lot near where I live. A step in their direction sends them wheeling into the air, and every distinction between them vanishes in shuddering wings and new perspectives of their flight. They’re no longer verifiably separate.
If haiku accomplish so much, perhaps that’s enough. When I was four, I remember scooting along the curb after a storm, my feet driving a wave of rainwater ahead of me. That instant persists because I seldom feel such power now. I’d like to write something substantial—a novel, a poem worthy of public attention, a collection of essays or short stories—and instead settle for the fitful awareness in haiku—they might add up, or, at the other extreme, one will be the apparition of faces in a crowd, petals plastered against a background making them visible at last.
One of Basho’s loveliest haiku reads,
in a world of one color
the sound of the wind
Aren’t we always hoping for that, connectedness and singularity, belonging and the strange joy of feeling so?
I saw a pigeon recently I was sure I’d recall. It was ginger rather than gray, and one wing feather was a white vee, the other not. Turning to me as if it knew me, its strut faced my direction. I thought it spoke, issuing a challenge to be known and understood.
No haiku occurred to me, but I knew then what haiku is.
Personal essays require believing you’re a valuable subject. The principle justification for writing about yourself comes from the granddaddy of personal essayists, Michel de Montaigne, who said individual experience is never purely individual. He believed, “Every man bears the whole stamp of the human condition.” And—if you accept his premise—the particular, paradoxically, illuminates the universal.
Philip Lopate goes further in his introduction to The Art of the Personal Essay by urging confession. Confession garners trust because, “The spectacle of baring the naked soul,” he says, “is meant to awaken the sympathy of the reader, who is apt to forgive the essayist’s self-absorption in return for the warmth of his or her candor.” In indicting yourself, the thinking goes, you must be honest.
If you’re sincere, your “indictment” might include confusion and the hopelessness of ever deciding anything definitively. Admitting you don’t (and maybe can’t) understand could be part of every essay, especially if you undertake issues or questions hoping to resolve them. Montaigne said, “Anyone who studies himself attentively finds in himself and in his very judgment this whirring about and this discordancy.” He also says, “There is nothing I can say about myself as a whole simply and completely, without intermingling and admixture.” Yet confusion will likely frustrate your reader as much as you. Sympathy has limits. You’re supposed to say something worthy or why write? Expressing your finite intelligence isn’t helpful or winning or impressive.
What is? You can’t be sure. Personal essays involve inventing a tolerant audience willing to sympathize with tortuous, circular, and equivocal ruminations, fellow feeling that maybe might occur if your thoughts are new, relevant, incisive, clever, amusing. You could be the worst judge though, and not know it. Just as the tone deaf are least qualified to assess the quality of their own voices, you may sing on, missing cues signaling how discordant or flat you are. And any response, even the most muted and mixed, could produce disproportionate effects. Someone smiles or smirks, and you think, “Ah. I’ve said something. I’m communicating. An ear is listening at the other end of this line, after all.”
The high-wire risk of personal essays is faith. You pray you’re perching on insight. Keep going, write enough, and you’re sure to… you think. Life is finite, you think. One life may be different, you think, but, if you try hard enough or long enough, you’ll reach some truth, minor and irrelevant as it might be. Sure, quantity can be the enemy of impact, yet—you think—you’re an exception.
So you tread on. You reach your foot forward praying for something like solid ground or a great uplift of wind to keep you from falling.
I value critics, but some take the job—and themselves—so seriously they go beyond illuminating their subject. Instead, they hint at their superior understanding. They assume awareness greater than those they criticize. They sound smug or condescending or dismissive and thus elicit criticism themselves.
In these publicity-hungry, hot-headed times, we’re accustomed to vehement critics. How valuable can a half-hearted viewpoint be, after all? Yet egotism often poisons criticism. Confidence helps, but self-assurance without self-awareness reveals ignorance akin to the cluelessness it denounces. Instead of discernment, the critic’s motives come first. Yet fighting over rectitude rarely convinces anyone. It rarely exposes something hidden and important. I wish all our social critics were a little less vociferous, but I prefer Jon Stewart’s dissections to Sean Hannity, Bill Mahr and Bill O’Reilly’s rants.
Printers’ Row, the book supplement associated with The Chicago Tribune, recently started a new feature called “Time Machine” offering old Tribune reviews of famous books. The first entry was H. L. Mencken’s response to The Great Gatsby, which I encountered with some skepticism. I mostly admire Fitzgerald and the novel, and the little I’ve read from and about Mencken fills me with ambivalence. Sometimes he’s witty, incisive, and unstinting. Sometimes he’s sarcastic, biting, and petty. And this review evoked both reactions—demonstrating, for me, when criticism does and doesn’t work.
In this case, I should say, “Doesn’t and does,” for Mencken swings his sword wildly in his opening before calming down to say something valuable. He calls the novel “No more than a glorified anecdote,” and writes off Gatsby as “a clown” and the other characters as “marionettes—often astonishingly lifelike, but nevertheless not quite alive.” In the end, he says, “The immense house of the Great Gatsby stands idle, its bedrooms given over to the bat and the owl, its cocktail shakers dry. The curtain lurches down.”
Maybe Mencken wanted to launch with a blast of his characteristic vitriol, but he seems so self-satisfied. As muscular as Mencken’s prose is and as much as I get his perspective, he speaks to those who enjoy (as Warren Buffet put it), “Interpreting all new information so that their prior conclusions remain intact.”
Granted, that’s most humans, but you either revel in his savagery or put the review aside immediately. If you’ve read the novel and agree, fine. If you haven’t, the critic’s snark is all you get. Illustrating broad proclamations is tricky, nigh impossible. Yet, if proof is impractical and explanation superfluous, only empty assertions remain.
Many of our pundits, politicians, and television personalities operate similarly. No longer inhabiting a three or four network world, we all have our shows. Whether to the left or right side of blue or red, you need never challenge prior conclusions. You can luxuriate in the affirmation of your disgust. Meanwhile, thought and self- examination suffer. Mencken described the U.S. as a “boobocracy,” ruled by the uninformed. We’re no longer quite that (because it’s hard to be uninformed in a nation saturated with media), but we can bask in the sneering certainty of the critics we accept, which may be worse.
Mencken’s appraisal of Fitzgerald improves after his initial salvo, not because he begins to give the book some credit—Mencken continues to assert rather than demonstrate or prove—but because he uses the book to address the practice of writing, a subject bigger than the author, the novel, and the critic.
At first, Fitzgerald chiefly receives faint praise for improvement. According to Mencken, Fitzgerald’s earlier writing was “Slipshod—at times almost illiterate” and “devoid of any feeling for the color and savor of words.” Then, however, Mencken stops punching Fitzgerald, whose progress is, to Mencken, “Of an order not witnessed in American writers; and seldom, indeed, in those who start out with popular success.” Mencken’s point also stops being personal. It tackles artistry and success, how the latter blunts the ambition of the former. The popular author who has “Struck the bull’s-eye once” may stop learning new techniques, Mencken says, and undergo, “a gradual degeneration of whatever talent he had at the beginning. He begins to imitate himself. He peters out.”
Which seems, to me, wise and well-put. Mencken is no longer talking about Fitzgerald at all, but about the temptations and pitfalls of popular fiction. Fitzgerald is the opposite of Mencken’s scenario, a talentless author who achieves success and then labors to improve. He is the exception to a rule. Having dropped insults, Mencken also abandons dismissing The Great Gatsby and turns to what’s in it. He notes Fitzgerald’s interest in the elite’s “Idiotic pursuit of sensation, their almost incredible stupidity and triviality.” Mencken’s statement that “These are the things that go into his [Fitzgerald’s] notebook,” marks a shift toward description and criticism’s real power, its capacity for careful observation and valuable distinctions.
I wish all criticism were so thoughtful as those last few paragraphs and that all critics might leave off hollering to speak in more audible tones. I know that’s less entertaining, and maybe it’s our nature to slip into ad hominem. Yet, to me, criticism seems most effective when it’s respectful. Critics don’t have to love everything—that’d be a different evil—but it’d be nice if they made their work about their subject and not about self-righteousness.
My wife and I sat at a picnic table, and next to us were three strangers eating in advance of the same outdoor Shakespeare performance we were attending.
One of them asked the other about a daughter who recently graduated from college, and she answered, “My daughter wants to be a writer.”
“Has she published anything?” the first said.
“No. Right now, she has a blog.”
I tried not to spy but didn’t need to look over to hear the message behind the answer—embarrassment, putting a positive face on the only response possible. She might have substituted, “No, not yet… but, you know, she’s pretending.”
That’s the trouble with blogging. Anything in magazines, journals, newspapers, books, or even commercial promotions comes with verification. Some authority says this writing deserves notice. In contrast, posts only require clicking “publish,” a faint stamp of approval that—most people assume—comes too readily. Based on this overheard conversation, the writer-daughter takes herself seriously, maybe thinks a great deal of her own work. The rest is up for grabs.
Any blogger’s vindication of blogs sounds like rationalization, further effort to gild the author’s own work. I felt for this girl’s mother. Naturally, a mom wants to believe, and, though blogging is hardly the same as appearing in TheNew Yorker or even the local paper, her daughter means to ply her craft, to pursue a dream, to practice by taking baby steps toward something brag-worthy. More than that, she may want to be read, and creating a blog assures a voice and audience… albeit a limited, often intimate audience. Which, she may think, isn’t so bad and certainly better than no readers. She might even like blogging and regard it as a distinct form with idiosyncratic challenges and potential.
Eavesdropping, I couldn’t help thinking about this blog as it approaches its 500th post. Am I still, after all this time, practicing for something real? Am I more proud (and appreciative) than I ought to be of my tiny audience? Am I alone in valuing my labor while real writers snicker? Have I, all along, been deluding myself to avoid actual evaluation and accomplishment? Does self-expression only count when someone else says it does?
This week a colleague posted on Facebook, “I’m writing everywhere else but on my blog, which means I’m finally working. I won’t be stopped.” In no way did he mean to direct the comment at me, but my spirit sunk nonetheless. My inner Rodney Dangerfield started muttering, “I get no respect. I get no respect at all.”
He meant, I’m sure, to say his blog has faded as more public writing projects took precedence, but the assumption seemed to be—or my defensiveness heard—you can’t be serious and simply blog. Blogging is what you do while waiting for anything better. In itself, as a writing genre (if it is), it sometimes seems the equivalent of copy printed on grocery-brand macaroni and cheese. Though cute, it hardly counts.
A fury of counterarguments rears: if you’re not a published writer does it mean more or less that people choose to read you (based necessarily on content rather than name, reputation or designation by Important People)? What sort of motive to write takes precedence when fame and remuneration are unlikely? Do readers from the Philippines, India, Botswana, and Latvia counterbalance having a small audience? What does it say when readers feel compelled to comment fresh from encountering ideas—can that be bad?
But those are framed questions, as all my questions are. They dig the hole (from which I shout) deeper. They evoke that unfortunate parent proffering her daughter’s blog as proof she’s a writer.
Perhaps there’s no satisfactory vindication or apology. As seriously and carefully as bloggers compose, the possibility lurks they have no place else to be writers and their only claim to the title is one they’ve asserted themselves.
Although, to me, these essays, stories, poems, and haikufeel quite real.
The technical meaning of the word “feedback” doesn’t exactly the match its colloquial meaning. In acoustics, feedback is sound doubling back, fuzz reverberating in dammed sound waves. Whereas, when teachers or other evaluators use “feedback,” they mean to say something new, something missed or unnoticed. In sound, feedback is a sort of echo. Teaching feedback says, “Here’s what you’re haven’t done… and should.”
Not everyone is good at receiving feedback. A teacher points out a glaring error, and suddenly the student’s competence is being questioned. The student’s face clouds. Maybe tears start. The tone of critique can make a big difference, and many teachers rely on “This work” over “You” because they want to emphasize the process over its author. They wish to make feedback an intellectual process, and, as long as any issue is repairable, it’s no reflection on the person who made the mistake. A student who can always improve his or her work—these teachers believe—receives even brutal critique as a ratification of their ability and capacity for improvement.
Yet how students respond to feedback often rests more with their personalities than how they’re criticized. Someone burned before may not want to go near any stove, and someone with an insecure sense of self might be hypersensitive to even the mildest threat. Teachers often have to guess which category this individual is in and how he or she might respond. In other words, they need to know students. Sometimes that’s impossible, and yet, paradoxically, teaching means focusing not exclusively on work but on the workers’ feelings and investment.
And as the consequence of the work increases, the gravity of criticism grows. Discuss “the essay” with a student five days before the due date, and he or she might respond positively and hopefully. The day before the due date, some measure of reassurance may be necessary, not just “These repairs are doable” but “YOU can do these repairs.”
Perhaps all work is personal work, inseparable from the person who does it.
Bosses frequently neglect “You can do it” because, after all, employees are compensated for good work. It’s required. What’s more, a boss may think only producing matters. Though studies confirm over and over that output rises when a manager takes interest in developing skills and a worker feels valued and important, concern for employees as people seems too messy, time consuming, and expensive. It’s easier to bypass the worker and stress the work. Many businesses use feedback exclusively to cull people they deem ineffective. In that case, “evaluation” or “adjudication” might be more honest. In a time of labor surplus, employers are much more interested in finding the right person for a job than helping someone learn how to do the job right.
To a lesser extent, the same issue arises in schools when those giving feedback have more concern for assessment than education. From that perspective, feedback justifies a grade instead of improving either the academic work or the capacities of the student. As in many workplaces, some teachers hope to keep the process of “managing” students clean by stressing the product. They wish to avoid entangling themselves in the idiosyncratic.
When the academic work is central, teaching is supposed to result in the best work possible, and any “feedback” that accomplishes that end, including threats, sarcasm, and personal insults, becomes permissible.
The dilemma is that, here too, personality matters in ways challenging to acknowledge. Teachers (and bosses) aren’t immune to insecurity either and, whether consciously or unconsciously, may express those insecurities in petty authority. An impersonal process has the advantage of protecting them from examining their own motives, even if giving particularly cold or harsh feedback fulfills only a need to believe in their own competence and significance.
As a term and a concept and a practice, feedback is challenging. In the end, however, its acoustic meaning may reflect most on the way people use the word. If feedback labels the need for evaluators to double back and evaluate themselves and their motives, perhaps it’s the right word.
But if the ultimate purpose is progress, growing productivity and confidence, then maybe the word is wrong. Proper feedback doesn’t feed back at all but reaches receivers through careful sensitivity to who’s listening and what they can hear well. It speaks without echo or distortion.
Lately, the philosophical question plaguing me is whether solitude is the natural state of humans… which says something about the state I’m lately in.
It’s July and, as a teacher, I don’t report to work. However, my wife still leaves each morning, my son lives elsewhere, and this summer my daughter has a job in the wilderness of Wisconsin. Between seven am and seven pm, email, Facebook, and the internet generally keep me company. With my sabbatical ahead, I forecast a long stretch of similarly uninterrupted solitude for the next 14 months.
Scientists believe they’ve answered my philosophical question definitively: humans are not solitary, never have been, and, in fact, experience changes in genetic expression in response to social situations. Where scientists once believed you were stuck with the genes you possessed at birth, they now recognize the environment, including the social environment, can turn on certain genes and change traits thought immutable. Research indicates people who live alone develop suppressed immune systems and manifest marked changes in genes linked to depression. Abused children with access to support outside the home, for instance, show–genetically—less sensitivity to stress and trauma. Closeted gay men fall much more rapidly to AIDS than more connected victims. Solitude, science says, is bad for you.
I’m not naturally social. In that great divide between those energized by company and those taxed by it, I’m squarely in the second group. A day of teaching runs upstream against my disposition, and, by the end of the workday, I have no talk left. As most people do, my wife looks forward to parties, guests, and visits. I try to. I remind myself how much fun I’ll have, how good it will be to reconnect with friends, how exciting meeting new people can be. Nonetheless, my apprehension grows. Almost involuntarily, I experience a kind of dread.
I’m no recluse. I love most humans and seem to function well in public. Some people, I’m always surprised to hear, say I’m interesting, even charming. Still, solitude is easier.
There’s a difference between solitude and loneliness. Solitude is a choice. Loneliness implies unfulfilled desire. A solitary person likes quiet, enjoys controlling his or her time, and finds productive and satisfying ways to spend what may appear to others empty hours. In contrast, a lonely person feels lost in a desert of time and wonders where the oasis is, where life-sustaining company might be, right then. Solitude evokes strength, self-sufficiency, autonomy, confidence, and completion. Loneliness stings. It never feels right and elicits resentment, bitterness at the thought of being dismissed or neglected.
I aim for solitude, but its border with loneliness wavers. I consider calling people so we can get together, then I give the idea up as weakness—they have their own lives and could certainly call me if they wished. I shouldn’t impose. I remind myself of my good fortune, the time to read, and study, and think, and write. Then, when I’m not looking, the switch flips. I feel excruciatingly bored and forgotten. The day begins with journal writing, a to-do list, an hour or so of studying a psychology text, and work on my latest creative projects. It ends with Netflix, iPad games, and anything to pass time before my wife (finally) walks in.
If I complain, she says, rightly, “Do something about it.” And I say, “I should.” Yet, the next day, I return to the same strategy of making the most of being alone. Sometime soon, I may scream. In the meantime, I structure my new solitary life like a dike to keep loneliness out. I mean to keep loneliness out.
A researcher named Steve Cole has devoted his career to studying the physical effect of social isolation and has discovered that, even more than stress, “Social isolation is the best-established, most robust social or psychological risk factor for disease out there. Nothing can compete.”
Scientists may have answered the question of whether humans are solitary, but my own experiment continues. My days negotiate self-reliance and desire, fellowship and autonomy, productivity and yearning to hear another voice. Nothing seems so immediate and real as this battle between being myself and being part of something. Even this post is a skirmish, a surrogate for conversation, piled earthwork, more effort to occupy time.
A fellow blogger once told me, “Don’t expect too much from summer.” She meant visitors, not summer in general.
She’s right about visitors. Something happens in June, and those WordPress bar graphs flatten to foothills. My first two years of blogging, I worried I’d said something so heinous no one liked me anymore. Now the summer lull is a familiar pattern, and, being a grizzled veteran of the sport of blogging, I accept readers’ attention wanes when the weather encourages healthier alternatives to reading angsty, self-doubting prose.
You can hardly look at an overcoat when it’s boiling out. I get that.
In fact, I more than accept the quiet. I relish it as a resort town must sigh through October or the babysitter must claim the whole couch between lights out and parents’ return. It’s not that I relax so much as I don’t worry about relaxing.
Blogging and publishing offer very different companionship. Real writers must imagine readers. In contrast, bloggers can usually guess how crowded the room is and adjust their volume and tempo, maybe even whisper because more intimate speech is okay right now.
Over the last six months or so, I’ve sent some writing away, and all of it has returned with “No thanks.” So perhaps I’m telling myself summer’s drought shouldn’t be ego-killing the way those rejections are. The alternative is believing I have nothing to say. Maybe I have nothing valuable to say sometimes, but I do desire speech. I want to say something.
And a strange relief arises when less is at stake. The less important the end, the more enjoyable the means. Why not be experimental or confessional or meta-conditional or plainspoken?
Writing is like swimming. It’s strange imagining someone inventing a way to cross a river, but someone must have. Conventional strokes—freestyle and breaststroke and butterfly—have well polished efficiencies, and they work. They aren’t the only means to reach another shore, however. Trying other methods might be embarrassing, but you could dream up something if you didn’t worry about looking like a fool. Plenty of brilliant writers master conventional syntax to compose lovely prose, but others revise the rules. Virginia Woolf, Ernest Hemingway, William Faulkner, James Joyce all swim oddly.
Probably because they didn’t care and worried little about readers—who readers might be or how they might react to their beautiful fumbling.
Our MFA age is more homogenous, full of MacPoems, MacShort Stories, and MacNovels acceptably well structured, thoughtful, and forgettable. Hell, you might be reading a MacEssay right now. The “focus group” and “workshop” sometimes seem oddly named, as they often center on acceptability instead of vision or idiosyncrasy.
“I don’t mean to be mean,” I hear a classmate criticizing Fitzgerald’s Gatsby, “but aren’t these opening pages just a lot of throat-clearing?”
Fumbling isn’t always beautiful, but it’s more human than self-consciousness generally permits. I realize all my efforts to “get myself out there” and “learn what editors want” may improve my work because I’ll learn to appraise and revise what’s invisible to me now. But solitude—or an intimate gathering of friends—can be helpful too, especially if I can become comfortable with throat-clearing as I learn to sing.