Finally got an article accepted yesterday. I knew it would happen again eventually, but it was getting rather frustrating.
As I write in October 2018, criticism of Donald Trump’s competence and respect for the law as President of the United States has ceased to be a partisan affair and has become a duty of the citizenry. But all he is is a symptom, I’d argue, of a larger problem. From my perspective as an university professor, colleges haven’t been successful enough at iberal arts education in the last 40-50 years to prevent a Trump-like political event.
Consider these numbers.
First, 50% of voters in the 2016 exit polls claimed a college degree or higher, with another 32% “some college.” Pew has corrected this to 37% of voters having degrees. Either is higher than the national average of college degree holders, which is 33.4% as of 2016. Overall, 39% of registered Democrats have degrees and 31% have some college experience; 28% of registered Republicans have degrees, with another 35% having some college experience. Thus, I submit that less than one-third of the electorate had no college experience, one-third had some, and one-third graduated. I further suggest, then, that the majority of voters had encountered the basic required curriculum of any college, including a composition/writing course like the ones I teach.
Second, according to Pew, among white voters with a college degree, Clinton took 55% to Trump’s 38%, with initial exit polls claiming the reverse of Trump winning 49%-45%. Overall, among all college graduates, Clinton took 52% and Trump 42%, with a gender split among whites: white women with degrees, Clinton 51%, Trump 41%, and white men with degrees, Trump 53%, Clinton 39%. I cannot find numbers on non-white degree holders. I find these numbers incredible, whether or not you favor Pew or the exit polls.
Third, the default explanation that Trump voters were left behind economically is partially mistaken; rather, “growing domestic racial diversity and globalization contributed to a sense that white Americans are under siege by these engines of change” – a polite way of saying those same voters tended to be (but were not necessarily) racist, anti-immigrant, and isolationist.
Fourth, there were about 18 million college-degree-holding Trump voters; my estimate based on 36% of degreed voters being affiliated with the GOP. In accordance with the third point, they tended to view diversity as threatening, immigrants with fear, and their culture – predominately white – as under siege.
These four points form prima facie evidence that college, as the supposed champion of critical thinking and citizenship, has been a crapshoot for fostering critical thinking or citizenship. If those core courses, like composition, had reliably done the citizen-building job that they claimed to do, the degree holders voting for Trump would much be closer to zero. This failure is more apparent when factoring in the millions of graduates that did not vote at all. Turnout for college-educated citizens was about 70% and post-graduate was 80%.
Writing classrooms in 2016 were not the lone culprit, of course; this was a failure to vote against an authoritarian candidate that has its deep origins in previous decades, as most degree recipients got their degrees many years ago. Still, past Republican candidates – Romney, McCain, Dole, the Bushes, Reagan, McCain – were all moderates, worthy of some democratic consideration, compared to Trump’s odious strongman.
I could blame history or philosophy or political science – how can one get a post-WWII college degree without knowing that electing an authoritarian demagogue is undesirable? But no. Few undergraduates take many courses from these disciplines, but exposure to composition is almost guaranteed. My discipline must share some blame, too. We could have done more.
I used to think my teaching was formative of critical thinking and ethics and built at least a motte and bailey defense against the worst excesses. Writing needed teaching to all comers as a communicative civil right. All that seems dangerously stupid now. Increased writing skill does not magically lead to responsible citizenship. If you knew 42% of your composition class was going to note your citizen-building pedagogy and vote for Donald Trump, would you not change your strategy? Or would you “do your job” to “teach writing” like thousands of others, especially as an adjunct or lecturer if you did not have a reasonably secure job or control over your curriculum?
Repeatedly, we have thrown the difficult and lengthy task of teaching skilled writing to instructors that were underprepared, underpaid, and overworked. When we surrendered collectively and unconditionally to the conclusion that the task was not important enough for the best trained, best paid, and best-motivated instructors – who got to become “scholars” with minor teaching responsibilities – that was when the seeds were planted. Now the entire country pays for our neglect; a constitutional crisis that makes Nixon look like a paragon of integrity. If we could have taught just 1% more responsibility – just 1% – Trump would not be president.
Facing our miserable 58% showing (and I refuse to count those college degree holders that didn’t vote for Trump but didn’t vote; that’s sin by omission), we could salvage our idealistic faith in citizenship-building with a dose of realpolitik. Yes, I have the glimmerings of a solution. Still working it out, but I think writing classrooms, at least, need to explore and learn the techniques of the direct opposites of “ethical” citizenship – falsehood, obfuscation, and emotion. Our link between the teaching of writing and the promotion of citizenship has clearly failed to prevent the development of “anti-citizens” that willfully voted in a demagogue without critical reflection as voting one into the highest office of the land undercuts the purpose of the system.We could teach writing as a neutral tool used for good, evil, and all the gray points in between, as much as we did the practice of democracy and the performance of citizenship – a marriage of realpolitik and idealism. We could study how to compose “unethical” communication through not just the increasingly prevalent examples, but practice, and thus stress the real-world consequences of rhetoric and writing used for nefarious purposes, particularly in civic/political contexts, using the lessons of history – and starting with Trump as Bad Example #1. We have to stress the consequences of dishonest communication and condemn them when we see them.
Or, is it too late? Have we bled out from a self-inflicted wound, and my musings here are part of the last flickers of a dying brain? Certainly, waiting passively for Robert Mueller to save America is a losing bet. The poison has settled in, and the problem is now long-term. Behind Trump is Pence, and behind Pence are other emboldened strongmen, many overseas in parallel tracks. Times are dire.
A college education, on the front lines of voting, may be the best hope for holding the democratic line, but blind idealism, our old pedagogical strategy, is not enough in the face of an evil that conceals its true nature all too well. There are many “anti-citizens” out there that think Trump is the second coming. Lower taxes, reduced immigration, tough trade talk, white male Supreme Court justices, racism and sexism carefully enshrined – all the little things they want, and at what they think is a great price, their souls bundled with the future.
You may note that I used the word evil. I did so purposefully. This is a path of evil we’re on. The election of Trump in 2016 was not a blip. It was a game-changer, a culmination of decades of poor education and careful politicking. Whatever happens in the midterms next month, even a Democratic takeover of both the House and the Senate, will not reverse it. It takes decades to make this kind of mess, and it will take decades to change it. I wonder, though, if we have decades left.
Tiger Woods’s recent “comeback kid” storyline and the ongoing accusations against Judge Kavanaugh remind me that America has an obsession with linking competence to character.
Americans understand competence in two ways. The first is as a minimum. Competent means you mean the minimum requirements for your job or role or sport. You can use it as a pejorative – “He’s just competent,” or as a compliment, “I think you’re competent,” signaling that we ourselves don’t quite know what to make of the concept.
The second way, which is much more insidious and worthy of analysis, is that competence signals good character; a competent person is a good person. When Woods was struggling on the links, it was far easier to link that struggle to personal failings of will, talent, or ethics. But when he’s winning, those concerns are forgotten and replaced by their opposites. He is “mentally tough” and “brilliant” and “disciplined” now, an object of celebration and adoration, a victim of his injuries rather than ruled by them, if he was still losing.
Kavanaugh, too, is a litmus test for how competence is viewed. On one hand, Republicans tend to point to his long career as evidence of competence, and this is extended, by the second definition, to his character. He could not possibly be an attempted rapist because he is competent professionally, the reasoning goes. On the other hand, Democrats reverse this – because he is competent in Republican eyes, they reason, his sterling resume is just the mask of a sexual offender. Either way, it’s a logical mistake. Kavanaugh’s competence as a judge does not cause better personal behavior, or the reverse, that an ethical life leads to competence.
Think over your life, of the many people you’ve known, and you’ll recognize many other examples. The selfless saint that can’t hold down a job, the crack businessman that made his fortune cheating customers, the immature star athlete, the idealistic employee passed over for promotion yet again. And yet we insist to ourselves that there must be a link between behavior and competence. There must be. But there isn’t.
All Trump voters in 2016 knew this very well, even though they might not admit such in public. Trump was rich and famous, with all the trappings of success, and a reputation, at least, of business acumen, but no one is seriously going to point to him as a paragon of moral character. And yet, even with his glaring, obvious example, this doesn’t change how we view Woods or Kavanaugh in the slightest.
Disengagement from this kind of thinking is difficult. Among the professoriate of which I am a member, the professors who publish often are seen as hardworking and industrious, and many sins are forgiven. The ones that don’t get as much in print are viewed as lazy, goldbricking deadwood. This happens despite the inherent randomness of the academic publishing process and despite all the other things professors do, like teaching and administrative work. We’re supposed to be the smart ones, but we can’t easily escape the fallacy either.
Curiously, when it comes time to fire someone, the two concepts of competence and character separate a little. Either can be used to fire you without recourse to the other, but there is always an implication that you failed in both areas. Many positions are apparently supposed to be better than the average Joe, character-wise, given employment clauses detailing the requirements of proper behavior. Behind this is the assumption that you can’t really do your job competently if people don’t view you as competent because your behavior suggests otherwise… even though your behavior has no necessary logical connection to your job performance. It is the appearance or performance of competence, then, that matters.
With Woods and Kavanaugh, we can see one figure ascendant, with his competence and character simultaneously restored; with the other man, both concepts are crashing rapidly because they are so closely linked. I am not suggesting that we do away with linking competence to character, or even if we could, given how hardwired it seemingly is to the American mindset, but we might want to start thinking about applying it more carefully and questioning whether the claims it makes are really warranted.
My losing streak for journal submissions has finally dried up. Two strong R&Rs that look promising have arrived, along with another crushing and disappointing rejection where I apparently managed to erase scholars of color, queerness, and other diverse groups. Or at least that’s what I’m told.
I remember the first time I got a competent rejection. It was in 2008, while I was finishing my doctorate. I sent a piece to Rhetoric Review and the longtime editor, Theresa Enos, sent back a short letter that more or less said, “Sorry, but there’s not enough there there.”
I thought about this, and I realized she was right. The thesis was of the “gee, this is interesting” variety and ultimately not useful to anyone. After a lengthy revision, the next journal started with what seemed to be a solid R&R but became a bait and switch with a third reviewer who thought they had found the critical flaw in my argument (they hadn’t; they just hadn’t read closely the two-page rebuttal in the middle of the piece), but it hit on the third journal. This added up to a three-year delay in publication, of course. I don’t know how folks that don’t write essentially evergreen argument manage to sustain careers.
Speaking of Rhetoric Review, until recently I didn’t know it was the first rhetcomp journal to require peer review, in the early 80s. Which makes me think – most of the classic scholarship of rhetcomp is in the 70s. I wonder.
I have a short essay in the Chronicle of Higher Education that just came out, called “The Three Types of Peer Reviewers.”
It’s pretty straightforward, so I don’t think it needs further commentary. However, I would thank Daniel Peña for reminding me that there are many other venues besides journals to write for, especially given that I am increasingly cranky.
Another rejection came in, this one for a co-authored piece.
I can’t figure out what it was rejected for, though; the editor gives no hint, and the reviewers didn’t find anything serious, just trivia. One of them even praises it in their opening (misstating the thesis, of course) like it’s an acceptance or at least an R&R, and doesn’t undercut it afterward.
So my overall reaction is… huh. Like the other two rejections I’ve gotten this year, there is little evidence they understood the argument at all. I have communicated to my co-author, a grad student, that this is not surprising, and the long process continues.
In brighter news, I just completed a very lengthy, complicated R&R that I have been sweating for months, so the same day one comes back, another goes out! There is no other productive response to rejection, especially silly ones, but persistance.
Another rejection. This time it was a split; one reviewer accepted with no changes, and the other rejected outright.
It was a tough journal, but it was a good article, and I’d hoped for more. In particular, I’d hoped the reviewers would actually engage my argument in a way that might suggest that they had read it. Neither did. The quest continues.
My son, seeing I was annoyed, gave me a cheese cracker. That made it better.
Got a rejection on an article today. Both reviewers rejected it- no R&R.
I have two firm rules about what I do when a paper of mine is rejected by a journal.
- Do not send that journal another piece until the editor changes.
- Reflect on the positives rather than the negatives.
The first rule is based on an old lesson that took me forever to learn. I don’t think I fully learned it until I was about 30. Maybe even later. Namely, do not try to win the favor of someone who doesn’t like what you are doing. Not only it is demeaning, but it’s a total waste of time.
In this case, the review took five months, and anyone that can’t find anything good in my ideas in five months is not worth trying to please. (I take one week to do a peer review. Maybe a week and a half. Tops.) The bit about the editorship is mostly wishful thinking on my part, based on a belief – erroneous and idealistic, of course, but I cling to it – that the editor bears the responsibility for accepting or rejecting, not the reviewers. There are a few journals that have the same editor for decades; I have learned to avoid those.
The second rule is also a practical one. The negatives are considerable – no acceptance or R&R in a journal I had specifically written the piece for – and two ‘peers’ that couldn’t find anything redeemable in my ideas, which I had shared with several colleagues and generated some excitement. That’s a professional blow to anyone.
But the positives are also considerable. One reviewer dwells on that 1) my article wasn’t ‘rigorous’, and 2) they couldn’t find a reason for it existing, and cheap shots like that tell me I hit a nerve – and that’s very interesting, given that I wrote a rather harmless theory piece that shouldn’t have pissed off anyone. The other reviewer said much the same thing – and both finger-wagged about how I had not cited enough literature, but only mention two additional citations that I could have literally run circles around. It is entirely possible (actually it’s a quite common occurrence) that one or both wrote the citations in question…
Lit reviews are trivial. They can be easily added or omitted. It’s a dumb reason to reject a paper. Ideas are far more rare. The refusal to engage the thesis meaningfully is more telling.
My conclusion, therefore, is that the journal that I chose and designed the piece around was a mistake. It contradicted the core assumptions of the reviewers about how such an idea was to be handled with ‘rigor’, and they pushed back hard with numerous technicalities that could have been easily resolved in an R&R. Instead, hard reject.
So I made a mistake. Wrong venue, and possibly wrong subfield. That’s a positive – I learned something. I won’t make that mistake again when I revise. Or I will, in which case I will adapt again. It remains a good piece, and it will find a home. I will sift through their comments and use some of them (one reviewer was much more helpful in this regard than the other), but I will also disregard the spurious.
I got an email tonight saying that I have been awarded a Funded Faculty Leave (FFL). Other universities call this a sabbatical. In this case, it means that I will get paid to do little but research for one semester (likely Spring 2019) – no teaching and no service (though I suspect some will sneak in). This is good news. I will use it to draft a new book on the gospels and rhetoric.
In other news, I sent out another article today. That makes four journal submissions in four months. M is due May 7; I should be able to do one more, a chapter in a collection, before he arrives. Actually, I already have a draft, it’s mostly editing at this point.
I could grade, but it’s late, almost 9:15. I think some time with Stellaris is in order…
The NYT has an amusing article on Bolton’s mustache and the political history of the particular trim level.
I have a full beard and long hair past my shoulders, so I feel qualified to hold forth somewhat on this topic.
I hate shaving. Absolutely hate it. I started in middle school and it remained a painful, tedious process for over twenty-five years. I started with an electric razor, but moved to a blade as I got older. It never got comfortable, and I always cut myself and irritated my face. My neck was typically a series of open wounds. Stubble for me appears in 12 hours or so, so shaving every day was mandatory.
As I got older, I went longer and longer between shaves, and often stopped for a week or more during vacations. If I needed to go on a job interview, I would shave close, but typically went 3 days in between. Stubble became acceptable.
Then, around the time H was pregnant with L – when I was 39 or so – I grew a full beard during a vacation and simply neglected to shave it off. I had tried a beard in college once, but it itched so bad I gave up after a month. This time I persisted, and it paid off. After that first month, the itching stopped, and it was a revelation.
It helps, of course, that I have an occupation – professor – that has no special expectation on facial hair, or head hair, for that matter. Most men in my department maintain some facial hair (typically with some gray) like the chin beards that are fashionable lately, or advanced stubble, but I have the only full one.
So I come to the question – what does it mean to have facial hair or not? Well, for me, my long hair signifies – to me – that I’m not a suit. The beard is simply further evidence of this. Frankly, if you think about it, if a man has the genetics to grow facial hair, why fight it? Social expectations? Trends come and go. I wish I had not listened to every conforming male for decades that told me to trim that stubble.
Now I’m comfortable with my face. It expresses some of my personality. H says I look ‘blank’ without at least some facial hair, so there is that to consider as well. I have also noticed that people tend to take me more seriously now, even at work. I think it is more the gray than the beard, but I don’t think it hurts.
Ultimately, though, it is a personal decision. I couldn’t care less about fashion. Beards are a little more ‘in’ now, but so what?
If you are judging a person on appearance you are literally and figuratively engaging them at the shallowest level possible. Rather, examine if they are fair and compassionate. Little else matters, and I say that as an academic. Intelligence is common and easily bent to evil, and looks always deceive. A sense of justice and a care for decency, though, are rare and hard to replace.
So, moral: grow your hair any way you want that pleases you.