Evidence Is Old-Fashioned?

So, more wailing and gnashing of teeth about Andrew Breitbart.

The New York Times has a piece on plagiarism that reviews an increasingly prominent argument that contemporary college students simply don’t know that copying the words of another writer verbatim is plagiarism, that they’ve grown up in a different kind of textual environment that will eventually produce new norms for everyone.

I’m sympathetic to certain versions of this claim. I’d agree that many students are taught poorly how to cite online material. I’d agree that there really are new kinds of text-making practices in digital environments that arise out of networked or collective systems for sharing information.

What we’ve come to understand as plagiarism is a relatively short-term consequence of a highly individualized and relatively recent conception of authorship, creativity and property rights. Many years ago, I was surprised to find that 18th and 19th Century European travel writers sometimes committed what I saw as outright plagiarism, reproducing or directly paraphrasing work by an earlier traveller. Over time, I began to realize that for some writers, this was a “best practice”: if you didn’t have time to visit an area along your route, but someone else had, then include what they had to say, but fold it into your own authoritative account.

But I’m enough of a devotee of our recent view of authorship and creativity (and property) to think that the norms established around plagiarism during the 20th Century need some kind of continuing defense, just with sufficient awareness of the changes in textual production and circulation.

What really worries me is what’s happening to the larger purpose of the analytical writing which tempts some to plagiarism. The thing I’m honestly afraid of is that we’ve come to a point where the professional value of learning to build strong arguments based on and determined by a command over solid evidence is in rapid decline.

I think in the last four decades of the 20th Century of American life, the ability to build a strong case whose factual foundation could withstand fairly determined examination by a variety of critics paid off in a wide variety of professional and personal contexts. I’m not saying that the quality of knowledge claims in that era was always beyond dispute: quite the opposite. A lot of social research from that time turns out to have been flawed in its claims, in its evidence, in its rhetoric, in its method. But I do think that both academics and non-academic professionals often tried hard to get it right, changed features of the arguments they were inclined to make based on evidence, and when their evidence was found seriously wanting, abandoned or strongly modified views that they’d previously held.

There are a zillion reasons why that spirit has receded strongly from public life. It’s not all about the sudden surrender of the Republican Party hierarchy to a populist fringe that treats all evidence as infinitely malleable to its needs, and evidentiary debates as the culturally perverse hobby of an elite it disdains. But that’s the latest and strongest fruit to hang from long-growing roots and branches. The upshot is that we’re in a moment where it’s not clear that there are any meaningful professional, social or personal consequences to believing whatever you want and unabashedly cutting “evidence” to fit the Procrustrean bed of your beliefs. Evidence or facts are becoming a rhetorical flourish, like opening a letter “Dear Sir:”, or calling an openly totalitarian nation “the Democratic Republic of”. You include “evidence” because that’s the form, but the substance hardly matters.

So here’s the question, then: am I committing a kind of futureward malpractice if I tell students that the quality of their evidence matters? Is this one more way that I’m just an academic training people to be academics and ignoring the future needs of other professions and careers in the world as it actually is? I know this sounds like dramatic pearl-clutching, but I look at the case of Breitbart and a seemingly endless parade of other pundits and writers wrong about small facts and big facts, casually mangling and manipulating evidence, and I don’t see that it hurts any of them. I don’t see that the mainstream media cares much any longer, if it ever did, about enforcing a different standard. I don’t see that this kind of writing or speaking means anything negative for a political career or a career in public service. Business, law, medicine: if you’re on top, you’re not going to get called to account for any distortion, no matter how gross, and if you’re not on top, you’ll be producing distortions on command for those at the top.

It’s not just the professions, either. There’s one blog that I really love to read that has a regular commenter who has a near-perfect style that combines the recirculation of right-wing talking points, the undisguised evasion of unwanted ‘frames’, and a passive-aggressive retreat into personal and anecdotal accounts when directly challenged, a style for which Ronald Reagan should have been awarded a patent. I think this style probably makes this person successful at producing outcomes in her everyday civic and professional life. I know that when I’m in everyday civic contexts and I come up against someone who fuses that kind of approach with a dogged determination to have their way, I just say screw it and walk away unless the stakes are the highest possible. (And that’s partly how we get to situations where the stakes are the highest possible, because of incremental erosion on smaller issues.)

So maybe that’s the kind of writing and speaking we need to train our students to do: rhetorically effective and infinitely mutable on substance, entirely about rather than just sensibly attentive to affect and audience. At what point is it perverse to continue making buggy whips while the Ford plant churns away right next door?

This entry was posted in Academia, Politics. Bookmark the permalink.

10 Responses to Evidence Is Old-Fashioned?

  1. evangoer says:

    “…am I committing a kind of futureward malpractice if I tell students that the quality of their evidence matters?”

    Yes, this is pearl-clutching. 🙂

    Taking this on face value: if the students want to be trained in the Dark Arts, they can always sign on with the College Republicans. I am sure that even Swarthmore has College Republicans.

    But even if you decide it’s a good idea to train them in the Dark Arts yourself, you *still* need to teach them about proper collection and analysis anyway. Lies and bullshit are for offense, but traditional analysis is still absolutely necessary for personal defense.

  2. Nate Kogan says:

    In one sense our definitions of “evidence” and “analysis” are very much rooted in an old-style (Rankeian, I imagine) sense of what the historical profession is and should look like. We strive to encourage analysis of these sources in the hope of finding the “truth.” Although we’ve generally come to recognize that the “truth” is very much shaped by the author and his or her background, perspective, values, era, etc., we still view this as something to which we as historians should aspire and should teach our students to strive for.

    I think a large part of the problem comes with the conflation of “argument” and “opinion.” It seems like the sophists (that’s really what they are, right? Playing loose and free with the facts in order to win an argument) you’re concerned about might be conflating the two. As someone who teaches high school students, in teaching argumentative writing I strive to have my students understand my resistance to the use of the term “opinion” as many see it as something that doesn’t have to be rigorously rooted in anything than their own sense of right and wrong. (E.g. “I believe chocolate ice cream is better than vanilla, and you can’t argue with that because it’s my opinion.” However, I’d contend that this is better labeled as a “preference” than an “opinion.”)

    Therefore, I agree that evidence, its quality (e.g. where does it come from, is it used appropriately in context [e.g. Shirley Sherrod], is it appropriately referenced, etc.), and the interpretation of that evidence remains a really important thing to teach students. Although this model might be outmoded in popular culture and seem like something that is only the province of the academy, it nevertheless is important for students in encouraging them to develop habits of deep thought and an ability to grapple with the nuance of *all* the sources they encounter throughout their lives as students and beyond.

  3. sschnei1 says:

    I prefer just ‘elite,’ but I’ll take ‘elitist’ any day. We’ve made a dichotomy between two different games, in which rhetorical moves of various sorts have different statuses. In one game derisions, ad hominems, anecdotes, etc. have great power. In the other, relatively little (I say “relatively;” let’s not make academics out to be saints). I think on this point alone it’s clear which game is, let us say, more mature. And taking into account the relative roles of evidence, analysis, critique, and argument in the two games, I think it’s clear which outcome we should care about.

    But as long as enough people sail by the winds of pop-culture and irresponsible or sensationalist media, then they will stormy up the seas for everyone. Like adopting pacifism, ignoring sophists is really only safe if everyone else does it too.

    So if you’re looking for a goal in the classroom, try to teach your students how to take the high road without getting knocked on their asses. Whether that means confronting people like Breitbart or ignoring them is matter of strategy. One must, I’m sure, be aware of them. But my intuition says it’s a mistake to think you should ‘take them head on.’ This is a difficult line to walk. But I think all the best people try.

  4. Matt Lungerhausen says:

    not to be too polyannaish, but really I think this is where those lessons we learned as kids from our happily bourgeois parents come into play:

    If all the other kids jump off a cliff are you going to jump off the cliff too? – so if the pundits and politicians are too sophistic to bother with getting the evidence and arguments right, are you going to tell your students to follow them off the cliff? It seems to me that the lesson to teach is “these people are sophists and this is why its bad.”

    Along the same lines, if everyone else is breaking the rules, that doesn’t give you license to do so. If other people seem to be “getting away with it” that doesn’t make it right. Those people are still acting unethically, even if their foul deeds go unpunished. You are not doing a student any harm if you give them the tools to make good arguments based on sound evidence.

    In the movie Real Genius, I think Chris says to Mitch, “The world always needs smart people.” If you are smart, it seems a shame to throw that gift away by doing something intellectually dishonest or half-assed. That goes for students who are going into academia and especially for those who will go to work in business and public service. We’re supposed to teach them the right way to do things, if they turn to the Dark Side, let them do it on their own.

    Even if people don’t appreciate it right now, good ideas and good arguments still matter. Besides, there is such a thing as honor and being able to look yourself in the mirror every morning.

  5. Timothy Burke says:

    Sure. And I am pearl-clutching, no doubt. But at the same time, it’s the same thing I feel about plagiarism. Effective pedagogical address to plagiarism can’t just be an enforcement pitch (“this is the law! you will get caught and punished!). You have to persuade students that plagiarism is a bad idea independent of it being against the rules. If they know full well that out in the world it’s close to being standard practice, then your job as a persuader has just gotten a lot harder. The same goes here: if you and your students both know that there are almost no professional or personal contexts where the truthfulness or meticulousness of your evidentiary practices actually leads to better outcomes (as opposed to just feeling better about oneself), then what?

    I guess I’m fretting about a broken-windowpane theory of analysis, reason, and argument: that if the rhetorical neighborhood has lots of broken-windowpanes and no one seems to care, then that has systemic consequences that seriously devalues efforts by the few people in the neighborhood who still work hard to keep their joint clean and well-maintained.

  6. dchudz says:

    “…if you and your students both know that there are almost no professional or personal contexts where the truthfulness or meticulousness of your evidentiary practices actually leads to better outcomes (as opposed to just feeling better about oneself), then what?”

    In that case, isn’t the question (what you should do) a matter of to what extent (and in what ways) you, or your institution, want to be teaching ‘values’? Or want to be ‘bettering the world’?

  7. Western Dave says:

    The writing was on the wall when Sasha took down David Brooks and the media establishment rallied to Brooks’ defense and attacked Sasha. I think it was a major turning point. Hell, as far as I can tell, Brooks never even apologized.

  8. jpnudell says:

    I think as an educator you have to be “pearl clutching,” whether you are at Swarthmore, or Harvard, or Northwest Missouri State, where one of the graduate students here at Missouri immediately before me got a job. By not demanding these standards from the students we fail them. And I hadn’t thought about plagiarism as just one problem among others of surrendering rhetoric and argumentation, although I think you make a good point.

    While the article focused on higher education, I think that the problem begins far before then and is not aided by the increase in fact-based standardized testing. At the risk of citing Star Wars, most incidents have multiple points of view that, if argued well, may be considered correct or true. Yet students are not taught how to write in any rigorous, sustained fashion. An attendant of public high school, then Brandeis University and now graduate education at the University of Missouri, I know I never had any such course of study.

    I continue to practice, study language, and read in order to try to improve my writing, but if anything this has made me more convinced of the terrifying lack of writing and argumentation and rhetoric training in our schools.

  9. Brutus says:

    Astonishing. When the professor wonders whether to teach sound academic and intellectual values (such as citing to sources and using evidence responsibly), the battle is already lost. Students will naturally take the path of least resistance if unchecked, but teaching them how to cheat more effectively (the Dark Arts indeed) to produce a more favorable outcome though without apparent downside because it’s normative, that’s just too much.

    Pragmatism has clearly won the day over idealism. Strategies for obtaining desired outcomes are now largely unmoored from all ethical and moral restraints. Perhaps we could learn this lesson better from the mafia, for whom the quickest path to success is simply eliminating the competition and rationalizing the brutality as “just business.”

  10. James_West says:

    The quality of the evidence only matters if you’re arguing with people who are themselves experts. However, in most peoples’ professional lives, those are the dominant people they’re going to be arguing with.

    I’m both science faculty at a university, and have worked as, essentially, a sentencing investigator for the courts, and find that the quality of evidence is of critical importance when talking to experts, and almost irrelevant otherwise.

    However, even with the best evidence, rhetoric is still important: people have to understand how your evidence fits together, and why it’s important. Rhetoric is important whether or not you have facts, and no matter who your audience is. Facts are only important when used to address someone who is also familiar with the facts.

Comments are closed.