Archive

In the News

As I’m sure many Americans have, I’ve spent a lot of time lately thinking—and worrying—about the concept of truth. I should say at the outset that while I have some strongly held political views, I generally believe that on any given issue, there are a range of reasonable opinions. And because I don’t appreciate others’ proselytizing, I generally try to keep my politics out of the social media sphere.

That said, I feel compelled to comment on President Trump’s casual disregard for the truth. I want to make it clear at the outset that this is not a narrowly “political” opinion. Rather, my anxiety on this issue stems from larger, longer-term concerns about the civic health of the United States. To be sure, Trump is not to blame for most of these concerns, but the President (regardless of party, regardless of personality) has an obligation to nurture a healthy American civil society, allowing—even welcoming—dissent even as he promotes his own agenda.

Throughout the campaign, Trump displayed a penchant for making bold claims—fairly typical “red meat” for his political base. This is not unusual. But when those statements later became political liabilities, he did not (as many politicians do) attempt to “massage” them. He did not (with the exception of his infamous “locker room talk”) even attempt to explain them away. He and his team simply denied outright ever having made them. The TV spot produced by the Clinton campaign in the wake of the VP debate illustrated this perfectly.

Trump’s supporters might argue here that all politicians play fast and loose with the truth, and they would not be incorrect. Still, I think there’s a difference between run-of-the-mill “spin” and what Trump has done and continues to do.

Now that he has taken the oath of office, we’ve witnessed a trivial but protracted debate over the size of the crowd at Trump’s inauguration vis a vis Obama. (In the grand scheme of things, who cares?!—except that, obviously, Trump does.) Most of the notable comments have come from Trump’s press team, with Kellyanne Conway purporting “alternative facts” on a Sunday morning talk show and Sean Spicer asserting in a press briefing that “we can disagree with the facts” (as if facts weren’t facts at all).

On top of the inauguration silliness, there have been reports of a forthcoming investigation into allegedly widespread voter fraud. To be sure, if millions of ballots were cast illegally, I—like most Americans—want to know about it. But so far, no one has produced any real evidence to that end, and the Trump team seems convinced that this problem was really only a problem in states where Trump lost.

This is not new, of course. Let us not forget that this is the same Trump who built his political career on “birtherism,” refusing to accept Barack Obama’s citizenship and demanding to see his birth certificate. And yet, once the document was released, Trump refused to accept it. When he was challenged on this issue during the 2016 campaign, he sought to blame Hillary Clinton for the whole charade.

It boggles the mind, and it has me wondering: Is Donald Trump the postmodern president?

As an undergraduate, and especially as a grad student, I dipped my toe into the waters of postmodernism, and I initially found them intellectually stimulating. For someone who hated history in high school, it was exciting to learn that everything—even the very nature of reality—was subject to debate. My experience in high school was essentially: “Here are a bunch of facts that someone else has deemed significant. Now memorize them!” So you can imagine how it felt to be told, in effect, “There are no wrong answers. All perspectives are valid; just take a side!”

In graduate school at the University of Alabama, I took a class with Professor George Williamson (now at Florida State University). We read the German historian Leopold von Ranke, who advocated for a fact-based (as opposed to a mythological) history—history wie es eigentlich gewesen (“how it actually happened”). We read Peter Novick, whose book That Noble Dream questioned the whole Rankean conception of history as an “objective” discipline. And we also read Gertrude Himmelfarb, the conservative historian who decried postmodernism thusly:

In history, [postmodernism] is a denial of the objectivity of the historian, of the factuality or reality of the past, and thus of the possibility of arriving at any truths about the past. For all disciplines it induces a radical skepticism, relativism, and subjectivism that denies not this or that truth about any subject but the very idea of truth—that denies even the ideal of truth, truth as something to aspire to even if it can never be fully attained.

Throughout the semester, we tacked through the heady winds of intellectual discourse, zigging and zagging from left to right and back again. Week after week, Dr. Williamson’s class left me thoroughly confused and convinced that I was stupid. I cannot honestly say that I enjoyed it. Nevertheless, long after the semester ended, I came to appreciate it more than any other class I took at Alabama. It was his class that provided me with a broad theoretical foundation for understanding history as a discipline, and in retrospect, it was his class that transformed me from merely a good student who happened to like history into a historian.

I must admit here that I am still influenced by certain aspects of postmodern thought: I remain skeptical of so-called “metanarratives,” and I do hold that truth—particularly historical truth—is a slippery and contested concept. The facts are always tentative and subject to change, and historians must (to the extent possible) be aware of their own biases. In that sense, I—like most historians, I suspect—share Novick’s view. However, I find much truth (and I choose that word deliberately) in Himmelfarb’s position. Postmodernism, intellectually engaging though it may be, is ultimately nihilistic and self-defeating. If there is no truth, even an imperfect one, then what’s the point? It reminds me of the famous line from Macbeth: “It is a tale / Told by an idiot, full of sound and fury / Signifying nothing.”

As the renowned historian Eric Foner states in his 2002 book Who Owns History?, “There are commonly accepted professional standards that enable us to distinguish good history from falsehoods like the denial of the Holocaust. Historical truth does [exist], not in the scientific sense but as a reasonable approximation of the past.”

In short, even for historians who accept Novick’s contention that “pure objectivity” is unrealistic, a bright line still distinguishes between fact and fiction. Historians acknowledge that they are not writing history wie es eigentlich gewesen, but that is still a far cry from simply “making it up.” The entire historical enterprise is built on documentation of sources and the peer review process. In that, there is, in fact, a connection to the scientific method and, indeed, to the very ideals of the Enlightenment.

Not for nothing have I made a habit of posting the following quotation on the door to my office or classroom throughout my career: “For here we are not afraid to follow truth wherever it may lead, nor to tolerate any error so long as reason is left free to combat it.” Those words come from an 1820 letter Thomas Jefferson wrote describing the University of Virginia, my other alma mater. For all of Jefferson’s many accomplishments, his role in founding the University was one of only three that he wished to have placed on his tombstone, alongside his authorship of both the Declaration of Independence and the Virginia Statute for Religious Freedom. Clearly, Jefferson wished to be remembered as a man of the Enlightenment.

The United States is a very much a product of the Enlightenment as well, but it is the Enlightenment values—of truth, of reason—that Donald Trump appears to question. To my mind, few things could be more corrosive to the health of our civic and political institutions or as damaging to our republic in the long-run. As Americans, we can (and should) disagree; that is our heritage. It is one of the things that makes America great. But we must disagree reasonably. For here we are not afraid to follow truth wherever it may lead, nor to tolerate any error so long as reason is left free to combat it. Without reason, however—without the pursuit of truth—we are simply wandering in the dark.

It’s been a busy month for me, and I’ve started to develop a backlog of potential posts. Here’s an idea from almost a month ago (!):

On my way into school, I heard this piece on NPR: “Racial Bias Isn’t Just a Police Problem, It’s a Preschool Problem”. With a wife who teaches pre-school in a school that is roughly 98% African American, I was immediately intrigued. The focus of the piece, it turns out, was a study on implicit bias done by the Yale Child Study Center. The findings? That teachers are often implicitly biased in their classroom discipline.

This kind of work is important in helping to bring broader exposure to the problem of implicit bias, something all teachers (all people) need to be made aware of. Until we begin to recognize our implicit biases–thus making them explicit–we can’t work to counteract them. And yet, this story actually obscures a different implicit bias–one which the study seems to suggest might be more substantial than racial bias.

Here’s the gist of the study, and the kernel of the findings:

At a big, annual conference for pre-K teachers, Gilliam and his team recruited 135 educators to watch a few short videos. Here’s what they told them:

We are interested in learning about how teachers detect challenging
behavior in the classroom. Sometimes this involves seeing behavior before it becomes problematic. The video segments you are about to view are of preschoolers engaging in various activities. Some clips may or may not contain challenging behaviors. Your job is to press the enter key on the external keypad every time you see a behavior that could become a potential challenge.

Each video included four children: a black boy and girl and a white boy and girl.

Here’s the deception: There was no challenging behavior.

While the teachers watched, eye-scan technology measured the trajectory of their gaze. Gilliam wanted to know: When teachers expected bad behavior, who did they watch?

“What we found was exactly what we expected based on the rates at which children are expelled from preschool programs,” Gilliam says. “Teachers looked more at the black children than the white children, and they looked specifically more at the African-American boy.”

Indeed, according to recent data from the U.S. Department of Education, black children are 3.6 times more likely to be suspended from preschool than white children. Put another way, black children account for roughly 19 percent of all preschoolers, but nearly half of preschoolers who get suspended.

One reason that number is so high, Gilliam suggests, is that teachers spend more time focused on their black students, expecting bad behavior. “If you look for something in one place, that’s the only place you can typically find it.”

However, the next line of the article piece is the one that really got me thinking: “The Yale team also asked subjects to identify the child they felt required the most attention. Forty-two percent identified the black boy, 34 percent identified the white boy, while 13 percent and 10 percent identified the white and black girls respectively.”

This would appear to be in keeping with the title of the piece, and it fits nicely into the current debate over systemic and institutional bias in other areas of American life, such as criminal justice. If we look a bit more closely, though, we notice that black girls may actually receive less scrutiny than white girls. And girls in general appear to receive about one-third of the scrutiny that boys do. In fact, seventy-six percent of participants said that boys (regardless of race) required more attention to keep them in line, while only 23% said that girls did.

To me, this suggests more of a gender bias than a racial one.  I know that, thoughout my career, I have tended to see more “troublesome behavior” in boys, and I suspect that girls get away with more in class than boys do. This brings to mind the “boy crisis”–a hotly debated concept in education, but one which I think has at least some merit (even if the name itself is a bit melodramatic). I’m not sure I’d go so far as to call it a crisis, but I do think there is something to the idea. Why is it that boys are prescribed ADHD medications at rates far outstripping girls? Why is it that boys are suspended from school more often? Why is it that more boys are identified as “special needs” students? These things can’t all be accidents. Perhaps some of it stems from this implicit bias–the same one that the author of this piece seems to miss altogether.

To be sure, racially discriminatory discipline practices are real and problematic; they need to be exposed and addressed. I am by no means disputing that. But I think that this story actually obscures another problem that is potentially bigger (at least in terms of sheer numbers). As educators, we must work to ameliorate conditions which disadvantage black boys, absolutely, but as we do so, let’s not forget that most boys find school to be a challenge at some point. Are there changes we can make that would serve them all?

After the latest revelation about Donald Trump, it seems strange to write about anything else related to the presidential campaign, but I’ve been thinking about this for several days, so here it is.

In the wake of last week’s Vice Presidential debate, I read this article from the Washington Post:

Clinton debate prep is focused on what happens once the debate is done

It discusses this ad from the Clinton campaign, which was released mere hours after the debate ended.

Clearly, this informed Kaine’s debate strategy, as he repeatedly (and often awkwardly) said some variation of “I can’t believe Governor Pence is going to defend Donald Trump on this issue.” I suspect the ad was all but ready to go, pending only the video from Mike Pence, who played into their trap nearly perfectly.

I’m not here to talk about politics, though, except in the sense that I wonder if this sort of ad–while clearly effective for Hillary Clinton’s campaign in the short run–is good for our nation’s civic health in the long run. Again, I’m not debating whether these statements by Trump and Pence are “fair game”; they certainly are.

But by using the debate as a venue for producing “gotcha advertising” (similar to “gotcha journalism”), is Clinton’s seeming innovation making it less likely that candidates, who are often reluctant to take the stage to begin with, will debate in the future?

We could argue about whether debates actually serve any real purpose, as they often devolve into little more than constant sniping and bickering, neither of which is good for our civic health either. But they do represent–at least in theory–one of the few times when the candidates appear together and at least attempt to talk about real differences of philosophy and policy. In that sense, I would argue that they are important and meaningful, imperfect though they are.

Of course, this could turn out to have just the opposite effect. Going forward, both parties will adopt this tactic, and candidates will know that they have to be prepared to defend previous statements with more than just a flat denial, lest they be made to look foolish. As Glenn Beck pointed out, “We’re not living in the 1800s. We can go back to the clips on YouTube.” Yes–sources matter!

We can’t predict the future, of course, but thinking historically, I wonder what the long-term (possibly unintended) consequences of this will be.

On my drive to and from school each day, I typically listen to NPR. My commute is only about 10-15 minutes, even with traffic, so I generally only hear a handful of stories. Even so, in the last couple of weeks, I’ve heard a number of stories about the recent decision by ITT Technical Institute, a for profit college, to shut down virtually overnight.

If you’re not familiar with the situation, the U.S. Department of Education announced in August that ITT Tech wold not be permitted to enroll students who use federal financial aid. This came after a number of investigations which alleged fraud and a number of other questionable business practices.

As I was driving in this morning, I heard this story, produced by our local NPR affiliate (WUSF). The gist is that ITT students—some of whom were only months from graduation—are basically left high and dry, and they now face a difficult decision: They can transfer some (though likely not all) of their credits to another institution and continue holding their debt in full, or they can discharge most (though likely not all) of their debt and start over from scratch at another institution. It’s a lose-lose situation.

The story quotes an ITT administrator in Fort Lauderdale as saying, “If I had a magic wand, I would have said, ‘If you’re closing, you teach them out, [show] that there’s a plan in place to teach out anyone who’s currently enrolled and that you don’t just shut the doors, you don’t just do that to people.”

To me, this story gets at the heart of my concerns about for-profit education ventures. I’ll be the first to admit that I am no expert on this issue, and I’m sure there are some valid arguments in favor of for-profit education. Still, I believe that educational institutions (even private, for-profit ones) have a fiduciary responsibility* to their students—a responsibility to act in their students’ interest that goes beyond their obvious responsibility to provide a quality educational experience. Unfortunately, in a for-profit school (or college, in this case), the motives and responsibilities become blurry.

Is the institution’s motive to education—or to make a profit, come what may? Is its responsibility to its students—or to its shareholders/stakeholders?  Of course, those don’t have to be mutually exclusive, but when push comes to shove, which will it be? We now know the answer for ITT Tech.

* I am not a lawyer and thus don’t use this term in a strictly legal sense. Rather, I use to suggest that institutions (or, more precisely, the administrators who guide those institutions) must consider students’ interests when making decisions about the company. They must honor the trust that students place in them when they enroll.

Truth be told
I’m not much for the cold
But if you’re selling a snow day,
Mother Nature, I’m sold!

An original poem, perhaps one of my finest! (Excluding those I wrote for school assignments, I think I’ve written–at most–a half-dozen poems in my life. See what you can do on a snow day!)

We missed the brunt of the recent winter storm as it passed through Virginia; just a bit to the north of us they received about 25″. For our part, we got probably 6-8″, though it’s hard to say for sure. Late Friday night it changed over to freezing rain/sleet, which compacted the original snow and turned it to ice. It then changed back to snow for much of the day Saturday which added a few more inches of powder on top. School was cancelled on Friday, primarily because everything else was cancelled, and it’s cancelled again today. And what a wonderful winter weekend it has been!

A snow day provides an opportunity to think and reflect (and write poetry). Over the past few days, I’ve luxuriated in a slow-sipped pot of tea, helped clean the house, and, yes, done some work–grading, catching up on e-mail, planning my upcoming weekend duty. These are all things I might normally do on a weekend day, but with a sea of white outside my window, it somehow feels different.

Another thing I’ve done this weekend is read–a lot. I just finished Dan T. Carter’s The Politics of Rage: George Wallace, the Origins of the New Conservatism, and the Transformation of American Politics. Now I’m working on Anthony Grooms’s novel Bombingham and Nel Noddings’s Education and Democracy in the 21st Century.

I also read this article from the Washington Post: “Expecting to enjoy a lazy snow day? Teachers urge parents, students to think again.” While it’s not the hardest-hitting journalism I’ve ever read, it does manage to provide an interesting angle on one of education’s stickiest wickets in just a few paragraphs. In short: whereas a heavy snow used to promise kids a fun-filled day of frolicking outdoors, educators now worry about the consequences of such activities. As the article states, “[I]n an era of increased academic testing, stacked curricula and virtual learning, many educators and school officials are urging students to continue their schoolwork during snow days to avoid the dreaded ‘amnesia’ that can set in after a few missed days of class.” Particularly in the upper grades and in Advanced Placement courses, “that can create stress for teachers, who worry about how they will cram a year’s worth of advanced curriculum into one shortened by snow days.”

To me, this is suggestive of the difference between a “teacher-centered” and a “student-centered” (or “learner-centered”) education system. I do sympathize with the teachers, who appear to be caught between an immovable rock (the AP exam) and a cold hard place, though I think the use of the verb “cram” is telling. How much curriculum should we be cramming in the first place? (And if we are cramming, is this really education?)

I think there’s a reasonable a fair conversation to be had about the “costs” of a snow day in the classroom, especially for those students who are really struggling or when a “snow day” becomes a “snow week” or more. For the vast majority of kids, though, the idea of “amnesia” setting in after just a few days is ridiculous. If that were true, we wouldn’t have long weekends, holidays, or spring break. In fact, such breaks are necessary to allow students (and teachers!) the opportunity to recharge and come back into the classroom fresh.

Perhaps most frightening of all is this asinine quote from Connie Skelton, Assistant Superintendent for Instruction in Arlington Public Schools: “In Arlington, we really are moving towards 24/7 learning.” She’s explaining how Arlington’s use of iPads and other technology can be a game-changer, but what does that even mean? Do kids not sleep? Do they not eat or go to the bathroom? And even if we set aside the ridiculousness of the actual claim, we should ask ourselves a more serious question: Just how much learning is necessary and appropriate? I’m all for providing a “rigorous” education, but the law of diminishing marginal returns applies here. There is a point at which enough is enough.

I am heartened, however, by this quote from Evan Glazer, principal of Thomas Jefferson High School for Science and Technology (better known throughout the DC area as “TJ”): “‘We want them to go out and play and make snowmen and snow angels, because it doesn’t happen all that often,’ Glazer said. ‘You might as well take a break when Mother Nature gives you the opportunity.’” This from a school that has been ranked among the very best in the country and sends its graduates to top-notch universities. (I can hear the counterargument now: “With a more advanced student, you can ‘afford’ to take that stance.” Maybe so, but I also wonder to what extent the 24/7 “cramming” mentality contributes to the percentage of students who either drop out of school or simply go through the motions.)