Archive

For Public Consumption

As part of our Martin Luther King, Jr. Day observances, our school took the entire boarding community to view the film Selma. As a history teacher with an avowed interest in civil rights history (not to mention an Alabama native), I was asked to give a short talk providing some historical background during our morning assembly, especially for those students in 9th and 10th grade who may not be quite as familiar with the civil rights movement. Here’s what I wrote:

The setting for the film you’re going to see tonight is Selma, Alabama in 1965. To give you a sense of the historical context for this film, we must keep in mind that this twenty years after the end of World War II. Those twenty years were exciting ones, if a little scary: it was during that time that the United States had become the wealthiest country on the planet, and the nation was in the thick of Cold War tensions with the Soviet Union.

It was also during this time that what we today know as the civil rights movement emerged into national consciousness. African Americans, particularly in the South, had been advocating for their full rights as American citizens for decades (if not centuries), but it was in the wake of World War II that these scattered efforts became a full-fledged social movement.

There are a few key ideas I think you all should understand before you see this movie. As African American men returned home from war, having fought overseas in defense of democracy, they began to raise the question much more vocally of why they should be expected to fight and die for democracy in Europe or Asia when they were denied democracy at home in America. Despite the passage of the Fourteenth and Fifteenth Amendments (which granted African Americans citizenship and the right to vote, respectively) at the end of the Civil War, white southerners had effectively denied those rights through a number of legal tricks, as well as through intimidation and physical violence. Without representation in government, southern states had passed extensive legislation (known as “Jim Crow laws”) that segregated African Americans from the white population and often relegated them to an economic and social condition not that different from slavery in some ways.

In 1946, a World War II veteran by the name of George Dorsey was murdered (along with his wife and another couple), which brought national attention to the problem. In the wake of Dorsey’s murder, then-President Harry S. Truman created a President’s Commission on Civil Rights to study the situation of black people in the United States.

In the late 1940s and early 1950s, the National Association for the Advancement of Colored People (also known as the NAACP) began to challenge Jim Crow laws, especially those creating separate schools for whites and blacks. In 1954, this culminated in the famous Supreme Court decision Brown v. Board of Education, in which the Supreme Court of the United States unanimously held that “separate educational facilities are inherently unequal” and ordered the desegregation of American public schools. At last, the American government had signaled to African Americans that someone was listening. Of course, significant progress on the desegregation front was a long time in the making, and by some measures, one could argue that it still has not been fully achieved. But Brown v. Board is often viewed as the beginning of the “civil rights movement,” because in its wake, African Americans began to push much more assertively for their rights.

The following year, in 1955, a seamstress and activist by the name of Rosa Parks boarded a city bus in Montgomery, Alabama and took a seat near the front. When she was ordered by the white driver to move to the back of the bus so that a white passenger could have her seat, Parks refused. She was arrested for violating Montgomery’s segregation laws, sparking a black boycott of the Montgomery bus system that ultimately lasted an entire year. As the boycott got underway, local activists searched for a leader, ultimately settling on a young and relatively unknown Baptist preacher by the name of Martin Luther King, Jr. Soon, King was the public face of the boycott and would, in the years to come, become the most most recognizable figure of the entire movement.

Now, I don’t think it does any disrespect to Dr. King to point out that although we celebrate today as Martin Luther King, Jr. Day, there were hundreds of other leaders of the civil rights movement, and thousands of people who put their lives and their livelihoods at risk to participate in the various boycotts, sit-ins, marches, and other actions that would dramatize the plight of southern blacks in the media.

As the civil rights movement progressed, King and his followers promoted what they called non-violent direct action. Often, they intentionally violated segregation laws in hopes of eliciting a violent response from the local authorities and gaining media attention for their cause. Often, this strategy worked perfectly. Occasionally—as in Albany, Georgia in 1961 and 1962—it did not. There, Police Chief Laurie Pritchett had read King’s books and studied his tactics, and Pritchett simply enforced the laws without violence. Because there was no dramatic conflict to play on television, the media gave the movement there little attention.

Still, demonstrations spread across the South, and in 1964, President Lyndon Johnson signed the Civil Rights Act into law. The Civil Rights Act outlawed many forms of segregation, but King and others believed that without the right to vote, African Americans would continue to suffer at the hands of whites.

King and his fellow leaders learned from their mistakes in Albany, and as the movement gained strength, they made the conscious decision to target cities with law enforcement known for their violent response. By 1965, they had identified Selma as a prime target for a march, and the focus of their efforts there would be voting rights in particular.

This brings us to the film.

My wife and I saw the film a couple of weeks ago, and I can tell you that it is powerful. For me, it has some personal significance. First, my mother was actually born in Selma in 1953—twelve years before the events depicted in the film. Although she and her family had moved away by 1965, they still lived in Alabama, and I grew up hearing her stories about that time and place. Those stories are ultimately what led me to pursue a two degrees in history, and when I was in graduate school at the University of Alabama, I did a lot of research on a very poor rural county about an hour and a half from Selma. In the 1960s, African Americans accounted for about 80% of the population of Greene County, Alabama, but the local government was all white. The banks and most businesses were white-owned as well, so whites had a pretty firm grip on the local economy. The schools remained completely segregated until 1965—the same year as the events in the film, and eleven years after Brown v. Board—and even then, only one black student (a girl named Mattye Hutton) was enrolled in the previously all-white high school. As soon as this happened, white families began to send their children to a newly formed private school called Warrior Academy.

By the end of the decade, though, things had begun to change. As a result of the actions you will see in the film today, African Americans did claim their constitutional right to vote, and in the elections of 1969, Greene County was one of the first counties in the South to elect a majority-black government. One of the first things the new school board did was to fire the superintendent—allegedly because he refused to allow a school program celebrating the life of Dr. Martin Luther King, Jr. who had been assassinated the previous year.

So in that sense, the events in Selma were a success. They did contribute directly to the passage of the Voting Rights Act and made it possible for African Americans to elect officials who would be sensitive to their needs and concerns. But let’s be clear: This story is not all rainbows and sunshine. For all the progress made as a result of the civil rights movement—and there has been much progress—the story does not end with King’s untimely death.

Warrior Academy did not admit its first black student until 2004—fully fifty years after the Supreme Court decision in Brown v. Board. Even today, we need only look at the headlines to see that injustices still exist: after all, Michael Brown and Eric Garner became household names in 2014. And here are a couple of other statistics that may startle you: Despite representing only 30% of the U.S. population, people of color make up roughly 60% of those in prison. On top of that, voter laws in many states permanently deny some convicted felons the right to vote, even after they have served their prison sentences. What this means is that in those states, more than ten percent of their voting-age African American population is actually ineligible to vote.[*]

So here’s a final thought that struck me as the credits rolled on Selma. In 1965, when he led the marches there, Martin Luther King, Jr. was in his mid-thirties—not that much older than me. Realizing this caused me to reflect on what I’m doing with my life. Am I standing up for the causes I believe in? Too often, I’m afraid, I’m not. Life is too busy; or maybe I’m just too scared to speak up.

John Lewis, another individual you’ll meet in the film, is now entering his twenty-ninth year as a member of Congress—a position he would not even been allowed to vote for, let alone serve in, prior to Selma. In 1965, John Lewis was in his twenties—not that much older than you.

So to me, that’s what this movie is all about—really, what this day is all about: it is an opportunity to reflect on some very important questions. What do you stand for? What kind of life do you want to lead? On Martin Luther King, Jr. Day, we have an obligation to honor the sacrifices made not only by King, but by the thousands of others who marched alongside him. To do that, I think, we must look not only to the past, but to the future as well. So what are the causes that spur you to action? We don’t necessarily have to put our lives at risk as the marchers in Selma did, but if we don’t work to make the world a better place—if we live only for ourselves, in other words—we will have failed those who paved the road we travel.

Thank you, and I hope you all enjoy Selma as much as I did.

[*] http://www.sentencingproject.org/doc/publications/cjprimer2009.pdf

https://www.americanprogress.org/issues/race/news/2012/03/13/11351/the-top-10-most-startling-facts-about-people-of-color-and-criminal-justice-in-the-united-states/

 

As I noted in my previous post, I gave my first conference presentation last week at the bi-annual Teacher Conference sponsored by the North Carolina Association of Independent Schools (NCAIS). It was a terrific experience. My slides are below, along with some of the key thoughts that are captured in the PPT.

  • By signaling what a community actually values (regardless of what it says it values), grades have the power to shape curriculum—and the community itself—in meaningful but often undetected ways.
  • I’ve come to the conclusion that grades—at least as they’re typically used—don’t help us accomplish our “21st century” goals. In fact, they often prevent us from accomplishing them. So, if we must assign grades, let’s at least try to do so in ways that ensure the grades are serving the purposes we want them to serve.
  • Re: Pink and Robinson: You’re probably picturing your own students, wondering what planet these guys live on. “Of course grades are necessary!” you’re thinking. “Without grades, kids would never do their work.” There’s an element of truth to this, but I would argue that grading can create a vicious cycle. By reducing intrinsic motivation, grades ensure that more grades will be necessary to “get students moving” in the future. (In this sense, it’s akin to addiction—and we can probably all think of students whom we could describe as “addicted” to grades.)
  • Re: Mastery: In a rapidly changing world, we must teach students to find pleasure and success in the process—not just the results. Grades should reflect this emphasis.
  • Re: Mastery: We must expose students to failure, but without making failure seem insurmountable. Grades make that difficult to do. (If the marking period ends next week, that F in the gradebook isn’t going anywhere, and there’s nothing a kid can do about it.)
  • Re: Purpose: Luckily, many teachers are beginning to implement classroom models—think project-, problem-, or place-based learning—with the potential to address real-world relevance and purpose. The question, then, is: Will we rethink grades so as not to undermine that purpose?
  • Grades are both carrot and stick. As such, they have the potential to undermine autonomy, mastery, and purpose, all in one fell swoop.
  • Re: The Tendency of Fixed-Mindset Students to Avoid Challenging Tasks: To me, this is potentially the most devastating critique of all, because it undermines the entire premise of education. Students—like all people—MUST stretch themselves to continue learning and growing, but with grades, the goal can easily become getting the grade as opposed to learning and growing.
  • As our classrooms become more purposeful, though—as we move in the direction of preparing students for a changing world through more engaging pedagogical approaches—grades should become less necessary as a means of ensuring compliance. So what should they mean? And how often should we give them?
  • Grades are a central feature of American educational culture and are, in most places, at least, very well-entrenched. But if you accept that the world is changing and that education must change to keep pace, you should at least be open to the possibility that grading practices might need to change as well.

Last year, I almost totally revamped the way that I teach, striving to make my classes much more student-centered. Although it wasn’t the only change I made, the implementation of the Harkness method played a significant role in transformation. Given that the approach was completely new to my school community, I was asked to write about it for last spring’s issue of the school’s alumnae magazine. My article is below.

Educating young people for democratic participation has never been more important. That might sound hopelessly old-fashioned, but it is nevertheless true.  In an age of high-tech gadgetry, 24-hour global news coverage, and manipulative political advertising, it is easy to lose sight of the very foundations of democracy: a willingness to talk seriously with others (as opposed to talking at or over them), to listen to their ideas with an open mind, and to make sometimes difficult but always informed decisions. Unfortunately, evidence suggests that such skills may be in short supply these days.

In August [2011], The Washington Post published a story on the decline of political civility in America. It read, “The new basic unit of political discourse [in town hall meetings] is not the question or the comment, but the earful. Even legislators who say they enjoy a spirited give-and-take have had trouble getting the quiet required for such an exchange.” More recently, the bipartisan “Supercommittee” charged with reducing budget deficits failed to reach a compromise, much to the chagrin of Americans across the political spectrum.

It should come as no surprise, then, that as of this writing, more than eighty percent of Americans disapprove of the job that Congress is doing. Of course, when communication between citizens and their elected representatives breaks down, confidence in government is bound to falter. However, we should think twice before assigning all of the blame to our elected officials. As Abraham Lincoln famously stated, the American government is one “of the people, by the people, for the people.” If it’s not working, we must accept a considerable portion of the responsibility, and we must dedicate ourselves to preparing future generations for more productive political leadership.

How exactly to do that is something I’ve wrestled with since I began teaching at Saint Mary’s. This past June, though, I had the privilege of attending the Exeter Humanities Institute. Along with more than fifty other teachers from around the world, I spent a week at Phillips Exeter Academy learning and practicing a student-centered, discussion-based pedagogy that has become known as “the Harkness method.” I originally went to Exeter because I wanted to learn more about discussion-based teaching in general, but once I witnessed the true potential of the Harkness method, I was committed to transforming my classroom. It is without a doubt the best way I’ve found to prepare my students—in the limited time that I have with them—for participation in a democratic society.

My students have heard me say it so many times that they will tell you—in a mocking tone, most likely—that “disagreement is good for discussion.” It’s true, but that’s not to say that Harkness discussions are just open-ended arguments; they also require students to listen and think. The best discussions are those in which students question each other’s ideas, probe each other’s thinking, and ask for evidence of each other’s claims. Not coincidentally, these are all the makings of a strong and productive public discourse.

Quite often in our discussions, there is no “right answer,” but that’s not to say that opinions alone will suffice. Students find that their opinions rarely have enough weight to carry the conversation, so they are forced to examine their views and justify them to their peers. That is democracy writ small, and in this process, students begin to discover who they are and what they believe about the world.

They also discover that a frank and open exchange often leads to greater understanding and a better finished product. My eleventh grade American history students recently spent about a week writing an essay as a class. The prompt asked students to consider the extent to which American colonists were unified on the eve of the Revolution. As they began their discussions, there was almost unanimous agreement that the colonists had been unified, and even though I had allotted a week for the assignment, the students acted as if they might have the essay written in only a day or two. I just smiled and waited patiently as they began to grapple with the evidence.

Soon enough, one student challenged her classmates to account for a contradictory source. Within minutes, their thesis-in-progress evaporated, and ironically, as their confidence in the unity of the colonists waned, they became more divided themselves. Some students became frustrated that not everyone shared their opinions, and at times the conversations grew heated. I had to intervene once or twice, but they were mostly respectful of each other’s opinions, and I was proud of the fact that they were taking intellectual work so seriously.

Although many students found the exercise uncomfortable—they didn’t like confrontation, they said—they acknowledged that the essay ended up being much better for it. Having to account for contradictory evidence and differing viewpoints made them question and defend their original stance. In the process, ideas that didn’t pass muster fell by the wayside, and those that remained gained strength and clarity.

Reflecting on the experience, I realize that using the Harkness method helped me teach something much more valuable than any “fact” about the American Revolution. That week, my students learned that disagreement and conflict are (and always have been) at the heart of the democratic process, and they learned that making decisions in a democratic way takes time—just as the “Founding Fathers” intended. In today’s politically polarized society, that’s a lesson worth learning.

Back in May, I was named as the winner of an annual teaching award at my school. It was an honor to be recognized for my work, but the best part, in my opinion, was that it allowed me a relatively unique opportunity. In recent years, it has become tradition for the winner of this particular award to deliver the address at our school’s Opening Convocation in August. Below are my remarks, delivered early last week to students, faculty, and staff.

OK. I need to give you all a heads up right at the outset: I’m going to use the F-word a lot in this speech. I know that might make some of you uncomfortable, but I’m going to do it anyway because I happen to think that we all need to become more comfortable with the F-word. So there’s your disclaimer.

Before I get to that, though, let me tell you about one of my favorite movies when I was growing up—a movie that taught me a lot about the F-word, actually. In 1995, the movie Apollo 13 came out, and I was obsessed. If you’re not familiar with it, the plot centers on a 1970 NASA mission to land on the moon for just the third time in history. Three days into the mission, though, an oxygen tank onboard the spacecraft exploded, forcing the crew to abandon the moon landing and, in fact, putting the crew’s very survival at risk.

But I’m really not here to talk about astronauts. Instead, I want to talk about the F-word: failure.

The approach that many of us have to failure is summed up, I’m afraid, in one of the most famous quotations from Apollo 13. Shortly after learning of the explosion, NASA Flight Director Gene Kranz assembles a team of engineers and charges them with somehow getting the crew home from space safely. He gives them some instructions, and then he sends the team away to work, saying sternly, “Failure is not an option.”

I wonder how many of you feel this way. Maybe someone has said this to you before, or maybe you’ve just gotten the impression that failure is unacceptable from the way that people tend to react to it. Even if they don’t use those famous words exactly, I’m willing to bet that people send you the same message all the time: “Failure is not an option.” Today, I’m here to tell you that this mentality is wrong.

Failure in one form or another is not only an option, it is a near certainty. In life, you will fail. That’s not to say that you will grow up to be a failure, but you will fail at something along the way. Maybe you’ll apply for your dream job but not get it. Maybe you’ll get married—and then divorced. Maybe you’ll engineer a new contraption that can send people to the moon . . . only to see part of it explode before they actually get there. You see, even the very movie which tells us that “Failure is not an option” also shows us that failure will happen.

If you followed the Olympics this summer, you saw plenty of examples of failure. Who could forget the heartbreak on gymnast Jordyn Wieber’s face after she failed to qualify for the individual all-around competition? And she wasn’t the only one. Think for a moment of the hundreds of other athletes who trained for years, only to come home without a medal. Or the thousands who failed to qualify for the Olympics at all and never even had the opportunity to represent their countries.

No matter how high we climb, it seems, there’s almost always someone who’s just a little better. A little faster, a little stronger, a little smarter, a little more popular. And even if you reach the top—whatever that means for you—it’s highly unlikely that you’ll stay there forever. As the Olympics proved, even the most successful people in the world experience failure at some point.

Like those of Jordyn and so many others, your failures will be painful. I wouldn’t stand here and tell you that they won’t hurt. They will, and there is nothing we can do about that. But what I will tell you is that failure does not have to be permanent. The key to success in life, I have come to believe, lies in learning to fail well. Put another way: get comfortable with the F-word.

Here’s what I mean: If you try to always keep failure at arm’s length—if you tell yourself that “failure is not an option”—failure will eventually come to consume your thoughts. How can you not be focused on something when you’re constantly trying to avoid it? But avoiding failure and achieving success are not the same.

If you only remember one thing I say today, let that be it: avoiding failure and achieving success are not the same. We can avoid failure for a while and still be entirely mediocre—not failures, but not exactly successes either. But I hope that mediocrity is not what any of us want out of life, and that’s why I talk about failing well. It is by your response to failure that you will ultimately be judged, and to fail well, we have to retrain our focus.

Remember Jordyn Wieber’s tears, and then remember her joy as she helped push her teammates to gold in the team competition. It would have been easy for her to wallow in self-pity, overcome by the fact that her shot at individual glory was gone—but she didn’t. She accepted her failure, and she came back even stronger.

This is the essence of failing well. We don’t have to enjoy failure, but if we accept that failures will occur—if we accept that failure is a natural part of the learning process—we can rob failure of its power over us. Remember: failure isn’t something that only happens to less worthy or less talented or less motivated people. It happens to all of us, and sometimes it happens even when we’re working as hard as we possibly can. We’re all human, and none of us is good at everything. Even in the things we are good at, none of us can possibly be at our very best all the time—and that’s nothing to be ashamed of.

I hope that you will dedicate yourself to the task of failing well this year. It’s a tremendously powerful feeling when you experience the sting of failure followed by the triumph of overcoming it. You would do well to learn that feeling now, while you’re still in school. After all, schools are all about learning, and I for one hope that you will learn more than just names and dates and facts and equations and vocabulary words during your time at Saint Mary’s.

If those things are all you take with you when you leave this place, we will have failed you. So may we all—young and old alike—practice learning to fail well this year.

Thank you.