© Peter Steeves and Figure/Ground Communication
Dr. Steeves was interviewed by Laureano Ralon on October 24th, 2011.
H. Peter Steeves was educated at Indiana University and is a Professor of Philosophy at DePaul University. He has taught at Universidad del Zulia, Venezuela. His main areas of teaching and research include applied ethics (especially animal/environmental and bioethics), social and political philosophy (especially communitarianism), Philosophy of Culture and Philosophy of Science, and phenomenology (especially the work of Edmund Husserl). He has published The Things Themselves: Phenomenology and the Return to the Everday, (SUNY Press, 2006), Founding Community: A Phenomenological-Ethical Inquiry (Kluwer, 1998) and is the editor and a contributor to Animal Others: On Ethics, Ontology, and Animal Life (SUNY, 1999).
How did you decide to become a university professor? Was it a conscious choice?
I have always been drawn to The Big Questions. Without answers to them, all other questions begin to seem contextually irrelevant. I went to school to study science, especially cosmology and theoretical physics, in hopes of finding out how the universe was created and why there is something rather than nothing—and thus, I hoped, what it all means. It soon became apparent, though, that there were certain questions that were not allowed to be asked in science—particularly the foundational why questions in which I was most interested. I decided, then, to pursue a Ph.D. in philosophy, though I wanted to keep a close relationship with physics. During my undergraduate years in physics, I also became distressed when I realized that I could become so obsessed with The Big Questions that I potentially could easily be distracted from the everyday world and the responsibilities it demands. I went on to do Peace Studies and philosophy at Manchester College. Philosophy offered a way to work on ethics, social-political thought, and ways of making the world a better place while also understanding it, so this was another major draw. Becoming a professor in this or that discipline was not the end goal for me, though I have always loved school, I suppose. Answering questions and making a difference were and are the goals. To be honest, I am not sure if I will continue to be a university professor in the future. I am looking elsewhere in the world, still. The future is open—this is what makes it so full of hope and also so terrifying.
Who were some of your mentors in university and what were some of the most important lessons you learned from them?
You have asked about mentors during my university days, but to be fair, the teachers that have had major influences on me stretch back to my earliest days in school. Mrs. McDonald, my first grade teacher, missed her lunch break one day to stay in the classroom with me to help me finish a poem I was writing—a small gesture that has stayed with me all of these years (and helped me always remember rhymes for the word “horrid”). In the third grade, Mrs. Core made me feel as if I were truly cared for, giving me special reading assignments and extra projects outside of class when my life at home was not so happy. In high school, Mr. Seiling, my math teacher, began each semester by having a discussion about what is a number—a thorough and exciting trip from Platonism on. He loved his work, and he became a friend and mentor outside of school as well. To be honest, to name all of the rest of the teachers that had an impact on me from those early years would take hours.
During my undergraduate years, I was lucky that the trend continued and that I ended up studying with amazing teachers and people. Dr. Ken Brown and Dr. Steve Naragon were instrumental in my philosophical training. Dr. Charles Klingler, my literature professor, continues to be an inspiration to me today. He showed me how love of knowledge, coupled with a sincere respect for a student and a commitment to excellence, can change a life forever. He is an incredibly intelligent man, but also one of the most thoughtful and socially-engaged people one could ever hope to meet. Prof. James Hart, my dissertation director and emeritus professor in the Department of Religious Studies at Indiana University, introduced me to Husserl, and to possibilities in philosophy that were beyond the analytic tradition I was then studying. His mastery of Husserl, astoundingly original work in ethics and social thought, and general scope of knowledge is a model to which I can no doubt merely aspire.
In each of these cases, there is a kindness and compassion that accompanies a great intellect. I suppose that this is one thing I have always admired and surely have learned from past teachers: be interested in everything; be rigorous in your work; and remember that there is always an ethic to knowing (and thus an ethic to learning and teaching).
Joshua Meyrowitz’ thesis in No Sense of Place is that when media change, situations and roles change. In your experience, how did the role of university professor evolve since you were an undergraduate student?
I am not sure that the role of the professor has changed, though society—driven by the ineluctable (at least for the moment) force of capitalism—has attempted to turn us into something closer to “trainers” rather than teachers. The fact that so many people think of college as merely a necessary stepping-stone to a better-paying job does exert a force on the profession. I’m not in a position, really, to say if it is worse or even how it has changed in the last twenty years, though. Each generation tends to look back and think that the earlier generation had it better. Perhaps my university teachers in the 1980s and ‘90s thought the same.
Your question, however, is about the way in which media change, and how this effects education. And here I suppose the biggest cultural problem we face is making sure that we do not let information stand in for knowledge. With so much information available to students, the myth that there is progress—that we know more today than others did in the past—is perpetuated and even augmented. There are many dangers in this sort of thinking. First, I am partial to Bill McKibben’s argument that we are actually living in an age of missing information. We not only know less than others did in the past, but we even have less information. The fact that I can turn on the TV and find out the due date for Beyoncé’s baby, Google from my office computer the name of the singing group the three female castaways on Gilligan’s Island formed in episode 48, or get on a cell phone and find out how many vertebrae giraffes have in their necks, makes it seem as if—even without addressing the question of what constitutes real knowledge—surely we must admit that we have access to more information than ever today. But what is obscured is the information we do not have as well as the bad information we are getting. I know Jay-Z will be a dad in late-February 2012, the castaways’ Honeybees were a threat to the Mosquitoes (who were themselves a satire on the Beatles), and giraffes and humans amazingly have the exact same number of neck vertebrae (seven). But in accessing this information, I also get the false information that “celebrities are important to my life,” “most things worth knowing happened after 1940 because 99% of television is about what happened after 1940,” and “a cell phone knows about animals; and now I know something important about giraffes.” I get the wrong information about the world, that is. And this is not to mention the fact that all of this information is at the expense of information about the local and immediate world around me: information that is almost always more important for achieving real knowledge. So the first thing to say is that I think we actually have less information available to us.
The deeper point, of course, is that in order to have true knowledge there must be some data, some “information,” but we must know how to think with it, what it means, how to put it into context, and what to do with it. Information is not knowledge; information does not require thinking. We are in danger of letting media think for us—and media don’t really think. As a professor, I have to make this clear to students, which can be a challenge because the culture at large—especially as it is run by corporations and the ruling elite—is constantly sending the opposite message: “You know a lot; you know more than anyone else; you are in the age of information (and your iPhone is outdated, loser. Buy a new iPhone!).” Perhaps it is all faster and thus more challenging due to the current technology. To be sure, each form of technology carries with it its own set of values. A cell phone is not just like a phone only it’s now mobile: it creates a completely different person. Email is not just like regular mail only now it’s faster: it is a totally different form of communication. All tools carry values that are more or less necessary to their being. When we pick up a hammer it demands we hold it a particular way, use it a particular way, and view the world around us as a hammer-user. The same is true for picking up a cell phone, using email, or Googling something. These tools necessarily force values on us. That being said, I guess I see this information vs. knowledge problem as part of the Enlightenment project’s misdirected hubris in general. When we think everything is knowable, everything can be abstracted and known, and everything can be reduced to information, we are already on a course for disaster. The Enlightenment wanted to shed light on everything, to live in perpetual daylight. That’s the road to insomnia—and madness. As a professor, I am caught up in that Enlightenment project even as I try to critique it. And I am caught up in the web of values that the tools and media create today even as I try to critique them. I personally do not use a cell phone, but I can’t help but be forced to use email in my profession. And no matter what I use or don’t use, the point is that the culture in which I am immersed already has these values built-in such that personal choice is not really relevant. I am a cell-phone-iPad-Xbox-texting-Angry-Birds-skyping person even though I purposefully don’t personally use or own or do these things.
To answer your question succinctly, then, the role of the university professor has changed since I was an undergrad in that I never tried to text secretly while sitting in class like students do today; and by the way it’s never a secret, and I see it, and it’s obvious what’s going on, and it’s really annoying, so please stop it, please stop putting your cell phone under your desk and pretending you aren’t typing on it, please for the love of God, please, stop texting!
What makes a good teacher today? How do you manage to command attention in an age of interruption characterized by fractured attention and information overload?
I have reservations about saying what makes a good teacher in general. One can say such things as “Has respect for the students” or “Shows true enthusiasm about the work,” but on a certain level these become trite or hollow. They are both true, of course, but these qualities can manifest themselves in so many different ways in different teachers that they are not really constructive in terms of pedagogical training.
When I am mentoring graduate students who are just beginning to teach their first classes, we spend hours and hours together talking about how to teach. Like anything, becoming a teacher takes practice. But a great deal of teaching is also in the preparation. When I know I am going to teach a new class, I prepare many months in advance. The syllabus construction is half the battle. Choosing the appropriate texts, constructing the course so that it has a narrative arc, deciding on the written assignments—all of this constitutes a massive amount of work, and this is work that will define the potential of the course long before anyone steps into the classroom. Therefore I tend to stress the importance of course design and syllabus construction.
Then comes the writing of the lectures. I will spend months preparing lectures, finding the best way to explain a difficult concept, coming up with metaphors and analogies and stories to illustrate a point, searching for non-traditional ways to get the students truly thinking. Preparation is again the key. But the way in which the class then unfolds needs to be determined in large part by the personality and the specific talents of the instructor. I tend to lecture in the classroom. I have never had students break into discussion groups or do group projects to present to the class or get in a circle to talk with each other about things. Perhaps these techniques work for other professors, but I tend to feel that our time is so limited in the classroom that I can’t let so much of it go. During lecture, we have “discussions.” I will ask questions and get the students involved, but in general I am trying to model what it is to think, model what it is to be a philosopher.
Teaching is performing. This doesn’t mean that it is about entertainment and show-biz—quite the contrary, actually—but it is an act of performance, as are all cultural roles and relationships one takes up. There are costumes we wear, a set on which it all unfolds, and archetypes for how we structure the drama. I am different when I am teaching than I am when I am taking up all other aspects of my life. I have found that embracing this truth rather than trying to hide from it and collapse the classroom into any other sort of everyday space or event leads to a better experience for everyone. There is also an inherent power imbalance in the classroom. The professor has the chalk, the lectern, and the power to grade. Again, embracing this and taking responsibility for it is, in my opinion, the honest way to proceed. Ethics, in fact, is not about eradicating all power imbalances. It is often about making sure that one uses one’s power justly and for the good of others. Consequently, I am always learning and always open to having new experiences in the classroom, but I try to take seriously the fact that there is a reason I am on one side of the lectern and the students are on the other side. I take seriously that it is my job to put to use the many years of training and thinking that I have completed toward the goal of helping students learn. Striving for social justice—in society and in the classroom—thus takes center stage.
In the end, then, preparation, modeling of one’s craft, and sheer joy and enthusiasm for one’s work cannot help but be inspiring. Yes, we are competing with a world that beeps and blips and flashes and screams and declares that whatever you are doing now is not as important as this other thing that just happened and to which you should be paying attention because you could be doing it right now. But the questions with which we deal in philosophy are themselves the draw. If we state them well, allow them to speak to the students, and make evident the sense in which these are foundational questions for any and all other sorts of questions one could ever ask about anything, it’s philosophy itself that wins the day.
As you probably know, RateMyProfessors.com has included you in their top 20 list of best university professors. What advice would you give to young graduate students and aspiring university professors? Do you believe in faculty/university rankings?
I guess I have just more or less covered my advice—shoddy as it is—for graduate students aspiring to become professors. It is difficult to say things in general that have much import. I suppose I might only add that it is important to remember that though we all want to read and publish and do our research, we have a charge as educators. Teaching is not some ancillary task to the profession. Taking that responsibility seriously, then, means devoting a great deal of one’s time and energy to the preparation and the activity of teaching.
As for this ranking business, it, too, is a difficult question. The obvious and most sincere answer is that rankings are an absurdity. We are a culture that is obsessed with winning and with some “objective” way of evaluating something. It is ridiculous to think, though, that anything of any importance is communicated by saying that Professor X’s lecture today was a 3.4/5—or worse yet, that Professor Y’s lecture was a 3.3/5 so it was not as good as Professor X’s. I am reminded of Alfie Kohn’s point about the insanity of the Olympics. One downhill skier gets a time of 37.645 seconds for her run, and a second downhill skier gets a time of 37.644 seconds making her the better skier. What has gone wrong in a culture that finds this last sentence meaningful? Husserl taught us, too, that the post-Galilean attempt to mathematize the world is always a doomed project (not to mention a morally-bankrupt one). So rankings are foolish and a waste of time. This goes against the academic culture where assessment and evaluation are the growing norms, but the truth is that I believe all of these sorts of evaluations cause more harm than good, actually. Not just those that rank, but even those meant merely to assess, review, and evaluate. That, of course, is another conversation.
What can be said in the positive, though, is that even given the messed-up systems of our culture, there are many students who took the time and effort to go to the website you mentioned and say nice things about me and our time together in the classroom. Because of this, I was given this award. And I am truly and sincerely grateful for it and humbled. Not so much by the ranking, but by the fact that so many students would take the time to do this, making an attempt to say something good and positive in the only way that our culture really allows. With all sincerity, I am grateful to them and truly hope to live up to their kind words.
In 1964, Marshall McLuhan declared, in reference to the university environment that, “departmental sovereignties have melted away as rapidly as national sovereignties under conditions of electric speed.” This claim could be viewed as an endorsement of interdisciplinary studies, but it could also be regarded as a statement about the changing nature of academia. Do you think the university as an institution is in crisis or at least under threat in this age of information?
That’s an interesting dual reading of McLuhan, to be sure. And a fascinating idea. The question of interdisciplinary studies is a monumental one. I could go on about this topic forever, but suffice it to say that I think this is a key our collective future in academia, but it is also a great threat. There is a lot of talk these days about supporting interdisciplinarity, but when it comes down to it, there really isn’t that much institutional or financial support anywhere. And to be honest, there is a lot of shoddy work that is interdisciplinary in nature. The best interdisciplinary work holds out for the highest standards of rigor in all of the combined fields in which one is working. But such rigor in thinking, methodology, and training is precisely what is absent in so much of the interdisciplinary work being done today. Not all of it, but certainly a good deal of it. And this is a terrible waste.
One cannot fully blame the individuals involved. Some of my best friends are specialists! It is, rather, our culture that demands specialization, our capitalist job market that demands one carve out a very small niche over which he or she is lord, thus increasing the chances of procuring a job in that specific area. In a job market where it is typical that 300+ candidates apply for a single job, how is a candidate going to stand out? You can’t say: “I know a lot about philosophy.” You can’t say: “I know a lot about Heidegger,” even. So you have to say: “I am the best scholar in the English-speaking world when it comes to comparing Heidegger’s introduction to Being and Time with his unpublished lecture notes from the early-spring of 1925.” Given that academia is just a business, that the job market is merely another market, competition drives specialization.
And it is our further twisted thinking that makes us believe that being smart about subject A is necessarily at the cost of not knowing much at all about subject B: that being a jack of all trades must also mean one is a master of none. Anyone trying truly to become well-rounded and rigorous in more than one area of study is, to be honest, not looked on favorably. This is ridiculous, but it’s the norm.
When McLuhan was writing, new areas of study in the university were being created. Black Studies, Women’s Studies, Gay and Lesbian Studies, etc. were all just burgeoning, and there was great hope that the eroding of departmental boundaries would take place and create an institution where territory-marking and defending were far less important than making real headway in areas of social justice as well as inquiry into the nature of the world that didn’t demand a tunnel-vision of specialization (and adherence to an historical canon of training that was ethically and politically questionable). I believe that for the most part, the promise of those times was not kept, the hope was not fulfilled. There have been some good things that happened because of this, to be sure, but those areas that have hence not been utterly marginalized, with mere lip-service given to their importance, have tended toward a re-enactment of the same closing off. Interdisciplinary studies have become their own isolated disciplines. Again, this is not to put down the people doing this work. Not at all. It is, instead, to see the institutional and cultural forces that drive the academy as so powerful that we have not really begun to address what is, at root, truly the problem. Thus, the eroding of departmental sovereignty in this particular sense of McLuhan’s message has not really taken place.
Which brings us to the second reading of this claim you have raised—a reading I think is directly linked to the first, for one of the reasons that the university is in crisis is precisely due to the forces that have driven specialization. These are the same forces that drive us to think of the university as a training ground for corporate America, by the way. The same forces that have something to gain by recasting education as a business of acquiring information rather than an art of knowledge. The simple fact of the matter is that the humanities are in a state of crisis at the university. This is without question a matter of record: look at the number of philosophy departments and programs shut down and disassembled in the last few years; look at the number of language classes and departments, comparative literature programs, and the like, all canceled and done away with. One cannot say that the humanities are not under attack. There are multiple forces driving this, of course, but I do think that part of it is that we have begun to model higher learning in the same way we model agribusiness: monoculture cropping marked by a disregard for diversity and the way in which all things interconnect. There is thus a parallel, I would argue, between the desire to plant only one type of potato (out of the thousands of different varieties) for acres and acres, and the desire to have professors that do one thing and one thing only. “Mono-cropped”-thinking is just as dangerous. And perhaps they are both driven because this is what McDonald’s wants. When farming becomes about serving corporate America (because the assumption is that farmers don’t have to care about feeding people because corporate America will serve this role down the line), we all suffer. And when education becomes about serving corporate America (because the assumption is that corporate American will then serve the students when they are “workers”), we all suffer. It’s a sort of trickle-down theory of education. Mono-cropping is never good in any context. We could understand McLuhan’s claim about the erosion of sovereignty, then, as a hope that a monolithic model of intelligence and knowledge must give way to a new sort of thinking—one that will have an impact on the institutions in which we participate as educators.
Speaking of McLuhan, I am sure he would have liked your theatrical lectures. How did you come about such an innovative pedagogic format? Do you feel as though your mode of delivery supports the oft-quoted phrase The Medium is the Message?
Thank you so much for the kind words and for the generous claim that McLuhan would have liked the “lecture-performances” I have been trying to do. These performances are really the main focus of my work in philosophy these days.
You are exactly right to point out that the idea is driven by the claim that the medium is the message. That is, the way in which an argument is presented can be—and, in fact, always necessarily is—an important part of the very content of the argument itself. As I began to move away from analytic philosophy in graduate school, the lack of this sort of thinking is one of the things that most troubled me. The world cannot be reduced to p’s and q’s and a set of modus ponens arguments. In the early-1990s as a graduate student teaching assistant to a world-famous logician (a man, I must say, I still admire and respect, even if I disagree with much of his work), I once was leading a Friday lab/discussion section of a large lecture course, teaching the students how to translate arguments into first-order logical notation. One day I brought in some poems and asked the students to perform the same translations on them. I had some Emily Dickinson, some Galway Kinnell, some T. S. Eliot, and other assorted favorites, including Wallace Stevens’ “Disillusionment of Ten O’Clock”:
The houses are haunted
By white night-gowns.
None are green,
Or purple with green rings,
Or green with yellow rings,
Or yellow with blue rings.
None of them are strange,
With socks of lace
And beaded ceintures.
People are not going
To dream of baboons and periwinkles.
Only, here and there, an old sailor,
Drunk and asleep in his boots,
In red weather.
With a bit of hard work and imagination, one can go about a first-order logic translation here, but what is lost? What refuses the translator’s work? What might be the very point of the poem itself that is obscured by the logician’s formatting and reduction? Context, audience, space and time, the lived-experience, the medium—none of this shows itself in first-order logic, but all of this is incredibly important to meaning. This was, perhaps, the first time I began to think all of this through in a systematic way. It would take several years before I was brave enough—or perhaps foolish enough—to think that even the way in which I was making arguments within the confines of Continental philosophy was restricting me from saying what I really wished to be saying. And so I started to experiment with the medium.
It is not something I have come anywhere near “perfecting.” By no means is it something with which I am yet satisfied. But it has been liberating in many respects to tell myself that if I have an idea, a thought, a claim I want to work through and share with my peers and my community to see what they think and to see how we might think together, I should not limit myself to the traditional formats of my discipline. In the mid-1990s, this began to inform everything I was doing: in the classroom, at conferences, and in terms of my writing.
I began writing essays for books and journals, for instance, that incorporated a great deal of story-telling, autobiography, and a sort of “creative non-fiction” aesthetic. A dozen years ago, and still to this day, it is, to be sure, wildly unpopular and often met with a strangely emotional rejection. This itself is rather intriguing, I think. (I keep many of my “favorite” rejections. One reviewer for an essay I submitted to a journal on Latin American political theory—an essay that I had written in a style that mimicked a sort of “magical realism” in its philosophy—wrote in his or her anonymous review: “This essay should be rejected. If the author is making this up, he is a liar. If he is telling the truth, he needs serious psychological counseling. Regardless, this is not philosophy.”) I count myself lucky that a handful of editors and reviewers have allowed me the chance to write in this hybrid style I have been trying to develop, acknowledging that there are, indeed, multiple ways for an argument to show itself. The medium, indeed, is part of the argument itself.
In the classroom, I began experimenting with ways to engage the students in new ways as well. As I mentioned before, I tend to lecture most of the time. But I also try to find unexpected ways to make a point outside the confines of a lecture and really even the medium of language. In an undergraduate class on philosophy and culture last year, for example, we were looking at the Disney Corporation as a touchstone for corporate control and pervasiveness, and we talked about how it is nearly impossible to eat things at Disneyland that are not in the shape of Mickey Mouse’s head. Cookies, pretzels, waffles, pancakes, even tortilla chips are all Mickey-shaped. In order to think through what it means to ingest an icon—to transubstantiate a mascot of meaning—I baked cookies for the class. I decorated 29 of them with officially-licensed Mickey Mouseä candy images. And on one of the cookies, I used white icing to paint a swastika. As the plate of cookies went around the classroom, people were shocked to see the swastika cookie (a mixture of gasps and uncomfortable laughing), and each took a Mickey cookie instead. As we ate the cookies together, we discussed how difficult it was to pick up a swastika cookie and eat it simply as a cookie. What sort of person would I be if I did that? Could I separate any meaning of the image from the act of ingesting it? Shouldn’t I be asking who had made this cookie, and why? And then we discussed why we are all reluctant to ask the same questions about the Disney cookie. Being forced to eat Mickey or to eat a swastika made the point visceral and lived. It created, I think, a space in which the more traditional philosophical points that I then made could flourish. All together, it formed an argument.
For another class on the nature of comedy, we were studying the work of Andy Kaufman, putting it in the context of what we had read by Freud, Kant, Bergson, and Hobbes. Andy often tricked his audiences, and made them feel frightened or angry or sad before allowing the comedy to take shape. Other times, he created child-like moments of silliness and innocence to remind his audience that life itself can be something about which we can smile. In his Carnegie Hall show, for instance, he both faked an old woman having a heart attack on stage and also had Santa arrive at the end of the show (when he invited everyone to leave on buses to go eat milk and cookies at another hall he had rented). It was a mixture of fear and delight, and the result was a wide-ranging sense of comedy.
And so on the last day of my class, I planned a spectacle. It was June, and I hired Santa Claus and an elf to burst into the room, pass out presents to the students, and eat milk and cookies with them. After a little while, Santa suggested we all go caroling around the department and throughout the building. Of course, this was loud and disruptive (and it was June), but I made all 35 of the students file down the hallways, singing Christmas carols, passing out candy canes to the confused people we met. A few minutes after returning to our classroom, then-Chair of the philosophy department, Rick Lee, came in to complain about the noise we had just made and how we had bothered him during his office hours and his appointment with a graduate student. I apologized to Rick, but he continued to seem angry about it all, saying that he was fed up with my “stunts” and my lack of philosophical rigor. I was quite nervous, trying to disarm the situation with apologies and little jokes here and there to remind him that it was all in good fun, but it was a very tense moment. He stayed in the back of the classroom, refusing to leave as I continued with the lecture, angrily muttering under his breath and taking notes in order to use them against me. He spoke up again about ten minutes later, embarrassing me in front of the students by saying that I conflate thinking and performing in my published work as well as in the classroom, and I foolishly think that performing is all that is necessary for good teaching. He said was going to speak to the Dean about my behavior. Not knowing how to respond to this, I asked him why performance is something that is necessarily separate from Being, separate from thinking, and he said that it didn’t matter because I was incapable of understanding any actual argument he would give me. In a moment of nervous anger, I said to him that maybe the problem was that he couldn’t perform at all. He was silent. And then I added: “At least that’s what your wife told me in bed last night.” At this point, Rick rushed toward me and slapped my face full-on, harder than I have ever been hit in my life. He knocked my glasses off my face, and the blow actually knocked me to the ground. The students grew instantly silent. I composed myself and lunged up at Rick, pulling him to the ground with me where we wrestled and punched at each other. After a few moments, a student screamed “Oh my God!” and went for the door to go call security. It seemed that our happy Santa-stunt had all gone wrong. Outside the door, however, was Spider-Man. Spider-Man rushed into the classroom, separated us, and broke up the fight. Rick and I moved to opposite sides the classroom. As I fumbled for my glasses, I said “You know, maybe the whole problem is that we just don’t see eye to eye on things, Rick. For instance, you say po-tay-to and I say po-tah-to.” Rick replied “And you say to-may-to and I say to-mah-to.” And then we rushed toward each other, put our arms across each other’s back, and sang the entire “Let’s Call the Whole Thing Off” song for the class, still panting, sweating, and slightly bloody from our fake-yet-real brawl. Over the course of a few minutes, the students could thus feel the sense in which laughter directly following fear is a different sort of laughter altogether. They could do a first-hand analysis of what it means to experience exactly what we have been studying over the past several classes. I had bruises and a welt across my face for days, but there was a space that opened up in the classroom for an examination of the experience of comedy that was not there before. It’s my hope, at least, that planning such an extravagant performance was able to accomplish something that would have been impossible otherwise.
Taking this to the level of a conference presentation or a lecture for my peers was the final step for me. I had wanted to do something performative for a long time in these settings, but I was uneasy about taking the medium of a scholarly lecture and trying to upend it in front of a larger audience—and especially an audience of peers and colleagues. I had been adding PowerPoint images and even recorded music to some of my conference lectures for years, but it wasn’t until about six years ago that I decided once and for all to embrace the idea and perform full-on “lecture-performances” that I now call “shows.”
For “The Energy Show” I built a Tesla coil that shot eight-foot long lightning bolts at me while I lectured. I made a person in a glass pyramid disappear on stage in a puff of smoke and immediately appear at the back of the auditorium to illustrate the equivalence of matter and energy. An a cappella choir sang Radiohead’s “Creep” while I built an extremely high-voltage Jacob’s Ladder on stage. A live funk band played at the end of the evening when someone came on stage to put a robe around my shoulders (like James Brown), walk me slowly off stage, telling the audience “He can’t go on! You’re killing him! He hasn’t got the energy! He can’t philosophize more!,” only to have me throw off the cape (like James Brown) and come back for another few minutes of lecturing. (This happened, of course, three times.) The lecture itself focused on Heidegger, Nikola Tesla, Michel Serres, Bataille, and Aristotle.
For “The Wired for War Show” on the topic of the relationship between technology and violence, I had a military drum corps play at different times during the lecture while I broke cell phones, poured acid over a Blackberry, and smashed an iPad over my knee (just a month after the first iPad had been released). Audience members came up on stage and played violent video games behind me while I lectured, flanked by large live video projections of my face (so that I was thus competing with images of myself for the attention of the audience). I sang two songs with a band made up of students and faculty from the college I was visiting, and for the last song of the evening had everyone in the audience get out their cell phones, call the person sitting beside them, and put the call on speakerphone, thus setting up an eerie and other-wordly feedback loop of sound that was the sonic backdrop for the band’s final number (The Decemberist’s “Sons and Daughters”). We thus tried to take a bit of technology meant only to be used for the good of corporate America when we are apart (and, to be sure, to keep us apart), and we instead tried to make it work only when we were actually together, closer together, to make art.
For “The Mourning Show,” a string quarter played Mozart’s Requiem while I showed images and spoke about the death of the universe itself. Three different dances of mourning were performed (including my own tango with ‘Death’ on stage). Two audience members “spontaneously” arose at a certain point in my lecture and performed the scene from “Hamlet” in which Hamlet confronts his father’s ghost, quietly sitting back down in each other’s seat as the scene ended and the lecture began again. At another point, a member of the audience got up while I was lecturing, came on stage, pulled back a curtain to reveal a drum kit, and started laying down a beat. Another audience member soon joined him, playing bass. Another come up, laying banjo. Another came up and started to play the piano. And at a certain point, I stopped the lecture, walked over to join them, picked up a guitar, and sang David Bowie’s “Five Years.” The song ended as it began, with each person stopping one-by-one—in reverse order—and retaking his or her seat, seamlessly, as I spoke. ‘Death’ appeared at different times as well when a large lighthouse bell rang, going into the audience, choosing an audience member over which she sprinkled dried rose petals, and further covering that audience member completely with a black cloth. The audience member sat throughout the whole performance, covered, without moving. Though I had arranged this with each of the three audience members chosen before the show, no one else knew this to be the case. And so the nervousness that everyone felt each time ‘Death’ entered the audience to cover someone was real. No doubt people were nervous because they might be singled out and might have to be covered and participate in the show. But regardless of why there were nervous, they should be nervous when ‘Death’ is around! At the end, I stopped ‘Death’ from making a final trip into the audience, lay her on the ground, and fell to the floor beside her, weeping. A mariachi band then entered from the doors at the back of the theatre, and as they played, members of the audience came up on stage, picked up ‘Death’’s body, and carried her away, at which point I arose and screamed for an end to all mourning, an end to all death. This, and many other performative elements, were all intermixed with a lecture on the nature of mourning, specifically using Derrida, Heidegger, and Freud to think through what it means to face the death of the Other.
One of my favorite moments has been when I was lucky enough to be asked to participate in the Chicago Cultural Center’s annual SITE UNSEEN art exhibition. My show was entitled “You Are Here: Maps and the Nature of Occupation.” I filled several huge rooms of the Cultural Center with installation art pieces, and performed a show in the main theatre there. I spent several months making the art pieces, and it was incredibly rewarding and exhilarating. For one of the installations, several robotic mice that I had built were exploring and trying to solve a large maze (about 150 square feet). There was also a robotic scientist (dressed in a white lab coat) in the maze that the audience could control with a remote, either helping or hindering the mice along the way. Another exhibit was about mapping the human genome and included a carved mannequin, a fan of rotating knives, and a test tube mixer tossing around antique marbles labeled G, A, T, or C so that they clacked and clicked, echoing all throughout the hall. There was also a lighted antique microscope as part of this piece such that the audience was invited to peek inside and see an actual bit of the sequenced human genome. When someone went into the booth (which I built out of amazing 150 year-old barn wood I found), a motion detector triggered hidden speakers that played Gregorian chants directed at the ears of the viewer so that only he or she could hear the music while leaning down to look in the microscope. And inside the microscope the person merely saw—in the tiniest lettering—the words “You are not here.” Or another piece, a treasure chest in the corner of one room held 500 treasure maps with clues for finding three “treasures” I had hidden around the downtown Chicago area (e.g., a gold bracelet worth $100 in the pages of an obscure book at the public library). This piece was called “Buried Capital is a Dead Thing,” and the clues—which required some knowledge of history and philosophy—were printed on the backside of a single-page lecture I had written on Baudrillard, Marx, and the relationship between capitalism, exploration, and mapping. A different exhibit offered what was advertised as a live video feed from Baghdad with a monitor hooked up to a large satellite dish. On the small black and white video screen inside a deep wooden box, there could be seen a panoramic shot of the desert, with someone yelling every once in a while in Arabic in the distance. The video was actually recorded and not live. And every 120 seconds, the video looped, with the pan coming full-circle and always ending up with a shot of Las Vegas. The desert being shown was actually in Nevada, not Iraq. And if anyone could understand Arabic who was watching the monitor, they would have heard the woman yelling: “You are here. Who are you? America is here. You are not here. What you are seeing is not real. What you are seeing does not exist on the map where you are looking, though it, too, is a territory that is being occupied. The same values that occupy this territory are the values that have led you to occupy Iraq. What you are seeing is not real. It is also the most real thing here.” Nearby, a very old robotic mannequin that I had rebuilt and dressed as a little boy playing war and wearing a gas mask was turning and spinning around, holding a full house poker hand using the “Most Wanted Iraq Playing Cards,” with a 16mm projector lighting him with old Vietnam war newsreels. There was also an exhibit where you could hear sounds from NASA’s latest solar system mapping probe. In this piece, you could also talk to God (actually, my colleague Bill Martin) as he created more spacetime using a bubble machine (and playing his electric bass) while he sat in the middle of the five universes I had created within the room. Half a dozen other exhibits filled the rooms I was given at the Cultural Center as well; and in the theatre, for a show on maps, we served food from around the world to the audience, had world-dancers performing (tango, flamenco, bellydance, Indian, ballet, modern, Indonesian, breakdance, etc.), and actors on stage made paper airplanes out of maps, throwing them into the audience while I spoke about the way in which colonization and inquiry are related in all manners of human activity and thought.
At another time for a lecture on zombies I had half a dozen zombies come out at the end and stalk the audience, ultimately ending with us doing the Michael Jackson “Thriller” dance to end the talk. I’ve also danced with Grover from “Sesame Street,” led an impromptu karaoke competition, plucked ‘photons’ from the air while Metallica plays, and been chased by police in the middle of various lectures. Over the past few years, we’ve had singing and dancing and magic and theatre and general mayhem at one time or another. And to be sure, none of it would be possible without the participation and collaboration of so many talented people who have joined me on and off stage to make all of these shows possible. This is all definitely in the first person plural.
In the end, I hope that it is all on the way toward a new medium for philosophical inquiry: a mixture of scholarly lecture and (something that at least aspires to) art. I do believe that sometimes the medium of a song is necessary to make a philosophical point. The medium of dance is the only way to express a conclusion. The medium of a magic trick is key to making an analysis of identity stick. I know it is all just a work in progress, but I hope very much that it is helpful in making the arguments that I hope to make.
Let’s move on. Among your publications are The Things Themselves: Phenomenology and the Return to the Everyday, (SUNY Press, 2006) and Founding Community: A Phenomenological-Ethical Inquiry (Kluwer, 1998). What attracted you to phenomenology?
I tend to be the sort of person who is attracted to an idea and even some notion of ‘the truth’ more than I am attracted to a philosophical figure or movement. I like Husserl and I continue to work on Husserlian phenomenology because I think Husserl got so many things right. But when he isn’t right, I have no stake in defending him. There is something strong and compelling about the phenomenological method, even after the Derridean critique.
One of the things I particularly like about phenomenology is the way in which it puts to bed an entire history of metaphysics—and it does so by returning us to the world. I love philosophy and the abstract, but our discipline, perhaps unlike any other, runs the risk of missing the forest for the trees, of thinking that thinking is all there is to think, of putting Descartes before the horse. Phenomenology gives us a rigorous and insightful way to think about the world around us. For instance, I would argue that the Mickey Mouse/Nazi cookie experiment I mentioned before is best thought through with phenomenology. And this doesn’t mean, of course, just describing one’s experience in a sort of psychological or anthropological manner. Such experiments can be insightful and I don’t mean to put them down (for instance, if you are not psychologically queasy about eating the swastika cookie, I want to know that about you!). But a phenomenological analysis looks at the experience of the thing in order to gain insights into the being of the thing and the necessary structures of our conscious engagement with the world. We learn something about categoriality, for instance, in the cookie example: how it is not merely “cookie + icing + meaning” but rather how the meaning of the symbol/icon is always already there with the cookie, how the thing itself is suffused with ethical import from the start, and how we are necessarily related to this object and the value-laden, public world it inhabits in a corporeal way. What the cookie is and what the cookie means thus become elucidated in an amazing way through phenomenological analysis.
Phenomenology is also important to me because it shows us that ethics is not merely one branch of philosophy or one part of life. As I began developing the notion of “phenomenological communitarianism” in the early 1990s, I came to see this clearly and to count myself very lucky for having been exposed to Husserl and the tradition (through Heidegger, Merleau-Ponty, Levinas, Derrida and others) to which he gave rise. Seeing the way in which the self and Other are co-constituting, and the manner in which Goods are always public and shared, was key to my own excitement about phenomenology, and a good part of why I continue to work in phenomenology today.
I suppose I have no idea what I am, really, but when I have to answer this question in a philosophic setting, I usually say I am a phenomenologist.
Prior to becoming a faculty member at DePaul University, you taught at Universidad del Zulia, Venezuela. How was your experience in South America and what do you make of the social, political and economic transformations taking place there?
To state it up front, I wish that there were a Chávez-figure for the United States right now. I wish that we could have a peaceful “Bolivarian Revolution” of our own. I believe that Hugo Chávez is an amazing force for social justice who, at the start of the new millennium, has put Latin America on a path that will have positive repercussions for countless people. Given his health issues and the political animosity we have shown him in the North, I only hope that he is able to continue his work and fulfill his promise in the years to come.
I went to Venezuela in 1992, just weeks after the failed coup Chávez had attempted when he was in the Army. I had never been in such a situation before, and I was, to be sure, nervous and constantly ill at ease—an American Midwestern fish out of water. When I returned on a Fulbright to teach and do research in 1998, Chávez had just been let out of prison and was running for president for the first time. It was during this election year that I truly began to understand his vision for Venezuela and the South; and it was the first time I realized that my own interest in Latin America, in politics, and in questions of social justice in general needed to be much more than merely intellectual commitments.
What is amazing about Chávez in particular, and the new direction that so many in Latin America are taking in general, is that these are political movements that utterly reject Liberalism and not merely neo-liberalism. That is, the basic assumptions of Western political theory that begin with isolated individuals, social contracts, and autonomous, solipsistic, monadic Selves coming together to form a state are all being questioned. You can’t call it socialism or communism or anything of the sort, because even these movements are part of the Liberal spectrum. Instead, you have something completely different.
Instead, for instance, you have a 2008 Ecuadoran Constitution that states that Nature has rights. Imagine if the U.S. had a Constitution that allowed a river or a tree or the environment to count as a member of the community! You have a 2009 Bolivian Constitution that reads: “The State assumes and promotes the following ethical-moral principles of a plural society: ama qhilla, ama llulla, ama suwa (do not be lazy, do not be a liar, do not be a thief), suma qamaña (living well), ñandereko (harmonious life), teko kavi (good life), ivi maraei (land without evil), and qhapaj ñan (noble path or life)” (article 8). Imagine part of the U.S. Constitution being written in Navajo or Cree, with the federal government promising it will help people be noble! And you have a 1999 Venezuelan Constitution that declares that all people have a write to free education, the ability to read and write, health care, a clean environment, and their own culture/tradition/language. This Venezuelan Constitution further guarantees everyone the right to rebel against injustice, and maintains that the people are the fifth branch of government itself and that the media have a duty to report truthfully because the people have a right to the truth. Furthermore, all of this is to be accomplished from the ground up at the local level wherever possible because “[d]centralization, as a national policy…add[s] depth to democracy, bring[s] power closer to the people, and creat[es] optimum conditions both for the exercise of democracy and for the effective and efficient fulfillment of government commitments” (article 158). Imagine if the U.S. Constitution were re-written from top to bottom with a Bolivarian sensibility, thus moving us closer to a real democracy rather than the sham, the shell, the imposter that goes by the name “democracy” in our nation? This democratic-Doppelgänger in Uncle Sam clothing bows down before his corporate masters and thinks nothing of going around the world asking us to kill in his name. We need to look to the South. We need to imagine what is truly possible in politics and then find our own, local path toward that revolution. This is, in part, what I have learned from the Venezuelan experience.
What are you currently working on?
Thanks for asking. I have several essay and book-chapters coming out over the next several months, including ones concerning Samuel Beckett and mourning, Plato’s animals, the films of Michael Haneke, the image of the hand in environmental ethics, the ethics of eating vegetarian meat substitutes, the role of vision (and what is not seen) in oil painting, and the nature of repetition in human and nonhuman language.
There are also four books that will come out in the next few years. One co-authored on Chávez and the possibility of a non-Liberal democracy; another co-authored book meant for a wider audience entitled “The Lost Socratic Dialogues”; one on looking at “The Sopranos” as a touchstone for American culture and communitarianism as the 20th century gave way to the 21st; and one on the relationship between performance and philosophy (which will also include the collected “scripts” for several of the shows I have put on).
I’ll be putting on two new lecture-performance shows in the next year. “The Money Show” will investigate oikonomia and chrematistics, looking at the way in which value is created in the world, and the way in which money functions to break down the binary between sign and signified as well as engender infinite desire in its infinite iterability. And there will also be “The About Time Show,” which will be an investigation into the nature of time—its physics and its (post)metaphysics (plus I’ll also be building a working [?!] time machine on stage that will take the audience back to the 1970s and replace me with a seven-year-old version of myself at the conclusion).
At DePaul, I’m developing a new undergraduate course on the question of freedom and determinism, looking at the historical way in which the question has been shaped and has unfolded. And I’m working on a new graduate seminar on the meeting of deconstruction and science.
In terms of research, most of my time is devoted to projects that are philosophical in nature, but not, perhaps, strictly-speaking philosophy. For the last ten years I have been visiting NASA Ames and spending a good deal of time on astrobiology, pre-biotic chemistry, and the origin of life. There are some incredibly brilliant people working at Ames and in this field in general, and it’s a real privilege to be allowed to be part of that conversation. I think that within the next decade we will have a good sense of how life first arose on Earth—and how it might, thus, arise elsewhere. The move from inorganic to organic chemistry is fascinating, and discovering how our own genesis event took place will be one of the great breakthroughs of our time, I believe.
Finally, the majority of my research time has recently been to work on questions concerning cosmology and the origins of the universe. In philosophical terms, it is the “why is there something rather than nothing” question. Along the way, though, it has become evident that the nature of the second law of thermodynamics—the law that tells us that entropy and disorder are always on the rise—is key to this investigation. Consequently, I’m trying to work through why scale seems to be important to the second law, and what it means for the law possibly to break down at the start of the universe itself. Interestingly, this requires, I think, a new understanding of what we mean by ‘a law of nature,’ and thus a complete refiguring of the very foundations of physics. Several years ago, Jesús Pando (a cosmologist and Chair of the physics department at DePaul) and I started an interdisciplinary reading and research group on these questions, and we have applied for an NSF grant to cover work in the lab to test some of our ideas. We are hoping that in the years to come, our work will bear some fruit that will change the way these questions are asked and addressed.
And I suppose this brings us full circle. Without confronting The Big Questions, I’m not sure what we are all really doing here, and the question of why anything at all is here is one of the biggest questions I can imagine. Trying to reconceptualize, refigure, and readdress the something-rather-than-nothing question thus seems paramount to me. And I hope what it leads to will be exciting for others as well.
© Peter Steeves and Figure/Ground Communication. Excerpts and links may be used, provided that full and clear credit is given to Peter Steeves and Figure/Ground Communication with appropriate and specific direction to the original content.
Your feedback is welcome and appreciated! If you like what you see, please consider voting, commenting or donating to help us grow. Figure/Ground is currently on the outlook for collaborators to help with the expansion of this section into the largest repository of scholarly interviews on the net. For specific suggestions regarding future/potential interviewees or to obtain permission to republish any of the interviews already on the site, please contact me directly at firstname.lastname@example.org.