© Mark Kingwell and Figure/Ground Communication
Dr. Kingwell was interviewed by Andrew Iliadis on November 5th, 2012
Mark Kingwell is a Canadian professor of philosophy and associate chair at the University of Toronto’s Department of Philosophy. Kingwell is a fellow of Trinity College. He specializes in theories of politics and culture and he has published twelve books, most notably, A Civil Tongue: Justice, Dialogue, and the Politics of Pluralism, which was awarded the Spitz Prize for political theory in 1997. In 2000 he received an honorary Doctor of Fine Arts from the Nova Scotia College of Art and Design, for contributions to theory and criticism. He has held visiting posts at institutions including: University of Cambridge, University of California at Berkeley, and City University of New York where he held the title of Weissman Distinguished Professor of Humanities. He is a contributing editor for Harper’s Magazine, and has written for publications ranging from Adbusters and the New York Times to the Journal of Philosophy and Auto Racing Digest. Among his twelve books of political and cultural theory are the Canadian best-sellers Better Living, The World We Want, and Concrete Reveries. In order to secure financing for their continued indulgence he has also written about his various hobbies, including fishing, baseball, cocktails, and contemporary art. His most recent book, Unruly Voices, is an examination of democracy and civil society.
How did you decide to become a university professor? Was it a conscious choice?
It was more like a failure of choice! I had no intention of pursuing an academic career even at the time I started my doctoral degree. I had already spent two years in grad school, in Edinburgh, and I got my letter of admission to Yale the same week I was offered a full-time job at the Canadian newspaper, the Globe and Mail, where I had been working summers as a reporter. My brother was visiting and I bored him silly on long train journeys around Scotland, agonizing over what to do. I bothered strangers, importuned bartenders, the lot. It was, and remains, the only truly difficult career decision I’ve ever had to make—though turning down a named chair at McGillUniversity in order to stay at Toronto comes a close second.
Even so, it was a choice about the next few years, not my whole life. It wasn’t until I was coming to the end of my doctorate that I really thought: “I want to keep doing this.” I had always alternated my academic schedule with regular work at newspapers and magazines, and there was ever a sort of grass-is-greener feeling. I’d deplore academic abstraction when I was deep in dissertation writing, but then find myself appalled at how little ideas are taken seriously in most journalism. In the end, it seemed as though an academic career would offer the best of both, since I could write the non-academic stuff I wanted, from a platform of some intellectual stability, and not have to do anything I didn’t want to, like chasing ambulances or interviewing politicians.
The bad news was that I came to this conclusion in 1991, during one of the worst humanities job droughts in recent memory. I spent the next seven years in a series of lectureships, fellowships, and limited-term professorships—the dread ‘folding chair’. It was a tough time, and during those years I often wondered if I would have to do something else, like write full-time or try law school or the Foreign Service, as some of my grad-school friends did. And even though I published a fair number of journal articles and a peer-reviewed book during this period, my non-academic writing was more visible. This tended to make hiring committees uncomfortable, especially here in Canada, where we like to punish a person for anything that seems as though it could be construed as showing off. Being on television was even worse, of course.
Who were some of your mentors in university and what were some of the most important lessons you learned from them?
The most impressive undergraduate teachers I had were two political theorists, Gad Horowitz and Alkis Kontos, and a philosopher, Kenneth Schmitz. Schmitz was the only professor I had who still lectured wearing an academic gown, and I am always happy that I now serve as a fellow in the same college he did, Trinity at University of Toronto. (I only wear my gown to chapel and the dining hall, however.) My most important mentor, though, was my undergraduate thesis director, a scholar of 20th-century British literature, Walter O’Grady. He really pushed me to write better, clearer prose, and to argue with precision. He was not a philosopher but he had a gimlet eye for bad argument.
My M. Litt. supervisors at Edinburgh were a very quiet, careful scholar called Ronald Hepburn, who nevertheless wrote with wit and some ferocity, and Colin Manlove, who suffered from Grave’s Disease. He was a brave man who loved books. They were generous and virtuous teachers. I was closer to my Ph.D. supervisors, though: Georgia Warnke, who offered a warm and encouraging intellectual presence, and Bruce Ackerman, who was brilliant, lively and funny. I learned most about teaching from Karsten Harries, who is also responsible, albeit post facto, for my scholarly interests in art and architecture. I admired Jonathan Lear tremendously for his writing ability. Likewise Harry Frankfurt, who left Yale for Princeton during my time in New Haven.
This was a period in which Yale philosophy was in disarray, mostly because of personal and ideological tussles. I remember Frankfurt being asked by a reporter from the Yale Daily News what the solution to the problems might be. He said: “Ritual sacrifice of senior faculty members.” The main thorn in everyone’s side was Ruth Marcus, who could be extremely abrasive, though strangely she was always nice to me. I once described her as a “cult-busting logician,” which I think she secretly quite liked, but then I wrote that she had called Jacques Derrida a “charlatan,” a claim she denied. I had no firm source, just hearsay, and had to withdraw that; but I know it was in fact her view.
I liked Yale in the late 1980s, despite all the feuding and confusion. It was exciting, really, to feel that there were squabbles that were, at least in part, about intellectual positions. Of course a lot of it was just personality—but what isn’t? And it sank the department’s rankings a lot, which probably hurt those of us who came out of there in the midst of all that dust.
In your experience, how did the role of university professor evolve since you were an undergraduate student?
Well, there are all the usual things: the use of technology, the shift towards a customer-satisfaction model of teaching, the relentless emphasis on universities as credential factories or job-training centres. And of course, if you oppose these views, it can seem as though you are arguing for the ‘mental spa’ model instead, where education, especially in the humanities, is reduced to a personal luxury good, like some kind of expensive book club for weirdos and nerds. What happens to liberal education, which conforms to neither of these two poles but is, instead, about creating an engaged, critical citizenry and increasing the body of knowledge for its own sake?
Given all that, one important change in the role of professor is that you can no longer take for granted that people will value the work you do, On the contrary, there is much superficial ridicule of humanities scholarship and of the life of the mind more generally. We live in an age of relentless economic valuation, where everything must be cashed out in terms of its transactional value. Universities need to push back against that, and so professors have an added burden of demonstrating the value of education. Not by showing that philosophy students do well on the LSAT—a move that gives the game away—but by showing why what we do matters to everyone.
That’s looking out. Looking in, the biggest change I see is how much students expect in terms of personal attention and responsiveness. Up to a point, this is good; but everybody needs to set limits. Otherwise, we’ll all drown in a sea of emails.
What makes a good teacher today? How do you manage to command attention in an “age of interruption” characterized by attention deficit and information overload?
I’m very much of the view that what makes for a good teacher doesn’t change much, despite advances (if that’s what they are!) in technology or changes in the details of disciplinary controversy. Enthusiasm, engagement, and openness are the key virtues of a good teacher. I think often of Jacques Rancière’s The Ignorant School Master (1981), where he advocates a position of equality for all members of a class or educational setting, including crucially the ‘master’, who may therefore abandon any pretence to expertise. This assumption can then eliminate the very idea of an ‘economy’ of attention, in the classroom and out. Premises about competition (what grade will I get?), resistance (is the professor pushing a view I don’t like?), and indoctrination (how can I replicate the desired ends of this professor?) all fall away. The classroom is now a scene of intellectual companionship and shared desire. This can be done even in very large classes, if one imbues the lecture theatre with an atmosphere of joy and fellowship.
Of course, there are many countervailing forces beyond just the usual ones of the academic system. Studies show that undergraduate attention spans max out at about ten minutes or less—perhaps much less. So any lecture has to be performed (I use this word rather than the usual one, ‘delivered’) with a rhythm that accommodates the realities of cognition. Socrates knew that education was sometimes more like seduction than tutelage. You don’t have to subscribe to the idea of maieusis, or intellectual midwifery of ideas already present in the soul, to accept that sometimes your job is not to put things into students’ minds, but rather to draw out what is there, and examine it. And make sure there is a laugh at least every ten minutes, to wake everyone up.
I guess I don’t, finally, view university teaching as in any sense ‘competing’ for attention. It’s not something that should be structured as a rival to television, or Facebook, or whatever. It has its own integrity and its own power, and if you do it right—with passion, humour, lots of examples, and enough precision to be accurate—the students will come. And, better, they will stay.
What advice would you give to young graduate students and aspiring university professors and what are some of the texts that young scholars should be reading in this day and age?
Unsolicited general advice is something one should never give! Young scholars have enough problems just finding work without getting heavy middle-aged guidance from someone like me. Anyway, I’m very much of the Aristotelian persuasion on the question of how to achieve excellence in something: nobody can tell you, they can only show you. You begin by imitating somebody who’s good at something, until, with practice, it becomes a sort of second nature. I learned how to teach by watching great teachers. It’s the only way.
Having said that, there is lots of practical wisdom one can share about the mechanics of university life. It mostly makes for depressing conversation, alas. I suppose the one piece of general counsel I would give is to make sure the institutional pressures of graduate school and pre-tenure university life never stifle the intellectual enthusiasm that got most of us thinking about graduate education in the first place. It has to be about love, a passion for the life of the mind.
I once said, rather pompously I guess, that I treasure philosophy, among other reasons, because it’s the only modern university discipline that has ‘love’ in its name (philo-sophia, love of wisdom). A colleague shot his hand up and said, “No no! Also biomechanical engineering!” He’d misunderstood me, but he also taught me something: biomechanical engineers love what they do. Philosophers have no monopoly on intellectual love. Awesome!
I do recommend a few books that address these questions, and the plight of the contemporary university in its reduced job-training and entitlement-conferring condition. There’s the Rancière I mentioned above. Also Bill Readings, The University in Ruins (1996) and William Deresiewicz, “The Disadvantages of an Elite Education,” The American Scholar (Summer 2008). Martha Nussbaum’s Not For Profit: Why Democracy Needs the Humanities (2010) is also worth a look, though it could have been a much better book. And finally, Laura Penny’s More Money Than Brains: Why Schools Suck, College is Crap, and Idiots Think They’re Right (2010) is, as the title suggests, funny and sad at the same time. (Readings and Penny are Canadian, by the way, so they offer examples from both sides of the border.)
In 1964, Marshall McLuhan declared, in reference to the university environment that, “departmental sovereignties have melted away as rapidly as national sovereignties under conditions of electric speed.” This claim could be viewed as an endorsement of interdisciplinary studies, but it could also be regarded as a statement about the changing nature of academia. Do you think the university as an institution is in crisis or at least under threat in this age of information?
Well, for the last part of that question, see above. Yes, absolutely under threat, mostly because universities have lost their way when it comes to core mission. Are we educators and researchers, or are we warehousing young people while they prepare for—or, more accurately, fantasize about—life in a world of competition and productivity? Rare is the senior administrator who can resist the economic logic that joins post-secondary education to employment statistics, threats of so-called underemployment, and the relentless cost-benefit analysis of everything. Bill Readings is particularly acute on this question, showing how the rhetoric of ‘excellence’ takes on an almost fascistic tinge of economic competitiveness, with global rankings creeping into the consciousness of even the most conscientious dean or president.
I don’t think McLuhan was at all right about disciplinarity, however. There was some disciplinary blurring during the 1980s and 90s, especially in and around the humanities and social sciences, but much of it has actually been removed. Women’s studies, black studies, gay studies, postcolonial studies—at one point not long ago, no self-respecting research university was without a department or centre for each of these and more. Those days are gone, replaced by pervasive disciplinary retrenchment. Many such centres and departments have been closed or re-absorbed into more traditional units such as English or Sociology.
But this moment too is just part of a cycle. The history of universities as institutions is one long debate about how we divide up the work, of what counts as a valid method, and so on. I am mindful of Heidegger’s idea that methods always create the kinds of results they can produce, and then set about producing them. That’s all an academic discipline is, a machine for generating results. As long as we remember that, these tensions and fluidities around the idea of disciplines or their boundaries becomes less serious, and potentially more fun.
For myself, I am a humanist, in the sense that I see no reason not to think and write about philosophy, art, architecture, literature, and even film or television. I do this ‘philosophically’ in that I use argument forms and writing styles in which I was trained by other philosophers; but I have also experimented with first- and second-person writing, narrative, dialogue, monologue, and so on. The best aphorism for any writer and intellectual is, I think, the famous one from Terence: “Homo sum, humani nihil a me alienum puto.” I am human; nothing human is foreign to me.
I am, to be sure, a misanthropic humanist, but that’s another story.
In 2009, Francis Fukuyama wrote a controversial article for the Washington Post entitled “What are your arguments for or against tenure track?” In it, Fukuyama argues that the tenure system has turned the academy into one of the most conservative and costly institutions in the country, making younger untenured professors fearful of taking intellectual risks and causing them to write in jargon aimed only at those in their narrow subdiscipline. In short, Fukuyama believes the freedom guaranteed by tenure is precious, but thinks it’s time to abolish this institution before it becomes too costly, both financially and intellectually. Since then, there has been a considerable amount of debate about this sensitive issue, both inside and outside the university. What do you make of Fukuyama’s assertion and, in a nutshell, what is your own position about the academic tenure system?
I am on record in a few places as being philosophically opposed to tenure. Tenure cherished is self-defeating: it creates conformity under the sign of production. In that sense, Fukuyama is right about the way it inhibits risk. Tenure as practiced also invites conditions of injustice: able younger scholars are blocked from jobs held in perpetuity by older scholars whose only material advantage was the luck of birthdate. And finally, tenure is mostly unnecessary: those of us who have the privilege to work in democracies need have no real fear of censorship. (We may well have fear of ridicule, but that comes with the territory and should be regarded as a sort of back-handed compliment, or at least as a challenge.)
Of course, I don’t see much chance of tenure being abolished in our universities. Too many people have too much at stake. And ‘freedom’ is always a ringing word to use in any debate, likely to cloud other issues. The irony is that the people who least need tenure, the ones who work hard and creatively no matter what, get locked into supporting it as part of an ideology of faculty solidarity. And there would be no sense in giving up tenure except en masse and all at once. Otherwise, there would still be a system of asymmetric privileges.
The last time I published an article against tenure, I was interested to note that the comments made to the contrary—in the comments section of the publication—often came from academics in professional schools. The argument was that tenure constituted partial compensation for the benefits they had given up by teaching law, business, or medicine rather than practicing them. An ingenious argument, I suppose, and very much of our moment. I find it hard to imagine a humanist making that argument!
In Unruly Voices you write “If McLuhan is right that technological changes can create the notion of the individual – the basis, however fictional and slight, of the quality claims embedded in the very idea of democracy – then it is surely possible that further changes will distort, or even dissolve, that notion. Technology, after all, is not this or that tool, or even the sum total of tools; it is, rather, a world view. In particular, it is that view of the world that frames the world as disposable, as clusters of resources awaiting processing and consumption. But this includes, crucially, ourselves. Suppose the current version of our immersion in the technological world view has the ironic consequence that we hollow out our own sense of worth by way of consumption: of our preferences, of our ‘likes’, our ‘friends’, our ‘stories’.” You then say that we would have become “person casings”. Can you explain this connection between technology and the individual, and can you say something about this term, “person casings”, and some of the theory behind it? This passage seems to imply technology and capitalism are part of the same process; “individual” rights, “individual” choice, etc, but then that eventually the term “individual” might cease to mean what it does and adopt a different meaning.
The notion of person casings comes from Jennifer Egan’s novel A Visit From the Goon Squad, where it figures in the last, dystopian section. A person casing is what is left over after the modern-age idea of the individual self has been hollowed out, replaced by semi-random preferences, instant likes and dislikes, and networks of contacts that lack anything like the substance of friendship imagined by Aristotle or Montaigne. This is a deliberate exaggeration of a real trend, namely the off-loading of the business of personhood to a nexus of social media and consumption tendencies. Egan plays with the effects on art and commerce, and envisions a security-state Manhattan surrounded by an actual wall. Inside, human beings live and work, but they are carapaces of their former selves, or of the kind of self the modern age was meant to create (the self of Montaigne, but also of Descartes and Locke) — that is, self-interested but active, prior to groups but responsible to community, and so on.
Naturally capital and technology are linked to this vision of a postmodern, maybe posthuman, endgame. This is in part because they are so closely linked to each other. The view of technology I give in that passage will be recognizable as a paraphrase of Heidegger’s concept of Gestell, or enframing. The world conceived as resource, ourselves included, makes itself available for exploitation by the forces of capital. This isn’t just a matter of how much consumer markets are now driven by technological products as such—phones, laptops, communication and entertainment devices—though that is certainly considerable. It is also about the ingrained attitudes of upgrade anxiety and technological inevitability. Overall, the tendency within capitalism, always strong, to reduce everything to transaction is aided and abetted just to the extent that we conceive of ourselves as buyers of products and services, rather than citizens.
I should say that there are also positive visions of the transhumant or posthuman future, and I am mindful of those, especially in the last essays in the collection. As Donna Harraway put it many years ago, we are all cyborgs now. The question is, what kind? The erosion of empathy that has been measured in habitual users of social media—there are actual studies showing this!—suggests to me that the future is one of atomization and narcissism, rather than the sort of creative exploration of life possibilities that links Harraway and other radical thinkers to, say, John Stuart Mill’s liberal interest in ‘experiments in living’.
Naturally, I hope I’m wrong about this. But the evidence is not encouraging. Maybe the truth is that we all zombies now, hollowed out vessels of pure restless desire, without any controlling centre or higher consciousness. How we would know, after all?
Your book Concrete Reveries is a philosophical examination of the city. What do you think is the biggest factor involved in the genesis of cities? Is it geography, capital, migration, or perhaps information? Has this changed over time? What will be the impetus that drives the growth of our future cities?
Big question! It’s all of those, and more. Cities are extremely complex systems, with multiple layered economies and vast circulatory mechanisms. In Concrete Reveries I note that they are the most complicated machines humans have ever created, and I sift through the various metaphors we use to understand them, none of which is really adequate. The best way into thinking about cities, though, is via an analogy with personhood—at least personhood of the right sort, and not person casings. Persons combine material existence and consciousness, and they channel experiences into decisions, actions, and judgments. They can be good or bad, flourishing or sick, and so on. Cities are the same: they are not reducible to their material basis, their built forms and infrastructures, but neither are they simply the sum total of desires experienced by their inhabitants at time-T.
It’s often remarked that we now live, decisively I think, in an urban world. For the first time in history, the majority of the world’s population lives in cities, often very large ones. Perhaps unsurprisingly, we don’t really know what this means, or what to do about it. We do know that people want to live in cities for sensible reasons, such as access to work, love, stimulation, and security. Because these opportunities entail density, negative externalities are created as well: noise, pollution, cultural conflict, poverty, crime. Many cities teeter on the edge of falling into a version of a collective action problem: a self-defeating cycle of competition and striving.
Somehow or other, most of those manage to avoid falling into complete chaos—though one should note just how fine the balance is, and how little it takes, say in the form of a bad storm such as the one that hit New York recently, to push things toward the edge.
By the way, the one thing we can definitely rule out as an impetus for urban growth is information. This is a new development, because for centuries it was indeed one of the key factors driving people to urban centers: they wanted to know what was going on, to take advantage of knowledge. For us now, there is no such issue because information is comprehensively available from anywhere with electricity. And assuming you have access to a computer, of course! That so-called digital divide still holds for many parts of the world, usually the ones we in the “developed” world hear little about.
In a world of ubiquitous computing, information, apps, and instant messaging, our cognitive capacities seem to be adapting – evolving is the wrong word – to a more streamlined conception of individuality and individuation. At the risk of asking a big, puffy (or old and stuffy) philosophical question, where do you position the individual, today, in terms of metaphysics and epistemology? Do we still have access to an individual self or are we the products of, what they call in epistemology, anti-individualism and semantic externalism?
I find it hard to take a strong stand on this issue, not least because the terrain keeps shifting beneath our feet. But I am drawn to as well as a little unsettled by the notion of the ‘dividual’, often associated with Deleuze and defined as “a physically embodied human subject that is endlessly divisible and reducible to data representations via the modern technologies of control.” The less paranoid version of the definition would rejoice in the possibility that we, conceived as our allegedly unique and precious individual consciousness, are far less important than we think. There is, after all, a kind of liberation in realizing not only the contingency of your specific experiences and memories but also their insignificance. Imagine no longer having to worry about what to wear!
In my view, then, semantic externalism is the correct position in philosophy of language, and runs parallel to Heidegger’s more stirring claim that language speaks us. The implications of these views for other areas of philosophy, and life, are really the subject of Unruly Voices. Indeed, the title is meant to suggest, in addition to the obvious allusion to the rough-and-tumble of democratic discourse, the cacophony of voices that make up the experience of consciousness, the different tones and tenors of life. That’s one reason I included essays written in first- and second-person as well as the usual, faux-expert third. The whole voice of authority that dominates writing in philosophy is really worth challenging.
No doubt I have just scratched the surface of the implications these ideas have for us who are here, huddled on the mortal plane and burdened by the complicated gift of consciousness. But it was ever thus for philosophy. And, as I say on the final page of the book: what else?
© Mark Kingwell and Figure/Ground Communication. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Mark Kingwell and Figure/Ground Communication with appropriate and specific direction to the original content.
Your feedback is welcome and appreciated! If you like what you see, please consider commenting or donating to help us grow. Figure/Ground is currently on the outlook for collaborators to help with the expansion of this section into the largest repository of scholarly interviews on the net. For specific suggestions regarding future/potential interviewees or to obtain permission to republish any of the interviews already on the site, please contact Laureano Ralon at firstname.lastname@example.org