© Robert C. MacDougall and Figure/Ground Communication
Dr. MacDougall was interviewed by Laureano Ralon on August 9th, 2012
Robert C. MacDougall is a professor of Media Studies in the Communication Department and coordinator of the Faculty Center for professional Development & Curriculum Innovation at Curry College near Boston. His teaching and research centers on the cognitive, social, and epistemological roles played by communication media and technology today and through history. His first book Cultural Technics: making meaning at the interfaces of oral and electronic culture (2009) interrogates the use of email by a group of Mohawk Indians. His second book, Digination: identity, organization and public life in the age of small digital devices and big digital domains, and an edited volume entitled Drugs and Media: new perspectives on communication, consumption and consciousness were both published in 2012. Dr. MacDougall has also taught at the University at Albany, Allegheny College, and Emerson College. He is the author of numerous journal articles and book chapters related to organizational communication, communication theory, the rhetoric of science, and new media. His most recent investigations incorporate the use of EEG technology to understand some of the interactions, effects and side-effects of multimedia use on the “brain-body.” He is an avid member of the Media Ecology Association and father of two willful boys who wish, among other things, that their parents would let them watch much more television.
How did you decide to become a university professor? Was it a conscious choice?
It’s tricky, the conscious/pre/unconscious thing. Having said that – I was raised in a family of educators. My father was a professional musician, composer, and public high school music teacher. My mom still teaches piano and French. My older brother was a syndicated film critic and taught film courses at the local community college and prison. My other brother and sister are both musicians and music teachers, too.
So while I didn’t think about it much growing up, or even well into college for that matter, the notion of teaching and working in a kind of service capacity certainly wasn’t alien to me. But I was by no means a great student in college, so teaching did not seem likely. In fact, I dropped out of my first communication course my freshman year. I think it was after I got hooked on literature and philosophy (my eventual major and minor) when I admitted to myself that I liked writing and talking about ideas. However, it probably wasn’t until late in grad school – when I decided, with some desperation regarding what to do next, to stay at the University of Albany after earning my master‘s degree in political communication – that I started thinking about teaching in a serious way.
I enrolled in the doctoral program in humanistic studies at Albany. I had to decide on a dual course of study; I chose to blend philosophy with communication. On one side I delved much deeper into western thought, continental philosophy, the philosophy of mind and phenomenology – the mid ‘90s here! Again, that’s when I probably started thinking seriously about a career in academe. And on the communication side, I was finally introduced to McLuhan (actually saw him quickly in my side-view mirror as an undergrad, but wrote him off as an impenetrable quack), Meyrowitz, Ong, Eisenstein, Postman, Gumpert and Cathcart, Strate, Lum, etc. I loved all these wild ideas about media and education and consciousness. But I still hadn’t taught a thing yet. I had no idea how to teach anything, as far as I could tell. The first class I taught as “instructor of record” was an entry-level humanities course that the head of my program offered me entitled Concepts of Identity, Race and Culture in the Modern World. While I was given a basic reading list that was typically used by graduate student teachers, I was allowed some latitude to mix things up in some interesting ways. I took the Joan Didion, Stephen Baldwin, and Albert Camus handed me, and threw in some George Herbert Mead, Phillip K. Dick, a couple Star Trek episodes (original series), and the X-Files. It was awesome. And it was then that I knew I wanted to be a college professor. That would have been 1994.
Who were some of your mentors in university and what were some of the most important lessons you learned from them?
If I had a mentor as an undergraduate student, it was a young writing teacher named Jennifer Fleischner. She was a graduate student at the time and I think she was the only professor I decided to take more than one course with. Jennifer was really encouraging – she said I had a knack for it after the first writing course I took with her. She really got me interested in the craft. I probably wrote the detailed opening scenes of half a dozen sci-fi novels that year. But that’s about it for undergrad mentors. Unfortunate, but I just wasn’t that interested in academics yet.
The serious academics, and mentoring, began in grad school, which I sort of just fell into after trying out the “real world” for a year or so. I worked at a couple NPR affiliates, first as a board op, then news editor. But I had a hankering for something more. And I eventually had to get serious, as I soon found myself prepping to write my thesis. It was 1992-93 at this point, and I was really interested in the rhetoric of science. I met Donald Cushman – Don ended up being my thesis advisor and was certainly my first real mentor. He was a big, blustery man who developed a rules-based theory of interpersonal communication back in the 1970s. When we met he was steeped in organizational communication, and was pedaling his “High Speed Management” approach. I wasn’t as interested in all that, but learned it well enough to help him teach his courses as a T.A. This is where I got some initial training talking and working with students.
Thankfully, Don was also interested in rhetorical analysis and taught me how to write with a bit more precision. One of his students, Larry Prelli, had recently edited a special issue of the journal Argumentation. I used the articles contained there as the basis for my thesis. I analyzed the different modes of scientific discourse – the kinds of rhetoric and types of evidence brought forward in different genres of scientific argument: the Bohr-Einstein debates, Darwin’s troubles selling his theory of natural selection, an argument put forward regarding the viability of AI, and several others. I stayed on with Don in the first year of my doctoral studies, too. He helped me get a research fellowship in Poland. I worked in Warsaw for several months at a management development center for ABB, the European equivalent, I suppose, of General Electric. The Iron Curtain had fallen just five years earlier. It was still a Wild West, with new media popping up everywhere. I was sometimes paying more attention to that than my other, more earthly duties. In any case, there I gained invaluable experience as an interviewer and ethnographer, and wrote several white papers. But our interests really started to diverge substantially as Don was pushing me in the direction of management consulting, and I started to veer in the direction of media studies. Once back home, with all the philosophy of mind and cognitive science still going on over on the south side of campus, I was starting to see some natural linkages between communication technology, perception, and cognition. This is when I met Stuart Sigman.
I always say Stuart taught me how to think. By this I mean he helped me figure out how to frame questions – proper research questions in particular. And he helped me develop an intuition for locating the appropriate unit of analysis in any given communicative context. Without lapsing into any kind of reductive outlook – quite the opposite in fact – Stuart helped me begin to see how communication was a total, incessant, environmental process. Stuart trained under Ray Birdwhistell, an anthropologist and communication scholar who is sometimes described as a media ecologist. I came to “media ecology proper” pretty late in the game and Stuart is the guy who introduced me to Marshall McLuhan and most of the other thinkers who make up the media ecology tradition. Barnett Pearce and Paul Watzlawick were also staples with Stuart, and it was these sorts of thinkers who helped broaden my awareness of the natures and functions of human communication.
Ron McClamrock was my most serious mentor over on the philosophy side of campus. Maybe it was a defacto mentorship now that I think about it. He wasn’t my advisor in any official capacity, but he eventually agreed to be one of the readers on my dissertation committee. He introduced me to the philosophy of mind, cognitive science, phenomenology, and even a smattering of A.I. (artificial intelligence) theory. This is where I got hip to folks like John Searle, Douglas Hofstadter, Jerry Fodor, Herbert Simon, Zenon Pylyshyn, David Marr, Tom Nagle, JJ Gibson. I just learned an incredible amount over the course of three or four years under his tutelage (I’m actually gearing up now to write an article about “Gibson as media ecologist” for a special journal issue.) This was the point in my graduate career where my head was on fire. Class never ended, for like four years. We were always talking about this stuff – me and a couple good friends who were also making their way through the “pure” doctoral program in philosophy at Albany.
Ron was just one of those wunderkinds. He took his Ph.D. in philosophy/cognitive science at MIT when he was 25, maybe 24. He studied under a number of impressive folks – including philosopher Ned Block, probably Minsky and Chomsky, too. Ron’s 1995 book Existential Cognition: computational minds in the world pretty much blew my mind. We were reading chapters for class as he was tweaking the proofs. The courses were intense and rigorous. You had to hang on with both hands.
So while McClamrock was on the committee, Sigman was my dissertation advisor – and he was a tough nut, too. Another one of those razor-sharp minds. I remember, with mixed feelings, our heated meetings spent hammering out the proposal. These sometimes lasted two or three hours. The dissertation was an ethnographic analysis of a group of Mohawk Indians learning to use an early commercial email system. And it wasn’t until halfway through the project that I realized I was doing media ecology. There was other, more subtle mentoring going on I’m sure, but these are the folks I remember most vividly.
In your experience, how did the role of university professor “evolve” since you were an undergraduate student?
Evolve…that’s a loaded term. I detect some evolution and devolution over the past 20-30 years. For example, when I was an undergraduate student in the mid-late 1980s, most of the professors seemed to subscribe to the model of teaching that was put in place before Gutenberg’s press came on the scene. This has its merits, but it’s not always the most effective way to go about business. For me personally? Aside from a couple of writing courses that were more interactive, it was indeed a matter of passive reception most of the time. Biology and history and literature courses could have been really engaging. But there was a lot of monologue and the professor typically had a pile of notes or a couple books they knew well, and the students had never seen or even heard of these ideas before. The course was really a matter of taking copious notes. It was almost entirely a one-way model. Again, this isn’t necessarily problematic, depending on what sort of class we’re talking about.
There’s a funny scene I recall from some B-movie where a college lecture is in progress. The professor is spouting off, scribbling notes on the board, etc. The camera scans the room. It’s a theatre-in-the round type set up. You notice a bunch of half-awake students sitting in chairs, but also a slew of portable cassette recorders on desks and chairs where students should be. The next scene shows a room with just a few students, if that, and dozens of those portable devices. The next scene starts there again – maybe no students left. But the real punch line comes when the camera scans to the front of the room. No more professor – just a big reel-to-reel turning on the desk. Quite prescient.
So let me say some more about media. In many contexts today, the professor can easily get sidelined if they’re not careful. And there’s a reciprocal dynamic at work for sure. With smart boards and smart classrooms of various kinds, the expectation in these spaces (and outside too) is for that technology to get utilized. We’ve all heard about the horrors of PowerPoint, but it’s really any technology used in a learning environment that can go wrong. These things aren’t just tools to help us do the things we want to do. They superimpose on the original environment, the original plan, sometimes blotting it out entirely. They do a lot to structure the interaction. They change thought and behavior. But they also leave a cultural residue that functions beyond the original context of use. So if the professor doesn’t turn the devices on, they’d better be ready to at least have a few really good stories to tell in lieu of some snazzy A/V. It’s taken a while to catch on, but there’s a growing awareness that so many college students indeed expect to be entertained.
Neil Postman had this right I think. He might have originally chalked it up primarily to some issues with Sesame Street, but the problem has exacerbated with the proliferation of various kinds of electronic and digital media in classrooms. I remember TVs on rolling carts in lots of rooms in my high school. But they rarely seemed to be used – it wasn’t ingrained as yet. Some older definition of “classroom” still applied. There was often a solemn feel to things once the bell rang. And I think that was probably O.K. We might wonder like Postman did – should learning always be fun? Should the professor be a jester or minstrel as well?
Now, if the professor does turn the devices on, they’d better get ready, because they can quickly become superfluous, meaningless really. Part of my job these days is to serve as coordinator of a center for professional development and curriculum innovation. At my school and elsewhere, I know of too many college courses where YouTube or a TV show or a movie is started at the beginning of the hour at least once or twice a week, and – there goes class. It’s kind of like the novice PowerPoint-user (teacher or student) who sets up a slide show with that automatic slide-change function enabled. They start off with an articulate opening then soon find themselves mumbling, turned three-quarters and standing back, watching the screen flicker along with everyone else.
I like to think that the best college professors these days are cautious users of technology. I’ve already participated in some “hybrid” courses. This means part online/remote, part in the traditional classroom. A whole different set of skills and habits have to be developed to run an online course well. And these too aren’t all done the same way. It’s a brave new world, and I know your question doesn’t necessarily imply this, but I wouldn’t say we are evolving in any significant ways in the academy lately.
I don’t know, but maybe e-portfolios are a part of the answer. Maybe this new notion of the “flipped classroom” holds promise: the idea where the hearing of traditional lectures, the participation in discussions, and the reading of books and watching of films is done somewhere, sometime outside of the standard time and place class meets. The class meeting is then used for problem solving, questioning, and mentoring and coaching – both group and individual. I’m not sure how the authors of this idea think folks will carve out the time to make it all work, but, hey, it’s worth a shot. If we don’t give it a chance the contemporary college professor could soon find themselves obsolesced by Khan Academy or the University of Phoenix. Like corporate logic, the logic of the machine is that of efficiency (in all its guises). And let’s not even consider here the whole mobile-media-reality emerging on college campuses I’m sure so many of my colleagues think we’re set, but I do get the sense that the traditional definition of the classroom – of the entire academy I’ve been alluding to – hangs on a thread. Will something new and better emerge? And what will the role of the college professor be? These are of course open questions. Unless a different set of values is constructed and fostered in the population, “Distance Ed” may well be the future of “higher education” – whatever that ends up meaning on the other side of the transmutation. I hope I’m wrong.
What makes a good teacher today? How do you manage to command attention in an age of interruption characterized by attention deficit and information overflow?
As suggested in my answer to the previous question, I think one needs to find a balance between the old and new models of pedagogy now emerging. Limited and very strategic use of classroom technologies of various kinds has its place and makes good sense. However, I have to remind a few of my colleagues every now and then: a blackboard with chalk, a white board with markers, flip charts, books, etc. – these are all technologies educators have been using for a very long time. PowerPoint and Prezy and all that are the latest, and maybe part of a digital gestalt shift, but they are still tools we need to learn how to use and understand if we want to continue being effective, reflective, and relevant teachers. We need to understand their limitations and affordances. I’ll drop another of my favorite Postman sayings here to help us see why: He said you can’t do politics with smoke signals (or something like that). What he meant by this, as far as I can tell, is that there are certain kinds of knowledge that want a certain kind of communicative context. And this means media contexts as well. And there are, reciprocally, certain kinds of media that help relay (and reshape) certain kinds of knowledge better than others. In this age of Twittered political campaigns, the smoke signal quip gathers new relevance. And for sure – smart phones on student desks results in texting and such. Information overload, as you say. It’s a no-brainer! Magazines and comic books hidden between the leaves simply did not open up this kind of effect. So I put a few rules in place to discourage mobile media use in my classrooms. That’s a no-brainer, too. Then again, if you want a certain kind of class dynamic you explicitly allow a continual cascade to your Twitter feed on the big screen at the front of the room (people do this! Just not for me quite yet). Like Josh Meyrowitz pointed out, media change the definition of spaces and places. They also change definitions of self and other. We can foster some sensible habits that conjure the good pieces of that centuries-old definition of “classroom.” We have to.
Beyond this, I really think every college professor should continue honing their story-telling skills. This is a form of technology as well. Stories, in the “Face-to-Face” mode. (F2F: it gets truncated in more than just graphical ways unfortunately.) In so many instances, the narrative form is a lost art. But, I’ll take this to the bank: few things gather student attention more effectively than a good, compelling story told live that is related in some substantive way to the lesson of the day. It doesn’t need to be a long story to have resonance and a powerful, mnemonic effect. Does this mean we have to keep up a bit on the cultural life of our students? Maybe a little. But the truth is that so many stories we tell can have resonance across cultural space and time.
To think that we can command attention is just sort of wishful thinking. I wrote a chapter related to this question in my most recent book Digination. There are plenty of folks who feel they expect and even demand their students’ full attention. But if you think about it, we’ve never had – nor should we ever expect – all of their attention. Since before electricity, even before the codex, even Professor Socrates, students have been daydreaming, staring at other students, staring at a fly on the wall, their navels, whatever. The trick is to regularly break through (or build on!) that part of human nature that persists. Reverie is a good thing. I’d never want to stamp that out. Besides… Bueller… Bueller… Bueller… you can get a lot of mileage out of that sort of pop-cult reference toward your agenda when you have a dozer in the back of the room. Works for me!
What advice would you give to aspiring university professors and what are some of the texts young scholars should be reading today?
I’ll assume you’re wondering here about young scholars interested in studying media. But in general, read widely. And walk widely. In other words, don’t just attend NCA or MEA conferences. I had great fun attending a mathematics conference a few months ago here in Boston. I sat in on a couple talks where MC Escher and Rene Magritte were employed as thematic guides for doing geometry and calculus. This is cool stuff and it lit some bulbs for me and my own efforts at course design. Either way, get outside the boundaries of the “discipline.” Just branch out. For example, I always suggest to junior colleagues, as well as my own students who are working on their capstone projects and theses, to seek out the sensible, if sometimes subtle, connections between their area of interest and other issues and ideas circulating in the world. Look for the “natural” connections. I’ve been digging into the literature in neuroscience, biology and genetics, and cognitive science most recently to get another angle on the roles played by media in the world today. But there are so many angles one might productively consider as a media scholar: economics, history, psychology, philosophy, almost anything really. This is a crucial part of enhancing one’s theoretical depth and breadth, and also one’s practical engagement with the world. But all of this makes for a more interesting person, too – for others, and for ourselves. We have to live with ourselves most of all, right?
Since it formally began back in the early part of the 20th century, media studies has always been at the vanguard of interdisciplinary pursuits. But it can be so much more. Like I said earlier, communication is the discipline without a discipline par excellence. We shouldn’t be afraid or ashamed to admit this. We can capitalize on the epistemic freedom that allows. Also, finally, practice and become adept in a couple qualitative methods (I worked at depth interviewing and ethnography/thick description), and a couple quantitative approaches as well. And maybe fiddle around with SPSS and learn some basic statistics.
Here are a few texts that made a big impression on me, or continue to impress me (some old, some new). I suspect many others might also find them illuminating if they are interested in media studies: David Abram’s The Spell of the Sensuous: Perception and Language in a More-than Human World; Andy Clark’s Natural Born Cyborgs; Antonio Damasio’s The Feeling of What Happens: Body and Emotion in the Making of Consciousness; George Dyson’s Darwin Among the Machines: the evolution of global intelligence; JJ Gibson’s The Ecological Approach to Visual Perception; Eva Jablonka and Marion Lamb’s Evolution in Four Dimensions: Genetic, Epigenetic, Behavioral, and Symbolic Variation in the History of Life; McLuhan’s Understanding Media: The Extensions of Man; Ron McClamrock’s Existential Cognition: Computational Minds in the World; Maurice Merleau-Ponty’s Phenomenology of Perception.
In 1964, Marshall McLuhan declared, in reference to the university environment that “departmental sovereignties have melted away as rapidly as national sovereignties under conditions of electric speed.” This claim could be viewed as an endorsement of interdisciplinary studies, but it could also be regarded as a statement about the changing nature of academia. Do you think the university as an institution is in crisis or at least under threat in this age of information?
So I dig McLuhan, and think he was spot on with so many things, but we’d have to pick and choose carefully regarding exactly where this particular statement has and has not come to pass. It has not come to pass, for instance, at the school I work at now, nor the two previous ones since leaving grad school. Pretty sure it also does not apply to three-quarters of the 100+ institutions of higher learning in the Greater Boston area. Of course, you can find little enclaves of inter-disciplinary and even trans-disciplinary work at many schools, but these approaches to knowledge-building are not part of the standard operating procedure of mainstream academic culture. They of course should be. Biological anthropology, sociolinguistics, ethnomusicology, media ecology, etc, etc. It’s out there, but aside from a fairly small number of well-established, elite institutions that can afford to experiment in these ways, and just a smattering of places like the New School and its ilk, in fact, worries over preserving the sovereignty of the disciplines are, unfortunately, very real. And the students are the ones who really get shafted by the narrow views onto the world all of this entails.
While it would be nonsensical for a communication scholar to utter such a thing, I’ve heard psychologists say just this – historians, economists, and philosophers, too. They are worried about the sovereignty of their disciplines, the purity of their intellectual pursuits. I cut my teeth in an academic environment where the notion of a “pure discipline” just didn’t make much sense. And I’d agree with McLuhan and Eisenstein and other media ecologists who suggest that academic specialization is one lingering side effect of the printing press. With the massive repeatability the technology allowed, classification schemes start to make sense, become really easy to handle, etc. The building up or accretion of knowledge is the foundation of modern civilization, but there is a downside to any accretive process. In a word, things start to reify. These are processes of social and historical construction. And unfortunately this means to make concrete in more than just the sense of being real and tangible. It also means solidify, and that ends up meaning stultify…to make useless…or pernicious. I say “side effect” of the printing press because I think it really was an unwanted and certainly unpredictable effect thrust upon the world of academics at the time.
Some might have welcomed this effect, but I suspect the committed academic, the person interested in a profound and spiritual way in the world of ideas, did not appreciate what was happening – if they managed to get the sense that is was happening, because it took a couple hundred years. Elizabeth Eisenstein noted that Gutenberg’s press exerted two parallel, but two very different, effects over time. It created an explosion of information and new modes of thought, as well as a concretization of certain, favored or popular ways of thinking. She cites increases in nationalism, fundamentalism, certain modes of dress, manner, ways of speaking, etc. But let’s take one example straight from the academy. We see the residual effects of this in the field of genetics, where epigenetics and other unorthodox theories concerning the propagation of species are harkening back, leap-frogging over Darwin in many cases, and landing smack dab in the middle of Jean-Baptiste Lamarck’s thinking. And this is helping us think in new and productive ways (i.e., practical ways) toward solving some long-held bottlenecks and gaps in our ability to understand how our organism works in the world. Of course, this opens up all kinds of new possibilities with respect to what we can do as causal agents actively involved in the making of the future of humankind. It’s scary stuff, and it’s full of all sorts of new responsibilities. And this seems to be one of the reasons why so many cling to the disciplinary schemes and categories handed down. It’s not for the benefit of students.
I might get accused of cherry picking with this next example, but consider Lucretius’ On the Nature of Things. (I should have included it in my list of texts media students ought to read.) It was really ahead of its time in an almost eerie way. I mean, here’s a 2,000+-year-old chirographic document that was just all over the place in its observations and allusions. It’s a dizzying text to try to get one’s head around, but I think that’s the whole point. The author (or authors – who knows!) was obviously trying to prompt a breakdown and breakthrough of the edifice of confusion that had been amassing around dogmatisms of various kinds. And that’s about 1,500 years before Gutenberg.
And so we come around again to departmental and disciplinary sovereignties. One can maybe think of some forms of mathematics that are well-served by this kind of bracketing. (Maybe accounting?) But there aren’t a lot of disciplines I can think of this minute that are well-served by this. McLuhan was pointing to identity-changing effects of the massive increases in the speed and scale of information exchange ushered in by electronic-then-digital technology. It should not necessarily be seen as a threat to the academy. Instead, it can be a window of opportunity (one that might not stay open for all that long in the grand scheme of things by the way). It can enhance the relevance of particular epistemes in a much grander epistemic constellation. I wish McLuhan was right across the board on this, but the problem, as suggested above, is that the culture of academy really is a culture of constraint in most corners.
The books you recommend to young scholars deal with issues of Consciousness, Perception, Embodiment and Existence – themes that are very much present in Phenomenology and Existential Philosophy. Of the pillars that constitute the Self, you seem to favor embodiment and temporality over symbolicity and sociality. Is that a fair assessment?
I guess I’d favor embodiment and sociality above all. “Pillars of the Self” – so, we mean personal identity there. But we would have to involve temporality too (identity equals some semblance of continuity over time.) The question is still just a bit too dichotomous, though. But let me try to parse this out. The real trick is to look for emergent, interactive patterns in all four. My suspicion is that we’ll eventually discover how embodiment – specifically these naked bodies we have evolved into – will always be front and center with respect to what it means to be human…humanity with all of its foibles. Being in these bodies…it’s a good place to be. Without them we will become something other. There’s plenty of solid historical and anthropological evidence for that. It’s like we worry so much about murder and mayhem. True, these things have always been, but they didn’t start spiking until we began appending various things to our bodies. I’ve always been impressed with the ancient Japanese theory of warfare. You wanted to be able to look your enemy in the eye. The long-sword, spear, and bow chipped away at that. Then the cannon, gun, plane and bomb finished the job. The result? It’s just too easy to kill now. And that’s not even getting into the whole remote/drone-style warfare so popular on TV today. I don’t want to simplify things too much here, but I get the sense this is a big part of the reason why we have so many wars and so much strife. Mediation, or “mediation on steroids,” as I like to say. It accounts for so much of this.
So? So, we need to be wary of some fairly recent ways of thinking about the body that stem directly from that nagging “dream of modernity.” And from Descartes, too. Walt Disney is apparently frozen somewhere because of it. The human genome project is one manifestation of it, as is computational biology, and we now have movements like “trans-humanism,” and initiatives such as the “connectome project” which seek to rid us of that tenuous, inefficient dependence on the “meat.” “Get rid of the body altogether,” they say. “The meat is obsolete.” It’s the notion that technology can make us better, bring us to a better place without the body. And Facebook and YouTube and the latest social networking techs are all designed to circumvent the body. They functionally subscribe to the same idea. That’s the ethos. But we should retain a healthy dose of skepticism when it comes to that whole notion. The body should not be considered obsolete.
There seems to be plenty of evidence at this point which suggests that the more you mediate – that is, separate or put things in between people (different mind-bodies), and in the midst of or between individual mind-bodies – the more you start to see inhuman and inhumane activity taking place. And remember that we have to read “mediation” really broadly. More than just the warfare example floated a minute ago. It gets down to separating thought from embodied action. Language is of course a wonderful medium that has taken us so far as a species, but it has and continues to exert its one fatal flaw. As Alfred Korzybski liked to say, “the map is not the territory.” Before writing and all that, before Ong’s secondary orality kicks in in its various forms, we have human language in-context (I refer here to embodied social interaction). There’s nothing nostalgic about saying this is human being in its most impressive, egalitarian, elegant form. Two people engaged in an engrossing conversation. Buber’s I/thou in a nutshell. Talk about bandwidth – that’s bandwidth off the gigahertz scale! And yet even in such contexts, we’re only able to grasp a small portion of what it is we are trying to communicate to each other. The French philosopher Henri Bergson wrote beautifully about the problems of language along these same lines. Symbolicity, as you say, sure adds a thick layer of complexity in our efforts to understand who and what we are.
But beyond the body? That’s the rallying cry for many futurists today. Maybe we should start another meme: Beyond language! Because language already gets so misused and abused in these latest guises, let alone its original, embodied form. When we start to strip language away from the body – compressing it down most recently with things like texting and twittering to perhaps the most narrow bandwidth/form of symbolic exchange ever conceived – I think we really open up the possibility of becoming something less than human. That would be Buber’s I/it. It paves the way for everything and everyone to be rendered instrumental when we start giving ourselves over to such media. And this is unavoidable. All media become naturalized over time with enough use.
Consider Skype and facetime and such. Most people on the street aren’t aware that we had a commercially viable videophone read to go in the 1960s. A few companies, including AT&T, had their fingers on the trigger. I have an 8mm movie reel in the next room showing my mom and older brother using it at Expo ’67 in Montreal. There was plenty of “gee-wiz,” but the problem was that folks just weren’t interested in that kind of innovation as part of “real life” at the time. The conventional phone was plenty, too much even. Already too alienating. We were supposed to embrace this thing that suggests we are actually talking with another person? Interestingly enough, today millions are getting comfortable with these kinds of techs because the figure/ground scenario has flipped. I’ve felt the pull myself. And of course it’s all predictable, plain to see, because we start to see these technologies as a natural mode of being. We start to get the sense that they are natural extensions of ourselves. But they are not – they are first and foremost corporate and commercial manifestations or extensions, with powerful affordances (some, not all) specifically tuned to their points of origin. Recent groundswells of discontent with various social networking sites suggest as well that many people are starting to get the same sense. I trust you’ve tried Skype or facetime or any one of their analogs? They’re really weird if you let yourself seriously think (and feel) about it. Keep an eye on Facebook especially. Looks like something might be crumbling there as we speak.
Among the books that you suggest students should read is Merleau-Ponty’s Phenomenology of Perception. Well, Toward the end of his life, Marshall McLuhan declared: “Phenomenology [is] that which I have been presenting for many years in non-technical terms.” How can phenomenology and communication studies reinforce each other in this age of information and digital interactive media?
I do believe McLuhan was a phenomenologist. And here’s one way to think about it: Both McLuhan and Merleau-Ponty spent their careers problematizing the idea of an encapsulated/unitary, Cartesian self. If I understand these guys, the whole subject-object distinction upon which so much of the modern, western ontological framework rests is pretty much rendered meaningless by the end of both corpuses. But, that might just be my reading. Point is, phenomenologists and communication scholars should be working together to help us understand what sort of environmentally enmeshed, work-in-progress the human animal really is.
Do you think communication studies should be a discipline in the first place – concerned as it is with a sort of nothingness, i.e., the effects stemming from technological environments?
Like I said earlier, I think media studies can be a disciplined intellectual pursuit without having to maintain distinct disciplinary boundaries. You’d think it would be a fool’s game to try and encapsulate the study of human communication within a single department or discipline, yet we’ve been struggling to live with that conceit for more than half a century now. That’s really unfortunate because this has not been in the best interest of students. It sprang more from a way to make faculty feel good about what they do and do not know. Of course it happens in fits and starts, but ”communication departments” across the land should be making it a point to hire more people devoted to the primary pursuit, with deep interests also in anthropology, philosophy, history, sociology, economics, cognitive science. All of that works.
You’ve recently edited a book entitled Drugs and Media. Can mediation theory account for habits of drug use and patterns of addiction?
Yes, I think it has a lot to offer. I try to draw out a broad sketch of why this might be so in a couple of the earlier chapters. And then I get really speculative about it in the final two chapters that consider how all of these media forms have really brought us fully into something like the Anthropocene. I suggest that we have entered a new epoch characterized by “environmental engineering” (sometimes aka “niche construction”), where some fairly strange processes, including transgenerational epigenetic effects, could be triggered by both individual and system-wide media use (including everything from anti-depressants, smart phones, video games, etc). This could all be instrumental in redirecting the trajectory of our species. Again, those last two chapters are pretty far out, and are really supposed to serve as probes in the McLuhanesque sense. But all of that is tempered well by the other contributors, starting with Lance Strate and Corey Anton, who do incredible jobs drawing out more specifically why mediation theory and media ecology more broadly can have great explanatory power, even some predictive power, and are particularly good ways to help us understand the relationships we maintain with our media and our drugs. I’ll leave my answer there and shamelessly urge your readers to check out the book.
What are you currently working on?
I’ve been working recently in a psychology lab at Curry College set up to track different physiological/neurological data. We’re using a custom-built EEG apparatus in an attempt to directly test McLuhan’s “light-on, light-in and light-through” distinctions, and just ferret out any meaningful differences detected in the human system between screen and page reading. A formal write-up of those findings will be coming out next year in a book edited by Michael Grabowski at Manhattan College. My research partner Bruce Steinberg and I are also starting up a project investigating the perceptual effects of video game environments of various kinds. And I’m growing increasingly interested in all these “learning technologies” coming on the scene now. I’ve actually been peddling McLuhan and McLuhan’s tetrad as a new tool at a few conferences to assess things like PowerPoint, e-portfolios, real-time clicker systems, etc. But also more conceptual techs like “flipped classrooms,” and various “gamification” strategies. Bruno Latour’s spin on ANT (actor network theory) has also piqued my interest. I’m really itching to learn more about how students are interrogating and constructing knowledge and building skills via these technology-saturated and information-rich contexts they find themselves in. There’s plenty to do.
© Robert C. MacDougall and Figure/Ground Communication. Excerpts and links may be used, provided that full and clear credit is given to Robert C. MacDougall and Figure/Ground Communication with appropriate and specific direction to the original content.
Your feedback is welcome and appreciated! If you like what you see, please consider commenting or donating to help us grow and expand. Figure/Ground is currently on the outlook for collaborators to help with the expansion of this section into the largest repository of scholarly interviews on the net. For specific suggestions regarding future/potential interviewees or to obtain permission to republish any of the interviews already on the site, please contact email@example.com.