Stephen M. Walt

Will the Internet destroy the ivory tower?

Apart from a few brief sojourns at various think tanks, I’ve spent most of my professional life in the academic world. Seven of these years were spent helping run various programs, first as deputy dean of social sciences at the University of Chicago and later as academic dean here at the Kennedy School. I have ...

Darren McCollester/Getty Images
Darren McCollester/Getty Images

Apart from a few brief sojourns at various think tanks, I’ve spent most of my professional life in the academic world. Seven of these years were spent helping run various programs, first as deputy dean of social sciences at the University of Chicago and later as academic dean here at the Kennedy School. I have one child in college and another heading there in two years. You can therefore assume I have a certain professional and personal interest in the whole business of higher education.

Which is why I find discussions of how technology might transform this whole enterprise quite fascinating. It’s hard not to read such articles and wonder how my own job might change in the years ahead, and to reflect on how I think it ought to change. I have not studied this issue in detail, so what follows are some purely impressionistic observations, based mostly on my own experience.

1. I think there’s no doubt that the traditional model of the academic lecture is headed the way of the dodo. I say that with a certain wistful regret, because I enjoy lecturing and like to think I’m fairly good at it. But it’s hardly an efficient mode of information-transmission, and there are plenty of studies suggesting that students don’t learn particularly well in this sort of passive "I-speak-while-you-listen-and- take-notes" experience. Lecturing of the old-fashioned sort can be entertaining and inspirational, but real learning requires students to engage and wrestle with the material instead of just hearing some older person declaim about it.

2. Given that top-flight faculty are among any college or university’s scarcest resources, having them stand in front of a handful of students and talk is especially inefficient, and all the more so in basic introductory courses. In other words, you probably don’t want Nobel Prize winners teaching basic statistics, Economics 101, or even Intro to Biology — especially when there may be lots of less renowned people who are actually better at doing that. But you do want students to have the opportunity to interact with the most brilliant minds, to argue with them, to see how they do their work, and to be inspired by their example. And that means creating different sorts of educational experiences (seminars, workshops, mini-courses, etc.) rather than just one.

3. Information technology is making it possible to transmit educational content at almost no cost; you can put course materials on the web and stream lectures to anyone with an internet hookup. This is what MIT is doing now, and it doesn’t seem to be discouraging people from wanting to attend full-time and pay full-freight. There are also online teaching programs that might do a better job of teaching basic materials (such as introduction to microeconomics, statistics, calculus, etc.) than that old model of the single lecturer with a chalkboard and a pile of notes. This suggests that we ought to be thinking of ways to use faculty rather differently — in more interactive and personal modes–where hands-on attention, genuine inspiration, and pedagogical ability can produce big payoffs, while using online tools to deliver basic factual or technical content.

4. I suspect that in the near future we are going to see a lot of experimentation with new forms of higher education, reflecting the fact that these institutions in fact serve many purposes other than merely transmitting knowledge/skills to students. One reason MIT can make its content available for free is that students understand there is a difference between watching lectures online and actually being in the class, being on the campus, and being immersed in the broader in-person environment. In the United States, at least, universities and colleges also provide a relatively safe space for making the transition from adolescence to adulthood. They are environments where young people can meet future spouses of similar class or social backgrounds, have lots of arguments with peers and with their professors, and get a lot of preconceived notions challenged. For many young people (though not all), college is about a lot more than just what they learn in class, which is one reason parents are willing to pay through the nose to make that whole experience possible.

What I’m describing here, of course, is the traditional model of a liberal arts education, and it’s hardly the only model out there. Other institutions (e.g., commuter colleges, junior colleges, vocational institutes) serve somewhat different educational functions and are already organized differently. My guess, therefore, is that changes in information technology and the overall globalization of information and education is going to produce an explosion of innovation over the next few years. The traditional four-year university/college won’t disappear, but it will be coexisting and competing with a lot of other models.

Lastly, this is going to be a painful process. Universities are filled with brilliant and innovative people — as individuals — but they are also incredibly conservative institutions (not politically, but in the sense of being wary of change). As a former Harvard president reportedly said, "trying to change the curriculum is like moving a graveyard." Faculties don’t like having to retool and alumni and other stakeholders often have powerful emotional attachments to traditional ways of doing business. And the older and more successful a university is, the more impervious to change it is likely to be.

Plus, coming up with new educational models is hard to do if you’re already working pretty hard teaching the existing program. But there’s no stopping this sort of Schumpeterian "creative destruction," and I’d hate to be working for the educational equivalent of Polaroid — a brilliant and innovative company that proved unable to adapt to a rapidly changing technological frontier.

Now if we can just get universities out of the business of running semi-professional athletic teams…

Stephen M. Walt is the Robert and Renée Belfer professor of international relations at Harvard University.

Trending Now Sponsored Links by Taboola

By Taboola

More from Foreign Policy

By Taboola