I have been going to academic conferences since I was about 12 years old. Not that I am any sort of prodigy—both of my parents are, or were at one point, academics, so I was casually brought along for the ride. I spent the bulk of my time at these conferences in hotel lobbies, transfixed by my Game Boy, waiting for my mother to be done and for it to be dinnertime. As with many things that I was made to do as a child, however, I eventually came to see academic conferences as an integral part of my adult life.
So it was that, last year, I found myself hanging out at the hotel bar at the annual conference of the Modern Language Association, despite the fact that I am not directly involved with academia in any meaningful way. As I sipped my old fashioned, I listened to a conversation between several aging literature professors about the “digital humanities,” which, as far as I could tell, was a needlessly jargonized term for computers in libraries and writing on the Internet. The digital humanities were very “in” at MLA that year. They had the potential, said a white-haired man in a tweed jacket, to modernize and reinvigorate humanistic scholarship, something that all involved seemed to agree was necessary. The bespectacled scholars nodded their heads with solemn understanding, speaking in hushed tones about how they wouldn’t be making any new tenure-track hires that year.
You need to be a member of School Leadership 2.0 to add comments!
Join School Leadership 2.0