In May 2009, the semester before I started graduate school, I attended a conference on Late Antiquity. I was excited for the conference, because this was the field I had intended to study, and I thought that I would be familiar with the subjects of the talk. After all, I was a star student in my undergraduate program, getting As in Latin and Roman History courses, and easily being able to memorize what I thought were the key components of studying ancient history, such as emperors’ ruling dates, key battles, and the names of various barbarian confederations.
Very quickly, it became apparent to me just how much I didn’t know. Sure I could rattle off key dates, but the level of specialized knowledge of individual texts, scholars, and methods of research that the conference presenters shared was way over my head. I felt like I had just moved cross-country to devote the next phase of my life to something I had absolutely no grasp of. I was overwhelmed, an impostor, and a fraud (or at least that’s how I felt).
Of course, since then, I have become more familiar with the field of Late Antiquity. In graduate courses and my own projects, I learned much more of the specialized knowledge and skills that had seemed so foreign to me at that first conference. I’ve researched and written about a specific area of late antique history, and have written a dissertation that has proven to a community of scholars that I have enough of a grasp on the field to deserve a doctorate.
Here’s the thing, though: all that knowledge isn’t the most important part of my experience. Neither is the dissertation that I produced from it. Even after studying for nearly a decade, there are a lot of things about Late Antiquity that I still don’t know. I can’t tell you, off the top of my head, what the average fourth-century peasant ate for breakfast in rural Cappadocia, or how bishops in small towns in north Africa conducted business within the imperial bureaucracy, or how widely classical Greek medical theories were understood among educated fourth-century Christians.
The most important thing I learned in graduate school is how to embrace the state of constantly learning.
Even as I write this now, I realize something that I wish I had known back in 2009: knowing how to learn is far more important than knowing particular facts. As I do daily coding exercises, I often find myself frustrated that I can’t get my code to do what it’s supposed to do, and sometimes (but only after struggling through it myself) I start searching StackOverflow and FreeCodeCamp forums for the “answer” to the problem. When I find that “answer,” it usually ends up raising a number of further questions: why does a certain method work? Why does it work on strings, but not integers (or vice versa)? Why did my first attempt to solve the problem not work? How can I remember this in the future and apply it to similar problems? I know more about coding today than I did three months ago. What’s more important, however, is that I know what I don’t know, and I am beginning to know what kinds of questions to ask.
Do I still have moments where I start freaking out, asking myself why the hell I thought it would be a good idea to leave my comfort zone and learn to code, and wondering if anyone would ever actually pay me to do this? All the time. I certainly hope that as I continue, I will have fewer of those moments, but the truth is, I think they will continue for quite some time. What is more comforting to me is the knowledge that I’ve had these moments before. Whenever I encounter a new coding problem, read a blog about something completely foreign to me, or attend meetup where I’m not even sure that everyone around me is speaking English, I like to remember that I’ve been in this situation before. The most important part of my graduate education is that I’ve learned to accept and embrace it.