Out of all the disciplines affected by the crisis of AI in education, I think most people would agree that none is more affected than computer science. Of course, CS is at the heart of the development of AI, but I don't think that the reason AI has become so prevalent as an issue in CS education is necessarily due to that. I think a combination of the content of the field, the culture of the field, and the pressures on CS students has caused the CS major to be the pre-eminent example of how AI can massively disrupt the collegiate classroom.
Unlike my other essays, I want to use the Core texts here to set the boundaries for what I want to discuss. In Discipline & PunishCC222"Unmaking" the Modern World: the Psychology, Politics, and Economics of the Self, Michel Foucault describes the arrangement of panopticism. In the panopticon, a single prison guard sits in the center of a circular prison watching all of the inmates; importantly, the inmates can't tell when the guard is observing them, and they can't see the other prisoners either. Foucault directly (but metaphorically) compares this arrangement to isolating relationship between the teacher and the student in the traditional classroom environment. It's important to note that Foucault is not necessarily saying that this arrangement is bad, but rather commenting on a sense of isolating cellularity present in a traditional classroom.
The classroom environment is probably one component of the problem. In the CS major, most of the courses, especially lower-level ones, are all large lecture-style courses with weekly homeworks that are either graded on an objective basis or by an autograder, in the case of code submissions. Lecture by lecture, there is often little student engagement with the professors, because the class is just so big. Students are often assessed in lecture through in-class multiple choice questions.
This can easily create an environment that feels isolating. But the problem is more than just that. Almost all CS courses offer significant office hour blocks held by both the TAs and the professors; yet, many of them often sit very quiet. It feels like there is a dissatisfaction with what is being offered as part of being a CS major.
Let me set a boundary here for a moment. Considering Foucault's description, we need to remember that we are a part of this panopticon system too. I can't exactly know what other people's perspectives are; I just have my own experiences and the descriptions of others' experiences from people I've talked to. But in my perception, I feel that a lot of CS majors feel a dissatisfaction with the inherent worth of what they're studying. It is a common perception that what really matters for CS students, particularly those interested in software engineering, is the summer internships they do, not really the classes they take as much. The degree just functions as a means to an end, a piece of paper that validates the grind for a tech job, the skills for which were really learned outside of the classroom in the "real world."
It was this mindset of my peers that I consistently had when I served as a TA for CS112. I was on the course staff for three semesters; particularly, I was the most active out of the course staff on the Piazza page for the course, which is an online forum where students ask questions and the course staff can respond to them. I tried my best to get to know my students and to keep them engaged as much as I could in the lab sections I taught. But we were all faced with strikingly low office hours attendance and lab participation, decreasing averages, and a general sense of increased apathy and frustration from the students. I don't blame the students or the staff at all for any of this—and I think we did a good job and the way the students felt was understandable—but it was clear that we were in a time of rapid change.
In The Protestant Ethic and the Spirit of CapitalismCC221Making the Modern World: Progress, Politics, and Economics, Max Weber describes the rise of the "Protestant ethic" in western Europe and the United States, the idea that earthly work is crucial to reaching heaven. Over time, this mindset became secularized and permeated through culture, to the point where it affects everyone, including those who aren't particularly religious. I've thought about this a lot when considering the CS work that I push myself to do. From classes to internships to research spanning many different fields, there are a lot of different expectations and pressures placed on CS students to succeed.
This is by no means a bad thing—in fact, in most ways, I think it's a very good thing—but all of the different expectations placed on CS students can breed apathy when people start to see it all as useless. And that is exactly what AI has caused for many CS students.
In the context of CS education, AI has become a tool that, to many students, can do any classwork demanded of them effortlessly, and is used in the context of software engineering to do nearly all of the work taught in classes, to the extent that what's taught is useful at all. AI tools present themselves to CS students like the forbidden fruit of GenesisCC101Core Humanities 1: Ancient Worlds, a cheat code to the struggle of classes so students can focus on their extracurricular activities, such as internships, many perceive as being valued more.
Within this framing, it's pretty obvious to see why many CS students have flocked to using AI tools in their classes. The question is, to the extent that this framing isn't true, how do we combat this mindset, and to the extent that it is true, how do we change the system?
Well, it depends on what the purpose of collegiate education is for each person. Some people view it as an opportunity to learn different things, and aren't too concerned about any future career; others have a vision of where they want to work in the industry of their major that they are more focused on; others are more focused on financial interests.
I don't think we should discount any of these motivations, and I don't think any of them are particularly more "valid" than others. However, maybe there needs to be a difference in the way the field of computer science is viewed. AI tools have become extremely proficient at programming. Over time, I've realized myself that I do a lot less programming than I used to. In my research projects, most of the code I need to write is pretty boilerplate, such as generating plots, creating scripting infrastructure, etc. AI tools can vastly speed up this work and allow me to focus on the nontrivial aspects of my work. As for the classes I take, I have shifted to primarily taking more theory classes, which I feel like I get a lot more out of. These classes involve little to no programming and are much more mathematical. For me, this is one of the big ways as to how I have shifted my focus within the field as a result of the proliferation of AI tools.
I think students need to think more strongly about what exactly they want to get out of their classes, and what they see themselves using their CS degrees for. The field of software engineering is rapidly changing, and will probably start to span less and less of the overall field of computer science. With the rise of AI and a much higher CS graduate unemployment rate, there are significantly less CS majors than there used to be. The field might change to more strongly demand higher-order research work while AI tools do the bulk the lower-level programming work.
What does this mean in the context of classes? As students, we need to remember more strongly that our professors are just people trying to run their classes the best they can during this tumultuous era. We shouldn't expect the classroom environment to run perfectly, and we should do what we can to give respect to our both our own and our professor's time by respecting the contracts we make between students and professors when participating in a class, whether there is an explicit class contract or we are just implicitly following the expectations described by the syllabus. As professors work to improve their class environment in a world of AI, we should try our best to be the best students we can within that framework.
That being said, we should push our CS class environments to evolve to better fit a world with AI. Part of that initiative should be to help teach students how to use AI usefully and responsibly; another part of that initiative should be to help teach students how to think critically in a world with AI.
I don't know how precisely the courses in institutions as large as CS departments will develop. Maybe there will be a shift away from programming courses and towards more mathematical theory courses; maybe there will be a shift towards courses that cover the usage of AI in detail in different ways; or maybe none of these things will happen.
The important thing is to understand that we're currently in a time of tumult and transition as it pertains to being a CS undergraduate, and we might as well make the most of it while we're here. Here is one thing I know for sure. There is absolutely much to be gained from studying computer science in the year 2026. You just need to have a strong sense of what you're getting into.