The Psychology of LLMs in Education
I had a great time speaking with Emily Gonzalez, an Educational Psychology PhD student at USC’s Rossier School of Education. Ms. Gonzalez works with Dr. Mary Helen Immordino-Yang at the Center for Affective Neuroscience, Development, Learning and Education (CANDLE), researching how neuroscience and psychology can inform methods of effective teaching. We spoke about how LLMs like ChatGPT could be affecting students’ learning development, how much of a concern cheating is, how schools can approach creating policy for effective use of these tools, and lots, lots more. I’d like to thank Emily Gonzalez for speaking with me, and I hope you enjoy this interview.
Tell me a little bit about your experiences with AI tools across the second semester this year. How did you use it personally? Did you use it for help with your research? Did you use it for help with your coursework?
So first and foremost, like everyone else, I think I was really interested in ChatGPT specifically. I think there were a lot of conversations regarding the utility of what it can and cannot do. And so, like everyone else, I played with it. I asked it questions that I already knew the answer to, and compared what it said in the output to what I know to be true. At the time, I was planning a trip and I asked it to plan the trip for me and saw how it differed or aligned with what I had already planned to do. So I kind of pushed it and played with it to see what it was capable of doing beyond the conversations regarding its use in education and all of that. But secondly, as a second-year doctoral student, you are engaging in coursework of some type, and there were times where I would search a concept that was really tricky to grasp. I would ask it to explain it to me in simpler terms. But beyond that, I didn't really use it for anything in particular with my research. I'm still wondering about the applications in regards to things like research and learning, so I did not use it for research. I think first and foremost, as a researcher and a scientist, you need to be responsible and engage in your own sensemaking before turning to technology to give you the answer or explain something to you. That doesn't mean it's not used that way in research, it is. But I think there's a benefit to grappling with tricky ideas or wondering about the answer to something before just turning to technology. So I think that's generally the approach I took most recently, and it hasn't changed much. I still carry that perspective regarding its utility and, in particular, how it relates to my role as a researcher and PhD student.
In your career, you’ve been a student (through college and now as a PhD student), a teacher (as a TA in Harvard and a teacher in a K-8 school) and a researcher. I want you to give me your opinions about ChatGPT from the three different perspectives that you have of the student, of the teacher, and of the researcher.
So from a student perspective, particularly as a PhD student, but I'm also thinking about high school students, elementary school students, college students, really everyone- The use of ChatGPT is this really unique experience of having readily available information at your fingertips, and being able to use it to get quick answers you might need. I know that there are a lot of complicated questions surrounding that, but it's the same with Google. The way that we've used Google is to get quick information we might need. And I know that there have been questions regarding its use for completing assignments and exams and papers, but I think as a student, it's really useful in getting a quick answer that you don't necessarily need to think deeply about. It's just, “I need to know more about this topic or this content area, and ChatGPT is going to be able to explain it to me in layman's terms,” and that's it, that's all I need to know. So as a student, its utility is really in that capacity. As an educator, which is very related to my answer for students, is that it’s another tool in teachers' toolboxes that they can use for that particular purpose. And I use that language really purposefully. It's for getting information that we don't necessarily need to think deeply about at the moment. That doesn't mean that it won't trigger us to think deeply about something, but I think for an educator, it's a tool to help kids access information that will help their thinking in some capacity, and then from that, they can think deeply. I believe that as an educator, your role is to create an environment that allows youth to think deeply about the concepts they are learning about, and to engage in learning activities that push their thinking and push their ability to relate with one another and think about broader impacts. ChatGPT can be a tool in that equation, but I don't think that it should necessarily replace educators. As a researcher, that is a really interesting dynamic, because, of course, we've had ed tech going on for a long time. I used to conduct research regarding the affordances of particular forms of educational technologies for youth learning. There are some really important affordances, but it can also potentially come at costs. So it's always about considering your goal in the classroom. What is your goal as an educator? And as a researcher, we're able to think about what happens when teachers use some sort of technology in the classroom. How does it change the learning dynamic? How does it change the way in which youth are engaging in the thinking process, or just learning differently altogether? How is this impacting youth's psychosocial development over time, their ability of relating with other people, their ability of thinking deeply and creating connections? And that's really, as a researcher, the lens that I take and the questions that I ask. We have science that shows that if you're always outwardly focused, maybe at the teacher or the whiteboard, that often comes at the cost of being able to think deeply. You can't do both of those things at once- outwardly focused attention and deep reflection. When you are constantly “out there” all the time, attending, attending, attending, you are not thinking deeply and reflectively. And that can come at a cost. We have brain science that shows us that. It’s like a seesaw- You can't always be on one side of the seesaw, sometimes you have to go to the other side, right? It's that coordinated effort that matters. I worry that technology, depending on how it's used, and depending on how it's perceived in value, will come at the cost of youth being able to tip into that more reflective mode, especially if they're just scrolling and scrolling and viewing more information, more information, new TikTok video, right?
One of the key issues with ChatGPT in education has been its potential to enable widespread and turnkey cheating. Do you share that concern? Do you consider it your primary concern? Or do you consider something else as your primary concern?
First and foremost, cheating would not be my primary concern. If our concern is that we're worried about kids cheating, then we need to seriously interrogate our own beliefs about what academic integrity is. Why are we designing assignments where kids feel like they need to cheat anyways? We need to rethink the way in which we're engaging youth with assessments, where they don't feel like they need to cheat. Imagine students engaging in a community-based project, that's their final, where they're working with folks in their community to develop business plans- That can involve social studies, math, science, all of the academic domains. Would students feel like they need to cheat then? If cheating is your primary concern, you need to really interrogate the role and purpose of assessment and why you think kids need to cheat for your assessment. My primary concern is what I discussed prior, which is more aligned with psychosocial development and patterns of thinking. How is technology interacting with what we know regarding youth development over time, and how they can develop positive patterns of thinking and dispositions of mind that we know lead to later life satisfaction, sense of purpose and identity, right? Patterns of thinking are directly linked to our feelings of ourselves in the moment and later on. Same with relationships- How is technology impacting relationships and how is that then later impacting our sense of self and fulfillment in other things? So that is my primary concern, less so cheating. I know we have algorithms now on various online learning platforms like Blackboard, where you get a statistic that basically says, “With this much certainty, we can say that this is plagiarized from ChatGPT,” right? We have those tools. But I think we really need to interrogate the question of why folks are worried about cheating in the first place. That should really make us think about something different here rather than, God forbid the kids cheat.
Can you tell me a little bit more about how the research you do at CANDLE fits in with AI tools like ChatGPT, if it does fit in at all?
Yeah, at CANDLE we do not very specifically focus on the impact of ed tech on youth in classrooms. We focus more on the neurobiological and psychosocial impacts of educational opportunities, or lack thereof, on youth development. But we do consider the role that technology might play. And that's similar to what I said prior, about when kids are constantly scrolling on TikTok, they're obviously not interacting with their peers in real life. Maybe they're interacting with peers online, but that’s a very different form of interaction that humans are not used to. For many, many years, we have been interacting in-person with each other. So this is a very different form of interaction that’s also related to constant attention, or the here and now, that can come at the cost of really positive patterns of thinking that are good for our development. So we, as a lab, take that approach. We think about things like what is the role of technology in development? We think about what these technological spaces create when youth are engaging with them, but we don't really research that. My previous research at Harvard was on how we can use the affordances of various educational technologies to improve youth's ability to think complexly about science concepts, and we focused specifically on ecosystems science. There we were able to look at other things that weren't related to psychosocial development, per se, but more about epistemological lenses, like what does it mean to be a scientist? How do scientists go about their work? How do you collect evidence and use multiple forms of evidence? So that was really the design of the program. We built the program to incorporate those opportunities for youth to think and engage in those ways.
Policy-wise, what would you recommend to a school district or a college, taking into consideration the concerns that you have with how ChatGPT might affect students’ psychology? How would you recommend they implement such a technology into their classroom, if at all?
I think it's really important to start at ground zero with the people who are most interested or worried about this technology and understand why they're feeling that way, whether it's positive or negative. There are probably really important perspectives that you need to understand before giving any sort of advice, especially if someone is feeling very strongly positively or negatively. There needs to be an understanding of why they're feeling that way, what their goals are, etc. I think the most important part is considering your goal. Is your goal to replace a human teacher with AI? I would argue that is not worth your while, right? There is something that is so innately human about teaching and learning that we have yet to even really figure out or describe. Our science at CANDLE is starting to unearth some of those things that make effective teaching so important and so powerful. Part of that is your biology- The way that you regulate your biology, how you regulate your emotions, even your vagal nerve is involved, something that we can't necessarily see or regularly think about, but that's directly implicated in your relationships and engagement with other people. So what is good teaching? We're trying to really understand what that is because I think it will inform conversations like this. If we think that all kids can just learn on the computer, I think that we're really failing to understand the power of what a really effective teacher is in a youth's development and long-term trajectory. If their goal is to just have a tool that enables students to look up information that they need to know once and might not need to know it ever again, then it's a good tool, right? I think if we are creating engaging learning environments and experiences for youth that are aligned with some of the progressive pedagogical approaches that we have out there- project-based learning, problem-based learning, interdisciplinary learning, those types of things- then AI and ChatGPT can be a tool that can be used to inform learning. But I think that it really is going to depend on the perspectives, the goals, and understanding that learning is dynamic because of teaching. I don’t think we can capture the essence of what it means to have a really impactful teacher who is able to see you and guide you and believe in you and put that into AI. I am not confident in that. So I would really want to talk with folks who are feeling those really intense emotions, positively or negatively, understand their perspectives, understand their goals, and then move accordingly. I don't think that a sweeping ban or a sweeping integration for everyone everywhere is going to work because contexts are so different, perspectives are so different, resources are so different, what is considered equitable, access to various technologies is different in different parts of the country, right? So it's a complicated question and any advice would need to be tailored to the very unique contexts in which people are thinking about this.
Do you think that the current paradigm of teaching is compatible with the rise in tools like ChatGPT? And if not, do you think that there needs to be a shift in the way that schools are teaching their students?
So as a former educator, the way in which we engage with technology is: “Here is a program and you're just going to implement it. You don't need to worry, the kids are going to work on the program and you can do whatever you need for those 40 minutes the kids are on the program.” It’s often seen as a replacement, rather than another form of engaging with material that the teacher is using and working with to teach kids. There are pockets of teachers who understand the technology isn't replacing them, that they need to work in collaboration with the technology to achieve deep, meaningful learning. But I think that as is, we see it as a replacement of a human teacher for whatever learning experience that they're implementing at the time. I think with everything involved in education, we need a paradigm shift of what it means to teach. I think we need to really understand the relational power and the social, emotional, and cognitive intensity that it takes to be a teacher and what they actually do with students. It's not just reciting the curriculum. You're relating to students, you're thinking about their identity, you're thinking about their longitudinal development, you're thinking about creating spaces for them to authentically develop their own sense of self and create new knowledge on their own. That means I'm not the person sitting at the front of the classroom telling them what the answer is. So I think regardless, we need a paradigm shift in education, and we also need to think critically about what it means to be a teacher in the age of technology. We need to think about their role with technology in mind. So I don't have a clear answer to that, but I do think the way in which it has been used now has been a stand-in for teaching, and that's not working, that's not good, that's not useful. Kids aren't going to learn as much with that, rather than a teacher working with technology to improve learning.
If you could look under the hood of ChatGPT and you had a magic wand to make any changes you wanted, regardless of technical feasibility, what would you do?
In regards to education, I think there would be utility in having some sort of AI program saying, “Stop and think about this before we move on.” I mean, we've incorporated something like that in many educational technologies, where it's like, “You're going to pause now, and you're going to reflect before we move on to the next thing.” We use that in all sorts of technologies. But I have a hard time thinking about what I would do technology-wise because it's hard for me to take that perspective when I'm coming from the human side of things. But I would want people to better perceive and interrogate their own beliefs regarding its value and how they would use it. Rather than saying, “Let's have technology just change for us,” I want us to start thinking really critically about what technology means for us, and to understand that it has value for some things, depending on your goal, and it doesn’t have value for other things. Also, not to put it on this pedestal where technology is going to figure out all of our human woes and solve them for us. I think it's really good for some things, so there is utility, but it just depends on what your goal is and the context that you find yourself in.