When ChatGPT surged into public life in late 2022, it brought new urgency to long-running debates: Does technology help or hinder kidsâ learning? How can we make sure techâs influence on kids is positive?
Such questions live close to the work of Jason Yip, a University of Washington associate professor in the Information School. Yip has focused on technologyâs role in families to support collaboration and learning.
As another school year approaches, Yip spoke with UW News about his research.
What sorts of family technology issues do you study?
I look at how technologies mediate interactions between kids and their families. That could be parents or guardians, grandparents or siblings. My doctoral degree is in science education, but I study families as opposed to schools because I think families make the biggest impact in learning.
I have three main pillars of that research. The first is about building new technologies to come up with creative ways that we can study different kinds of collaboration. The second is going into peopleâs homes and doing field studies on things like how families search the internet, or how they interact with voice assistants or digital games. We look at how new consumer technologies influence family collaborations. The third is co-design: How do adults work with children to co-create new technologies? Iâm the director of . We have kids come to the university basically to work with us as design researchers to make technologies that work for other children.
Can you explain some ways youâve explored the pros and cons of learning with technology?
I study âjoint media engagement,â which is a fancy way of saying that kids can work and play with others when using technology. For example, digital games are a great way parents and kids can actually learn together. Iâm often of the opinion that itâs not the amount that people look at their screens, but itâs the quality of that screen time.
I did my postdoc at , and weâve known for a long time that if a child and parent watch Sesame Street together and theyâre talking, the kid will . We found this in and With these games, families were learning together and, in the case of Animal Crossing, processing pandemic isolation together.
Whether Iâm looking at artificial intelligence or , Iâm asking: Where does the talking and sharing happen? I think thatâs what people donât consider enough in this debate. And that dialogue with kids matters much more than these questions of whether technology is frying kidsâ brains. I grew up in the â90s when there was this vast worry about video games ruining childrenâs lives. But we all survived, I think.
When ChatGPT came out, it was presented as this huge interruption in how weâve dealt with technology. But do you think itâs that unprecedented in how kids and families are going to interact and learn with it?
I see the buzz around AI as a â with a surge of excitement, then a dip, then a plateau. For a long time, weâve had artificial intelligence models. Then someone figured out how to make money off AI models and everythingâs exploding. Goodbye, jobs! Goodbye, school! Eventually weâre going to hit this apex â I think weâre getting close â and then this .
The question I have for big tech companies is: Why are we releasing products like ChatGPT with these very simple interfaces? Why isnât there a tutorial, like in a video game, that teaches the mechanics and rules, whatâs allowed, whatâs not allowed?
Partly, this AI anxiety comes because we donât yet know what to do with these powerful tools. So I think itâs really important to try to help kids understand that these models are trained on data with human error embedded in it. Thatâs something that I hope generative AI makers will show kids: This is how this model works, and here are its limitations.
Have you begun studying how ChatGPT and generative AI will affect kids and families?
Weâve been doing co-design work with children, and when these AI models started coming out, we started playing around with them and asked the kids what they thought. Some of them were like, ââ Because it couldnât answer simple questions that kids have.
A big fear is that kids and others are going to just accept the information that ChatGPT spits out. Thatâs a very realistic perspective. But thereâs the other side: People, even kids, have expertise, and they can test these models. We had a kid start asking ChatGPT questions about PokĂ©mon. And the kid is like, âThis is not good!â Because the model was contradicting what they knew about PokĂ©mon.
Weâve also been studying how public libraries can use ChatGPT to teach kids about misinformation. So we asked kids, âIf ChatGPT makes a birthday card greeting for you to give to your friend Peter, is that misinformation?â Some of the kids were like, âThatâs not okay! The card was fine, but Peter didnât know whether it came from a human.â
The third research area is going into the homes of immigrant families and trying to understand whether ChatGPT does a decent job of helping them find critical information about health or finances or economics. Weâve studied and helping their families understand the information. Now weâre trying to see how AI models affect this relationship.
What are important things for parents and kids to consider when using new technology â AI or not â for learning?
I think parents need to pay attention to the conversations theyâre having around it. General parenting styles range from . Which style is best is very contextual. But the conversations around technology still have to happen, and I think the most important thing parents can do is say to themselves, âI can be a learner, too. I can learn this with my kids.â Thatâs hard, but parenting is really hard. Technologies are developing so rapidly that itâs OK for parents not to know. I think itâs a better position to be in this .
Youâve taught most every grade level: elementary, junior high, high school and college. What should teachers be conscious of when integrating generative AI in their classrooms?
I feel for the teachers, I really do, because a lot of the . So it totally depends on the context of the teaching. I think itâs up to school leaders to think really deeply about what theyâre going to do and ask these hard questions, like: What is the point of education in the age of AI?
For example, with generative AI, is testing the best way to gauge what people know? Because if I hand out a take-home test, kids can run it through an AI model and get the answer. Are the ways weâve been teaching kids still appropriate?
I taught AP chemistry for a long time. I donât encounter AP chemistry tests in my daily life, even as a former chemistry teacher. So having kids learn to adapt is more important than learning new content, because without adaptation, people donât know what to do with these new tools, and then theyâre stuck. Policymakers and leaders will have to help the teachers make these decisions.
For more information, contact jcyip@uw.edu.
This story was .