Four questions to ask about tech (and AI) products before children use them
by Melissa A. Butler
It is hard to read anything about education without finding a story about a new technology product for children, teachers, schools, or families. Recently the headlines have been full of examples of how AI can benefit children’s learning. Articles answering questions such as “When should we teach AI to children?” and “How do we teach AI to children?” saturate our inbox news feeds.
There are a few things typically missing from such stories, including discussion of what is meant by AI (for background and a distinction between AI and technology, this article is helpful: Artificial Intelligence Is Not A Technology).
To me, what seems to be most missing in stories and conversations about any category of technology product is a discussion of what is meant by learning.
What does learning look like? What does it mean when a child is intrinsically motivated to learn? Who decides what matters most in children’s learning? Who determines what is packaged as a helpful “learning” product? Who decides what is highlighted as successful learning?
These questions are especially important when a product, program, material, or tool is offered with a claim to be in service of “learning.”
In order to elevate our focus towards deeper conversations about children’s learning, here are four questions we might ask when discussing the relevance or need for any proposed product, technology or otherwise:
Is it necessary for learning?
Sometimes the most obvious questions are the ones that are rarely asked. This is one of those questions. If there are other and better ways to engage children in the desired learning, there is no need for the technology product.
Often a product is framed as relevant simply because it is new or because people think using the product itself is meaningful learning. But we need to think beyond the product itself and ask about what kinds of learning happen with the product.
Does the product support accumulation of separated skills and tasks? Does it support more efficient input of information into students? If so, not only could such learning happen without the product, but it might not be the kind of learning we want for our children. Thus, we need to also ask: What aspects of learning does the product most value? Is this learning relevant for children? What other learning could happen if time wasn’t spent using this product?
Does it enhance relationships?
This has become a frequently asked question when it comes to decisions about whether or not to use technology products with children. Although it is a positive step that more people are noticing and highlighting the importance of relationships in learning, this has also resulted in much image-focused framing of technology products as about “relationships,” when such products are, at best, only relationship adjacent.
We need to think carefully about what we mean by relationship and what we mean by enhance. When a technology product is used alongside a relationship, or when a relationship is needed in order to use a technology product, or when there is a relationship in spite of the technology, this does not mean that the technology product enhances the relationship.
Does it support learning that is personal or personalized?
Most conversations about personalized learning products zoom past foundational questions about “What is learning?” and “What is personal?” into talk of usability, comparison with other products, scalability, etc. But these products shouldn’t get a free pass into our learning spaces by begging the question of learning itself. We should not assume that just because something calls itself “personalized learning” it has anything to do with meaningful learning.
Personal learning (learning grown from inside the child) is meaningful for the person and comes from the thoughts, feelings, and being of the person. It is motivated, sustained, connected, grown by the person in connection with others. It is alive and joyful and expressive and relevant and what is learned is deeply connected to how and why it is learned.
Personalized learning (customized program designed by others often far removed from the child) is individualized inside a frame of knowledge that exists away from the human, relational context of a person’s life. It is designed to be like learning, for the user. It is efficient at targeted practice with isolated skills and specific feedback designed to motivate the user to continue with incremental challenge to learn more inside the frame of knowledge the program has curated.
Is there a place for children’s practice with isolated skills and concepts as part of their learning? Maybe. Might some teachers or parents want such products as one small strand of what/how children learn. Sure. But this doesn’t mean “personalized learning” should be assumed as helpful for learning and we certainly shouldn’t assume that such products support children to practice personal, intrinsic motivation as learners.
Does it deepen engagement or rely on hyper-engagement?
Engagement too often gets reduced to an image of fun and happy and busy—"Look, children love it.” “Look, they stay interested in this program for a long time.” “Look, their scores are up.” “Look, they love school now.” Although having fun and being happy might be desirable outcomes for children, they are not relevant indicators of learning engagement.
We need to look beyond the surface and think about how we are supporting the mess of children’s engagement—the feelings, boredom, not-knowing, daydreams, detours, and surprises of learning. These are the experiences that children most need in order to figure out what it means to motivate, sustain, and challenge their own learning. This is the kind of engagement children need to practice for their learning to grow from inside themselves.
Many technology (and AI) products designed for children’s use rely on programming designed to keep children inside a loop of “engagement” (with constant feedback and incremental increases of challenge). This programming follows similar logic to the design of casinos, by maintaining a closed loop for a user’s attention, creating experiences of intense “engagement” that feel as if time stands still.*
Although the term hyper-engagement means different things in other contexts. In the context of children’s learning, I use the term hyper-engagement to mean that which entices user entry into a manufactured, and addictive, flow cycle of surface level engagement. Products designed for hyper-engagement motivate children with gimmicks that are counter-productive to their development of intrinsic motivation for learning.
Even when such products are a small part of a child’s day, we need to ask how such experiences might be negatively impacting children’s overall sense of what it means to deeply engage in the rigor required to follow their intrinsically motivated curiosities.
Technology (including AI) products don’t need to be scary or overwhelming. But we can’t assume that products claiming to be in support of learning actually are. And we can’t rush past foundational questions about learning itself. We need to slow down and create space for central questions about what is valued in learning and what kinds of learning support all children to grow as deeply engaged and intrinsically motivated thinkers in the world.
*For further reading about notions of flow in engagement, see Nakamura, J. & Csikszentmihalyi, M. (2009). “The concept of flow.” In Snyder, C.R. & Lopez, S. J. (Ed.). Oxford handbook of positive psychology. Also, Flow: The Psychology of Optimal Experience by Mihaly Csikszentmihalyi (2008).
What seems to be most missing is a discussion of what is meant by learning.
EarthTime is an example of a technology product that enhances learning.
EarthTime allows tangible access to big data in ways that allow us to process, find new questions, grow community conversations, and take action. It allows engagement with data in ways that otherwise couldn’t happen.
Message from Me is an example of a technology product that enhances relationships.
Message from Me allows children to photograph and share ideas about a moment of process while in school that couldn’t be shared otherwise (unless present in real time). This sharing through technology allows continued conversation between child and caregiver, extending both the learning and the bond between child and adult.
Products shouldn’t get a free pass into our learning spaces by begging the question of learning itself.
What can I do?
· Invite these four questions into your routine conversations about technology (and AI) products. If you don’t already have such conversations, try one out.
· Ask educators. Often, educators aren’t at the table when decisions are made about purchasing (or accepting donations) of “learning” products. It is important to invite educators to share more of their knowledge of learning before decisions are made.
· Notice how children handle boredom and not knowing what to do. If they always need to be directed or need something outside of themselves in order to not be “bored,” this is a red flag.
· Notice what children do and say. Notice what they do with their time and the kinds of questions they ask about the world. Look for them to have a balance of interests and multiple ways to play and learn.
· Look below the surface when technology products are advertised. Look for rich description of student learning and rich conversation from educators. There are products that support depth of learning, it just takes some digging to find them.
· Notice when things are described as efficient, scalable, or easy for teachers. This typically means they are not in support of messy, deep engagement of learning.
· Talk with librarians. Local libraries are full of resources to help parents and educators make decisions about children’s learning.