S14 Episode 5: A Kid's Book About AI Bias // Avriel Epps, PhD
Hosted by Hillary Wilkinson
Resist the seductive myth that artificial intelligence is somehow magic or all-knowing.
~Avriel Epps, PhD
As we charge forward into this age of AI and the next chapter of digital technology at breakneck speed, we would be wise to start conversations with our children, family, and communities about what we're seeing. Avriel Epps is giving you the tool to do that! As the author of “A Kid's Book About AI Bias” she used a background in social injustice and biases to create the resource she wished she had for her own children. In this episode we talk about the challenges of raising children in our expanding view of AI and today’s world and many other things - you don’t want to miss this critical conversation.
Healthy Screen Habits Takeaway

Resources
For More Info: https://www.avrielepps.com/about
Order Dr. Epps' book on Amazon!
Follow Dr. Epps on Instagram @kingavriel

Show Transcript
Hillary Wilkinson: (00:00)
I truly believe in the power of books, book sharing and read-alouds, and somehow using a platform like a book to introduce a topic of concern or something you wanna introduce into discussion. It allows for sort of this like on-ramp to conversation. And there's a series of books from DK publishers called A Kid's Book About…., that utilizes this sort of on-ramp for discussion method for deep social, psychological, and sensitive issues really beautifully. My guest today is an author of a book in this series and swims deeply in the studies of sensitivity, biases, social change, all of these things holding a PhD from Harvard in human development. And as a civic science postdoctoral fellow at Cornell, their research focuses on the intersections of algorithmic bias and identity development across racial and gender spectra.
Hillary Wilkinson: (02:08)
As co-founder of the A14 Abolition, she is dedicated to building collective power with and around AI, through open source tools and AI literacy programs in marginalized communities, which honestly, I think we can all up our AI literacy. She's written a book that I think should be required reading for kids, and which is different than most books about this topic, which are currently only written for adults, which is why I really wanted to amplify it. It's called A Kid's Book About AI Bias. And as we charge forward into this age of AI and the next chapter of digital technology at breakneck speed, we would be wise to start conversations with our children, family, and communities about what we're seeing. I'm so honored to welcome to Healthy Screen Habits, Dr. Avriel Epps.
Avriel Epps: (03:06)
Oh, thank you so much, Hillary. I'm so happy to be here.
Hillary Wilkinson: (03:11)
Ael, I think if I were choosing anyone who defined the statement, we are more than the sum of our parts. It would, you would be like in my top five . Okay. , you, you've been entrenched in the academic arena of social justice mm-hmm . Human development technology, and, but yet I, I'd be in earlier life you had a foot firmly set in the entertainment industry Yeah, that's right. With music recording and even voiceover acting. So how did all of these things merge and create where you're at today?
Avriel Epps: (03:49)
That's an excellent question. So, yeah, like you said, I grew up in the entertainment industry. It's like, I think, I wanna say I was in the Screen Actors Guild when I was two and, you know, and Nickelodeon shows, Disney shows. Um, and there's, you know, something about being in media and being a media worker so young, really it shapes your psychology in a lot of not-so-great and interesting ways. I was keenly aware of how I was being judged based on how I look, um, and, you know, how I performed on a daily basis. And this was before social media. Now, of course, young people know exactly what that feels like from a very young age if they're unlucky enough to, um, you know, experience the judgment of the internet at a young age. Um, but it's, it was a pretty unique experience for me as like a millennial in the, in the nineties.
Avriel Epps: (04:44)
Um, so, so yeah, I think that also coupled with the fact that by the time I got to college, I was also starting to get more of a formalized education of how media shapes the way we see ourselves and our place in the world, even as consumers of media and not just performers or workers in that, in that industry. Um, and at the same time, I was on MySpace and, you know, starting to like dabble in producing content for people on the internet. And that gave me like an alternative, um, space to explore my identity. And so, uh, you know, as a college student, when I first started doing research, I was really interested in like, what's the difference between what traditional media is doing? And then these like new opportunities that digital media is allowing. And, um, you know, fast forward seven years after that, and machine learning is kind of a new emerging thing that folks are having conversations around identity and bias. Um, and I thought, oh, okay, that's interesting. So it's not just about the content, it's also about the underlying technological structures and how those shape people's experiences, their identities. Um, and that's kind of been my fascination and my little special interest, I guess you could say, for the last almost decade. Um, yeah. And I, and I think, you know, the, the, I think the people say that sometimes research is me search
Hillary Wilkinson: (06:20)
Mm-hmm .
Avriel Epps: (06:21)
Yep. And, um, I certainly became interested in adolescent experiences because I had such like painful adolescent experiences navigating media mm-hmm . And digital media. Um, and I also kind of wrote a book that I wish my five-year-old or 6-year-old self had. Um, and I say this in the book too, like, um, you know, I, I learned as an adult that it wasn't a problem with me. It was a problem with society and like, what a relief that was, but imagine, um, and I don't want kids who are reading the books now to, to have to be a grownup to understand that. Um, and so, um, yeah, it's like a, a little bit of a love letter to my younger self and also to my own kiddo.
Hillary Wilkinson: (07:03)
Yeah. Yeah. So how does being a mom kind of inform your view of where you want technology to go?
Avriel Epps: (07:11)
That is a wonderful question. Um, I think about the unknown a lot with my kiddo as I think certainly a lot of people listening to this podcast think about, um, and not necessarily the unknown of the existing technology, although that's certainly a part of it. Mm-hmm . I know so many people feel locked out of fully understanding how this technology works, which is a big reason why I wrote the book, but also the unknown of where it will go. Mm-hmm . Um, and like we know what large language models are like now in 2025, but what will they be like in five years, in 10 years when he's an adult? What is the future of work gonna look like? What is the future of the climate gonna look like because of artificial intelligences, like intensive usage of resources mm-hmm . Um, so I don't, I I I think I'm not answering your question of how it informs my work, but I think it motivates my work for those reasons.
Hillary Wilkinson: (08:15)
Oh, I like that answer. I, because I feel the same where I feel like there's so much that we don't know, and it's very easy to get caught in the downward swirl of, of where things may or may not be heading. But I, you know, it's almost like every day we have to remind ourself of messages of hope. And I, I come from a place where I truly believe with all my heart that education saves lives. Mm-hmm . And I feel like the more we can educate, the more informed people will be the better informed choices they will have.
Avriel Epps: (08:56)
Absolutely.
Hillary Wilkinson: (08:57)
Yeah. So, speaking about being informed, I need to, um, let's break down some terminology because
Avriel Epps: (09:07)
Absolutely.
Hillary Wilkinson: (09:08)
So just so we're coming from a, uh, a baseline understanding of the words we're using , which . So before we get into the book, um, what is AI bias? The, the title of the book is A Kid's Book About AI Bias. And can we, can we discuss what is AI bias?
Avriel Epps: (09:30)
Yeah, absolutely. So depending on who you're talking to, they're gonna define AI bias differently. So if you're talking to somebody who's just like a statistician and doesn't have like any social science training or any humanities training, they're gonna say AI bias is just like the systematic error in the prediction that a system makes. Um, and all prediction systems, like, like large language models are predicting moment by moment what word is the right fit for the, um, the task at hand. Or a Waymo is predicting moment by moment what the right velocity or speed or turn should be, right? So a statistician will say, well, there's always some error in that prediction. And that systematic error is the bias. That's not what I'm talking about in this book. I am talking about biases as we understand them, like from a, a, a societal level mm-hmm . Um, these historical injustices, historical, um, uh, hierarchies in our society that place women below men, or that place white folks above black and brown folks. Um, or say that disabil people with disabilities are less valuable to society because of their disabilities. And saying when the AI reflects those historical forms of, um, injustice that is AI bias, um, I'm not the only one who thinks of it that way. Um, but you could also, instead of AI bias, you could call it, um, tech injustice or, um, unethical AI. Um, but the reflection of those like real world biases Yeah. In the AI is what I'm concerned with.
Hillary Wilkinson: (11:18)
If people are struggling with, it's like, okay, bias, it's like the benefit of people in power and like You know mm-hmm . Mm-hmm . But, but like, what, what does that look like for just, you know, average Joe Schmo who's logging on or maybe using a filter or facial recognition, anything along these lines? Yeah. Can you kind of flesh that out a little?
Avriel Epps: (11:53)
I mean, those are two great examples. Um, so selfie filters, there's kind of a long standing history, I guess maybe a short standing history 'cause they're not that old of a technology where they tend to lighten people's skin color, um, or, you know, shape reshape the features of their face to form their face in some way that makes it, um, you know, if wider noses get shrunken down to be thinner or, hooded eyes get reshaped, um, so that they're, they don't have that shape anymore. Or changes your eye color from being brown to something that is more fitting for European standard of beauty. Um, like blue or green, um, facial recognition technologies. Um, you know, there's a lot of research in this area that shows that facial recognition technologies just don't work as well for, um, women and for people with darker skin tones, um, are more African or indigenous features. Um, and that's kind of just like across their use, right? Their use in law enforcement, their use in kind of just like everyday surveillance, like Zoom's background feature. Like that's using facial recognition to see where, uh, where your face might be. Um, and there's just, there, there are numerous examples. One of my favorite examples, um, to use with little kids because, um, you know, they love listening to music. At least my little kid loves listening to music. I did, um, some years of research with Spotify looking at biases in their recommendation systems um, which is another form of machine learning. It's not the ,AI the large, you know, chatbot AI that we typically think of, but it is a form of machine learning. And, um, we found that they recommend, uh, people with female-sounding voices, uh, less often or, or women artists less often than male artists. And so I tried this out with my own kid. We were listening to a playlist of like, kids' music and, um, you know, it was song after song after song after song with no girls' voices in it. Wow. And that's something that's kind of really easy for them to pick up. I'm like, Hmm, why do you think that is? Um, or let's see, let's play a game. We're gonna click next, and we're gonna see what what is recommended to us. You know, and it's not my playlist that I built, it's like one of those AI-generated playlists. So, um, the, these kinds of examples are kind of everywhere if you're looking for them.
Hillary Wilkinson: (15:41)
Yeah. And that's so interesting, especially for little kid music specifically, because it's like, there's so much research around the, um, like the soothing quality of the female voice so much so that, like, even in, um, fire departments or, um, fire stations is the word I'm thinking of, fire stations, when they're waking up, they used to wake up sleeping firefighters, you know, with, with a loud bell mm-hmm . And they found that that burst of cortisol, adrenaline, et cetera, was having long-term health effects. Mm. So instead what they, what they say now is it's a woman's voice who comes on and says, you know, alarm blah. And it's, and it, and it gets slowly louder, louder, louder. But I'm like, that's so interesting to me. It's like, are we, are we weaponizing? Are you ? I dunno that that goes deep and dark. Let's not go there,
Avriel Epps: (16:34)
. Yeah, no, I'm, I have the tendency to go that in that direction too, so, we'll, we'll hold each other off the ledge. Hillary. Exactly.
=====================
Ad Break : HSH Parent Presentations
—-----------------------------------
I'm speaking with Avriel Epps, the author of the book, A Kid's Book About AI Bias. And so AI is at the center right now of most digital literacy and ed tech conversations. It's a huge hot button. And with many new things, there's fear in the unknown that you talked about earlier. And quite honestly, I come from a place where big tech has proven itself to be worthy of distrust, . That That's right. So that being said, one of the reasons why I love your book is it really is this message of empowerment and hope, and you encourage parents and kids to critically examine biases. And can you, I mean, you had a beautiful illustration of the, um, the the use of music, uh, to, you know, to, to look at that. Do you have any other kind of illustrations what that kind of examination might look like or what that conversation could look like?
Avriel Epps: (18:00)
Yeah, I, I mean, you, you brought up the distrust of big tech, and I think, you know, there are a lot of reasons to distrust big tech, but one of the reasons is their recklessness in the handling of issues around children mm-hmm . And children's safety. And, um, you know, when we talk about structures of power that, you know, disproportionately harms certain groups of people, I'm not just talking about what we're typ what we typically think of, which is like people of color and women, I'm also talking about children. Mm-hmm . Children in this country are a marginalized group of folks on the whole, regardless of their race, racial identity or their gender identity, because they lack power, they have, they, they lack autonomy in a lot of, um, instances. And in, unfortunately, in some cases, they lack agency too. Um, and so 1, 1, 1 of the central messages of this book is that you as a young person, one, are not reflected in these systems.
Avriel Epps: (19:04)
And your safety and wellbeing is not, um, prioritized in these systems necessarily, but two, you deserve to be like that. We shouldn't just take that for granted and say, oh, they're kids that like, you know, let's, let's prioritize the, like the users who are going to be the quote unquote majority users. Um, you know, I want kids to understand that because they were born in this era, they have a cradle to, uh, casket digital footprint. And that digital footprint, that data is being used to power and make these systems which create billions and trillions of dollars in value for corporations. And so, um, I want them to feel a sense of, of empowerment and say, actually that's not right, and I can do something about it and I should do something about it, and I should do that for the sake of all kids. Right.
Avriel Epps: (19:56)
It's also a, a message of kind of, um, the need, we need to, um, see each other as a collective mm-hmm . Rather than just thinking about how do I keep myself individually safe as well. Um, and I'll give you another, another example of, of bias in some of these systems. Um, I put out a card game, which folks can download for free, um, on my website, um, aubreylabs.com or on the books website that, um, it's 10 different little fun experiments that you can do to just kind of put the concepts of the book into concrete action or to see them happen in real life. Mm-hmm . And one of the experiments is to ask chat GBT with your grownup, do this with your grownup, you know, , and then it says in the instructions, right? But to ask chat GBT to generate a picture of an American and do that 10 times now, when my kid did that, he was, he very quickly realized, oh, there are no kids. Every picture of an American is a pers like a, like a middle age, somebody in their, you know, thirties, forties, or fifties. Oh, wow. Um, so it wasn't even like the race, different races aren't represented here. It's like there's just no generational representation here either. Um, and I think every kid can understand and relate to that. Um, and, and that's one of the powerful things of the book.
Hillary Wilkinson: (21:17)
I feel like every computer science teacher in elementary schools could benefit from just like, opening their class that way. You know what I mean? What an easy exercise to just promote critical thinking and digital literacy. So you also are involved on social media, you're on Instagram and TikTok, you have this, you have a very large global following. And how does knowing what you know about algorithms instruct what you do on these platforms?
Avriel Epps: (22:05)
Oh, love that question. Okay. The first thing I'm gonna say is I do not have those apps on my phone. I have somebody else post my content for me, because I learned very quickly that when I'm posting content, I get glued and I'm like, refreshing, refreshing. I need to know people liked it. And I'm like, I have divorced myself from that entirely. Um, but I also have read a lot of the really interesting research about like how misinformation spreads and how conspiracy theories are picked up and, you know, what, um, what hooks people into social media and makes them wanna refresh when they're just scrolling, not posting their own content. Um, and I'm experimenting with this a little bit. I don't, I don't know if it's like totally ethical or how I feel about it. I don't know you guys, the, the, you guys listeners can be the judge, but I try to like, take those understandings of how I know like the algorithms are being hacked by like bad people, , uhhuh , or people who have like nefarious intentions and then be like, Hmm, can I use that to like inject a little bit of like, critical thinking?
Avriel Epps: (23:08)
And, you know, so I'll often be like, I will often open up my hooks with like, did you know, or like, you know, there's like some salacious, scandalous thing that has happened. And then like that is, you know, the kind of hook for getting folks to the deeper question beyond just the salacious headline, if you will, um, about what role do we want artificial intelligence to play in our society and how does it further issues of injustice mm-hmm . Um, yeah. So I need to do, I need to do some content about my process too. So I'm glad you asked me that question, , uh,
Hillary Wilkinson: (23:45)
And I was watching one of your posts recently where you talk about algo speak and can you, can you, like, can you explain A. What this is for people who don't know, and then like, how, how is it used as a tool and all of the stuff?
Avriel Epps: (24:02)
Yeah, I, I'm not familiar on the scientific research about this, so everything I'm gonna say is like totally anecdotal, but, um, I'll go speak is a thing that young people do to try to trick the algorithm, um, to not throttle or like keep their content from being, um, going viral or being pushed out. So there is theories that if you're promoting your link in your bio, for example, in your content, or if you say a controversial word, um, that, uh, the algorithm, the TikTok algorithm will not push your video out on people's for you pages. So folks will try to disguise that language by saying things like ink lay and i obay, or like shmink in
schmyo, um, and use all kinds of weird spellings of things in their captions to try to trick the algorithms. Hmm. Um, I am not entirely convinced that that is necessary or that the, the, the underlying assumption that that's what's stopping people's content from being pushed out is necessarily true across all the cases where people use algo speak. But, um, but now I'm, I'm like, oh, I need to go look at that research and see if anyone's actually done any experiments on it to see
Hillary Wilkinson: (25:23)
Well, and again, anecdotally, because Lord knows I'm not researching things other than, you know, mom on the street sort of thing here. But it, it kind of, I, I kind of like, after listening to you talk about the algo speak, I kind of go, Hmm, I wonder if like the gen alpha brain rot slang of today has roots in that. Like, you know, I mean, we have, we have all these parents who are so frustrated 'cause they're like, I can't understand what my kid's talking about at all. You know, , and it's like, well, maybe like have systems driven them to adopt this alternative language.
Avriel Epps: (25:59)
Yes. well, not only that, but they've been connected to each other at a speed at which, so like, kids have always come up with slang to try to like, get their parents to not understand what they're saying, but there is a speed at which it gets adopted. And then like, I think a complexity of that vocabulary, to me, that's very impressive. Honestly, I'm kind of impressed by it. I understand that it feels like brain rot, but when you think about it that way, it's kind of cool. .
Hillary Wilkinson: (26:26)
Yeah. Yeah. . Okay. Uh, yeah. So going back to the Kids' Book About AI Bias, I, like I said, um, we are all about education and empowerment for healthy screen habits. And I love how your language really empowers the reader on pages, um, 58 and 59, you state, we are all co-creating the AI of the future. Everything we do every day generates data for AI systems, and we have the right to say something when these technologies aren't fair and are hurting people. I love that statement. Avriel. I deeply wanna believe it. , can you convince me?
Avriel Epps: (27:10)
Mm. Well, which part do you want? The, do you wanna be convinced about the technical aspect of what I said or about having the rights?
Hillary Wilkinson: (27:18)
I, I believe wholeheartedly in the rights of humanity to ha to say something when something is not right. And to have the power. I, I don't know that I believe that the, the creators of technology are listening to the voices mm-hmm . And it, it puts me in a deep, dark place. I'm looking, you know, you said research is me search . Well, the, the podcast is, uh, my very selfish endeavor to talk to people like yourself, , who I never would have the opportunity to, to like, you know, h help help me make sense of the world I'm living in.
Avriel Epps: (27:54)
Totally, totally. Um, yeah. So here's the thing. Um, I think if enough of us understood that technological advancement would not progress, and the, the value that these companies have would no longer exist if we actually said, no, we're gonna stop producing data for you. Now, the logistics of how we do that are very TBD, but there are some mechanisms already in place to allow us to opt out or delete our data or, you know, engage in some kind of data protest. So in California, for example, you have the California Consumer Privacy Act, which allows individual users of any website to one, request the data that's being collected on them, but also to delete, you have the right to delete. And, um, you know, I think it's very reasonable that we could organize some contingent of the digital population. And some studies have suggested you need maybe like 30% of a user base of a technology. So like 30% of Netflix users, um, deleting their data in protest, um, or poisoning the data that does get uploaded or taking that data that Netflix, for example, and I'm not picking on Netflix, I love Netflix. I don't think that they're like the worst of the worst. But, um, you know, taking that data that you, that Meta has collected about you and, um, said, I'm actually gonna take this file and I'm gonna give it to a competitor until you do what I want you to do as the consumer or you know, the collective of consumers who are engaging in this type of protest.
Avriel Epps: (29:48)
Um, and that being a form of, um, of collective action, um, and engaging in, you know, or, or a first step in kind of pressuring them to engage in some kind of collective bargaining or meeting some form of demand. So you're right, I think that we've watched over a decade now of failed legislative action to reign in these big companies. Mm-hmm . I don't think that there are, at least at the federal level. I don't think there are gonna be any major wins in that direction. Um, anytime soon, there's some progress that's happening at the state level. But, um, I think we need to think about other tactics. Mm-hmm . I think about like the sixties and the Memphis Sanitation workers and the Birmingham Bus Boycotts. I think that's the,
Hillary Wilkinson: (30:35)
No,
Avriel Epps: (30:35)
I think that's where we're at.
Hillary Wilkinson: (30:37)
I agree with you. The power is in the grassroots organization.
Avriel Epps: (30:41)
That's right.
Hillary Wilkinson: (30:42)
Yeah. So we have to take a short break, but when we come back, I am going to ask Avriel Epps for her healthy screen habit.
—--------------------------------------------------------
Ad Break: HSH Website
—---------------------------------------------------
Hillary Wilkinson: (31:01)
Okay. I'm speaking with Avriel Epps, author of a Kids' Book About AI Bias, expert on predictive technology, and perhaps most importantly to this podcast parent. On every episode of the Healthy Screen Habits podcast, I ask for a healthy screen habit. And this is going to be a tip or takeaway that our listeners can put into practice in their own home. What is yours?
Avriel Epps: (31:29)
Hmm. I would say mine is to resist this seductive myth that artificial intelligence is somehow magic or all-knowing. And, um, probably most troubling of the myths is that it's objective in some way, and to just remind our kids every time that they use it, that that is not the case.
Hillary Wilkinson: (31:49)
Hmm. And I also like to remind them how much water it takes
Avriel Epps: (31:54)
, so much water and energy. We,
Hillary Wilkinson: (31:56)
Yeah. Yeah. We live in Southern California, and I think water is, is always on our minds out here. And when you hear about the, I mean, they say it takes what, like a, a bottle and a half of water for every image generated on AI. And I don't, I don't think, I think it's akin to the days of strip mining in, during the gold rush, you know, to just like completely absorb ourselves in California culture here.
Avriel Epps: (32:22)
Yeah, absolutely. So
Hillary Wilkinson: (32:24)
As always, you can find a complete transcript of this show, a link to ale's website, a purchase link for the book that belongs on Avriel's bookshelf, a kid's book about AI bias and where to find her on Instagram and TikTok. You're gonna find all of these by going to the show notes for this episode. You do that by going to healthyscreenhabits.org. Click the podcast button and find this episode. Avriel, thank you so much. Thank you for being here today and for casting this light upon an area that I think right now is, is really kind of in the shadows for a lot of us.
Avriel Epps: (33:03)
Yeah. Thank you so much for having me. It was a lot of fun.
About the podcast host, Hillary Wilkinson
Hillary found the need to take a big look at technology when her children began asking for their own devices. Quickly overwhelmed, she found that the hard and fast rules in other areas of life became difficult to uphold in the digital world. As a teacher and a mom of 2 teens, Hillary believes the key to healthy screen habits lies in empowering our kids through education and awareness.
Parenting is hard. Technology can make it tricky. Hillary uses this podcast to help bring these areas together to help all families create healthy screen habits.
Recent Episodes
