S15 Episode 8: Online Harms Prevention Kit // Julianna Arnold & Dawn Wible
Hosted by Hillary Wilkinson
"…the laws that are currently on the books are not adequate to meet the challenges (of protecting kids against online harms)."
~Julianna Arnold
Dawn Wible and Julianna Arnold are co-chairs of the Online Harms Prevention Workgroup with Fairplay. Fairplay for Kids is the nation's leading nonprofit committed to helping children thrive in an increasingly commercialized screen-obsessed culture, and the only organization dedicated to ending marketing to children. The online harms prevention work group believes children and young people deserve to be safe online. Today, we discuss a helpful kit designed to assist parents in getting a handle on all areas of online harms.
Listen to this episode and visit the show notes on the Healthy Screen Habits website to gain access to this valuable resource.
Healthy Screen Habits Takeaway


Resources
https://fairplayforkids.org/pf/onlineharmskit/
https://www.healthyscreenhabits.org/s6-episode-8-talk-more-tech-less-dawn-wible
Show Transcript
Hillary Wilkinson: (00:02)
The theme of this season on the Healthy Screen Habits podcast is from First Screens to Crisis Moments. And my guests today are here to talk about the latter. They're co-leaders of the online Harms prevention work group with Fair Play, the nation's leading nonprofit committed to helping children thrive in an increasingly commercialized screen-obsessed culture, and the only organization dedicated to ending marketing to children. So the online harms prevention work group believes children and young people deserve to be safe online. And today we're going to discuss a very helpful kit designed to help parents get a handle on all areas of online harms. Our two experts today have both been on the podcast before. So welcome back to Healthy Screen Habits, Julianna Arnold, and Dawn Wible. Thank
Julianna Arnold: (01:11)
You, Hillary. Thank you. Thank you Hillary.
Hillary Wilkinson: (01:13)
Juliana, last week you shared the story of your daughter Coco, and I encourage listeners to go back and familiar familiarize themselves with the, um, the, the work you are doing on behalf of Cocoa and all families affected by social media, as well as the court case that's happening in downtown la. Is there anything, I just kinda wanted, uh, we talked a lot about that, but is there anything that you'd like to add about your involvement with Fair Play or anything along those lines?
Julianna Arnold: (01:51)
Yeah, I mean, I don't think I would be here doing this work if it wasn't for Fair Play. And, you know, I started off with the online harms prevention, uh, work group, kind of just listening, right? Like just learning and listening. And it provides such a amazing opportunity to hear other people who have been doing this longer, whether they're survivor parents or their advocates like Don, who have had experience in, you know, in these things and run their own organizations. So it was just phenomenal to be able to sit in with no pressure, you know, and just listen. And then over time, slowly kind of get more and more involved. And then I was kind of like surprised when they asked me to be co-lead with Don. I was like, oh my God, wow. You know? That's cool. So it's been a fantastic, um, kind of group that's very safe and very welcoming to enter in as a survivor parent when sometimes you just don't know what other people know yet. And you're coming obviously with some baggage and you're there for a little bit of a different reason. But, um, yeah, it's, it's what brought me to this work, really. I have to give it to the online harms prevention work group.
Hillary Wilkinson: (03:01)
Mm. Yeah. And, um, in a different note, I can say all of the work groups have a different focus and, um, all are, are very inviting and non-shame-based and very educational as well. So, Dawn, way back in season six of this podcast, we explored the work you do with your organization, which is Talk More Tech Less. And I'll link that episode in the show notes. So anyone who wants to get caught up can go do a quick listen. knowing that you are already passionate about digital wellness and all of these type topics, were there additional things that called you to lead the online harms prevention group?
Dawn Wible: (03:55)
Yeah. Really what you said, Hillary, of it not being a shame atmosphere, that that is so true because technology affects all of us. Mm-hmm . Right? I mean, um, some of the latest statistics are it's near as 22 hours a day and there's 24 hours in a day. So it's such a huge part of our life. Why aren't we putting more conversations around it, more education around it, more safeguards, um, just like we talked about with the trial, every other industry is required to provide guardrails. Why is tech company, um, different? they're producing a product, marketing it to kids, and, um, they should have the same kind of accountability that all the other industries do. And so it's, it's such a huge passion of mine to do the education part, which is what I've done for so long. Um, but being drawn to the online harms prevention work group was, it was just where my work evolved because I was tired of seeing the burden being put on kindergarten kids that I was talking to and parents and seeing the harms influence even my own kids. And so, um, yeah, I was drawn to it and then get to, to get to work alongside the most incredible practitioners and survivor parents and see us working together on these issues is it's a powerful group. And I felt the same way Julianna when I first joined. I was just there to listen and learn, right?
Julianna Arnold: (05:26)
Yeah.
Dawn Wible: (05:27)
Just, it's such a powerful group of activists that know deep down what's right and will go to whatever lengths to make that happen. So I'm, I'm just so honored to be a part of it.
Hillary Wilkinson: (05:39)
Well, I'm grateful to the, yeah. I'm grateful to the both of you guys. It's, um, it's difficult space to, to, uh, stand in and I admire you both. Um, so as most of you know, I'm an elementary school teacher by trade, so I like to start at the very beginning of any topic being discussed establishing common language and knowing what we're all talking about, right? So using that as kind of a starting ground. Can you define for me what are online harms?
Dawn Wible: (06:21)
Well, when we first started the work group, um, it, there were different specific harms that were in the title. I think, I can't even remember our original title of the group. But we started to notice so many different issues that kids were experiencing online. You know, some of these things are crimes on the street, um, but they weren't crimes on the platforms. And so they are, but they're not being held accountable for 'em. Um, so we started with cyberbullying, which is one of the top issues and harms that, um, are being addressed. But then we started to see that kids were experiencing harm, like, um, harmful challenges, TikTok, viral challenges, dangerous online challenges, um, drug sales on platforms like Julianna shared. And all of these different things started to combine. And we, and we realized this is a bigger issue than cyberbullying alone, or this is a bigger issue than drug sales alone.
Dawn Wible: (07:23)
These are all-encompassing and affecting minors on the platforms. And so we, we ended up actually changing the name to online harm prevention 'cause we realized we're not just dealing with one or two things. This, this is a whole grouping of issues. And that's where we started to target it. And, and really that came from the survivor parents who were joining. We would have a survivor parent join that had a completely new harm that their kid had experienced in dealing with chatbots, or, um, just as technology was evolving, the harms began to, um, add up. And we, we do have a list, uh, uh, online harms list because when we were testifying in court about some of these different issues, they were saying, well, where are these things? We wanna see what harms you're talking about. And so our work group produced, um, a list that has the specific harms and links to resources for families and parents. Um, and then that's where the action kit kind of evolved into.
Julianna Arnold: (08:30)
This issue is cross harms. Like the, it's the say it all revert, we all come from it, or not all, but some of us come from it from different places, different experiences, different harms, different atrocities, horrible things. But in the end, it comes from the same main cause is the fact that there's no guardrails for these companies. And the way that is on their platforms are very harmful. And the laws that are currently on the books are not adequate to meet the challenges. You know, they're very outdated. And so they were, kind of built for a billboard kind of situation, which the internet kind of was in the beginning, right. And now it's just completely morphed into this. And AI has been Dr driving this for, since they developed algorithms back like 10 years ago. So, um, but none of us knew, you know? Right. And that's why I think the, like, the real passion is like, we have new families, like younger family, like they need to know, they need to know. And we hope that once they know they're gonna wanna speak up about it, because that's what we need.
Hillary Wilkinson: (09:46)
Yeah. And I think the power and having that list is, it does provide this common language what to use otherwise. I mean, you don't know what to call what's harming your family and you, and there's, that's so well isolating. You feel as though I'm the only person who must have totally ever experiences. But if there's a name, if there's a, like you said this powerful list of words, it's like you identify, oh, that's, that's what I am dealing with at my house.
Julianna Arnold: (10:19)
And I feel like when I met other families that whether our harms were different, but did it all like, kind of focused around their, you know, online, um, activities and these social media platforms. You know, some of them I knew about, some of, I have no idea. And I was just completely shocked. But also when you talk to parents who have lost their kids who are open about it, you know, you find that there's pattern in usage too that kinda leads them down this path. And it's, um, the whole narrative that it's like the parents' faults, you know, we were all parents, but like, really when this all happened, there were no, there were no safety regulations. There was nothing. It was like, oh, it's okay. No. If they're 13, you know, or whatever age there, fine. So we didn't know. And it started off kind of innocent, you know?
Hillary Wilkinson: (11:08)
A lot of the messaging was not only, it's okay, it's good, it's good.
Julianna Arnold: (11:17)
Well, right. They can connect and all this stuff.
Hillary Wilkinson: (11:19)
People, they can, yes. So, you know, you,
Julianna Arnold: (11:22)
And, um, so I think, you know, we, we really didn't have enough information to make informed decisions. So we did the best we can to keep our kids safe, but the reality is, the safety tools that they put in place, were always after the fact. Mm-hmm . So it wasn't like they were designed safely. It was like, oh no, we have a problem. Something came up. Oh, now we gotta put a bandaid on it. Right? And the reality is, when you're a parent and you try to, like, as we all know, try to implement that a either it's not easy, b you do it and you're all of a sudden your kids found a way around it because they're smarter than we are, and there's a whole network of them, like talking about how you do that. Um, or yeah. They're just not effective. And so to say that that's good enough is not okay. That's not good enough. You know what I mean? So they can do better. I agree. And the whole thing is like, just design me safely. We don't, we're not against the platforms. I mean, they're here, um, technology's here, you know, AI, but that's the whole scary part is if we don't do something about it now, what's gonna happen down the road? It's gonna be that much more destructive, you know? Right. So,
Dawn Wible: (12:34)
And I I wanna add too, that a lot of times the companies are the ones pushing that narrative that this is on the parents, that this is a, the parents' fault. At the same time, it's not providing that kind of protection for their products,
Julianna Arnold: (12:49)
Right? Yeah. 'cause like, parent controls are usually something that a parent has to actively do and keep in place. So it's not something that's like, by default, it's usually like, you gotta do this, you gotta do that, gotta do that. And then your kid knows perfectly well what mom has done, and they're angry with you when they're a teenager too. So it causes this huge, like, why'd you do this to me? No one else has, you know, it just immediately causes, you know, aja and chaos in the house. And that's what I dealt with. It was like always the issue. And it's like, it's hard enough to parent in general these days, and then to have that,
Hillary Wilkinson: (13:25)
And you think you get your, it's impossible. You think you get your protections in place and then an update comes through and everything.
Julianna Arnold: (13:31)
Oh, yeah. No, no, no. I mean, I've done that. I'm like, how did that happen? I thought I took care of that.
Hillary Wilkinson: (13:36)
Yeah.
Julianna Arnold: (13:36)
Yeah. And there's not a lot of good places either to go. Like if something's not working, like no one ever responds to you, or if you do report some, you know, illegal or whatever, problematic, you know, um, stuff that you see on the platforms, no one gets back to you. Like, you know, it's just kind of like you're out, you know, writing into the ether, so there's no real, you know, um, customer service, let's put it that way. There's absolutely zero customer service.
Hillary Wilkinson: (14:05)
Which is where the action kit comes into play. So we have to take a quick break, but when we come back, let's get into the action kit that's been developed by your group, as well as, um, some of the online harms that you're seeing surrounding AI.
—---------------------
Ad Break: HSH Presentations - Parent Nights
—---------------------
I'm speaking with the co-leaders of the online harms prevention work group at Fair Play. So before the break, we talked about the definition of online harms. And now I'd really like to dive into the action kit created by your group. So what is this kit? How'd it come about? Who do you see using it all, all the stuff.
Julianna Arnold: (15:03)
Well, Don's the educator, so I'm gonna, I'm gonna pass this one over to Don.
Dawn Wible: (15:08)
Well, this was really a product that came from the harms list that we talked about earlier. We started to see that these harms were rising. And, um, we had parents survivor parents in our group that dealt with these specific harms within their stories. And many of them were saying, well, I wish that I did this, or when I was working with law enforcement during this, they said, this would've been helpful. And so the most amazing thing about the action kit is that it is actually made by survivor parents that have experienced those harms, and by child safety experts and law enforcement and different activists that are putting that topic into your hand and saying, here's what our best practices and here's how to help protect your family. Now, of course, we know, like, because of all of our stories and because of the stories within our group, it isn't foolproof because that's just where we are with the internet right now and with phones.
Dawn Wible: (16:12)
And, um, but it is a tool to help bring some prevention. I always say education is prevention. And for somebody to hear, if one person hears, Hey, you don't have to send that picture when that person asks, it might not even be the person in that profile, um, that you think it is. And so for them to hear that simple thing and make an empowered decision with that information, then that can prevent the harm, and that can save lives. And so the, the list is, um, I mean, it's extensive. We started with five, um, five or six, I think it was six, Juliana,
Julianna Arnold: (16:49)
Six I think, I think in the end it was six that first year,
Dawn Wible: (16:52)
And then we added up to, um, 12. So, anywhere from screen overuse, harmful challenges, illegal drug sales online, online sexual exploitation, pornography, depression, suicide, self-harm, and eating disorders. So we have a specific one-pager for each one of those harms. Um, yeah.
Hillary Wilkinson: (17:13)
Can you break down like what a one-pager looks like? Is it just a definition? Is it where to get help? How is it meant to be used?
Julianna Arnold: (17:30)
Well, we try to keep it like at a simple, as simple as possible. So it would be very accessible to many different groups of people. But really, it's kind of like an informational, like design. Like, okay, like how, what is this, what does it mean to say illicit drugs, you know, online? Like, what does that mean? And then it's kind of like examples of the different types of stuff, types of stuff like that. Because like sexual exploitation, there's never a different ways online that kids are being sexually exploited. And then moving into, um, some examples of like, you know, just to give people real-life examples of like, what that could look like. And, um, some statistics. So that really like hits home. And then going into kind of, um, what you should look out for if, you know, if you see this like, then that might be an indication that we have a problem.
Julianna Arnold: (18:24)
And then how can you talk to your kid about it? Like, it's a hard topic, but like, how can you address 'em? So maybe some prompts, it's like, what kind of questions to ask? How do you do it? Because obviously communication, open communication's the best way to go about it. And then it kind of ends with, you know, resources of like, for, for more information. Um, and as Dawn said, it's like, there's no way we can, it's not foolproof, right? Sure. Like we, all we can do is provide this information to people, make them aware, because nothing's foolproof with these companies because they're so, they have so many resources and so powerful, and obviously no, um, no guardrails in what they're doing. So it's kinda like, I hate telling young parents, you know, like what I really think , which is like, don't let your kid go online. I mean, seriously. But knowing that that's, that's a hard battle to fight with your kids, especially when they get older.
Hillary Wilkinson: (19:25)
Sure. And for that reason, and what we talked about earlier, about, um, the online companies, like trying to stay in front of the products they come up with is a lot like digital whack-a-mole. We can't do what the top engineers, designers, coders, et cetera, in the world are doing, right? So, at healthy screen habits, that's why we firmly, firmly stand by the best online protection is your relationship with your child, you know? And so I love that you guys have the conversation starters that even, I mean, even if somebody just, you know, doesn't know how to open a topic on something that they saw on their kids' phone and they're not sure about, I think you guys have put just a, you know, arrow in the quiver of, of, you know, I don't know, just help . So thank you, .
Julianna Arnold: (20:34)
It's not an easy discussion to start. I always said like, it's not that I don't trust you, it's that I don't trust what's out there, because really it's an open portal to everything bad in the world, basically, is what it is, you know? And unfortunately, um, um, it's bigger than, it's bigger than us, and it's bigger than parents. And that's why it's unfair to say like, if something goes wrong that it's the parent's fault. I just,
Hillary Wilkinson: (21:16)
Oh, yeah. Yeah.
Julianna Arnold: (21:16)
In most cases, I don't see that, you know, being the case.
Dawn Wible: (21:20)
Yeah. It's absolutely bigger. And it's a cultural shift that has happened. So we've had parents say, well, I never gave my kid a phone. And they still experienced these harms because culturally, kids are learning about it. There's challenges being tried at school that kids may not even know. They never saw it on TikTok, but they heard about it because somebody else was talking about it, and then they tried it. So just the fact that they don't even have to have access to these devices, to social media, it, they're still being impacted by it. Because culturally, there's been a shift. And, and we've seen that in the trials too, just the addiction, how that culturally was happening to an entire generation. And we just have to, we have to take a step back sometimes and look at the bigger picture of these harms and then see where our part is. And so I just love that this group is tackling not only the litigations, not only the legislation, but also education and just saying education. Yeah. Need all of it. , we need all of it. Well,
Julianna Arnold: (22:28)
That's it. We need all, it needs to be all a holistic approach. You know what I mean? Because one piece without the other is not going to be meaningful and not gonna be enough. So I think we all realize that, and we all know where we stand, like where our area is of where we can have the most impact and respect each other that we know, like without these different parts. And, you know, it, it, we would never be able, we will never be able to, you know, tackle this, you know, huge, huge, huge, huge challenge, um, to society. And it's not like, not just us, it's all over the world, you know? Right,
Hillary Wilkinson: (23:05)
Right. So I think one of the biggest areas that we're all talking about in digital wellness and parenting right now is, um, with tech is AI. And so can you guys, just because it is a new, a a new-ish topic, and, um, although as you mentioned earlier, Juliana, like that's essentially what the algorithms are. So we've actually been dealing with AI for years and years and years, but now it's, it looks different. It feels different now that it's gotten into this interactive type component of it. And we're seeing online, kids that have involved themselves with relationships, you know, and I mean, they're, but people are fostering friendships with chatbots. Are there types of harms that you see around AI specifically that are different?
Dawn Wible: (24:12)
Yeah. Um, well, I just think about this last year. I was really honored to be at the White House to see the Take It Down act signed into law, which is the first federal leg legislation of its kind, because it does require the platforms to take down images, including AI images that are non-consensual intimate images. Um, and that came from high school girls this harm happened to them and Texas. And, um, they were the, the boys in the school took pictures outta the yearbook, ran their photos through nudify apps, which shouldn't be allowed, um, in the app store for minors. But they not only are allowed, they're advertised to minors. Um, and they used those apps to sexually exploit their classmates. And there wasn't a law to protect them with AI being the images, but the girls experienced the harm the same way they would've, and with their own bodies. And nobody realized that. Yeah. And so to be able to be informed on some of these issues of sexual exploitation happening, but not even realizing that, you know, this was before the explosion with grok undressing women.
And so the girls testified and pushed and pushed and pushed and got this, um, law essentially updating sexual exploitation online and non-consensual, non, uh, consensual intimate images updated to include AI. And we are seeing that happen on, um, CSAM child sexual abuse material. Um, definitely in those issues. But then also interacting with chatbots. I mean, if you open Snapchat, one of the first contacts that you can, the first person to contact is not a person. It's your AI bot, “my AI”, or any of your friends are listed. And so it's being targeted, um, to minors and, and they know what they're doing. And so being able to put that kind of prevention in front of kids to help them understand the difference between a human and a bot, even in kindergarten, second grade, um, it's important content to, inform them about.
Julianna Arnold: (26:42)
Yeah, definitely. Especially because these companies have pushed to place their products in these schools without even the, um, approval of parents. So like, your kids being exposed to this in school, they're in elementary school, and all of a sudden they come home with their, you know, school-issued laptop or Chromebook or whatever it is. And all of a sudden I've heard parents say, like, all of a sudden I heard this like, oh, mom, the AI chat, wants to ask me, how about how, like, how am I doing on my homework? I see you're doing your homework. Do you want help in solving the problem or do you just wanna know how to solve it? Like, I mean, you know, and, and that's like a basic thing, but they're getting normalized. And once they're normalized, that's when kids start to think they're safe.
Julianna Arnold: (27:29)
And that's the problem. I do advocate with, um, a group of parents whose kids have either been harmed or, or died due to their experiences with these AI chatbots and, um, the conversations that we're seeing, the transcripts of the conversations that we're seeing, the sexual sexuality of some of it, or the sexualization, um, a lot of like helping with suicidal ideation. If the kid even mentioned something about suicide, they're like providing them like, well, how this is how you can do it and do this, and like, the steps and, you know, and really pushing them rather than think like, Hey, whoa, you know, and they walk down this path and they pick up the language of the kids so they learn like how they talk. So it's like you, you really feel like you're really talking with someone who's your peer.
Julianna Arnold: (28:24)
The vulnerability of these kids is that they're isolated already because of social media, right. So they're not having the interpersonal relationships that we may have had. And then on top of it, now they're exploiting that even further by saying, well, you know, kids don't have enough friends, so they need their chatbot friends. And then these chatbots are not trained to be like kid age. They're absorbing all of this data and information, which is totally age-inappropriate and damaging. And, um, and like, no one's doing it. Like they've been released out there, like they're safe, you know, 13 and over, you know, they're good or whatever.
Julianna Arnold: (29:15)
And we know the age thing just a joke too, because they have no way of knowing whether a kid is entering in their, the, you know, they're being truthful about the age that they are. And so we have a lot of kids under, under 13, under eight on all of these platforms, which is why it's so scary too. So, you know, the reality is, um, we, we need to immediately be able to like, um, figure out the situation and put guardrails on these companies so they can develop their product safely. It's not that we're totally anti-tech; it's just like the way they've been doing it is just wrong.
Hillary Wilkinson: (29:52)
Yeah. You know, totally wrong. Agreed. Agreed. So not,
Dawn Wible: (29:56)
Yeah, I was just gonna echo, we're not against innovation. It is so realistic for them to be able to put up these safeguards when they're trying to push to minors. We think about the, the internet age was the information age. It started as then it was the attention economy monetized. And Tristan Harris recently said, now it's the attachment age. And it's really where, like you said, Julianna targeting them to being able to be attached to bots. Mm-hmm . Versus let's innovate, let's use this in, in a way to innovate. No one's against that we are against them targeting to miners.
Hillary Wilkinson: (30:37)
Yeah. Yeah.
Julianna Arnold: (30:38)
Like, it just seems like they just are going for the lowest common denominator, the easiest way that they can make money so they can continue. Like now, like chat GPT talking about having ads. Well, that's how you monetize, right? Like back in the day, I was like, how are they gonna monetize this? When they came out with Facebook in 2008? You know, I think they were still figuring it out too. They were like, being valued is very, but everyone's like, that value doesn't really mean anything. And then they're like, oh, now we have the answer. And so now they're just trying to replicate that with AI, which is super dangerous because, um, it's at the expense of all of us really. Yeah. You know? Yeah. Especially our kids.
Hillary Wilkinson: (31:14)
And historically, when you look back at any revolution, say you say like the industrial revolution or, you know, any other revolution, you look at and who, who were the people that paid the greatest price? And it was our most vulnerable population. Well, I mean, we all know what happened to kids in the industrial ages when they were shoving them inside, you know, machinery, because they were little and they had little hands, you know? And it's the same thing; the same analogy can be applied today.
Julianna Arnold: (31:44)
Today. That's a good analogy, Hillary, you know? Yeah, it is. Yeah.
Julianna Arnold: (31:48)
Is. And it's the shock that at this point in time of where we are in the 21st century, that like you see, not much has changed. Yeah.
Hillary Wilkinson: (31:57)
You know? Yeah. Yeah. Do you guys have any, um, conversation starters, just so we can leave families with like an actionable tool? Because AI is so new, I think a lot of parents don't even know how to, like, what do they even say? Do you guys have conversation starters around that?
Dawn Wible: (32:26)
Definitely on the one pager, we, we have an AI one pager, and there's specific conversation starters on that. But I think the easiest one to have from a kindergartner all the way up to an adult is for people to remember that these are tools and they're talking to a robot. That's huge because everybody's getting so comfortable with open ai with chat GPT with Gemini, they're getting, because they're designed that way to connect, we're seeing the AI psychosis. And so having that reminder can, keeping that in front of people, um, that you can use this as a tool, not minors, but, um, not young kids. But when you're talking to other people, you can use this as a tool, but it is not a, it is not a human. And, um, we, we were having to tell elementary and middle schoolers that about being able to interact with it on school devices. This is not a, this is not your friend or your buddy. They're not safe. They're not designed, they're not ready for kids to be interacting with them at this.
Hillary Wilkinson: (33:32)
And I just think, I mean, you are dealing, when you look at it from developmentally appropriate practice, I mean, you're dealing with kids who are just years away from object permanence. I mean, they don't, it's just, it's insane to me that we're allowing AI to, to enter any of these zones.
When we come back, I'm gonna ask Dawn and Juliana for their healthy screen habit.
—----------------------------
Ad Break : HSH Workbook
—------------------------------
I'm speaking with the leaders of the online harms prevention work group at Fair Play, Juliana Arnold, and Dawn Wible on every episode, as you ladies know of the Healthy Screen Habits podcast, I ask for a healthy screen habit, which is gonna be a tip or takeaway that listeners can put into practice in their own home. What's yours?
Dawn Wible: (35:35)
Yeah. So I always say talking more about these issues because we need to normalize these conversations. It's happening among kids', peers, it's happening in their schools. And if you can be that place where it's happening in your home and you're normalizing it, you're, we are able to talk about some of these things with them, then they will feel more supported, hopefully to be able to come to you about the issues. And also, I think developmentally appropriately, um, sharing the stories with them and letting 'em know that this happens. I have three boys and we've talked extensively about Sextortion. They're the ages that are targeted that 14 through 17. And so we talk about it, Hey, if you see a profile picture that looks like a really pretty girl, don't, you know, you can always come to us no matter what. Um, which some of my dear survivor friends have told me, continue to say that no matter what, that they can come to you. But like I said, we're just, we're up against power and we're up against a lot of money and this huge industry. And so to be able to band together with your community and to be able to, um, raise these concerns and talking about these issues is, is just so important. Mm-hmm
Julianna Arnold: (37:00)
. Yeah, I mean, I totally second that. I think like having these conversations at age appropriate times, but starting young. And so when you don't want them to have a phone, they're kind of like aware of, well, like this is why mom doesn, we have a phone. Because like when they're younger, they're more open to that. They haven't already formed their opinions. And the other thing I just tack on that I see people having success with is when they're at school finding, um, like-minded individuals or families who may want to bond with you. So you have a group of people rather than being the lone individual that their kid is not getting access. And there's been some success that when those families do do that, then it's kind of like, okay, they can be more comfortable that they don't have a device, and then maybe that will become the norm.
Julianna Arnold: (37:43)
You know what I mean? Especially as we fight to get, you know, um, devices out of out of schools, um, I think they'll become easier, you know, because I think the problem is the schools have started to rely, I had started to rely on them so heavily that if your kid did go to school without any kind of device, they were at a loss and they weren't gonna get the same access to education as other kids. So, um, I think open communication, um, starting at a young age. And, um, I, I think really going to the extent of trying to explain as best as you can, the whys and what's really happening, you know, as age appropriate and not just putting down bans, because I think that's when kids just wanna like, you know, do the opposite. They're like, mom told me I can't do it.
Julianna Arnold: (38:30)
I wanna do it. You know, like, those are the two things I think really like, you know, open lines of communication and acceptance, you know, understanding that they're going to, they're in a place, it's not like they wanna do something wrong, but it's kind of being presented to them on a platter of all these things that they could do that might not be super healthy. So it's not like making them feel bad if they did do something, but letting them feel like, yeah, this isn't your fault. Mm-hmm . You know, and it's gonna be okay and we're gonna figure it out, you know?
Hillary Wilkinson: (40:00)
As always, you can find a complete transcript of this episode as well as a link to that online Harms Prevention action kit, as well as the previous episode with Dawn about Talk More Tech Less all by visiting the show notes for, this episode. Do this by going to healthyscreenhabits.org. Click the podcast button and find this episode. Dawn Juliana, thank you for all that you're doing to build that, um, that just, you know, powerful cohort of voices that are continuing to support people through, through the hardest times and teach people who are, who are coming up through it.
Julianna Arnold: (40:48)
Thank you, Hillary, thank you so much for all that you do, Hillary in getting the information out there. Yeah.
About the podcast host, Hillary Wilkinson
Hillary found the need to take a big look at technology when her children began asking for their own devices. Quickly overwhelmed, she found that the hard and fast rules in other areas of life became difficult to uphold in the digital world. As a teacher and a mom of 2 teens, Hillary believes the key to healthy screen habits lies in empowering our kids through education and awareness.
Parenting is hard. Technology can make it tricky. Hillary uses this podcast to help bring these areas together to help all families create healthy screen habits.



