S15 Episode 7: Online Harms and Prevention Group at Fairplay // Julianna Arnold
Hosted by Hillary Wilkinson
"These companies knew the harm that their platforms were causing, but intentionally designed them to be addictive so they could increase their viewership among young users."
~Julianna Arnold
Companies are not allowed to knowingly produce harmful products without posting consumer warnings.
As internal documents and emails social media executives get exposed it becomes clear that these companies know exactly how dangerous their products are for young or vulnerable brains. Today's guest, Julianna Arnold with Fairplay for Kids, tells us about a current court case in Los Angeles and how survivor parents are continuing to fight for all children.
Resources
Our episode with Frances Haugen, Facebook whistleblower and certified Healthy Screen Habits hero: HSH Podcast Season 8 Episode 2
Show Transcript
Hillary Wilkinson: (00:00)
Today I am doing something a little bit different if you're familiar with healthy screen habits, you know that we are located in Southern California. And to have something that is happening on our doorstep that is so important, it's difficult not to highlight it. And this is just, I would like to kind of do a brief catch up on what has been going on in Los Angeles at the Landmark trial over whether tech companies like Instagram and YouTube can be held liable for allegedly promoting a harmful product in addicting users to their platforms. There are many of the families who have paid the greatest price, and they are the ones that are, you see with the signs supporting demonstrators and providing faces to the deadly consequences of these platforms. And a lot of them are voices that you've heard here. So today I'm joined by Juliana Arnold, a parent who is marching the front line. And Julianna, can you tell us about your connection to this issue, especially as Coco's mom?
Juliannna Arnold: (01:26)
Sure. Um, well, obviously like none of us really, you know, thought we'd be in this position, uh, ever or that we would become advocates or activists or whatever you wanna call us, but we are. And so, um, I, um, got involved in this, obviously because I lost my daughter Coco, um, in 2022. She was just a couple weeks after her 17th birthday when she was, um, approached, uh, unsolicited by a man on Instagram who befriended her and pretended to be like a big brother friend and got her trust. He groomed her, lured her to meet him, said he had a, a Percocet, something that was good for anxiety, and she made a bad decision. And she left the house at that morning at like 11 o'clock, saying she was going thrifting with her friend, and she never came home. Um, whatever she was given, uh, the pill was not, not Percocet, it was Fentanyl.
Juliannna Arnold: (02:25)
And, and she died from fentanyl poisoning. So that brought me to try to figure out what I wanted to do. You know, like my whole world was completely like, blown up, and I really didn't know what would have meaning for me after that. And all I knew though, is I was still angry about so many things, but the thing that always had like, really angered me was like watching what was going on online, both with, the way that it becomes so addictive to my daughter. And I thought for like, certain types of kids that are vulnerable, whether that means they have, you know, something like ADD or a learning disorder or something. Not that it doesn't affect, you know, other kids 'cause they're all vulnerable, but really it really hooks these kids because it's like they found their place where they can get that constant, you know, dopamine rush and everything's new and everything's fun, and it's just very easy for them to kind of fall into that trap.
Juliannna Arnold: (03:29)
And that's what we saw. And um, also, I, I saw what was going on, especially on Snapchat with, you know, um, a lot of the drug stuff 'cause Coco had tried, um, marijuana. And as soon as that happened, I saw a flood of dealers come in with ads and videos, and I was constantly reporting them. And I basically cut down her Snapchat time to like two minutes to say hello to friends. And that was it, so I was monitoring things, but as we all know, I mean, it's like whack-a-mole. You can't keep track. Maybe they had more than one account. You know, they're better at this than we are. And, um, I did the best I could, but it was so frustrating, and I kept on thinking like, if I'm feeling this way, you know, um, then what are other families feeling that maybe don't have access to the resources?
Juliannna Arnold: (04:23)
I mean like maybe education or the language or, you know, not just financial, you know? And it really concerned me. And then, I met, a fellow, survivor mom, Deb Schmill, who, uh, we had a mutual friend and Deb's like, Hey, come on, like, join us. It was just kind of, it was like, I think it was late, late 2002. And I, um, got involved with the Online Harms prevention group and then started learning about this. And before I knew it, uh, we were taking a trip to DC to advocate for the Kids Online Safety Act. And then from there it was just like natural, like it would just kind of slow that I, I knew I was in the right place doing the right thing, even if I didn't want to be in this group. Right. Um, I knew that if I was gonna advocate and wanted to do something meaningful, um, this is what it was going to be.
Hillary Wilkinson: (05:17)
Yeah. Yeah. And it was right in your backyard. Why? I mean, you mentioned DC, which is of course the, the place where you think to go for federal legislation. Why was downtown LA chosen as the location for this case?
Juliannna Arnold: (06:18)
I mean, that's a good question. well California does have, you know, a reputation in their superior courts to being more open to a lot of these issues, you know, so I think there may be had been, but what this case is, is really a ca case is just, it's just, um, basically the cases that are involved in this are, um, from California. And, um, and it's hard. The JCCP versus the MDL is a different; they're not a class action, they're like a group lawsuit where you have one plaintiff. And so we're not the plaintiff in this case. Mm-hmm . Um, there's, you know, one plaintiff and then there's the rest of the group, which we have are like almost a couple thousand cases cases, but this, she's been chosen as the bellwether case. So that bellwether case will get decided alone, totally separate from us, but that would then inform what was gonna happen with the rest of the cases that are in that JCCP group.
Hillary Wilkinson: (07:26)
Okay. So that would act as the precedent case.
Juliannna Arnold: (07:31)
So what I heard, I mean, I'm not, I'm not an attorney obviously, and I probably probably can't explain it clearly as well as other people. But under my understanding, because the MDL is a national case. Mm. Um, and the JCCP is state by state, so it's a state case. Okay. So, but both of the cases that the first two, that the two bellwether cases that are gonna be heard, this, you know, spring and summer one is a J-C-C-P that's here down in Los Angeles.
Hillary Wilkinson: (08:02)
What does JCCP stand for? Do you know?
Juliannna Arnold: (08:04)
I should, I was like, yeah, I, I, let me look it up. I'm so bad I should know this. Oh,
Hillary Wilkinson: (08:08)
No. Like
Juliannna Arnold: (08:08)
Ridiculous that I don't know this. Um, I do know it, but I can't remember it. Oh, that's a whole other
Hillary Wilkinson: (08:14)
Thing. No, that's fine. Clearly I don't know it, so that's why I'm having to ask.
Juliannna Arnold: (08:18)
Okay. So JCCP is a judicial counsel coordinated proceeding. It's a legal mechanism that consolidates multiple civil cases involving common questions of law or fact into one superior court for efficiency. It's used for complex litigation like mass torts, wildfires, or class actions.
Hillary Wilkinson: (08:39)
Okay. And it's my understanding, correct me if I'm wrong, but it's my understanding that the judge in LA it was the, the previous case, he kind of put a pin in it and said, this is a not a section two 30 issue. This is a product safety issue. Exactly. Right. Am I right understanding?
Juliannna Arnold: (09:13)
No, no, no. It's like all taking the, I mean, the approach that they've taken, because most of the cases that have even tried to go and get, you know, move forward, um, immediately get, you know, um, sent out of for, don't even make it there because of 230. So the, um, social media companies, uh, will plead, you know, 230. So therefore they don't have any liability for any content, as you know, on that platform. So, yeah. Sorry. You can't sue us. And like with that case, I know that judge was like, Hmm, I'm not really sure that that's really the case, you know, and that's the Neville vs Snapchat and then this lawyer, Carolyn Kool, she also was very open to wanting to hear this case. So at every moment, you know, the tech companies were trying to find a way out of this, right.
Juliannna Arnold: (10:11)
Trying to like, play every card they had. And she just repetitively said, no. Like, I'm, I, I mean, this is a product liability case. I'm gonna hear it as a product liability case. Yeah. This is not a content case. This is about the way that they designed their platforms, and that's how it's been, you know, carried out in, in court. Because anything that comes like up about content, you know, they're like, okay, that's 230. We're not talking about content. Right. So it's really not focused on content, which of course is difficult to like, you know, maneuver, but that's what they've been doing. And it's solely about the design of the platforms and that these companies, um, knew the harm that their platforms are causing, but intentionally designed them to be addictive so they could increase their viewership among young users and then increase their profitability because they've monetized their platforms by the number of eyeballs they have.
Juliannna Arnold: (11:07)
So they can sell that to advertisers, and hence that's where the profit's coming from. Right. So, um, so that's kind of been the difference. And I guess maybe in California they've found judges that are more open to hearing these cases. Mm-hmm . Um, so this case and the Neville case is now in discovery as well. But those are some of the first ones that have been able to go to discovery. And that's what's so meaningful is just 'cause when they go to discovery, um, these companies will try to fight back and not produce documentation like internal documents. Mm-hmm . But they've been forced to because she's been very much like, no, let's, let's, we gotta see what we gotta see. We've gotta gotta, like, you've gotta act. There's, you don't get any special preference. And so she's been really, um, amazing in just wanting to hear this case in, um, in an unbiased way, but also not taking in 230 where the plaintiff has no case whatsoever.
Juliannna Arnold: (12:12)
Right? Right. So, um, so now all this information's coming out, which is crazy because it's like a lot of the anecdotal information that Survivor parents and other parents that kids have been harmed on the platforms have been talking about. But it's really providing the internal documentation that is really starting to, um, show that this in fact was an intentional design feature and that up to the highest level. And, um, you know, and there's just no never been anything to stop them because they have sign full immunity behind 230. So, right. It's a unique approach to take on a pri product liability versus just a, you know, a, a content, you know, um, right approach.
Hillary Wilkinson: (13:02)
Right. So just to bring everybody up, in case you're not familiar with what Section 230 is, um, for years, like Julianna has said, for years, social media companies disputed allegations that they harm children's mental health through these deliberate design choices to addict kids to the platforms and failed to protect them from predators and dangerous content, despite the reports from inside whistleblowers, ie: Francis Haugen and many others who have come forward since then. Um, you can go back and listen to Francis' interview with me earlier. in an earlier season, and families who quite frankly have lived and know the difference and social media platforms continue to hide behind this section 230 of the 1996 Communications Decency Act, which protects tech companies from liability for material posted on their platforms. So it's called, it's also referred to as like the Blackboard Rule because it treats online platforms like a public chalkboard or a bulletin board, rather than say like a newspaper editor. It establishes that these platforms are just intermediaries. They're not publishers, they're not legally liable for content written by third parties. So like, if anything, if there's a blackboard in a coffee shop and somebody writes something illegal on it, the owner isn't liable for that message. And that's what Big Tech has been hiding behind. And that's why this product, like, like the twist in it being product, not, um, content is so key.
Juliannna Arnold: (15:06)
Exactly.
Hillary Wilkinson: (15:08)
Juliana, is there, were there any aha moments or turning points for you? I know Mark Zuckerberg came up and spoke, and I, my heart is heavy for you. Like, I, I, were there any things?
Juliannna Arnold: (15:26)
I mean, it felt like, I mean, I was sitting there, so the first week they didn't have a lottery system. They had first come, first served. And so that Monday when, you know, we were together, um, the opening statements, it was first-come, first served, but it wasn't as crowded. So we were able to get in, we got there very early in like, the very early morning, I'd say like four o'clock in the morning. And we were able to like go in line. It was very calm and we were able to get in. And then, um, we were not there on Tuesday. And the word on the street was that whether it was true or not, that the tech companies had gotten placeholders, like people to come and stand in line so other people would not get the seats. 'cause there's only 15 seats in this at the beginning it was 12 and now it's like extended to 15. So it's a very limited number of seats.
Hillary Wilkinson: (16:21)
I had no idea.
Juliannna Arnold: (16:22)
So we had Survivor parents in town for a couple of weeks. Um, they kind of shifted in and out, but yeah, we had survivor parents, so we, um, decided, 'cause we had heard that there, the court had started talking about going through a lottery, you know, system, which would mean it's like luck of the draw. Like it doesn't be difference how early you get there. So when we knew that Adam Mosseri was testifying on that, this was two weeks ago Wednesday, um, we decided like the night before, like, okay, well we really, really, really felt strongly that the first thing he saw was a whole row of Survivor parents just present mm-hmm . And, um, in order to do that, it turned out that we really, uh, we got hotels downtown. We were planning on going there maybe at two, but when we drove by, we saw a couple people there.
Juliannna Arnold: (17:18)
So we're like, oh, we're going now. So basically, um, it was 11, 11 Lori shot ran up. We were like, woo, we got it. Like, we're here. And thank goodness we did. And I don't think we would've got in. But that next morning when, um, Adam Mosseri testified, um, we were the front row and he was looking at, you know, 12 survivor parents. Um, and I think he knew exactly who he was looking at. Like, it felt palpable that he understood. So we wanted to send that message that we're not gonna be so as easily, you know, you know, distracted, I should say. And so that was fantastic. I think that really, even though it was hard and we were tired, and it really kind of gave us a sense of like power, I think that we can fight back and we're not gonna back down.
Juliannna Arnold: (18:13)
Um, which was, which was good. Now his testimony didn't really give us much to go by. I mean, they've all been very coached and trained to answer these questions in a way where they're not perjurying themselves, but they're not really answering the questions. Mm-hmm . And we're finding that with every executive that's been, you know, on the stand, that they all have a way of like, not lying, quote unquote Right. But not also telling the truth. And so these documents are put in front of them, and quite often they were compared to maybe a testimony that they did at a congressional hearing mm-hmm . Especially with Mosseri and Zuckerberg. Right. Because they referenced a lot with Zuckerberg, um, the January 2024, um, hearing in which, um, he was forced to apologize to the survivor parents. And we actually all were there with our, you know, photos of our kids.
Juliannna Arnold: (19:15)
And, um, they just managed to skirt the issues. I mean, it was obvious in my opinion, you know, what they were doing. It seemed very like, okay, this is what you guys always do. You come out with some, like after they get busted for something or something comes out that's negative the next day, there's always some big press release or some new safety product or Adam, a series on, you know, the Today Show talking about, you know, Instagram teams and, you know, so it was more of the same, which, um, which I don't know if I expected anything differently, but what is out there now are these documents Mm. That they can't refute that they exist. So they're still there and they're gonna keep on, there's like thousands of more that are being released just this week and with every trial. 'cause there'll be I think eight, um, bellwether trials in the next two years mm-hmm .
Juliannna Arnold: (20:11)
Um, more and more documentation's gonna get released. And so that's kind of our hope is that as parents and legislators start to see more and more of what really the inner workings were going on in these companies, that they're not gonna be able to, A, take money from these companies or B, work with them in a, you know, a, a compatible way, but also they're just gonna have to, you know, put their foot down and do their jobs, which is, you know, put guardrails on these companies so we can keep our kids safe and force them to design products that, you know, are not deadly or harmful, to our kids. So
Hillary Wilkinson: (20:57)
Thank you for doing the hard work for showing up day after day. Oh,
Juliannna Arnold: (21:14)
No, it's actually quite fascinating. I was so totally hooked. I mean, talk about addicting. I was, was like, I was hooked. Oh my God. I was, I was totally hooked. Yeah. Like, I was like, because also the litigator for the plaintiff is so, um, he, he was trained as a pastor and he's from Texas. He, um, just presents everything in a way he is telling a story and he, he, he very much understands that different people learn in different ways. So some it's auditory, some it's visual, you know, so some of it's tactical. So he uses all these different styles to get the message across and really has a, an amazing way to make something that's probably pretty complicated, pretty simple.
Hillary Wilkinson: (22:27)
Oh, that's a gift. Right.
Juliannna Arnold: (22:29)
Oh my God. So like, he's
Hillary Wilkinson: (22:31)
Taking the complex and making it understandable. That is,
Juliannna Arnold: (22:34)
And he does it. He's such a like, great delivery too. Mm-hmm. Like, you just wanna listen to him. So I think it's been fascinating to see him at work, you know, and taking these difficult issues you really gotta think about how you're gonna approach this because it's like a fine line to not go into content. Absolutely. You know, it's really like you have to keep it really narrow. And even today something came up about, um, bullying and stuff, and that's really content. Content, you know what I mean? So it's like you gotta steer away from that, even though we know we have a huge issue, you know, with a lot, especially the illegal content that's, on these platforms, which they're not doing anything about.
About the podcast host, Hillary Wilkinson
Hillary found the need to take a big look at technology when her children began asking for their own devices. Quickly overwhelmed, she found that the hard and fast rules in other areas of life became difficult to uphold in the digital world. As a teacher and a mom of 2 teens, Hillary believes the key to healthy screen habits lies in empowering our kids through education and awareness.
Parenting is hard. Technology can make it tricky. Hillary uses this podcast to help bring these areas together to help all families create healthy screen habits.



