S8 Episode 2, Part 2: How to fix social media...Not kill it // Frances Haugen, AKA the Facebook Whistleblower

Sep 20, 2023

Hosted by Hillary Wilkinson

"We desperately need transparency."

- Frances Haugen

As the source behind the Wall Street Journal's Facebook Files, Frances Haugen exposed Facebook's awareness, complicity, radicalization in political violence around the world, plus their disregard to users' mental health. 


Her new book: The Power Of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook, (available in our Amazon Marketplace) tells the backstory and more.


I had the opportunity to speak with Frances. She was incredibly generous with her time…so generous that I did something I never have before – that is: create a 2 part episode.


Enjoy listening to Part 2 of this fabulous conversation as Frances Haugen talks about ways we can fix social media….not kill it.


Healthy Screen Habits Takeaway


Resources

For more info:

Beyond the Screen: website


Get Frances book, The Power of One, here


Show Transcript

Hillary Wilkinson (00:00):

This week you'll be listening to part two of a conversation with Frances Haugen, the Facebook whistleblower. If you missed part one of this conversation, I strongly recommend you go back and check it out. This week we're tackling the question she and I ended with last episode. How do we fix social media? Welcome to the Healthy Screen Habits podcast. I'm Hilary Wilkinson. Whether you're starting your parenting journey with a newborn or looking to connect with your teen on technology, let's learn some new healthy screen habits together. I'm speaking with Frances Haugen, a k a, the Facebook whistleblower and founder of Beyond the Screen, a nonprofit driving systems level change for social platforms by building an ecosystem of accountability. So let's talk about this accountability. Hmm. In May, the Surgeon General issued an advisory which stated, yeah, that social media presents a profound risk of harm for kids. So recognizing that the parent company meta includes Facebook and Instagram, does meta slash Facebook know how to keep kids off of these platforms?


Frances Haugen (01:29):

Hmm. So that, that's one of these things that's quite frustrating, which is so right now the line of defense between kids and these platforms is, can you successfully lie about your age? Mm-Hmm. <affirmative>, you know, can you come in there and say, yes, I'm 13. And just for context for your listeners - let's play a guessing game. What year did they start requiring new users to disclose their birthdays on Instagram? So think about this like every other piece of software, like, you sign up, what do you do? You put your birthday in. What year did Instagram start asking people to put their birthdays in?


Hillary Wilkinson (02:09):

I don't even know.


Frances Haugen (02:11):

2019. So like eight years after they were founded. Wow. Okay. What, what year did they go back and say, Hey, we have a lot of users that have been on here for quite a long time. You know, who, who joined before 2019, but we've never asked them their age. When do you think they started requiring all users to disclose their age


Hillary Wilkinson (02:30):

2019?


Frances Haugen (02:33):

 2021! So it wasn't until they were asked for comment on the information in my disclosures that they started requiring people to disclose, like, even, even previous Caesars to disclose their ages. So there's this question of why would Instagram do that? Like, why would Instagram not even do the very bare minimum of phoning it in to try to keep kids off platforms? And that's that there's a, there's a, a, we're dealing with conflicts between business goals and societal wellbeing. Mm-Hmm. <affirmative>. So any company that is the first mover who unilaterally says, we're gonna actually keep kids off our products, is giving up on the next generation of social media users. Because the, the, the norm will, will esta get established on whatever platform cheats. And I'll, and I'll give you an example of like, well, you just said, you, you can come and say, well, like Francis, like if they got more stringent about detecting kids, and, and let's just talk through some of the many ways that you can find kids on, on platforms like Instagram, kids do things like they put in their profiles.


Frances Haugen (03:36):

“I'm a fourth grader at Jones Elementary.” Mm-Hmm. <affirmative> kids do things like, “you annoyed me on the playground, so I report your Instagram account and you lose it.” Kids do things like they show up geographically at an elementary school every day, and all their friends also share show up at that elementary school. Right. Or other elementary schools, you know, they have no adult friends or almost no adult friends. But there's a lot of different ways you can find children on these platforms. The only reason why the children are on there is because the platforms are afraid to be the first movers. Hmm. And part of why this matters so much is we're looking at the current research, and it looks like the danger window for kids on social media is like 10 to 13 mm-hmm. <Affirmative>, like when they start to go through puberty.


Frances Haugen (04:21):

And that's because kids' brains begin to change. They start getting more social reward transmitters. So that means like more dopamine, more oxytocin for, I'm assuming none of your listeners are out there are 13 year old girls. But like you know, no compliment the rest of us can, will ever get, will be as sublime or any criticism as painful as what like a 13 year old girl can get in a hallway. And that's because literally our brains are wired to be less sensitive to that feed, that feedback. And so it's, it's really interesting. It's a great example of where we need to start having conversations around liability. You know, we don't have to talk about liability about content, but we do need to talk about liability around how these systems are designed and how they perform.


Hillary Wilkinson (05:08):

In your book. It's also shared that Facebook employees considered it problematic that teen siblings were coaching their tween brothers and sisters against oversharing. So mm-hmm. <Affirmative>, I kind of, I come from this potentially naive place of wanting to believe in humanity <laugh> and assume positive intent. And I have a hard time applying anything positive to this. So could you, could you break it down for me, like, why was sibling coaching against oversharing considered problematic?


Frances Haugen (05:45):

So this is one of these interesting things where you know, people…. people can get siloed in what they view as success. And at Instagram, they knew there was a problem across Instagram of kids creating artificially refined versions of their own lives on social media. You know, it wasn't about being your authentic self and showing the good and showing the bad. It was about having these, like, you know, we used to say at Pinterest, like a Pinterest per perfect lifestyle. Yeah.


Hillary Wilkinson (06:18):

Highlight


Frances Haugen (06:19):

Reels. It's a highlight reel. Yeah. And, and what's interesting about that is that it, it's a feedback loop, right? That, that when you sign on and you only see the highlight reel of other people's lives, you end up in a situation where, like, that that product is actually more dangerous for people in general. And so that document was talking about the idea that you know, when like the, it, it was a really interesting document about that is now, I believe, available through the Harvard archive. So Harvard has started to make those documents available to people who wanna research them. And there was talking about how Facebook needed to start analyzing how Instagram worked within a family lens. Hmm. And one of the things that they found was that the people who were doing a lot of the work of onboarding younger children onto these platforms were their older siblings.


Frances Haugen (07:11):

And that when Facebook delegated that onboarding experience, one of the things that came along with it was that older siblings would coach kids on, you know, when you post things online, they, they last forever. Be really careful about what you post or, you know think carefully about like what you want to convey because other people will see it and they'll judge you. And the document was talking about how this could be potentially a problem because it created barriers to future sharing, right. That if you associate I need to be really careful about what I post, you won't post as often. Right. You'll maybe you'll YouTube will only post your highlight reel. And so it's one of these interesting things where I think they had positive intentions, but at the same time, like it's, it's one of these things where we can't let Facebook grade its own homework. Like, we have to have the public be involved for accountability reasons. Yeah.


Hillary Wilkinson (08:05):

So, sadly, I interpreted that exactly correctly. <Laugh> <laugh>.


Frances Haugen (08:13):

But,


Hillary Wilkinson (08:13):

But, but thank you for the explanation. I just go, “Oh man, that's really what I, I really wished I was misunderstanding that <laugh>”.


Frances Haugen (08:21):

Yeah, yeah. No, I get, yeah.


Hillary Wilkinson (08:24):

In your book, The Power of One, you task us with fixing Facebook, not killing it mm-hmm. <Affirmative>. And if, if I were to substitute social media for Facebook, recognizing mm-hmm. That Facebook is not alone mm-hmm. And the healthiest path mm-hmm. Will be the path forward figuring out as we go. Mm-Hmm. Mm-Hmm. So do you have an idea on how, how do we fix social media?


Frances Haugen (08:51):

Actually, let's, let's, before we discuss that question, I think, I think we should actually dive a little bit deeper into like, why is it I say we have to fix these systems, we can't just mm-hmm. <Affirmative> we can't just say like, you need to sign off. Like, the only thing to do if you wanna protect yourself is sign off and I do think people should monitor how they interact with these tools. And the right thing for you may be to sign off. Like everyone has a different relationship with them. But I think what we lose when we tell individuals like you should sign off, is we ignore the fact that over the last 40, 50 years, we have slowly gutted our opportunities to socialize with each other in person. Like it used to be that people belong to bowling leagues, right. People used to go to church, or more people used to go to church.


Frances Haugen (09:33):

People used to belong to things like elks, lodges, like fraternal orders mm-hmm. <Affirmative> garden clubs. And, over time we have done things like defund community centers, senior centers, playgrounds. It used to be we had third spaces other than our home and work to socialize, and we don't anymore. And the people who face these problems the most acutely are often people who are already economically marginal. Or they're marginal in some other way, like a physical disability. And so when we come in there and are really absolutist about like social media, bad, like unadultered, it's bad. I, I think we, I I worry we are having a conversation from the wrong angle because a lot of people don't have alternatives to socialize today. Right.


Hillary Wilkinson (10:19):

And like, and the alternative can alternative Yeah. The alternatives cost money. You know, the country clubs,  the alternatives cost money. The tennis clubs… totally


Frances Haugen (10:28):

Mm-Hmm. <Affirmative> mm-hmm. <Affirmative> a hundred percent. And so it's this question, and I think that actually dovetails in to the second thing I was gonna say, which was the other reason I say we have to fix Facebook is Facebook very intentionally bought itself something known as a network effect in African countries in Southeast Asia. You know, they came in and said, if everyone uses Facebook as the internet, it's gonna be very difficult for people to leave. Hmm. And that's still true, right? The reason why Facebook is the internet for billions of people is because that legacy is gonna take a long, long time to unwind. And so if privileged people in the United States sign off Facebook and say like, you know, I'm, I'm gonna walk with my feet, like, but I'm not gonna go and like, push to reform these systems, we're gonna leave behind people in some of the most fragile corners of the world who have the least ability to, to put pressure on these systems to change.


Hillary Wilkinson (11:22):

Hmm. Interesting. So do you have a recommendation on, on where we go, how


Frances Haugen (11:29):

To make these things faster? Yes.


Hillary Wilkinson (11:30):

Yes. Sure.


Frances Haugen (11:31):

So, the thing I'm always trying to encourage people on is there, I I think that the, I like to think of conversations about how we move forward as kind of falling into two categories. So one is we can get on what I call the magic bullet wishlist train. And we can be like, we can enumerate five things that we should fix about social media. We can enumerate 20 or 30 things where these are just in the documents, right? Like Harvard just opened up an archive of, of a large fraction of the Facebook files mm-hmm. <Affirmative>. And you can go read about solutions that Facebook researchers suggested, right? We have a lot of, of, or individual solutions, but the only way any of those solutions will get done or get used is if we change the incentives that these systems operate under.


Frances Haugen (12:20):

So we talked about earlier how right now we're kind of in a situation like cars were in 1965, you know, people had known that seat belts could be saving lives for 20 years mm-hmm. <Affirmative>, and yet they were still elective amenities in cars. And the way we change is by saying, “Hey, you don't get to hold all the cards anymore. If I ask you a question, you have to give me a real answer. You have to let academics study your platforms in, in authentic ways. You can't sue academics that make you look bad anymore.” Yeah. And that's really happening. Like, like Facebook has sued researchers who caught Facebook with egg on their face before. Hmm. Right? and so it's one of these things where that's the way we're gonna make these systems safer. But if you want some, magic bullets, I'll give you a really simple one that will help on a lot of kid problems, which is sleep deprivation is really, really serious.


Frances Haugen (13:19):

Mm-Hmm. <affirmative> right across a bunch of the biggest harms for kids with social media. A thing that comes up over and over again is it's very easy to get compulsive about your interactions with these products. It's very easy to stay up really late. And, you know, the surgeon general just two weeks ago said you know, 1 in three kids is saying they use screens. I'm guessing a lot of that's social media screens until midnight or later. Mm-Hmm. <affirmative>, most weeknights. Mm-Hmm. <affirmative>. And when it comes to things like Instagram, we have known for 20 years that if you make an application a tiny bit slower, we're talking five milliseconds, 10 milliseconds, 20 milliseconds, slower. These are tiny slivers of time slower. People use those applications less. Oh, wow. Like when I worked at Google, we would obsess over this, like, you know, yes, this feature you brought in is cool, but it makes Google five milliseconds slower.


Frances Haugen (14:19):

No, you can't have it. Right. Like that, that level of obsession. Imagine a world where instead of popping up a little thing at 10:00 PM saying, “Hey, it's your bedtime” mm-hmm. <Affirmative>, which I have on, on, on like YouTube, and I don't know about your listeners, but I always snooze it, you know, doesn't work for me. But imagine instead of doing that, any of these apps asked us at noon, “when do you wanna go to bed tonight?” You know, when do you want, what, what time do you wanna, you wanna go to bed? You have willpower right now? When do you wanna go to bed tonight? And you're like, you know, this kid who's, who's hungover in math class? 'cause They stayed up till three o'clock last night. They're like, 11. My mom wants me to go to bed. 10, I wanna go to bed at 11. And imagine for two hours before 11, Instagram got like a little bit slower and a little bit slower. A little bit slower. Or maybe, maybe on TikTok there's like a slightly longer gap between videos. Like you go to play another one and it just takes longer to buffer. Imagine it slowly creeps in for a couple hours before bedtime. Around bedtime you would get tired and go to sleep. Yep.


Frances Haugen (15:24):

Right there, there, it seems like such a simple thing. And it is simple like it is live on Instagram today. If you steal content from Instagram, like they don't take your account down, they just slow your account down so you can take less overtime. And so it's one of these things where if Facebook actually cared about the fact that sleep deprived kids are at higher risk for a host of mental health issues, we're not just talking about depression and anxiety, we're talking about bipolar schizophrenia. That it's not just that sleep deprivation has huge academic implications which live with that kid for life, or that it makes 'em more likely they use substances, uppers 'cause they're tired, downers 'cause they're depressed. Or, or accidents. Like it's not just car accidents. It's like all cause accidental death. If they cared about this, kids could be going to bed when they want to go to bed two weeks from now. Yeah. But if they do that, they will lose this generation to TikTok. And so they don't do it.


Hillary Wilkinson (16:25):

So it's like you were talking about that Yeah. The first platform to do it loses. Yeah. Yeah.


Frances Haugen (16:32):

And so I'll give you an example. I'll give you an example of like how I think we get to a world where we care about kids' sleep deprivation. Let's imagine we went so basic as to say, Hey Facebook, every week you have to publish a number saying how many kids were online after, after 10, 11, midnight, 1, 2, 3, 4:00 AM mm-hmm. <Affirmative>, you have to publish it every week. One little number. What's the summary on average on any given night that week? How many kids were awake at what times? I guarantee you podcasters would talk about it and advertisers would boycott and investors would do divestment. Right? Like any information at all allows the gears of the ecosystem of accountability to start turning. That's how change happens.


Hillary Wilkinson (17:21):

So transparency, right? Transparency. Yeah. We need transparency.


Frances Haugen (17:24):

Yeah. We desperately need transparency.


Hillary Wilkinson (17:26):

Yeah. Yeah.


Frances Haugen (17:28):

And, and just as a, a little like, like thing for, for, for your listeners out there, there are laws that are hanging in Congress Yes. Right? Like even really basic laws. Yes. Like the Platform Accountability and Transparency Act. People think accountability isn't, excuse me, transparency isn't sexy. Let's get even basic transparency because we really can change things in dramatic ways with even, even shreds, shreds of data.


Hillary Wilkinson (17:55):

When we come back, I'm going to ask Frances Haugen, for her healthy screen habit, which honestly kind of feels like asking Julia Childs or Joanna Gaines for a, a Rice Krispy treat recipe. <Laugh>. But, but I'm still gonna do it. <Laugh>, 


Ad:

Did you know that Healthy Screen Habits does workshops? For 2 days we can come and guide students through an exploration of their own tech use and facilitate a research-based presentation bespoke to their own questions. 87.5% of the teens who participated in this workshop recorded a change in their tech use following participation. For more information, contact us at info@healthyscreenhabits.org.


Hillary Wilkinson:

I'm talking with Frances Haugen, the Facebook whistleblower and author of The Power of One. How I found the Strength to Tell the Truth and Why I blew the Whistle on Facebook. Truly a person who embodies the concept of being a Healthy Screen Habits Hero. So Francis, on every episode of the Healthy Screen Habits podcast, I ask each guest for a healthy screen habit, which is a tip or takeaway that our listeners can put into immediate practice in their own home. Do you have one? 


Frances Haugen (19:19):

So I, I'm one of the things I'm always trying to raise awareness around is the idea that you know, it's actually pretty hard for kids to start making individual choices to spend more time in person with their friends unless their friends make those same choices at the same time. Right. It's not enough for one kid to say, I wanna have an in-person lifestyle. But the the crazy thing is that, that that is a possibility, right? Like the, if you were to go to like an elite private school in the Bay Area, Silicon Valley and talk to a random 17 year old there's a good chance that they would, you know, look at you and kind of like scoff, you know, be like, we're an in-person community, <laugh>, you know, like they're, they're the places where the parents see the inside data.


Frances Haugen (20:09):

Mm-Hmm. <Affirmative>, like the only places where, where parents actually have transparency, they, they go to pretty big lengths to, to help cultivate communities for their children, where their kids get to make authentic choices around socializing. And so my, my healthy screen tip is, you know, be aware of what the digital culture is like for your children's community. Mm. You know, and, and the be aware of the I and, and, and be aware that that's an intentional choice mm-hmm. <Affirmative>, that you can be having conversations as that local community around, do you want to find more opportunities for kids to get to socialize in person,


Hillary Wilkinson (20:45):

Right. And kind of building your tribe around your mm-hmm. <Affirmative> your people. Yeah. Mm-hmm. <Affirmative>. So as always, you can find a complete transcript of this show, a link to our Amazon marketplace where you can pick up your own copy of the Power of One and a link to beyondthescreen.org, where you can find Francis Hagan's ongoing commitment to build the ecosystem needed to create social platforms for the common good. And do all of this by going to healthy screen habits.org. Click the podcast button and scroll down to find this episode. So, Francis, there are times that even as a podcaster, I feel really limited by language. Mm-Hmm. 


Frances Haugen (21:29):

<Laugh>,


Hillary Wilkinson (21:30):

And I, I need words beyond just like the ones that I have to say thank you cannot even


Frances Haugen (21:37):

Oh, thank you.


Hillary Wilkinson (21:38):

It cannot convey even a portion of the gratitude that the world is handing you. Honestly, the planet is indebted to you, and I'm limited by language, but I, I'll, I'll just say thank you so very much and I've really enjoyed talking to you.


Frances Haugen (22:00):

<Laugh>. You’re welcome. Yeah.


Hillary Wilkinson (22:01):

Okay.


Frances Haugen (22:02):

Well, I, well thank you for, for helping me connect with your listeners. The thing that I always try to leave people with is every single time we've invented a communication technology, it's been incredibly disruptive. Like incredibly disruptive. And so it can feel overwhelming in this moment. But I want everyone to remember every single time before it was printing presses, newspapers, radios, even the telegraph was disruptive. You know, we've figured it out and we figure out how to live with it and make it a positive force for good. So thank you so much.

Hillary Wilkinson (22:34):

That's awesome.


Hillary Wilkinson (22:38):

For more information, you can find us on Instagram and Facebook at Healthy Screen Habits. Make sure to visit our website healthy screen habits.org, where you can subscribe to the show on Apple Podcasts or via r s s so you'll never miss an episode. It's free, it's fun, and you get a healthy new screen habit each week. While you're at it, if you found value in this show, we'd appreciate you giving us a quick rating. It really does help other people find us and spread the word of healthy screen habits. Or if you'd simply like to tell a friend, we'd love that too. I so appreciate you spending your time with me this week, and I look forward to learning more healthy habits together.




About the podcast host, Hillary Wilkinson


Hillary found the need to take a big look at technology when her children began asking for their own devices. Quickly overwhelmed, she found that the hard and fast rules in other areas of life became difficult to uphold in the digital world. As a teacher and a mom of 2 teens, Hillary believes the key to healthy screen habits lies in empowering our kids through education and awareness. 


Parenting is hard. Technology can make it tricky. Hillary uses this podcast to help bring these areas together to help all families create healthy screen habits.


Recent Episodes

S10 Episode 1: Screen Strong and Growing // Melanie Hempe, BSN
02 May, 2024
After her oldest son dropped out of college due to his video game addiction, Melanie Hempe put her nursing degree to good use and founded Screen Strong,@bescreenstrong a nonprofit that empowers families to prevent screen problems and reclaim their kids from toxic screens. Listen to this episode and learn how your family can stop fighting over screens, kids can gain more life skills and everyone can benefit!
S9 Episode 11: Do YOU Know a Healthy Screen Habiteer?
19 Apr, 2024
Healthy Screen Habits was founded by a group of 4 moms who find it imperative to practice what we teach! Next week, the podcast will take a break as we enjoy Spring Break with our own families. During Spring Break, take some time to do some digital spring cleaning! Delete unused apps and revisit memories of the past year by organizing photos. The act of revisiting memories brings about reminiscence which it turns out is one of the best ways to increase language with younger kids and strengthen memory. Enjoy all of these memories and create new ones this Spring Break.
Share by: