S8 Episode 2, Part 1: - Healthy Screen Habits Hero // Frances Haugen, AKA the Facebook Whistleblower

Sep 13, 2023

Hosted by Hillary Wilkinson

"One of the most painful things you can experience is having someone get taken away from you by the algorithm."

~Frances Haugen

The Facebook Whistleblower, Frances Haugen deserves a seat in the High Court of  Heroes in the “Do The Right Thing” Justice League ( I just made that up but it should absolutely be a thing.).  A data scientist, Harvard graduate, expert of algorithmic product management and an Iowan daughter of two professors, Frances truly moved the needle for humanity when she recognized wrongdoing and stood up for the world. 


The source behind the Wall Street Journal's, Facebook Files, Frances exposed Facebook's awareness, complicity, radicalization in political violence around the world, plus their disregard to users' mental health. 


Her new book: The Power Of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook, (available in our Amazon Marketplace) tells the backstory and more.


When I had the opportunity to speak with Frances she was incredibly generous with her time…so generous that I am going to do something I never have before – that is: break this episode into 2 parts. 



Healthy Screen Habits Takeaway

See part two of this episode next week for Frances Haugen's Healthy Screen Habits takeaway!


Resources

For more info:

Beyond the screen: website


The Power of One: Amazon book link


Show Transcript

Hillary Wilkinson:

This season our theme is Healthy Screen Habits Heroes – and as we all know heros can come in many forms.  The hero I am featuring this week is truly someone who moved the needle for us all.  As the Facebook Whistleblower, Frances Haugen deserves a seat in the high court of  the “Do The Right Thing” Justice League.  When I had the opportunity to speak with Frances she was incredibly generous with her time…so generous that I am going to do something I never have before – that is: break this episode into 2 parts.  – you’re going to want to listen to both episodes.


Hillary Wilkinson:

Okay. In soccer, there's the World Cup. In entertainment, there's the Academy Awards. In Digital Wellness, there was one event that had people canceling meetings and postponing appointments. In 2021, we were all glued to our devices, radios, TVs, anything. And today's guest is the reason why. On October 5th, the Facebook whistleblower testified in front of the US Senate Commerce Committee for three and a half hours. We watched on C-SPAN as this data scientist, who is a Harvard graduate, an expert of algorithmic product management. And an Iowan daughter of two professors stood up for the world. So two days before her identity had been unmasked on 60 Minutes as she gave an interview, which revealed herself as the source behind tens of thousands of pages of internal documents and the source behind the Wall Street Journal's, Facebook Files, Facebook's awareness and complicity and radicalization in political violence around the world, as well as the disregard given to users' mental health were now public knowledge to us all. I am thrilled you are here. Welcome to Healthy Screen Habits: Frances Haugen!


Frances Haugen :

Thank you for inviting me. Happy to be here.


Hillary Wilkinson :

Frances, we have a lot of families who are followers, and I'm positive so many parents are wondering right now. What were you like as a kid? Like, what did you like to do? Mm-hmm. <affirmative>.


Frances Haugen:

Um, so I, sorry, I, I foolishly ate some peanuts, like right before <laugh>. I started this interview and so I have a little tickle in my throat.


Frances Haugen:

When, so I grew up in Iowa and, um, I definitely, you know, it's fascinating for me to kind of see what childhoods are like today because, uh, you know, I grew up in a pretty, a pretty safe community. Um, uh, my parents were, were, open-minded, and it meant that I could wander around town. So, you know, I started having what is now obvious in retrospect, um, some mild neurological deficits. So I started having a little bit of trouble walking when I was, um, seven. Um, and, you know, I, my parents, you know, bought me a bus pass and like, let me wander around town by myself instead of going to day camp. And I'm really, really grateful that like, both, I was like a, a level headed enough kid and that my parents saw that, that, you know, I got to have a very, um, open and kind of expansive childhood.


Frances Haugen :

I really liked building things. So I, uh, used to do things, everything from like building, you know, doll houses out of, you know, foam to, and like, felt and stuff to, um, you know, I learned very basic circuitry in, you know, elementary school, like our science class, you know, learned how to light up light bulbs and with batteries and wires and stuff like that. So I, I had fun building models I, we had a very extensive garden. Um, and I really liked gardening 'cause my parents are both avid gardeners. Um, and so yeah, I, we had, I had a childhood that was not super online until maybe I was 12 or 13, 'cause like, no one was really online. Mm-hmm. <affirmative>.


Hillary Wilkinson:

Yeah. So, um, kind of standing up for what you believed in was critical in this whole whistleblower path slash journey that you were very brave to take on <laugh >. Was there anything in your upbringing that drafted this strong stance of right and wrong?


Frances Haugen:

I think probably the foundational experiences that influenced the course that I eventually took was, um, I was really lucky to grow up in a school district that took high school debate really, really seriously. And so when I was in high school, um, I did high school debate for all four years. Um, I was the novice debate coach, uh, starting my junior year. Um, so I, like, I coached the junior high debate program, which I found out is still going today. Like, I, I did ran the first junior high debate program when I was, when I was, um, I guess a junior Yeah. When I was a junior. And, um, and so high school debate for those who never did it. Um, you learn both about issues around how the world operates, but you also, um, uh, depending on the style of debate, you do also learn a lot about ethics and philosophy. And I went on to coach, high school debate. And so I meant that, uh, I had to get really, really clear on like, what did I believe? Because I needed to be able to teach it in a clear enough way that, you know, 14 year olds could understand. Um, and that's like a, a for anyone who's ever parented, um, a 14 year old, you know, that's another level of clarity.


Hillary Wilkinson :

Right. Right. This is so interesting 'cause I can kind of see this framework of backgrounds of like, um, systems understanding, like as a, as a kid, putting things together, building things, and then the debate kind of honing your, your critical thinking capabilities. I just, I dunno. I love, I love human development. So <laugh>,


Frances Haugen:

I think another one of the like subtle ways. So when I was, um, in my early twenties and I was at Google, like, I, I remember when I first was in search quality, um, at Google, there were very, very few women and I would, would agonize over like, why aren't there people, why are, like, where are the other people who look like me? Right? Like, there'd be blonde men. Why are there no blonde women? Right? Um, and, one of the realizations I had at some point was because I struggled. Like I, it wasn't all the time, but like, I had a long series of recurrent ankle injuries because I, I didn't know it. And I talk about this in the book, like, I, I was severely malnourished from Celiacs and I, uh, you know, your body can't heal if you're not getting enough nutrients in. Um, you know, I spent a lot of time indoors playing with computers, you know, uh, instead of going out to recess because like, you know, I didn't wanna run around. And so it's like these interesting things from my childhood where like, you never think that there's gonna be like a long tail of influence on these things. Um, but sometimes, sometimes our, our limitations become our strengths.


Hillary Wilkinson:

Oh my gosh, So you've written this amazing book, the Power of One, how I found the Strength to Tell The Truth and Why I Blew The Whistle on Facebook. And it's this story of your professional journey through big tech where upon you ultimately end up at Facebook in 2019, and you were working in its Civic Integrity Department. And so how long, I'm, I'm just wondering how long did it take before …how long was it before your spidey senses were tingling there, whereas like, something's amiss.


Frances Haugen:

So, um, I'm a little bit of an anomaly for someone who has like a lot of emphasis on data science or, or algorithms, um, in that I went and got an M B A and I remember before I got an M B A, you know, people said like, this is a waste of time. Like, um, like people at Google. And, and a big part about why I went was I, I came to appreciate how much organizational design matters, right? That we spend so much of our lives at work and the choices of how we design these spaces and like how we manage them are really, really, um, significant in terms of impacting people's quality of life. But that meant that when I showed up at Facebook and things were just kind of like, uh, a little off, you know, like they, they were more chaotic than seemed reasonable, or, or like, the way decisions were being made just seemed greatly, um, outside of bounds. You know, when people would say to me things like, oh, this is just the way Silicon Valley is. I'd be like, “No, no, no. I've, I've worked at enough companies now.”


Hillary Wilkinson:

I was gonna say, because you worked at Pinterest, Google, you, you had Yeah.


Frances Haugen :

Uh, the, uh, so I spent the vast majority of my tech career at Google. Um, but I ended up spending some time at, at both Yelp and at Pinterest before I ended up at Facebook. Um, and, uh, the, um, the, and which is a great example of how like, you know, you can run into hiccups in your life. Like, I, the only reason I ended up at those companies was 'cause I got really sick and, you know, my career kind of got deflected for a couple years. Um, and, uh, but because I had been at these places when people said to me things like, oh, no, no, no, this is just the way Silicon Valley is. Like, I could, I could look at it and, and interrogate that logic and say like: One, why do why do they think that? And I, I think part of why some of the people who said that to me earnestly believed it was, um, a lot of 'em had never worked anywhere but Facebook, you know, they had come in straight out of college 5 years, 10 years before and it just never left.


Frances Haugen:

Right. So they could say like, oh, yeah, yeah, yeah. Like all everyone's Silicon Valley is as, as understaffed as we are, you know, that kind of thing. Everyone struggles with hiring. Um, uh, uh, when, like, I, I knew from looking at how these teams were run at other places, like no, like other places who don't make anywhere near as much money as you do, who should struggle much more than you do hiring. Like, they're able to find people to do these jobs. So like, what's going on? Mm-hmm. <affirmative>, um, or just things like, you know, I, I tell the story towards the beginning of my time at Facebook of, you know, uh, Facebook appreciated that they were different enough internally that if you were going to do the job function I had, which is called product management, it was enough different at, um, Facebook than at other places that they wanted all of the product managers to basically go through a bootcamp for a few weeks.


Frances Haugen :

So like, like two weeks of bootcamp where, you know, they go in there and they give you a bunch of lectures on like, here's how this is done at Facebook. Um, and my manager pulled me out of that after like maybe three or four days. He was like, you know, we have, we have, we have a, um, I you have to write the plan for like, what our team's gonna do for the next six months. And like, take a moment and just like, think about the logistics of this. Like, I don't know anything about, you know, I, I was supposed to work on misinformation in places that third party fact checking, uh, couldn't touch, right? So this is most of the world, most of the world doesn't have third party fact checkers. Um, in times of crisis, even if there are third party fact checkers, they can't move fast enough, right?


Frances Haugen:

Cause like, uh, third party fact checking takes a few days to like research and those kinds of things. Um, you know, what do you do? Um, so, so this is a novel space for the company and I, knowing nothing and supposed to write the plan for the next six months on like, you know, after a week. Hmm. And so it's just from the beginning, like my spidey sense was like, something is very, very wrong here. Like, I don't know exactly what's wrong, but like enough things are happening that are, are not being done in even approximately ways that are similar to how they work elsewhere. Something is definitely off.


Hillary Wilkinson:

Yeah. Yeah. So it was kind of intuitive as well as, I mean, what you were seeing, but there was kind of that strong sense. Yeah. Do, was there, like, was there like a turning point for you where, I mean, was it that point or was it, was there a turning point at which you were like, “okay.”


Frances Haugen:

So it's interesting. Like, I I, I had a number of moments in the first six months where I was just like really shocked at the scope of like, Facebook's influence. 'cause like, I had never thought about Facebook being the internet elsewhere in the world. Mm-hmm. <affirmative>, right? So for most listeners, they're probably not aware that Facebook went into some of the most vulnerable places in the world, you know, countries in, in Africa, countries in Southeast Asia and said, “Hey, we noticed data costs $65 a gigabyte cost a hundred dollars a gigabyte. You know, if you use our products, your data is free. But if you use anything else on the, the open web, you're gonna pay for it yourself.” And lo and behold, somehow they became the internet for, I would say, a majority of languages in the world. Right. You know, 80, 90% of all the content available on the internet, only available on Facebook.


Frances Haugen:

I knew that Facebook had done that. It's called Zero Rating. Um, but I hadn't really appreciated that, you know, Facebook had followed that up by like, not adequately investing in safety systems in those same places. Like they knew people didn't have any other options. So like, you know, what are they gonna do? Right? Um, but, but I would say the first moment where I was like, this is out, like this is even outside of those bounds was, um, we were in the runup to the 2020 caucuses, like the Iowa caucuses. So this is like an event that I used to volunteer at as like, um, like in high school. 'cause I wasn't old enough to vote, you know, we'd go and we'd help collect ballots and like, you know, set up chairs and make sure people have water and whatever.


Frances Haugen:

Um, you know, the Iowa caucuses is a really foundational thing about our democracy, because for decades they've been the first place right. Where candidates for president actually get voted on. Right. Um, and they've been hugely influential in, in how that influences who gets to be the candidate who we get to vote for in November. Mm-hmm. <affirmative>. So, um, in the runup to the caucuses in 20, in, so 2019 into 2020, um, I, I was sitting in our, our review where we were gonna pick out the plan for the first half of 2020. And my, my boss, or my boss's boss to be accurate asks me, “What am I gonna have ready for Iowa?”  And I'm like, geographic misinformation, like geographically targeted misinformation is like outside of our scope. Like we were told by the researchers that it doesn't happen very often, um, that, that, you know, sociographic targeted, you know, this is like, um, you know, people's age and sex and gender and interests and all these other things.


Frances Haugen:

Like, that's much more common in terms of targeted misinformation than saying you live in this state or you live in this zip code, we're gonna target you. Um, but one of the few exceptions to that could be Iowa, right? Because Iowa has enough sway in terms of altering the course of an election that potentially targeted misinformation might actually have an impact in Iowa. Right. And so I'm like, “Uh, so yeah, we're not planning on doing anything, but I will go find out.  I'll go find out what we can do for Iowa.” And he's like, “okay, come back to me.” So it's the 17th of December or something, and I go and talk to the war room people. So these are the people who are like writing the software that's gonna like, run the war room where we're all gonna like sit around and like do what we can do to keep the Iowa caucuses safe.


Frances Haugen:

And I find out that they will not have the ability to look at individual states ready in time for the Iowa caucuses. And, and I, and I was like, how did that happen? And it's just like, they didn't have enough resources, it fell behind schedule, that kind of thing. But they're like, don't worry, we'll have it ready the week after. Like, we'll have it ready for New Hampshire, or, or worse we'll have it like ready for Nevada. Right. So January rolls around and we get to have more meetings 'cause people are like back from vacation and I find out that they're gonna, I ask. So, so they tell me we can't actually support all the states, so we're gonna have to choose to give some states support and other states are not gonna get support 'cause we'd have to pay too much money to store the data.


Frances Haugen:

We don't have budget for it, so we're gonna pick some states and then like, that's how we're gonna do it. And I was like, “Okay, cool. So like, are you gonna pick like the early primaries and caucuses and then like, after they pass, like, are you gonna move on to like the next primaries and caucuses?” And they're like, “Oh, no, no, no, we're just gonna do the swing states.” And for context for people, swing states are states that are, um, evenly divided politically. So like, the reason why they might swing a Republican or Democrat is because they have, you know, approximately the similar number of Democrats and Republicans. And the only problem was in the 2020 primaries, the Republicans weren't voting. Like they had already picked Donald, like Donald Trump was the candidate. Mm-hmm. He was the sitting president, he was the candidate. There weren't really primaries going on. And I was like, wait, wait, you're only gonna do the swing states, so like, you're gonna leave out California and New York, which make up like almost 50% of all the delegates for the Democrats are are like, and, and I and and the person in charge looked at me and said, “Why would that matter?” Like, she didn't understand, like, she didn't understand like, like the how primaries work or like the role that they play in the Democrat process. Like it's, it's one of these things where we have p


Hillary Wilkinson:

Yeah, no, it is, yeah. We have so fascinating how all of your background Yeah. Converged on this moment.


Frances Haugen:

Oh yeah, yeah.


Hillary Wilkinson:

For you to recognize this is a major problem. And like, yes. <laugh> people, people who were so busy, it's kind of like if you go to a doctor, you know, and say you have shoulder pain and you go to a mm-hmm. <affirmative>, you go to a neurologist and they're gonna tell you, you've got, you know, oh, like a neurological problem. You go to a podiatrist, they're gonna give you an orthotic for shoulder pain. You go to, to, you know, a dentist, they're gonna Oh yeah. Say, oh, you're clenching your teeth. But it's like every person had their specialty, but you were uniquely qualified through your life experience as well as your education and everything else to see this umbrella effect. That's fascinating.


Frances Haugen:

Yeah. And I, that's, and that's probably one of the reasons I can be like, very empathetic about like how bad things got inside of Facebook. Um, which is that one of the things that we, we, we need the public to be aware of is we are training technologists with like, God-like powers, um, you know, every year what we can do with computers gets more magical mm-hmm. <affirmative>. And yet, you know, if you are an 18 year old who wants to work at Google after you graduate every class you take, that's not a cs like a computer science class is gonna make it less likely you're gonna get hired at Google. Right. So we're, we're getting even more specialists, even though they're able to wield more and more power.


Hillary Wilkinson :

Interesting. Okay. Stick around. 'cause after the break we're gonna hear more about the inner workings of Facebook and what steps Frances Haugen believes we can take to right the wrongs of social media.

 

------ad break -----


Hillary Wilkinson:

There was an algorithmic change that happened at Facebook in 2019 mm-hmm. <affirmative>, and I recognize that this was for political reasons, like you had just talked about and kind of calming or, you know, trying to, not incite and outrage, but it would seem like kids kind of got caught in this crossfire mm-hmm. <affirmative>. So like, the same algorithms that promote outrage and extreme emotions on the political front don't turn off when a, when a 14 year old is looking at a disordered eating account. So, mm. I guess I'm asking for a, uh, um, help here <laugh>, because,


Frances Haugen :

So I, so I don't, I don't, I don't think they were made for political reasons. I think they were made for, business reasons.


Hillary Wilkinson :

Okay. Okay. I misunderstood that.


Frances Haugen :

So, so the change that it gets talked about in the book is, um, uh, the, um, Facebook. So, so just like we were just talking about this idea of Facebook thinks really acutely, like, intently about how do they get more content to be created because, uh, Facebook is a two-sided marketplace. You know, like you go to eBay and there's like buyers and their sellers. Um, you can't have a buyer if you don't have someone selling. Right. Um, or either, either or. In the case of social media, without creators, you can't have consumers. And most of us think about these places as places of consumption because most of what we do on them is we consume. Right. Um, I, what what Facebook found was that over time, and this is a, a phenomena that happens at lots of social networks, um, people were creating less and less over time.


Frances Haugen :

Okay. And part of what fuels that is, um, uh, uh, part of what fuels that is, um, you know, some people get more intent on the platform. You know, like they, they buy a drone and they buy, you know, a fancy camera and they start like making really beautiful content. And when people see that content, they go, oh, um, you know, uh, uh, Instagram is not for me. Right? I don't, I don't have a drone. Like I don't have a fancy camera. Um, but the secondary thing is like, Facebook was like, how can we try to turn this around? So this was on facebook.com, it wasn't on Instagram. They're like, how can we turn this around? Like, how can we get people to just produce more content more often? And what they found was the only sustainable thing was getting people more social rewards. And so that means, um, more likes, more comments, more re-shares, like making sure that that flow of social rewards, that engagement didn't stop.


Frances Haugen :

And so in 2018, Facebook started saying, we'll give more distribution to content that gets more engagement. Oh, okay. And, uh, unfortunately there was a side effect, which was the content we are drawn to engage with the most. Like click on, you know, put a comment on is extreme content. So like, uh, the, the shortest path to a click is anger. Mm-hmm. <affirmative>. Um, and unfortunately in the case of Instagram, those similar kinds of algorithms are in play. And I'll give you a really concrete example of what the consequences are. A journalist, uh, was interviewing me and he had just had a new baby boy. So this is a happy, healthy baby boy, cute baby boy. That baby boy had an Instagram account 'cause he's like a modern father. That account had maybe five friends, which are other babies that are family friends. The only photos posted to any of these five accounts are cute, healthy, happy babies.


Frances Haugen:

And yet, and yet about 10% of the content that filled this new father's feed with suffering children. It's like kids who have been mangled in accidents, kids who are laying in hospital beds with like tubes coming out of them, horribly disfigured kids who look like they're in pain. And he's like, how did we go from healthy, happy babies to, to this Like, I haven't clicked on any of this stuff and it keeps coming back. It's like 10% of my feed. And I think what's happening there is these algorithms, you know, dwelling even can be considered an active engagement. Mm-hmm. <affirmative>, you know, if you just have it open on your screen and you don't continue to scroll by, they're like, oh, you liked it. Mm-hmm. <affirmative>. And so even so, and this is a great example of how the algorithms are agnostic. Like they don't know the, the significance, the meaning of what they're showing this person, you know, it's kind of like they don't know that giving more disordered eating content to someone who's struggling with their own body dysmorphia is bad. Like, they don't know that. All they know is, if you like cute babies, you'll not be able to scroll past this suffering child. Mm-hmm. <affirmative>, if you are getting preoccupied about diet culture, you are going to like, not scroll past this “thinspiration”.


Hillary Wilkinson:

Mm-hmm. <affirmative>. So on behalf of parents who find this kind of like dangerous content on kids mm-hmm. On their kids' accounts, is there a way to get around it? Mm-hmm. Like what, how would you suggest, like, can, can the user hack the algorithm, I guess is what I'm asking?


Frances Haugen:

Um, I think it's really important if you have kids, you know, you were talking about digital tips mm-hmm. <affirmative>, I think it's really important to, um, and I actually think this is true even for adults, you know, it's very easy to have an algorithm drift over time. Um, so if, I'll give you an example. I got really sick in my late twenties and I had to relearn to walk. Right. You know, I, I was paralyzed beneath my knees. It's really painful. Took a long time. Very isolating. I must have watched a bunch of really depressing stuff on Netflix because a couple years later I got enough healthier and like stronger that I was like, wow, Netflix is really depressing. You know, I tried to go find comedies, I try to find other stuff, and like, I, I was like fighting against the algorithm. Um, I think that's really important to sit down with your kids and say like, I'll show you my, my, my home feed.


Frances Haugen:

I'll show you what I, I am looking at on, on Instagram. Like, this is what they're feeding me. Why don't we look at what, what your Instagram is feeding you and make sure your kids understand how to, there's, there's ways of, of indicating, like, you wanna see less of certain kinds of content Mm. Mm-hmm. <affirmative>. And, um, I, and I think having that idea, like making your kids aware, hey, algorithms are just trying to fulfill their own needs. You know, they don't know the meaning of what they're doing and that you need to, you what we, we should have systems that are safe by default. That's what our kids actually deserve. But you do have the ability to shape these things and, and always monitoring to see if you're being led to a more extreme place is a really important skill in a way we can care for each other.


Hillary Wilkinson:

Yeah. Yeah. And I think it's a really important part of, you know, there's this kind of argument between digital literacy and digital citizenship and like, what, what gets defined and what, and so I think that component is that to me that's as important as digital literacy, you know, I mean, to, for, for health and wellness. But I, I like, yeah. Anyways. Okay. <laugh>,


Frances Haugen:

Because I, I think it's because I think it's one of these things where like, you know, um, I, the, one of the most painful things you can experience is, is having someone get taken away from you by, by the algorithm. Mm-hmm. Right? Like, this is a thing that is not necessarily obvious to kids, but like watching someone go down the rabbit hole is hard. Mm-hmm. <affirmative>. And like, I, I, I'm, I'm super curious about what does it, what does care look like in the 21st century? Like if I say, if, if you and I are, if you and I are are, you know, if you and I are friends, like what does it mean to care for each other in a world where there are digital, digital algorithms that impact? Like, it's, it's not just like, you know, I know, you know, I come to your house like it used to be, you could come to someone's house and you could see it in magazines were on their table. Like, if they, if they were starting to drift, you could see Right. And now people end up, you know, sometimes struggling in silence for long periods of time.


Hillary Wilkinson:

Yeah. Yeah. Very true. 


Hillary Wilkinson :

So earlier you talked about how Facebook kind of became the internet for mm-hmm. <affirmative> people globally, like, you know, developing nations. And in your book, the Power of one, you task us with fixing Facebook, not killing it mm-hmm. <affirmative>. And I'm like, if, if I were to substitute social media for Facebook, recognizing mm-hmm. That Facebook is not alone mm-hmm. And the healthiest path mm-hmm. Will be the path forward, figuring out as we go mm-hmm. <affirmative> mm-hmm. <affirmative>. So do you have an idea on how, how do we fix social media?


Hillary Wilkinson: 

Next week on part 2 of this episode learn how Frances Haugen- The Facebook Whistleblower believes we can tackle the issue of fixing social media and create the healthiest path forward.  In the meantime you can check out her book The Power of One as well as a complete transcript of this episode  by clicking on the shownotes. Do that by going to Healthy Screen Habits.org, click on the podcast button and scroll to find this episode. 

If you’re interested in learning about more topics surrounding Healthy Screen Habits check out our Episodes By Topic where you can find a library of previous podcast episodes listed by topic – thank you so much for listening and be sure to tune in next Wednesday to hear the rest of this important conversation with Frances Haugen – a true  Healthy Screen Habits Hero!



About the podcast host, Hillary Wilkinson


Hillary found the need to take a big look at technology when her children began asking for their own devices. Quickly overwhelmed, she found that the hard and fast rules in other areas of life became difficult to uphold in the digital world. As a teacher and a mom of 2 teens, Hillary believes the key to healthy screen habits lies in empowering our kids through education and awareness. 


Parenting is hard. Technology can make it tricky. Hillary uses this podcast to help bring these areas together to help all families create healthy screen habits.


Recent Episodes

S10 Episode 1: Screen Strong and Growing // Melanie Hempe, BSN
02 May, 2024
After her oldest son dropped out of college due to his video game addiction, Melanie Hempe put her nursing degree to good use and founded Screen Strong,@bescreenstrong a nonprofit that empowers families to prevent screen problems and reclaim their kids from toxic screens. Listen to this episode and learn how your family can stop fighting over screens, kids can gain more life skills and everyone can benefit!
S9 Episode 11: Do YOU Know a Healthy Screen Habiteer?
19 Apr, 2024
Healthy Screen Habits was founded by a group of 4 moms who find it imperative to practice what we teach! Next week, the podcast will take a break as we enjoy Spring Break with our own families. During Spring Break, take some time to do some digital spring cleaning! Delete unused apps and revisit memories of the past year by organizing photos. The act of revisiting memories brings about reminiscence which it turns out is one of the best ways to increase language with younger kids and strengthen memory. Enjoy all of these memories and create new ones this Spring Break.
Share by: