S14 Episode 4: Get Help From the Family IT Guy! // Ben Gillenwater

September 25, 2025

Hosted by Hillary Wilkinson

Instagram is one of the few apps that offers the ability to switch from (addictive) algorithm to (non-addictive) time-based feeds.

~Ben Gillenwater

Ben Gillenwater @family_it_guy is a former chief technologist for a $10 billion IT company. He has over 30 years of experience, and (most importantly!) he's a dad. As a friendly and calm IT expert who helps parents navigate their digital parenting journey, Ben offers advice and tips on how to set your family up for safety. This episode is a great conversation. Listen now!


Healthy Screen Habits Takeaway


Resources


Show Transcript

Hillary Wilkinson: (00:00)

So many times when a new platform becomes the cool thing, or a game is making the rounds, it would be great if there was one place you could look to see what was needed to immediately put protections in place, what conversations to have with your kids to prior to them playing. So to keep them safe. And just like this sort of like basic roadmap of path of best safety. You guys, I found it. I found him. He's the keeper of the roadmaps and he's here today to talk about all of it as an ex-NSA cybersecurity expert.


Hillary Wilkinson: (01:36)

And the former chief technologist for a $10 billion IT company. He has over 30 years of experience and he's a dad. He is a friendly and calm face that is guiding so many parents during this path of parental digital navigation. Welcome to Healthy Screen Habits, Ben Gillenwater!


Ben Gillenwater: (02:07)

Thank you very much, Hillary. Appreciate the very kind introduction. Glad to be here.


Hillary Wilkinson: (02:12)

Absolutely. Ben, your name kind of really says it all in your platform. You are the Family IT Guy. And I can't tell you how many times I've wished for an IT department in my own house. So, so yes,


Ben Gillenwater: (02:28)

I can understand .


Hillary Wilkinson: (02:29)

Yeah. Right. And what led you to become the Family IT Guy? How did you get here?


Ben Gillenwater: (02:37)

Yeah, it's a great question. You know, I like to, to sort of, uh, tell people that I've, I've spent most of my career focused on national security and corporate security. And now I do family security and family IT. Um, and the reason that I'm doing this is because I gave my son an iPad when he was five. And it's ironic to me now, looking back now he's nine that I did that. But I did. And, and I, I put YouTube on it and games on it and it didn't even occur to me at the time that that would be problematic. But of course it was, you know, he found all the stuff on YouTube. I was like, oh boy. Okay, hold on. Lemme take that off. . Lemme put on YouTube kids. 'cause it's called kids.


Ben Gillenwater: (03:23)

And it didn't solve the problem actually. Like the YouTube kids platform, he found nightmare stuff. He found sexual stuff. Um, he had nightmares for years with the stuff that he found on there. I'm so sorry. So we, he had to pull YouTube kids off of, off the iPad. And then it was left as like, okay, it had some kids games. How bad could it be? It turned out that it's still very addictive. It was affecting his sleep. He was waking up early to play it. He would come home, walk in the door and go straight to the iPad. It was disruptive to his routine. So we took away the whole iPad. I'm like, I, I darn it. I gave my my kid drugs. I didn't even know. And so if it's, and by the way, so now he's, he has a device now, but it's a watch.


Ben Gillenwater: (04:06)

Mm-hmm . It's an Apple watch and it's all locked down. 'cause Apple has these great screen time controls and you can, he can only receive and make calls to people that are on the list that I control mm-hmm . And so it's this really nice like, emergency device, and then he can use it to listen to music. And so, you know, if, if it's that tricky for me as an expert in the field, how hard must it be for people that are not experts in this? And so, um, after talking with a bunch of people about it and other parents and people started asking me for advice about what they do should do for their kids, I, I looked into it more and more like, okay, this is an area that needs help. Mm-hmm . And so I've, since the beginning of 2024, I've worked almost nonstop on preparing myself and, and preparing resources on how I can share my experience with parents and translating things that are very technical into very non-technical, um, means like, how do we, how do I deliver technical concepts in a way that anybody can digest and actually use to help their kids stay healthy while they use these devices and use the internet mm-hmm


Hillary Wilkinson: (05:26)

And I have to say, you're so good at it, .(Wow. Thank you very much.) And I do not say that lightly. Thank you. But I, I am so excited to be sitting here having this conversation with you because I have actually had people contacting me saying, “Hey, have you heard of this guy? You know, have you, oh, no. Are you talking to the Family IT guy?” So, I mean, that's cool. No, you, you are resonating and hitting people exactly where they need the most help. And so, of course, like, you know, getting ready for this conversation, I sit down and I peruse things and I look through things and your website is fantastic. And part of your superpower, I think is, is not that you're relatable as this IT guy? Because quite frankly, I think there are very few of those out there who Yeah. Who we can actually understand. But you do have this. Yeah, no, that's true. Yeah. Like I said, this like superpower of being able to, to kind of break things down into very easily comprehensible chunks and, um, also just this fact that you're a parent yourself and you're doing it and like you disclosed you're struggling your own self. Yeah. Um, do you have areas currently of, I mean, without disclosing too much personal information, what are the biggest areas of friction surrounding tech in your own family unit?


Ben Gillenwater: (07:20)

Oh, yeah. Well, gosh, I mean, so there's certainly, uh, some of those things, and I'm happy to talk about them because I think that it's very important. One of my personal values is vulnerability. And I think it's very important for me to be vulnerable as I do my work in this space so that people can understand that like, I'm going through the same stuff, even though I, I have expertise and I sort of know what to do. It doesn't mean that it's easy. And so, for example, um, I think that that kids of course follow and, and, and model the behavior of their parents as well as their friends and other people they interact with, right? But certainly their parents. And so how do I, as a parent, behave with my phone? And how do I behave with my screen time and how does my wife do the same?


Ben Gillenwater: (08:14)

And so it's not easy because these things are designed to be addictive explicitly. And many of us as adults and parents require these devices for work. I do, for sure. I have lots of devices that I use for work, but like, my phone is certainly one of 'em. And so being very cognizant of how I spend my attention and, and how much of my attention I give or is taken by the devices that I use and the systems I interact with, and how does that balance with how much attention I give to my son and how much attention I give to my wife. And so that's something that comes immediately to mind when you ask about that, because I want to just highlight that that is not easy, even though I'm constantly aware of it because I'm involved in this 24 7 professionally, it's, it does not make it easy. And so, but it, it's something that's an ongoing effort. And I, and I'm very intentional about it as much as I can be every day of the week, every moment of every day.


Hillary Wilkinson: (09:18)

Oh, I think that is extremely admirable. I strive to have that level of, uh, awareness .


Ben Gillenwater: (09:26)

Yeah. Well, I tell you. And, and there's tricks involved with doing it. I, I have my device set up. I have a bunch of settings in place to make my phone boring. Mm-hmm . You know, and I have a bunch of blocks in place where I have to undo those blocks in order to access social media when I need to do stuff for work. 


Hillary Wilkinson: (10:03)

Okay. So you're the cybersecurity expert. And I am a child development person, , it is unfathomable to me. I'm looking for clarity, and I don't even know if this is, if you're gonna be able to help me understand, but I'm, I'm, I thought, ah, I'm gonna ask Right? ,


Ben Gillenwater: (10:24)

I'll do my best. , yes, yes,


Hillary Wilkinson: (10:26)

Exactly. So as a child development person, it is unfathomable to me to understand why platforms continually develop apps that are dangerous or harmful to kids explicitly? And I know the obvious answer to this question is profits, but I'm, I'm looking, I'm, I guess I'm, I'm digging deep and I'm trying to find hope for humanity here. Are there reasons that you're aware of other than just putting profits over people that big tech has failed to protect our most vulnerable users, those being our kids?


Ben Gillenwater: (11:18)

Hmm. That's such a good question. One of the things I've been exposed to is that what gets measured gets done. And, um, incentives are, are very, very important to focus on and to look at, like, what are the incentives of a third party that we're going to engage with? And so, always, of course, any commercial entity, it, it must, uh, put profit first because it cannot survive if it doesn't. And so what mechanisms can it use to achieve its profit goals?


Ben Gillenwater: (12:16)

And one of the things that Facebook taught the business world is that attention is one of the most valuable things that a software platform can have. Uh, because the internet is very much, it's mostly free in air quotes mm-hmm . Um, in that, instead of exchanging money, you exchange your attention mm-hmm . And when you exchange your attention, the company that receives that attention can learn about what causes you to stay and then can serve you ads as you stay. And then that's how they monetize the air quotes, free tools. And so I also like to think about the fact that anybody outside, I'll just speak for myself personally, anybody outside of my family is absolutely not responsible for keeping my family safe. That includes people that pretend to desire the safety of my family. I find that to be a false premise. Um, and I, anytime a a stranger pretends to be interested in the safety of me or my wife or my kid, I'm immediately like, red flag, like they do.


Ben Gillenwater: (13:33)

And, and like, what are their incentives? And do their incentives match mine?


Hillary Wilkinson: (15:10)

So what do you think, this is going a little different direction, but what do you think about, um, laws and legislation like the Kids Online Safety Act or like the Childhood Online Privacy and Protection Act that, that, that actually did get passed last year. Kids Online Safety Act got shut down Right, right. Towards the end of the year. do you think there is a responsibility of big tech to provide a basic level of a do no harm type of protection?


Ben Gillenwater: (16:02)

Uh, short answer, no, I don't think so. Oh,


Hillary Wilkinson: (16:05)

Wow. We disagree completely. Ben!


Ben Gillenwater: (16:09)

Which is fantastic. That's actually like my


Hillary Wilkinson: (16:11)

Favorite thing. Oh, yeah. No, I, yeah. Love it. No, I'm enjoying this. And I having, go ahead.


Ben Gillenwater: (16:16)

Well, having, having, I, I love, uh, exploring ideas with other people. And, and this'll be really cool to do, um, the, okay, so the laws, what gets measured gets done. The measurement, the incentives of laws are to maximize the votes that the politician that enacted the law will get at the next voting cycle. That's the primary measurement. So that's a, and that votes are given based on the success of that politician's marketing campaign, which is based on their ability to maintain positive perception about how they're doing their job. 


Hillary Wilkinson: (17:04)

Uhhuh,


Ben Gillenwater: (17:04)

But there actually are no actual measurements of like these, like none of these laws have measurements that I agree with that say like, these are the outcomes that should occur, and we will know if the outcomes are occurring by measuring the following metrics mm-hmm . And if they don't occur, we're gonna cancel the law. So these things become permanent. Every, every bureaucracy, including the US, is primarily interested in growing itself. Mm-hmm . And so each new law grows the bureaucracy and grows the regulatory environment and always has trade-offs. And so, like a lot of these privacy mechanisms that want to identify the age of a kid mm-hmm . Then requires identifying the age of adults too. And it's a big privacy problem. And, and privacy is something that requires trading convenience, and even sometimes trading security. But, and this is where the, you know, the personal values things come into play is that I'm very privacy focused.


Hillary Wilkinson: (18:59)

I, I think I ha come from a place of having spoken to too many parents of dead children who, who feel very strongly that had there been, uh, just a basic, um, filtering or a basic, you know, a, an alert or anything, which, which I know there are private companies that you can employ to alert your, you when your child is looking at something. But platforms that have been kind of hijacked by back backdoor type. Uh, I'm talking about Snapchat specifically. Snapchat had, oh, yeah. They had a whole problem with like, with backdoor apps coming in and not being, and social media by design, uh, does not allow for monitoring and filtering on any of, like the bark, the, any of those things. So I, um, I think that there are too many ways that parents are basically blindfolded and don't have the ability to know what's going on. But this is where your, uh, your website and your services excel is, is keeping families safe. And when we come back, let's talk a little bit more about those pitfalls and pillars of protection that you have.


Ben Gillenwater: (20:36)

Okay. And, and can I, can I offer a quick note? Sure. On, just to clarify, just to clarify on the, my position: Do I think that it would be wonderful if social media platforms and other entities did offer things that were child safe? I think that would be fantastic. And I think that the fact that they don't screams out that they're not in our favor. And so I kind of like how loud the problem is because it can become obvious when you, when you look at it that way.


—------------------------------------------------

HSH Workbook

______________________________________


I'm speaking with Ben Gillenwater, the Family IT Guy w
ho is, uh, instructing us on, on all things, uh, big tech, security, et cetera today. And let's get into kind of the areas that you find are the biggest areas of pitfalls or pain points for most families.


Ben Gillenwater: (22:38)

That's a great question. There's a lot to know. There's so many things. And so if you're to prioritize those things, how might you prioritize them? And that's one of the things I focus on helping parents with. And so I like to group this, uh, larger problem set into three buckets. And so the first bucket is blocking harmful content that could be stuff that is damaging to a child's mental health, or it could be stuff that is inappropriate, like perhaps, you know, porn or sexual material. And then the second bucket is screen time. And that is very focused on, um, addiction mechanisms and how addiction affects people, and how it very strongly affects the youngest amongst us.


Ben Gillenwater: (23:39)

And the, when our brains are the least developed, they can also be the most impacted by addictive substances. And then finally, the third bucket is, uh, preventing stranger contact. Preventing stranger contact at internet scale. So there's 6 billion people on the internet, and a lot of platforms connect one to everybody. And so when your child goes on certain games or certain systems, they are immediately exposed to that level of scale. And it, there are a far as I can tell, it seems that there are a percentage of people on the outliers of whatever the bell curve is of like average, you know, humans that are intending harm mm-hmm . And many of which apparently are intending harm specifically to children. And so when you expose them to internet scale, stranger contact, the risk becomes higher than you might think because of the quantity that even those small percentages of 6 billion people like what that represents. And so, blocking harmful content, limiting screen time and preventing stranger contact are kind of the three ways that I bucket my, uh, my work into mm-hmm .


Hillary Wilkinson: (25:21)

Mm-hmm . Yeah. And on your website, you refer to them as the three pillars.


Ben Gillenwater: (25:27)

Yeah.


Hillary Wilkinson: (25:28)

I like a construction analogy. I think it's, it's easy to, it's easy to hang thoughts from and that yes, that whole idea, we teach this also this whole idea of the internet as a place rather than being just because so many of us still as digital immigrants, our children are digital natives growing up within this time, and they move with this fluidity and speak with a fluency that as a digital immigrant, I will never have. That's funny. And so I like that term . Yeah. Yeah. And so, and what, like, oftentimes as a digital immigrant, you kind of still think of the internet as a thing, and we need to shift that and recognize that the internet is a place, it's a space that people are operating within. Yeah. And I was having a conversation with somebody lately, and they were saying yes, and actually the internet as a place is even more dangerous. Then say, I'm in Southern California, I always use the example of taking your, taking your 7-year-old and dropping them off on the corner of Hollywood and Vine at like, you know, 11:30 on a Saturday night, which if you, 


Ben Gillenwater: (26:46)

Oh My God!


Hillary Wilkinson: (26:46)

, if you're, if you, you know, by themself, right? Yeah. Exactly. Now, if you're in New York, it would be Times Square if you're, I mean, fill in the blank of wherever the least desirable part of town is, right? Yes. Yes. Okay. Not that not, I'm not meaning to throw shade on Hollywood at all, but you know, it's nowhere where I think, so


Ben Gillenwater: (27:04)

It's special, you know, I think Yeah. It's a special place


Hillary Wilkinson: (27:06)

If you understand what I'm inferring . Yeah. Yes. What I'm throwing down there anyway, but what this person said, and I completely agree with was she said, but here's the thing. If you saw a 7-year-old by themself on a street corner at late at night, by themself, you as a mom or as a, you know, citizen Yeah. You would automatically be like, “Hey buddy, what's going on? You, you, you okay?” You know? Yeah. And


Ben Gillenwater: (27:36)

Absolutely


Hillary Wilkinson: (27:36)

You would already enact that kind of community protection. But in the internet, in the internet place slash space, we don't even have that because it's just a one-on-one. Yeah. There's no person that's coming in as a third party saying like, “Hi, do you know this person? Yeah? Okay.” You know, so that's right. I think it's, that was an interesting kind of Hmm. Add-on to the internet as a place for me.


Ben Gillenwater: (28:06)

That's a good point. I like that and it's interesting too about the internet as a place, is that you can't see it with your eyeballs. Mm-hmm . And so there's a big part of our brain that can't process, uh, risk, you know? Yeah. We can look out, we can look out in front of us and see, oh, that guy across the street not going next to him, oh, that person coming next to me, let me kind of scoot over here. You know? Oh, there's a dark alley not going down there. You can't see those things. Right. And if you're a kid, you don't even know that that's the dark alley versus the bright alley as you wander through the internet. Right.


Hillary Wilkinson: (28:49)

Right. So, what are your areas of greatest concern as the security expert? I mean, is it, is it privacy protection? Is it AI? I mean, putting your dad hat on, like thinking, because most of our listeners are parents. Like what, what are the areas of greatest concern for you as a parent? As a parent right now?



Ben Gillenwater: (29:18)

. Yeah. So there's, there's two, um, it's, the first one is addictive algorithms. Mm. The primary place we see those is in social media. They occur anywhere that you find a feed of content. 


Hillary Wilkinson: (29:35)

Okay. So can you tease that apart for us?


Ben Gillenwater: (29:38)

Yes, definitely. So I'll, I'll offer an exercise to everybody listening. If you take out your phone, well, which you currently might be using to listen to this and open up Instagram, it'll open up to your main feed. If you look at the bottom of every post that tells you how old the post is. So look at the first post. It could be a couple days old. The second post could be three hours old. The third post could be two weeks old. They're not time-based. Right. They're algorithmic. Mm-hmm . They're based on what the algorithm, what the computer system thinks you're gonna wanna see next, and what you're gonna wanna spend the most time and the most attention and engaging with. Now at the top of the Instagram app, if you tap on the Instagram logo and you go to the following feed, that is now a time-based feed that is not an algorithmic feed.


Ben Gillenwater: (30:26)

And you can tell because you look at the bottom of every post, the first post is the most recent one. The post after that is the next most recent one. So it'll be 15 minutes old, 22 minutes old, 27 minutes old. It's way more boring. You can tell just right away, way less interesting. And you can see, I use that example, 'cause Instagram is one of the few apps that offers the ability to switch from algorithm to time-based. Hmm. And so I really recommend that people look at that. And then from this point forward, every time you look at Netflix, every time you look at Facebook, every time you look at TikTok, every time you look at YouTube, recognize that those are all algorithmic. They're designed to keep you, they're designed to addict you. They're designed to drip dopamine into your brain so that you like being there, and that when you're not there, you want to come back.


Ben Gillenwater: (31:17)

So that is the first thing in order of priority. That is the number one thing in my mind, that if parents only focused on one thing, it would be addictive algorithms, 'cause that's the thing that is causing a massive mental health crisis. I'll give you the second thing, but I'd like to quickly throw out there that I, I stumbled across data that really impacted the way I think about this. I was looking for anxiety and depression statistics, and I came across a World Health Organization death certificate database. And they've been tracking death certificate data since 1951. Okay. And they offer it for free on their website. You can go look at this yourself. So I pulled the data for the US for young people aged zero to 39. The age groups between 10 and 24 specifically had a, as I charted them out, had a very distinctive pattern.


Ben Gillenwater: (32:08)

And it is as as follows, in 1951, 1 in 100 deaths of people in the age group of 10 to 24 were self-inflicted suicide. In the nineties, it was one out of 20; it was 5%. In 2012 until 2021, which was the most recent data, it increased to one in five 20%. The cause the, the primary causes of death amongst young people right now, first are automotive accidents, second is suicide, third is homicide. That's both boys and girls ages 10 to 24. And the chart that I drew, I plotted social media platforms and smartphone releases along the chart, and it directly correlates to this gigantic spike that went from 5% to 20%. So addictive algorithms are top of the list. These things are dangerous. It's like giving a kid drugs, literally. Okay. The second thing is online chat. That's the exposure to strangers thing. Mm-hmm . That's the medium that predators use to engage with children in order to hurt them and take advantage of them. And it's very unfortunate because I love the idea of connecting with strangers. You and I have never met in person, and here we are, we get to have this fantastic conversation because of the internet. But online chat is dangerous. It's a high risk, medium for communication. And so I, I like parents to be aware of those two things, addictive algorithms and online chat.


Hillary Wilkinson: (33:51)

Okay. Okay. Um, how do you stay relevant with what's coming at your son? Like how I, I think, you know,


Ben Gillenwater: (34:03)

That's a good question.


Hillary Wilkinson: (34:04)

, you know, Ben, I I was gonna say Ben, I, you know, I gotta admit half the, like I would say 80% of this, uh, podcast is for selfish reasons. Totally. amazing.


Ben Gillenwater: (34:14)

That's the best way,


Hillary Wilkinson: (34:15)

Because I'm like, oh, I got all these experts now I gotta, no, I'm gonna ask what I need to know. And, um, I'm in, I'm in a different stage of parenting than yourself. I have older, I have older kids, young adult and, and older kids. And, um, it's very, it's very difficult for me to see what's coming. So how, how do you stay relevant? How do you see what's coming that's


Ben Gillenwater: (34:43)

Such, that's such a good question and such a truly challenging question. You know, it's, it's, um, okay, so the short answer is that I, that's why I work based off of principles. So I work based off of concepts. That's why I talk about these conceptual, addictive algorithms and online chat and the, you know, um, the three pillars. Uh, you know, blocking harmful content, limiting screen time, and preventing stranger contact because these principles can apply across all technologies. Now, AI is unique and you mentioned that, and we can talk about that separately 'cause it is, it is its own thing in a way. But I think addressing, um, working from like a first principles, like nobody disagrees about.


Hillary Wilkinson: (35:33)

So almost a value base, is that what you're Yeah, exactly. Is that okay? Okay. I don't mean to put


Ben Gillenwater: (35:39)

Words in your mouth. Yeah. Value. And, and, and the concepts that, that drive, like the underpinning the incentives, you know, like we were talking earlier with like, should the companies do this or that, should the governments do this or that? What are the incentives underneath? Uh, we're very intentional about not setting the precedents that we don't want to see. And so I'm not getting him another device until a long time. I'm not going to give him access to social media now. Okay, sure. That solves part of the problem. What about when he goes to his friend's house? What about when he goes to other places? What about when he goes to school? Which is insane to me that this is part of the problem, but it is. 'cause they give him a Chromebook and they created a Google account with his full name attached to it and didn't even tell us.


Hillary Wilkinson: (36:43)

Oh, yes.


Ben Gillenwater: (36:44)

Um, and then the principles come into play. And so what we re what we must do there and what we do in my family is we are very transparent and we expose him to things that I'd rather not expose him to. And I tell him about problems that exist so that he is aware that they're there and that when he comes across them, he's able to identify them. And then it's up to him and how we've, how he has absorbed our values and established his own values as to how does he respond during these experiences. And that's challenging as a 9-year-old boy, like, you know, there's not a lot of good decision making going on when stuff is right there when the drugs are right there, right? Mm-hmm . Um, wouldn't blame anybody, myself included, for just grabbing 'em, you know? But so we're just ongoing and I, I even, I take 'em to events.


Ben Gillenwater: (37:34)

I take 'em to my lectures, he hears he's nine and he, he hears, uh, police officers talk about sextortion. Mm-hmm . He hears me talk about it. Exposing your kids to the realities I think is very effective. I've had a very beautiful experience through my videos on, on TikTok and Instagram and in person where people say, I showed your video on Roblox to my 7-year-old. And it's like, Hey, I made that video for adults. Right. Like, in terms of the way I was speaking. Right. I showed it to my 7-year-old and now they get it and they don't wanna play. Yeah.


Ben Gillenwater: (38:09)

I've had, I've had that happen like 10 times.


Hillary Wilkinson: (38:11)

Yeah. And somehow it's, uh, you know, because a lot of times we just become the char as parents, we just become like the Charlie Brown teacher voice. Right. You know, I think our kids, kids hear exactly. Our kids hear our voice so much, they're like, yeah, yeah, yeah. But it's like, oh, this guy is talking about it. And it's like when you're outsourcing it, it does, it's uh, oh, it does. Yeah. That's so true. It has, it has. It's so true. Great power. Yes.


Ben Gillenwater: (38:41)

Yeah. Like, so find, find your local. So PTAs, the parent teacher associations are often hosting events of the nature that I'm speaking about and the events that sometimes I speak at. Find those events and consider taking your kids to them.


Hillary Wilkinson: (38:57)

Mm, mm-hmm . Yeah. Yeah. I would love that. And as far as you're talking about instilling that, uh, baseline understanding of your own family philosophy on technology, I'd just like to point out, I know you have tools on your website. We have a free downloadable template as well, called the Family Tech Plan, that's meant to be a conversational springboard that takes you just kinda leads people through a conversation of like, what's the role we want tech to have in our family? Where does it belong? Where does it not belong? And it just kind of, yes. And it's meant to be a living document because at different ages and stages it means different things. Right.


Ben Gillenwater: (39:38)

So absolutely change it, it it should change. Yeah. Probably at least annually, you know. Yeah,


Hillary Wilkinson: (39:44)

Yeah. Exactly. Exactly. So we have to take our second short break, but when we come back, I'm gonna ask Ben for his healthy screen habit.

______________________________________________

HSH School Presentations

—-----------------------------------------------------------------

Hillary Wilkinson: (39:58)

I'm speaking with Ben Gillen Water, the family IT guy, and Ben, on every episode of the podcast, I ask for a healthy screen habit. This is going to be a tip or takeaway that our listeners can put into practice in their own home. What's yours?


Ben Gillenwater: (40:16)

Oh yeah. Like that. So if you don't mind, I'll give you, similar to what I've done so far in the conversation, I'll give you a concept one, and then I'll give you like a concrete one. Sure. So the, the concept one is taking time off of tech as a family and intentionally labeling it that. And so I like this, uh, really easy moniker is like Tech-free Tuesday. 


Hillary Wilkinson: (40:37)

I love an alliteration.


Ben Gillenwater: (40:39)

, right? Yeah. It's great. So like, it sticks in my head. And so, you know, so for example, Tuesdays you come home from work, your kids come home from school, your devices, if they have their devices, their devices, they go in the drawer, they go away, they go in a room that's not where everybody's gonna be. And they stay there until the next morning. And the whole Tech-free Tuesday when you get home, it's like family time. But even if you do that for an hour, anything more than zero is so impactful because then you can look back and be like, wow, what was that like? And you can really feel it. And when you do it for a few hours, it is a huge change. It can really jump out. So I, I love it. It's very, a very positive activity to practice and there's intentionality there that the kids get to see.


Ben Gillenwater: (41:27)

So that's the concept. And then the, the concrete one is a technical thing that I'll throw out there as a challenge for people that's effective. It's a DNS filter. So this is something from, I have a background in network engineering. This is a thing that network engineers are familiar with. And if you go to my website on, on family it guy.com and you type DNS in the search box, there's a whole description 'cause it, it can be lengthy. And so I'll kind of just be very brief here, but consider a DNS filter. It's a very effective way to filter harmful content. And you can do it at your router level for the whole house. And you can do it on each device and you can configure the adult filters to be different than the kid filters and even one for each kid if you have multiple kids. So those are the two things I'd like to recommend.


Hillary Wilkinson: (42:15)

I love it. I love it. I feel like if people could just enact even one of those, it would be powerful both and you


Ben Gillenwater: (42:20)

Yes, absolutely.


Hillary Wilkinson: (42:21)

Nuclear level!


Ben Gillenwater: (42:22)

Anything  better than zero. Yes. Yes,


Hillary Wilkinson: (42:24)

Exactly. As always, you can find a complete transcript of this show and a link to Ben's website by visiting the show notes for this episode. You do this by going to healthy screen habits.org. Click that podcast button and find this episode. Ben, thank you so much for sharing your wealth of knowledge with the world because I feel like you are your, this is like your professional life and your personal life or merging and creating this supernova. And thank you. Yes, thank you. Thank you.


Ben Gillenwater: (43:02)

You're welcome. And thank you for having me. I'm very grateful that I get to do what I do.



About the podcast host, Hillary Wilkinson


Hillary found the need to take a big look at technology when her children began asking for their own devices. Quickly overwhelmed, she found that the hard and fast rules in other areas of life became difficult to uphold in the digital world. As a teacher and a mom of 2 teens, Hillary believes the key to healthy screen habits lies in empowering our kids through education and awareness. 


Parenting is hard. Technology can make it tricky. Hillary uses this podcast to help bring these areas together to help all families create healthy screen habits.


Recent Episodes

September 20, 2025
Early childhood experts Kailan Carr and Katie Talarico have combined forces to create a FANTASTIC AND FREE new resource for parents and teachers! “Let’s Grow Brain P.O.W.E.R.!” is a free resource for schools and community groups, and families designed to help educate and empower families to focus on what really matte
S4 Episode 2:  Smartphone Free Childhood Now! // Jodi Carreon
September 20, 2025
Jodi Carreon is a self-described “regular mom” who honestly is anything but!! While volunteering at her kids' school. She discovered students were more focused on screens instead of each other. By accessing the tools and templates at Smartphone Free Childhood US, she founded a local parent group that focuses on limiti