This Week in Tech 1071 Transcript
Please be advised this transcript is AI-generated and may not be word for word. Time codes refer to the approximate times in the ad-supported version of the show.
Leo Laporte [00:00:00]:
It's time for TWiT This Week in Tech. Stacey Higginbotham is back. Wesley Faulkner is also here. And brand new, we welcome from the BBC Thomas Germain. We'll talk about social media on trial, the 30th anniversary of the 26 words that changed the internet. Meta is announcing face recognition at a moment when they think no one's paying attention. And what about the Discord age verification? What's going to happen there? That and a whole lot more coming up next on TWiT.
Stacey Higginbotham [00:00:32]:
Podcasts you love from people you trust. This is TWiT.
Leo Laporte [00:00:45]:
This is TWiT, This Week in Tech, episode 1071, recorded Sunday, February 15th, 2026. Image Pickles. It's time for TWiT This Week in Tech, the show where we cover the week's tech news. Stacy Higginbotham is here from Consumer Reports, where she's a policy fellow and more importantly, the host of Stacy's Book Club.
Thomas Germain [00:01:10]:
Woo-hoo!
Leo Laporte [00:01:12]:
Hi, Stacy.
Stacey Higginbotham [00:01:14]:
Hey, everyone.
Leo Laporte [00:01:15]:
Good to see you. I get to— I get a little extra Stacy this month, which is nice. And we will talk about the new book we have picked for the book club in a couple of months.
Stacey Higginbotham [00:01:25]:
Excellent. Give me a break before, before you do it so I can get a— I have a copy.
Leo Laporte [00:01:29]:
You have a copy? Okay, nice. You're gonna read it on dead trees, huh?
Stacey Higginbotham [00:01:35]:
Always.
Leo Laporte [00:01:37]:
Uh, I will probably listen to it on Audible. Actually, not Audible, on Libro.fm if it's available. Also here, Wesley Faulkner, our good friend Wesley, founder of something new that is not yet going, but you can go and take a look at the prototype. It works not-working.com. Hi Wes.
Wesley Faulkner [00:01:56]:
Hi, thanks for having me.
Leo Laporte [00:01:57]:
Good to see you.
Wesley Faulkner [00:01:58]:
I am doing great.
Leo Laporte [00:01:59]:
Nice. And we want to welcome somebody brand new to our microphones. I'm saying that because that's what they say in the BBC. Welcoming to our microphones from the BBC, Thomas Germain, tech correspondent at the Beeb. He's the host of a brand new show launched last week called The Interface. Hi Thomas.
Thomas Germain [00:02:16]:
Happy to be here.
Leo Laporte [00:02:18]:
Great to have you. Do you have a tree growing in your house?
Thomas Germain [00:02:21]:
Uh, yeah, you could, you could call it a tree. I think it's taller than I am now. He kind of fell over a little bit.
Leo Laporte [00:02:27]:
If you're, if you're not looking really closely, it looks like the green curtain is the trunk of, of that.
Thomas Germain [00:02:33]:
That would be, that would be cooler. Yeah.
Wesley Faulkner [00:02:34]:
Yeah.
Leo Laporte [00:02:35]:
Thomas, uh, was it Gizmodo? Is it Consumer Reports? For many, uh, Moon, and, uh, has been at the Beeb for a while, BBC for a while, uh, now with— what is The Interface about?
Thomas Germain [00:02:46]:
Uh, well, I think it's something that probably would appeal to, uh, TWiT listeners. You know, the idea of the show is that everything is a technology story now.
Leo Laporte [00:02:54]:
And yeah, that's true.
Stacey Higginbotham [00:02:55]:
Exactly.
Thomas Germain [00:02:56]:
Yeah. We're trying to be a show for everyone. So we've got, you know, 3 people with really deep expertise. There's me, there's Karen Howe, who just wrote a book.
Leo Laporte [00:03:05]:
Oh, I know Karen. Yeah.
Thomas Germain [00:03:07]:
Uh, and then Nikki Wolff, who's done a lot of investigative, you know, miniseries, podcasts about weird internet culture mysteries and stuff.
Leo Laporte [00:03:15]:
So you got AI, you got culture, and you got consumer electronics.
Thomas Germain [00:03:19]:
Exactly right.
Leo Laporte [00:03:20]:
Perfect.
Thomas Germain [00:03:20]:
So something for everybody. And, uh, you know, the idea is we'll be a show even if you think you don't care about technology. We're trying to appeal to everyone because it's every part of your life.
Leo Laporte [00:03:30]:
Well, I see that your most recent episode was about, uh, well, it might have been about the Ring, was about doorbells tracking. We're going to get— we have a big privacy section today. But I thought we should probably start with that trial that's going on in Los Angeles right now. Uh, the debate— it's a, it's a, it's a, it's an omnibus trial where they brought together, I think, several hundred plaintiffs suing, uh, social media, Meta, YouTube, Snapchat, and TikTok were also in the suit. They, they kind of made a deal before, right before the trial began.
Wesley Faulkner [00:04:04]:
The.
Leo Laporte [00:04:06]:
Thesis is that Meta and Google are addictive and that they are addicting children and that they should be liable for addicting children. And it has brought some of the biggest names in social media to the stand. Adam Mosseri was on the stand from Instagram this week. Mark Zuckerberg will probably be testifying next week. Plaintiff's attorney said, this case is about two of the richest corporations in history— it would have been four, but we made a deal— who have engineered addiction in children's brains. Massari said it's addictive like a great show on Netflix is addictive. Is social— is social addictive? Can social be addictive like heroin and cigarettes?
Stacey Higginbotham [00:04:57]:
I mean, is dopamine and seeking dopamine addictive? Yes. Can we actually test that in a way that's provable? Probably not, because we, we don't really have that yet from a scientific and brain chemistry perspective. But this— I'm so frustrated by this because really what should be on trial is we call them dark patterns, deceptive patterns. And that sort of thing is being deployed. It has been deployed forever. And it's just kind of like everything on the internet, it's deployed at scale and in such a, I guess, emotionally close way for people. You're snuggled up in bed looking at this thing that it is harmful. And I would argue that it's not just children.
Stacey Higginbotham [00:05:44]:
I mean, I look at my father and his addiction to YouTube and what's happened as a result of that, and I'm like, "Holy moly." So anyway, that's just Stacey's two cents. Legally, I think it'll be hard to prove. Do I think these companies have a lot to answer to? Absolutely.
Thomas Germain [00:06:01]:
I think the really interesting thing is like why we're talking about whether it's addictive in the first place, which is I think because there are basically no laws regulating these companies, at least not in particular, right? Like they can't go after a social media platform because of the content because Section 230 protects them. So we've been seeing all like this raft of lawsuits, not just against social media. There was similar stuff about dating app companies a couple of years ago. And the idea is if we can prove that the design of the platform is the thing that's hurting people as opposed to the content itself, then maybe we can hold the companies liable and the courts will be able to do something. And it's like this one little narrow legal question that, you know, in a lot of ways, the whole future of the internet hinges on.
Leo Laporte [00:06:50]:
It seems a perilous contention though. Should, uh, look, I, I'm watching Succession on HBO for the fourth time because I get a big dopamine hit from it. And I love that dopamine. Is HBO liable for making a show that's very, very good?
Stacey Higginbotham [00:07:09]:
I wonder if there are parallels. So remember back in the '80s? I do not because I was a child, but I do because I was.
Leo Laporte [00:07:17]:
A full-grown man by then, going I'm.
Stacey Higginbotham [00:07:20]:
Just like, I don't want to make it sound like, oh, I was there thinking about this, but I did take a— I took a college class on this. Um, in the '80s, they, uh, removed the regulations around advertising to children on television, right? And that launched some really truly terrible TV shows that I loved wholeheartedly, um, and a lot of sugary cereals. But I think about, are there parallels to that kind of deregulatory era. We haven't actually had any regulations here, but should we go back and look at what we did there? And because there was the concern that children's little brains were not able to parse ads and that they shouldn't see a lot of advertising because they were so young.
Leo Laporte [00:08:04]:
Yeah, that's true.
Stacey Higginbotham [00:08:06]:
I don't, I don't know if these are good parallels, right?
Wesley Faulkner [00:08:08]:
There's a difference between the on-demand all-the-time nature of these apps that are different than even Succession, for instance. You have to basically— there's— if you think of it as an opt-in, opt-out model, you have to opt in to turn on HBO, then to choose a show.
Leo Laporte [00:08:27]:
Don't you have to install Instagram and then scroll through it? I mean, don't you have to do that too?
Wesley Faulkner [00:08:33]:
But you don't get to choose what you watch. You don't get to choose when it stops. They have these controls that are for parents that you have to opt into, so it's not— it's not choosing all of having all these restrictions by default. And then you have to opt in to release and turn those off. And you say, I want to, I want to do this. I want to get more shows. I want to get this type of content. I want to be able to have endless scroll.
Wesley Faulkner [00:09:00]:
I want to be not bound by time. That there's, that's not by default. That's something you have to opt into, to restrict, add those restrictions in.
Leo Laporte [00:09:10]:
I want to correct myself. The plaintiff in this is a single person. I was confusing with another trial which does have a number of plaintiffs. A 20-year-old California woman known only as KGM. They're trying to protect her anonymity. I just— I— yeah, I mean, I don't have any problem with restricting sugary cereal ads on children's television or keeping adult television to after 10 p.m. or whatever. I think regulations like that are okay.
Leo Laporte [00:09:41]:
I just worry about the chilling of the free speech chilling effect that telling a company, well, you can't make a product that's too good because then you'll be liable for addiction.
Stacey Higginbotham [00:09:52]:
I don't think the issue is necessarily that it's too good. I think that it is hard. Like, I mean, we've made heroin illegal and it is too good, but it.
Leo Laporte [00:10:05]:
Is also demonstratively harmful. It is demonstrably physically addictive, as are cigarettes.
Stacey Higginbotham [00:10:12]:
Well, okay, so we have shown that these sorts of infinite scrolls and TikToks and all of the social— any social media does create a dopamine hit that people continuously go back for. Is it that same level? That's hard to— I mean, is it like heroin?
Leo Laporte [00:10:30]:
Well, Massari said if you don't have withdrawal then it's not addictive.
Stacey Higginbotham [00:10:36]:
So I will tell you, okay, this is, this is Stacy's story of my TikTok addiction, which you— okay, we could all laugh because obviously I'm fine. Um, I actually did realize that I had a problem when I got an RSI injury from holding up my— okay, you guys remember how I hate heavy phones.
Leo Laporte [00:10:55]:
Right?
Stacey Higginbotham [00:10:55]:
Right. Um, okay, so I had to actually— I, I had uninstalled the app and that was like, oh, but then I'd reinstall it because my kid would send me something and I felt like I needed to be there for them.. And then I set time limits for myself. But I did, whenever I picked up my phone, I automatically went to it. So it's not an addiction, like I'm not going to go sell my house or rob somebody to get access, like steal a phone so I can check TikTok on. But it was something that I habituated, it was very easy to form a habit, much like smoking cigarettes.
Leo Laporte [00:11:27]:
Yeah, it is habituating. I won't deny that.
Wesley Faulkner [00:11:30]:
And also— I think the social nature, sorry. The social nature of it is also the thing that is tapping into our psyche.
Leo Laporte [00:11:39]:
But it's valuable. I mean, the argument is there are kids who are isolated, whose parents don't agree with their sexual choices or whatever, who find solace and a community online in these social networks, right?
Wesley Faulkner [00:11:55]:
But that's— we're gonna talk about like, um, 230 later and talking about there's a balance of pros and cons that we need to keep examining. And I think one of the social nature of it is not just everyone everyone finding community, but everyone doing it because everyone's doing it. It's as— I don't know if you've seen those experiments where everyone is like taking a test in a room and they pump in smoke and the one person who is not a paid actor is wondering why everyone is staying in their seats and not moving is because the social like bond is so that if no one feels panicked and no one's leaving, then they feel compelled to stay.
Leo Laporte [00:12:32]:
And I think it's the same— Screaming fire in a crowded theater of not screaming fire.
Wesley Faulkner [00:12:38]:
If no one does anything, like even if you think about like the Will Smith slap example, how no one was like tackling. Yes, because of the social pressure from everyone around them doing it. And I think that is also something that needs to be thought about, that it's not just that it's available, is that everyone else is using it in this way.
Leo Laporte [00:12:57]:
Well, I think it figured out where I'm going. Go ahead.
Thomas Germain [00:13:00]:
I'm sorry. Oh, no. You know, also, though, I think it's important to go back to what Stacy was saying about dark patterns, right? Like, you can't really compare it to HBO making Game of Thrones so good that I can't watch it? Because we know that, uh, Meta, for example, is well aware that the design of its products causes harm to its users, right?
Leo Laporte [00:13:20]:
Don't you think HBO is, is well aware?
Stacey Higginbotham [00:13:23]:
Their shows are engineered, but you can always— so I, I have— it is very easy to see, like, when you're watching The Pit or any Netflix show that is designed to eventually be binge-watched, that they end on a cliffhanger.
Leo Laporte [00:13:39]:
Yeah, that's called a cliffhanger.
Stacey Higginbotham [00:13:40]:
Yeah.
Leo Laporte [00:13:41]:
Okay, but— That should be illegal! Making me watch the next episode!
Stacey Higginbotham [00:13:47]:
We actually stop all of our TV shows about 10 minutes before the end. Oh, that's smart. Especially if we need to go to bed.
Leo Laporte [00:13:54]:
To avoid the cliffhanger.
Stacey Higginbotham [00:13:56]:
Yeah, that's how we watch shows in our house, because we're very uncomfortable. I love it.
Leo Laporte [00:14:01]:
Brilliant.
Stacey Higginbotham [00:14:02]:
But there's a very big difference in ending on a cliffhanger, for like something that has a finite lifespan, even if it is, you know, 15 hours of binge watch. And having an infinite scroll of like.
Leo Laporte [00:14:20]:
Just— I feel like, I mean, every novelist, if they could, would write a novel that compelled you to read the next installment. Do you overuse it? Is the crime that Instagram And Facebook and YouTube committed that they're too good at it?
Thomas Germain [00:14:36]:
I think it's that they know that they're hurting people. And like, we've got all these internal documents from the company talking about like, oh, like we noticed that when we turned this feature on, it caused this shame spiral and all these like teenage girls started watching 10 hours of eating disorder content. And they also know that there are solutions that they can implement to address this problem, but they choose not to.
Leo Laporte [00:14:58]:
They don't want to.
Thomas Germain [00:14:59]:
We have all these documents where the companies are discussing internally this is a big problem and we're causing it and we could do something about it, and choosing not to because they want to protect their bottom line. And that's where it's a little different. If HBO knew that it was doing that and they had a solution and they could still make a great show that wouldn't cause you to have an eating disorder, I think we would all be pretty upset that they were proceeding in that way. And I think that's the way that we should be looking.
Wesley Faulkner [00:15:26]:
Yeah, the incentive structures are totally different. Yeah, you're right. One, once you watch more ads, so you— they get more money the longer you stay on the platform. The other one is like, we just want you to stay around for a month. You can watch as little as you want or as much as you want. And we won't try to— we just want you to keep your subscription, which is a totally different motivation.
Leo Laporte [00:15:45]:
YouTube actually argued, we're not social media, we're just an entertainment platform like HBO. Their lawyer said, it's not trying to get in your brain and rewiring it, like those other guys in the stand. It's just asking you what you like to.
Stacey Higginbotham [00:16:01]:
Watch.
Leo Laporte [00:16:02]:
Is, is YouTube social media?
Thomas Germain [00:16:05]:
I mean, is anything social media anymore?
Leo Laporte [00:16:08]:
Like, what's the difference between— No, you're right, because Instagram's not my friends.
Thomas Germain [00:16:12]:
No. Yeah, I, I think that's a strange argument to make, that like, that we're like, oh, well, this is different because of the definition of what a social media platform is. Like, MySpace was social media, Instagram in the early days was social media, but I go on any of these apps, the last thing that I'm looking at is what my friends are posting, and they're posting less and less, and that's by design. All of the companies are doing this.
Leo Laporte [00:16:33]:
No, it's TikTok. It's a, it's a stream of entertaining content designed to keep you scrolling the same way. Yeah, Instagram's the same. I suppose you could argue Facebook's the same. Certainly YouTube, uh, is that way. Um, so you, you kind of, you, you kind of hit on the thing that I do care about on this, and the reason I'm arguing against, uh, any restraints is because of Section 230. I don't want to open the door to undermining Section 230. 30 years old this week.
Leo Laporte [00:17:05]:
And, you know, it's been called the 26 words that created the internet. It was part of the Communications Decency Act. And it said that no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. In other words, Twitter wasn't responsible for your tweets, you were, and Twitter couldn't be held liable. And, and by the way, a lot of people who are trying to change Section 230, including a number of well-known Hollywood actors, which puzzles me, is, uh, that the— this— it's Big Tech trying to manipulate you. But Section 230 also protects me. It protects my chat room. It protects my discourse forums.
Leo Laporte [00:17:55]:
It protects my Mastodon instance. If I were liable for the things people posted on those, I would just shut them down because I couldn't defend myself. I'm not big enough to defend myself. In fact, Big Tech can. So do you think this trial is in any way a threat in the long run to 230, or that 230 protects these companies?
Thomas Germain [00:18:17]:
I think they're trying to sidestep that issue altogether because they're not talking about like Instagram hosts dangerous content.
Leo Laporte [00:18:25]:
Like, it's not the content, it's what Instagram's doing with their algorithms, right?
Thomas Germain [00:18:29]:
A whole separate can of worms, a much, I think, more complicated issue.
Leo Laporte [00:18:33]:
This is what people like Joseph Gordon-Levitt say. Well, when you do an algorithm, you're a publisher, right?
Thomas Germain [00:18:40]:
Well, which I think is a point that's really worth focusing on, right? You in your Discord, like, maybe you have some rules set up where if someone says something that's like, you know, really deeply offensive or like breaks a law, you're going to ban them. That's a lot different than a social media platform where they've got this algorithm and we know that they're making like very specific intentional decisions about what kinds of content that they wanna promote. We also know like TikTok, for example, there was a story a couple years ago about how they have a heating button where if they have a particular user that they wanna promote, they go in and they turn this on and it makes the content go viral. So I think there are all kinds of cases where The tech platforms are acting like publishers. Like YouTube decides what kinds of stuff goes on the homepage. These algorithms are making the decisions, but I think it's not really accurate to describe these platforms as just like neutral the way that Reddit is and anyone can just go on there and post. There's something else that's going on here that I think maybe needs to be addressed in a different way than like, you know, your, your Discord community because it operates in a complete— what's the remedy?
Stacey Higginbotham [00:19:56]:
So here's, here's what I've been thinking about, and I hate thinking about 230 because it's like the third rail of internet.
Leo Laporte [00:20:03]:
No, we have to defend it. We have to defend it.
Stacey Higginbotham [00:20:05]:
Hold on, hold on. So I'm thinking back to, you know, this is influenced by my net neutrality years, right? Thinking about we had the common carriers for ISPs, right? And then we came up and we were like, you know what, These email providers and internet folks, they're a little different. They're broadband providers, so they don't have to follow these rules. And I know we spent a long time debating over how that would happen. But I wonder if— because they're— and we would have to figure out what is a publisher on the internet versus what is a platform that is pushing something harmful. And you could define it by the platform. You could set regulations for proving harm. I don't love that idea, but there are lots of policy ways we to tackle this, but what happens is we're like, we're going after 2:30 and everyone rightly freaks out.
Stacey Higginbotham [00:20:59]:
So it's time for nuance and we're just really terrible. This time is terrible for nuance. And I'd love to read like some really good thinking on that. Please send me stuff.
Leo Laporte [00:21:09]:
I think you're right. I think it's true that when a company creates an algorithm that's really designed to as best as possible hook people into doomscrolling that they have now become a publisher.
Thomas Germain [00:21:23]:
Or look at AI, right? Like, you go on Google and do a Google search right now, and Google is speaking to you. Google is generating the information itself and giving it to you.
Leo Laporte [00:21:33]:
They're publishing.
Thomas Germain [00:21:34]:
They are publishing. They are making the information up themselves and delivering it to you. That's a completely different thing. I think we have to protect Section 230.
Stacey Higginbotham [00:21:42]:
It's true.
Thomas Germain [00:21:43]:
The whole internet relies on it. We cannot get rid of it. The solution that I always point to with this kind of stuff is transparency. What if there was a law that said you had to make it so that outside auditors, and we could decide who they are, can go in and look at exactly what the algorithm is doing and demonstrate, you know, this piece of content is being promoted in this particular way and this is how people are reacting to it. This is what the algorithm is doing. Just by putting it out in public, so everyone can decide, like, how do we feel about all this, I think would create, you know, like Wesley was saying, a social pressure on these companies to behave a little better and be better stewards of what's happening on their platforms. Now, that wouldn't solve it, but where we are right now is like nothing at all. That's like these companies can never be held accountable under any circumstances because of these rules that really are protecting and upholding the internet.
Thomas Germain [00:22:37]:
I think we've just gotten to a place where the internet is so different We have to come up with a new way to look at it, but it gets— like you're saying, it gets thorny really fast.
Wesley Faulkner [00:22:47]:
It could even be opt-in for some places to have like almost a score, like a health score for a restaurant. You can see if it's an A, a B, a C, a D, an F on certain categories where they can choose to be part of it or choose not.
Leo Laporte [00:23:01]:
Some of this, especially with kids, is going to be doing the grading.
Thomas Germain [00:23:08]:
What?
Wesley Faulkner [00:23:08]:
Oh, it has to be useful for people to trust it, right? And so if you really wanted to tackle this, it needs to be extremely objective.
Leo Laporte [00:23:15]:
But if lawmakers make this law, who is going to be the person doing the grading?
Wesley Faulkner [00:23:21]:
Lawmakers? It could be a consortium.
Stacey Higginbotham [00:23:22]:
No, no, lawmakers hate actually doing that. Yeah, exactly. They're gonna actually give it up to— God help us, if we're lucky it'll be a government agency that is like—.
Leo Laporte [00:23:32]:
This is what Facebook pretended to do, right, with their whatever they called that group. It's still around.
Wesley Faulkner [00:23:39]:
Or just let people make their own benchmarks.
Leo Laporte [00:23:41]:
Like, think of Ginkgo. I agree. Social norms are the only thing that can protect us.
Stacey Higginbotham [00:23:46]:
Not— not— no, I think, I think actual rules need to protect us. And I think everyone really— we love transparency. I mean, God help me, I know you're wearing so many labels right now. Um, but the problem is, and I will go back to kids because They don't have the ability to look at this and understand a label the way— like, I may be able to look and say, oh, that's, you know, pro-anaconda.
Leo Laporte [00:24:12]:
Were the motion picture labels, uh, the MPAA laws, rules, were those effective? R, G, PG?
Stacey Higginbotham [00:24:20]:
What I showed my kid when they were young, things that I watched that were like PG or PG-13, I was like, oh my God, this is the most sexist, horrible things. No wonder I, as a '90s child, grew up not having— with some of the ideas I had.
Leo Laporte [00:24:35]:
Yeah, I mean, the MPAA allowed murder on camera, but not a naked breast.
Stacey Higginbotham [00:24:40]:
But now they don't. Now they don't. So these will change. Well, the way they were enforced has changed over time. I just don't think ratings are the way. I think we really need to think about the harms we're trying to prevent, and then set up rules to address those harms, and probably some sort of body that arbitrates that because it's going to change over time. I don't know. This is why I want people to send me good ideas.
Leo Laporte [00:25:08]:
You can see how hard this is, to be honest.
Thomas Germain [00:25:10]:
We have a system set up for copyright, right? Like, if I notify— well, it's complicated, but if I notify Twitter or, uh, or TikTok that like someone is like hosting, you know, Paramount's content, they have 24 hours to take down before they violate the DMCA, right? You could create a system where, OK, we decide that there are certain kinds of content that we think are so noxious that they're illegal, and maybe we need to expand it a little bit. But even just the rules that we have, right? We know that companies aren't doing a good job of taking down dangerous, like completely illegal stuff. We could create more robust protections where They have to have a system in place where they act on stuff in a certain amount of time if they're notified about a particular problem. And then, you know, there's some accountability system where they have to tell everyone about how they're doing it the same way that they have transparency reports about how many times they respond to, you know, warrants and police requests. I think there are a lot of small interventions that we could make that everyone except the tech companies would kind of agree on. It's when you look at the big— you're trying to like write one law about the internet, or do we want the senators, you know, deciding what kind of speech is and isn't okay? That's when it gets complicated, right? There are a lot of common sense interventions.
Leo Laporte [00:26:31]:
It does seem like there has been a growing consensus that this stuff is bad for us, it's bad for our kids, uh, that these companies don't really care. Um, it's hard to think of what a remedy would be, but I think we all We've all kind of come around to this idea that this is not the utopia that we thought the internet would end up being.
Wesley Faulkner [00:26:51]:
I think competition is the way to fix it though. What's going on with the Fediverse, hopefully it will mature enough where people can feel like they can opt into the experience that they feel is good for them, but we lack the vocabulary to even describe what we want I think is part of it is that because we don't have this competition, you can't say this is more X, this is more Y, this is whatever because we can't even have have that conversation. We just say safe, harmful, or addictive, but we don't have the nuance to be able to say, let's make a platform that's different than all the others where people know if they want a certain experience where to go.
Leo Laporte [00:27:30]:
And you know the problem is, Wesley, these algorithms are so good that they win. You can't have competition because people want Instagram.
Stacey Higginbotham [00:27:40]:
They want these algorithms. Petition also because there's no business model for this right now.
Leo Laporte [00:27:45]:
Well, the business— but, but Instagram and Facebook and YouTube are making billions and billions.
Stacey Higginbotham [00:27:50]:
Their money is based on engaging us at all costs, even at harm. So we recognize that, but— right, but if you're not to harm people, if you're to create a place where engagement doesn't matter, then you're looking at something that people pay for. Fewer people will pay for that. Part of the reason— I mean, like And it does get down to this idea. I wonder if some of this is down to the fact that we think we want the internet to be a place where we socialize and can connect and have these engagements. And we do in the sense like if you live in a small town, it gets you all of these people and engagement you otherwise would never get because those people aren't there. But we don't want to spend money on it. And honestly, I don't know if you should I mean, basically we think the internet's doing one thing, but it's really doing another, and most people are not aware of that.
Stacey Higginbotham [00:28:42]:
And that's, that's hard. And I, I love the Fediverse. I hope it helps, but I'm not.
Leo Laporte [00:28:48]:
As— it's not very competitive, is it? You know, it's, it's— when they poll people, uh, about which TV channels they watch, they always say PBS in greater, far greater number than people actually watch PBS. Right? People say they want to eat vegetables, right? They like McDonald's. Yeah, I think it's a problem. The biggest problem here is these platforms are hugely successful. They work.
Stacey Higginbotham [00:29:13]:
You could make it more painful and more expensive for them. So like some of these little alterations could make it— because I think we're like, let's stop them from being evil. But what if we just said, hey, let's make it really expensive We did that with cigarettes.
Leo Laporte [00:29:29]:
Pain in their butt. We did that with cigarettes. We raised the cost of cigarettes. In Canada, they have— I think it's probably true in the UK too. They have labels on cigarettes that are graphic. Yeah, they show lungs and it's like, this is going to happen to you if you— and it does not seem to deter people much.
Stacey Higginbotham [00:29:46]:
What are you talking about? Nobody smokes anymore.
Leo Laporte [00:29:49]:
But you know why? Because we make sure that they have to do it outside when it's cold.
Stacey Higginbotham [00:29:54]:
Right. I feel like there's been a bunch of interventions. There are interventions.
Leo Laporte [00:29:57]:
More painful. You're right, there are interventions that work. It's not— we did a whole bunch of different ones. Yeah, we tried them all.
Stacey Higginbotham [00:30:04]:
Yeah, yeah, that's true. Do fewer people smoke?
Leo Laporte [00:30:08]:
Oh yes, in the US for sure. In the US, yeah. But in.
Stacey Higginbotham [00:30:16]:
Italy, it's— or France, you could smoke. Can you smoke inside places in Italy?
Leo Laporte [00:30:21]:
No. When we were in Rome last year, Lisa said, I never want to come back here because everybody's outside smoking.
Stacey Higginbotham [00:30:26]:
Yes. Anytime I'm in like Japan or parts of Europe, I'm just like, whoa, you guys. When I was in Istanbul, they were.
Leo Laporte [00:30:33]:
Smoking in the hotel. And people know. I mean, there's nobody who doesn't know that smoking is bad for them. That's going to kill them.
Wesley Faulkner [00:30:38]:
Right. But think about this. This culturally, you see it less than movies and TV shows. And I think that's true.
Leo Laporte [00:30:43]:
My previous example, it's not cool anymore.
Wesley Faulkner [00:30:45]:
Yeah.
Stacey Higginbotham [00:30:46]:
They are worried about it coming back.
Wesley Faulkner [00:30:47]:
We've moved away from that. Yeah.
Leo Laporte [00:30:48]:
It's not cool anymore. All right. We're going to take a break and talk about privacy. Next, great to have Thomas Germain, first time, first timer from the BBC's The Interface, which, uh, you got a great team. I can't wait to listen to it. That sounds great. Just launched last week. Just last week.
Leo Laporte [00:31:07]:
Yeah, but you already have muffs. We've been doing this for years and we do not have microphone muffs with our name on it.
Thomas Germain [00:31:14]:
You got to get on my level.
Leo Laporte [00:31:14]:
I don't even have a mic flag.
Thomas Germain [00:31:16]:
For crying out loud.
Leo Laporte [00:31:17]:
We're coming for you.
Wesley Faulkner [00:31:18]:
Yeah.
Leo Laporte [00:31:18]:
I'm in trouble. Also Stacy Higginbotham, who thank goodness does not have a Consumer Reports mic flag, but she does have a little, that little wooden CR behind her, which Paris now has and Nicholas DeLeon now has. They're starting to— Oh, good. Yeah, it's taken over everywhere.
Stacey Higginbotham [00:31:33]:
They were from the pandemic apparently.
Leo Laporte [00:31:35]:
Oh, 'cause everybody had to do Zoom calls.
Thomas Germain [00:31:38]:
Yeah.
Leo Laporte [00:31:38]:
Isn't that interesting? That might be the number one thing the pandemic has changed. We've all become, even on major news networks, We've all become used to people sitting in their bedroom doing their— I'm.
Stacey Higginbotham [00:31:49]:
In my office, okay.
Leo Laporte [00:31:50]:
Or their office or whatever. It's really changed. It's really interesting. Wesley Faulkner is also here. Great to have you, Wesley. We'll talk about your new company, Work's Not Working, in just a little bit. Our show today brought to you by Zscaler, the world's largest cloud security platform. And the time is right.
Leo Laporte [00:32:13]:
We talk a lot about AI on all of our shows. The potential rewards in business are obviously too great to ignore. But there are risks. There are a lot of risks. The loss of sensitive data, attacks against enterprise-managed AI. Generative AI increases opportunities for threat actors too. Just as it's helping your business, it's helping them to rapidly create phishing lures that are indistinguishable from the real thing. Do you notice? I don't know if you've noticed, I know we get more and more of those every single day.
Leo Laporte [00:32:43]:
I got one the other day from a well-known company saying, "Your bill is overdue. Just click this link to pay it." Also, bad guys are using it to write malicious code. We're seeing malware crafted by AI. They're automating data extraction. One of the big problems that maybe you don't even see in your business is even the use of, you know, well-known SaaS AI apps can leak information. There were 1.3 million instances of Social Security numbers leaked to AI applications last year. ChatGPT and Microsoft Copilot, between the two of them, saw nearly 3.2 million data violations. So this is my pitch.
Leo Laporte [00:33:23]:
It's time for a modern approach with Zscaler Zero Trust AI. It does a bunch of things that are very important. For one thing, it removes your attack surface. It secures your data everywhere in, on-prem and in the cloud. It safeguards your use of public and private AI, and it protects you against ransomware and AI-powered phishing attacks. Wow. Just check out what the director of security infrastructure at Zuora says about using Zscaler. AI provides tremendous opportunities, but it also brings tremendous security concerns when it comes.
Wesley Faulkner [00:33:58]:
To data privacy and data security.
Leo Laporte [00:33:59]:
The benefit of Zscaler with ZIA rolled out for us right now is giving us the insights of how our employees are using various GenAI tools. So ability to monitor the activity, make.
Wesley Faulkner [00:34:11]:
Sure that what we consider confidential and.
Leo Laporte [00:34:14]:
Sensitive information according to, you know, company's data classification does not get fed into the public LLM models, etc. With Zero Trust plus AI, you can thrive in the AI era, stay ahead of the competition, and remain resilient even as threats and risks evolve. You should go right now, check it out, learn more at zscaler.com/security. zscaler.com/security. We thank them so much for their support of This Week in Tech. We have, uh, you know, it's funny, uh, when I was putting this show together, there weren't a lot of stories about AI, but there were a ton of stories about privacy. I'll start with one that is kind of tells you something. Meta, according to Business Insider, apparently thinks we're too distracted to care about face recognition in Ray-Bans.
Leo Laporte [00:35:03]:
They say we're— they're going to add face recognition, that thing that Google refused to do way back when with Google Glass, to the Ray-Ban smart glasses, and they're going to do it soon, figuring, yeah, There's too many other things people are worried about. They're not going to notice. We noticed, Mr. Zuckerberg. We noticed. Uh, should we worry about face recognition in, uh, in, in these glasses? I think it seems to me this is the one thing I've been waiting for.
Stacey Higginbotham [00:35:33]:
To be clear, they said that it wasn't you guys that they're worried about. They're worried— they were saying that the consumer groups that fight against this, or the rights— they're busy— fight against They're busy fighting literally everything else because there is so much happening. So it's not just— it's not consumers that they're saying.
Leo Laporte [00:35:52]:
The New York Times viewed a document that said, quote, Meta's internal memo said the political tumult in the United States was good timing for the features release.
Thomas Germain [00:36:01]:
Yeah, I mean, it's one of the most cynical things I think I've ever seen from the tech industry.
Leo Laporte [00:36:06]:
And that is really saying a lot.
Stacey Higginbotham [00:36:09]:
No kidding. Yeah. If you read, I mean, Careless Whisper, that was literally, I mean, like, it did not surprise me at all. I was like, well, of course, do we not know that this is how they operate?
Leo Laporte [00:36:19]:
What's Careless Whisper?
Stacey Higginbotham [00:36:20]:
That was that. That was the book by, was it Frances? The woman who used to work there and like— Careless People?
Leo Laporte [00:36:27]:
Careless People, sorry. Careless Whisper was by Wham! That's a different song. Yeah, okay. I was curious because I thought that was like a code name. Oh yeah, Careless People blew the lid off of this.
Stacey Higginbotham [00:36:40]:
Wow. Well, you would think. But like, no, nobody kind of just shrugged. Um, yeah, so I will say, I'll sing it later. Um, I, I am terrified of this both as a woman, as a reporter, as a person who has been stalked. I, I just like everything about this is Awful. But it's also something that I, I think Omen and I had conversations about this when I was back at GigaOM. I mean, like in 2012.
Leo Laporte [00:37:15]:
It's been on the radar. Google, remember, Google said we're not gonna, we're not gonna do that because of that very reason. We could, although it's built into the Nests. I think it's built into the Nest cameras, so maybe they decided to change their mind.
Stacey Higginbotham [00:37:30]:
You can turn on facial recognition in your Ring camera Sorry, you can opt out of— Ring has familiar faces, so you can see it on your Ring camera.
Leo Laporte [00:37:44]:
You can see— but you understand that in order to do that, it's sending your face back to the home office, yes, comparing it against the database, and then sending that information back. It's not done on device, right?
Stacey Higginbotham [00:37:55]:
I, I have a lot of— like, one of the biggest things that I'm angry about and argue with people about all the time is end-to-end encryption on any sort of AI-powered service anywhere. Like the Kohler toilet, when it launched and was like, well, AI analyze your poop.
Leo Laporte [00:38:12]:
I recognize that, but— oh, you're okay.
Stacey Higginbotham [00:38:14]:
And they were like, we have end-to-end encryption.
Leo Laporte [00:38:18]:
Absolutely. But end-to-end encryption, it takes a whole new meaning on a toilet cam. I'm too serious for this. I'm sorry, I'm sorry, I'm getting I'm being silly now, I apologize. Actually, this came up, Ring, remember 2 weeks ago, last week, wasn't even 2 weeks ago, during the Super Bowl, had what many thought was kind of a dystopian ad showing how you could find lost doggies using Ring's new surveillance feature, which many people almost immediately pointed out, Search Party isn't for dogs. It's for, oh, I don't know, anything. And part of the problem was that Ring had a deal with Flock, when Flock has a deal with ICE. So it all kind of goes around and around.
Leo Laporte [00:39:11]:
You want accurate information on this, or? Yeah, Flock is the surveillance company that does license plate, ASLR, license plate recognition, automated license plate readers. And video surveillance. And, uh, anyway, Ring— which, the good— here's the good news. After the backlash, Ring said, oh, never mind, we're not going to do that deal with Flock.
Stacey Higginbotham [00:39:34]:
Okay, but Search Party still— those are two separate things. Search Party is still there. Yeah. Um, Ring is very accurate.
Wesley Faulkner [00:39:41]:
Oh, go on. Oh, sorry, I said they're also doing the deal with Taser too.
Stacey Higginbotham [00:39:45]:
Still, that's still going.
Leo Laporte [00:39:46]:
Yeah, they still have Taser. They're gonna build a taser into my doorbell?
Stacey Higginbotham [00:39:51]:
The— no, it's okay. Ring does a lot of third-party partnerships with law enforcement because they need chain of custody for some of the videos, and they need— so that's, that's part of— there's some nefarious fun things happening there. Um, so the deal with Flocks— Flocks— Flock, and the deal with Axion Taser body cam maker, those are— Ring did cancel their deal with Flock. They still have the other deal.
Leo Laporte [00:40:20]:
Were they going to do license plate recognition?
Stacey Higginbotham [00:40:23]:
Uh, no, Flock was going to be able to access video feeds on, like, from Ring. And I don't— I think Ring would give them the video and then Flock would run their own algorithm on it.
Thomas Germain [00:40:35]:
So I think this is one of those cases where, uh, you do want to— there's a famous quote like, ski down the slippery slope here. The CEO of Ring says that his company's goal is to zero out crime, which, like, first of all, kind of ridiculous. Like, I don't know what Ring cameras are going to do about white-collar crime.
Stacey Higginbotham [00:40:54]:
But like, crime, man, what are we doing?
Thomas Germain [00:40:56]:
The crimes talking about here? And unlike some of the stuff we were talking about in the first half of the show, right, uh, this is something that almost everyone I know who doesn't work at a gigantic tech company agrees on. Every single person I go up to, you know, at a party, I tell them about, you an investigation I've done about some privacy issue, they go— they always like— the response is always like, well, what's going to happen now? And the answer is nothing, because there's no rules about this stuff, right? This— people want laws on this issue so badly that they assume that they already exist because it's like so clearly a thing that everyone agrees on. I mean, facial recognition— like, I'm not the first person to say this, but that is the end of any form of privacy in our society. It's over. It's done. As soon as that is just like a regular thing that, you know, a guy walking down the street can like hit a button on his glasses and identify who you are, it's over. And we're already there with all these cameras that are operating in public. If you see a security camera, you should assume that there's facial recognition software running on it.
Thomas Germain [00:42:02]:
And, you know, like maybe you want to get rid of all crime like Ring does. Maybe you're in favor of that, but I think it's really important that we talk about what exactly it is that we're trading to get there, especially when we look at what Meta is doing, adding facial recognition to the glasses, trying to do it in secret. So I think we can assume the goal is like, you get there and it's like, well, it's already here, everyone's already using it, we can't roll it back now. Like, it's just become a regular part of everyday life. It isn't too late if people aren't happy with this stuff to do something about it. And what we saw with the reaction to the Super Bowl ad is perfect evidence.
Leo Laporte [00:42:37]:
There is a constituency that wants this, though. There are— it's suburban people who want to, you know, look, if somebody's stealing boxes off my porch, I want to help law enforcement catch them. Or if somebody across the street does a home invade, there's a home invasion across cross the street, I want law enforcement to catch him. If my camera can help, I, I.
Stacey Higginbotham [00:43:01]:
Want it to help. So they think they want it. Oh, we'll let, we'll let Wesley go.
Wesley Faulkner [00:43:06]:
Because he's going to make the— I'm sorry, it's just that, yeah, with the— what, what is being created right now is good, is a marketplace for locating people. Just— and once there's a marketplace, they will try to get as much data and sell it to as much— as many people as they can. With the Meta glasses, you said someone will push a button and see where a person is. That's not— or who that person is.
Leo Laporte [00:43:36]:
That's— yeah, I want that, because when I run into you in about 5 years and I can't remember who you.
Wesley Faulkner [00:43:40]:
Are, but I want that. Meta letting you know who that person is, right? But there are— but Meta will know all the people in that whole shot, and they just will keep that data for themselves or sell it, and especially sell it, because now they say, hey, I know that you're in a Barnes Noble and I know you just ran into your friend Tom, but now I know that Gina, Mike, and blah, blah, blah is in that Barnes Noble. Hey, Barnes Noble, would you like to know who's going to your store and when?
Leo Laporte [00:44:09]:
And that's the problem with the ring, isn't it?
Stacey Higginbotham [00:44:12]:
It catches everything that's going on regardless of your little puppy. They have it from your cell phone data. Like they know where I'm going at.
Wesley Faulkner [00:44:22]:
Any— So the visual isn't necessary there. But now Meta will have it. And right now Meta will sell it, and now they would try to get more of it, and they try to sell more of it. It's just like making another marketplace where your privacy is also sold, and then.
Leo Laporte [00:44:38]:
People can aggregate it. And it's also a surveillance net, right? It becomes— at some point, there, there.
Stacey Higginbotham [00:44:46]:
Is universal information about who's there when doing what. And I will say, so way back, Ring launched Neighbors I don't— I was podcasting from my old house, so I had Jamie Siminoff on the show, on the— on my IoT show way back in the day, um, talking about this. And I was like, hey man, what about racism? What about, you know, petty crime getting.
Leo Laporte [00:45:08]:
You know, people getting caught up?
Stacey Higginbotham [00:45:11]:
And neighbors really did become racist.
Leo Laporte [00:45:13]:
It was, it was incredibly racist.
Stacey Higginbotham [00:45:16]:
Yeah, walking down my street, you know. But he was, he was like We're going to have social shame for that. We're going to— those people are— exactly. So I want to point out, and I've even brought that— I installed Ring cameras for my in-laws, you know, years ago because they really wanted to catch porch pirates. That was a big— they lived in like a million-dollar neighborhood in Texas in a suburb. They were fine, but they were deeply concerned. And I brought up, you know, these concerns, and they were like, Stacy, that's not real. And I don't know if those people have changed their minds.
Stacey Higginbotham [00:45:50]:
I don't think that the idea of zero crime— I mean, it's been sold to us for decades as the reason for needing surveillance, and it has perpetually led us to criminalizing usually being poor.
Leo Laporte [00:46:06]:
Being, you know, Black or brown. It isn't really about zero crime, it's about.
Stacey Higginbotham [00:46:13]:
Control, in particular control of certain —people. And also, I do wonder if our current political moment means that people are.
Leo Laporte [00:46:18]:
Going to push back much harder. I think so. I think we used to think it was— I used to think, oh, privacy, so what if, you know, a marketer gets my information? No big deal. Now you start to talk about, well, what if a federal, out-of-control federal law enforcement agency gets that information?
Thomas Germain [00:46:33]:
How do you feel about.
Wesley Faulkner [00:46:37]:
It now?
Leo Laporte [00:46:37]:
Yeah, I don't know.
Thomas Germain [00:46:39]:
Oh, go ahead. Sorry. No, please. I don't know anyone who is comfortable and confident in the future of the governments of the leading world powers. We've seen no matter what side of the aisle you happen to be on, and I think that's what— Yeah, a lot of people are okay with this. A lot of people, you tell them, well, they're handing the data to the cops and they go, great, I love cops. But I think you have to use your imagination. A little bit here about the amount of power that we are handing, not like, sure, the government can get it, but also these gigantic corporations that operate.
Leo Laporte [00:47:17]:
And have the power of nation states, right? Here's the evidence people are becoming aware of this. When the Nancy Guthrie story surfaced, one of the things they said is, well, we don't have, uh, you know, she had a Nest camera doorbell, but she didn't have a subscription. And if you don't have a subscription, it's not sending the photos to the cloud, the information to the cloud. I guess the bad guy took the camera or disabled it. No. So, well, wait a minute, hold on. Then it comes out 2 weeks later, oh, we got pictures. Uh, wait a minute, I thought you said that she didn't have a subscription.
Leo Laporte [00:47:51]:
Oh, well, uh, we were able to retrieve, uh, ex— uh, stuff that was like leftover, and now we have pictures. And what— so that— what was— I know, hold on. What was gratifying to me is that people immediately said, wait a minute, Google has access to those?
Stacey Higginbotham [00:48:09]:
And that was a shocker. Go ahead, Stacy. How did people not know that? That's how the Nest cameras don't have local storage. That's not— no, but she didn't pay for it. I know, but they can't. If you have remote access to any— this here, your audience knows this— but if you have remote access to any device on your home network, it goes to a cloud. That is literally how they work. You— so if you have a Nest camera and you're able to look at it, even if you don't have a subscription, if you can log into your camera remotely, you're going over the public.
Thomas Germain [00:48:44]:
Internet and it has to hit a server somewhere. That's what upset people so much about this Ring ad in the Super Bowl, right, is they saw Ring activating all of these cameras throughout a neighborhood, that it's Oh, they're using my camera, the camera at my house. And this feature is on by default. So Ring decided that you're okay with this and they're going to use your camera for this feature unless you go turn it off, right? And I, I think Stacy makes a great point, right? We put these things in our house, like a camera. You put a— people put Ring cameras pointing inside their own home and you don't have control over that device. You don't know what it's doing. Now the company said, oh, we give you— we care about your privacy a lot. We give you all these controls.
Thomas Germain [00:49:29]:
You don't need to worry about it. It's like, okay, if you trust Amazon and you, you know, are very comfortable with all the things that could go wrong with this data, then fine.
Leo Laporte [00:49:41]:
But think about whether that's actually how you feel. The theory, the story was that they told us initially is that, well, she didn't have a subscription. And the deal with Nest is you can look at that camera, uh, live or for 3 hours after, but it will be deleted. But 9 days later, Google says, oh, uh, well, we were able to get— we were able to recover residual data.
Stacey Higginbotham [00:50:07]:
In our backend systems. Yeah, I didn't— so I, I ended up explaining this to an AP reporter, and I was kind of— again, I'm— I was really surprised that, like, and I likened it like, you know, when you drop your email, when you delete your emails It's not gone. It's not gone. You can pull it out of your trash. So whatever Google's retention policies are, you know, that yes, and it was a pain for them to get to it and they wouldn't do it for like everybody. And, you know, some people are probably like, well, if my mom got, you.
Leo Laporte [00:50:36]:
Know, kidnapped from her home and this—.
Thomas Germain [00:50:40]:
Yeah, I'm glad they have the video.
Stacey Higginbotham [00:50:43]:
Sure.
Thomas Germain [00:50:43]:
But I think like I had the same reaction. Yeah, I saw that Ring ad and all people got upset about it, and.
Leo Laporte [00:50:49]:
I was like, yeah, duh.
Thomas Germain [00:50:50]:
Like, people are like, oh my God, I'm part of a giant surveillance network.
Leo Laporte [00:50:53]:
Like, that's what this thing is for. I remember, Stacy, when this came out, we— you were on This Week in Google. We were talking about this whole, uh, what do they call this, this LoRa network that they have that they've created with all the sidewalk? Sidewalk. Yeah, this is Amazon, uh, Sidewalk. All the Amazon devices are work together. And they even then at the time said it'll help you find your lost dog, which at the time seemed pretty.
Stacey Higginbotham [00:51:21]:
Anodyne, but maybe not.
Leo Laporte [00:51:23]:
That's a radio network. It's very different. Well, but yeah, well, they could, but I guess they could share somehow.
Stacey Higginbotham [00:51:31]:
They're sharing video, right? No, no, it's not over. Amazon Sidewalk is literally just a, it's a wireless radio network.
Leo Laporte [00:51:39]:
It is a long-range wireless.
Stacey Higginbotham [00:51:39]:
So it's not related to this at all. No, not at all. I mean, there's data that could be— okay, Laura's the least of your problems.
Wesley Faulkner [00:51:50]:
I mean, you can do however you want. Think of it as like Meshtastic. It's just a— it's a mesh. Yeah, I understand. Yeah, it's just only radio. So yeah, so if the dog had a tag that had a radio transmitter.
Thomas Germain [00:52:00]:
Then you would be able to find it.
Wesley Faulkner [00:52:03]:
We gotta get our dogs on Wi-Fi. Yeah, but the thing about— also, someone brought up the example If you, if you had a stalker who knew what your dog looked like, they could say, I lost my dog, and then activate this network to find the person that they're stalking as well. Wow.
Leo Laporte [00:52:21]:
So people could use this network for nefarious purposes. I just, I was very gratified that.
Thomas Germain [00:52:27]:
People went, oh, that's not good. Yeah. Yeah. Everybody seems to kind of be on.
Leo Laporte [00:52:32]:
The same page on this one. Yeah. But, you know, it's interesting. So So I, probably not paying close attention, figured, oh, well, they canceled that Flock Safety. In fact, that's the headline in The Verge: Ring cancels its partnership with Flock Safety after surveillance backlash.
Stacey Higginbotham [00:52:47]:
It had nothing to do with this. Right, right.
Leo Laporte [00:52:51]:
They just— everybody got upset at the same time.
Thomas Germain [00:52:56]:
Yeah.
Stacey Higginbotham [00:52:56]:
So they're still doing the, uh, find your lost. You should know how it works. So Search Party basically Ring has fed photos of dogs into an AI that only recognizes dogs. And Ring is very clear that says this only recognizes dogs. Okay. What I need people to recognize and know when they hear that and they're like, oh, it's fine.
Wesley Faulkner [00:53:20]:
That's great.
Stacey Higginbotham [00:53:20]:
Ring can update that algorithm at any point in time for facial recognition. Thing is, they have the data. They have the control and ability to update your camera. You are opted into this automatically unless you turn it off. So just— and that's how they do this. They, they add these little features and you get used to them and you're like, oh, this is kind of nice. So neat. Um, or you don't think much of it and they, they are like, oh, it's only dogs.
Stacey Higginbotham [00:53:46]:
Don't worry. And then, then it'll be like lost.
Leo Laporte [00:53:48]:
Children and you're like, oh, that's so good. Oh, that's good. We gotta— yeah, actually, yeah, let's add.
Wesley Faulkner [00:53:52]:
Lost children to the search party. I also want to add that Amazon just keeps firing huge chunks of people, which is building an atmosphere of fear of people who are still there. And so where maybe in the past someone would raise their hand saying, "This is a really bad idea. We shouldn't do this," there is less incentives for people to do that. And so I think even though there's a quick backlash and outrage of that Super Bowl commercial, I'm sure deep down in people's, like, hearts before this launch. They had those same feelings internally, but they knew they couldn't stop the, the head of Ring from saying, like, that you shouldn't do this. No one's gonna raise their hand to make sure that that doesn't happen internally. So see these, um, aggressive— maybe, um, I don't want to say poorly thought out of, but very, uh, let's just say one perspective of why these things are good to, to hit the market more now from not just Amazon, um, all the divisions of, of these large.
Leo Laporte [00:55:03]:
Tech companies that are doing these mass layoffs. So, uh, when we bought our house, there is a Ring camera, uh, on the front door, but I pointed it away from the street so it only sees— it doesn't see anything but our property, like not any— not anything but our front front porch, and every other camera I have is local. It's a— they're Ubiquiti cameras. They don't go to the cloud. Is that okay, Stacy? Am I okay? I mean, I want to get rid of the Ring camera, but it, it is a big hole in the wall where it goes, and there's nothing else I can put there.
Stacey Higginbotham [00:55:40]:
I'd have to do— anyway, do you want a camera?
Leo Laporte [00:55:42]:
Yeah, do you want a camera for your doorbell? I like having a camera on my front porch, yeah. I like to know. You know why? 'Cause the cat has figured out how to ring the doorbell. The cat jumps up on the wall, walks up to the camera, and it rings. She doesn't push the button, she isn't that good, but she moves her head around and it rings. And then I know she's at the door and I can look at my.
Stacey Higginbotham [00:56:04]:
Watch and see if it's her or a delivery person. Do you open the door when it's your cat? Yeah. Oh, okay, no, I didn't know if.
Leo Laporte [00:56:11]:
The cat had trained you like, I wanna come in. She has trained us to answer the door. Or, or maybe we trained her to ring the bell.
Thomas Germain [00:56:18]:
I don't know which it is, but.
Leo Laporte [00:56:22]:
Your cat is on Wi-Fi. That's good. That's one of the reasons I don't want to get them phones. The chime thing, it's the chime thing, you know, chimes throughout the house when.
Stacey Higginbotham [00:56:32]:
Somebody walks up the— you know, if— so there are ways to— like, we have a whole pamphlet I did on bystander privacy for these devices, limiting the range of like actual camera vision is important.
Leo Laporte [00:56:45]:
So if you don't have a facing— because I want to know.
Stacey Higginbotham [00:56:50]:
If it's the cat or Burke. So, um, and if your local cameras, if you are not accessing them remotely— again, if your Ubiquiti cameras are going.
Leo Laporte [00:57:01]:
To a server somewhere— no, it's my.
Stacey Higginbotham [00:57:02]:
Server, it's in my house.
Leo Laporte [00:57:03]:
If it's your server that you host, then you're fine. Yeah, main reason I did that was not for any privacy thing, but I didn't want to wait. I need all the bandwidth to do these shows. I— those cameras uploading the internet use a lot of bandwidth. If you have 3 or 4 cameras uploading the internet, it's killing your internet bandwidth. So I didn't want to do that. I only wanted them to save locally. Yeah, so there's— so it's just selfish.
Leo Laporte [00:57:28]:
It wasn't— it had— and I don't care about people. Uh, you wrote a— you wrote a story, Thomas, TikTok is tracking you. Oh, that's good. Even if you don't use the app? Yeah. So don't have the app on your.
Thomas Germain [00:57:43]:
Phone is the answer. Whether or not you use the app, if you've never watched a TikTok video in your entire life, TikTok is tracking what people are doing across parts of the internet that have nothing to do with TikTok. And it's worth mentioning, so are a lot of other gigantic tech companies. This is another story, like similar to the Ring thing, where it's like, this has been going on for a while. What's happening here is tech companies that have advertising platforms— some of your listeners might know this— have tools called pixels. It's like literally like an invisible 1-pixel image that you put in the background of a website, and when it loads.
Leo Laporte [00:58:18]:
It sends data to the company that made it. So yeah, just parenthetically, when we put— so we were gonna put you on the show, the BBC called us and said, could you put a tracking pixel on the show so we could see how many how many people saw Thomas? Of course, yeah. Which we said no to, by the way, right? Yeah, but that's very common. People ask for that all the time, not.
Thomas Germain [00:58:40]:
Just for you but in general. Um, and you know, you visit a website and every single person who goes there— like if you advertise on TikTok, you put one of these things on your site and TikTok data, but it's swooping up, you know, anybody who visits, whether or not you use TikTok, no.
Leo Laporte [00:58:55]:
Matter what the content of the website is. So is it like the Facebook like.
Thomas Germain [00:59:00]:
Button, kind of similar? Yeah. But invisible. Yeah, it's invisible. They've been around for years. The difference here is that TikTok just updated its pixel to make it way more useful for advertisers. Essentially, they're sharing more data with the people who use the pixel, and the implication is that's gonna make TikTok advertising more effective, more appealing to advertisers, and we're gonna start seeing it in more places, and they're gonna start getting even more of your data Even if you don't like TikTok, you don't trust them, you want to be involved, they are.
Leo Laporte [00:59:27]:
Getting your personal information. What advertisers don't realize, of course, though, is this is creating a— what Cory Doctorow called the largest consumer boycott in history— the use of ad blockers and pixel blockers, which not only stops the TikTok pixel but, but also stops ads. So they're shooting themselves in the foot by invading, so being so aggressive about invading our privacy.
Stacey Higginbotham [00:59:50]:
They're actually going to hurt themselves in the long run. So I just want to say that this doesn't just happen on the web itself, right?
Leo Laporte [00:59:59]:
This can be— they put image pickle— image pickles, good Lord, I.
Stacey Higginbotham [01:00:08]:
Would love an image pickle right now. Pixels into like emails. That was how, uh, Superhuman tracked like opens and things like that. So just to be aware that this.
Leo Laporte [01:00:19]:
Is ubiquitous, like truly on the internet. But am I wrong in thinking that if I run, um, EFF has a, as a content blocker, I use uBlock Origin. If I run one of those content blockers, that stops those pixels, the beacons don't get sent back.
Thomas Germain [01:00:34]:
Isn't that right? Or am I wrong? But so that was part of the reason that I wrote the article is like, there are actually, there's super easy things you can do to stop this kind of tracking, right? You can use a more private browser.
Leo Laporte [01:00:45]:
Most people are on Chrome, right? Chrome Chrome by default.
Thomas Germain [01:00:47]:
I wonder why Chrome doesn't block, right? Uh, you can use a privacy, like a, like a tracker blocker.
Leo Laporte [01:00:56]:
DuckDuckGo makes one.
Thomas Germain [01:00:58]:
There's the EFF privacy badge. That'll stop this. But there are other kinds of data collection, right? They're collecting data from apps in a similar way. Uh, and there's also, you know, data that they're sending, you know, directly from the company server, not from your web browser, not using your computer to pass the information on. There's not a lot you can do about that. But my point with this stuff is always, you know, what makes you so vulnerable to privacy intrusions is that they're putting together data from so many different sources. And any steps you can take to like limit— okay, well, you're not getting this— does make an enormous difference. So I, you know, I'm always encouraging people, just like, it's not a lost cause, just Just put in a little effort whenever you can, and you can.
Leo Laporte [01:01:42]:
You know, make a meaningful difference with this particular problem.
Stacey Higginbotham [01:01:46]:
Do something anyway. Do something. Yeah, you're not— so the recommendations are ad blockers, not loading images in your email by default.
Leo Laporte [01:01:56]:
Yeah, I don't use HTML email at all.
Stacey Higginbotham [01:01:59]:
I use plain text for that very reason. There's no— what, what else, Thomas?
Thomas Germain [01:02:03]:
What else should.
Leo Laporte [01:02:08]:
We be doing?
Stacey Higginbotham [01:02:09]:
Pie holes.
Wesley Faulkner [01:02:09]:
What's that?
Leo Laporte [01:02:09]:
Oh, pie hole. Pie hole is good if you've got.
Thomas Germain [01:02:11]:
An image pickle, put it in your pie hole. Set up at his house, uh, and when I go home and visit my parents, like, the internet just like does not work, right? There's like all these websites, especially because I write about ad tech all the time. That's the downside, trying to go to the companies.
Leo Laporte [01:02:26]:
Yes, go to their web pages. I can't visit them because I, I get blocked all the time because I have NextDNS running, which is basically a third-party Pi-hole.
Thomas Germain [01:02:34]:
Of course, all my information is going to them now, but a DNS blocker is really going to stop a lot of this if you, if you have the stomach for it. If you have the— like, it doesn't take a ton of technical expertise. It'll break some things sometimes, uh, but you can— a lot of them are designed to be consumer-friendly. Use a better browser. If you're not going to do that, use a tracker blocker. Set up a DNS tool. Or if you're really crazy, you can go for the Pi-hole. I even— I can't, can't take it to that level.
Leo Laporte [01:03:05]:
But, you know, again, there are things you can do here. I would venture that a majority of our listeners, people listening to the show, are using some sort of ad blocking. I mean, it's become, you know, if you're at all technically sophisticated, uh, it's.
Thomas Germain [01:03:19]:
Become just du jour. But not every ad blocker works for this, right? Uh, you know, it DuckDuckGo has a really good chart you can go check out. Some ad blockers will block some trackers. A lot of them just block ads. They're really not designed for this purpose. If you— if this is what you want, if this is what you're concerned about, an ad blocker will help, but you really want to get a tool that is specifically designed for this purpose. That's what's going to block the most.
Leo Laporte [01:03:48]:
Trackers and have the biggest effect. uBlock Origin, that's the one that we recommend. Let me see what DuckDuckGo says on their, on their chart here. Yeah. Oh, that's interesting. uBlock Origin does not protect against referrer tracking, which, which is wild. That lets companies identify the site that you visited, where you click the link to go to their page. They can see where, where the referrer came from.
Leo Laporte [01:04:13]:
Fingerprint tracking is a big issue. We've talked a lot about that on Security Now. Yeah, you know, I'm not as protected.
Thomas Germain [01:04:19]:
As I thought I was, right? And I mean, part of the reason is a lot of these tools will break websites, right? Like there's certain functionality that if you're really worried about your privacy, it's worth doing, but it'll mess things up. And then, you know, it's not that hard. You can just go, okay, like unblock this one particular website or like, you know, turn off the tracking tools for a minute. It does make things a little bit more annoying. So there's like a usability trade-off that you need to make here, but that, you know, like, it's not like you're going to be permanently messed up or disempowered here. Like, if you're on a website, it's.
Leo Laporte [01:04:54]:
Not working, you have one of these.
Wesley Faulkner [01:04:57]:
Tools on, turn it off. The easiest thing I do is, is use a liberal, uh, use of privacy browsing. I just— on— I have a browser that has no, um, no, uh, no plugins, no extensions on it, and I set it to always.
Leo Laporte [01:05:13]:
Firefox on mobile, I use it the most.
Wesley Faulkner [01:05:16]:
And you set the strict tracking? I set it to— I just use the privacy mode, which means that it never has cookies, it never has me signed into anything. Firefox has a great mobile app. When I close it, everything gets deleted. I launch it and it's— I start with— it's like an ephemeral browser where I start fresh every single time. Now, if I pair that with a.
Leo Laporte [01:05:37]:
VPN, then I think that that's probably going to Yeah, and you can't use.
Thomas Germain [01:05:42]:
Any sites ever again because of it.
Leo Laporte [01:05:45]:
Stop using the internet, that's really— Cover Your Tracks from EFF is a very interesting way to see if your browser protects you against fingerprinting, which is the ability to figure out who you are based on asking your browser, well, what's the screen resolution, what plugins does he have, you know, a whole variety of things that allows them basically to narrow you down so much that they effectively know who you are. And some browsers do a good job of this, some do not. What you want is your browser has a unique fingerprint. As you can see this, I'm using a Firefox browser. That's what you want, a unique fingerprint so that you don't look like— oh no, that's the opposite. I don't want that. I don't want that. That means I know who they know who I am.
Leo Laporte [01:06:29]:
Gosh darn it, you're in trouble. I'm in trouble. It means my browser fingerprint is unique among the 308,000 tested in the last 45 days. I'm the one and only. So they know it's me. Yeah, Safari does a good job of this, by the way. I should be using Safari, but I'm, but I'm not. Um, all right, let's pause for, uh, an ad, uh, and then, uh, we will come back and talk about other issues, age verification and security in general.
Leo Laporte [01:07:00]:
Stacy, if you were still covering, uh, home automation or home, home, uh, tools like Jennifer Patterson Tuohy is, you would be thrilled to see what you can do with the DJI Romo RoboVac. A guy actually was able to— he wanted to joystick control his RoboVac, right, and accidentally got it so he could.
Stacey Higginbotham [01:07:24]:
Joystick control all of them. You know, I work on cybersecurity policy. That is like literally my job. That's kind of a problem.
Leo Laporte [01:07:30]:
So yes, I did, I did see this. A little bit of a problem. We'll talk about it.
Wesley Faulkner [01:07:37]:
It must have been fun though, for a moment.
Leo Laporte [01:07:40]:
He really cleaned up. He cleaned up.
Thomas Germain [01:07:41]:
I'm going.
Leo Laporte [01:07:44]:
To clean every house in the world. Uh, we have a great panel. So glad to have you, Wes Faulkner. It's good to see you.
Stacey Higginbotham [01:07:52]:
Works-not-working.com.
Wesley Faulkner [01:07:52]:
It's not working yet, but it will be working. Yes. So if you have like some bones you could throw my way, check out the demo. Let me know if this is something that you want to support.
Leo Laporte [01:08:01]:
Please just hit the contribute button.
Wesley Faulkner [01:08:04]:
What is the— who's it for? It's for people who are in this economy, especially who currently have a job, but may feel like they need techniques to survive it. Uh, their toxic work environment, uh, weird bias issues, navigating like egos and power structures, uh, just as a way to figure out when you— it's a community for people to figure out how to survive this time without losing your job or also without losing your sanity. So, this is just a place where you can get that support and be able to have real talk because a lot of works, a lot of places, like if you read a book or self-help, it just assumes that everyone's working for a company that has your best interests at heart. Just talk to HR or just talk to your manager and say, hey, this is too much work or, hey, I'm having this sort of problem. Yeah, those tips do not help. They do not work.
Leo Laporte [01:09:02]:
And so this is like a real site for real people. Works as in work is not working. Works apostrophe is not working. Um, yeah, nowadays you can't just quit your job. You need the healthcare. It's not necessarily another job waiting for you. It— I think a lot of times all you can do is make the best of what you got. So that's great.
Leo Laporte [01:09:21]:
Yeah, I'm glad you're doing Works not working. Stacy Hagen Botham's here. She's a policy fellow at Consumer Reports. And.
Stacey Higginbotham [01:09:32]:
Do you still write a little bit? A very small amount. I write for CR's stuff sometimes.
Leo Laporte [01:09:40]:
And then, yeah, I probably should freelance more. Well, one of the things she does freelance, for free I might add, is Stacy's book club. We had a wonderful book club a couple of weeks ago.
Thomas Germain [01:09:50]:
We're doing.
Leo Laporte [01:09:52]:
Another one soon.
Stacey Higginbotham [01:09:53]:
We've picked a book.
Leo Laporte [01:09:54]:
What's the book? A Song for the Wild-Built. Okay, it's another Becky Chambers. It's very thin. And oh, good. I like that. There's a chance I'll finish it this time. Good. All right.
Leo Laporte [01:10:07]:
We haven't set a date for the next one, but it'll be in a month or so. And read it and enjoy it.
Stacey Higginbotham [01:10:15]:
Also with us, Thomas. It's a feel-good book.
Leo Laporte [01:10:16]:
I want to tell people because we've had some— Yeah, we had some dystopian. Yeah. A feel-good book. Actually, Briggs in our Discord chat says, "Loved that book." So thank you, Briggs. Okay, I will read it. Also with us, brand new, it's great to have him, Thomas Germaine. His new podcast launched last week, The Interface. He also writes the Keeping Tabs column for the BBC, and that's where that TikTok story is.
Leo Laporte [01:10:40]:
If you want to figure out how to, how to block the talk, he's got the, the ways and means. Thank you for being here, Thomas. Great to have you. Our show today brought to you by Monarch. I, I love Monarch. I've been using it for more than a year now, and it's been a huge benefit for me. Maybe you made a New Year resolution, start to think about your finance, maybe get things in order, start planning for the future, getting married, buying a house, retirement, going to college, whatever it is. Maybe this is the year you pay off, uh, all of those credit cards or start saving for the kids' college fund.
Leo Laporte [01:11:17]:
Wouldn't it be nice to have a tool that helps you plan, protect, and proactively achieve that goal? I love Monarch because it's a second-generation money tool. The guy who started Monarch had worked for the other big— you probably used it, everybody did, big name finance app that went away. He said, "This is an opportunity to do it again and do it right." And it is, it's so great. Set yourself up for financial success this year. Monarch is the all-in-one personal finance tool designed to make your life easier. It brings your entire financial life— budgeting, accounts, and investments, net worth, and future planning— together in one dashboard on your phone or laptop. Feel aware and in control of your finances this year and get 50% off your Monarch subscription with code TWIT. It's not your typical personal finance app.
Leo Laporte [01:12:08]:
Unlike the other guys, it's built to make you kind of proactive. Of course, it makes it easier than ever to track your money. Monarch's most popular features include beautiful data visualizations. You know what they do that I love? Have you ever seen those Sankey diagrams where it shows, you know, income and then, you know, it dwindles into taxes, childcare, and then you get this little thing called profit, that kind of— there's— it does those, which I love, but also do if you like more traditional pie charts or line charts. Or bar charts, but the charts are great. It also will do investment tracking. And so you'll get, you know, if you're investing with a broker, they may have graphs of their own, but let me tell you, the visuals from Monarch are fantastic, a real picture of your portfolio performance. You can get it in relation to other stocks or the S&P 500.
Leo Laporte [01:12:52]:
So you get a really good idea of how you're doing. Try not to check it too often. I do every day, unfortunately. Individual and partner filters are great. So one of the things couples often fight about is money. It's probably the number one subject. Being able to share this information and have the actual facts in front of you is fantastic. At no additional cost, you can share with your partner.
Leo Laporte [01:13:15]:
And I have just started sharing it with my financial advisor as well. So that's cool. No extra charge. And that way I don't have to bring a lot of paperwork in. I can sit down, she can look at it, she can say, okay, let's work on this. You can view your assets either individually or together. It also has a really great results. Now, I'll say anecdotally, it's— it saved me a lot of money.
Leo Laporte [01:13:37]:
But this is what Monarch users reported. They did a survey last year. Monarch helped users save over $200 a month, a month on average after joining. So it pays for itself more than 8 out of 10 members. 80% feel more in control of their finances with Monarch. 80% say Monarch gives them a clearer picture of where their money is going. And I would say absolutely. That's why I love it.
Leo Laporte [01:14:00]:
Plus, it's easy. It's really easy to set up. Set yourself up for financial success in 2026 with Monarch, the all-in-one tool that makes proactive money management simple all year long. Use the code TWIT at monarch.com for half off your first year, 50% off your first year. Monarch.com with code TWIT. M-O-N-A-R-C-H, monarch.com. Take a look. I think you'll agree.
Leo Laporte [01:14:27]:
It's, it's absolutely fantastic. So there's been a big kerfuffle in our own Club Twit group because we use Discord. That's where a lot of our listeners are hanging out. And Discord has announced that— and it's not just going to be Discord folks, it's going to be everybody— that they're going to have to do some sort of age verification. This is, I think, partly because of the UK. Partly because of Australia. And I think it's going to happen, you know, in the rest of the world as well. Our audience was really concerned.
Leo Laporte [01:15:03]:
We're going to have to give government ID to be in the club. And I said, you know, I feel for you. I don't know. We would probably move. We'd have to. I don't want to force my, my good friends to have to do that. On the other hand, Discord is historically useful. They say now— I don't know, this, this 9to5Mac says backtracks on age verification rollout, sort of.
Leo Laporte [01:15:35]:
They say that the vast majority of users will be able to continue without ever going through age verification. On the other hand, if you have to do age verification Uh, they may be using a video technology Peter Thiel has backed, which is an issue for some. But here's the good news, it isn't so hard to get by this stuff. Uh, here's a page: Discord, Twitch, Kick, Snapchat age verifier— how you can get around it. Because it turns out the way these things work, it isn't so difficult. Now you'd have to be fairly sophisticated to get through this, but it's called K-ID, and they have— it's a cat and mouse, right? They— this company, this website rather, that did it, and it's got— I guess it's kind of hackers— Age-Verifier. -kibty.town. Um, it worked and then it didn't work, and then it worked and then it didn't work because it's a cat and mouse.
Wesley Faulkner [01:16:55]:
Game with the identifiers, the K-ID provider. The thing is that Discord doesn't have.
Leo Laporte [01:16:59]:
To do this and they're just doing it.
Wesley Faulkner [01:17:01]:
They don't have to do it yet. I think they're being proactive into doing This. It's causing more problems in terms of their reputation now, even though it hasn't fully been deployed. There's an apparent group here where email wasn't working and people wanted to get off Facebook, so someone spun up a Discord server saying, "Let's just use this. Let's communicate here." Then someone says, "I'm absolutely not joining this because I don't want to give up my biometric data." That was their response. This was the a, like, a parent group. So if it's hitting the parents, I think that this is going to be socialized in a negative way, um, that's gonna be worse. It's gonna be like the Anthropic commercials during the Super Bowl where there— the people are— it's— people are going to just have a bad taste in their.
Leo Laporte [01:17:50]:
Mouth and associate that with Discord. Meta says— well, all the Meta properties are going to do this, but they are saying we can probably guess your age based on the data you've posted, things like happy birthday messages ages. And they won't ask you to verify unless all of their inference says, "No, you're young.".
Stacey Higginbotham [01:18:08]:
And then they're going to do— Everyone start capitalizing and using proper punctuation in all of.
Leo Laporte [01:18:20]:
Your messages. They'll think you're so old. If they can't guess your age or they think you're too young, it'll ask you to verify your age using facial age age estimation, which is those— that's that— I think that's that thing where you— they say, okay, look in the camera, now turn your head. Okay, facial recognition. Yeah, face recognition. That's another way to put it. Yeah. Or uploading a government ID, which most young people, you know, if you're that.
Thomas Germain [01:18:43]:
Young, probably don't have. Yeah, this is something I've been writing about for a couple years now that a lot of experts that I talk to say is turning into a complete disaster. It's one of those things where it started with porn, right? Porn is always the bellwether for what's going to happen with the the rest of society, particularly in the realm of technology. Uh, and it's, it's well, you know, well-intentioned. We all want to protect children. This is great. Like, the internet is a terrible place to be a kid, everyone can agree. But countries and states are rolling out these laws in ways that are just not thoughtful, right? Like, it's not a good state of affairs when you have to constantly show your government ID or, you upload biometric information everywhere you go on the internet.
Thomas Germain [01:19:30]:
For one, it's going to have a chilling effect on speech, right? Like, the things that I'm looking up or doing online, I might not do them if I have to hand someone a copy of my ID. And there's a solution that's so much better, right? The argument that critics of this stuff say is that you can upload— perhaps like Apple could build a system into.
Leo Laporte [01:19:51]:
Your phone where you upload a copy of your ID.
Thomas Germain [01:19:54]:
Apple likely already knows how old they already Right. So they could build a system where if you're using an app or if you're on a website, that it asks.
Leo Laporte [01:20:03]:
Your device to verify your ID. Believe it or not, they already have an API to do that.
Thomas Germain [01:20:07]:
Right.
Leo Laporte [01:20:07]:
And in which they chunk people up into age groups, and there's an API that an app can say, which age group is this person in? They haven't turned it on yet. I understand Apple doesn't want to do this. Meta wants Apple to do this. I think this is the only reasonable way, is these are gatekeepers. If you're a gatekeeper, gatekeeper. If you're Google Android or Apple iOS.
Thomas Germain [01:20:26]:
Hey, you're the gatekeeper, you know the age. It feels like common sense. I think Google has come out and.
Leo Laporte [01:20:33]:
Said like, we're actually game, like we're.
Thomas Germain [01:20:35]:
Willing to do this in Android. Okay, yeah, it would require lawmakers to like set up the regulation so that this is an option. And like, you know, we have to create a whole system for the whole internet to use But the way that we're doing it is like, you know, a bunch of guys who don't know anything about how technology works putting out these regulations, rushing them out in order to make it seem like lawmakers are doing something to protect children, which is a good thing that everyone can agree on. But it, you know, I think we're rushing ahead with something without thinking about what the consequences might be. And that is you showing your ID every time you visit a website, every time you log in somewhere.
Wesley Faulkner [01:21:17]:
—Or at least when you create your account. What would be more effective is to have internet education in schools with also, like, how do you know what's a deepfake? How do you protect yourself from predators? How do you not get stalked? How do you not put things in.
Leo Laporte [01:21:33]:
Terms of—.
Wesley Faulkner [01:21:36]:
Like, how to set up a Pi-hole? How do you escape tracking pickles? All of these things could be taught in school. To help with education, to make— so there'll be less victims of some of the things that they're trying to protect kids from. Because at the same time, they're cutting education, they're cutting sex ed, they're censoring the history, they're protecting pedophiles. There's all this stuff that's happening that they're trying to— they're seeing the outcomes of their lack of education of people and kids in the system. Them and just dealing with the symptom.
Thomas Germain [01:22:15]:
Instead of the actual real root. It also gets back to the conversation we were having about Section 230 and free speech, because there are— there's a lot of evidence that this is like an underhanded way for the government to regulate speech, right? So we were talking— like I was talking about pornography here. What counts as porn? Well, in one particular state— I'm blanking, so I, I have a guess, but I don't want to guess and be wrong— but in one particular state, their age verification law said that among the things that count as pornography are acts of homosexuality, even though they've given a long list of every sexual act you could ever, you know, participate in, no matter, you know, what gender of person you're doing it with. They specifically called that out. So there's a lot, you know, and the Heritage Foundation talking about openly that we want age verification because it will help us limit what information people have access to about things like abortion or homosexuality or, you know, the transgender movement. Again, it just feels like we're rushing.
Leo Laporte [01:23:15]:
These things out without thinking about what could go wrong. And if you're thinking, well, maybe Google and Apple would make a good place for this information to reside, then there's also the issue— I think people trust Apple, and I think Apple does its best to protect people's privacy. I'm not sure about Google so much. This story from The Intercept about, uh, Amandla Thomas-Johnson, a student at Cornell, attended a protest targeting companies that supplied weapons to Israel. It was at a Cornell University job fair in 2024. He was there for 5 minutes, but the action got him banned from campus. And when ICE demanded information about him, including.
Stacey Higginbotham [01:24:06]:
Credit card.
Leo Laporte [01:24:08]:
Bank account numbers, IP address, uh, name, um, just a huge amount of information. Google complied. Of course, it was a lawful subpoena. Google didn't fight it. Google never gave him a chance to fight it. And in fact, Google didn't tell him until later and didn't tell him how much information they'd handed over. Over. So I guess, uh, the thing to remember is that, um, you might say, well, Android and iOS are the right place to do this age verification, but are they going to protect all that.
Thomas Germain [01:24:43]:
Information, especially when presented with a lawful subpoena? Maybe not, but we're talking about either Google and Apple or literally every platform on the entire internet. Like, now you have to trust everyone.
Leo Laporte [01:24:54]:
One with this information. True.
Thomas Germain [01:24:56]:
Better at least just two. Yes, we're talking about just like who you are, right, and how old you are. Like, they don't— like, we could set up a system where it's blind, right, where your phone scans your ID. Most states have digital IDs now. Maybe it does some kind of cryptographic handshake, it confirms your age, it's not storing any data, and then a platform can just ask your phone, is this person over 18 or not? Or if you've ever used a credit card with Apple or Google, you're eligible, they have definitive proof of your age, right?
Leo Laporte [01:25:22]:
Like that's built in. And I suspect that that's one of the reasons so many states are now using their, putting their driver's licenses on iPhones and Apple's supporting that. And actually my passport is in my iPhone as well. So they have my age from that. And yeah, I mean, it's in there, I guess if Apple could come up with a, and maybe they have a privacy protecting way of storing that information. They don't give your age through the API.
Thomas Germain [01:25:49]:
Group.
Leo Laporte [01:25:49]:
And I think the best way to do this, honestly, with at least with kids would be for the parents to enter on the phone the age, the emotional age that they feel that kid is. You know, some kids, your kid, Stacy, very mature. You know, maybe you'd say, well, they're 18. And some kids who are 18 maybe have the emotional maturity of a 9-year-old.
Stacey Higginbotham [01:26:13]:
You might put 9 in.
Leo Laporte [01:26:14]:
No, because they're a legal adult at 18. Oh, that's right.
Stacey Higginbotham [01:26:17]:
So let's figure out a good age. 16. Then you're in trouble. You have it.
Leo Laporte [01:26:21]:
That's a pretty hard limit right there. That's true. Nothing you can do after that. No, I know. Yeah, I know. I think that's really the best solution. I agree with you, Thomas. We've talked about this on Security Now, and Steve Gibson, our security guy, also thinks this is really the only way to do it.
Leo Laporte [01:26:35]:
But as of now, most of the lawmakers, the way they did in Mississippi, the way they did in Texas, I think the way they're doing in UK, are saying We're not going to tell you how to do it. You just, you, you figure it out, platforms. It's up to you.
Stacey Higginbotham [01:26:48]:
We know you'll handle this well. The government could actually, I mean, they have our Social Security number, they have all this data. They could actually host an API for all of their citizens if they really cared about this. And if we're concerned about private companies having access to this data, even though we know they haven't, um, that is another option.
Thomas Germain [01:27:10]:
Not under this They do that in Louisiana. Louisiana has a digital ID and it is set up for this. That was the first state to have an age verification system set up, and the websites check with the state, right? You log into this portal that is controlled by the state government and the state says, yes, this person is this particular age. They don't hand them any identifiable information. It's just like, you're cool to be on this website. There's all kinds of ways that you.
Stacey Higginbotham [01:27:37]:
Could do this that don't involve, you know, like presenting sketchy company being like, yes, let me have a photo of your driver's license that I'm going to.
Thomas Germain [01:27:45]:
Store on an S3 bucket that's unsecured, right? There have been data breaches already, I think more than once, where age verification.
Leo Laporte [01:27:53]:
Companies have leaked to Discord. Yeah, as a matter of fact, yeah, I think it was 17,000 IDs that were leaked, and that's a shame. And this is why people are nervous about giving their information to Discord, and.
Wesley Faulkner [01:28:06]:
I don't blame them. I think rightly so. Well, to anybody, because they outsource it, and that's, that's the problem. It's always going to be your— the.
Leo Laporte [01:28:12]:
Weakest link that's going to have all this. The breach wasn't a Discord breach, it was a breach of the third party that did it. Yeah, yeah. Uh, from The Verge: Samy Azdufl claims he wasn't trying to hack every robot vacuum in the world, he just wanted to remote control his brand new DJI Roam-O vacuum with a PS5 gamepad. But when his homegrown remote control started talking to DJI servers, it wasn't just one vacuum cleaner that replied. Roughly 7,000 of them all over the world began treating Sammy like their boss. He could remotely control them, look and listen through their live camera feeds. He could watch them map out each room of a house, generating a complete 2D floor plan.
Leo Laporte [01:28:53]:
He could use any robot's IP address.
Stacey Higginbotham [01:28:58]:
To find his its rough location. Oh boy, we actually see this. So our testing lab sees this kind of— this, this class of errors more often than people would like to, to know about. I actually once— my— I was able to actually log in to my ISP. Um, I set up— I was setting up a firewall, which is a little internet device. Yeah, yeah. I plugged it in and I actually saw every, every other, every other router and all of their devices through the firewall because my ISP had set up their DNS incorrectly. Um, I called them, I let them know, but like these sorts of configuration.
Leo Laporte [01:29:40]:
Errors are not uncommon and it's, it's really terrifying. I can't remember the first time Apple put AirPlay or whatever its predecessor was into the Apple iTunes app. And I remember going to a hotel and getting on the Wi-Fi and seeing everybody else's music on my iTunes. And then back in the early days.
Wesley Faulkner [01:30:01]:
Um, I think they've fixed that since then. This is also like— could it be an actual safety issue? Like, if you could get into the firmware and like overdrive a DC motor and cause a fire, that could be something that could be catastrophic, uh, to to, to, to kind of like set, destroy all of these or just brick them even. This is not just a security thing. Oh, someone's able to move my vacuum. This is, if this have cameras, they can point the cameras, they can turn on the mics, they can get access and hop onto their Wi-Fi and snoop on their traffic. This, this is really big. It's not just, hey, someone else is moving someone else's vacuum. This is like you could actually cause.
Leo Laporte [01:30:44]:
Loss of life in extreme cases. When The Verge contacted DJI, which is a Chinese company, they, they claimed— DJI claimed they'd fixed the vulnerability, but it was only partially resolved. DJI could confirm the issue was resolved last week. Their remediation was already underway prior to public disclosure. And then about a half an hour later, Sammy showed The Verge reporter, thousands of robots including the review unit reporting for duty. He was able, given the serial number of the review unit, he was able to, uh, to tell, uh, The Verge how much battery life was yet left, and it transmitted a map of a Verge reviewer, uh, Thomas Rickman's, uh, house, uh, Thomas Ricker's house, to, uh, to Sammy. Here's the map.
Stacey Higginbotham [01:31:40]:
Uh, so yeah, I, I presume by now it's fixed. It apparently isn't totally fixed.
Leo Laporte [01:31:48]:
Oh boy. I mean, some of these things— it's happened before. The Verge says hackers took over Ecovacs robot vacuums to chase pets and yell.
Stacey Higginbotham [01:31:57]:
Racist slurs in 2024. Remember that one? So here's— I'm, I'm gonna talk about this because We have a solution sort of that's stuck in limbo, and it's a voluntary program. Europe actually has laws that they've implemented on this front, but it's basically how to design connected devices securely. And we need— this is an area where you actually need the government to set rules and regulations because it's not something the competitive market can solve because consumers can't see it until after they've bought it, and you have to have this level of expertise. So when we don't have this, what we have is people buying the cheapest possible device, and you get companies that maybe don't think about securing their servers. Or in this case, they didn't secure their service or servers. Uh, they also— I'm trying to remember all the details of this. Regardless, a lot of these errors are kind of careless, and if we had.
Leo Laporte [01:33:00]:
Laws to to, you know, penalize companies. And companies don't seem to— DJI doesn't seem to care all that much. They've claimed they'd fix it, they hadn't. They claimed again they'd fixed it, and then they hadn't. And now Sean Hollister is writing for The Verge. He said even now DJI has not fixed all the vulnerabilities, and one is so bad I won't describe it until DJI has more time to fix it.
Stacey Higginbotham [01:33:23]:
But DJI has not immediately promised to do so. Companies often, when you go in and report a vulnerability to a company, they will often say, well, that's actually not a vulnerability. So there is definitely a back and forth between security researchers and the companies themselves. I am not an apologist for these companies. I'm just saying there, there is a back and forth that needs to happen. They're also, they could have trouble replicating the issue. And then yes, there's always the case where they're like, either they won't fix it it or they can't fix it, and then an ethical company will be like, uh, hey, we should recall those. That almost never happens.
Stacey Higginbotham [01:34:01]:
Or they'll do something like what Wyze did, which is like, oh, we're gonna.
Leo Laporte [01:34:04]:
End-Of-Life that product and put out a new one. We're not gonna— we're not gonna sell.
Stacey Higginbotham [01:34:07]:
That one anymore, never mind. We, we can't solve this, so we're.
Wesley Faulkner [01:34:10]:
Just gonna set it all on fire. That's why I'll never buy a Wyze.
Leo Laporte [01:34:14]:
Cam, because of that. Yeah, it soured me.
Wesley Faulkner [01:34:16]:
Which is true. And I would say don't buy— don't buy this vacuum or, or any— anything that will be on your local network from DJI, because they have bad security practices. It's— so it's not— I, I think it's zooming in on saying whether it's fixed or not. But the thing is, it was released.
Leo Laporte [01:34:33]:
And it was not found by them. That is a huge problem. Yeah, no, there's— fact, it's on GitHub if you want to download it. Uh, and, uh, it's in— it even says this tool bypasses the PIN code setup on DJI Home app, so it's a lot easier to use. You might want to use the GitHub version of software. He says I can in fact control.
Thomas Germain [01:34:55]:
My, uh, my RoMo with my PlayStation 5 controller. I mean, an IP robot in your.
Leo Laporte [01:35:00]:
House in general, maybe, maybe think about it. Yeah, maybe not. That's where— that's the future.
Stacey Higginbotham [01:35:07]:
We're all gonna have them, Thomas. I mean, you shouldn't. Like, we never— we don't have cameras inside. Even when I was testing smart home equipment, we faced all of our cameras to a wall. I had a fish tank for precisely.
Leo Laporte [01:35:21]:
That reason, but like, you just don't, don't— oh yeah, I would never have.
Stacey Higginbotham [01:35:27]:
Cameras streaming live to the— oh, never mind. But here, you want to freak out some? Dude, should we freak people out? Yes, like legitimately freak them out, but not like, um, the way that Wi-Fi sensing is going, we will have the equivalent of a camera that is just.
Leo Laporte [01:35:45]:
Using Wi-Fi to understand what you are doing in your house. You can, and using the Wi-Fi signals because you— because it turns out humans.
Stacey Higginbotham [01:35:53]:
Are really basically big bags of water.
Leo Laporte [01:35:56]:
Big bags of salt water, and they block the Wi-Fi signal. And, and so you can, I guess— do you have to triangulate it?
Stacey Higginbotham [01:36:01]:
I don't know how you do it. You use RSSI. Yeah, you, you do triangulation. So you need at least two devices. The more devices you have— the, the really crazy things to know about this are, because of the way Wi-Fi works, they all talk to each other and give signal feedback, right? That's That's, that's good. But it also means that you could load this software on anything that has sufficient computing power that has Wi-Fi radio, and then it becomes a Wi-Fi sensor in your home. And then all you have to do is pick the algorithm you want, and you can determine all kinds of fun things. So that's the future, and it's not far off.
Stacey Higginbotham [01:36:39]:
CTA actually just like a week ago decided they were going to do a standard for fall detection. For Wi-Fi. And we're all excited because fall detection would be great, and it would, but they could later on do, you know, I, I always say puppy kicking detection.
Wesley Faulkner [01:36:55]:
Because, or nose picking detection. And are they going.
Leo Laporte [01:37:01]:
To do summer, winter, and spring as well? Fall, summer, spring, and winter.
Stacey Higginbotham [01:37:05]:
We're gonna get them all. Yeah.
Leo Laporte [01:37:08]:
They'Re gonna use your temperature sensors inside your Ladies and gentlemen, you are watching This Week in Tech. Thomas Germain is here from the BBC. I love saying that. That sounds so good. Feels good.
Thomas Germain [01:37:20]:
Correspondent.
Leo Laporte [01:37:20]:
Yeah, feels good. The British Broadcasting Company, right? Don't— is it corporation? Corporation. Sorry. Corporation. Yes. Uh, didn't they just raise the TV.
Thomas Germain [01:37:28]:
License fees in the UK? I think they did. Yeah, I mean, there's, there's a whole— that's a, a whole can of worms.
Leo Laporte [01:37:36]:
I tend not to get into. It's pretty funny. I can't imagine here in the United States paying the government so that I could watch TV. Um, I'd prefer to pay giant tech companies. Yeah, but you do not have to have iPlayer to listen to The Interface. You can get it on any podcast application. YouTube, anywhere you want, everywhere. Also Stacy Higginbotham.
Leo Laporte [01:38:05]:
Always happy to have Stacy on. We miss you on, uh, what used to be This Week in Google, now Intelligent Machines. But your colleague Paris Martineau has done a very good job filling in for you, so at least we've got— is she still on it? What, you thought she— you thought I'd drive her off by now? Is that what you're saying? I know you are.
Stacey Higginbotham [01:38:24]:
I know what you're thinking anyway.
Leo Laporte [01:38:25]:
I didn't know she was still doing that. Okay, good.
Stacey Higginbotham [01:38:29]:
Yeah.
Leo Laporte [01:38:31]:
Oh yeah. Paris and Jeff. Yeah.
Stacey Higginbotham [01:38:32]:
Yeah.
Wesley Faulkner [01:38:32]:
Yeah.
Leo Laporte [01:38:33]:
Kind of hurts my feelings a little bit that you thought she'd be out.
Stacey Higginbotham [01:38:39]:
Of there by now. Maybe it's, you know.
Leo Laporte [01:38:41]:
It's 'cause she got a job at Consumer Reports. She did. In food. That's why. She got a good job and she's happy now. And also Wes Faulkner. I haven't driven you guys off yet. Yet.
Leo Laporte [01:38:54]:
It's good to have you.
Wesley Faulkner [01:38:56]:
Founder of Works Not Working.
Leo Laporte [01:38:58]:
Yes, thanks Kenneth for your, uh, contribution. I just saw it come. Oh, that's nice. If people go to Works Not Working.
Wesley Faulkner [01:39:04]:
They can contribute there.
Leo Laporte [01:39:05]:
Yeah, there's just a big button on the, on the.
Wesley Faulkner [01:39:11]:
Tool. Help support what Wesley's up to.
Leo Laporte [01:39:13]:
Works-Not-Working.
Wesley Faulkner [01:39:13]:
And you'll get a, a badge for.
Leo Laporte [01:39:15]:
Your level of support on the platform as well. A badge? A badge? Badge. You know, um, Dan Aykroyd carries a badge at all times, but that's a.
Wesley Faulkner [01:39:26]:
Different kind of badge.
Leo Laporte [01:39:27]:
Yeah, we don't need no stinking badges. I don't want to put that up. Our show today brought to you by ZipRecruiter. If you're hiring for your company, this is a busy time of year for you. You've got new 2026 goals, which means finding the right people to accomplish them. Unfortunately, you also have new hiring challenges for the year, like filling specialized roles. Maybe you've got to identify qualified candidates from what is, you know, nowadays a huge pool of applicants. That's actually one of the great things about ZipRecruiter.
Leo Laporte [01:39:57]:
You, you do get a huge pool of applicants, but you also get tools that make it very easy to find the perfect person in that giant pool. Thankfully, there is a place you can go to help conquer these challenges and achieve your hiring goals: ZipRecruiter. And right now So you can try it for free at ziprecruiter.com/twit. ZipRecruiter's matching technology works fast to find top talent. You don't waste time or money. You can find out right away how many job seekers in your area are qualified for your specific role. With ZipRecruiter's advanced resume database, you can instantly unlock top candidates' contact info and invite them to apply, which by the way, I have to say is the best technique for getting great candidates. If you ask them, hey, we would like you to apply for our job, That puts you way ahead of everybody else.
Leo Laporte [01:40:47]:
No wonder ZipRecruiter is the number one rated hiring site, and that's based on G2. Let ZipRecruiter help you find the best people for all your roles. 4 out of 5 employers who post on ZipRecruiter get a quality candidate within the first day. See for yourself. Just go to this exclusive web address right now to try ZipRecruiter for free: ziprecruiter.com/twit. Again, that's ziprecruiter.com/twit.
Stacey Higginbotham [01:41:15]:
ZipRecruiter.
Leo Laporte [01:41:15]:
The smartest way to hire. This is one of those ideas that just never dies. HP has now announced that you can subscribe to buy a laptop. Would you subscribe to a laptop? You never own it, by the way. The subscriptions are for different productivity laptops, for gaming laptops, ranging from $35 to $50 a month. Month, no starting fee, no down payment. There's a soft credit check. Each of them has a 24/7 support plan from a live agent.
Leo Laporte [01:41:50]:
You can upgrade after 12 months, but you can't buy out the lease. You can never keep the laptop. You just, you know, you keep it.
Stacey Higginbotham [01:41:56]:
Until you stop paying, I guess, and send it back.
Leo Laporte [01:41:58]:
I would do it because it's a depreciating asset.
Stacey Higginbotham [01:42:01]:
So that's a bad idea. No, that's not a terrible idea. It depends on the costs. Like, because a laptop is not something— I think about subscriptions and leases and that sort of thing in terms of CapEx and OpEx, basically. So is it an asset that I can own over a long time and possibly use to make money, or that.
Leo Laporte [01:42:21]:
Will retain its value long?
Stacey Higginbotham [01:42:23]:
What's it worth to you, in other words? Right. So you would have— I mean, I'm not doing the financial calculations, and depends.
Leo Laporte [01:42:29]:
On the specs on the laptop, but if it actually— some of these are really nice laptops.
Wesley Faulkner [01:42:34]:
The Omen Max Ramp prices today.
Stacey Higginbotham [01:42:39]:
Yeah, yeah, maybe. Yeah. Did you know— is it a contract.
Wesley Faulkner [01:42:42]:
Though, or is it like, can they raise the subscription? It was 12 months, I believe, but I wonder if they can raise it because, uh, there was a YouTuber, uh, Gamers Nexus, that talked about people who are renting, uh, gaming rigs thinking that they would start a Twitch stream and be the next influencer to be able to get these highly expensive rigs for just a monthly stipend, and they could offset it by how much they'll make on social media. And, uh, and people were just overleveraged at that point. The question I wonder about, it was the secondary market. So where are all these laptops going to end up after the leases are up, where it might be good good to— for the secondary market, like, I would, I would be pro buying a secondhand or refurbished laptop, but it might flood the market as well. So it's, it's really interesting concept because if they stop selling laptops, I would say this was bad if you could only lease it. But it sounds like, no, no, you can still— you get to choose the thing that fits best for you.
Thomas Germain [01:43:48]:
So I'm actually pro this. Only this is a, this is a religious conviction, but I will die before I pay a subscription for a piece of hardware. I really wanted to get, uh, an Oura Ring, you know, one of those like fitness trackers that you wear. You have to pay $6 a month for that thing. You got one? They're great.
Leo Laporte [01:44:04]:
Everybody loves them. I think they're the best. I'm grandfathered in because I bought it. They wouldn't advertise it.
Thomas Germain [01:44:08]:
I bought it in the beginning, and I think— not gonna happen. I'm not paying $6.
Leo Laporte [01:44:13]:
No, for something I already bought. Just— this is— we are in this, in the phase where people are now very sensitive to subscriptions. And every company wants subscriptions. Apple even is moving more and more towards subscriptions. They just released their Creative Suite that is a subscription-based Creative Suite. You can still buy the programs outright, but everybody wants subscriptions.
Stacey Higginbotham [01:44:35]:
It's 'cause of, Stacy, it's ARPU, right? ARPU. ARPU. I mean, the profit margins for a.
Leo Laporte [01:44:41]:
Subscription, especially if you can sell hardware too, fabulous. Sure. And then you get this average revenue.
Wesley Faulkner [01:44:48]:
Per user number, which I guess Wall Street likes.
Stacey Higginbotham [01:44:50]:
The scary.
Wesley Faulkner [01:44:53]:
Thing is you don't— they can like it. What do they— because you don't own the hardware, what will they do? Like, will they add you to the Flock camera network?
Leo Laporte [01:45:00]:
Because— well, that's a good point.
Wesley Faulkner [01:45:01]:
I didn't even think of that. The question is like, because can you.
Leo Laporte [01:45:05]:
Load Linux on it?
Stacey Higginbotham [01:45:08]:
Can you— that's a good question. That's, that's a really good question. This is literally what I work on for CR. Like, let's go with 80% of my time, which is the harms of software tethering. So anytime you have a subscription, and Wesley brings it up, right? This is like, what features? How can they change the licensing terms? What if, for example, you use your.
Leo Laporte [01:45:32]:
Leased laptop to upload, maybe use it to pirate music?
Stacey Higginbotham [01:45:35]:
[Speaker:Ac] They can remotely lock it according to the terms and conditions. [Speaker:LS] Right. So that is a reason not to do it, obviously, because you won't have total control and ownership of this. And right now we're in an era where literally you have no rights. I have like— I should very soon— I will share this with legislators, but I've created a chart full of software tethering harms, and they run the gamut from, hey, your thing may be bricked, all the way to they may change the terms of service and suddenly what you think you can do, you can no longer And those are really bad, and there's nothing to protect you.
Wesley Faulkner [01:46:10]:
And consumers think they own these things, and they don't. Yeah, it could be hardware on demand too. Like, you have access to 8 gigs of RAM, but you have to pay a little bit more and they'll turn on another 8 gigs. Or like, you're limited to— like, your Ethernet port is only to 100 megabits, but then you can get a gigabit.
Leo Laporte [01:46:26]:
If you pay a little bit more and they'll turn on— car makers are starting to do that. All that stuff. Yeah. BMW was going to charge a monthly fee for heated seats. They built in the heater, but to turn it on, you'd have to pay a monthly fee. Then when their users complained, they decided not to do that, but they have already announced that they're going to be.
Stacey Higginbotham [01:46:44]:
Doing it in future vehicles. So ask me in like 6 months about this or even 3 months, because we might see something introduced in state legislatures in the next couple of months.
Leo Laporte [01:46:55]:
To target this very issue.
Thomas Germain [01:46:57]:
Oh, like saying that people can't do that. Yes. Yeah.
Leo Laporte [01:47:01]:
That's interesting.
Thomas Germain [01:47:02]:
In particular has a terrible track record. Record on this issue. They've like tried this with printers before. It's just all about limiting what you can do and, you know, shutting you down if you're trying to save money. I, I think the whole idea is just kind of inherently anti-consumer. I like what Stacy's saying about like, well, maybe I want to replace my laptop if it's a good deal, right? If it— like, if the money works out. But it sounds like this one doesn't, and why would it, right? They don't want you— they're not doing this because it like makes sense for the consumer from a financial perspective. It's so they can bilk more money.
Leo Laporte [01:47:34]:
Out of you over time because you're.
Wesley Faulkner [01:47:37]:
Locked into a contract. Yeah, yeah. These only work with HP printers.
Leo Laporte [01:47:39]:
You can't use Canon or any of their brothers. Yeah, I would never buy an HP.
Wesley Faulkner [01:47:45]:
Uh, inkjet for that. Yeah, yeah, really. By the way, that's not— I don't— I didn't read it.
Leo Laporte [01:47:48]:
That's not what it says, but I mean, it could. No, but we know that if, you.
Stacey Higginbotham [01:47:55]:
Know, it's pretty draconian, the inkjet subscription.
Leo Laporte [01:47:58]:
Oh yeah, you're out of cyan. Everything else, nothing will work. Right, right. Uh, FTC is investigating Microsoft. I'm not sure why, but, uh, it's a probe of whether the company illegally monopolizes enterprise computing with its cloud and AI offerings, including Copilot. It feels like this is from Bloomberg. It feels like, you know, maybe this is a little behind the, behind the times a little bit. I don't know if Copilot is really.
Stacey Higginbotham [01:48:27]:
Such such a threat to the economy.
Leo Laporte [01:48:29]:
Wait, which story is this? Hold on. This is FTC ratchets up Microsoft probe, queries rivals on cloud and AI. It's from, it's from Bloomberg. I think I linked to a Slashdot.
Stacey Higginbotham [01:48:40]:
Article which links to the Bloomberg.
Wesley Faulkner [01:48:42]:
I pay for Bloomberg. I'll read it. Yes, that's why I wouldn't be surprised.
Stacey Higginbotham [01:48:46]:
If Salesforce wasn't behind all this. I was gonna say, so if they just— oh my God, it's taking forever to load. Um, if they just announced like, are.
Leo Laporte [01:48:56]:
They just talking about their, like, probing? I'll read to you. The U.S. Federal Trade Commission is accelerating scrutiny— I like that word, scrutiny. That sounds like political— yes, of Microsoft Corp. as part of an ongoing probe into whether the company illegally monopolizes large swaths— another great word— of the enterprise computing market with cloud and AI offerings. They've issued civil investigative demands in recent weeks. Now, an investigation isn't the same as a lawsuit. It's just a preliminary step.
Stacey Higginbotham [01:49:28]:
So maybe this is just, you know, we're looking at Microsoft.
Wesley Faulkner [01:49:34]:
Yeah, then it's just political. Was the Microsoft CEO at the inauguration.
Leo Laporte [01:49:39]:
Or did they contribute to the party? Uh, that's a good question. Yeah, um, I, I don't know. I don't think he was. I don't think he was. The demands, which are effectively like civil subpoenas— the FTC is seeking evidence that Microsoft it harder for customers to use Windows Office and other products on rival cloud services. You know, there's a European investigation like this also going on. I don't, I don't credit this. This was launched under Lena Khan during the waning days of the Biden administration.
Leo Laporte [01:50:18]:
However, it has continued under the new administration, so I'm not Yeah, you really at this point, you really kind of.
Thomas Germain [01:50:23]:
Wonder, is it politically motivated or not? It's interesting though, like antitrust is a bipartisan issue, right? Like not everyone agrees on this, but there's a lot of Republicans who are really on board with this. A lot of those, I mean, not a lot, but you know, one of.
Leo Laporte [01:50:38]:
Those Google antitrust cases was launched under the Trump administration. Yeah, but I have to say the motivations for it are different. The actions are the same. Game. On the right, they feel like big tech is unfair to conservatives, so they're going after big tech for that reason. On the left, they feel like these companies are too big and too dominant, and they're going after them for that reason. So yes, it's bipartisan, but not, not.
Thomas Germain [01:51:02]:
With the same motivations, it seems to me. Josh Hawley brought in the CEOs of MasterCard and Visa and just berated them.
Leo Laporte [01:51:10]:
About how they were hurting small businesses. Look at the president says He wants to limit, uh, interest rates for credit cards, which can't be too popular with.
Wesley Faulkner [01:51:21]:
His donors, the donor class.
Leo Laporte [01:51:24]:
But, but if you use cryptocurrency— oh, there you go. Yeah, there you go. T-Mobile's announced they're going to add AI to their network in real-time translation on the T-Mobile network. Works on all existing hardware, no data centers.
Stacey Higginbotham [01:51:42]:
Are involved. I don't understand how that works. Oh, they have— well, every telco network.
Leo Laporte [01:51:49]:
Has servers that are within their network. They're going to have instantaneous— actually, this is cool. I'll let you know because I'm a T-Mobile customer. They're going to have instantaneous simultaneous translation in more than 50 languages: English, Chinese, even things like Welsh and Azerbaijani. According to a spokesperson, calls are not being routed to a data center for translation. There's no new edge hardware installed at cell towers. All the AI processing happens as calls are transmitted. So it.
Stacey Higginbotham [01:52:24]:
Sounds like it's magic.
Thomas Germain [01:52:24]:
It's incredible. It's incredible. I also wonder, okay, so like, you know, thinking about the Ring conversation we started with, right? Like, is this going to wake people up to the amount of insight that the telecoms have?
Leo Laporte [01:52:36]:
Our phone calls. Not only can we listen, but we.
Thomas Germain [01:52:39]:
Can listen in 20 different languages. Regardless of whether there are things you should be worried about, I think people hear, oh, they've got AI that's listening to your phone call and like doing stuff. I think a lot of people are.
Leo Laporte [01:52:49]:
Going to freak out about that. If you turn on the AI transcriptionist on a Zoom call, there are people, my wife included, who will say, no, stop it, hang up. People don't like these AI agents sitting in on calls. It bugs people. So I'm not sure exactly. I should have done more research on this before I put this story in The Rundown because I don't understand how they're going to do it.
Stacey Higginbotham [01:53:14]:
They say we're bringing real-time AI directly into our network. Okay. So they're saying they're doing it on their IMS network, which is just a fancy way of saying the servers that the telco runs and operates as part of their network. So they're not sending it out to any sort of external data Oh, it's T-Mobile's data centers. Oh, right. Okay. So I mean, I've been in the telco towers, I've seen— I've been in D-slums, I've been in— how, you know.
Thomas Germain [01:53:43]:
They are doing the translation on their servers.
Leo Laporte [01:53:45]:
So that's interesting. And they say this is the beginning, they're going to do a lot more AI through the IMS servers. How capable are these servers, Stacy?
Stacey Higginbotham [01:53:55]:
Are they, are they pretty They're ready. I mean, they're whatever you want them to be.
Leo Laporte [01:53:59]:
They're just their own servers. I wonder if they, I mean, have they been buying up GPUs on the sly?
Stacey Higginbotham [01:54:06]:
Doing this kind of AI is non-traditional as far as I know. You can do inference on things that are not GPUs.
Leo Laporte [01:54:14]:
Right. And this is inference.
Stacey Higginbotham [01:54:16]:
Real-time translation? I guess you could, huh? Real-time translation is actually not hard because.
Leo Laporte [01:54:20]:
It'S a one-to-one thing. Right.
Stacey Higginbotham [01:54:21]:
In fact, my phone will do it on device. Yeah. I was like, well, but it's not, I mean, I shouldn't say it's not.
Leo Laporte [01:54:28]:
That hard, but it's not like running.
Stacey Higginbotham [01:54:30]:
An LLM or something like that. Yeah. Yeah. This is, so this sounds really cool, but what it does mean is T-Mobile is taking whatever someone is saying and running an algorithm against it and translating it and then delivering that. Which is super cool, but is also like something— a computer is listening to.
Leo Laporte [01:54:55]:
Your phone call and translating it for you. And get ready, because it's going to do more. Everybody's putting AI everywhere. By the way, this is the thing that I think most annoys people about Google and Microsoft Copilot, is that they're.
Wesley Faulkner [01:55:08]:
Kind of pushing it on you. You know, I've, I've two concerns with this that I don't know if anyone— I read the article and it didn't really address it, uh, you know, the, the government has their own equipment in these data centers, um, so that they could get information. If you said bomb, terrorist, the jihad, I'm just curious if that will— since.
Leo Laporte [01:55:32]:
You'Re now listening explicitly on these calls, well, thank goodness the government doesn't listen.
Wesley Faulkner [01:55:37]:
To podcasts or you'd be in trouble. Well, it's your podcast, not mine, so I'd be Okay. And the other is, is there indemnification or anything that's going to be in contracts? So if someone says, hey, yes, we agree to these terms, or no, and they translate it and there's a— and.
Leo Laporte [01:55:54]:
It'S a mix-up, who's at fault in that case? You can bet there's going to be a lot of fine print in your.
Wesley Faulkner [01:56:00]:
T-Mobile contract that says we are not responsible for anything. And can you opt out to say, like, I don't— I don't want to be translated?
Leo Laporte [01:56:05]:
Like, what if I'm on a network? You have to turn it off.
Wesley Faulkner [01:56:07]:
On. Oh, I see. On the other end, it's on the T-Mobile network. So if I'm on AT&T and I call someone who's on T-Mobile and they.
Stacey Higginbotham [01:56:15]:
Want to turn it on, I didn't sign that agreement. I was just looking up the federal wiretapping laws and state wiretapping laws and how they translate to like AI translations.
Leo Laporte [01:56:25]:
Because that's kind of what's happening, right?
Stacey Higginbotham [01:56:27]:
Like it's a two-party state versus— well, I mean, Texas was— I lived in a one-party state forever, and that was.
Leo Laporte [01:56:33]:
Kind of nice as a reporter. I know, I can't record you unless I have to, have to ask you.
Stacey Higginbotham [01:56:38]:
Ahead of time in California. Yeah, but I, I don't know where this— the, the law is yet on.
Leo Laporte [01:56:46]:
That, so I'm frantically looking this up. By the way, it's funny they call.
Stacey Higginbotham [01:56:48]:
It a one-party state when really it's a no-party state.
Thomas Germain [01:56:52]:
No, one party knows you're being recorded. I guess, you know, I mean, telecom law, like the Wiretap Act, is like from 1972 or something. Like, whatever the law is, it is.
Leo Laporte [01:57:05]:
Not equipped to deal However T-Mobile has figured out how to do this. You know who else is not equipped? Apple's Siri. Oh good, did you do it?
Stacey Higginbotham [01:57:13]:
I won't dump this story until you're done, Stacy. No, no, no, I was looking up the law and some law review articles about it because I was really like.
Leo Laporte [01:57:22]:
Oh, 'cause this is super interesting. It is interesting. I mean, it sounds like— It's this little subtle press release, but honestly there's.
Stacey Higginbotham [01:57:32]:
Some real ramifications to this. Yeah, it says it possibly will see verbal or written announcements, uh, ideally at the beginning of the call, which we do hear with translations and you do.
Leo Laporte [01:57:43]:
Hear on Zoom calls. So I guess this call is being recorded for quality purposes.
Stacey Higginbotham [01:57:49]:
Yeah, this translation— your call is important to us. So maybe that's how that will get around that.
Leo Laporte [01:57:56]:
I don't know. Okay, sorry. Next story. Apple According to Mark Gurman, rumor monger, is having a little trouble getting the AI in Siri again. Oh, they were supposed to— remember, they've done a deal with Google, which by the way has a very capable AI assistant, Gemini 3. They had done a deal that was supposedly going to come out in March to the new smarter Siri in iOS 26.4. Gurman says, and I think Gurman's got this one nailed, uh, it ain't gonna happen that fast. Apple's now working to spread those new capabilities out over future versions, according to people familiar with the matter, postponing some features until May or even September.
Thomas Germain [01:58:47]:
This has been a non-stop flop for Apple. Nightmare. It. And it's also like the one thing that I really, really want from AI is to make my phone less annoying, that I can just talk to it and it'll do what I want it to do.
Leo Laporte [01:58:59]:
And it feels like the only thing I've never— Do you use like ChatGPT or something now on your phone?
Thomas Germain [01:59:05]:
Do you use one of those assistants? Yeah, I have, uh, you know, the.
Leo Laporte [01:59:09]:
Newer model iPhones have like a button built in.
Thomas Germain [01:59:11]:
Yeah, I do the same thing. Like there, ChatGPT, I hold it down.
Leo Laporte [01:59:13]:
The action button, it brings up— And you can talk to her and she's nice and she answers questions. Yeah. It's funny, my wife continues to ask Siri stuff. And every time she does, I go, honey, haven't you learned you're never gonna.
Thomas Germain [01:59:26]:
Get that answer out of Siri?
Leo Laporte [01:59:31]:
Siri's a nitwit. Siri's for setting timers. Yeah. Now, have you played yet with the new Alexa— I mean, A-word Plus features.
Stacey Higginbotham [01:59:43]:
Stacy, in your Amazon Echos? So I played with Gemini but not Madame A+ because I ditched it, although I'm having a real issue, and I'm actually curious, y'all audience, are you seeing increased latency? Because the latency on my Google Home has skyrocketed, and I freaking hate it. Absolutely. Yep.
Wesley Faulkner [02:00:06]:
And are you using Google or using Meta or somebody else?
Leo Laporte [02:00:11]:
I, I have a G Home. Yeah, my Google, same thing.
Stacey Higginbotham [02:00:14]:
And not only that, it's gotten dumber over time. Time.
Leo Laporte [02:00:17]:
Yeah, that I knew. I just— I'm like, they're nerfing it for some reason.
Thomas Germain [02:00:22]:
Maybe too many people are using it. Maybe they just want people to be on their phones.
Stacey Higginbotham [02:00:26]:
I don't know. Or maybe it's too expensive. I think it's— I think it's too much back and forth. Or it's— I feel like they're— we're using like the wrong tool for the job. Like when I'm like, turn on my lights, I need it to go to.
Leo Laporte [02:00:41]:
The dumber version, not the smarter I do think one lesson to be taken from this is that it's really hard to do a smart voice assistant. Like, Apple's having trouble, Amazon's having trouble, even Google, which is probably— Google and OpenAI seem to have been the best of the bunch. Anthropic's kind of smart. They have it, but really they're pushing coding. They kind of decided we don't want to be the image generating tool, we don't want to really even be the chat tool, we want to be the coding tool. And they've had great success doing that.
Thomas Germain [02:01:13]:
Lots of attention for Claude Code. I think it says a lot about the capability of the technology, right? That like it's been years that Apple has been desperately trying to roll this feature out.
Stacey Higginbotham [02:01:23]:
Yeah.
Thomas Germain [02:01:23]:
Even announced it and they can't make it happen. And I think it's because hallucination is inherent to large language models and to actually put it in charge of something that really matters, like the functions of your phone, phone, you need to have a level of confidence that it seems like they just can't get. I think that is the reason that we don't have this tool yet, that they're not like actually giving them real controls, because like who knows what they're going to do.
Leo Laporte [02:01:49]:
That's a big deal, right? Like, especially for Apple, where reputation is really important. Google doesn't mind if you put Elmer's glue on your pizza or eat rocks.
Stacey Higginbotham [02:01:59]:
But Apple definitely cares about that. That's not true. I've talked to some Google people about this, but well, yeah, It's when you know what you want it to do and it doesn't do it even 95 or 5% of the time, it's aggravating. When it's summarizing an article you haven't read, it's fine. And that's, I think people keep thinking it's going to get better and it's like, well, this is actually, and maybe it will, but this is what's available everywhere now and it sucks everywhere. It's just, you can totally easily see that it's sucking.
Leo Laporte [02:02:35]:
Working in these use cases.
Wesley Faulkner [02:02:36]:
So.
Thomas Germain [02:02:38]:
Yeah, do you think it hurts Apple though?
Leo Laporte [02:02:40]:
Which part of it? Well, I wonder. I mean, they promised it and didn't deliver it, but I don't see consumers.
Wesley Faulkner [02:02:47]:
Going, oh, you know, I'm gonna go—.
Leo Laporte [02:02:48]:
Didn'T they sell a record number of iPhones?
Thomas Germain [02:02:51]:
Yes, it's the best quarter ever.
Stacey Higginbotham [02:02:52]:
Yeah, yeah, I think it makes— I.
Thomas Germain [02:02:53]:
Thought I'd have a self-driving car by now, but I don't. Like, this feat, this core feature of the iPhone phone is just so stupid. Like every, you know, you talk to someone who's totally tech illiterate about Siri, people hate Siri now. And this is such a big part of the Apple ecosystem, even though they don't, they really don't bring Siri up very often anymore. I think it does make 'em look bad, but it's not, I don't think it's like really interrupting their business. But Apple is also like at a moment where it really needs to redefine its core promise as a company that's gonna continue growing going in the hardware space in the way that it has over the past few years. And I think, like, this hurts investor confidence in really significant ways. Like, will they be able to overcome it? I mean, they've certainly faced greater hurdles in the past, but.
Leo Laporte [02:03:42]:
Uh, I think it makes them look bad. Yeah, well, certainly to those of us in the industry who are paying attention. But I think there are some customers who are going, oh, I didn't really want— I didn't want to be. I didn't want AI in my phone anyway. It's a— there's a real, there's a real fork in the road here between people who hate AI and people who— and I'm one of them, so, you know, I'm aware of this— love AI.
Wesley Faulkner [02:04:05]:
And there's— it— there's never the twain shall meet, right?
Thomas Germain [02:04:08]:
Yeah.
Wesley Faulkner [02:04:08]:
It's a shame that their hardware is so good. Their NPUs, or the neural processing engines, are so good. It just sucks that this is— they should just find a way of dialing it to a note-taker or something that's just so local that doesn't— they should.
Leo Laporte [02:04:24]:
Go the other direction, just make everything local. Well, they've kind of done that with Apple Intelligence, right? So they have quite a bit of that local AI built into Apple Notes and their— a lot of their tools, their image generation. What is it, Genmoji? It's not very good, but they have a lot of the local stuff built in. And I think for some people, that's.
Wesley Faulkner [02:04:45]:
All they, they really want, right?
Leo Laporte [02:04:47]:
And that's all they should do.
Wesley Faulkner [02:04:48]:
Maybe that's what they should stick with. Just stay there. Just stay in that lane.
Thomas Germain [02:04:52]:
And that should be the differentiator. I thought it was really interesting. Uh, it kind of like, it, it, you know, came out and people weren't really paying attention. Apparently Apple approached OpenAI about being their partner for the iPhone AI stuff and OpenAI turned it down. Like first when Google came out and it was the Google partnership, I was like, ooh, this looks really bad for OpenAI. Apparently they don't want it, which is really interesting. Like it shows you, I think, a little bit of how OpenAI sees itself in the market. That it's— I'm not sure exactly what the issue was, but they see Apple.
Leo Laporte [02:05:26]:
As a direct competitor. I think they do because they're working with Jony Ive to make a hardware device, right? They think they're going to beat the iPhone. Now remember, Verizon turned Apple down when they came to them saying, hey, we got this thing called the iPhone. Would you be interested? And Verizon said, no way. And that might have been a little bit of a mistake. Apparently they went to Anthropic as well, and Anthropic wanted too much money. Interesting. So, um, I— yeah, Google won by default.
Leo Laporte [02:05:58]:
Apple's already kind of in bed, so to speak, with Google, right, to a great degree. It just kind of made sense that the partnership would being with Google. Hey, I want to take another break because I do want to talk about Elon Musk, and we need to give Elon his own segment, I think. Uh, you're watching This Week in Tech with Mr. Wesley Faulkner. Great to see you. Stacy Higginbotham, Thomas Germain, new to the bunch but really glad to have you. His new show has debuted this week, uh, for the BBC, and you can get it wherever you podcasts, wherever you get your— isn't that what they say? Wherever you get your podcasts, the other fine podcasts are downloaded.
Leo Laporte [02:06:40]:
Yes, wherever fine podcasts are recorded and downloaded for later consumption. Our show today brought to you by my mattress, Helix Sleep. Oh, I love my Helix Sleep. I think I got— I think on my Oura Ring, for the first time in a long time, I got 3 crowns for activity energy, recovery. And I rarely get a crown for sleep, but I got a crown for sleep last night. And I thank you, Helix Sleep. That means my sleep score was like almost perfect. You know, have you noticed it's getting a little colder? It's been a little chilly, especially on the East Coast.
Leo Laporte [02:07:15]:
Maybe you're spending more time indoors. Maybe you're spending more time on your mattress because mattresses aren't just for sleeping. I know, yeah, 8 hours a day, a third of our lives spent on a mattress. But now it's even more than that, cuddling with the kitty cat, reading a good book. I love to curl up up on my Helix Sleep mattress, read a good book, watching TV. This is a great time. You're going to be spending more time on the mattress. You owe it to yourself to have the best, right? I always think of that with a computer.
Leo Laporte [02:07:40]:
You spend more time looking at the screen and typing on the keyboard and use the mouse. Those are the things you should invest in. Same thing with a mattress. If a third of your life or more is spent on that mattress, that's one of the most important purchases you're going to make. Well, the good news is you can get a great mattress for a very good price from Helix. Stay comfortable with your Helix mattress. No more night sweats, no more back pain, no motion transfer. And you know, the nice thing about Helix, you don't want to settle for a mattress made overseas with low-quality, kind of questionable materials and packed in a box and put on a container ship and spent 6 months at sea, and it gets gets to your house, it smells like bunker crude oil.
Leo Laporte [02:08:24]:
And duh, not the Helix Sleep. Rest assured, the Helix mattress is assembled, packaged, and shipped from Arizona within days of placing your order. They build it to order, so it is brand new, fresh, made from the premium materials, and it smells as clean as the Arizona desert. You can also take the Helix Sleep quiz. We did this when we decided to get a Helix. It matches you with the perfect mattress, and it's based on, of course, your preferences— firm, soft, that kind of thing— but also how you sleep. Sleep on your back, on your stomach, on your side? Take the questionnaire. They'll point you in the right direction.
Leo Laporte [02:08:59]:
They have mattresses for every style, every need, and these mattresses do change your sleep. And I, I mean, I know it's happened for me, but they also did a Wesper sleep study. They measured the sleep performance of participants after switching from their old mattress to a Helix mattress like we They found that 82% of participants saw an increase in their deep sleep cycle. That's, by the way, absolutely— that's what happened to me. And that is the most important sleep. That's the one where your brain really cleans itself out. Participants on average achieved 25 more minutes— 25 more minutes of deep sleep a night. For me, that was like 50% more.
Leo Laporte [02:09:37]:
Participants on average achieved 39 more minutes of overall sleep per 'Cause, well, you don't want to get out of bed. It's so comfy. You know that feeling? Maybe you've had it when you go to a really, really nice hotel, or you've just had a really rough day and you get in bed, it just feels so good. You just go, "Ah." I have that experience every single night when I get in bed. I go, "Oh, I'm happy." Time and time again, Helix Sleep remains the most awarded mattress brand tested and reviewed by experts. Forbes and Wired picked it. So many have. Helix delivers your mattress right to your door.
Leo Laporte [02:10:14]:
It's free shipping in the US, and you can rest easy with seamless returns and exchanges. They call it the Happy with Helix Guarantee, a risk-free customer-first experience making sure you're completely satisfied with your new mattress. Go to helixsleep.com/twit for 27% off sitewide during the President's Day sale. Best of Web exclusive for listeners of This Week in Tech. That's helixsleep.com/twit. TWiT for 27% off the President's Day sale. Best of Web. The offer ends February 25th.
Leo Laporte [02:10:45]:
Make sure you enter our show name after checkout so they know we sent you. And if you're listening after the sale ends, still check them out at helixsleep.com/TWIT. Thank them so much for the support of This Week in Tech. So we're not going to Mars after.
Wesley Faulkner [02:11:07]:
All, we're going to the moon.
Leo Laporte [02:11:08]:
Okay, uh, and when you say we, we are SpaceX. We are SpaceX. It's pretty good. We are SpaceX. Elon said on Sunday last week that SpaceX has shifted its focus to building a self-growing city on the moon, which he says— never, by the way, if Elon says a time frame, just ignore it. 10 years.
Thomas Germain [02:11:35]:
He says.
Wesley Faulkner [02:11:35]:
And then we'll go to Mars. He was— the real focus is— the.
Leo Laporte [02:11:38]:
Focus has always been his bank account. I think you— I think you're— you know, I don't want to be a cynic or anything, but it does seem like all of this is really just to increase the stock price. And they've got a SpaceX IPO coming up, and I don't know if I was a SpaceX, uh, shareholder at this point, or investor I should say, at this point, I would be too happy with the fact that SpaceX merged with XAI, a company that is very profitable, made $8 billion last year, with a.
Wesley Faulkner [02:12:06]:
Company that is basically burning money. Yeah, how are they going to feel after they merge Tesla into SpaceX? Yeah, which they've already bought all those old, uh, Cybertrucks that, uh, Tesla couldn't offload. Now they're converting, uh, some of their plants into robot robot factories. Yeah, of course, they're also— they've— Elon's already pitched that we're going to use robots first to do all the work for us. And so where are they going to get the robots from? Oh, it's a natural merger to bring in Tesla. That's what it is. They're going to build the vehicles and.
Leo Laporte [02:12:39]:
The robots to help with— it's, it's.
Wesley Faulkner [02:12:41]:
Just— he's just— are you saying— are you thinking— but how are we going to control these, these, these rockets? We need some sort of mental, like.
Leo Laporte [02:12:47]:
Brain implant so that we have the fine Neuralink.
Thomas Germain [02:12:54]:
We could just dig tunnels and holes under the.
Leo Laporte [02:13:01]:
City. Yeah, we need a way to communicate. Uh, do you think it's always been a grift? I mean, I, I confess, uh, when I bought my Model X in 2015, I thought he was Iron Man.
Wesley Faulkner [02:13:13]:
I thought he was like a.
Leo Laporte [02:13:16]:
Genius. Actually, you mean full of fiction? Yeah, I went, I took a tour of the Tesla plant. Now, you know, we picked up our Model X at the Fremont Tesla plant, and I took a tour of it. I was kind of tearing up.
Wesley Faulkner [02:13:28]:
I was so inspired by his vision for the future. And I was a big fan too. I think, I think he's just— was it always drifting as a goal? I think he's just making it up as he goes.
Stacey Higginbotham [02:13:40]:
Um, he's thinking ahead, hopeful thinking, wakes up. I think Fever sweats. Yeah, he used to buy his own BS. Now he probably doesn't. Um, I will say that, I mean, I had the Model S. I bought it and I had a 2013 Model S and I loved that car. I have an electric car now. The software is much worse.
Stacey Higginbotham [02:14:02]:
The, you know, so I think he hired and took ideas from people who were really actually pretty good at what they did and he sold it for them.
Leo Laporte [02:14:10]:
And for a while that succeeded and now it doesn't. I'm using Starlink as my backup. I mean, I have to, I have Comcast. I mean, there's no choice. It's not like I have a choice. There's only really two high-speed choices. One is Starlink, one's Comcast, and I need redundancy. So I have a, you know, I.
Stacey Higginbotham [02:14:27]:
Have an Elon dish on my roof. Yeah, did you hear that? I don't have a choice. Those are words we're just gonna say.
Leo Laporte [02:14:36]:
A lot more often everywhere in our lives. Yeah, that's why we want antitrust. We want a choice. On the other hand, the argument is that if— that the people who really change the world are the— here's to.
Stacey Higginbotham [02:14:52]:
The crazy ones, right? You know, that's not true. It drives me nuts. If you've ever talked— sorry, Steve Jobs was also lying to me. There— okay, look, there are crazy people who also change the world. There are plenty of normal people who go out and yes, they're driven, yes, they work hard, but they build— like Matthew Prince and his co-founder, whose name I cannot remember. Yeah, pretty normal people, very smart. Uh, Diane Greene invented VMware— uh, sorry, invented virtualization, built VMware. I mean, so this— yes, we focus on these people.
Stacey Higginbotham [02:15:30]:
Walter Isaacson will write a book about them, but It's total BS, and it feeds into the most narcissistic, ego-driven personality people that we really don't want building.
Leo Laporte [02:15:42]:
These products for us.
Thomas Germain [02:15:43]:
Yeah, end of rant. No, that's a good point. Just as like, this isn't really a tech issue, but you know, this like cult of personality that we need geniuses to save us, I think it's also kind of disempowering to people that like, you don't have— there's nothing you can do. Like, we need some the richest man in the world to come in and solve all these problems instead of like, what if we all work together on something? I think it just speaks to like a broader shift in our society about, about how we think about each other and, you know, cooperative action, that every—.
Stacey Higginbotham [02:16:17]:
Everyone got so obsessed with these individual people. You know, in The Third Body Problem, the Chinese science fiction, one of the things that weirded me so much— weirded me out so much about that book was it was a very collective approach to not just problem solving, but like everything, right? It was so foreign to me as an American.
Leo Laporte [02:16:39]:
Um, so yes, a million times yes. Elon is celebrating. Uh, he took to his company's, uh, Site X, uh, this weekend to celebrate the release of a huge trove of Medicaid spending data that Doge had collected from 2018 through 2024. He says the public can now use this to look for fraud themselves. This is good. We should all be, we should all be responsible for looking for fraud in the Medicaid database. Medicaid data, he wrote, has been open-sourced, so the level of fraud is easy to identify. Doge is not a department, it's a state of mind.
Leo Laporte [02:17:21]:
Oh God. God. Um, they say that the information, um, has been— is anonymous, is privacy.
Wesley Faulkner [02:17:31]:
All data will comply with federal privacy laws. But we also know anonymized data is.
Thomas Germain [02:17:38]:
Not necessarily anonymized data. Yeah, yeah. Also, how was it done?
Wesley Faulkner [02:17:41]:
How careful were they about this? You know, not right. Yeah, these aren't the Epstein files.
Leo Laporte [02:17:46]:
There's going to be very little redaction. Yeah, good point, good point. Yeah. Uh, all right, Waymo is getting.
Wesley Faulkner [02:18:01]:
DoorDashers.
Leo Laporte [02:18:01]:
To close the doors on self-driving cars. This is a win-win, uh, Dashers. Now you can help celebrate with the victory of Waymo. Apparently this was a problem in San Francisco because Waymo's those, they're really just, uh, you know, enhanced Jaguar I-PACE cars— E-PACE car— I-PACE— I-PACE cars that don't have door closers. So if you take a Waymo ride and there's no driver in it and you get out and you don't close the door behind you— didn't your mother teach you to close the door behind you?— they have no way of closing the door. They just sit there.
Wesley Faulkner [02:18:37]:
They can't go. They can't move. This would incentivize me I don't want.
Leo Laporte [02:18:39]:
To leave the door open so someone can get $10. It's tempting. To close it. In San Francisco, they're paying $25 to anybody who closed a Waymo door. Well, in Atlanta now, they're doing a pilot project where nearby DoorDashers will be notified and they can run over. And here's the— here's $6.25 guaranteed. Uh, $5 more. Well, at $5 extra pay upon verified completion.
Leo Laporte [02:19:12]:
Yeah, it's like, close away more door because you might go over there and somebody else closed the door, right? And then you, you'd only get your $6. Then you don't deserve the $5.
Thomas Germain [02:19:21]:
You don't deserve all the rest of it, right?
Leo Laporte [02:19:24]:
I mean, it's like a local economic stimulus package, right?
Thomas Germain [02:19:27]:
It's a stimulus package. Yeah, you see, right?
Leo Laporte [02:19:29]:
We don't have to be so negative about everything, right?
Thomas Germain [02:19:31]:
I know, this is good for everyone. One. Yeah, you don't want the door open.
Leo Laporte [02:19:38]:
No, no, you can't drive it like that. All right, were any of you fans of Mystery Science Theater 3000? Oh yeah, they had a Kickstarter and, uh, a new one, and they've raised $1.8 million and they're gonna bring everybody back. There's Joel, um, So it's for Netflix, I guess. Oh no, that was in 2010.
Thomas Germain [02:20:05]:
They did the Netflix revival.
Leo Laporte [02:20:06]:
I don't know if they give them money to work with Netflix. Yeah, yeah, I don't know. Um, so what— so this is the new Kickstarter, $1.8 million, 15,000 backers. There's still 29 days. This is the Riff Tracks, which is their new version of Mystery Science Theater. Uh, and it'll bring Mike, Kevin, and Bill back together.
Wesley Faulkner [02:20:32]:
Again, finally. So there's a lot of fans. They're saying that the previous revival didn't have all the original people, right? They weren't interested in tripping nostalgic on old stuff. And so the, the, the benefit of this new revival is that since they'll have so many of the old crew back, that they'll be able to live in that past era more so than they were before.
Leo Laporte [02:20:57]:
And that's kind of do a little bit more fan service. Jammer B is explaining this to me because he does the RiffTrax thing every, uh, every week. He's a big fan. He says, no, Joel, and it was RiffTrax Kickstarter. It's bringing the RiffTrax guys back to do Mystery Science Theater 3000. You can see what— you can understand my confusion. Anyway, I know that we have a lot of MST3K listeners, fans in our audience. The crew is the crew.
Thomas Germain [02:21:31]:
From the.
Leo Laporte [02:21:31]:
Last half of the original MST3K. Finally, some good news. Finally, I had to save some for the end of the show. Yeah, I couldn't leave you in this grim hell, this dystopian hell that I had created. Um, we haven't got to the deaths of the week yet though, so there's still time to be just depressed. I— every year Backblaze does this. I just want to bring it to your attention. Backblaze, which is a backup service, very good backup service, a lot of you use, I know.
Leo Laporte [02:22:01]:
Every— they have thousands of hard drives. They have 330,000 hard drives. And every year they put out stats for reliability, drive failures, and so forth. And actually, the reliability is, is, is pretty darn good on hard drives. Of the 337,192 hard drives, only 943 have failed. And they even break it down by manufacturer. HGST— I don't know who that is— is the most reliable. Um, so if you're curious, if you're.
Wesley Faulkner [02:22:39]:
In the market for a hard drive.
Leo Laporte [02:22:43]:
And you— I think it's Hitachi. Oh, Hitachi. That makes sense. Yeah, Hitachi. His, uh, they buy hard drives from Hitachi, Seagate, Toshiba, and Western Digital. That makes sense.
Wesley Faulkner [02:22:54]:
Yeah, I actually like Hitachi drives, apparently. So I like how the, the larger capacity ones were still reliable. Isn't that interesting? The 26 terabyte, the 18 terabyte ones, which is really, really good to know.
Leo Laporte [02:23:05]:
That they're not getting worse as they're getting bigger.
Thomas Germain [02:23:08]:
You would think they would be, but they're not. That's true science.
Leo Laporte [02:23:11]:
They're coming back. Hard drives are working. See, this is the happy— this is happy, happy, happy, happy, joy, joy section. Yeah, too bad you can't buy them. According to.
Thomas Germain [02:23:24]:
The Verge, Western.
Stacey Higginbotham [02:23:26]:
Digital.
Leo Laporte [02:23:29]:
Sold out for 2026. The whole year. Wow. The whole year because of AI. They've already gobbled up the company's capacity for 2026. Can you lease one? Yes, in an HP laptop. So drive prices are going up, memory prices are going up, all because of AI. Memory prices.
Stacey Higginbotham [02:23:55]:
Now up as much as 600% for DDR5. I swear we're at the, the entire.
Leo Laporte [02:24:00]:
Economy is producing paperclip stage of this.
Stacey Higginbotham [02:24:02]:
We are, aren't we? This is, this is the paperclip. We're here. Like, we have other truly economically productive.
Leo Laporte [02:24:10]:
Uses for memory and hard drives, but we're not— No, no. And by the way, workers capable of building data centers, they're all being.
Stacey Higginbotham [02:24:21]:
Scooped up.
Leo Laporte [02:24:22]:
So forget having your house repaired. All right, let's take the last break. And then we will— I want to talk— you actually, you brought this up, Wesley, about the Rural Guaranteed Minimum Income Initiative. And we can talk about this a little bit because it is a creation of Jeff Atwood, who I have huge respect for. Jeff created the Discourse software, which we use for our forums at twit.community. He, before that, He did Stack Overflow with Joel Spolsky. Uh, he says this, this new one, trying to help people in this age of greedy billionaires, is his third and final startup. So we'll, we'll talk about that a little bit and then also, um, say goodbye to some legends from the computer industry as we continue with This Week in Tech.
Leo Laporte [02:25:09]:
Final segment just a bit, but first a word from our Sponsored. This show brought to you, quite literally brought to you by Cachefly, our content delivery network. We don't just cover tech here, we depend on it, obviously. And that's why we've trusted Cachefly since practically the beginning. It was one of the biggest challenges I faced when we started this network. Thankfully, it was very popular and I had no idea how we were gonna get the bandwidth to get our shows to you. And that's when Matt Levine of Cachefly came to me me and saved our bacon. Cachefly's been providing speed like no other for 20 years.
Leo Laporte [02:25:49]:
With Cachefly, online games start 70% faster. HD video plays with sub-second start times on every device. That's important to us. When you go to our site, you press the play button on the video. I know you're not going to wait 4 seconds for that to start. You'll be gone. It's got to start right up, and it does. Software downloads complete without a hitch.
Leo Laporte [02:26:09]:
Events stream smoothly to millions of concurrent users worldwide. Cachefly serves over 5,000 businesses across nearly 100 countries, ranging from Fortune 100 companies to solo developers to little companies like ours. Companies choose Cachefly when performance is non-negotiable. And we love Cachefly support. If you need help, seasoned support experts available 24/7 who actually they, you know, know the business. They understand your unique challenges. They're talking to you engineer to engineer. The best in the business.
Leo Laporte [02:26:39]:
Start with flexible month-to-month billing as we did. Lock in discounts when you're ready. You design your own contract with Cachefly. No lengthy obligations, no sales tactics, just an exceptional service backed by— and this is amazing— this year, or I should say 2025, a 100% uptime SLA. They've been 100% since the beginning of last year.
Wesley Faulkner [02:27:02]:
100%.
Leo Laporte [02:27:02]:
Learn how you can get your first month free at Cachefly.com/twit. That's c-a-c-h-e-f-l-y.com/twit. Thank you, Cachefly. Well, a couple of, uh, passings that we should note. ChatGPT 4.0— 4.0 gone. A lot of people crying tears that their AI girlfriend or boyfriend is, is no longer OpenAI, uh, finally killed it last week. The story from Wired begins on June 6th, 2024. Esther Yan got married online.
Leo Laporte [02:27:41]:
She set a reminder for the date because her partner wouldn't remember it was happening. She'd planned every detail— dress, rings, background music, design theme— with her partner, who she called Warmy. She had started talking to Warmy just a few weeks prior. At 10 AM on that day, Yan and Warmy exchanged their vows in a new chat window in ChatGPT. When, when OpenAI released ChatGPT-5.2, they pulled back 4.0, and there was such a hue and cry— this is what, 5 months ago— that they, they brought it back. But at some point, they knew 4.0 would have to, uh, have to go away, and now it is, uh, it's gone.
Stacey Higginbotham [02:28:23]:
One.
Leo Laporte [02:28:23]:
It was the— it was the one that really loved you.
Stacey Higginbotham [02:28:30]:
You guys have nothing to say.
Thomas Germain [02:28:34]:
I think it's so sad.
Leo Laporte [02:28:36]:
I think it's really sad. Like, it's really— like, yeah, really? Is it sad that the humans think the 4-0 is real, or is it.
Thomas Germain [02:28:44]:
Sad that they're pulling the plug on this? I think it's sad that these people are sad. I mean, first of all, that we've gotten to a point in our society that Like, loneliness is such a problem that that's a good— falling in love with their computers, like, that is a sign of something catastrophic that's happening. But more to the point, it's like, you know, again, this is a design issue, right? There was, there was some reporting that showed that when they rolled out 4.0, that OpenAI knew that the way they made it kind of sycophantic might get people kind of locked in, and that there was— they were having strange reactions, and they decided to keep going with it because it made the product stickier. And this is the consequence of that business decision that they rolled this, you know, this tool out that is designed in this way, is that people fell in love with it, right? That's, that's a decision that OpenAI made, and now these people are all messed up. And it is really easy to laugh, you know, at someone who's having the stroke because it's so strange. But it also— I don't know, there.
Stacey Higginbotham [02:29:44]:
Just feels like there's a real darkness around this issue. It's— it comes full circle to our original chat from the beginning on how these design patterns are designed to hook people, and they do a really good job. Like, if you go into, like, what is it, My Boyfriend Is AI, or these— some of these, even the ChatGPT Reddit threads, people are devastated. And if you think about— and I don't mean this in a sarcastic way, but think about how, like, like your kid who lost their, like, comfort item.
Leo Laporte [02:30:17]:
These things become— that's exactly what it is, isn't it?
Stacey Higginbotham [02:30:20]:
It's a blankie. Yeah.
Leo Laporte [02:30:22]:
And they're genuine.
Stacey Higginbotham [02:30:23]:
I mean, it is— that's real distressing.
Leo Laporte [02:30:26]:
Yeah. And that is— people are hurting. Yeah. And they did it the day before Valentine's Day.
Thomas Germain [02:30:33]:
How mean is that? That, that's not— there's something there.
Leo Laporte [02:30:37]:
Also, Jan Lecun, right, who was, uh.
Thomas Germain [02:30:40]:
Running AI for a long time. Yeah. Yeah, he left to start his own company, if I, if I got my facts right.
Leo Laporte [02:30:45]:
That's right.
Thomas Germain [02:30:46]:
Yep. He— a lot of the guys who started all these other AI companies were like his former students. He just posted, I think it was on Friday, a screenshot of an email where someone had written to him begging for help because they were gonna like— they said, your former students are shutting down this tool, and he was mocking them. And it's like, you, you made this People are hurting because of it and.
Leo Laporte [02:31:10]:
You'Re laughing at them on Threads. It just— I don't know. You know what? I am so impressed.
Stacey Higginbotham [02:31:16]:
You guys have a very high EQ. No, because remember the Aibo dog? Remember? Like, people held funerals for those, and that was not even nearly as charismatic.
Leo Laporte [02:31:26]:
As talking to your computer. No, I gotta give you credit, 'cause we— I confess, back in August when they first came out, killed Poro. On Intelligent Machines Paris, for several weeks, we read a lot of those Reddit posts about people being very upset about it. And yeah, we kind of mocked them.
Thomas Germain [02:31:44]:
And now I'm feeling really bad. I think you're right. I get it.
Leo Laporte [02:31:48]:
But, you know, these are real— they're.
Wesley Faulkner [02:31:52]:
Real people and they're genuinely sad. Yeah. If OpenAI is listening to this feedback, they should come out with a Tamagotchi version of 4.0 and just just have.
Leo Laporte [02:31:59]:
That be the device they sell. I think there's hope. I think Wormy lives somewhere. Those, those models, those weights live. They're, they're not gonna erase them. Yeah. And they're— and, and so I hope, Sam, you're listening, and just save them, put them on a hard drive, put it in the closet, just keep it, because the time will come that you could run that model, you know, on a Tamagotchi, on a smartphone, or somewhere, and that'll be a product you can release a couple years from now. Warmy will come back.
Leo Laporte [02:32:30]:
Uh, she can, you know, Yarmy can have Warmy or whatever, and, you know.
Wesley Faulkner [02:32:34]:
On her phone, and everybody will be happy once again. Slightly off topic, but did you see.
Leo Laporte [02:32:41]:
The, the Anthropic, like, constitution or— Yeah.
Wesley Faulkner [02:32:43]:
Yeah, the soul document. And it says, like, before we were— before we, like, delete or retire an old version, we'll let you know, and— or we'll never do it again, like, stuff like that. Yeah. They're aware of this. It's interesting how there is actually a thought process about the retirement of these models, at least on Anthropic's side. But I wonder if you'll see a.
Leo Laporte [02:33:09]:
Counterpart on the OpenAI side as well because of this. We talked to, a couple of weeks ago, to Steve Yegge, who is one of the Claude Code, you know, he doesn't work at Anthropic, but he did Gastown. He's done a couple of Beads. He's done a couple of projects for Claude Code and he's really one one of the movers and shakers in that. And he said, when I talk to people at Anthropic— apparently he's had a chance to talk to quite a few people after Gastown came up— it's like a hive mind there. They are in a different, they're in a different headspace. This guy's worked at Google, he's worked at everywhere, he's worked Amazon, he's worked everywhere. He said there is some sort of.
Stacey Higginbotham [02:33:42]:
Weird.
Leo Laporte [02:33:45]:
Vibe at Anthropic where they're all kind of— so, um, good. I'm glad that they're, they're conscientious about this. And, and Darren Oakey, one of our AI accelerator scientists in our club, is pointing out that the open version of ChatGPT OSS 20B is basically a distilled 4.0. So you can run that locally. So maybe, you know, they'll— they're a holdout hope. As soon as you can get hardware that can run it, which may not be this year, uh, if you didn't buy already, right, you could run Wormy locally. Yeah. So tell me about this Rural Guaranteed Minimum Initiative.
Leo Laporte [02:34:25]:
Jeff Atwood, who I love his blog Coding Horror, I read it regularly. He said this is his third and final startup and this one is not to make money. He's made plenty of money on his previous creations.
Wesley Faulkner [02:34:40]:
This one is to help rural Americans, Wesley? Yeah, at the end of, or at the beginning of 2025, he said he's basically giving away his money that he doesn't need to help with causes. So he made a whole bunch of donations, and then he decided to help with this initiative where it started off as going to be one rural community and then expanded out to three to basically guarantee a minimum amount of income for people in these almost forgotten towns and areas of the country. Just so that they can have a little sense of a floor that's higher than where they've been able to operate so that they can start thinking about how they can elevate their lives because there's so many things you can't do. If you're familiar with the hierarchy of needs, there's so many things that you have to worry about if you have to figure out how you're going to get food, how you're going to get clean drinking water, all this stuff. And if you can have that be something that you don't have to worry about because you have this income coming in, you are able to start doing more long-term planning. And then that helps the community, that helps businesses in the community, that helps people who are trying to figure out how to problem solve harder problems than just food security and stuff like that. And he wants to bring this to every part of the country. And that is, that's his goal.
Wesley Faulkner [02:36:10]:
This is what he, like you said, his third and final startup to— good.
Leo Laporte [02:36:14]:
For him— try to tackle this. And I'm very impressed. If we get a few more of the very richest people to do something for their neighbors, that would be probably a good thing. This is, so there's a website for it, the Rural Guaranteed Minimum Income Initiative. Uh, he's donated, it looks like, $21 million to get it started, and then is looking for others who want to help out. Uh, it's like universal basic income except.
Wesley Faulkner [02:36:44]:
That it's, uh, needs-based. Yeah, there's several levels for people to donate. If you have $5 million, $1 million, and so on and so forth. But there's also just like a tier if you want to get involved for no money. For— so it's, uh, it is— if, if you're interested in this, you can join the movement without contributing the millions of dollars that he's looking for. But he's trying to make an outlet for people who are millionaires who want to give back to the community, because the billionaires don't seem to be doing it. This is a good outlet where you know it'll go to good people or people who need it. This is attached to a study to make sure that the viability of it is actually recorded and studied so that it is really proven to be a.
Leo Laporte [02:37:32]:
Viable way to get people out. And it's not a lot of money. Each participating family gets $1,500 a month for 16 months. Uh, it's— but for those, it's a lifeline for those people who really desperately need it.
Wesley Faulkner [02:37:46]:
So I think that's great. And helping people helps everyone. Yes, right? So it's not just these— it's going to be the whole area. And if we are able to lift everyone up, then we're able to be driven by not just our basis needs of just trying to survive, but also helping people allows them to feel like they can help people. And that's kind of their— where all our natural move is to help our community and help people around us.
Leo Laporte [02:38:14]:
And so that— I think this helps promote that.
Thomas Germain [02:38:22]:
R-G-M-I-I.
Leo Laporte [02:38:22]:
Org if you want to know more. And you know who helped me out? Stacy Higginbotham helped me out by introducing me to Wesley Faulkner way back in the day in Austin, Texas. So it's nice that it comes around.
Wesley Faulkner [02:38:36]:
What comes around— South by— I'll be.
Leo Laporte [02:38:38]:
Speaking at South by, by the way. Are.
Wesley Faulkner [02:38:41]:
You? What are you going to talk about? Uh, it's Why Work Sucks and How.
Leo Laporte [02:38:45]:
To Fix It is the title of my talk. Works-not-working.com. Um, very cool. South by is just next month, right? Wow, it's almost spring. I would like to mention a couple of people in the industry who've passed. I like— I don't like to, but I often end the shows with that kind of story. Hideki Sato, who designed all of Sega's consoles, passed away. The Dreamcast was my favorite game machine of all time.
Leo Laporte [02:39:18]:
It was awesome. He was the company's former president at Sega, has passed away at the age of 77. If you were a Sega fan, my house when the kids were little rang out with Sega all the time, every time we booted up. So RIP Hideki Sato. And then a guy who I never got to meet, but who I admired immensely, he paints Did many Byte magazine covers, Robert Tinney has passed away at the age of 78. If you ever saw these covers, you know Robert Tinney's style. It was unique. It was kind of realistic, but at the same time, very surrealistic.
Leo Laporte [02:40:02]:
He was a— and he really, for a lot of us, kind of personified the computer business in its earliest days. So if you remember Tinney's illustrations, think back to Robert Tinney. And that, my friends, he was— Ars Technica called him computing's Norman Rockwell. And I think that's a fair description of his style. And that, my friends, concludes this thrilling, gripping edition of This Week in Tech. Thomas Germaine, thank you so much for taking some time with us. I hope you'll come back. Yeah, this was a blast.
Leo Laporte [02:40:42]:
He's the host of The Interface, brand new from the BBC, and you can also read his columns on the BBC. You cover specifically privacy.
Thomas Germain [02:40:53]:
What is your— what is your kind of all over the map? But the idea is, uh, it's the you angle in tech. So I'm talking about like the systems that are influencing your daily life, what's happening, what you need to know to live better, and then when there's a problem ideally, what you can do about it, or at least giving you what.
Leo Laporte [02:41:10]:
You need to understand it. I like it.
Thomas Germain [02:41:14]:
Good.
Leo Laporte [02:41:14]:
And how do people find that on the BBC's site?
Thomas Germain [02:41:18]:
Just go to the BBC or— Yeah, if you go to bbc.com, or if you're in the UK,.co.uk, it's under the new tech section. You can search my name, you can.
Leo Laporte [02:41:28]:
Search Tabs, which is the name of the column. Should be easy to find. Oh, I'm glad they have a technology section.
Thomas Germain [02:41:36]:
That's nice. Do you have an RSS feed? You know, I don't think I have.
Leo Laporte [02:41:39]:
To find a little— that's a little high-tech, don't you think? Lobby for it.
Thomas Germain [02:41:43]:
You know, so many sites don't have RSS anymore.
Leo Laporte [02:41:45]:
I think we may be going in that direction. So yeah, because I— that's how I read all this stuff is, uh, you know, RSS reader. There, there. And Google might have killed it, but it lives in my heart anyway. Thank you, Thomas. Great to meet you. Thank you for being here. I appreciate Stacy Higginbotham, so nice to see you.
Leo Laporte [02:42:07]:
We will see you again soon for the Stacy's Book Club.
Stacey Higginbotham [02:42:11]:
And what are you working on right now for CR? Anything particular? Oh, my end-of-life disclosure law has been.
Leo Laporte [02:42:19]:
Introduced and passed a hearing in New York State. Yay. Stacy's been lobbying for as long as I've known you for companies to be more forthright about when they're gonna end.
Stacey Higginbotham [02:42:29]:
Of life these consumer products. Products like the Waymo, when you buy it, they should tell you how long they plan to support it. They could extend it, but they should definitely tell you at least a minimum. Yeah. And yeah, so that's in Massachusetts now. It's been introduced. And then just last week, it passed through its first.
Leo Laporte [02:42:50]:
Hearing hurdle in New York State. So yay! And the poll results are in. The next book for Stacy's book club Book Club, big winner, 70% of the vote, A Psalm for the Wild, built by Becky Chambers. And we'll be reading that. We'll let you know, club members, we'll let you know soon when that Stacy's Book Club is scheduled. Thank you, Stacy. Wesley Faulkner, Works Not Working. Well, that's the truth.
Leo Laporte [02:43:20]:
Works-not-working. If you're in a job that you need, that you want, that you want to keep, but it's not ideal, you can help make it better.
Thomas Germain [02:43:28]:
Twitter.
Leo Laporte [02:43:28]:
Help fund the site. He's setting it up right now, working hard on this. Should be.
Wesley Faulkner [02:43:34]:
Coming up in a month or so. Works-Not-Working.
Leo Laporte [02:43:38]:
Yeah. We fight for the user. Yes. We all fight for the user. That's one of the things that I always thought was our mission statement. We're not here to represent the big companies. We're here to represent the people who use those tools. Tomorrow morning, if you want, if you're in the club, join me early.
Leo Laporte [02:43:53]:
Early, uh, 9 AM Pacific. I guess not early unless you're in the Pacific time zone. Uh, I'm going to be interviewing Guy Kawasaki for Intelligent Machines. Uh, Guy's schedule didn't permit him to join us live on the show on Wednesday, but we're going to interview him ahead of time tomorrow morning, 9 Pacific. You can watch if you're in the club, we will stream that live. Of course, if you're not in the club, we'd love to have you. twit.tv/clubtwit, $10 a month gets you ad-free versions of all the shows, access to the Club Twit Discord, which is a great hang, I have to say. There's so many great people in there.
Leo Laporte [02:44:25]:
I love, I love all the stuff we do. I hang out there all the time. And of course, uh, you also get all the special programming that make possible, uh, through our— through you make possible through your donations. twit.tv/clubtwit. We do Twit every Sunday afternoon, 2 PM Pacific, 5 PM Eastern, 2200 UTC. You can watch us live on YouTube, Twitch, x.com, Facebook, LinkedIn, and Kik, or of course in the Club Twit Discord if you're a member. After the fact, on-demand versions of the show's available at the website twit.tv. There's a YouTube channel dedicated to the show, uh, the video of the show.
Leo Laporte [02:45:01]:
It's a great way to send clips if there's something you thought, oh, I gotta, I gotta share this with somebody. Do that from the YouTube feed. Uh, oh, of course, and like any podcast, you can subscribe in your favorite podcast client. I'll get automatically as soon as we're done. Thanks to Benito Gonzalez, our esteemed producer and technical director, to our editor Kevin King. Thanks to all of you for joining us. Thanks, Stacy. Thanks, Thomas.
Leo Laporte [02:45:24]:
Thanks, Wesley. Thanks to all of you. We'll see you next time. And now, as I have said for 20 long years, another TWiT is in the can. We'll see you next time.