Tech News Weekly 425 Transcript
Please be advised this transcript is AI-generated and may not be word for word. Time codes refer to the approximate times in the ad-supported version of the show.
Mikah Sargent [00:00:00]:
Coming up on Tech News Weekly, thank goodness for Jennifer Pattison Tuohy of The Verge, who gives us a lowdown of what's going on with Amazon's Ring, that Super Bowl ad about lost dogs, surveillance, and its partnership with Flock. Then I talk about that Lego Smart Brick because it turns out it took a long time to develop the technology. Afterwards, Tariq Malik of Space.com stops by to tell us about AI being used to drive a rover on Mars.
Mikah Sargent [00:00:43]:
This is Tech News Weekly, episode 425 with Jennifer Pattison Tuohy and me, Mikah Sargent. Recorded Thursday, February 19th, 2026. Ring's lost dog ad was never about dogs. Hello and welcome to Tech News Weekly, the show where every week we talk to and about the people making and breaking that tech news. I am your host, Mikah Sargent, and I am very excited because today I have the, the wonderful privilege of being joined by Jennifer Pattison Tuohy of The Verge. Welcome back, Jen.
Jennifer Pattison Tuohy [00:01:15]:
Hello, Mikah. Happy to be here as always.
Mikah Sargent [00:01:19]:
Always a pleasure to get to chat with you. And boy, do we have a lot to talk about because the Superb Owl happened and during that there were some advertisements, as there always are, and there's now been a lot of confusion, questions. I recently had to sort of explain what was going on to someone because they had these devices. I'm hinting at it. They had these devices in their home and wanted to know what was up. So, Jen, for the people who are listening, a little roadmap here. When we've got a big topic like this, we've got to take up the, the, the whole first part of the show to really dig in and understand what's going on. So without further ado, can you tell us what the heck is going on with Ring and flock and dogs and everything else in between?
Jennifer Pattison Tuohy [00:02:14]:
I can, I can. I will try at least. Um, I think— and this, while this happened a couple weeks ago, as you mentioned, this kind of kicked off with the Super Bowl. Um, why do people have to say Super Bowl? Is it like a licensing thing?
Mikah Sargent [00:02:28]:
Yeah, I think, I think, well, okay, you've got two groups. You've got the nerdy group who wants to show, I'm a nerd and I don't like sports. And so they say it for that reason. But then you do have the second group that's like, if you write Super Bowl or say Super Bowl in the wrong place, you will be sued.
Jennifer Pattison Tuohy [00:02:45]:
You'll get in trouble. Okay.
Mikah Sargent [00:02:46]:
Yeah.
Tariq Malik [00:02:47]:
All right.
Jennifer Pattison Tuohy [00:02:47]:
Sorry for the sidebar then. It was something I've always wondered. But yes, this did sort of kick off on what was that, February 8th? So this was like over almost 2 weeks ago now, but it's the story that just keeps on giving.
Mikah Sargent [00:03:02]:
And it does, yes.
Jennifer Pattison Tuohy [00:03:04]:
And then there's also, it's sort of like February has become the month of the doorbell because there's been some interesting news around Nest doorbells and the Nancy Guthrie case, Savannah Guthrie's mother who was kidnapped. So it's really been an interesting time to kind of dig into how this technology impacts people on a daily basis and what these companies are doing with our data and with our video footage. So the story around Ring is that they ran a Super Bowl ad. And Mikah, do you like dogs?
Mikah Sargent [00:03:34]:
I— yes, do I like dogs. I love dogs with my whole heart and soul.
Jennifer Pattison Tuohy [00:03:40]:
I know, me too. Me too. I'm a huge dog lover. And I think this was what Ring was counting on with this ad, which was all about how their new technology helps find dogs. It's a new service called Search Party, and it is tailored to use AI to scan through footage in Ring's cloud for missing dogs. And what happens is if your dog goes missing, anyone— now you don't have to have a Ring camera— can upload a photo of their dog to Ring's app, and it works— to the Ring's Neighbor app or the Ring app if you have a camera— and it can— it initiates a search party and it sends out the footage, the picture of your dog, to all of the Ring cameras in the neighborhood. And if the Ring camera catch— sees that dog on its footage, it then tells its owner, the Ring camera's owner, I've seen footage of what could be this missing dog. Would you like to share it with the owner? That is what Search Party does.
Jennifer Pattison Tuohy [00:04:43]:
Ring ran an ad during the Super Bowl explaining this in 30 seconds, which— I mean, I just explained in 30 seconds, so you'd think it would be quite easy to do. But somehow the ad that they ran made it look like and sparked fears of— and not unfounded, I'm just to say— that this is a mass surveillance tool. And with Ring's already well-known ties with law enforcement, they have partnered with police departments in the past. And whilst they canceled that partnership, they have reinstated a system that does allow Ring users to share footage directly with local police and local law enforcement agencies. This kind of came together in the conflagration of like, oh my goodness, Ring has just unleashed a mass surveillance network on our homes. And if you— there was one image from the video, um, the Super Bowl ad that shows homes in a neighborhood with blue surveillance cones coming out from each door, you know, where the cameras are searching. And, you know, like, you look at that and it just literally, you know, if you Googled Getty for the first ideal image for mass surveillance. Yeah, that's what that image looks like.
Mikah Sargent [00:06:05]:
Yeah.
Jennifer Pattison Tuohy [00:06:06]:
So yeah, so this caused a lot of consternation and concern, although this feature, Search Party, has actually been active for almost 4 or 5 months now. It launched last October. They announced it on stage at Amazon's event. So it's not— the feature itself isn't new or that new, but being on the Super Bowl, it got a lot of attention and Also, what has happened since last October is we've had some really awful news coming out of Minneapolis in Minnesota, where we've had a really difficult time between the law enforcement and ICE. So, sorry, my son is just coming home behind me right in the important part of the story. Sorry.
Mikah Sargent [00:06:54]:
Don't worry, most of our listeners are just listeners, so they won't even have known.
Jennifer Pattison Tuohy [00:06:58]:
Okay, sorry. So what's also happened in the last few months is, um, you know, there's been a lot of backlash against ICE's activities, and so much more concern about the, the idea that police could be sharing footage from people's cameras with organizations and law enforcement organizations like ICE. So that's kind of really flipped the script here as well. And so what, what now is a concern, and what happened— the second part of this story was that Ring had announced also last October that it had partnered with an organization or technology company called Flock, um, Flock Safety. And Flock Safety has gotten into a bit of hot water in the past for its ties to ICE and Customs Border Protection, um, because it had— it's a little complicated, but it had allowed— so law enforcement agencies use its cameras, which are license plate readers to track potential crimes. Like, they can track license plates. And those law enforcement agencies had shared footage from the cameras that they had access to, to ICE. Um, and that, that kind of chain of custody, I suppose, like the fact that footage has been shared to ICE by law enforcement, sparked fears that footage that people share from their Ring cameras to law enforcement could then be shared with immigration officials, which in this current climate is something that people were really horrified by.
Jennifer Pattison Tuohy [00:08:31]:
So it has been a kind of roller coaster as people have sort of been reacting to this news. Ring responded by actually canceling the partnership with Flock, but they didn't say it was because of this connection with, or concern that people had around ICE and sharing with law enforcement. It said it was due to— it was going to take too much time and money. And I was like, no, I don't think that's why you decided to do this. And then it's still— they still have a partnership with Axon, which is another company, technology company, that works— that uses systems that law enforcement agencies use to aggregate data, can— to bring data in. And what's happening now is, as you have these technology companies helping law enforcement with their work. And then you have consumer camera companies who are providing access with their users' permission, they say, to their cameras to law enforcement. Technology and AI that powers this search party feature that can search for dogs in the cloud is all kind of coming together into this what feels like a giant web.
Mikah Sargent [00:09:46]:
Yes.
Jennifer Pattison Tuohy [00:09:46]:
Whilst right now, Ring says categorically no one has access to its camera feeds other than its users. And they— you have the choice to share. And once you share, you know, that then they don't have any control over it after that, which is a bit of a problem. But yes, that is all coming together. And whilst theoretically we've not seen, other than that flock incident, any examples of footage going to law enforcement agencies like ICE from Ring cameras. The concern is there and the technology is there, and it's just like a switch that could be flipped.
Mikah Sargent [00:10:24]:
It just needs to be. Exactly.
Jennifer Pattison Tuohy [00:10:26]:
That's the concern. And Ring hasn't really addressed that concern. I mean, so yeah, it's— it's other than to shrug, other than to cancel this Flock partnership. But yeah, that, um, didn't really— they still have the community requests feature, which has always been a controversial feature. It— this is a reincarnation of a previous feature that Ring had called Request for Assistance, that one actually went directly to law enforcement. So the advantage of Community Requests, according to Ring, is it goes through these evidence management systems like Axon and Flock Safety, which in theory creates a stronger chain of, um, evidence, evidence chain, you know. So you know the footage hasn't been altered, but you're putting it into this database, and that's— I think that's where people are getting freaked out. And there's been a lot— I mean, politicians have come out and spoken against this.
Jennifer Pattison Tuohy [00:11:18]:
And then, oh, I forgot to mention that at the same time, Ring also launched facial recognition. So, and that's the other piece of the pie that kind of brings it all together. Although facial recognition right now is not tied to Search Party, but again, the pieces are there.
Mikah Sargent [00:11:35]:
Yeah. Let's talk more about the facial recognition aspect and what is involved there after we take a quick break. I want to tell you about Zscaler bringing you this episode of Tech News Weekly. Zscaler is the world's largest cloud security platform. The potential rewards of AI, we know, I know, you know, they're too great to ignore. But here's the thing, you also can't ignore the risks— loss of sensitive data and attacks against enterprise-managed AI. Generative AI increases opportunities for threat actors, helping them to rapidly create phishing lures, write malicious code, and automate data extraction. There were 1.3 million instances of Social Security numbers leaked to AI applications, ChatGPT and Microsoft Copilot, well, they saw nearly 3.2 million data violations.
Mikah Sargent [00:12:24]:
So it's time to rethink your organization's safe use of public and private AI. Chad Pallett, acting CISO at BioIVT, says Zscaler helped them reduce their cyber premiums by 50% while doubling their coverage and improving their controls. Take a look at this from Chad.
Mikah Sargent [00:12:44]:
With Zscaler, as long as you've got internet, you're good to go. A big part of the reason that we moved to a consolidated solution away from SD-WAN and VPN is to eliminate that lateral opportunity that people had and that opportunity for misdirection or open access to the network. It also was an opportunity for us to maintain and provide our remote users with a cafe style environment.
Mikah Sargent [00:13:08]:
With Zscaler Zero Trust plus AI, you can safely adopt Gen AI and private AI to boost productivity across the business. Their Zero Trust architecture plus AI helps you reduce the risks of AI-related data loss and protects against AI attacks to guarantee greater productivity and compliance. Learn more at zscaler.com/security. That's zscaler.com/security. And we thank Zscaler for sponsoring this week's episode of Tech News Weekly. All right, we are back from the break. As I mentioned, joined this week by the awesome Jennifer Pattison Tuohy, who is helping us understand what is going on with Amazon Ring and the Search Party feature and everything else that's involved. Now, I opened my Ring app the other day, uh, when I— when we moved to this, uh, home, it came with a few Ring devices, a Ring doorbell and a Ring camera that was, you know, sort of on the corner of the home.
Mikah Sargent [00:14:07]:
And so we kind of slotted in and, you know, signed up for the service and have used them regularly. They've actually been very helpful here. But anyway, I launched the app the other day and there were all of these AI features that I hadn't seen before. Can you talk a little bit about these AI features And sort of does it differ at all from what other companies are doing, like Wyze, for example? Or, I mean, frankly, any of the companies that are doing the thing where, oh, we see that there's a dog walking into the yard, or we see there's a package detected. Is it more than that? Is it less than that? What's the dealio?
Jennifer Pattison Tuohy [00:14:52]:
Yeah. So, so the AI is really one of the key elements here because it has gone from creating, from having a system where you would have to like scroll through hundreds of hours of footage to find something, to a system that you can just type in what you're looking for and it will pull up that footage. This is vision language models that, that we're kind of, that is part of like large language models. It's changing the technology that cameras use. And this is really great for users of cameras because it does make them much more useful. So now, you know, you used to get to the point where you would get, instead of just motion alert, you might get a person alert or a package alert. So you had some kind of context that someone, that what had happened on your camera, that was based on machine learning. Like they had to train that algorithm to understand what a package looked like.
Jennifer Pattison Tuohy [00:15:46]:
It was very complicated and it took a lot of time. That we've seen on a lot of cameras over the last few years with mixed success because it's very, it's a harder technology. Whereas now with, AI, they don't have to train it all of that data necessarily. They can just— it's a much broader technology. So now Ring has a feature called AI Video Search where you can go into your app and tap into a search bar. And I could put something like kids on bikes and it would search through footage from all of my cameras that I have selected to search. And it will tell me, it'll show me any clips of kids on bikes. And you can use this to extend this to anything.
Jennifer Pattison Tuohy [00:16:32]:
And eventually I think we'll get to the point where you could set an alert to say like, anytime you see kids on bikes, send me an alert. And so this is just making your camera technology a lot more useful and less annoying. I mean, I'm an outlier case because I test a lot of cameras, but the number of notifications I get is like, does make me want to turn them off. And then you turn them off and then that's the night that someone tries to break into your home. So this is, and you know, there's a larger question here around a broader topic around do we really need to be surveilling our homes and what are there other solutions here that feeds into the broader civil liberties topic that we were talking about earlier. But in terms of the technology here, what's different about what Ring is doing versus all the other companies that offer these types of services is that Ring has a communal feature and that's what Search Party is. So that means it's not just you in control of your footage. Other people have access to it.
Jennifer Pattison Tuohy [00:17:31]:
Not direct access, but, you know, when I mentioned it earlier at the top of the show that I could upload a picture of my lost dog and it can search your footage. Now, whilst Ring says this is all controlled and privacy— your privacy is protected and you don't have to share the footage if you find the lost dog. You know, what kind of a person are you if you don't? But you don't have to share that footage. Um, the Ring has created a tool that allows the cloud data to be scanned to search for something. Right now it's dogs. They've added wildfire as another, um, element that you can be alerted to around your community. They said they're going to do cats, but they've specifically said it cannot search for people. But this is obviously the concern that has been— that has arisen around this technology is they designed it not to search for people, but that doesn't mean it can't one day become a tool that could search for people.
Jennifer Pattison Tuohy [00:18:31]:
And because they've created a communal search process as opposed to it just being me looking for kids on bikes, the next tick is, well, who else can search? And could it be, you know, a nefarious— could there be nefarious uses in increasingly authoritarian government? I mean, Where does this end? And that's, that's kind of the controversy. Um, it's, it's, it's definitely been an interesting one. And to add to kind of the whole package, this recent— the Nancy Guthrie case, where they were able to retrieve thought-to-have-been-deleted footage from her Nest camera, makes everyone wonder, okay, well, did we— is my deleted footage not really gone? But also, "Oh, yay, maybe they're going to be able to find her," which leaves you like, "Well, cameras are good because they help, but cameras could also be bad." And what it comes down to is the tools aren't the problem. It's how we use the technology. Yes.
Mikah Sargent [00:19:33]:
That's a really good point. Ultimately, it is. It's how we use the technology and how we sort of predict how people will use the technology. And that's what we have to be mindful of.
Jennifer Pattison Tuohy [00:19:44]:
And that's hard to do.
Mikah Sargent [00:19:46]:
Yeah, because we don't know until it's out there. And then people make use of it in good ways and bad. Sometimes it's novel and helpful, and sometimes it's not.
Jennifer Pattison Tuohy [00:19:59]:
How do you feel about it, Mikah?
Mikah Sargent [00:20:04]:
I obviously— look, as you asked me at the beginning, do I love dogs? Yes. And I think that that's also part of this. It's the same thing as the lawmakers who put forth these different ideas as save the kids, it's hard to argue against that aspect of it. And so 100% of someone's lost their pet, I want them to be able to find their pet. And if this is a way to do that, great. However, there's a lot that I've also been part of the Neighbor app and part of the other— I can't think of what it's called right now, but the other one that people use all the time.
Jennifer Pattison Tuohy [00:20:46]:
And Nextdoor.
Mikah Sargent [00:20:47]:
Thank you, Nextdoor. And those are hellscapes of, of all sorts of isms.
Jennifer Pattison Tuohy [00:20:52]:
Racial profiling. Yeah.
Mikah Sargent [00:20:54]:
And so let's not add to that is how I also feel. So yeah, again, as is often the case, it seems with tech, I'm of two minds about it. And it's— you wish that you could have the good without the bad. But the potential for the bad is always there and has to be considered. And so I have to tell you, I was honestly surprised to see Amazon back away from Flock. Um, given, given the administration and the current state of things, it seems like, uh, more, more than ever, companies feel empowered to do whatever companies want to do regardless of any sort of, um, humanist pushback. And so I was honestly surprised that the company decided to step away from it. It just seemed like now double down feels like the narrative of everything at the moment.
Mikah Sargent [00:22:05]:
And so, yeah, there was a bit of like, huh, interesting.
Jennifer Pattison Tuohy [00:22:09]:
But that was a full PR move because the system is still there. The community request system is still there. The acts on service that they partner with is the same thing as the Flock service. Nothing's gone away really, just it's still smaller. It would have been bigger if they partnered with Flock. But yeah, it, it was like, I think, I think the big backlash we saw here really, I don't have any data on this, but it affected their users. And I think they maybe saw some cancellations. There was certainly lots of talk about people ripping their Ring doorbells off.
Jennifer Pattison Tuohy [00:22:44]:
Their front doors, whether they actually did or not, we don't know. But that's actually something I'm looking into. So if you have a Ring doorbell and you got rid of it and want to talk to me, let me know.
Mikah Sargent [00:22:53]:
You heard it here, you gotta talk to JPT about your Ring doorbell rippage. I will tell you, mine is still on my door, but I do have all of the AI features turned off.
Jennifer Pattison Tuohy [00:23:04]:
Turned off.
Mikah Sargent [00:23:05]:
And frankly, regularly go in and check after app updates to make sure it's still toggled off and that they haven't— Oh yes, the search party.
Jennifer Pattison Tuohy [00:23:14]:
Was on by default.
Mikah Sargent [00:23:15]:
Exactly.
Jennifer Pattison Tuohy [00:23:16]:
Yeah, so that's, that's, that was another factor to this. So it, yeah, turning things like this on by default, I don't think is ever a good idea. I get the idea, like it wouldn't work as well without it, but still, you need to let people choose. I mean, you can turn it off though, as you found out.
Mikah Sargent [00:23:33]:
Yes, but, uh, yeah, the, the just the default option should be let the user be in control of the thing They purchased.
Jennifer Pattison Tuohy [00:23:43]:
Exactly, exactly.
Mikah Sargent [00:23:45]:
Well, Jennifer Patterson-Tewi, it is always a pleasure to get to chat with you here on the show. I want to thank you so much for taking the time to join us today. If people would like to follow you online and check out all the great work you're doing, you're a prolific writer, uh, and so there's always new stuff to check out. Where are the places they should go to do so?
Jennifer Pattison Tuohy [00:24:02]:
Yeah, um, so theverge.com/jenniferpattersontuohy, you'll get all my articles. And there's a lot if you want to dig into the Ring stuff. And then I'm also on the socials at Threads and Bluesky and X @smarthomemama and JP2E. So come and see me there and look forward to chatting again next month, Mikah.
Mikah Sargent [00:24:26]:
Yes, can't wait. Thank you so much. Alrighty, folks, it is time to take a little moment here so I can tell you about Club Twit. At twit.tv/clubtwit. If you haven't heard the good news, the good word, well, now you have. Head to twit.tv/clubtwit, sign up $10 a month, $120 a year. When you join the club, uh, you are going to gain access to some awesome benefits. First and foremost, you get every single one of our shows ad-free, just the content.
Mikah Sargent [00:24:55]:
You also gain access to our special feeds. That includes a feed of our behind the scenes, before the show, after the show. We also have a feed of our live coverage of tech news events. And we have a feed for our special shows that are published in the club, like My Crafting Corner, the recent D&D campaign that— or adventure that I ran, Stacy's Book Club, plus so much more. And access to the members-only Discord server, a fun place to go to chat with your fellow Club Twit members and those of us here at Twit. If all of that sounds good to you, well, join the club. Uh, you will— you should head there, twit.tv/clubtwit. You'll see deals, you know, we've got a trial, and then we also regularly offer discounts throughout the year, so be sure to check that out as well.
Mikah Sargent [00:25:43]:
I always like to go in and see the new folks who have joined. Uh, so John, uh, welcome. Also Optimus, welcome to you. And Mary, hello to you. Thank you for being part of the club. It is always a pleasure to have you there. All right, uh, twit.tv/clubtwit. Let's head back to the show.
Mikah Sargent [00:26:05]:
It's time for a quick little story of the week before my final interview of the day. I want to tell you, you may remember I talked about LEGO's Smart Brick technology before, right? Well, Wired has a really interesting piece. It's an exclusive from Wired where senior innovation editor Jeremy White takes, takes you on a deep inside, deep look inside LEGO's Creative Play Lab in Denmark, where the company has been quietly developing what might be its most significant product since the minifigure was introduced nearly 50 years ago. It's called the Smart Brick. You may remember it's a sensor-laden 2x4 black brick that's packed with an LED array, accelerometer, light, sound sensors, a miniature speaker, wireless charging, an analog synthesizer, and a custom— importantly— 4.1mm chip. This is expected to land in stores on March 1st alongside a wave of Star Wars sets, and the Smart Brick is the culmination of an 8-year development journey that had lots of prototypes, of course sleepless nights, scrapped product lines, and more than 20 patented world firsts. The whole thing started, turns out, because a guy was chiseling damp plaster off a wall during Christmas break. So let's talk about this.
Mikah Sargent [00:27:22]:
I love this. Uh, the Smart Brick is the brainchild of Tom Donaldson. Okay, Tom Donaldson is the senior vice president and head of LEGO Group's, uh, Creative Play Lab. And during the 2017 Christmas holiday, he was doing some repair work, pulling plaster off of a damp wall with a chisel drill. And he said, you know, I was just really bored and I got this idea. Um, he called them key insights. First, uh, one of the key insights that he had was that instead of one big piece of technology, Lego needed lots of small pieces because kids don't play with one big thing. It becomes boring because kids play with lots of stuff.
Mikah Sargent [00:28:00]:
So we needed something that kids could have lots of. Second, the smart brick had to be universal so that the same brick would work no matter which Lego set you bought. And third, It needed wireless charging because expecting parents to manage batteries across multiple smart bricks would be, as he called it, a nightmare. So that, those were the insights that he came with and took that from there. Uh, the smart brick has quite a bit of technology inside. It's just a little 8-stud, uh, rectangle and you have an LED array. I've mentioned an accelerometer. Light sensor, sound sensor, speaker, battery, uh, and then this copper coil assembly that's controlled by a mixed signal chip.
Mikah Sargent [00:28:45]:
Now, um, what's fascinating is that when you hear sounds from this brick, they're not actually pre-recorded. These aren't audio clips that are stored in memory somewhere on the device. They're actually generated in real time by the onboard synthesizer. So the brick takes its instructions from NFC-enabled Smart Tiles and Smart Minifigures, meaning different tags and characters can change the play experience depending on what you have. So it's kind of like, uh, it's almost like the brick itself is the game console. And then these little NFC tags are kind of like cartridges that you put into the console. And then it also went forth to build a proprietary brick-to-brick position system. So these little copper coils that are inside, uh, will sense distance, will sense direction, will sense orientation between multiple smart bricks.
Mikah Sargent [00:29:38]:
And what's great is that the whole network is self-organizing. There's no setup, no app, no central hub, no external controllers. And if the network fails, it resets itself automatically. Uh, the Bluetooth-based BrickNet protocol handles data sharing between bricks. Really, really cool stuff. Um, but here's the thing. It started out much more complicated, as is often the case, and then over time has gotten, you know, more miniaturized. Uh, it was done in collaboration with with a company called Cambridge Consultants, uh, and Andrew Knights, who was the technical director at Cambridge Consultants, uh, walked White through these various prototype stages.
Mikah Sargent [00:30:18]:
So the first prototype was a chunky crocodile-like creature with wires that were held down by Blu Tack and capacitive touch that would actually make the crocodile's head spin. Uh, he says he built it in about 3 hours and then went on a plane to LEGO HQ. Uh, then they produced these oversized 3D-printed gray bricks, one that had a light sensor, one that had audio, and then kind of use that to work on it. And it's really fascinating. I'm not— I don't want to go into all the details because I want people to be able to check this out. You got to read this piece from Wired. But overall, they were— they just continued to iterate and iterate and iterate. Every 2 weeks, the team would put prototypes in front of kids and see how they played with them, trying to discover this sort of sweet spot.
Mikah Sargent [00:31:02]:
And so they talked about, you know, making sure they've got the right stimulus and that they're not going kind of the wrong way. Um, what's fun too is that my sort of comparison between the NFC tag as a, as a, you know, console cartridge is not really an original thought. In fact, the system was a nod to 1980s game design. Uh, the LEGO's existing manufacturing model where you have generic components that are produced in bulk and then, uh,, will get customized late in the process. This is a little bit different because the brain brick had to be universal and it was, it's the tag that provides the specificity. So it's kind of like, uh, a little bit like, you know, a tap of your card on a payment terminal. Um, but you encode these interactive game experiences on such tiny memory. There's so little memory that you can store an NFC.
Mikah Sargent [00:32:01]:
That they kind of said, you know, what we can do is look back to 1980s game design and see what all was able to be encoded at that level. Uh, then going on, Knights talked about, you know, what is it like to write these unique programs and have them in these little NFC chips, saying, uh, you quote, yes, it's difficult, but this project was all about solving problems. And it goes on from there because that wasn't the only issue, right? Trying to figure out how to program something in this miniaturized way. Yeah. Wireless charging was also an issue. Um, and it turns out that this is a problem that other companies were not able to solve. So LEGO needed kids to be able to charge multiple Smart Bricks simultaneously on a single pad, a one-to-many wireless charging solution, right? Does that sound familiar to any of you? If not, then you didn't hear about Apple trying to make their AirPower charging pad that was supposed to be able to let you put multiple devices down on the surface and charge multiple devices at once. Apple wasn't able to do it., but it was something that the company continued to work on until they got it.
Mikah Sargent [00:33:26]:
The team, uh, has had prototype chargers that powered 10 bricks at once. The system is designed to charge through height, so you can actually place an entire assembled model on the charging pad and it'll still charge. The wireless charging system shares the same coils as the tag reading system, so that way it is charging and NFC to do this NFC positioning for 3D spatial tracking. They're using— this is so cool— they're using the coil for multiple purposes to let the chip understand where bricks are placed. Like, what? Uh, this is the quote: the nearest thing we saw, uh, close to this was in the F-16 fighter jet, which in the helmet has something inside to know where you're looking relative to the What? That's where they got the inspiration? So, so cool. Um, there's a lot more to this, but again, I want you to go read the piece. Uh, Knights claims that the team delivered every feature on its original wishlist, which is not very common in product development. And, uh, you know, the, the project was not kind of put forth as this idea of what is the new thing that we can do to make the most money, uh, because frankly, This has the risk of making LEGO less money if people buy these LEGO sets and can do new things with them instead of having to get a new LEGO set to have a new thing happen.
Mikah Sargent [00:34:56]:
So this was really about kind of making something cool, making something big, and making something different. And I think that's a pretty neat idea. So definitely go check out this piece from Wired. A really, really interesting story and one that I think is worth everybody's time to, to go read. And thank you to Jeremy White of Wired for writing it. Very cool stuff. All right, folks, we're gonna take a break and come back with our final interview of the show. All right, we are back from the break, and I've got to tell you right now.
Mikah Sargent [00:35:35]:
There is a car-sized rover sitting on Mars that just made history, not because of where it drove, but because of who planned the route. Joining us today to talk about what is going on in space, it's Space.com's editor-in-chief, Tariq Malik. Welcome back, or welcome, I suppose. Yes, thank you to the show.
Tariq Malik [00:35:55]:
Yeah, thank you, Michael. Thank you for having me. I'm very excited to talk about stuff that's not with Rod, uh,
Mikah Sargent [00:36:02]:
today. So that's good. Uh, so first and foremost, NASA just announced that the, uh, Perseverance rover completed its first ever drive on Mars that was fully planned by AI. Can you actually walk us through what happened during this demonstration and why it's such a significant milestone?
Tariq Malik [00:36:18]:
Yeah, this is really interesting because this week, as you and I are talking, we are celebrating 5 years of Perseverance being on Mars, which alone would be exciting. But 5 years into this mission, now they have planned the first drive, actually two drives, uh, completely with what they call, uh, I think it's like vision-capable generative AI. And initially it can take, you know, days for NASA or rover— they call them, uh, joystick operators— to, to plan a drive for, uh, for the rover itself. It's this, you know, you mentioned it's a, it's the size of a a car powered by a nuclear battery. And there's a lot of back and forth that would go through the planning process, 'cause you don't want your $2 billion machine to hit a rock and tip over, and then it's ruined on Mars. So what they have done is they've used this generative AI, they've worked with Anthropic and with Claude, I think it's called Claude AI, is that what it's called? So, or do we just call it Claude?
Mikah Sargent [00:37:21]:
Do they have— I think we just, I just call it Claude, But there probably is a special name. But yeah, I just call it cloud.
Tariq Malik [00:37:27]:
Then people, they'll know. So they used this, this kind of capability, this vision-based system, uh, to marry it with all of the data that the rover collects with its really high-definition cameras, uh, as well as all of the mapping that they have. And we have a lot of mapping of Mars, both from the ground, from the rover, as well as from the satellites that are up in space. You know, they map everything out. Over time. And what they were able to do is say, hey, instead of me at the controls saying drive, let's say the, the distance of a football field, NASA usually would say you can drive maybe 300 feet and then you gotta stop and check in with us. Instead of, instead of that, they would say, all right, Claude, you know, or whatever NASA wants to call this, this system, um, uh, go ahead and generate this and set the waypoints out and then upload everything. And so the, the system was able to look at all the visual data from the rover, look at all the visual data from these maps, and then identify hazards like scarps or crags or, or big rocks and whatnot, and compare all that together and say, okay, here is the waypoint.
Tariq Malik [00:38:37]:
Yeah, sure, here's 300 feet, but here's another 300 feet, here's another 300 feet. And they upload all of that, and on these two drives, instead of driving stopping, calling home, having, uh, the engineers on the ground compare everything, uh, turn it around, get back new instructions to the rover. That can take like a day, maybe more. It was able to just keep going. So it goes to that viewpoint, it doesn't have to call home because it has all of the information already from the AI-generated route and just keeps going. And they were able to do that twice and drive for longer periods of time in a faster way than ever possible possible. And that's really exciting when it comes to trying to get to your next
Mikah Sargent [00:39:16]:
science target on Mars. Yeah, absolutely. Now, this, of course, the way that things have worked up to this point, as far as I understand it, is that human drivers, you know, not actually physically there, uh, have been the ones planning the rover routes from here on Earth. Can you tell us just sort of in-depth— well, as much depth as you want to provide here, like What is the traditional process? And I think something that I hear people talk about a lot that is kind of hard to understand is what does it actually mean when it comes to, to controlling something 140 million miles away? Right. And, and like how much data can be sent at a time? How long does it take to get there? How long does it take for the thing to respond? What happens if you type in the wrong thing and then you drive it over? Yeah, like there's so much to consider there, right? That are, are human drivers like mapping everything out first before they— yeah, I
Tariq Malik [00:40:14]:
have so many questions. There is so many things that can go wrong, you know. Let's consider, consider the longest running rover off the planet, NASA's Opportunity Mars Rover, right? It lasted for years, actually a decade plus, you know, uh, beyond a 3-month, uh, mission or a 6-month mission primary, primary flight. That was great. Great. How did it die? It drove into a sand dune and got stuck. And then that's it. That's the end of the mission, right? Uh, NASA's Spirit Rover, like, some, some things went wrong.
Tariq Malik [00:40:44]:
Uh, the twin of Opportunity, it got stuck. And, and that was, that was it. You need to know where all this terrain is and, and have a way to avoid it. Uh, NASA has lost missions
Jennifer Pattison Tuohy [00:40:58]:
to Mars
Tariq Malik [00:40:58]:
because the, uh, you know, like, a, a distance was in units that were metric and maybe they should have been in centimeters, you know, you know, something like, or inches instead of centimeters, you know, things like that where they make those kinds of basic mistakes. And the hope is that you can take some of that human error out and you can speed up the decision process too. Because like I mentioned, uh, as you said, there's 100 or 150, 140 million miles away. The time it takes one signal from Mars to say, hey, I reached this waypoint, that's maybe 12 minutes, 15 minutes. Uh, then the, the engineers get it. They say, okay, hey, it's 6 o'clock, I just got this thing, I'm gonna go have dinner now and I'll come back tomorrow and look at it. They spend the day analyzing it, then they send that back. That's another, you know, 15 minutes up to half an hour depending on where Mars is in its orbit.
Tariq Malik [00:41:44]:
Now you've lost the day, maybe half a day and a half of all of the exploration. The rover drives everything and then it says, hey, I'm at the next spot, and they have to do it over again. So that means that a drive like this that, that it did, you know, over the course of a couple of days could take a month or more, you know, weeks or a month or more to get through. And that slows down your science. And especially if you're in like, let's say, a quote-unquote boring part of Mars where there aren't any signs. I mean, I don't know how that's possible, right? Right. But if there's like— if it's a place where you're just trying to get across, like when we're on the highway. I grew up in California.
Tariq Malik [00:42:18]:
I grew up in Stockton, went to university in Los Angeles. That's 6 hours of highway. That's like 3 hours of nothing. Nothing for a long time. If you find a place like that, you, you don't want to spend weeks of the rover checking in saying, hey, do I still go straight? You know, that kind of thing. Yeah. And that's what this system can really allow. And, um, and I'm not sure if— I just saw this, this new thing, it just got announced like before you and I were meeting, and they've, they've, they've got a new system now that allows the rover to know exactly where it is by itself too.
Tariq Malik [00:42:48]:
And oh wow, that is really interesting. I don't know if you want to
Mikah Sargent [00:42:50]:
get into that now or get through some other questions. Planet GPS? Can you tell?
Tariq Malik [00:42:54]:
Yeah, what— how does that work? So it's not GPS, it's— they call it like the Mars Global Something System. And what they've done is they've taken the processor, the communications processor on the rover that was the base of the helicopter. You know, the, uh, Perseverance had this helicopter, a little drone that they could fly, and it would scout ahead and, and find really cool things. And they go, yeah, we'll go there. But it crashed and it died, but they still have that hardware. They took, and they were able to, uh, allow it to process the panoramic images that it takes, compare it with the onboarding maps that it has, and it can now within 2 minutes pinpoint within about 10 inches of accuracy exactly where it is on Mars using that system, which used to take, again, like up to a day where it would drive, uh, it would take a picture of where it is. Say I'm going to drive, I don't know, 20 feet, and then say I've driven 20 feet, I am here now on Mars. But there's maybe some inaccuracies because the wheels would slip or, or whatnot.
Tariq Malik [00:43:56]:
After a while, that's 100 feet of difference of where it thinks it is to where it actually is, because NASA would then have to at home double-check all of that the next day before it could drive. So now it can do all of that by itself, and this AI system can plan out all of the, the drives so they can really maybe open up the road a bit, if you will, with this, this rover to let it drive maybe a little bit quicker,
Mikah Sargent [00:44:18]:
uh, and explore a bit more Mars. The, the scrappy, uh, innovative nature of, of space scientists, uh, is— and, and, you know, engineers— is just incredible to me. The, you know, you— because I think about like on Earth, if something fails, you go and you grab it and you say, you know what, it's— I, I have the opportunity to let it serve its original purpose, and we're going to fix it and we're going to try again. But here, because it's so far away, there's not a lot that could— But instead of just letting this stuff sit there and not do anything, the fact that they said, "Let's try to do some—" And you're doing it with tech that is as old as it was when it left. That is just mind-blowing to me. I just think it's so incredible, the stuff that's, you know, able to happen so far away. And hearing about it always kind of just goes, "Whoa."
Tariq Malik [00:45:16]:
It's really, it's really amazing because, you know, JPL, the Jet Propulsion Laboratory, which built the Mars Rover, they have this saying, you know, dare mighty things. So when it might seem surprising now, but when Perseverance's predecessor Curiosity landed on Mars, they did— it didn't have like a, a landing system. It didn't have, uh, like the, the big airbags or anything. They had a sky crane and then that lowered it down and And it worked perfectly. It was crazy, this idea that they had. And it worked so perfectly, they did it again with Perseverance, where it again worked perfectly, uh, that time as well. And so they, they keep coming up with these really wild ideas to do bigger and better things, but they are able now, it seems, to just extract the maximum amount of science and engineering out of these systems because because they have, they have tested them so much on Earth, they know what they're capable of. And then as technology progresses, they're able to say, what if we used, you know, X to do Y instead of X? You know, what if we used, uh, you know, the, this helicopter base as a communication setup for this other system? What if we, now that we have all these AI systems, we just let that do the, the planning so that we can now look at a deeper science target, uh, and, and plan out a better science campaign.
Tariq Malik [00:46:38]:
And I think that's what is very interesting because we're seeing AI in all of our daily lives, and this is, and this is a very stark example about how it can make something, um, much faster and efficient when it comes to robotic exploration of space. Because if they can do it on Mars, then they can do it on the Moon when, when there we have people living there and we want to send rovers out to go find the, the liquid water or the frozen water ice space, or the helium-3 when we get to that point where we can use it. Uh, NASA wants to send a helicopter to Titan, the cloudy moon around Saturn. That's a big drone, also nuclear-powered. And how is that going to fly around that moon? This system can help it fly for longer as well. And I'm pretty sure the folks building
Mikah Sargent [00:47:25]:
that mission are watching this right now too. Wow, wow, wow, wow. Now let's get into it a little bit more. Um, we, we talked— you talked a little bit about this. It's, uh, the AI is described as Vision Language Model. And can you tell us kind of specifically maybe how this compares to what we know about AI here on our planet, and if the AI that's being used is different from what we're used to?
Tariq Malik [00:47:56]:
Well, what I was able to discern kind of doing researching this and working with our our editors, you know, because they were writing in the stories, is that they're using this generative AI to plan a path based on existing data that doesn't change, which seems like it's a bit different than, say, asking an AI, you know, between you and I, using it to write like a paper or to come up with a proposal, or it's kind of really building something out of whole cloth. They know that they can say, Hey, here is the complete dataset that we have of Mars. We have high-resolution imagery from these orbiters. Each of these images has a resolution of XYZ. Here is the dimensions of the spacecraft itself. It has 6 wheels. It's this tall. It has this kind of suspension system.
Tariq Malik [00:48:47]:
Uh, here's the kind of dangers we have to avoid, and this is what they look like. And here's what they look like from, from above and what they look like from the rover. And so this system can look at all of those, those datasets. Datasets and then map out a path that then identifies where those, those dangers are and then allows it to set some waypoints to check how they compare with that internal map before it keeps going on. So I do feel that it has a very specific use case and it can, it can, you can check exactly what it's using because you know what the maps are. Now it's not finding new maps of Mars because there aren't any new maps of Mars. And so knowing that it has that concrete dataset, know exactly that the path that it sets. Now, you mentioned that they do have a digital twin on Earth, and that's really crucial because they can do anything they want with a digital twin.
Tariq Malik [00:49:38]:
They do have a physical twin as well. Uh, they, they use those in a, like, a Mars yard if there's something that's particularly finicky, they can't figure out what to do, or something's gone wrong and they're trying to recreate it. But this digital twin that they have have is all of the different, uh, orientations of the rover itself, and they can manipulate it and put it exactly as they see based on the imagery and the data they have from above and from the rover. And they can say, okay, go do this, what will happen? And, you know, 9 times out of 10, that's— that gives them a lot of confidence that the way they're saying that the rover should move can work. And when they put these paths into the digital twin, it worked very similar to what they saw in real life, which gives them a lot more confidence. They're able to test it, uh, both, you know, in that, in that, that computer model on the, on the ground, and then it gives them confidence that
Mikah Sargent [00:50:32]:
they can use it on Mars itself. Got it. Uh, one kind of big question that I have for you is, you know, beyond Mars, talking about using autonomous technology elsewhere for future exploration, as you mentioned, the moon, deep space, space. Um, with this successful demonstration, what do we think, you know, from your crystal ball, uh, what are the longer-term ambitions here, uh, as NASA continues to test this technology?
Tariq Malik [00:51:03]:
What's next? Yeah, well, we know that the Moon is next for sure. You know, as you and I are speaking, NASA is fueling up their giant Space Launch System rocket, uh, to see if it's ready to launch 4 astronauts around the moon, uh, later this year, hopefully next month as we're recording it. Uh, but, um, uh, Jared Isaacman, the NASA administrator, said very specifically with the announcement of this AI kind of, uh, a pathing system for the rover, that they can, they can see using this on, on the moon because you want astronauts to get the maximum amount of value out of, uh, out of their time. Being able to just say go here while they do other things can be very valuable to them. Astronauts on Mars engineers can be able to have these like, like robot, uh, rover handlers where they'd say, go look at this and see if there's anything interesting there, and, uh, and then let me know. And then they'll say, okay, then that's worth me actually going out there and spending the time and the consumables like air and all of that, uh, to get there for it. NASA's hopeful that they can upgrade the system over time so that not only does it map out a path for the rover to go, but it can also, uh, take the images from that, that trip and say, okay, these are interesting science that I can find. This is maybe a place that we're going to stop so I can go touch it with my rock abrasion tool or collect the sample or something like that.
Tariq Malik [00:52:23]:
And having that kind of, uh, intelligence on a rover, you know, will help just get the most science in the least amount of time that you can. Because Steve Squire's fame, you know, famous planetary scientist, led the Mars Exploration Rover missions of Spirit and Opportunity, has has said repeatedly that he loves the rovers, but a geologist on Mars could do more in like an hour or a day than it would take a week or more for the rovers to do because of just not having that human touch. And this, this new system allows you just one notch closer, uh, and one, one more tool to get closer to
Mikah Sargent [00:53:01]:
that true twin geologist on another planet. Got it. Uh, anything else about this that you would want to tell our listeners about or, uh, you know, what else we should kind of be keeping our eye on in terms of AI and space
Tariq Malik [00:53:17]:
exploration and space in general? Yeah, I think this is really interesting because it's been an evolution for Perseverance in particular. Perseverance is different from Curiosity on Mars because it has a very, uh, it has— first of all, it has fancier cameras that are higher resolution, and they're more powerful, and they're faster. It also has its own onboard second brain for image, uh, image processing, so it can process images as they come in in near real time, which is pretty great. It also has an auto nav system. So all of these were like the building blocks for this system over time. And I think that watching, uh, the, the engineers and the scientists at Jet Propulsion Laboratory build the most capable machine that they could to launch it in 2020, and then now 6 years, you know, 5 years later after— 6 years later after launch, 5 years after landing on Mars, finding new ways to use that hardware hardware is phenomenal. And I think we're going to see a lot more of that kind of engineering in space exploration. And this early success with the use of AI for an actual function, I think we're going to probably see that being included in the design of future things.
Tariq Malik [00:54:22]:
Like I said, the Titan Dragonfly probe has not launched yet. It's not— I mean, they're building it, they're still working on all of the, the systems. It'll be very interesting to see how they include AI in terms of tracking and flying that vehicle based on the tools that they have now and what they might have in a decade or so when it arrives, uh, at that, at that place. And then astronauts are going to be using it too, you know, everything from having someone to talk to on those long dark trips, you know, to the Moon, uh, or, or, or Mars, uh, or just to, to plan their days or their, their calendars. I mean, what we use AI to do now on Earth, they'll be able to use it to to optimize their days, uh, on, on the Moon or Mars or somewhere else. Hopefully an asteroid, or I'd like to see Venus, you know, uh, and, and these really harsh places, Mercury, where it's really hard to land, uh, having a smart rover that can make decisions for itself and get everything else and then just say, this is what I found, man, send, send more if you want.
Mikah Sargent [00:55:22]:
That'd be really cool to see. That would be cool to see. Well, Tariq, I want to thank you so much for taking the time to join us today to help us understand what's going on on with AI in space. Always, uh, a pleasure to hear about, um, the, the state of things. Of course, uh, folks should check out the great work you're doing. So could you tell them all where they should go to make sure they're
Tariq Malik [00:55:46]:
keeping up to date with what you've got? Yeah, well, you can find me at space.com as always, uh, as well as @tarekjmalek across most, uh, socials. On YouTube, if you like video games, I'm SpaceTron Plays, and on every week with Rod Pyle on This Week in Space on the Twitch Network. That's a lot of fun. We talk about space. We have great interviews. And we're going to be talking about this stuff as well as Artemis II.
Mikah Sargent [00:56:10]:
So that'll be great to see. Yes, incredible interviews on that show. We, of course, every week sort of plan the week and hear about what's coming up. And every time I'm going, they're interviewing who? That's awesome. So if you listening to this show and you haven't checked out This Week in Space, you gotta, uh, thank you so much for your time.
Tariq Malik [00:56:31]:
We appreciate it. Thank you.
Mikah Sargent [00:56:32]:
Thanks for having me. It's been fun. Alrighty folks, that brings us to the end of this episode of Tech News Weekly. As you know, the show publishes every Thursday. twit.tv/tnw is where you go to subscribe to the show in audio and video formats. I mentioned Club Twit during the show. Remember, twit.tv/clubtwit, that's where you go to sign up. $10 a month, $120 a year.
Mikah Sargent [00:56:51]:
If you'd like to follow me online. I'm @micahsargent on many a social media network. Uh, you can also check out iOS Today and Hands On Apple, which will publish later today, and Hands On Tech, which publishes every Sunday. Thank you for being here, and I'll catch you again next week for another episode of Tech News Weekly. Bye-bye!