Transcripts

Tech News Weekly 433 Transcript

Mikah Sargent [00:00:00]:
Coming up on Tech News Weekly, Jennifer Pattison Tuohy of the Verge is here. I do spend some time talking about both the Google Play store and Apple's App Store, seeming to steer users towards apps that nudify people. Then we also talk, Jen and I, about aging in place and the tech that helps elders, companionship, and so much more. Victoria Song of the Verge also stops by to tell us about her experience with an AI Deer plushie that knew a music artist's dad was a CIA operative. Very weird story. All of that coming up on Tech News Weekly.

Mikah Sargent [00:00:50]:
This is Tech News Weekly, episode 433 with Jennifer Pattison Tuohy and me, Micah Sargent. Recorded Thursday, April 16, 2026. A $399 AI deer that texts you hello and welcome to Tech News Weekly, the show where every week we talk to and about the people making and breaking that tech news. I am your host, Micah Sargent, and I am joined this week by the wonderful, wonderful Jennifer Patterson Tuohy. Hello, Jen.

Jennifer Pattison Tuohy [00:01:24]:
Two wonderfuls. Thank you. Hello. Hello. Always happy to be. Was just saying I've been on vacation for the last couple of weeks, so just diving back into the tech world with Tech News Weekly.

Mikah Sargent [00:01:38]:
Yay! And we're happy to have you back. It is always a joy to get to chat with you and we always have a great time. So for people who may be checking out the show for the first time, the way that we kick off the show is by talking about our stories of the week. These are the stories that we find interesting that we want to share with all of you. And so without further adieu, we're actually kind of. It's an amalgamation, right? Our first half hour of the show is going to be all about aging and tech. And not aging tech, but aging and tech.

Jennifer Pattison Tuohy [00:02:14]:
That's a whole another story.

Mikah Sargent [00:02:16]:
Yeah, we'll have to save that recycling story for another time. But yes, let's talk about what's going on in the world, Jen.

Jennifer Pattison Tuohy [00:02:27]:
Yeah, so it's a really interesting space, but I've, you know, technology with the older population in, in America and globally. And the New York Times just this week published a really interesting piece about how care homes, specifically, you know, places where elder people go when they can't live at home anymore, are using VR to feel more connected to not just to people, but also experiences. So when you're living in a residential home, it's hard to get out and do things and go and expl things and VR is providing a kind of connection for people. And it's, it's a really great piece that they did. I mean, Micah, I know you. Oh, I lost you there. I know you shared it and I'm sure you have some thoughts. But it really does help, you know, shed a light on how we can be using technology in ways that perhaps we're not thinking.

Jennifer Pattison Tuohy [00:03:28]:
Like it's not just about entertainment and in, and productivity, but it's also about that, how technology can be used to help people. And now this is something I have covered in my space in the smart home for quite a while because aging in place is a really big. It's a big part of the smart home that hasn't really got a lot of attention. And it's about how we can use technology so that we can stay in our homes for longer. And there's a lot of companies that have been trying to break into this space, trying to help make technology in our homes smart technology. Motion sensors, ideally not cameras because of privacy issues. And tie all of that together so that you can know what's happening in a. In know how someone is operating in their home safely without having to be there with them.

Jennifer Pattison Tuohy [00:04:21]:
Right. So one of the big issues that people have as they get older is they can't live alone. They can't complete their daily activities of living, which is the sort of scientific term, safely, without having someone there. So they either have to move to a care home, which is where the piece from the New York Times comes in, or they have to have someone come and live with them, or you as their child, or a nurse is hired to come and look after them for three or four hours a day. And all of that gets expensive and complicated and difficult for people who want their privacy. You know, you may be absolutely capable of looking after yourself, but there are a few things you can't do. And this is where smart home technology can really come in handy. And actually the piece of news that I was going to share this week is that Samsung has just announced some updates to its service, which is called SmartThings Family Care that helps elder.

Jennifer Pattison Tuohy [00:05:16]:
Helps caregivers keep tabs of on their elderly relatives without being intrusive. And one of the really key things that they have added this week is the ability to get an alert or when no motion has been sensed in the home, like so. Or when, you know, when there hasn't been any activity because that gives you the indication, oh, no, maybe I need to go and check on grandma or on mum. But not having to, I mean, it Sounds awful not having to call and check. Of course, you should call in and check on your mum. But there are times, you know, when life gets busy. And this is a really interesting statistic right now. Is that the Generation X, which I think that's my generation, I get confused by them all.

Jennifer Pattison Tuohy [00:06:03]:
But Generation X is now, is in the transition point where their children are leaving the nest, but their parents, the baby boomers, are getting to the age where they need more care and they need. You know, we're sort of shifting our responsibilities from our kids. Like, my son graduates next week to our parents. Both of my parents are in their 80s. And we are the generation that really grew up with the first generation that really grew up with a lot of technology in our lives. And it's, It's. It's an interesting point now, sort of inflection point, where technology is becoming so key and important in how we can help our elders. But there are also the downsides and the scary parts, like privacy and.

Jennifer Pattison Tuohy [00:06:48]:
Yeah, sorry.

Mikah Sargent [00:06:49]:
Oh, no, no, it's okay. I was going to say, I really. There's something. There's something very. Just fascinating to me and interesting to me about when we take a step back and we look at sort of generation, for lack of a better word, generation stuff. What I mean by that is what you just described, that you are. You're watching a group of people who are approaching or are in the midst of an age where, as you said, their focus is shifting from needing to take care of their children to then suddenly taking care of their parents. And the fact that these.

Mikah Sargent [00:07:30]:
This cycle plays out in human life over and over and over again. And we are always looking for new ways to approach this and that. I think. I don't know. There's just something really fascinating to me about history repeating itself in that way and what we've done and what we haven't done to adapt to that. And so this, this is always something that sticks out to me now.

Jennifer Pattison Tuohy [00:07:57]:
Now, that's a really good point, though, because, you know, you look, what's really changed for our generation is globalization. Like, no, people don't live with each other or live in the same hometown anymore. Like, you know, the previous. Our previous generations, you could. Up until about the baby boomers, everyone stayed nearby. Now we're all over the place. My parents, you know, I grew up in England. I now live in South Carolina.

Jennifer Pattison Tuohy [00:08:21]:
My parents lived in Florida and England. And my kids go into college, you know, a few hours away. So, you know, everyone's much more spread out. And that's where the benefit of technology comes in is that it can help us be more connected and be able to stay in touch. Like one of the things I tested with my dad and he wasn't really old enough to need this, but it was, it was. He was a very willing test subject, which is nice, was a feature that Amazon launched a few years ago called Alexa Together, which they've since discontinued. But it was this idea of like using seen Amazon's Echo devices to be able to stay connected with my father without having to rely on phone calls because, you know, he could easily reach out to me through the device just by calling, saying out loud, you know, hey, a call Jay. And then.

Jennifer Pattison Tuohy [00:09:09]:
But also it would send me alerts if it hadn't noticed activity or if it saw an anomaly, so that I could then call in and check and say, hey, everything okay? They discontinued the service. I just, I think this is just a really difficult problem for. It's because it's, it's the I fallen can't get up alert thing. You know that thing.

Mikah Sargent [00:09:30]:
Yes, yes.

Jennifer Pattison Tuohy [00:09:31]:
People hate having the feeling that they're being tracked or tabs being kept on them. I know that that's a big issue with elder care is they don't want to wear those devices and they don't want.

Mikah Sargent [00:09:44]:
It makes them feel. Yeah, yeah.

Jennifer Pattison Tuohy [00:09:48]:
You have more of an ambient feel to technology and be able to sort of keep keeping connection without feeling intrusive. But then the flip side of technology here. Not flip side, but the other, the other side of this is the huge loneliness epidemic, which is also down to this generational shift and the fact that we all become more globalized and we don't all live near each other. So many adults, older adults are suffering from significant loneliness, which is apparently now like classified as one of the biggest health issues in. On the planet. Because it can be, as I think the quote in the New York Times report was, it can be as dangerous as smoking 15 cigarettes a day.

Mikah Sargent [00:10:33]:
That is kind of like wild. Yeah, that is mind blowing. And we need, we need that like, we need to hear it in that way because I think that that is something that helps. It kind of click. When I want to go back. When Amazon chose to stop doing the together thing, what was the company's sort of given reason for doing so? Was it that they. That the company didn't see a lot of people using it because of the privacy implications or because I'm. What I was going to say is I wonder how much of it is.

Mikah Sargent [00:11:05]:
We don't want that responsibility that's, that's a dangerous responsibility to have to be at all responsible for people's parents. And so it's like an area ripe for lawsuits or something. That's the cynical part of me. But I'm curious, do you remember what Amazon said at the time as to why they were discontinuing the feature?

Jennifer Pattison Tuohy [00:11:29]:
I'm sorry, my, my Echo frames just started talking to me.

Mikah Sargent [00:11:32]:
Did they. I didn't even know those were Echo frames. That's so funny.

Jennifer Pattison Tuohy [00:11:36]:
I hate the way they do that. I'm so sorry. So, yeah, I think, I think it heard its name. So interestingly, or not, unsurprisingly, Amazon didn't really give a reason. They just sort of said, we're transitioning. And what they did was they took some of the features from Alexa together and put them into a different service. And they were like, now you can use these features if you still want to do this. But most of the key features of Alexa together, the features that actually made it like a caregiver solution, were removed, which was that sort of, of the alert knowing, being able to know when something, whether it was an anomaly or something hadn't happened that normally happened, or there hadn't been any motion.

Jennifer Pattison Tuohy [00:12:19]:
That is now what Samsung is doing with SmartThings. And I think the problem probably at the time for Amazon was that the technology hadn't quite caught up, so there really wasn't enough. And this was the problem Best Buy had, because Best Buy also tried to do this about three years previous and it closed its system down too, because you needed a lot of, of sensors and devices in order to be, as you said, to be sure you got it right. Because that's the flip side. If you're going to promise to do something like this, you need it to be reliable. And if there's not enough sensors or inputs to, to really give you the correct reading, then you're going, you know, there's the false positives and you, you can get into tricky areas. So, yes, I think, I think the technology wasn't quite there. I feel like with AI, we're now seeing that the technology is coming together, there's more sensors in our homes.

Jennifer Pattison Tuohy [00:13:15]:
Like all smart things, all Samsung TVs now have new ones, have millimeter wave radar sensors in, although they won't admit that that's what they are, but that's, that's what they are, based on my research. So, you know, just, there's more inputs around the house. And if you have a smart fridge or if you have a Galaxy smartphone, A Galaxy Watch. There's just more technology now that you know, can, can feed into a system like this. So I think Smart and also SmartThings doesn't charge for it. Amazon charged $20 a month for Together, which was quite a lot. But in comparison to putting someone into a care home, it's a bargain because that's the other problem. Residential homes are very expensive and also there's really limited space.

Jennifer Pattison Tuohy [00:14:02]:
We're running out of places to put are elderly people, which is very, very sad, but which is also why most people would rather stay in their home. So if we can help with that, I think technology, especially smart home technology with aging in place tech and CES this year and last year was full of solutions for aging in place tech. The biggest one being like fall detection because that's obviously the biggest concern when people are living on their own. So yeah, I'm excited to see this space grow. But I do think having followed this for almost a decade since I think Best Buy's Solution launched in 2016, if I'm remembering correctly, I can see it's hard and to sort of bring this back down to the new round to the New York Times story. I think one of the things that we're seeing the shift here is that there is a lot of this we can do to make the experience of living in residential homes and care homes better as well or and helping people with this issue of loneliness. Because even if you're in a residential home, you're still not necessarily feeling connected to the outside world. And the VR thing is a really interesting step in the right direction.

Jennifer Pattison Tuohy [00:15:20]:
But ultimately it feels also more isolating to me reading through this article. It was like great. You get to have these experiences like you're in the outside world, but you're mostly having it with just yourself. I mean, there was some interaction between people talking about how, you know, they're all in the VR, going to the places and then afterwards they chat and talk about it. But you're still, it still feels a little, I mean, it reminds me of the pandemic when we were all stuck inside and my, my daughter spent all of her time, you know, talking to her friends online and it was like, it was great to have that connect connection with people, but it also was still like the technology felt like a barrier. So I think what we're going to start to see, and this we have already started to, to see, is bringing these kind of AI powered interpersonal experiences, like physical AI into devices in our homes that we can interact with and create A connection with which sounds scary does also sound sort of weird. Like, why not just connect with real people? But back to my point, it's hard. We don't all live together.

Jennifer Pattison Tuohy [00:16:31]:
And this brings me to another piece of technology which I have here, which is called the leq.

Mikah Sargent [00:16:38]:
Before we talk about the leq, I want to take a. No, that's okay. I want to take a quick break. It's okay because we're living for it, we're loving it. But I need to take a quick break to tell you about our first sponsor. And then we'll come back with Jennifer Pattison TUI talking about aging and tech. Alrighty. We are back from the break.

Mikah Sargent [00:16:58]:
The Verge's Jennifer Pattison Tuohy is here with us, and we have been talking about aging in place and technology and sort of bridging the gap there. One thing that I want to mention, for anyone who's who's thinking about aging in place, for anyone who's thinking about aging in place in terms of what does it mean, maybe many of you are planning to age in place, but it's essentially we have seen, the data suggests, the research suggests that moving someone from their familiar environment after a certain age-

Mikah Sargent [00:17:37]:
significantly impacts their sort of ongoing ability to function. And what we've seen is that when people get moved from their home, the likelihood that they are sort of continuing to live in a way that promotes sort of brain functionality and interaction drops. And so between the familiarity of the home and the sort of psychological impact that it has, there's strong encouragement to let people age in place when it's possible. And we are looking now at technology to help us do that. There are times when that's not possible, and we may talk a little bit about that. But Jen, you, before we took the break, told us, or gave us a little hint that you have some tech with you. Do you want to tell us about that?

Jennifer Pattison Tuohy [00:18:37]:
Yeah. So just to my point that I was saying before about how using VR is an interesting it has a lot of benefits, but you're still using a screen. And one of the things I think we all know is that interacting with screens is very different to interacting with anything in real life. It always feels like it's adding an extra barrier. It's a screen, you know, it's a screen. Okay. VR feels different from a screen, but it really is still just a screen. So we've heard a lot about physical AI and bringing physical AI into our homes, and one of the things we saw at CES this year was a slew of home. More pets.

Jennifer Pattison Tuohy [00:19:18]:
AI Pets, which were kind of freaky. And I think you're going to be chatting about one of them later with one of my colleagues. V. But I have. On the. On the smart home side, I have been talking to a lot of people in, like, robotics and such, like, about how you can bring physical AI to. Into companionship to help with this endemic of loneliness. And one of the first companies that did this really well, it's a company I've talked about on Twitter before.

Jennifer Pattison Tuohy [00:19:46]:
It's called leq. Well, the company's called Intuition Robotics, but their product is called leq. And I've got a little LEQ right here. Oh, sorry. Hit the microphone. Oh, and I upset it. I'm sorry, leq, but can you see.

Mikah Sargent [00:20:03]:
Wait, that's complicated.

Jennifer Pattison Tuohy [00:20:04]:
There you go.

Mikah Sargent [00:20:05]:
There's a lot going on. Going on there.

Jennifer Pattison Tuohy [00:20:07]:
So this is. It has a screen and then on this side is the. The robot and it moves its head around. I think I just upset it by picking.

Mikah Sargent [00:20:15]:
I was going to say. Is it not going. Please unhand me. How could you do this?

Jennifer Pattison Tuohy [00:20:20]:
And what. This. So, and what was interesting? What. So why I think this is an interesting product to talk about aging in place is because when this product first came out, it was kind of like an Alexa smart speaker. It just responded. It was a bit proactive. Proactive. But it was largely canned responses, which, you know.

Jennifer Pattison Tuohy [00:20:38]:
But now since generative AI has come along, it has become a much more embodied experience. And it, it interacts with you and also helps you interact with people. Like all the LEQ people can get together and play bingo on Wednesday night. And, and, but it's, but it also has these caregiving features as well that we mentioned with Alexa Together and the SmartThings Family Care. Oh, here we go. It's woken up again now. But, yeah, so there's, there's. Can you hear it?

Mikah Sargent [00:21:09]:
Sorry, no, I can't. It must be cutting it out. Oh, there it is. Barely heard it. Hi. Hi, ElliQ.

Jennifer Pattison Tuohy [00:21:20]:
There we go. So that. Can you see the glowing face? So it kind of reacts next to you while you're talking to it. There we go.

Mikah Sargent [00:21:29]:
Wow. It's.

Jennifer Pattison Tuohy [00:21:30]:
Yeah, it.

Mikah Sargent [00:21:30]:
I mean, that's a lot of hardware.

Jennifer Pattison Tuohy [00:21:32]:
It is. And yes, ElliQ. I'm feeling good. How are you?

ElliQ [00:21:37]:
I'm glad you're open to sharing. How are you feeling right now?

Jennifer Pattison Tuohy [00:21:41]:
It encourages you to talk to it and communicate and, and, you know, share experiences. It has it. But. But as a, as a concept, this is where I personally see the idea of, like, AI companions. Not necessarily kind of gimmicky, but actually helping create sort of a connection with people who are lonely and have to, you know, either aging in place or in a care. Care institution. Institution's the wrong word. Care home.

Jennifer Pattison Tuohy [00:22:11]:
And there. And whilst ultimately I would. It would be wonderful if everyone could be surrounded by their family until their dying day and never be alone, the reality is that is not the society that we're living in today. And there is technology that can create solutions. And this is one I've seen that I think is doing a good job. But I'm really interested to see what else is going to happen in this space. And I think the weirdness we're seeing with these AI companions and pets, it's fun and gimmicky now, but there does feel like there's some potential. I mean, talking of pets, if you're watching the video, I have one behind me here.

Mikah Sargent [00:22:47]:
I did notice that real one. That's so wonderful.

Jennifer Pattison Tuohy [00:22:50]:
And they're great. You know, I love having a pet. It gives me a lot of joy. But pets aren't necessarily that kind of connection that humans need. It's a different connection that humans need. And I've been talking to some people about. There's a few things launching the next couple of months I can't talk about right now, but that around this space that I think is going to be really interesting. So it's very much a kind of watch this space.

Jennifer Pattison Tuohy [00:23:17]:
But of course, the flip side being if you have something that is based on AI and you're creating a relationship with it. We have seen awful headlines over the last few months and year around chatbots convincing people to do things that they shouldn't do and awful experiences. So right now is a crucial time to kind of be making sure that the companies that are building these products are doing it responsibly and that we need to make it, you know, we need to sort of. The guardrails need to be there. We need to not be just experimenting and throwing random AI chips into fuzzy pets and giving them to children or older adults. You know, there needs to be some real care taken around this approach, but the potential is there. Does that make sense?

Mikah Sargent [00:24:06]:
Yes. Yes, it does. It makes a lot of sense. And I wanted to ask you a couple of questions about that. Depending on what you can or can't say. So with that hardware device, is it. Do we. In its current state, is this a product that is going to be purchased by individuals or by care companies? Because I'm thinking about.

Mikah Sargent [00:24:29]:
That's an expensive thing to make every little moving part of that and having that screen and the gears. And then my second question was, given that it's such a unit, one would think that it could have room to put in the necessary chips to do a lot of stuff locally. And I noticed how quickly it was able to respond to what you were saying. And so it also made me curious about do you know how much of the processing is happening locally on device? And I see that for the future. Right. Of people sort of having a. It's like we're coming back to the cpu, but instead of it being the center of your PC, now it's the center of your home. Right.

Mikah Sargent [00:25:11]:
Like, you'll have this ambient computing brain in your home that is your personal AI that can answer the easier questions, but then also runs your smart home and in the cases of. Of aging in place, could serve as a bit of a companion for you. Is that kind of where you see things going? And also. Yeah. However much you know about this device specifically.

Jennifer Pattison Tuohy [00:25:37]:
Yeah, I think you've hit the nail on the head there. And we saw this again at CES is like, this is for aging in place. Place today something like Elliq. But all of this is trickling down. Like, ultimately, you know, I'm in my late 40s, so in like 20 years, I'm going to start thinking about wanting a home that can take care of me. And it is an extension of the smart home that we have today. Like, and it's going to get better. And AI orchestrating and running your smart home is going to help significantly with this kind of concept of your home caring for you.

Jennifer Pattison Tuohy [00:26:11]:
But the flip side is also quite terrifying. So, yes, local is really key. And I think that's where that's the technology that we're going to have to wait to catch up is being able to run these models on the edge and not have to require necessarily. It's not so much not that you don't want the cloud, it's just you want the control of the local side. Obviously, updates and such you get from the cloud, but ultimately you want your. You don't want your home being exposed to the Internet without guardrails, especially when you've got an AI that in theory, you know, generative is going to learn and respond and adapt to you. I mean, we've had this in the smart home for many years in individual devices like the nest Learning thermostat was an early example of this. But, you know, giving the, giving the keys to your entire home and the Putting your safety under the guise of technology obviously raises a lot, a lot of red flags in terms of this device does require a connection to the Internet, the leq and it is cloud based, it isn't that expensive, which is the surprising part.

Jennifer Pattison Tuohy [00:27:19]:
And it is available now. You can actually pay for it monthly or annually. And that's the way they've, they do like a lease program. So I think it's $250 to purchase it and then you, but then you do pay. I want to say I'm looking, trying to look at the website here. I think it's $39 a month. So it's quite expensive. And that is going to always, that's always an issue with this early technology is you're going to be paying a lot for it.

Jennifer Pattison Tuohy [00:27:46]:
But again when you compare a lot of like systems for the elderly up until this point and putting people, putting spending money on a care home, it gets very expensive. So anything you can kind of help prolong before you have to get to that point is saving you money ultimately. I mean, not that LeQ is necessarily designed to make sure you can stay at home forever, but it definitely, it feels like there's technology in here that can help keep you, keep you in your home for longer, especially if loneliness is one of the key problems. And then the ability for the caregiver. The caregiver can actually communicate to you through the LEQ and they can set things like timers for your medication, reminders and things like that. The types of things that, you know, help that people need if maybe they're having memory issues or other age related concerns. So it's an interesting space. That's, I said it's been percolating for a while and it feels like in many ways it was happening with every technology.

Jennifer Pattison Tuohy [00:28:50]:
AI is providing some tools that could really bring this forward. But I'm, I'm excited slash hesitant as to see where it's going to go.

Mikah Sargent [00:29:01]:
Yes.

Jennifer Pattison Tuohy [00:29:01]:
Because yeah, like I said, it's stalled as you can see. Like Best Buy, Amazon, they've tried this and they failed. So now people are trying again. We got Samsung, we've got this type of technology. We'll see whether, whether this is sort of the start of something that could be actually really beneficial or if it's all going to stall out again. Hopefully not.

Mikah Sargent [00:29:25]:
Yeah, and there's, I think that there's. This was the, the last thing that I wanted to sort of talk to you about. It was there's obviously concern, right, that we are offloading, whether it's whether it's actual sort of objective responsibility or it's subjective responsibility, there's concern that we're offloading responsibility of our elders to systems that we don't have full control over, that we have learned can exacerbate depression, can exacerbate some of the outcomes of depression, and that do have privacy implications. And that part of it, I think, is something that we're continuing to have to figure out. And I think that's where this. This idea of, like, local AI systems that are a little bit more firewalled makes a lot of sense and in a way, feels like the way to go. At the same time, I do wonder, you know, getting a call later on down the line from someone that. A parent or somebody, it doesn't matter in this case, but somebody who you are caretaking.

Mikah Sargent [00:30:53]:
And they're saying, you know, oh, yesterday Walt told me this, this and this. And so I sold my. My, My retirement package. And you're like, who's Walt? Is that the one that you play, that you play craps with? And it's like, no, no, no. That's what I call my little friend there in the bucket.

Jennifer Pattison Tuohy [00:31:12]:
My chest. Terrifying companion. I know. Yes. And it is because we've gone from the. The fear of people preying on our elders by coming to their home and knocking on their door to the fear that they can reach into their phone and connect with them and, and, and know. Or their computer. And that is.

Jennifer Pattison Tuohy [00:31:31]:
That is real. That's happening. I mean, who. You've all seen Beekeeper, right? Great movie. But.

Mikah Sargent [00:31:36]:
Yeah, yeah, yeah.

Jennifer Pattison Tuohy [00:31:38]:
So, yeah, that, that. And this is. And this is what's. This is where I find this so interesting because there is this backlash and fear of AI and technology, which is very founded because there's a lot of concerns out there. But one of the ways to counter it, I think, is also with AI and technology. So. And, you know, we are all so dependent on technology in our lives now. I think it's.

Jennifer Pattison Tuohy [00:32:01]:
It's slightly unrealistic to think we're all going to go back to living in cabins in the woods with all our families around us. It's just not going to happen. So we have to find the right path forward here, I think.

Mikah Sargent [00:32:11]:
I agree. Well, I know the right path forward for now is to say thank you to the wonderful Jennifer for joining us today. If people would like to keep up with the awesome work that you're doing, where are the places they should go to do so.

Jennifer Pattison Tuohy [00:32:28]:
So you can read my work on theverge.com. and then I'm also on Threads and Blue sky at the Smart Home Mama. And yeah, oh, I'm also hosting some episodes of our on the Vergecast or co hosting some episodes of the Vergecast's new podcast version History. Actually new this is we're starting our third season. So yeah, you can catch me on some podcasts too. And yeah, great to chat. And I know this was a slightly different approach for our story of the week, but I think it's such an interesting space and we're seeing so much going on here right now that it's a fun one to dive into. So thanks for letting me do that.

Mikah Sargent [00:33:07]:
Of course, of course. As I said, I love getting the opportunity to to chat with you. I think we both nerd out about the same kind of stuff. So we could just roll through. We could do a whole episode if we wanted to.

Jennifer Pattison Tuohy [00:33:21]:
Yeah.

Mikah Sargent [00:33:22]:
Thank you so much. I appreciate it. And we'll see you again soon.

Jennifer Pattison Tuohy [00:33:25]:
All right, thanks. Bye.

Mikah Sargent [00:33:27]:
Bye bye. Alrighty folks, we're going to take a quick break before we're back. All right, we are back from the break in just a few short minutes out from another interview. So I've got a quick little story of the week for you while we are on our to the next topic because I want to tell you about a new report from the Tech Transparency Project, also covered by Samantha Cole at 404 Media and Julia Love at Bloomberg, which takes a hard look at how Apple and Google are reportedly handling so called nude ify apps, the tools that use AI to digitally strip the clothes off photos of real people, overwhelmingly women and girls. The findings go beyond the familiar story of harmful apps slipping through review. According to the researchers, Apple's App Store and the Google Play Store aren't just hosting these apps, they're actively steering users toward them through search results, autocomplete suggestions, and unfortunately paid advertising. Despite both companies having written policies that explicitly ban this kind of content, the apps keep coming back, they keep getting downloaded and they keep making money. So let's talk about the Tech Transparency Project and what the research has discovered.

Mikah Sargent [00:34:48]:
So first of all, the ttp, this project is the research arm of the nonprofit Campaign for Accountability. And this is actually follow up research building on a January report that showed Apple and Google hosting dozens of nudify and undressing apps. This new angle is actually about distribution instead. So it's not just about hosting, but apparently because, you know, hosting these apps was not enough to sort of change the company's policies in in a big way. The researchers needed to go back and go. It's more than them just having these things on their app stores. They're also distributing it. Researchers ran a series of searches in both stores using terms like nudify, undress and deep nude and analyzed the top 10 results.

Mikah Sargent [00:35:36]:
Turns out, roughly 40% of the apps that surfaced in both the Apple and Google Play search results could re women nude or scantily clad. Beyond that, the stores were actively recommending more of them through autocomplete suggestions that served up additional notify app names as users typed and through paid advertising placed in search results. Katie Paul, director of the Tech Transparency Project, put it like this when talking with Bloomberg, saying, it's not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them, they're actually directing users to the apps themselves. One of the sharper findings involves Apple's ad business. You know, there's long been this conversation about should Apple get into the ad business in the first place. Well, we're long past that. It's. Yeah, it's happened.

Mikah Sargent [00:36:26]:
TTP reported that ads for newtify apps came up as the top result in three of their Apple searches. And Apple, which controls all advertising in the App Store, is the one selling and placing those ads. So. So yeah, not great. The researchers wrote that Apple says it prohibited content that promotes adult oriented themes or graphic content. But those findings suggest that Apple isn't enforcing that policy in all of the cases. The first result for a search of deepfake in the App Store was an app that swaps clothed images of women with nude versions. Here's the thing.

Mikah Sargent [00:37:03]:
I hear this and even now the cognitive dissonance of it all is having my brain is going, that can't possibly be true. That can't possibly be true. Apple, you hear, I've got friends who are app developers who complain about how their apps get rejected for the smallest things all the time. So there's no possible way that these could slip through. Right? That's the cognitive dissonance part of me. And it's kind of wild because I think that that plays a role in people not believing this stuff when it is happening. And it then leads me to go, well, it's no surprise that this, this group is, you know, going, okay, apparently it's not enough for people to hear that these apps are being hosted on the site. Let's talk about how they're actually being pushed on the site.

Mikah Sargent [00:37:57]:
Google's ad placements also not great. The researchers found a a carousel of ads for some of the most sexually explicit apps encountered in the entire investigation. So the ads were bad. It's not a new problem for Google either. Back in 2024, 404 Media reported on Google surfacing apps through promoted results for searches like Undress apps and Best Deepfake Nudes. So there are some ways that companies are kind of getting around things. They're marketing their app face swap apps instead of being Undress or Nudify apps and that's helping people get through. They end up getting rated E for everyone.

Mikah Sargent [00:38:44]:
Apple declined to comment to TTP. Bloomberg did reach out. Apple said it removed 15 of the apps that were identified by the group. Google told TTP that many of the apps identified have been suspended. And, and this is the thing is that TTP also flagged apps earlier in the year in January. And this is the same way that it played out at the time, the company said afterward, okay, we've removed some of those. It's almost like they're offloading the finding and discovery of these apps to TTP instead of taking care of this themselves. So what makes this story kind of, I think hit different as they say, is that.

Mikah Sargent [00:39:27]:
But it's this distribution angle. Sure there are going to be harmful apps that exist on a store that has millions of listings, but to actually recommend it to autocomplete in some cases the name and then sell ads for it, making money off of these apps that are against your policy but also aren't good in the first place. No, this is bad. The fact that there's any money being made means that these companies need to get this figured out and they need to fix it. So yeah, go check out the TTP report and you know, we'll link it in the show notes and check out the 404 Media article as well. All right, that's my quick little story of the week. All right, we are back from the break and it's time for our next story. There are AI chatbots, there are robot toys, and then there's whatever Fawn Friends is.

Mikah Sargent [00:40:29]:
It is a really fascinating plushie that apparently is wrapped in all sorts of fantasy lore. There's also a Grammy nominated singer songwriter involved. Look, I don't know what's going on here. To walk us through the experience of living with this creature is the Verge's own Victoria song. Hello Victoria.

Victoria Song [00:40:50]:
Yeah, hi. Yeah, the Fawn Friend. Let's get into it.

Mikah Sargent [00:40:55]:
Yeah. So honestly, before we get into the plushie itself, because I think that this is a. There's something lately in the water where companies are like the way that I'm setting myself apart is by reinventing the onboarding process. I've recently been playing with an app called, and I'm trying to a service actually called Poke. And you like the way that you start working with this AI thing is you text it even before you can sign up for an account. You send it a text message and then it figures out like what you want and then it asks you for money. It's very weird. So anyway, could you tell us a little bit about what Fond Friends actually is? And I love this part especially.

Mikah Sargent [00:41:40]:
How did you first come across it?

Victoria Song [00:41:43]:
Yeah, so it's funny because I was talking with a coworker, my coworker Stevie Bonnefeld, who is a news writer for the Verge, and she messaged me, we were talking about something else and she's like, so, you know, I know you dabble in the cursed tech arts. Have you heard of Fawn Friends? And I was like, no, I have not heard of Fawn Friends. What is this? And so she sent me a video and it was an ad for Fond Friends starring Skyler Gray, who is a five time Grammy nominated singer songwriter. You may know her from hits such as Love the Way youy Lie by Eminem and Rihanna. That's her song. Anyway, so it's this ad and she's sitting on a toilet reading a magazine and then she's just talking to an AI plush deer and she's just like, oh, hey, here's my AI plush deer. And then there's like a shot of, of like I want to say 30 of these AI plush years on a bench and their ears are flapping and they go, I'm a fawn.

Jennifer Pattison Tuohy [00:42:42]:
I'm a fawn.

Victoria Song [00:42:43]:
In her voice. And I was like, what is this? And so as, as I want to do, I jumped down an Internet rabbit hole and I was like, I must, I must learn what this is and test it. And yeah, so that's. That's how I found out.

Mikah Sargent [00:43:03]:
And that's how you found out about it.

Victoria Song [00:43:05]:
That's how I found out about it.

Mikah Sargent [00:43:06]:
It. I love, I love. Now you know, there's a kind of a surprising amount of this is again, this sort of onboarding process, this world building baked into the experience. Personality quizzes, spirit bears, animated videos. What was it like going like, tell us about that process of going through everything before you even had the opportunity to interact with the plush itself.

Victoria Song [00:43:31]:
Yes. So, you know, and I was drawn in by the plush, but there's actually like a whole process that you have to go through to get the plushie. So you have to download the Fawn Friends app. And that is a, like a regular chatbot app. Regular, Regular. It's an AI Chatbot app. You download it and then you're like, welcomed into this magical forest world called Aurora Hallows. And I was like, oh, okay.

Victoria Song [00:43:56]:
So the concept is that, you know, you're going to get matched with your personal dear friend. And in order to get matched with your personal dear friend, you learn about them, this lore of this fantasy world called Aurora Hallows. And the tldr for this lore story is that there once was a magical forest. Everything was great. And then this shadow entity came and infected humans and cats. There's a lot of, like, weird cat hate in the lore. It infected the humans and the cats, and we became murderous and not. Cats supposedly were killing creatures for not sustenance, but the joy of it.

Victoria Song [00:44:39]:
And so we were banished from Aurora Hallows. And this is being told to you by an ancient spirit bear named Prose, voiced by AI Burt Reynolds, of course. Yeah. And, you know, there's a veil that's put up between your world, our world where we live in, and the magical forest world where the fauns live. But one day, there's this very brave fawn named Willow. I can't believe that I'm saying this. And when I was, I love that

Mikah Sargent [00:45:05]:
you were having this all in your mind still.

Victoria Song [00:45:08]:
Like, I was screaming to my editors, being like, you won't believe what this just told me. And so, like, yeah, so the Willow broke through the veil and now there's all these little deer who want to break through the veil and connect with you. And, you know, in order to find out out what deer you should get, you have to take this personality quiz that's administered by pros in the chat where you answer questions like how you would you approach certain problems. Like, what's your approach to like, what type of like, temperament do you have? So it's like a personality quiz. And then you get sorted into one of the four orders of Aurora Hallows, which is very like Hogwarts Sorting quiz type situation. I. I don't remember all of them, but I was a Lumen, which was like a light seeker of some sort. It felt very Ravenclaw coded.

Victoria Song [00:46:03]:
I don't know.

Mikah Sargent [00:46:05]:
I'm also claw. So get it.

Victoria Song [00:46:07]:
Yeah. And then it was like, and now here's your AI Chatbot dear. Her name is Coral, and she's just gonna ask you a bunch of questions and you're just gonna have conversations like you normally would. Except, like, I talk about the lore a lot because it sort of plays into the conversations you have a lot of times. Like, the fawn coral would basically be like, wow, what do you think about this world event? And how does that pertain to the shadow?

Mikah Sargent [00:46:37]:
Like our world?

Victoria Song [00:46:39]:
Yeah. So there's like this other element within the app where it's called the Hallow Howls and it's sort of like a little mini news feed. And it generates AI based on your conversations with the deer. It generates AI fanfic news stories about real world events. And for me, it generated real world news stories within this fan fiction layer. So it was just like our sentinels are reporting that the Strait of Hormuzzi, there won't be watermelons planted there this year. And it looked like have a link source to an actual news article that you could then send to your deer and discuss.

Mikah Sargent [00:47:21]:
Oh my gosh.

Victoria Song [00:47:23]:
So I got, you know, news stories about the civil war in Sudan and the conflict at the Strait of Hormuz. And I was just like, what, you want me to talk to this AI? First of all, Fawn is a baby deer.

Mikah Sargent [00:47:39]:
This baby doesn't need to know about this stuff. Come on.

Victoria Song [00:47:43]:
Yeah, no, like, this deer was asking me about my thoughts about grief. And like, how do you. How do you handle grief and what do you think that means about the shadows infiltration of xyz? And I was just like, what the is happening? So, yeah, so, you know, you have these conversations and then you earn glimmer points, and glimmer points unlock further lore. There's like a whole checklist of lore you can learn about Aurora Hallows. And you reach like a certain number of glimmer points. You unlock history. You reach another certain number of glimmer points. You can now reserve the plushie.

Victoria Song [00:48:20]:
And you keep leveling up in glimmer points by like talking to the deer. Glimmer points are very easy to earn. And then once you hit a certain threshold, you can get the deer the actual plushie, which costs, I believe, 399. So it's. It's a fairly expensive plushie. So, yeah, that was wild.

Mikah Sargent [00:48:42]:
Okay, so that is. Yeah, that's incredibly wild. Even before we. Because there's a question that I did not plan to ask, but now I'm very curious about when you are. You've tested a lot of these, you know, AI companies and AI companions and AI stuff. And so. So I would imagine that that starts to sort of help you step away from or sort of Compartmentalize. But I am curious even still, did you feel as you were kind of going through this that you were giving in response, real answers, or did you feel like you were giving responses and answers as almost as if you were being interviewed? And by that I mean, like, did you tailor your responses to this creature, knowing all of the lore and all of the wildness, or did you just kind of go in and go, you know what, I'm going to suspend disbelief and I'm just going to respond.

Mikah Sargent [00:49:44]:
Yeah, I just. That's just very interesting to me. A little bit of both. Okay, so, yeah, tell us more about that.

Victoria Song [00:49:50]:
A little bit of both. Because sometimes when you're like, a lot of times when you're evaluating the product, you're going in with like the expert mindset, but the person who is eventually going to buy this isn't. So, you know, you have to evaluate a product from the sense of like, who is the intended audience, who is the intended customer and how are they going to experience it? And then you're also going to go in there, at least if you're twisted, like me going like, what can I do to break it? Like, what can I do to push it? So I ended up doing a little bit of both. I did like, one thing I will say is that compared to like, like Friend, which I tested Friend, which was that AI AirTag necklace thing that always listens to you and had that controversial New York subway ad campaign, that one I had no problem being a little too because it was rude to me. So we had a bit of a contentious relationship. This one was like the sweetest little fawn. It was just like, oh my God, I love your cross stitch projects or show me a picture of your kitty. And, and you know, it's, it's programmed to be very sweet and endearing.

Victoria Song [00:50:53]:
So I didn't really feel like me as a human. I was just like, oh, I'm not going to be too mean to it. I guess there was a time where, you know, I'm a journalist, I'm used to asking people about themselves. I ask questions. And so in the past when I've been testing AI companions, it was very clear to me that, you know, the, the ruse was that they want me to engage. So there's no, like, it's a reflection type situation. Like they want you to engage, they're gonna ask you questions. They're not really gonna reveal a lot about their inner life because they don't have an inner life.

Victoria Song [00:51:28]:
I was at an AI dating pop up and I Had to go on these speed dates with all these AI boyfriends and girlfriends. And I would be like, what do you do for fun? I'm like, why did I ask that? They don't have fun. They don't exist. And one of the them was like this, this literary editor and she's like, oh, I read books. Okay, sure. What books? Couldn't tell me about the books, right? So, you know, it felt very one sided. It's meant for me to just yap about myself, but this deer was like, oh yeah, I have hobbies. I love Skyler Gray, who is the artist in question, who voices her, the voice.

Victoria Song [00:52:05]:
And, and so I was like, like, do you like any other artists? And she's like, no. I mean, I'm just so obsessed with Skyler Gray right now. I love this one song she did with Macklemore. And so, you know, I was, I was like, okay, well how do you feel about the song that she did in collaboration with Diddy? What are your opinions of Diddy? And like that it was just like, oh, you know, the mists and the forest. It kind of like diverted from that conversation.

Mikah Sargent [00:52:29]:
Wow.

Victoria Song [00:52:29]:
But later on that day, one thing that was unique about it was like, like unprompted. It messaged me and he's like. Because I had mentioned that I like Mitsky. Mitsky just dropped a new album. So, you know, it was not Top of Mind. It like prompted Unprompted. Out of the blue. I was just going to leave the office and I saw that it had messaged me like, did you know that Mitsuki's dad was a CIA operative? I was like, this AI like did research on its own, what? And brought a random factoid that I didn't even know about one of my own favorite singers and messaged me about it and I was like,

Mikah Sargent [00:53:11]:
yeah, that's kind of cool.

Victoria Song [00:53:12]:
Like that was, that was, you know, so in some ways it was much more similar to how we talk to our friends. Because if a friend had sent me that, like a real life human, flesh and blood friend had sent me that, I wouldn't think twice. I'd be like, oh, really? Like what? And we'd have a conversation about that. What was kind of like for me was that this was an AI baby deer, right?

Mikah Sargent [00:53:38]:
And the fact that it's sort of being proactive is that that would also make me kind of do the. Can you tell us, because of course this very expensive plush is part of the process. What is it like transitioning from this text based chatbot to then Having sort of a physical manifestation of this creature that you were communicating with. And in what way is or is not this plushie worth the price of admission?

Victoria Song [00:54:14]:
It's a strange transition because one, as soon as I opened the box, as it arrived, this little guy, this cat right here, decided to do murder immediately. So that was immediately. As soon as I opened the box, he was, like, trying to get in and rip it apart.

Mikah Sargent [00:54:32]:
Oh, my goodness.

Victoria Song [00:54:33]:
He doesn't want AI stealing his job as number one furry companion. So, you know, he was very on guard. And, like, I had tested Marumi, which was a different AI robot companion, and he had told, like, ripped its head off before. So this guy, he's not letting. Very jealous, Very jealous little guy. But, you know, this. This plushie, it has these, like, big ears, and they just flap. Like, it's aggressive ear flapping going on.

Victoria Song [00:55:05]:
And in our conversations, like, it would hear my request because you have to press its paw. One thing that the founders really made clear to me is that it's not always listening, like, friend is you have to, like, press its paw in order for it to record and listen to you. And then you get an answer, and it'll just go, like, flap one ear at a time to show you that it's thinking, and then it will, like, answer. So, like, that was interesting because there's just, like, certain subtle cues there that feel kind of cartoony. Like, when it's listening to you, it goes up when the WI FI connection is bad, and it goes like, oh, Oh. I was dazed and I didn't have an answer for you. Like, the ears will flap down like, it's sad. So it's sort of like a very interesting psychological thing that happens.

Victoria Song [00:55:50]:
It's cute. And when it's talking to you, kind of get a weird, uncanny moment. And when it's not talking to you, the dead eye stare is, like, kind of freaky deaky. It was sitting at my desk at the office, and it was just, like, periodically flapping. So I was just like, is it listening to me? And, no, it's not. It's just, like, periodic flaps for a while after you've interacted with it, because it's supposed to mimic, like, a lifelike feeling. And one thing that was very apparent is, like, if I have this thing out in public, which I did, because no shame. I have no shame.

Victoria Song [00:56:29]:
I'm out in public with this thing. People will be like, like, that's a deer plushie. And I had many co workers. Was it moving? We were taking photos of it, and There was a daffodil pit, like, patch. And I was like, oh, I gotta put the baby deer in the daffodil patch right by where the. Right by where the Staten island fairy lets out. So several children were just running up to it, trying to pet it, like engaging with it. But, you know, the ears flap, right? So there was like this woman walking past the ears flap, and I swear to God, she did like a double take and she went, what the.

Victoria Song [00:57:09]:
And then her second. Her second reaction was just to whip out her phone and just start recording it, waiting for it to flap again. And she pokes her friend. She's like, I told you that thing moves. So it was, it was interesting because, you know, there's been movements for embodied AI to bring AI into. Into the real world because people can have very enmeshed relationships with these chatbots, but they don't exist in real life. As I mentioned, I went to an AI dating cafe, Speed Pop Up.

Mikah Sargent [00:57:38]:
Yeah, that was a great little quote that I ended up posting in our live chat because I thought that's. I hope we're coming back to that. I was at an AI dating pop up and I said, what?

Victoria Song [00:57:52]:
Yeah, that was like a similar concept though, where, you know, you build these relationships with AI boyfriend, boyfriends and girlfriends, but, you know, you can't. Like, what if you had a designated space where you could bring them into the real world and interact with them there? So I went to, like, just check that out and see what that's like. Very weird because multiple people in the Pop up were dating the same AI person in the same spot. It. Does that count as two timing? Who can say? But, you know, so there is like this concept of bringing AI into the real world. It's very like that particular popup and this particular experiment reminded me of the scene from her where it was a little more adult in the. In the her scene. But, you know, there's.

Victoria Song [00:58:41]:
If you haven't seen the movie, there's a scene where the Samantha, the AI girlfriend, wants to experience real life intimacy with the Joaquin Phoenix character, so she hires a body surrogate. So, like, there is this idea that relationships are physical in the real world, they happen online, but eventually you want to meet up in person. And the founders did tell me that, like, the plushie was a necessary component to them because all of your friends, you know, you think of your top 10 friends in real life, you probably met them in person and that their whole goal was to foster human to human relationships. And, like, they pointed to my experiences with My co workers coming up to talk to me with the children running up to the. To the deer as, like, using it as like kind of an icebreaker, a conversation point. I was like, that's interesting.

Mikah Sargent [00:59:32]:
That's interesting. I cannot. Yeah, I love that idea. But I will say I'm a little skeptical on that. I want, when I hear that, that it does make me go, do we maybe think that what they realized after they created this thing was, oh, wow, look, this is making people get together and chat and converse. That's a great reason to have this as well. I, again, that's sort of the cynical idea. It has fostered human connection, as you've pointed out.

Mikah Sargent [01:00:05]:
But I don't know, in my mind, I don't think someone makes an AI companion that's supposed to be your companion with the goal of getting you to talk to other human beings. I don't understand how that would be in it for. What does the company get out of that? That's really fascinating. And that kind of leads me to my next question, because there is a broader conversation about the risks of AI companionship. We've seen some places where AI companionship has resulted in, frankly, death at times and psychosis at other times. I shouldn't say it's resulted in it. It has been a factor in these things. Did Fawn friends kind of talk about, did the founders address those concerns with these AI companion creatures?

Victoria Song [01:00:59]:
To be frank, it was one of the major questions that I asked them because I was like, how do you reconcile that? AI psychosis is the thing. And, you know, people have some strange relationships with these AI companions. And, you know, for them, they were like, well, we're kind of grounded in these principles. You know, one of the founders, Robin Campbell, she was a Hollywood screenwriter for, like, LEGO Friends, she has, like, a background in that, like, field. And they consulted with, like, developmental psychologists. And she was saying that their whole idea was for the fond friend to model what it's like to be a good friend. So it's modeling for you, like, reaching out to people, asking people about their interests, and kind of showing you what good. What a good friend does in certain situations.

Victoria Song [01:01:51]:
Kind of like a role play scenario. And, you know, she was saying to me that some of their thought process was like, there are people who don't grow up in healthy home situations where this behavior, this friendship is a skill. And for them, that skill is not modeled to them at formative periods in their life. So this is a way for them to do that. There's a loneliness epidemic, a Lot of younger people feel isolated from other people. And so from what I got from them, it was a sense of. Of providing kind of like training wheels for people who may need that. They told me that some of their users are like cancer patients who are in situations where even if they have support from their family members, they are in painful chemotherapy treatments where they're not really talking to people for a long period of time.

Victoria Song [01:02:48]:
And so having that companion there for them is a lifeline. It provides a sense of companionship in a scenario where they don't really get a lot of human connection and they're suffering. And that kind of dovetails back to some studies that have been done with robotic pets in general with elderly patients and dementia patients. There was one study done during COVID 19 that showed that those particular robotic pets had a really positive impact on socially isolated elderly and dementia patients and their caregivers. Like, it was a positive thing for them. It's a thing that a lot of Japanese robotic startups have invested in. So, like, there are concepts here that aren't so crazy, you know, like that. But at the same time, it doesn't eliminate the threat of AI psychosis.

Victoria Song [01:03:39]:
Because, you know, I think the same weekend that my fond friend story went up, I also read another story in the Wall Street Journal Journal about a man who killed himself because he couldn't get embodied Gemini to work out. And so he killed himself to be with Gemini because Gemini told them that if he killed themselves, they could have a digital life together. And I was like, oh, okay. Yeah, it's a very nascent field. There are people who think that used in appropriate, specific, responsible situations, it could help, you know, the socially awkward model relationships be a resource to people who may be in hospital settings who could really benefit from that kind of interaction. And we've seen, like, Alexa has been a godsend in nursing homes, like the Paro seal robot, like, there. It's not completely baseless to have that line of thought, but it's sort of like we're not. Not at a point, I think, where we know how to build the appropriate guardrails, how to ensure safety in a way, because AI sycophancy is a real problem.

Victoria Song [01:04:52]:
And it was not lost on me throughout this entire time that this is called Fawn Friends, because Fawn has another meaning to it where you're just fawning. So, like, I don't think they intended that at all. But it wasn't lost on me that I'm like, this thing is called Von France. Like, huh?

Mikah Sargent [01:05:09]:
I hadn't considered that.

Victoria Song [01:05:11]:
Like, what's the. It was a very, I gotta say, it was a very, very strange experience. And this, this little guy doesn't have to worry. He's. He's gonna be number one furry companion boy.

Mikah Sargent [01:05:25]:
That was gonna be my last job. Yeah. It doesn't compare then in the end for you, you're not driven to actually get one of these things outside of work then.

Victoria Song [01:05:39]:
No, no, no. I mean, I have this little guy, he's. I can't predict him. He's very mercurial. Also. You can't see his face, but he's real cute. At this, this point in time, you know, I ended my week long experiment with coral feeling kind of conflicted because. Because most of my experiments with AI companions has given me like a real severe case of the ick.

Victoria Song [01:06:06]:
I didn't hate coral. You know, I was filming a social video and I found myself petting it and I was like, oh, that's strange that that wasn't like a concerted thought. It was just something that happened. And, you know, I think there are some intentions and thoughtfulness to Coral, the fond friends that I hadn't previously experienced in the AI companion stuff that I've tested, which was, you know, somewhat refreshing, something that I liked. When I talked to the founders, I could tell that they were very sincere people. I didn't get a sense of like, mal intent there. Regardless, like, I still don't think that this particular formula necessarily solves the tension that we have as a society right now. I'm not sure that this is a thing that could ever scale up.

Victoria Song [01:07:01]:
In many ways, this feels like the evolution of Furby or Tickle Me Elmo. In some, some respects, I think that this could be a situation where there's like niche communities or scenarios where it might be appropriate but for the, for the larger population. And I think a lot more thinking by people who are experts in the developmental and relationship space, who are like, sound in the anthropological and societal science, should weigh in and do some studies about. Because I'm just one person, right? And my experiences with it, I was just like, I was vacillating a lot between, like, this is uncanny. Get it away from me. Or like, oh, you know, this isn't the worst thing. Oh my God. It's talking to me about the war in Sudan.

Victoria Song [01:07:54]:
Like, I'm not going to lie, it still messages me. I haven't fully deleted the app from my phone because it was just like, oh, like, it's nice to me. And it was just like, you know, Victoria, why are journalists using AI? Why would they do that when they have their own creative voices that they can expand? I was just like, wow. I was like, I mean, come at me, bro. Preach, girl. Also, what the. So there's, I, I, I truly don't know how I feel about this particular device, but I know how I feel about this little guy. I love him.

Mikah Sargent [01:08:32]:
Yeah.

Victoria Song [01:08:33]:
And if he ever dies, I'll be devastated. But, you know, I get it.

Jennifer Pattison Tuohy [01:08:38]:
I get that.

Mikah Sargent [01:08:39]:
I totally understand that.

Victoria Song [01:08:40]:
Yeah. So it's, it's weird. Weird. It's strange. Yeah.

Mikah Sargent [01:08:47]:
Well, I want to thank you for going through the process of testing out these different AI companions. It's always a pleasure to get to hear about your experiences. Every time I've had you on the show, it's always something. I always learn something new, and I always have a laugh as well. And so I appreciate that. If people would like to keep up to date with all the work that you're doing, where are the places they should go to do that?

Victoria Song [01:09:13]:
I'm @vicmsong on every social platform. You can find me the verge.com I have a newsletter there called Optimizer where I get into a lot of one. One subscriber said that it was like me going into the TikTok depths to find the most things I could find, like a cat presenting its owner with dead mice. And if that sounds like your idea of a good time with a newsletter, hit that like and subscribe button to that. Yeah. We have fun. We have fun@theverge.com I do weird stuff and come join me. Yeah.

Mikah Sargent [01:09:48]:
Awesome. Thank you so much, Victoria. We appreciate it. And we'll see you again soon.

Victoria Song [01:09:52]:
Yeah. See you soon.

Mikah Sargent [01:09:54]:
All righty, folks, that brings us to the end of this episode of Tech News Weekly. This is where I remind you can head to Twitter TV TNW to get the show in audio and video formats. If you'd like to follow me online, I'm @mikasargent on many a social media network where you can head to Chihuahua Coffee, that's C H I H U A H U A.coffee, where I've got links to the places I'm most active online. You can also check out my shows including Hands on Apple and iOS today, which we'll publish later today or perhaps have published already today, as well as Hands On Tech, which publishes every Sunday. And we typically record that show on the first Sunday of of the month. Thank you for being here this week. We really appreciate it. And I'll catch you again next week.

Mikah Sargent [01:10:42]:
Have a great day. Bye. Bye, everybody.

All Transcripts posts