Transcripts

Tech News Weekly 334 Transcript

Please be advised this transcript is AI-generated and may not be word for word. Time codes refer to the approximate times in the ad-supported version of the show.

0:00:00 - Jason Howell
Coming up next on Tech News Weekly. It's me, jason Howell. Hi, I'm back filling in for Mikah Sargent while he's out this week and having a great time Spend the first half of the show talking with Emily Drybelbis from PCMag. We have a couple of really great discussions. One she wrote an article about the Reddit, the subreddit chat, gpt. She spoke with the moderator and the person who created that subreddit. We have a really interesting discussion about what it takes to do that for free on such a popular subreddit. That's fascinating.

We also talk about the Rabbit R1 launch event that happened earlier this week and whether these AI devices can really deliver on the promises that they're putting out there. Then I speak with Scott Stein from CNET. Scott joins to talk about a Meta announcement, meta's Horizon OS, where they're opening up their XR, their AR and VR operating system for third parties to create specialized hardware around. Really neat to see that happening. It's almost like Android, but for the VR OS. Really neat to see that happening. It's almost like Android, but for the VR OS Really cool. And then, finally, we round things out by talking with Victoria Song from the Verge about the Meta Ray Bands and how they just got multi-modal support and it actually sounds like they do a pretty great job at delivering on the promises there. All that and more coming up next on Tech News Weekly.

0:01:24 - Emily Dreibelbis
Podcasts you love From people you trust. This. Is TWiT.

0:01:34 - Jason Howell
This is Tech News Weekly episode 334, recorded Thursday, april 25th 2024. Meta Goes Open With XR. Hello everybody and welcome to another episode of Tech News Weekly, the show where every week, usually Mikah and guests talk to and about the people making a break in the tech news.

I, however, am not Mikah Sargent. Mikah is out this week. I like to say gallivanting. I don't know if that's actually descriptive of what Mikah is doing. Week, I like to say gallivanting. I don't know if that's actually descriptive of what Mikah is doing, but definitely taking some time away from the show. So he reached out to me, Jason Howell, to fill in. I don't know if you remember I used to do the show with Mikah. Anyways, that was a while ago, but I'm back for this episode and I'm happy to be here. It's so great to be back in the studio and I'm thrilled to be welcomed, to welcome to the show Emily Drybelbis from PCMag to be here for the first half of the show to talk about some really interesting things. How are you doing? Hey, I'm great. Thank you for having me. Your beat at PCMag is it pretty AI focused these days? I mean, what technology beat doesn't have some thread of AI drifting through it, but is that really a large part of your focus?

0:02:52 - Emily Dreibelbis
I do AI. I also do a lot of EV content. Evs have so many interesting twists and turns going on right now. Are they failing, are they not? What's Elon Musk saying on earnings calls, I mean, there's always something. So I do a lot with EVs, I do AI, and then the story we're going to talk about today is about Reddit, which is a little bit different, so I kind of like to play in different topics and keep it interesting.

0:03:17 - Jason Howell
Yeah, right on. Well, there is a lot of crossover between EV and AI and, of course, the story we're going to talk about today. Yes, it's Reddit, but it also has an AI thread to it. Tell us a little bit about the story that you wrote.

0:03:29 - Emily Dreibelbis
Yeah, so I find myself on Reddit more and more these days. I don't know, do you use it?

0:03:34 - Jason Howell
Yeah, oh yes, I'm a big time fan of Reddit. It's one of my favorite go-tos, although it's interesting because I don't consider it the same as like how I use, like a TWiTter or you know, any of the other social networks. I feel like it's totally different, even though it's a social platform. It's just different.

0:03:53 - Emily Dreibelbis
Yeah, I even have started adding on to my Google searches like Redditcom at the end, because I just want to know what's on Reddit.

It's like a very informational thing for me and then, professionally, I find myself on there too. With AI, a lot of the sentiment of what people are talking about, what they care about, has been bubbling up on Reddit and that has made itself to the media. It's gotten the attention of OpenAI. So I found myself looking at the ChatGPT subreddit, which has 5 million members. It's top 1% in size on the site, so it really has a big, big spot on the Reddit ecosystem.

And I saw that they were looking for new moderators and they had an application link and I was like wow, who moderates these threads? It's so different than scrolling through TWiTter, scrolling through TikTok or anything else, where I just imagine an algorithm behind it. I was on Reddit and like oh, there are people who are curating what I'm seeing, which is just a bit of a different model. So I reached out to the moderators and I was like hey, what are you looking for in a moderator and can you tell us a little bit more about what you do on a day-to-day basis to maintain this massive group that's controlling and kind of shifting the AI conversation. So it's a behind the scenes look at their day to day and I ended up chatting with the founder of that group.

0:05:13 - Jason Howell
I love it. I have not, so I'm very interested in AI. Have not really spent much time in the chat GPT subreddit group, so I'm kind of looking through it now and I don't know exactly why that is. But yeah, I think what fascinates me about the Reddit model with the, with the moderation, is that usually at the other big tech companies that have a social media component to them, the moderation is often done in-house and often it's it's a often it's a very dangerous kind of draining, kind of unhealthy job for a lot of people because they're exposed to the worst, the absolute worst of the worst, on a regular basis. You really can only last in a role like that for so long before you start realizing that your own health is suffering or your own kind of your mental health is suffering. And yet on Reddit it's all you know people who are donating their time, who care so much about this community, um, that they want to kind of give their time over to it, and I have a lot of respect for that.

0:06:17 - Emily Dreibelbis
Yeah, it's, it's spot on. It's. It's interesting. On on that point, I interviewed Craig Newmark a couple, maybe a year ago now, and Craig's List, which he founded, had notorious content moderation issues and Craig's an older guy now in his late 70s, and he's telling me what I saw trying to content moderate. That site scarred me and will stay with me forever and so it really affects the people behind it and sometimes we don't hear about that.

When we see Reddit just IPO, that's big news and it's growing and growing and growing. But it's like, who are the people behind this? So, just to get into it, basically, I get on the phone it's 11pm because this guy's in India and I'm like, who is this? I get on there and he's like, yeah, so I founded this group and I'm like, oh, wow, okay, he came to the call with the full story of this subreddit and all the facts and figures, the dates. He launched it right after ChatGPT burst onto the scene in November 2022. It immediately started getting the attention of OpenAI and Microsoft because people in the group were testing the limits of what ChatGPT could do through a process called jailbreaking, trying to get it to contradict itself and Microsoft, because people in the group were kind of testing the limits of what ChatGPT could do through a process called jailbreaking, trying to get it to contradict itself, posting those on the group, and then they would find that OpenAI would sort of address those bugs and those concerns somewhat directly in their next releases.

So really, this group, they're listening, they're listening, they're watching, and Sam Altman even said we're listening, we're watching in an interview with the New York Times and he was like oh, it's great. No surprise that you have hundreds of thousands of free beta testers. Yeah, so, yeah. So he wanted to be anonymous. He's a 23-year-old, he has a full-time job, he does this on the side and he's also applying to grad school. So he logs in every morning and every night. He said it only takes him about 2 hours a day to comb through his. He has a queue and a feed where he looks at everything. He has to make decisions on how to curtail certain unhealthy patterns People posting A big issue for him is too many art posts. People going on mid-journey or going on chat GPT and being like look what I made. And he really has to take command of it and figure out how to guide the conversation and it's like this invisible hand kind of thing.

0:08:39 - Jason Howell
And really reading through your article and learning a little bit more about him. You know, a couple of years ago, when this all started, he just he realized that a chat GPT subreddit didn't exist. And we're like, okay, well, I guess I'll start that. It's not like this person did this knowing what they were really like, the depth of what they were getting themselves into. They just happened to be a fan, and I'm sure a lot of subreddits maybe the majority of them, kind of start off that way. It's like, oh well, you know, this would be cool.

And then at a certain point, I mean something like ChatGPT, that's like right place, right time, catches like wildfire. Now top of the list. I mean, there's so much attention and energy around yes, ai, but also, you know, the poster child for the modern AI movement would have to be a chat GPT right now. And so you're kind of caught in this world and like, how do you feel moderating something that almost has a, that definitely has a life of its own at this point, like I'm I imagine there's some pressure on him to, you know, feel to a certain degree, kind of locked into this because, like, you started it like what are you going to do you know? Does it get too big and you want to step away? And if so, how does that even work? I don't even know.

0:09:55 - Emily Dreibelbis
Yeah, the sense I got from him was almost he was incredulous and proud and intrigued at how it had all unfolded, and just the fact that he was willing to talk to me, I think, speaks to that. He did say he has no plans to stop. He will keep going unless he gets sick of it. There are eight other moderators now, so he has a little team of people who are helping him. But yeah, he had no content moderation experience before. He, like you said, just founded it on a whim. He wanted somewhere to talk about his chat GPT experiments and it has really escalated to, like I said, sam Altman's watching the group and then he told me that OpenAI will be doing an Ask Me Anything, which is intriguing, so we'll see. He qualified that by their talks with OpenAI. It's not scheduled, but it does sound like they reached out to him and are interested in doing something like that. Yeah, curious, jason, what you would ask.

0:10:55 - Jason Howell
What I would ask chat, ask the people behind chat GPT oh man, that's a good question.

0:11:02 - Emily Dreibelbis
Yeah, if OpenAI went on Reddit, are you going to ask me anything? What would you ask? I mean, gpt-5 is a big topic of conversation.

0:11:11 - Jason Howell
I mean, yeah, I mean that's the obvious, kind of like next step. You know, how do you ensure that a GPT-5 can really differentiate itself and stay ahead, considering especially this moment right now in AI? You know there was a time not very long ago where chat GPT was by far and ahead, the obvious, you know, leader in this space. And now we've got everything really doing amazing work. All have kind of their separate focuses and everything. But I think it's arguable to you know, at this point you could make a really strong case to say that chat GPT is not, you know, necessarily the leader that it was. So how, as you're designing, as you're creating this, this GPT-5 system, what is, what is the priority in creating something that ensures that you kind of keep the reins of where you're at, cause I know they don't want to give that up you kind of keep the reins of where you're at.

0:12:04 - Emily Dreibelbis
Cause I know they don't want to give that up. It's such a fair question, so I hope, I hope they do do it and and they ask or they ask the group to host them. So yeah, one other thing on on the moderator. It's a really interesting detail. So when Reddit IPO, they gave moderators access to early shares and friends and family.

0:12:25 - Jason Howell
Yes, that's right, you wrote about this.

0:12:26 - Emily Dreibelbis
Yeah. So it just shows how important these moderators are. The guy I interviewed I mean, they're just these people behind the scenes, but they are crucial to the business model and I think Reddit's also thankful for them. I mean, they're unpaid, so they gave them access to these shares. But this guy who started this group he's now doing, you know, in talks directly with the company has really escalated. He was unable to partake because he is not a US citizen.

So, he lost out on his ability to cash in on all these hours in this group that he has curated and every single day, morning and night, he's assembled a team. He is, you know, influencing AI conversation around the world and he is still unpaid.

0:13:09 - Jason Howell
Yeah, what a bummer, too, right, I know, you know, leading one of the most popular subreddits right now, obviously in there early enough to potentially benefit from something that could actually make him, you know, I don't know, I don't know what that turns into, you know, from a monetary sense, but it's a nice opportunity that Reddit is extending and to not be able to do it Although from your writing I kind of got the sense that he wasn't too bitter about that. It's kind of like you know it doesn't qualify for me, but you know I kind of got the feeling that it was okay, not ideal, but okay.

0:13:46 - Emily Dreibelbis
Right, he said you can't make everyone happy. And he's like the majority of people on Reddit are American and he does have an interesting perspective because he watches as everyone debates the wokeness of Chad, gpt, all these super American-centric political and racial debates. He's in India and he's just kind of watching it all go down Wow, and curating the queue, I guess to his credit. He lets people talk about what they want to talk about, but when there's a surge and there's a controversy, he's had surges and racially charged posts that are not allowed on the site, such a high volume of them that Reddit itself has stepped in and asked if he needs help to cut down, and I thought that was really responsible and good of Reddit and it's not what I hear about on other platforms, like with TWiTter or X. We hear kind of the opposite story. So it's just so would that?

0:14:38 - Jason Howell
be Reddit's team stepping in Like a hired Reddit kind of community co-manager of some sort? I don't know. I've heard of that.

0:14:46 - Emily Dreibelbis
It sounds like it. He called it a mod mail. He said I get a mod mail.

0:14:49 - Jason Howell
Okay, and so there's some yeah, I've heard of mod mail Like we've had in the past. You know, throughout the years we've started subreddits I think more in the like 2010s decade for our podcast. We would have subreddits and people could like, suggest or share stories and everything like that, and I was a mod on a few of those not a very active mod like mod by name, but you know I was not in there on a regular basis doing a whole lot of work because I had a lot of other things to do. But mod mail was, yeah, kind of a part of that from, if my memory serves and I don't know yeah, I can't remember much more than that Shows how much work I did on the subreddit Not enough, probably.

0:15:27 - Emily Dreibelbis
Yeah, yeah. So this moderation model has just, I think, gone very well for Reddit in recent months. The next big thing so they just IPO'd a month ago and then on May 7th they'll have their first earnings call. So we'll see how they're doing with the big boys, how, um, kind of being a publicly traded company. So that's, that's next.

0:15:47 - Jason Howell
Interesting, well, fascinating stuff. Definitely learn a lot about, um, what it means to suddenly kind of stumble upon the. You know, an incredibly popular uh community. And I guess he didn't stumble upon it, he created it but it took on. I'm sure at a certain point it took on a life of its own, just due to the sheer success of chat GPT and interest in chat GPT and what that's got to be like. Like that's a you know how long does that? Is that sustainable for this person? That was another question I had is like what does this lead to? This is a very specific set of skills that you're creating and learning in real time. That's got to be applicable to something else and you know, that's I don't know for maybe not necessarily a work resume or something directly related to that, but it's building you up for something and I wonder what that leads to. It's at some point down the line.

0:16:41 - Emily Dreibelbis
So yeah, he said he has a group of other content moderators that he talks to. They try to hone their craft. There's a lot of stuff he's seen that is very disturbing, and he said he's numb to it. I mean he's 23 now. So if he's every single day going through really offensive and disturbing posts and now to the point where he's now numb how long does that go on and how does that affect a young person? How sustainable is it for Reddit as a business model, just for the next 50 years, you know? Whatever, however, the platform will grow.

0:17:13 - Jason Howell
Yeah, and I think that's been going on for a long time too, so they've been sustainable so far. But I think you're right. Yeah, longer term, what does that look like? Fascinating stuff. Everybody should definitely check out your full interview to get all the details. Pcmagcom Look for the article titled Mod Life what it takes to keep our chat GPT's five million members in check. All right, we got to take a break. I told you that Mikah is not here. I'm sort of wrong, because Mikah is here for this moment. Let's hear what he has to say.

0:17:46 - Mikah Sargent
Hey folks, I hope you're enjoying Jason Howell's episode of Tech News Weekly. I know you are and I will be back soon, but right now I am in Memphis, tennessee, and so I thought I would take a moment to do an ad break with you all. This episode of Tech News Weekly is brought to you by a new sponsor here on Tech News Weekly. It's IntouchCX. IntouchCX is revolutionizing how brands connect with their customers with its proprietary framework, which identifies key areas of automation that can drive productivity, engagement, quality and cost benefits across the entire customer journey in all industries. The experts at IntouchCX guide you through what many companies find overwhelming where to start, who to work with and what to prioritize first.

So enhance your customer experiences with IntouchCX's industry-leading AI and automated solutions for all channels voice chat and email support. Deliver more engaging and personalized experiences at every stage by harnessing the power of AI and automation. In pre-interaction, you can speed up email handling time by 30% by creating predictive email replies and using AI to automate easy work. You can do it during the interaction. Apply AI predictive analysis to boost agent productivity and increase customer satisfaction. Productivity and increase customer satisfaction. And, in post-interaction, achieve up to 82% resolution satisfaction by using generative AI to analyze all customer cases and develop smart response templates that improve accuracy, efficiency and productivity. Unlock new opportunities between user experience, customer experience and employee experience to see real improvements in the metrics that matter most. So transform your business by anticipating what's next. Discover new ways of working and how IntouchCX's industry-leading AI and automation solutions can get you there. Learn more at intouchcx.com/twit. That's intouchcx.com/twit. All right Jason Howell, take it away.

0:19:45 - Jason Howell
All right? Thank you Mikah. Good to see you have fun in Memphis, although we're going to see you again in a moment. But before we get there, let's talk a little bit more about a different aspect to AI, and I thought this is.

This has been kind of an interesting week. Right A couple of weeks ago. We saw, saw the reviews break on the Humane AI pin and let me tell you, if you were hiding under a rock, you didn't know they were not good. This pin got slaughtered, destroyed. There was actually some controversy about that.

And now this last week, the Rabbit R1, which is another AI-assisted device. It has a very retro quality to it. It's really cool looking. It's a little plastic with a little screen, a little rolly thing and a camera on it. It's supposed to be kind of your portable AI assistant that does things that, yes, your phone can probably do, but does it without your phone, even though it's tied to your phone. I don't know, this moment in AI devices is a little confusing and I'm here for it, but I don't know what this all leads to. But anyways, that event happened earlier this week. I do have one on order. I'm not going to get mine until I'm my understanding is the beginning of the summer, so I'm not on the first delivery. You know a group of people to get them. Are any of these compelling to you so far, emily Like? And if so like, what is it about them that you're looking forward to?

0:21:23 - Emily Dreibelbis
I've been watching this all and just loving it. I'm just like totally interested in where this is going to go. And to me, since I'm not reviewing these devices, I'm thinking about it more and just the question right now like when do we get free from our smartphones? Like, is there a time when we won't have laptops, we won't have cell phones like we have today? What would that world look like? Because I think there's so much research now that's showing that we are giving up a piece of our happiness, our peace of mind. For high school students, their GPAs are suffering. It's not without consequence what we're doing, despite how amazingly convenient all these devices are. So I feel like the fact that there are multiple products that are trying to not have screens and push us in a new direction is really compelling. It's also kind of I don't know funny. It's maybe too cruel, but it's like they're all apparently terrible. So it's an interesting like. Intellectually I'm interested, but as a tech user I'm definitely not going to buy one can be.

0:22:29 - Jason Howell
But I think everybody recognizes that this moment in technology is big enough and feels like enough of a shift in what we are used to versus where we potentially are going, that it makes sense that there would be something new to come out of this, some sort of hardware, because a hardware device that we either wear on ourselves or have in our pocket.

This is, you know, this is a paradigm that we're very used to. At this point, the smartphone has really done that to us, and so it's easy to see why we feel like we need to figure out like, well, okay, well then, what is an AI specific device? It's gotta be something, and so the companies that are making these devices are coming up with all these different ways to differentiate to, you know, kind of try out based on the current, the current models of AI that are definitely imperfect to try and, you know, make promises that they can deliver on, and I'm not certain that this stage of this development is such that they can actually deliver on a lot of the promises. I just don't know that the technology supports it yet. It might someday, but I'm not quite positive that it does completely right now.

0:23:59 - Emily Dreibelbis
Yeah, it's like the desires there but not the capability. Yeah, yeah, so what's your understanding of how they work?

0:24:05 - Jason Howell
Well, I think it's very different on on how the different ones work. Some of the some of them, you know lean very heavily on sending data to the cloud, which was one big disadvantage for the humane AI pin. If you saw any of the reviews, marques Brownlee's being the one that most people talked about cause he called it, you know, I think he called it the worst tech product he's ever reviewed, which is, whew, that's pretty harsh, but it takes time, right.

So I ask it a question for this thing that's supposed to be really fast. You know, I tap it and I ask it a question and then it's taking that and one moment, you know. So you got to wait for the response. And then it sends it out in the cloud and then it does the little thing, it's thinking thing, and then it sends back the answer and then it speaks it out at a normal like human speaking pace, which, when you've waited that long, kind of feels like all right, taking time here, you know, to the point to where you feel like, okay, well, maybe I could have just pulled out my phone and done this in half the time in a paradigm that I'm already used to and possibly more effective. So you know, some of them do.

That Rabbit, I know, has some actions that it does on device, some actions that it sends out to the cloud. They're all different. And then, within that soupy mix, is, you know, this whole idea of data privacy? You know who actually has access to the data when it goes out to the cloud? Is that secure? Is it encrypted? There's so many questions.

0:25:29 - Emily Dreibelbis
Yeah, I've just been thinking a lot about where it's all going, and I recently read Ray Kurzweil's new book Do you remember that guy.

0:25:37 - Jason Howell
Yeah, I remember that guy. I have not read the book though, but I'm serious about it.

0:25:41 - Emily Dreibelbis
It hasn't come out. I have a piece that'll be publishing about it in about a month, but he's all about what's the future look like, when are we going? And he thinks that humans and machines computers are going to merge by the year 2045. And eventually we will all have Neuralink-style chips in our brain.

0:25:59 - Jason Howell
I've heard that one.

0:26:00 - Emily Dreibelbis
yeah, just be thinking thing, actions, um, so if you think of it as a spectrum of, like a cell phone, we go in there, we tap around, we say like, oh, call kevin, okay, we call kevin. Then these new devices are like we speak, call kevin, and it calls kevin. Maybe in the very distant future we'll think, call kevin. I don't even know if there'll be calls at that point. Maybe we'll just like ET phone home to his brain chip or something. Yeah, but it's like, does this get us closer to that vision or is it a temporary distraction? I mean, he thinks in such long, broad strokes, in such long term, whereas it's a very different level than like the marcus brownlee who's like reviewing it right now, right here, right now, and comparing it to other devices. Um, yeah, I guess maybe the voice technology, which has just always been bad in my opinion, is the downfall of these devices. I would agree.

0:26:59 - Jason Howell
Like you know, just this morning, instead of typing out a text message to my wife, you know, I used my voice and I rattled off a thing that you know probably said far more words than I would have typed. And maybe it's not as fun to read when it's like that, but I did it because it was the easy thing to do to read when it's like that, but I did it because it was the easy thing to do. So there are situations where I don't mind using my voice because it's the path of least resistance, because, I'll be honest, I kind of hate typing on mobile devices Usually. It's just never an enjoyable experience for me, but at the same time, a lot of what we probably envision these devices to be good at, we might also want a little bit of privacy and how we communicate that to them. And so then, getting to the Kurzweil theory or vision of the future that you're talking about, maybe that's where we do end up, which is OK.

Well, talking has its benefits, which is okay. Well, talking has its benefits. Maybe it's a little bit faster, but it also sprays that information out into the world. Anyone that's around you also knows what you're talking about, and it's not very discreet. Maybe we do get to a point where this stuff is done without the need for us to voice it, which also makes it faster. And if that's the case, do these devices, like you said, actually start to pave the way for that future? And actually I think the question that I have is do they pave the way for the start of that venture into the future more than the smartphone already does? And right now I'm not entirely convinced that they do. It's same, same but different, yeah.

0:28:38 - Emily Dreibelbis
It's like do we hate our cell phones or do we not? It seems like we like them very, very much because we spend so much time on them. We're borderline addicted to them. So, but are we, do we secretly resent them and we desire, um, you know, this new device that's not as screen heavy and that seems to be more seamless with how our bodies just work. We say something and something happens. So that's how it kind of feels. Like it was a flop, to be honest, but it raised some new questions and I think it could kick off an interesting, let's say, five years of reimagining consumer tech.

0:29:15 - Jason Howell
Yes, and I mean and that's what I'm definitely excited about is that there is this moment, this new technology to focus on. That, yes, is probably going to be pretty disappointing to a large degree when we're looking at these hardwares as far as delivering on promises. But I'm also a fan of like new things and technology. You know, and like smartphones. You know, by and large, at this point we kind of know what we get with the smartphone. There's, there's little that happens with the smartphone. That's truly like oh my wow, I haven't seen that before. That's amazing. I want that. You know that doesn't happen as much anymore. Like I have the nothing phone two way right now, which I'm kind of in the middle of reviewing and you know it's it's big kind of thing is that it has these little lights on the back that flash when a notification comes through or it's got kind of like a see through casing on the back that shows a little bit of the technology and stuff like that.

So it's got like a style to it.

But I think they're grasping for the differentiation. Whereas you look at devices like the Rabbit R1, you know, I threw in the doc and I know we're kind of running to the end of our time but the Limitless, which is like a clip-on AI pin, similar in some ways, I suppose, to the Humane AI pin, but a little bit smaller, kind of looks like an Apple tag or, you know, an AirTag or something like that. Then there's also the IO1, which is like they're in these huge honking in-ear AI like interfacing receivers. You know it's like an audio interface. They're the size of a half dollar, like in each earlobe, you know, not very attractive. Like there's so much creativity going on here and I'm not saying it's all good, like when I look at this I'm like I don't know that I want that huge puck sticking out of my ear. But I applaud the inventiveness and the thinking outside of the box that we're seeing in these devices when compared to kind of how boring and dull smartphones have kind of become over the years.

0:31:14 - Emily Dreibelbis
Yeah, I'll applaud them by, you know, typing a tweet from my Comfortably typing your tweet from the. Comfortably typing I feel like I've had the same phone for years but it's actually a different model, you know?

0:31:28 - Jason Howell
Yeah, yeah, you know what you get at this point, you know what you're in store for. So, anyways, none of this is to say that I'm like like disenchanted or anything like that. I'm just trying to be realistic about it. And you know, and I'm excited to see where it heads I do think that it heads somewhere and I think we will be disappointed along the way until a certain point. But I, but we're, I don't think we're near that point yet. Very fair.

0:31:54 - Scott Stein
Yeah.

0:31:55 - Jason Howell
Well, Emily, thank you. We get you for 30 minutes today and it's been a lot of fun hanging out with you and talking about these many stories mostly to do with AI, but also Reddit, which is a shared passion of mine as well, so I'm really happy you brought that story to the show.

0:32:10 - Emily Dreibelbis
Yeah, you added a lot to that. Thank you so much. This was really fun.

0:32:13 - Jason Howell
Thank you, emily dryvelvispcmagcom. This was really fun. Thank you, Emily Dry Velvet PC mag dot com. Go check out Emily's work there and yeah, thank you, and I'm sure you'll be back. I think you're back. What next? Next month, with exactly a month from now.

0:32:27 - Emily Dreibelbis
Yeah, I make every fourth Thursday.

0:32:29 - Jason Howell
Excellent, what a cool situation for the show. I love it Right on. Well, thank you again, emily. It's great talking with you. We'll talk to you soon. All right, take care. All right, have a great afternoon. All right. And before we get into not one, but two interviews coming up, let's check back in with Mikah.

0:32:48 - Mikah Sargent
All right. Before we get back to the show, one more quick break. Jason Howell, thank you so much. The show is amazing, but I do want to take a moment to tell you about ACI Learning, who are sponsoring this week's episode of Tech News Weekly. Aci Learning. They're the provider of ITPro binge-worthy video-on-demand IT and cybersecurity training. With ITPro, you will get certification ready with access to their full video library of more than 7,250 hours of training. Premium training plans also include practice tests to ensure you're ready before you pay for exams, and virtual labs to facilitate hands-on learning. It Pro from ACI Learning makes training fun. All training videos are produced in an engaging talk show format that is truly edutaining. So take your IT or cyber career to the next level. Be bold and train smart with ACI Learning. Visit go.acilearning.com/twit. Use code twit30 at checkout to save 30% on your first month or first year of IT pro training. That's go.acilearning.com/twit and use the code twit30. Thank you so much, aci Learning, for sponsoring this week's episode of Tech News Weekly. Jason Howell, take it away.

0:34:02 - Jason Howell
All right back to me. Thanks for throwing me the ball, micah. Good to see you. See you soon.

Let's start with our first interview here, and I think this is really interesting. Actually, the last half of this show, fair Warning, has to do with meta, and I think that they've deserved it. I'm really interested in kind of the direction of meta and what it's doing in the realm of openness when it comes to a lot of the technologies that they're really championing right now. Meta is taking a very open approach to many of its bigger efforts at this moment. Obviously, it's open sourcing. It's AI catalog, which I'm sure Mikah has talked about on Tech News Weekly in the past, kind of like an Android for AI in a sort of way, also opening up its XR, vr, ar OS to third parties. So in a certain way, in a certain sense, meta doing for AR, vr and AI what Google did for mobility and smartphones with Android and you know, wider than just smartphones.

So Scott Stein from CNET is here to talk all about this moment with Meta. How are you doing, scott? Hey, good, how are you doing? Excellent, it's great to see you. Thank you for hopping on today. I really appreciate it. Yeah, likewise, this is cool. So, yeah, big week for Meta. I mean they had a string of announcements this week, and let's talk a little bit about the XR strategy, because I know at CNET you really spend a lot of time with these technologies. You spend a lot of time in goggles. I know that you use your VR headset. I'm assuming it's a Quest for fitness stuff. Talk a little bit about the announcement first of all. What is new with Meta's XR strategy as of this week based on this announcement?

0:35:54 - Victoria Song
Well, it was a fascinating chess move, because this, what they announced, was something that Andrew Bosworth, cto of Meta, had kind of hinted to me in an interview I did with him a few weeks ago. But I didn't realize what he was saying there Because I was asking about, you know, whether the Quest family could expand into fitness, and he was talking about, oh, there'll be different designs and different possibilities and different forms. And then they announced that they're opening it up to third parties. So there are different OEMs that are going to be making Quest-alike things running what they're now calling Horizon OS, but is basically the Quest VR, ar software and so likely hardware partner Lenovo, who's been doing a lot of these things for years, is already in the mix.

Asus and the Republic of Gamers that ROG brand is going to be doing a gaming-focused one although you would say Meta is already doing a gaming-focused headset and Microsoft also gaming.

They talked about a branding partnership with an Xbox one, which is kind of wild on a number of levels because, even though it sounds, we'll get into that in a sec but they are opening this up to hardware.

But they're also suggesting, like you were saying, they're opening it up in other ways that the um they've already been allowing uh, valve and Microsoft uh in for a lot of their services and apps, and there's been Steam link.

But there's suggestion for Mark Zuckerberg that you know a lot of this will run the apps you want to run Um, when he did an Instagram talk about it recently, and so it looks like maybe you know running whatever apps and they welcome the Google play store coming aboard at some point. There's been this tug of war behind the scenes about you know Meta's wanted Google and maybe Google hasn't wanted to go on board. But Google also has a its own mixed reality headset partnership upcoming with Samsung Samsung and so now there's the question of, you know, is is meta positioning itself to try to align themselves even more with Samsung and Google to do this, and what will Samsung and Google's move be with this? But it does look like they are going to be one of the most. Meta will be one of the most widespread and open XR presences out there in the landscape.

0:38:21 - Jason Howell
Yeah, and it gets even more confusing, like, as I'm listening to that, I'm like, all right, there's so many like chess move kind of directions that this all goes in. You know, horizon OS is based on Android, right? So it's, if I'm not mistaken, so it's Google's Android powering Horizon OS that might threaten or influence in some way, shape or form Google's own XR strategy. And yeah, it just kind of gets a little confusing. But I mean, I'd say that's a great position for Meta to be approaching this from, because I do think that the marketplace, this particular marketplace, could really benefit from having more of an open approach for the OS to start seeing some hardware differentiation.

That isn't a one-size-fits-all headset, you know that, like you said, is really heavy. I think the real kind of key call-out as far as like, oh, I totally get it is the fitness headset. You know, throw a, throw a, an Oculus on your head with with all the extra weight, and you know, maybe it's an Oculus that you have to plug into your PC to do fitness or whatever, versus get a specialized headset that is lighter, that is designed for fitness and you know that's. That's just what it's really good at. And and, uh, you know, maybe it costs less. I don't, I don't know what the cost factor will be in all of this, but uh, that really makes a lot of sense to me.

0:39:47 - Victoria Song
Yeah, it does. And the price is a good question there, because I think that's where expectations might be that all of these partners would would go on the upper end of the spectrum, because Meta's usually been offering a lot of these headsets at the lower end, to the point where they've been subsidized and really hard to beat. They lowered the price on the Quest 2, however long it remains on the market to $200, which is to me an absurdly good and low price for something. And then that undercuts, uh, the nintendo switch. Um, it undercuts, you know, ipads and, um, I don't see a manufacturer being able to beat that. I don't know if it's like phones, but I do think that you'll end up with these. Um, maybe the pro landscape that meadow was trying to do with the quest pro. I wonder if they're maybe sort of seeding some of that here to manufacturers who could get the components for what they want and and that meta doesn't have to try to build the perfect um apple vision pro competitor here that they might be able to play in a lot of different directions. Um, I think that's. That's super interesting. I wonder too about the OS, as my mind keeps wrestling with this days later, you know. Does the openness mean that they're sort of unbundling the software to move into other people's headsets? You know we haven't seen that yet, but you know, like you know, would someday Meta want to put its software on a Vision Pro? Would apple want that? Um, which you know that's interesting or would, if google puts its foot down and says we are not putting a google play store on, uh, on meta's headset, would meta be open for putting some of its stuff onto the google headset? Um, and I think that's, or samsung headset, and so I think that's interesting too.

Um, there's a whole bunch of parts of this and the other part that I think about, which is like rendering this open landscape, and we think about apps being all over the place. Um is again the AI part. You know, I think AI has been lurking in XR Um, you've seen wearable pins right now and glasses, and that has been very AI forward right now, and particularly in glasses. When I asked Bosworth about where he thinks generative AI will go next, he brought up the Quest and he brought up generative AI for creating environments and horizons, user created worlds, kind of like what Roblox is doing. And you know all those chips that Qualcomm made last year. They're all meant to be AI forward and AI specialized ones, even the, even the one that's on the quest three, and we'll be on the Samsung Google one, and you know, I think, maybe I think about sensors and data and AI and the, and who has that platform seems like an umbrella. That's now the larger, more interesting one, even than where your app's at, so maybe that's, you know, like Google's end game.

It seems, with these glasses and in AR, seems to be on assistance and these kind of invisible, flowing things that are kind of beyond apps, and you know so now we have partners coming in here, but I wonder if this is also where Meta is going to be focusing on how its AI services and things can be the important glue down the road for all of these.

You know that it's like that's bigger than the, bigger than the hardware. It's fascinating, though, and I think, at this time, when there was no other, you know, quest is what people have at home. Quest has cemented itself, whether or not people, whether or not people think of these as like for everybody. So many people have them and so many people know what they are that I think it allows them to expand that Vision Pro is such early days and so expensive, and Google and Samsung haven't even started yet. So I think that this catches them right before the developer conference, if they're going to even talk about that in this. You know meta's flexing yeah, you know, probably they need to, but it's fascinating.

0:44:00 - Jason Howell
Yeah, I'm really happy that you brought up kind of the eventual and, I think, inevitable integration of AI technologies into these experiences.

That's when we start seeing these promises that have been made about immersive quality technology like VR starting to really deliver.

And we're seeing, you know, I think we're starting to kind of see in a non-VR world, we're starting to see some of those elements that really, you know, like Microsoft had its VESA one demonstration earlier this week that kind of shows, you know, an animated face, I think, taken from a single picture that's, you know, animated, moving, looking, expressive, kind of talking to the voice that's been fed into it.

And I look at something like that and I'm like man, once we take those technologies and bring them into a really well-developed VR experience, as like a representative avatar of who I am, that's when things start to get really much more difficult to perceive the difference between reality and non-reality or virtual reality. And that's when we start to see these technologies delivering on the promises that they've been making for years, which is, like you know, it's going to make you feel like you're in a different world and everything, yeah, but I still feel like I'm looking through bottles, you know, and all those things. I think that's really where this stuff combines, and yeah, there's a lot of benefit for meta to be positioning itself there with.

0:45:40 - Victoria Song
But with all of this technology coming together openly, yeah, and I think, like the AI stuff, like when I look at it, on those things that we have now, like the you know, the early days of the human AI and human AI pin, the meta glasses which I've been using both of those and a lot of the feelings, even what I've seen with RabbitR1, you know, there's like day zero but, like you know, there's a lot of thoughts of like generative AI be kind of like a little flaily, a little like where are you going with it? You know the precision things, um, you know, are you getting to what you want to do with it? It's a bit like multiverse-y and I think of it like you need, um, you need like lots of precision inputs, but also like ways you can suss out, uh, what's going on. To me, like the AR VR world, like having these headsets that have like big displays and lots of inputs, it's interesting to me because it feels like a way you can kind of explode out some of the generative AI stuff in the future and have maybe have control over it. It's like a doctor, strange like image, but you know, it's like you've got all these different things that you're kind of vomiting out with generative AI and then you've got to kind of like wrangle them back in and pick which one you want and like.

To me it's not just, oh, what is the best? Blank, and then you get that and then you go. That seems very simplistic, but it's like how I do web research or how people do stuff. Now you have a lot of different pieces and you're going to kind of sift them out and you're going to be like I like that one, I'm going to mix it with that one. I just feel like these combine down the road in an interesting way in those big headsets where people go. What are you doing with them? Maybe it's for all the stuff like this, as opposed to like, oh, I'm launching one immersive game, one immersive video, I think of it like no, you're going to be sort of wanting to do multiple things. That's why you have that canvas and the inputs.

0:47:34 - Jason Howell
Yeah, I think you're absolutely right. It's really interesting, really interesting time. So much happening and you know, on one hand it seems like they're happening in their own worlds, but it's starting to become a lot more clear how they do converge and how they combine. And I don't know how much longer before the kind of the trials in that realm begin. And you know, again, like we were talking about earlier um with Emily Drybelbis, we were talking about the Rabbit R1 and you know, and these early AI devices not, you know again, not quite delivering on the promises. You know I I don't know that that happens immediately, but we've got to go through the pain points to get there and I think we're kind of on the on the horizon, for no pun intended, for that to happen.

0:48:21 - Victoria Song
That always happens when I talk about something.

Yes, but yeah, there's like a in, like a play period. You know I was talking with another. There's a story I'm working on now, but I was talking with another AI wearable. They'll be writing up soon, but the need for the, the, the need for play, I think exactly Like it's.

It sounds very shallow to say that when everybody is also like thinking about how am I get stuff that's useful and this and that, but I feel like for AI, it all got. Particularly for AI, it seems like it all got pushed into function really fast and and then it's. It's like, well, I don't want it in my whatever thing where I already know what my workflow is, but is it more of a play thing? Like VR and AR have been play things for so long that everyone was like, well, when are they going to get useful? But I still feel like with AI and all these things, they need to kind of play and figure themselves out a little bit more before we know absolutely what we're putting this into our you know.

I just feel like this has been a year where we were prescribed that it's going to be in all of our search engines and phones and everything else and I'm like well, for what you know, and I go like what's the what to what end? And I know that it's fun to generate stuff with it, but, um, or can be really annoyed people or piss people off to generate stuff with it. But I think that that's. You know. Beyond that, there's a big road to figuring out how that can be helpful. It seems a lot like a lot bigger road than we've given it, and so maybe that's it. Maybe it's like a more of a period of playing and experimenting.

0:49:56 - Jason Howell
Yeah, I like that Could use a lot more play in our lives, I think, with our technology right now. So that I think you're absolutely right. Scott Stein writes for CNET and does a lot of work with all of these technologies that we've just talked about, and does excellent work. Scott, thank you for taking time to chat with me today about this. Appreciate it.

0:50:16 - Victoria Song
Hey, thanks, Jason. It was great to chat and catch up on this more soon.

0:50:21 - Jason Howell
Absolutely. Yeah, like every week. Basically that's the amount of this news that seems to be happening right now, so everybody should check out your work, scott. You're awesome, I appreciate you and we'll talk to you soon.

0:50:35 - Victoria Song
All right, thanks a lot.

0:50:36 - Jason Howell
See you later, likewise, all right. Well, speaking of meta and, heck, also speaking of AI hardware, I'm sorry you guys, this is like three quarters about AI hardware. It's just the way the day works, you know. It's the way the week works. This week, in particular, ray-ban Meta smart glasses, and we're going to talk a little bit about this because they have some new capabilities, which I think is a pretty interesting big deal. I'm also realizing like I think I need to pick up a pair of these glasses just to test them out, because when I think of kind of this moment in AI hardware, these might be near the top of proving that it can be successful, or proving that they can not make too many promises but deliver on the promises that they do, which is apparently a theme that keeps coming up. On today's show, victoria Song from the Verge is here to talk about some new features and what she thinks. Victoria, welcome back.

0:51:36 - Scott Stein
Hey, thanks for having me.

0:51:37 - Jason Howell
Yeah, so, so, yeah. So, okay, first things first. Should I just get one of these? Like, is it worth it? Like I was looking at the price the other day, I was like I could maybe justify it. But you know, I don't want to be disappointed, I don't want to be let down. Are you let down by them or are you pretty positive on them so far?

0:51:56 - Scott Stein
I have not been let down by them. I just think you need to be very intentional about the type of glasses you get with them, because I have them as sunglasses, which means I don't get to use them quite as often as if I had gotten them with transitions or with, like the clear lenses. So if you're like a super productivity user, I would actually just get the transitions, because then you can use them day in, day out. My uh, my spouse has a pair with the transition glasses and you can't pry them from their hands Like they're just like. They're so in love with these things. It's a little. It's a little unsettling because they're not like. It's really funny because my spouse is not into wearable tech at all. They're not into any of it. That's telling though.

That's really telling. It's so telling. It's so telling they love them so much. And it's funny because they're on a trip with their dad to Texas right now and I texted them. I was like, wait a minute, how was it like going through TSA with the Ray-Ban glasses and they left because they were going to take them. They were taking them off going through security and the TSA guy was like wait, no, no, no, no, you can leave your glasses on. He's like buddy, these have cameras in them. I'm taking them off. So it's totally just mind-blowing.

Because I go through a bunch of different gadgets in a day, I primarily use the Ray-Bans when I don't feel like running with contacts on or it's super bright out, it's. You just have, like, this wonderful ability to have them as headphones, open ear headphones, which crucial if you're running outside, don't want to get hit by it, don't want to get hit by a truck or anything like that. So have them out. Great headphones, great sound quality. So I find them super useful in that context. Because they're sunglasses, I myself have been like, you know, I really should just go to my local optometrist and get the lenses switched out so that I can wear them all day yeah.

0:53:54 - Jason Howell
Yeah, yeah, it kind of sounds that way. So, and and this is so far, this is your experience using these with their kind of like pre-update functionality, I feel like the current update which, by the way, for those who don't know, is multimodal support for the glasses, which we can talk about. I feel like that's a pretty, pretty significant upgrade. What do you think it is? Yeah, tell us a little bit about like what that actually means with these glasses. What do you think it is? Yeah, tell us a little bit about like what that actually means with these classes.

0:54:20 - Scott Stein
So multimodal AI just basically means they're taking a bunch of different inputs and the AI is allowed to like interact with them. So I think the flashiest thing you can do probably is go hey, meta, look at this and tell me what it is, or something like that. So I did it with my cat and I was like hey, meta, look at this and tell me what kind of cat this is. And the AI just goes in my my uh ear and it's just like well, this looks like a cat with unique stripes markings and it's a tabby with bright orange eyes. And I was like, well, you know what? That? That is my cat. That is an accurate description of my cat.

And other things you can do would be like say, uh, hey, take a look at at these plants and tell me how often I should water them, and it'll tell you, uh. Or you know, you can do a little more conversational things like, hey, write an instagram uh caption for my last picture, and just a bunch of things that you would ask, maybe your phone's voice assistant, if your phone's voice assistant was a little bit smarter than it is. And it's kind of a natural interaction because while you're wearing a pair of cameras on your face and you're actually looking at the world, so it kind of fits into the flow. Naturally, there's no like forced gestures you have to learn, like with the. I briefly played around with the humane pen and that was just like wait, what do I do now?

What are the gestures now? Oh no, it's not like responding to my fingertips properly. What's going on there? There's nothing to really learn aside from how to speak to the AI in a way that it understands you, but there's. It's a very interesting thing to play around with. And again, my husband is also part of the early access and they have just been going to town identifying every single car that they see on the street, because that's their car nerd. So they'll run up to like I don't know, like a Corvette on the street and they'll be like hey man, what model of Corvette is this?

0:56:25 - Jason Howell
And then he'll turn around and he's like, oh, I got it right or oh, I got it wrong. And every time they do that, that's confirmation that they can trust it to do the next thing. And those kind of interactions I've noticed in my own experience, not with these glasses, but just with AI in general. Those kind of interactions kind of light, the spark for what else could I do? Oh, okay, Now I've got a little bit more trust. It's giving me a little bit more of what I expect out of it when I ask it these things. Now I'm going to level it up. What about this? And that's really telling that your partner is not big into wearables and yet can't stop using these things. That's pretty dang positive.

0:57:03 - Scott Stein
They're kind of an evangelist, because they go to car meets and then everyone's like, oh, what are you doing? Because they're walking around like a little spy recording stuff for content. And he's like, oh, my glasses, they do these sorts of things and people who make content, who enjoy car content, they often have to wear a GoPro strap to their head. So this is a much better implementation than having to do that. So for his particular niche, that's great. But the one thing about the multimodal AI is just how confidently wrong it is sometimes. That's AI in general, that's AI for you. Like I asked, I I got a bunch of plants that I rescued from some friends with black thumbs and you know they tend to be succulents and I go, oh, uh, look and tell me what plants these are, and it told me, like very confidently, that I had aloe vera and echeveria and some other like types of succulents, and, to the best of my knowledge, I do not have an echeveria.

So I was like oh, you got that one wrong. That's not actually part of that. Like we have two Alfa Romeos, one is a sedan and one is an SUV and it looks at the SUV and it's like that is the sedan. And it's like no, you got the model wrong, you got, you got the car maker right, you got the model completely wrong. But it was very confident in giving me that that answer.

So it's just what's striking to me is how, when it works, it's in a way that's very intuitive. You're not, you know, you're not standing on a street and asking a pin and waiting for 19 seconds for the pin to tell you what the name of the restaurant is. That's, that's not what's happening. You're just kind of going along your way. You see a picture, you ask you, you see something and you ask it what you're seeing. That's very like intuitive. That makes sense. So you kind of use it in a way that doesn't require you to adapt what you would have already been doing. So instead of taking out your phone and taking a picture and looking up something, you are just doing something that you would.

Naturally. The one time it doesn't like work out so great is when you need the zoom because you can't zoom in with the camera. So we had a. We had a.

You know, my husband comes running through the room. He's like is that a giant obese squirrel in our backyard? And I was like what? And we look and and it's actually a groundhog. And it was like this little contest. I was like, okay, I'm going to whip out my phone and I'm going to zoom in and I'm going to take a picture and then run it through the internet and see what it is. And they were just trying to take the picture. It wasn't working because the groundhog was too small in the picture. But then they took a picture of my picture and was like, okay, it's a groundhog. And so that was like a really funny instance of tech nerds just experimenting with the limits of the multimodal AI. Yeah, but I will say it's surprisingly intuitive, just that form factor. You're used to wearing earbuds and talking to your phone's assistant, sure, so that's already built in the camera aspect and what you see is already kind of built in. So I think that's very smart of Meta to put it in this particular package.

1:00:23 - Jason Howell
Yeah, the way that we interact with devices is, I think, to a large degree going to determine how successful it is in the long run. If there's a steep learning curve on a particular syntax, that you just got to speak, the robot speak versus speaking the human speak in order for it to understand. Or you have to remember to touch it with two fingers instead of one and make sure you do a slight swipe down at the end of it, because that's what tells it to do this thing versus that. Like those are all hurdles that a tech nerd, someone who really loves tech, most of the people probably listening to the show, they might be fine with that, but your partner, you know they're. In order for them to be an evangelist of a technology like this, it needs to be dead simple, it needs to be intuitive.

It needs to make sense through and through. And one question I have for you is you know, as far as this technology is concerned, this device is concerned, compared to other, let's say competing, technologies. Is it a matter of not promising as much but delivering on what it does for the most part? Would you say? That's a fair statement about these.

1:01:23 - Scott Stein
I think that's like an excellent way of putting it, because you just have to. You have to prime people for a new future, right? You have to get them used to the idea. So if you show something that's big, grand, disruptive and it doesn't work, well, they're just going to go. Oh, it doesn't work.

1:01:39 - Emily Dreibelbis
Like you.

1:01:40 - Scott Stein
I had that experience a little bit with the humane AI pin where, you know, we all saw this demo where it said it was going to translate everything and like automatically detect Spanish and just make it work. So I tried it with Japanese and Korean, which I speak a bit of, and it was just spewing gibberish back to me and that illusion was very broken.

1:02:04 - Jason Howell
So it's like oh, okay, so that was real finicky.

1:02:07 - Scott Stein
Like, the more friction that goes into using it, the less you're going to enjoy it, and a lot of the humane reviews were just talking about how long the cloud processing time was. It's you know the glasses are connected to your phone. It's using that Internet connection. Our phones are pretty fast, so you're not really waiting a whole lot of time for the Meta AI to spit an answer back at you. If it doesn't know something, it's just me like oh, I can't do that yet, but you don't have to go and go like hey, do this for me.

1:02:40 - Jason Howell
That, that, that, that, that, that you don't have to go, and you're like, hey, do this for me, dot, dot, dot, dot, dot, dot, dot, dot, dot, dot, dot, dot, dot dot. We're not in a, we're not in a world right now where we're comfortable leaving our smartphone home anyways, and so you know so, having that baked into the device and don't worry about it, it can do everything, Just leave your smartphone at home. That's a lot of promise to make with a humane AI pin, at least in this current state, by comparison.

1:03:05 - Scott Stein
Yeah, I just don't. I don't think they got the timing of things quite right. I think you know there was a lot of excitement around the idea of it. I don't think the idea of it is necessarily bad, it's just like where are we at? Where are people at Like, where do we think this AI is going? And I think a lot of companies don't have that answer yet. So there's a lot of experimentation, there's a lot of playing around, but when you think about the average person, the person you know, like your grandma, your mom, the least tech savvy person, you know they have to be the ones who are like also on board with it. Least tech savvy person, you know they have to be the ones who are like also on board with it. So if they don't see a reason for it, well you're just, you're just preaching to the already converted and that's not really going to change society as a whole. So sunglasses everybody likes a pair of sunglasses.

1:03:53 - Jason Howell
I'm just saying you know everybody knows how to wear earbuds? Yeah, Everybody likes a pair. Yeah, most everybody likes a pair of sunglasses, and if those sunglasses look good which you know, that's, that's just a. This is where I realized, like you know, Google glass, let's just say, was far and away ahead of its time.

The world was not prepared for that, the technology was not prepared for it and up until these it was hard to point to any other smart glasses and say those look like actual normal glasses. But that's where we're at and I think you know probably because other glasses were trying to mimic the smartphone and put a screen on it and do all this other stuff it's like, okay, well, no, peel that back and don't make as many promises, but deliver on the promises you do. Make Any insight or guesses into, kind of the broader plan of functionality on these as we go forward, based on where we're at right now.

1:04:49 - Scott Stein
Well, we've heard some rumors. My colleague, alex Heath, has written a lot about what Meta says their VR AR roadmap is going to be, and I do think we're going to see some sort of wearable like accessory for these. I think we've seen a lot of like kind of rumors about some sort of wrist worn device that interplays with the glasses. I think that's going to be very possible. But, just in general, I really do see wearables becoming their own sort of ecosystem. We're seeing it a little bit with smart rings as kind of an accessory for your smartwatch, which is already an accessory for your phone. So I think you know. So I think we're Accessory section.

1:05:33 - Jason Howell
That's where we're at right now.

1:05:35 - Scott Stein
I really do think that what we're going to see with wearables going forward is just kind of like an ecosystem of accessories which, you know, from an antitrust point of view is maybe not great, but I do think it's just something that companies are going to start playing with to just have these devices interact with each other and less with screens.

1:05:57 - Jason Howell
Yeah, yeah, I think you're absolutely right and I think the final thing that I'll throw to you and then I'll let you go, because I know I've kept you a little long talking about public perception of devices like these and you know, getting back to what I said a couple of minutes ago about Google Glass being ahead of its time and the world not being ready for, you know, a camera on our lenses. You've had enough experience with these glasses and now, you know, probably, with these new features and if you swap out the lenses, that means you probably, you know want it sounds like you want to use them more than you already are. Do you get the sense that in at least where you are located geographically speaking, the real world is more accepting of a device like this now than it was five or 10 years ago?

1:06:43 - Scott Stein
Absolutely Like the Google Glass, were just way too early. For the time we didn't really have kind of a cultural acceptance of being filmed all the time. But now if you walk down the street in a big city you are an extra in somebody's memories, like somebody is filming something, someone's on a FaceTime, you're in the background. You're constantly a background character in someone else's like vlog. So just in general, I think we are a lot more comfortable with being out in public spaces, like public bathrooms.

1:07:14 - Jason Howell
That's another story. Take the glasses off.

1:07:18 - Scott Stein
Like that's a whole other story.

But you know, if you're out in a public space where there's a reasonable expectation that other people have phones, that other people know things are being recorded, I think we are a lot more immune or inert to that kind of privacy not a violation, so to speak, but knowing that you don't really have that kind of privacy in public anymore, or even in a restaurant. There are TikTokers in restaurants these days just going like oh my God, this is my day, this is what I'm eating in a restaurant. So is it really that different? I don't think so. So I think we're a lot more used to that idea and that language has just been building over the past 10 years. So, going forward, our idea of what's acceptable may be very different than what it is now, because it was very different than what it was 10 years ago. Yeah, 100%.

1:08:13 - Jason Howell
Absolutely agree, Victoria. What a pleasure to get the chance to talk to you today. Victoria Song writes for the Verge. Definitely follow her work over there because she does. Absolutely agree, Victoria. What a pleasure to get the chance to talk to you today. Victoria Song writes for the Verge. Definitely follow her work over there because she does great work. Thank you so much for joining us today, Victoria.

1:08:27 - Scott Stein
Thanks for having me. It was a fun chat.

1:08:29 - Jason Howell
All right, right on, we'll talk to you soon, or Mikah will probably talk to you soon, but maybe I'll talk to you soon as well. See you, victoria, all right.

1:08:35 - Scott Stein
Bye, have a good one.

1:08:41 - Jason Howell
Bye-bye, you too. All right. With that, we've reached the end of this episode of Tech News Weekly. What a fun show. I'm so happy to be able to drop in and do this. This show records every Thursday at twittv slash TNW. That's where you can go to subscribe to Tech News Weekly in audio and video formats. And, by the way, if you haven't heard, you can get your shows ad-free. That's right, that is possible.

Do you want to check out Club TWiT? In order to do that, $7 a month gets you every TWiT show, no ads. You get an exclusive TWiTPlus feed with extra content, stuff you can't find outside of the club and a members-only Discord channel, which is just so great. I'm so in awe of the Club TWiT Discord. Let me tell you, I've tried to build my own and it's hard getting the level of community that Club TWiT has. It is so active, it is so vibrant that, right, there is like one of the really big reasons that you should sign up TWiTtv slash, club TWiT $7 per month, month and you get so much. You can actually also subscribe to individual shows on Apple podcast $2.99 a month. You can search for it there. You'll get the audio feed completely ad free.

As for me, if you want to find me, I'm all over the internet doing my own things these days, but you can check out. I have a new YouTube channel, youtube.com/@techsploder. I love that name. Uh, that's where you're going to find my AI inside podcast, which big surprise. I like AI. I talked about it a lot on today's show. My apologies If some of you out there were rolling your eyes, but I do a show with Jeff Jarvis, um, every week, uh, called AI inside. That's on the Techsploder YouTube channel.

I have a new podcast launching next Friday, that is, May 3rd the TechSploder podcast, where I'm going to talk with people who are very familiar to you, I promise, about their it's kind of like their origin story of technology. It's about bringing the humanity back into the world of technology and I'm super pumped for that. So youtube.com/@techsploder. Or you can find me on Twitter @ Jason Howell and all the other social media platforms. Just search for me there. Big thank you to Mikah and Lisa for asking me to fill in for Mikah while he was out. This is just a heck of a lot of fun. Also thanks to John Ashley, to Burke, to John Slanina, everyone here at the TWiT Studio for making this happen. Mikah will be back next week on Tech News Weekly and I'll see you next time. I'm Jason Howell, take care everybody.

All Transcripts posts