This Week In Law 262 (Transcript)

Denise Howell: Next up on This Week in Law, we’ve got Meg Ambrose, Mark Paulding, Evan Brown and me. We are going to try not to lose track of Justin Timberlake. Discuss the right to be forgotten in the process.  Talk about some slightly peeved men a.k.a. the legal issues in Halt and Catch Fire, trust displacement, inadequate, irrelevant and excessive steps needed by Google to comply with the right to be forgotten. Lots of other things including can you trust, AIRBNB, all next on This Week in Law.

Netcasts You Love, From People You Trust. This is TWiT! (TWiT logo) Bandwidth for This Week in Law is provided by CacheFly at (CacheFly logo)

This Week in Law, Episode 262 recorded June 6, 2014.

Follow Me, Taser Drone!

Advertisement: This episode of This Week in Law is brought to you by Nature Box. Where you can order great tasting healthy snacks delivered right to your door. Forget the vending machine and get in shape with healthy, delicious treats like dark cocoa nam-nams. To get 50% off your first box go to That’s (Advertisement page: deliciously awesome snacks, delivered to your door every month Nature Box go to to get 50% off your 1st box.

Denise: Hey folks, halt and don’t go anywhere. We are starting This Week in Law. We have an amazing panel for you today. We are going to talk a lot about, oh, I don’t know, some prime time television, some different considerations around privacy and great folks to do that with here today. Joining us once again is Mark Paulding from infolawgroup. Hello Mark.

Mark Paulding: ( - @infolawgroup). Hi, how are you?

Denise: ( - @dhowell) I’m great. I’m so glad you could join us again. We had a technical and scheduling kerfuffle where you are going to join us recently and that didn’t wind up working out, so, Thank you so much for bearing with us and coming back on the show.

Mark: Oh, absolutely, I’m happy to join.

Denise: And also from infolawgroup my cohost, Evan Brown. Hello Evan.

Evan Brown: ( - @internetcases). Hi Denise, it’s great to see you again. Hope this early summer Friday afternoon is going well for you. It’s great to be here. As always looking forward to this conversation.

Denise: Yeah, me too. And Evan was recently quoted in a piece with Meg Ambrose, who is a professor at Georgetown University. So we had to get her on the show. Hello, Meg.

Meg Ambrose: ( - @megleta) Hi.

Denise: Great to see you. Tell us what you teach at Georgetown.

Meg: I teach basic cyber law class, I teach privacy and surveillance, I teach a robotics policy class and a class on big data call Governing Algorithms.

Denise: Wonderful. That’s a full plate.

Meg: Yeah, it’s a fun plate.

Denise: Yeah, absolutely. Well, your students are lucky folks. Let’s get into some of those topics. We are going to start with one that isn’t on your Georgetown curriculum, but hopefully it’s on people’s minds if they have been watching or started to watch since the pilot just premiered last Sunday, I think. A new AMC show, Halt and Catch Fire; presents some interesting legal issues primarily on the copyright front. So let’s start there first.

(Copyright law: VCR with tape popping out of machine, music playing. Headline: copyright law on top of FBI warning)

Evan: or start and stop, whichever work.

Denise: And it’s funny our little bumper has the VCR. There because the show itself is set right in the VCR era. Back in the early 80s, time near and dear to my heart, I was a high school student then and so remember the time frame captured by the show, even if I was not living in what they call the Silicon Prairie in Texas. I remember some of the tech, I’ve seen some of the machines that they put into the show as props so it feels quite familiar. And actually, like, the show is unfolding in a time that was not all that long ago. And, the reason I put it in our topics for today is because (Website: AMC: Halt and Catch Fire, Sundays, 10/9C, episode two premieres)  the show kind of revolves around an important legal issue called reverse engineering, that we hardly ever get a chance to talk about because it doesn’t seem to happen that much anymore. And I will get your takes on that in a minute, but first let’s set it up, you haven’t seen the show. First of all, I think it’s pretty interesting, and if you haven’t caught it and you’re an enthusiast of technology and how we got to where we are today; I think it provides some good historical perspective on the PC era, and how the IBM PC was just completely pervasive when personal computers first became a thing. And then, how various competitors decided, well, hey, we need a piece of that action and due to the IBM PC open architecture people were actually able to reverse engineer and by sort of off-the-shelf products and put together their own version, but not without some legal road bumps standing in the way. And this show is about that, it features a character who supposedly was an executive or sales with IBM leaves goes to taxes and kind of tries to light a fire under a small company there to be a competitor. So, with that in mind. It also starts out with an encounter with this guy and a young woman when he is lecturing at her school I believe this was supposed to be in New York in the show where he gets to know this sort of, she’s the channeling the class from that era, kind of a hunky, perky, very aggressive and self-possessed young woman coder. Who doesn’t take any guff from this guy who comes to talk at her school. They sort of have a liaison and then she comes back in the show later on. And, she does in the context of this whole reverse engineering, legal conundrum.

(Website: playing video of Halt and Catch Fire)

Denise: squadrons of IV IBM lawyers are involved in the first episode. Oh, there we go. The liaison in the way.

Evan: that’s what you meant.

Denise: if you’re watching the video, yes, we’re getting some kind of a promo for the show here. In any event, lawyers are coming through. When it comes clear that this company is embarked on this course and in fact, as Dave Levine pointed out to me over on Facebook is where I saw this. A writer in the New York Times, Alexandra Stanley, hit on this topic of reverse engineering, tagged it as piracy and says this is going to be a tough show for people to get their arms around because; let’s see she says, buccaneering on the high seas is the kind that involves daggers, planks and rum and romantic, because it remained safely in past

(Webpage: Television: Blackbeard as a Geek: Plundering in High-Tech)

Denise: copyright piracy on the other hand, may be too close for comfort. Part of her problem with this show is not just that it is about piracy, but for a period show you need a. Show to be older than this one, but we are going to focus on what she calls piracy, reverse engineering. Which actually, when you have spent some time studying copyright law and care use has a long and established history as part of the fair use doctrine and as one of the exceptions to the DMCA anticircumvention provisions when done in certain ways. So, the reason I put it in the show here is because they just sort of forged ahead in the story without giving you any background as to why, or how they are accomplishing their walking the fine legal line and I thought we would try and provide some perspective on that for you. Evan, I think you said you haven’t caught the show yet. Is that right?

Evan: No, I haven’t but I ought to be sure to now, on your recommendation because I hear that this may fill the void that’s going to be left by madmen. So we can look back on this moment today. And see how precedent we were, recognizing this being a good show. So, I will definitely try to look at it, it looks intriguing and you know, it’s got all the great stereotypes of the night early 80s. Including making fun of the lawyers, with the matching dark suits and white shirts, like we’re all dressed here today, right? Yeah, I’m glad you brought it up to sort of frame a discussion about reverse engineering. Because reverse engineering is one of those concepts in intellectual property law, where you set it down and its footprint tends to cover a number of different areas of intellectual property law, which leads for an interesting discussion. I think, it comes up pretty clearly, you know, pretty often when you’re talking about the law of trade secrets as a form of intellectual property because reverse engineering something is often looked at as a defense, or a carve out from the liability for misappropriation of trade secrets. And what we’re talking about here is information that has a commercial value to it, to the company which it undertakes efforts to keep secret. Well, if somebody takes the device or the process or the technique or what have you and reverse engineers it. Comes in and figures out how it works without doing anything unlawful, and already there some foreshadowing as to some other areas of intellectual property law that applies here. That’s not a form of misappropriation of trade secrets, so clearly trade secrets law come into play, copy right law comes into play, because if you’re reverse engineering the way a piece of software work; and you go and you examine it, and look at how the logic works. And then figure out the way to make a literal expression of that; writing your own code and you’re not just copying the code that’s a form of reverse engineering that wouldn’t be copyright infringement. Patent law plays a role in this, really, sort of conspicuous in its absence because one of the ideas of patent protection is that, the technology will be disclosed to the world. So there is really no reverse engineering to be done. If you have the patent claims written and on file, at the patent office. I was sort of foreshadowing a moment or two ago. The means by which you can access information about what it is you are reverse engineering’s, so there in certain circumstances, would come into play the anti-circumvention provision of the digital millennium copyright act. So, you can just see by going through this litany of different issues that can arise. It is a fruitful area for legal discussion. One that can give rise to, certainly you can bring in some interesting facts and of course, can’t talk about reverse engineering without kicking around the idea of how important, how critically important a role it plays in the process of innovation. There I said it, innovation really is the key concept going on here.

Denise: Right, because what winds up happening when something is reverse engineered if you do it the right way is you wind up something that is either compatible with original thing, that you are reverse engineering so you can interoperate with it or you might come up with something that is like the original thing, but different in certain key ways that might actually make it better. So, that seems to be the latter use case, is what seems to be going on here in Halt and Catch Fire, they want to get into the PC business. They are originally are a software company and the storyline pretty closely follow one that actually took place in the early 80s, involving Phoenix Technologies Limited. They were somewhat famous at the time for reverse engineering the IBM Bios which again is what is going on here in the show. They reverse engineered it using a clean room or a Chinese wall approach. How it works. You have a team of engineers that study the bios first and describe everything it did as completely as possible without using or referencing any actual code; and then you bring in a second team of engineers who have no prior knowledge of that item, here the IBM Bios and have never seen the code. And they work only from the first team’s functional specifications.  Then they can write a new Bios that operates as specified. That’s what Phoenix, did that was legally okay and it appears to be why the clash loving, kind of punk rock Cameron has a role in the show. She, she gets, I hope I am not spoiling this terribly for people who have not seen the show. So, spoiler alert should have been issued a long time ago, but this is only the first episode. So there is much more to come. Anyway, she does circle back from her initial liaison with IBM, former salesperson and comes back into the show in the context of establishing this clean room approach because she’s such a promising engineer. So, that’s what’s going on here is, as the army of lawyers from IBM are walking through there, they are scrutinizing what is going on here is, if a reverse engineering approach is being taken. It’s one that’s going to be legally up to snuff. So, Mark, do you have any thoughts on this at all? Have we at least intrigued you to check out the show, if you haven’t already?

Mark: Well, I’m pretty sure I will check out the show. If nothing else to be amused by the 80s all over again.

Denise: Yes

Mark: but I have, my experience with reverse engineering tended to be more in the security space where it’s probably a little bit of a dirty word, because a lot of software publishers are not fond of security researchers reverse engineering their technology to find vulnerabilities, flaws, leading into questions about responsible disclosure of vulnerabilities. And, you know, we have seen examples of where that it can work effectively like with the heart bleed vulnerability that was disclosed a little while ago. But, certainly areas where researchers have, I think, sort of rushed to take the credit for finding something new and special, and bad. (Laughter) And spread it around the Internet as fast as possible. Which, you know, is not always a positive development. So, I think it will be interesting to sort of see it, in a more contained, might be a good way to put it, environment where it is really just about the corporate competition and if nothing else see if AMC has replaced Madmen with, I don’t know, disgruntled men, or slightly peeved  men (laughter)

Evan: (laughter)

Denise: (laughter) Yeah, they’re definitely slightly peeved. The interesting thing about reverse engineering in the security context and the research that you’re describing Mark, that’s where the DMCA exemption really applies, doesn’t it?

Mark: When you do it appropriately. It does, but they are still is a lot of challenges to overcome, certainly software publishers try their best to implement as many contractual safeguards as possible, and access control mechanisms to take advantage of the restrictions in the DMCA; but it’s, yeah, pretty much.

Denise: Yeah. So, Meg this isn’t Battle Star Galactica. We were commenting before the show on your great poster behind you, the Battle Star Galactica universe. So, it may not be quite that compelling of television.  It’s not really more science fiction, more science, computer history with some drama thrown in. But do you have any thoughts on the legal aspects of the show?

Meg: No, but I’m going to watch it. I actually really love the history of technology, the fact that there is a legal piece in this is really interesting. I haven’t seen it yet, but I do think, at least what you’re describing sounds like a really different culture of innovation then the one that I am familiar with. The idea that you would reverse engineer to create something is not an idea that I think resonates with the innovators around me, the makers. It’s more based on functionality and they are all. I think the talent and the tools are so much more democratize then they were in the 80s that maybe we’re dealing with something much different than the reverse engineering practices that were happening in the Silicon Prairie. Love that term.

Denise: Yeah. I think you raised a really good point that there has been a movement toward making things more open source, and much more, at least in the engineering community, free sharing of ideas and techniques and would then, what every you are able to build off the ideas that are out there, then becomes subject to some companies proprietary intellectual property. But I do think it was different in the 80s and the homebrew computer club maybe was an example of the former; but then you had people coming in and clamping down on the engineers and saying no, no, no, thou shalt not share our trade secrets. Evan, what do you think, are things different now on the IP and innovation front then it was then?

Evan: Well, I think they almost have to be because of the greater means that we have now to collaborate with one another. So, I’m glad you mentioned open source because I think that has to have from a sociological level or on a sociological layer has had to play into all of this as well. I mean, the generality, the thing that is the same across all of it is the basic human inquisitiveness, to hack things, to get in there, and to tinker, and to figure out how things work. So, what we have now in these days is the added ability not only to figure out how something works, to go in and reverse engineer it, to tinker with it, dissemble it, and decompile it and do whatever you will, but also to share that information so that the benefits of having done that reverse engineering is magnified and increased exponentially. And so you see so much more innovation that way. And of course with the rising level of sophistication of things, the nature of innovation changes quite a bit. If you read in this month’s issues of Wired Magazine, the story of the oculus rift. The 21-year-old guy, I forget his name. Now. IRC can help us out here. Who is credited with inventing the current iteration of that in his parents’ garage, as a teenager. You know the process of innovation wasn’t  in going in and finding out how transistors work or what have you, but, it was in compiling elements, very complex elements that already exists, a particular mobile display would come on the market now would be integrated in that is well. So, it’s sort of as the technology becomes much more complex, much more is integrated in that. So, the kind of innovation can change as well. Because you are operating at a much higher clustered level, to use a sort of Hofstetter-ish term about it, you’ve got so much technology millions of person hours invested in a small device, those kinds of things being put together changes the very character of innovation as well. It’s sort of, almost in a certain sense, the opposite of reverse engineering because you’re not dissembling things you’re actually putting them together in new and interesting ways. Of course, that’s nothing new either or else we wouldn’t have the automobile, if that weren’t the fashion of innovation at some point.

Denise: Yeah, good points. Evan. So, we just thought we give you our take on that since it’s new on the scene and sort of obtuse in the way it projects the legal issues in the show and also not very charitable to the lawyers, probably with good reason, as they are bringing in the new engineer and putting her through the various questions that will establish her as, whether it’s behind a Chinese wall or subject to a clean room approach, however, you think about it. The lawyer sitting there, prompting her the whole time as to how she should answer.

Evan: lawyers always make a good antagonist so we can handle it. When in doubt, just blame the lawyer.

Denise: Right, so I can see why this New York Times columnist took sort of a cynical approach towards the show, and the “innovation” they are involved in. I think the point of the show is that they aren’t supposed to be involved in a highly innovative kind of activity, the male protagonist works the full 80s power suit, and looks a lot like the people I remember from back then whose main goal in life was to level up economically as soon as and as much as they could. (Laughter). So, I think that’s what we’re going to see unfolding here. But it is pretty interesting to watch and they are sort of this ragtag group of renegades surrounded by equipment that looks a lot like the TWiT brick house  things that you see laying around the studio and down in the basement there. So, it’s always fun to look at and think about and see people using the stuff we all started out with. If were old enough to remember the older technologies. Let’s move on to some privacy stories.

(Privacy statement: music playing)

Denise: before we jump into these I wanted to go in and put our first MCLE passphrase into the show coming from Mark’s comments about things that are new and special and bad. Let’s make it that, “new and special, and bad”. What the MCLE passphrase is, are that we drop into the show are for people who are listening to this show for continuing legal education credit or other professional education credit. That’s something we encourage you to do we have put together some information for you on how you might do that. If you are a lawyer, that’s over at the twit wiki at find This Week in Law over there, and there’s a whole page devoted to watching the show and getting MCLE credit for it. But, lest we forget, let’s follow up on the right to be forgotten, which at the end of May got codified in Europe, I don’t know if codified is exactly the term when the court does something, but, the last let’s say, EU like the right to be forgotten, and is imposing it on Google and I’m not sure if other search engines are swept up into this, if they are not already they are likely to and other entities as well. As Meg has written about in some detail on her blog. Meg can you bring us up to speed as to what has unfolded since the directive went into place?

Meg: (silence with lips moving). . . place in 1995.

Denise: I’m talking about, well, yes. Why don’t we start there and then bring us up through this current court ruling in the EU.

Meg: Okay, the 1995 directive and the EU was a data protection directive that required all member countries to put certain protections in place within their own national legal cultures. And they’ve done that over the last decade, almost 2 decades now. Included in that are a number of articles related to a user, a European citizen’s rights in relationship to their data. And that is the piece of paper at issue in this case that started way down low in the Spanish data protection agency. An individual, a number of individuals, 200 individuals went to the Spanish data protection agency around 2010. And brought these issues that were related to content about them that was available online and were claiming they had the right to be forgotten in relation to that data. The data protection agency agreed with them and ordered Google to remove links from its index for these 200 incidents and Google challenged that order. And, then it got kicked up through the Spanish court system in a way that’s at least strange for Americans and eventually it was punted by the Supreme Court of Spain to the highest court in Europe. So the European Union’s Court of Justice heard, finally heard this case, and handed down this very surprising decision, a very surprising interpretation of the ‘95 directive that did establish a right to be forgotten through two different articles that are in the directive.

(Blog page: Pla(Y)giarizing for Educational Purposes: Untying the Forget-Me-Knot of the Web: EU Right to be Forgotten Case: The Honorable Google Handed Both Burden and Boon)

Denise: Can we look for a second at what technically Google is going to have to do here? I know they put up a new form that people can fill out if they want links associated with searching on their name disassociated with their names.

(Webpage: Google site-information listed: Search removal request under European data protection law-you will need a copy of a valid form of photo ID to complete the form. Fields marked with an asterisk must be completed on your form to be submitted. We are working to finalize our implementation of removal request under European data protection law as soon as possible. In the meantime, please fill out the form below and we will notify you when we start processing your record last we appreciate your patience.)

Denise: But my understanding of how this was going to work, I hear people say, and I’ve heard you just said Meg, that links were going to be removed from the database. But I think when we originally looked at this and talked about it. What was going on here was a little bit more nuanced than that, it’s actually, that information will still remain in the Google database, it just won’t be associated with certain individuals’ names. And my accurate on that?

Meg: So, the information remains on the Internet, and that’s an important piece, I guess to the decision. The Spanish Data Protection Agency did go to some of the original source, newspapers in a lot of the instances and request that they remove content, but then decided these news sources actually had a right to preserve that information, they had a right to hold it on their servers and it can remain accessible, but Google did not have the same interests as these papers. So, the disconnect, you’re right, Google is asked to disconnect content and an individual.

Denise: Right, but say you’re just searching for that particular content, subject matter wise not associated with the individual. It’s still going to come up in your search results; if you entered the right terms, correct?

Meg: That should I think, that is going to be, technically, it can be really challenging, depending on how you search and how Google disconnects the individual from the information.

Denise: Okay, the one thing that we have now that we didn’t when the court’s decision was handed down is Google’s reaction and response. And as Google, it needs to do here, they are acting quickly and saying, look, we have to respond and this form means change over time, but here’s what we’ve got right now. They have a form that people can fill out and put in links that they think

(Webpage: Google site-information listed: Search removal request under European data protection law-you will need a copy of a valid form of photo ID to complete the form. Fields marked with an asterisk must be completed on your form to be submitted. We are working to finalize our implementation of removal request under European data protection law as soon as possible. In the meantime, please fill out the form below and we will notify you when we start processing your record last we appreciate your patience. Page displays blank form to be completed for search removal request under European Data Protection law).

Denise: they want disassociated from their name and Google is giving them some guidelines and said, ”In implementing this decision we will assess each individual request and attempt to balance the privacy rights of the individual with the public’s right to know and distribute information. When evaluating your request, we will look at whether results include outdated information about you as well as whether there is a public interest in the information, for example, information about financial scam, professional malpractice, criminal conviction or public conduct of government officials.” So Google seems to be saying here, yes indeed, we will process your request. But don’t think that just because you’re filling out this form everything that you’re requesting the disassociated with your name is going to go way.” So, do you think that they have walked the line here that they need to walk, Meg?

Meg: I mean, this is a, this case actually made me say poor Google. I’ve never said that before. The form, they easily could have automated this and just said, you know what, just wash their hands of all of the European Internet staff and taken away every link that was requested. They could have automated it and said they are obviously having to bump up their compliance spots and go through, they have 41,000 takedown requests in the first four days, and I think that the problem is not that they are trying to walk the line or whether they’re doing it well. I think the problem is that they have been asked to do it at all, Google is making up the right to be forgotten, it’s a guessing what the right to be forgotten is for each of these countries. Which all of those things that you just listed off and the term public interests is treated differently by all of these different countries. And so, procedurally and technically speaking, it’s a poor Google situation.

Denise: (laughter) Yeah, that’s an excellent point, they could have just kind of punted on this and decided okay, well, you know we’re doing what the court told us to do and in the way that makes the most sense to our shareholders to need us to act in economically reasonable ways and we’re just going to pull everything down that gets requested. And they’re not doing it, apparently, that’s what their form would lead you to believe. There’s an article in the Independent that said, that as soon as this was available, they started to get, you said it was up to over 40,000 requests. They were coming in at a clip of about 20 per minute. And that the majority of the requests were believed to have come from people in the UK that were looking to take advantage of making it easier to remove data. Didn’t you Tweet something too, Meg about most of the people that were trying to expunged various things relating to their criminal records?

Meg: Yeah, I tweeted to some article that said, I think half of the requests  that had come from the UK were from criminals which actually the right to be forgotten for the European country that have analog versions were for rehabilitating criminals who had served their time. So that’s actually very shocking to hear as Americans

(Webpage: Pla(Y)giarizing For Educational Purposes: Untying the Forge-Me-Knot of the Web: EU Right to be Forgotten Case: The Honorable Google Handed Both Burden and Boon)

Meg: the idea, but, that is, the roots of the right to be forgotten are established in criminal rehabilitation in Europe. So it is not quite as strange as it sounds, but it does raise really big public interests concerns because not all criminals in any European country are granted the right to be forgotten. It goes through the court system, a judge decide the public interest, the public’s right to know verse the criminal rehabilitation issues that the plaintiff is bringing and here Google, I guess is trying to figure out what those are now.

Denise: So Mark, what do you make of all of this? This seems to be quite an extension as Meg has said of personal privacy law beyond what we are used to in does it really change the way the web is going to operate in the EU, do you think?

Mark: Well, I think a lot of it is going to depend on how other search engines and web portals respond to this case. If Google’s practice becomes a common practice, then I think it could have a pretty significant impact; certainly for costs for operating in the EU. But also for, you know how information is maintained, what information is available in the EU, and to some degree, whether or not other companies are going to create, I guess sort of the reverse great firewall. Whereby that information may not be available in the EU, but may be available outside the EU. And if that is even an acceptable approach.  Given that it’s not all that difficult, for example, for someone living in the EU to access a site or a service that is theoretically only available to citizens in the United States. And, you know, is an international, or multinational business, then compelled to actually adopt the EU standard around the world just to avoid EU residents circumventing any kind of geographic limitations. Does the right to be forgotten suddenly become a US right, simply because it’s too hard to maintain different mechanisms in the United States and in Europe?

Denise: What do you think of that Meg?

Meg: Well, I’m not concerned, I do worry about cultural privacy rights blurring, I think that’s really important that they stay situated within their national boundaries. I respect the European version of privacy. And I hear a lot of Americans voice very similar views and maybe not realizing that they are quite European views. But as far as Google is handling this right now. I don’t, I’m not worried about the drift into the US unless it becomes a really popular idea. Americans really want it, and the right becomes established through some statue. And I say that because Google already requires that you have to show, you have to verify that you’re a citizen with a license of some kind, an ID of some kind. So at least right now it’s not really set up that just link through and say, I’m British, delete this.

Denise: Evan, one of the things I worry about with his is that at least you’re in the US, and I think elsewhere, people more and more look to the Internet as their, they sort of offshore their knowledge there, they don’t have to remember things because the web will remember them.  You can see people in bars settling disputes by looking up something in Wikipedia on their iPhone. And I just, I worry that the next time I can’t remember that it was Nsync and not Boys To Men that Justin Timberlake was in after he was part of the Mickey Mouse club that he may well decide that’s a bit of his past that he’d prefer not have really accessible and then I won’t be able to remember that anymore. Do you have similar concerns?

Evan: Well, not with that because if I forget everything about Justin Timberlake I ever learned it’s not going to be any worse for the wear. You do touch on an interesting issue, which really to me seems like it’s, we can’t get away from the big huge problem of the right to be forgotten when it comes to us here in the United States. And that’s the thing called the First Amendment. The free speech interests, a free-speech right that we all have, and that goes both ways. The right to say certain things and the right to actually get access to certain information as well. Then, Of course, you’ve got the freedom of the press built in there as well, and freedom of Association and all the other flavors and variation of rights that the First Amendment, gives to us. So, I’m not sure that there’s much meaningful discussion for us to really have here. When it comes here to the right to be forgotten in as much as it will ever apply to us because I just don’t see the contours of the First Amendment changing in this regard. If you look at what the Courts of Justice opinion says about where it draws the line about what content can be taken down.  It talks about those search results that are in adequate, irrelevant or no longer relevant. That could be a pretty broad scope. In fact, you could argue that 99% of the World Wide Web is inadequate, irrelevant and no longer relevant. So, we will leave that point aside. But in the United States for content to be deemed unlawful for it to, actually I don’t want to get ahead of myself. In the United States for content to be deemed okay for a court to say or for a statue to provide that it cannot be uttered or published or spoken; it has to fit a much more narrow criterion, then being inadequate, irrelevant or no longer relevant. It has to be unlawful. Like it has to be threatening words or fighting words or threatening language, or what’s the concept, it’s such a cliché, but it’s yelling fire in a crowded theater. Or defamatory. Or pornographic in certain senses like child pornography. So, there is a very, very, very miniscule portion of content that can be prohibited under the law of the United States as it is. So, I don’t think it’s very likely that the First Amendment is going to be repealed. So, perhaps I’m oversimplifying, but when it comes to whatever it would have to do here with the inability to circle it back to your question of whether Justin Timberlake was part of Nsync or Backstreet Boys. I have that same conundrum.  I don’t think we have to worry about anything here in the US, but I’d love for somebody to point out how I may be oversimplifying things to the detriment of our intelligent conversation.

Denise: No, who would ever do that? Well, just because I can’t let Justin Timberlake and Nsync go, go at all. We need to keep on that. I’m looking at the Google form and this is prompting me to think about someone who is really well-known for whatever it is that they did that they now no longer want associated with their name.  It looks as though, I see ‘add an additional URL’ after the one URL fields on the form. I’m wondering how many times you could click that and get another box and add another link? Do you think Meg, that Google has some, although they are definitely bending over backwards to allow people to submit links here.

(Webpage: Google site-information listed: Links associated with your name that you want remove).

Denise: Do you think that there is some top limits on what they are going to allow you to request to have removed?

Meg: That’s actually genius, what they should do is have it so that once you add 10 URLs it should automatically reject you because, this is public interest enough with 10 URLs that you have created. We can automate some piece of this, somehow I’m sure. But, in regards to your being concerned of losing track of Justin Timberlake, which I agree with is a problem. I recently heard that if you’re under 50 and you say, I don’t know; what you really mean is I don’t care.

(Wikipedia entry: Justin Timberlake entry)

Meg: You actually can look it up so quickly, you don’t actually know anything more. What you really mean is, I don’t care about the answer. I have to look that up in three seconds. But in regard to what Evan mentioned. I just wanted to add another American hurdle for the way that Europe is handling this, is section 230 of the CDA, also prevent a lot of right to be forgotten action to be taken in the US and so that’s an additional thing that would have to be repealed in order to make the right to be forgotten functional in the same way that Europe is trying to make it functional.

Evan: Yeah.

Denise: Well, I’ve been clicking on add additional box sense. I originally brought it up, and it’s not cutting me off, I probably have 50 boxes up there now, so. Maybe they will, if you’re persistent enough to enter that many things. Maybe they don’t have a top limits, it would be interesting to have someone to write an algorithm and see if they can figure out if there is a top limits or not. All right, I think it’s about time for a snack. I certainly starting to get a little hungry and if you are too, I definitely encourage you to grab your Nature Box.

(Advertisement: Nature Box: Awesome snacks delivered. 50% off your first box. Use code twit: Nature Box)

Denise: and bring it in to the show because snacking along with us is always a great way to enjoy TWiL. And the reason that you want to be using nature box, of course, I’ve talked before about how we all work ourselves to death passed when we should actually get something good to be, and, you know, we are just prone to doing that and a good reason to have Nature Box, is to have healthy snacks on hand for yourself,

(Webpage: nature box: deliciously fun, naturally easy snacks delivered to your door, over 100 ways to snack happy; awesome snacks delivered-2% off your first box, use code twit, Nature Box)

Denise: but the other thing I find myself doing all the time is just, doing and doing and doing everything I need for everyone else in my life, whether it’s my clients, or my family, and not having enough good things on hand for me and because I want to do nice things for my clients and family, for them to. You want to have the most healthy kinds of snack foods around possible. So, when you’re wanting to give stuff to your family and friends, you want them to be eating right, is your resource for it. Head on over to that URL, click continue, and you will get three subscription options, then you can place your order. Once you’re a member you can select what snacks you want in your monthly box and they have a huge variety. You can also find tune this to whatever dietary needs you or your family or folks who come by your office or home need to have on hand.  It can be vegan, soy free, gluten conscience, lactose-free, nut free and non-GMO. You can also select by taste: savory, sweet or spicy. And then Nature Box sends you those great tasting snacks, right to your door with free shipping anywhere in the United States. So that’s how are going to have these healthy snacks on hand. Healthy and satisfying snacks like banana bread granola, peppery pistachios and over 100 more. Always zero trans-fat, zero high fructose corn syrup and nothing artificial. And you know, those of you who are parents, you know how important that is to make sure you are putting good stuff into your kids bodies. I think we probably sometimes give our pets healthier foods that we do our kids. And that’s not a good thing, you need to be paying attention to keeping those artificial ingredients out of the things that are laying around your family pantry. So, with nature box, you can do that. It’s the snack happy gift that keeps on giving. You can get a three, six or 12 months subscription for that special someone, family or friend. And don’t forget that swimsuit weather is almost here. Evan, this would be the time that you get up and model your Speedo for us. It is time to snack smarter. Forget the vending machine and get in shape with healthy, delicious treats like South Pacific plantains So, what you want to do is you want to get 50% off your first order. And to do that you go to, that’s all you have to do. 50% off your first box, stay full and stay strong. Go to nature Thank you so much Nature Box for your support of This Week in Law. All right, now that we have our snack. I think it’s time to take a quick vacation. And what better way to do that. Then with AIRBNB< I threw AIRBNB in here because I think there are all kinds of legal discussions that we could have about the company, and on the one front; it’s causing all kinds of legal waves because of the way it enables people to rent out properties that technically may be zoned or otherwise regulated against being rented out, for example in New York. It’s estimated, this is from a Katie Couric interview with AIRBNB’s founder Brian Chesky over at Yahoo. This is her new gig at Yahoo and actually she did a nice job I look forward to more of her coverage.

(Webpage: Katie Couric World 3.0: AIRBNB CEO Brian Chesky)

Denise: She says she’s scratching her own itch, to answer what if questions that she is interested in. And she is going to be paying to a lot of folks in the tech sector. So, I think that would be interesting to see, she is doing some in-depth interviews over there. And Brian Chesky is her subject here, and in her coverage there says it’s estimated that these listings in New York City made the illegal, breaking the law that prohibits rental of fewer than 30 days, I think a lot of areas have that prohibition on rentals. So, there’s that legal aspect, but then there’s the whole privacy consideration. On a couple of different fronts; first in this particular New York scenario New York State Atty. Gen. Eric Schneiderman wants to go after these people who are renting on AIRBNB in violation of that law and gas what their B&B did fork over their data to the Atty. Gen. so they could be sued. So, it’s not representing it’s going to keep your information private from law enforcement if by using this service in your area you might be breaking the law. So there’s that aspect. And there’s also sort of the fuzzier aspect of, you know, someone coming to your home; if that’s what you’re putting out on AIRBNB, who you haven’t perfected, you’re not a hotel chain. It’s a very new way, I think for people to interact with one another that may well raise some concerns as time goes on. Lord knows AIRBNB has had its share of road bumps when they first started out. So, I just wanted to toss this out for our panel. AIRBNB, what do you make of the various and sundry legal ramifications, Evan?

Evan: Well, there’s, you’ve listed them out well, and to the extent that there are legal issues. The way that I like to think of them really more is on the normative level. What it is that going on culturally as a society that is changing? That makes sort of things acceptable. When I think of this, the way that the sensibilities are changing here, the greater willingness of people to engage in transactions with strangers like this.

(Webpage: business page: How AIRBNB and Lyft Finally Got Americans to Trust Each Other, by Jason Tanz)

Evan: I can’t help but think of, do you think back to 2003. Do you remember how people were freaking out about MySpace. And this sounds so for now, but oh my gosh there’s this website, you can go on there, put in information about yourself, other people can see it, and sometimes people are meeting up in real life. And guess what they are getting assaulted. So, there’s that, it’s the perfect environment, the meteorological conditions are perfect for some real moral panic to go on here. And some  fear mongering, and a bunch of stuff to be said about oh my gosh, this is awful, why would anybody do this because you’re going to end up dead or worse or something, but I don’t know if there’s anything worse than being dead but from using it. You hear what I’m saying, so, I look at it as really an interesting from that perspective. From where I am coming from. I grew up in a pre-social media world. And so, you know, the whole question about MySpace I was well past the demographic age for the class of people for whom my safety was a concern when MySpace came out and all of that, the moral panic that went on with that. I’m sort of in the older crowd, and I’m very resistant, my starting point is to be very resistant and very untrustworthy of the people who are, who I would be interact with online with. I think it is fascinating to see how the availability of information about, a vast amount of information. Here’s where I’m trying to work into some legal issues on this, the availability of information about someone can make the service  indeed safer, not only seem safer, but indeed safer because you’re not going to decide to engage in a transaction with someone if indeed there is bad information about them with them. So, clearly, then we see issues that have to deal with the individual responsibility and the information about the individuals. So there are some data security, privacy issues with that. Like one interesting way of characterizing, or an issue to raise here is, and this is to bring in the discussion to bring in the right to be forgotten, in the present conversation here. If there was a right to be forgotten these sort of trust sharing services would be less affected because, hey, I was arrested and incarcerated for aggravated assault 10 years ago, but oh now that that thing completely remove let me come pick you up in my car, no, my van, my white van without any windows and we will go for a ride. How does that sound?

Denise: Yeah, no, I think this is a really good juxs of a position to our right to be forgotten discussion because vacation, rental, sites are probably even more utilized in Europe then they are in the US.  You know, people from all over the world travel through to see the great antiquities, and museums and things and the wonderful sites in Europe and they are more and more turning to how can I rent a room in someone’s home or someone’s home. And, by turns, the people in the EU, are I think probably more than happy to have a little extra income coming in through the availability of these sites, but they are going to have less of an ability to background check someone with a bad record. Meg what you think about this?

Meg: Yeah, maybe you should have to choose, you could either have the right to be forgotten or you can have, but you can’t have both. But then maybe they are contrary. I’m trying to find this great story that I just read to posted in the IRC chat.

(Webpage: Americans by geographic regions, who said “most people can be trusted”1972 to 2012, United States map with rotating demographics)

Meg: Somebody just didn’t paper that I thought was so fascinating about the decreasing levels of trust by regions in the US. And red on the map, when the map is red, there is no trust and rarely is the whole country red but recently it’s become really red. And I’ve been having lots of conversations about whether this has to do with the Internet, distancing us from each other, and that creating a lack of trust. But it is interesting how these kinds of services, they have like millions of users, I have totally looked at AIRBNB to stay in someone’s place, and people who love me have not let me do that. So, I am intrigued by these ideas, but there is something on established right now about trust and I think we are going through some kind of a trust displacement with technology that may I’m not sure, but that makes these services, strange. That being said Craig’s list has been around forever, and we’ve had a few Craig’s list murders, but, you know, we have lots of murders related to other things. So I’m not sure how the numbers actually play out, I’m not sure if these services are that much shadier than some of the other ways that people get murdered by strangers. I don’t know.

Evan: I think most Craigslist users do not go in and get murdered. So we can go with that.

Denise: Most, most, yeah, I think we are living, “may you live in interesting times” was the famous quote. We are living in very interesting times. And in my own life I see the full spectrum of my son having made a great friend by playing on Xbox live, who this kid lives thousands of miles away in Alaska, and I am certain it is a kid and not a creepy person stalking him. And common you know is the modern day equivalent of a pen pal and I’m completely 100% behind this. And I think this is great

((Webpage: Americans by geographic regions, who said “most people can be trusted”1972 to 2012, United States map with rotating demographics)

Denise: and the serendipity of that happening is so delightful. And just yesterday a good friend of mine sent me a text on my phone and I responded to it. And then hours later, I did a double take and went with a minute there was something just off there, like there was something in her phrasing that didn’t seem right, and it prompted me to look up, ‘Can someone spoof your address book on your phone and start sending you texts masquerading as your best friend in the world’, and it turns out as of 2012, that was possible in IOS, so hopefully they have plugged that hole. So, you know, all of a sudden I had this huge chill down my spine and had to call my friend to see who I was texting with. It was apropos, except I can see living in both of this world, I am living in both of those world, and it feels very strange.

Evan: Well, you know, there's going to be two different worlds for as long as humans exist because there are always going to be unscrupulous people. And they're going to exploit it because just as much as we see stories — well, no, not as much. This doesn't get nearly as much coverage, but we do talk about it from time to time. Airbnb, Lyft, Uber all that — we talk about that. It gets a lot of press, it gets a lot of money; there's a lot of venture capital going into it. But how many times have we talked about catfishing instances — incidents as well? Sounds like there's the same portion of the brain — the nefarious portion of the brain — that person was spoofing your friend's phone number that you got the text yesterday. So you've just got to be careful. I mean, check twice before you send a selfie is probably the real moral of the story.

Denise: Yeah, exactly. I think we have a good second MCLE pass phrase from this discussion, and that's "trust displacement."

Evan: That is great.

Denise: So — yeah, that is a good phrase. Again, the reason we put these in is in case you need to demonstrate that you watched or listened to the show. You could say, "Hey, I know the secret phrases." So this is our second one: "Trust displacement." And then — see, I have to write them down so that, if someone calls me to check, I can say, "Yep, that was it." So I just wrote it down. (Laughs) Mark, what do you think about all this?

Mark: Well, I guess what I — what I find interesting and what I'll — I think will be an element to watch is to see how much of sort of a trust infrastructure these services have to provide and continue to provide going forward. I mean, I think there's a lot of conversation about, are people — or a certain segment of people today — more trusting than they were in the past; or has the nature of trust changed? But I also — I think we should not dismiss the fact that companies like Airbnb had to invest a lot of money into creating infrastructure to increase trust, and much of that infrastructure is what raises the privacy and security concerns. But that — I think it'll be interesting to see, over the long term, whether or not social patterns have changed so much that people are more willing to trust, or whether or not this is sort of an innovation in I guess what could be called third-party trust mechanisms; right? So we can rely on a company like Airbnb to provide insurance, to protect against the likelihood that I get the one-in-10,000 person that's going to trash my place, as opposed to a completely open system where it really is up to me to exercise a level of trust that I don't possess, frankly. (Laughs) To sort of openly offer my home or my car or anything else I own to strangers without some kind of vetting.

Denise: Right, and —

Mark: So I think that'll be interesting, to see where we draw those lines.

Denise: I haven't done a comprehensive survey of the available 'get a ride, get a place to stay' services out there, but I would expect that we've got the whole panoply going on right now, that there are — I think Airbnb being an example of a place that does provide insurance, and there are others out there that's probably much more sort of a "you're on your own" kind of approach, and letting the market sort of suss out what people are comfortable with. Meg, do you have any more insights on that spectrum of availability out there?

Meg: I think that you're right; anything you want, I'm sure there's a service for it.

Denise: (Laughs)

Meg: The only thing I'll add to this is that Bruce Schneier wrote a really, really great book about this called Liars and Outliers: Enabling Trust that Society Needs to Thrive; and it really hits on some of these points about how relying on technology to fill in our trust gaps can be detrimental to society. So if I wanted to go use Airbnb's services, maybe I'd take the step to, like, geo-microchip myself so that I'll be safe.

Denise: (Laughs)

Meg: Or, you know, I have my drone, my taser drone follow me. Like, there are all of these ways you can —

Evan: (Laughs) That's awesome.

Meg: — establish trust through technology, but how that's maybe not the — maybe that's not good for humanity. And that's why I really love the book, and I think that it touches on these topics.

Denise: Awesome. Yeah, we had him on the show not too long ago; and I have not yet read that particular of his books, but that's on my list now.

Shall we — oh, there was one other question I wanted to just hone in on a bit more about this, and I'll ask Mark. What do you think about this whole notion? I guess people have to be really, really careful when they're using these services if they're planning to rent out in an area where it might not be legal for them to do so; to read those terms of service and know whether they're going to get forked over to the attorney general; right?

Mark: Oh, absolutely. I mean, I think — well, to be entirely honest with you, probably the safest rule of thumb is to assume that most businesses are not going to fight a government agency for you. (Laughs) So you should definitely ordinarily assume that information about what you are doing with a private corporation can be discoverable or otherwise accessed by government regulators. So you certainly — I guess, speaking of trust ... (Laughs) You perhaps should not trust third parties to keep anything secret that may run afoul of the law if it were — if — particularly if it would require them to fight your battles for you.

Denise: Yeah.

Mark: And yeah.

Denise: Okay.

Evan: It would also seem relevant to discuss — you know, we were talking about the trust of the platform. To look at the flipside of that and look at any obligations of the platform or liability of the platform if something went wrong here. Like, if I get sued — or, you know, if the — what would it be? The attorney general, yeah — coming after me for having done this, to somehow pin that responsibility on the platform. This is getting kind of tenuous, but I can't stop thinking about MySpace. (Laughs) Because there's that famous Fifth Circuit case where the girl got assaulted, and the mom tried to sue MySpace for that. And this brings in Section 230 once again, trying to pin liability on the intermediary, on the platform for the content that the users of the platform put up there. Section 230 would appear to take the platform out of the scope, out of the zone of liability for things that the users choose to do. So just bears mentioning Section 230 again in this context. We were talking about it earlier in the show. It plays a role in many different areas of the law of the Internet.

Denise: Meg, getting back to what you were saying about — you've thought about using something like Airbnb, but those around you who love you have encouraged you not to do it.

Meg: (Laughs)

Denise: I'd like to poke a little bit harder at that. And it seems like Airbnb has managed to get past the trust and safety issue for most people. That — they have reviews, they have — they require a lot of background information about people who are going to rent on the site. They have a way for you to provide — to verify your identity for your account, whether you're a renter or rentee. And so I'm just wondering what you think, if there are Yelp-like reviews of a particular property saying, "Oh, this host was so gracious and wonderful," and there are lots and lots of them. Do you think that would allay your family's and other loved ones' concerns?

Meg: (Laughs) No. It only helped my own concerns. Like, you read these reviews that talk about people's cooking.

Denise: (Laughs)

Meg: So this woman makes, like, amazing pancakes in the morning —

Denise: The amazing pancakes!

Meg: — and it's $30 to stay. (Laughs)

Denise: Yes, amazing pancakes used to be a really popular thing on Airbnb.

Meg: Yeah. So —

Denise: For people to provide and then write about. (Laughs)

Med: Yeah. But no, it doesn't make anyone else feel better. That's just how I would choose my own hosts, based on the reviews and cooking. And I probably wouldn't stay at — in a space with a man. I — you know. All of the ones that I've ever considered are women in the home, at least, as far as I know. So that's, I guess, one way that I've sorted through — again, I've never actually done ... (Laughs) No one's ever let me actually do this, so ...

Denise: (Laughs)

Meg: I can't attest to going through the process, but yeah.

Denise: Well, I'll be our canary in the coal mine for the show, unless — Evan and Mark, have you rented on this or similar services yet?

Evan: No, I have not.

Denise: No.

Evan: I have not. I'm much — much —

Denise: I'm doing so later this summer. I've already booked it, went through my own vetting of — but we're not staying in a room in someone's home; we're renting the home. So hopefully — I don't know. That's another thing to consider there. Of course, the people who own the home will have the key. (Laughs) And can come in anytime. What were you going to say, Evan?

Evan: I'm much too dainty to fend off if anything happened, so I'm not going to —

Mark (Laughs)

Denise: Well, you guys have seen Carmael, our attack dog. I'm  bringing her along. It's a pet-friendly place, so we have — it's not quite a taser drone, but ... (Laughs)

Evan: (Laughs)

Denise: She'll be there for us.

Mark: Yeah.

Denise: All right.

Evan: It used to be cans of Mace; now it's taser drones.

Mark: Yeah.

Denise: That's right.

Evan: The world is getting better.

Meg: (Laughs)

Mark: Well, you know, the dog, it's the oldest trust mechanism we have. (Laughs)

Denise: That's right. I don't think we have too much to say — our last privacy-related story is on TrueCrypt. I'm prefacing it saying we don't have much to say because nobody really knows what happened here. But I put it in as a just — sort of, well, this happened. (Laughs) Kind of a —

Mark: (Laughs)

Denise: — story that has legal overtones because there seems to be consensus that, oh, yeah, TrueCrypt — if you're not familiar with the story, TrueCrypt was an incryption enabling device that would allow you to encrypt your whole drive, I believe, much like FileVault on the Mac. So I'm reading the coverage of TrueCrypt, going, Well, people could just use Macs. But I'm not sure if that's apples and oranges or not. And I encourage you to watch Security Now 458, which I'm in the middle of, and I haven't quite gotten to their TrueCrypt discussion yet, but I understand there is a good TrueCrypt discussion there which will get into the technicalities and may well tell you why. TrueCrypt Vs. FileVaultTrueCrypt is more important and more secure or what have you. And people seem to be quite unhappy that TrueCrypt, for mysterious reasons, has shuttered itself and apparently moved its servers to Switzerland in an effort to attempt at least, keep the doors open; but with caveats that, if you're doing something illegal in another country, their being in Switzerland's not going to help you.

Mark: (Laughs)

Denise: So again, this happened; and I toss it out to our panel to discuss. Evan?

Evan: Well, yeah. I don't know that I have anything more to say about it than "this happened."

Denise: (Laughs)

Evan: There was a — did you mention it? — there was sort of like this weird disingenuous reasoning they gave that had to do with Windows XP — you know, since — I don't even quite get what exactly they were trying to say there. But there are better encryption abilities built in now into Windows 7, or something like that. See, it just, like, sort of went right past me. It's like, oh, there's just something weird here. I'm not going to pay a whole lot of attention to it. So yeah.

Denise: Right.

Evan: This happened.

Denise: So the suggestion here, I guess, is that the implication is that the NSA no likey things that help you encrypt, and the speculation is that there was some sort of government strong-arming of TrueCrypt to shut down. Nobody has any idea what the answer to that question is, and people have been asking us — Evan and I — on Twitter, "Do you think there are legal reasons behind this?" Well, yeah, maybe; but nobody knows.

Evan: (Laughs)

Denise: (Laughs) So any further insights, Meg?

Meg: I sadly have not followed this at all. (Laughs) I have nothing to provide; I'm sorry.

Denise: That's quite all right. What about you, Mark?

Evan: That's essentially what I said. If I just — you know —

Meg: (Laughs)

Mark: (Laughs) Yeah.

Denise: If I need to encrypt my drive, I'm on a Mac, so I feel pretty good about that. And now I expect the flurry of emails as to why I'm living in a fool's paradise. But, Mark?

Mark: So I guess I can say, as a long-time TrueCrypt customer, I'm disappointed. (Laughs)

Denise: That's —

Mark: But I think the big issue here, or the open question, will obviously be whether or not there really was any government strong-arming. Or not even so much government strong-arming, but just an acknowledgement that so long as they operated within the reach of the U.S. Federal Government, whether or not they would be compelled at some time or another to share either their sourcecode to facilitate attempts to find vulnerabilities in their software; or to otherwise make, even alter, their sourcecode to make it easier to crack encryption keys from their technology. I mean, I think it's important to note that even in the light of everything that's been disclosed through the Snowden disclosures and other sources, the NSA cannot crack well-implemented, AES encryption today. For the most part, the way they get around encryption is by either compromising the software itself to provide them with backdoors of some type or to compromise the software to make key generation weak so that they can essentially brute force it by essentially having their computers guess repeatedly until they get it right. And if you can — if the keys are sufficiently weak, that's not very hard to do. So I think it does sort of raise a question as to whether or not any service operating in the United States that offers encryption technology — I guess to get back to our watch word of our previous conversation, how much can you trust them? Because if TrueCrypt felt so pressured — and this is all very hypothetical. (Laughs) But if TrueCrypt felt so pressured to weaken their technology or play ball in some other way, then how much can you trust the players that are still here and selling their products. And I have not found an alternate encryption tool yet. I will say I don't know how sincere the explanation about the changes in Windows was intended to be, but there is a kernel of truth to it. The default file encryptions options in Windows have improved significantly, but I suppose it still raises that question of trust, and how much pressure is there on software vendors operating in the United States to implement steps to make things a little bit easier for government surveillance? And so — yeah. That ain't good. Obviously —

Denise: And that's —

Mark: (Laughs) Oh, go ahead.

Denise: And that's what people are reading between the lines from the notice to users that TrueCrypt gave; correct? That you've been considering your data secure with our system; we can no longer give you that assurance, they said in as many words; correct?

Mark: Yeah. Now, that was — that looked like an attempt to, as diplomatically as possible, say, "We've shared something or something has been shared." Perhaps even without their consent.

Denise: Right.

Mark: "And we have no confidence that we can stop such vulnerabilities in the future, so you shouldn't either."

Denise: Right.

Mark: (Laughs) So —

Denise: All right. So let's move on to — Meg has been bopping around to various robotics and intelligence systems fora, including We Robot in Florida and, just recently, a Microsoft talk on — let's see — it's the Microsoft Innovation and Policy Center Robotics and Intelligence Systems — forum, whatever it was. Sounds like some interesting stuff. We've already heard about Meg's taser drones. (Laughs) So let's find out —

Meg: Yeah.

Denise: Can you give us your take, both from We Robot and the Microsoft event? It looks like one of the topics you've been investigating is parole boards, using algorithms and intelligence systems to evaluate prisoners.

Meg: So I have not investigated that as much as I want to. I think it's fascinating. I was looking at that in terms of big data and government's use of computational systems, and how we sort of legitimize these things that we're still trying to regulate, but still using at the same time. But the parole board, I think, use is really fascinating; and I think we're going to see more and more of it. So there's just a handful of parole boards that use them now. So I'm very intrigued; I would love to know how they work. I would love to reverse engineer those analytical tools [unintelligible]. (Laughs)

Denise: (Laughs)

Evan: Do you at least know what kinds of information they're looking at? Are they looking at information from before the time the inmate was incarcerated or behaviors while they're incarcerated or both, or, do you have any insight at all on that?

Meg: So my only insight — I have been trying to get more information on this. My only insight is that it is a big data analytics tool. So I'm sure that parole boards were using some — hopefully using some kind of systematic approach, right, where they would take the data of the individual and consider it and weigh it in, hopefully, a fair and consistent way. But now we're talking about big — so we're looking at how other inmates adhere to the law when they are released, which can have major social, racial, gender ramifications for differences in how inmates are released. I don't know exactly what type of data is being used from the individual themselves; but it is — you know, when we talk about a big data analytics tool, we're talking about massive predictions based on a massive amount of prior data. And a lot of times, that data's not very good, and it's based on releases that could have been unfair to begin with. So I think that there might be some really interesting problems. But one of the biggest things I always have to remind myself with robots and intelligence systems is you almost have to compare it to "better than what." So maybe the analytics are super scary and terrible, but are they better than the human boards that are doing this now? Are they — is it still somehow better even though it might be bad and scary?

Evan: Mm-hmm.

Denise: Anything else that you think folks might be interested in that they haven't stumbled across from We Robot?

Meg: We Robot was so interesting. Still a lot of issue spotting, and we would love for more people to be invested in this area. A really great paper that was presented this year had to do with access to education. So we've seen a few news stories about students who have used a telecommunications robot to attend school when they have some kind of disability, and whether that could become a right that students can invoke to get access to information. And so you could — that's like an actual robot, you know, with a little tutu for this girl, is the one I'm thinking of. You can imagine assistive sort of SciBorg, where it could also be provided, which causes all kinds of concerns when you put them in a classroom or in an education setting; but that education law's not necessarily prepared to handle. I thought that was a great conversation that I hadn't considered before. Some great stuff on drones, of course. Drones and the first amendment; so whether the press has a right to fly drones anywhere that they want because they have a first amendment right to record. That's Margot Kaminski's project. And there was a great paper outlining sort of a first draft of an outline of robot ethics, you know, a real one. So not the three laws. And it's very long, actually. There was — we needed more than three, apparently, if you wanted to put them into actual practice.

Denise: Right.

Evan: You know, I'm really taking a critical eye to — you're talking about Asimov's Three Laws of Robotics, right?

Meg: Right, right.

Evan: I'm really taking a critical eye to that based on some things that James Barrat has said. He was on the show a couple months ago, and he was on an episode of Triangulation here on the TWIT network. He wrote — Denise, what's the title of the book? Our Final Invention, right?

Denise: Our —

Meg: Yeah.

Denise: Our Last Invention, I believe it's called.

Evan: It's a sinister view — not pessimistic; I would say realistic and thoughtful view — of the future of artificial intelligence and robotics. And he just totally discredits the applicability of Asimov's Three Laws of Robotics, pointing out, "Oh, by the way, these were introduced in a fictional work to" —

Denise: (Laughs) Right.

Evan: — play into the plot. So just — I'm just saying, I'm really giving a critical — taking a critical perspective on how applicable Asimov's Three Laws are in any of these discussions because James Barrat, who's infinitely smarter than I am on these issues, says it. So I'll just do what he says in all this. But it does make sense, the reason that he gives, that it doesn't take into account the real nuances of the issues in real life, but it does work well as a fictional device.

Denise: Yes.

Evan: That's all.

Denise: Hence the long paper, I guess.

Evan: Uh-huh.

Denise: Let's — thank you so much for that, Meg. Always fascinating issues; and yes, I think my son would love to send a robotic telepresence into class every day if he had the chance. (Laughs) But I don't think he or any other hale and hearty soul would have the chance. They've still got to show up for school.

Let's move on to a final story on the legislation and policy front.

(Music starts)

Evan: Aw, yeah.

(Music fades.)

Denise: So Mark, we've been talking about big data and interesting uses of big data and intelligence systems. The FTC has obviously been thinking about this, too, and has issued a recent report on data brokers. Can you give us the bullet points on that?

Mark: Sure. Actually, just to confirm, I may be having problems with my Skype connection. Can you guys see me?

Evan: Yes.

Denise: Yep, see you and hear you.

Mark: Oh. Okay. (Laughs) All right. Well, my screen's frozen; but I'll run with this.

Denise: Good.

Mark: So the FTC actually spent about two years investigating the practices of data brokers, which is a somewhat nebulously defined group of companies. In many ways, they sort of define as basically any company that collects and processes information  for the purposes of sharing that information with other businesses for a variety of purposes such as people search; risk mitigation services like identity verification; as well as marketing. And sort of the upshot of the two years of investigation has been the report that was issued about a week, week and a half ago — maybe two weeks ago — where the FTC called for congressional legislation to regulate data brokers. I think probably the most straightforward way to explain the difference as the FTC sees it — or the concern, perhaps, as the FTC sees it — is that many of the practices that data brokers engage in are very similar to the same types of information collection and sharing that consumer reporting agencies engage in. And there's been increasing concern about the fact that consumer reporting agencies are subject to the Fair Credit Reporting Act and all of its regulations and amendments; and you have this data brokerage industry out there doing somewhat similar things and, to their credit, trying very hard not to get into the consumer reporting business — (Laughs) — that has no real direct regulation at all. And so, as a result, the FTC has recommended to Congress that it pass legislation touching on several key points. I think the big ones are creating more transparency into the data brokerage space because it's quite easy to go through your entire life without ever knowing that a data broker existed and has been collecting information about you and sharing that information with other companies. They have no real consumer basing aspect to the business —

Denise: Right.

Mark: — which then means that there's not really opportunity for consumers to even know that decisions are being made about them or what kind of information's being used. So the core — the core elements is really more about creating systems that would compel the companies that are consumer facing — so website publishers, retailers, companies like that — to disclose to consumers that they are sharing information with brokers; and then making it easier for consumers, once they know brokers are involved, to be able to access the information that's being maintained about them, to correct inaccurate information, and to even potentially have the ability to opt out of their information being used. Or I suppose the opportunity to be forgotten, at least by data brokers. So that's really sort of the core of the call for legislation; and I think it's been — I don't think that any of the conclusions were particularly surprising to the folks in the data brokerage industry. I think the FTC had shown a little bit of their cards quite a while ago, and now I think it's really — the questions to be answered will be the whether or not there's any bandwidth at all in Congress to do anything of the kind. And in the absence of congressional action in the short run, whether or not the FTC might still attempt to pursue any of this agenda or implement any of these recommendations through its authority under the Federal Trade Commission Act, potentially by declaring actions inconsistent with these best principles to be deceptive or unfair trade practices.

Denise: All right. Thanks for the update on that. I think we're going to move on to our tips and resources of the week. In fact, I only — I guess we now, as of, Evan, about two seconds ago, we now have two tips of the week. The first — we'll start with the funny one first; and that is, if you didn't think the CIA was actively scanning Twitter, what universe were you in? And now we know that they actually have an account. Can you tell us about their first tweet, Evan?

Evan: I just saw it in — well, I guess I saw it on Twitter. Actually, the funniest one is what Gawker has said about it. I'm not going to read the tweet because it comes from Gawker, and ... can't read a lot of stuff that they say and still keep our clean tag in iTunes. But yeah, it's @CIA; and the tweet from the CIA is "We can neither confirm nor deny that this is our first tweet." Ha ha, isn't that just — real cute, real funny?

Denise: It's hysterical. (Laughs)

Evan: Yeah, yeah. So — but I mean, Gawker's kind of taken a funny meta approach, making fun of the people who are falling all over themselves, saying how perfect that tweet is. So yeah, @CIA. There's the tip.

Denise: Yeah.

Evan: You want to follow the CIA on Twitter, and — you're already being followed by the, so —

Denise: Gawker says people either really, really love the CIA or really, really fear it. (Laughs) Probably more the latter than the former. The tip I was going to give, in addition to "beware the CIA on Twitter," is beware if you are calling yourself something that is phonetically identical — that means it sounds the same — as something that's already trademarked. And Apple actually got bitten by this in Mexico. Thank you, [unintelligible], our intern, for sending this my way. Being phonetically identical might not be the first thing that pops into your mind when you're considering whether you have a good trademark or not. Apple has been in a dispute, apparently, in Mexico with someone who has the — it's a telecommunications provider down there called iFone, with an F. And Apple's been trying to get that trademark  unsuccessful with that — been litigating about that trademark, been litigating about — apparently, iFone with an F came after Apple. Or actually, the people who are in the crosshairs here are the Mexican telecommunication services, the wireless services Telecell ... goodness, I'm not going to try and pronounce that one ... and Movie Star. (Laughs) Three telecommunications services there are sort of strangely now in the position, since iFone won its trademark case against Apple, of being able to sell Apple iPhones but not being able to use the word "iPhone" in any of their marketing or at all. They can't use that mark because someone else has the phonetically identical mark in the same market in Mexico. So just be aware of what your mark sounds like, not just what it looks like.

Evan: Shame on those lawyers in Mexico for not being able to extract a settlement, a licensing fee for iPhone to be able to do that.

Denise: Yeah.

Evan: What are you — come on. I mean, Apple has more money than — whatever.

Denise: (Laughs)

Evan: Get a licensing fee, and let them use it. Gosh. Anyway.

Denise: Yes, exactly. Interesting —

Evan: That's negotiations professor coming out.

Mark: (Laughs)

Denise: Yes. Interesting story, and you're absolutely right, Evan.

Our resources are twofold, especially if you're in California. I know many of you are not; apologies for that. (Laughs) But even if you're not in California, it's good to know that these things are happening because, as we know, people tweet and discuss the conversations at great industry events. And these are a couple to keep an eye out for. This coming week on Thursday, June 12, in L.A., in the IP and Internet Conference put on by the IP section of the California State Bar. Lots of great speakers, including both Kellys — Ben Kelly, who we've had on the show before; Chris Kelly, former chief privacy officer for Facebook and erstwhile candidate, I think, for State Attorney General in California, if I'm remembering correctly. Also — who else is here? Ian Ballon, who we've had on the show. Lots of great speakers and topics going on on Thursday, June 12, at the IP and Internet Conference. And then, if you can scoot on up to Stanford on the Monday right after that, a really, really great event I've attended a couple of times: the Stanford E-Commerce Best Practices Conference; it's the eleventh annual one of those. For anyone who does business online, this is Ground Zero of the legal issues that you are likely to encounter. Again, very esteemed program directors, including Ian Ballon, Mark Lemley, Bill Coston, and Ronald Vogel. Always a great — I know Fred von Lohmann is speaking in this, in addition to tons and tons of great other folks. One not to miss if you can go; and if you can't go, just keep an eye out for it. The discussions do get summarized and blogged and tweeted and things; and there's always good stuff coming out of those events. So those are our resources for you. This has been a really, really fun episode of This Week in Law. Can't thank you enough, Meg; it's been great meeting you.

Meg: Thank you. So nice to meet all of you, too.

Denise: All right. We'll let you get back to catching up on back issues — or back episodes of Battlestar Galactica. I know I like to do that in my spare time as well.

Meg: (Laughs)

Denise: When you're not doing that, is there anything — any great conferences or events coming up at Georgetown, or anything else on your radar that you want to let folks know about before we sign off here today?

Meg: Summer's a little bit dead, but I would just like to give another shout-out for We Robot, which is April every year; and in December at the University of Colorado, the Silicon Flatirons, the center there, has a great privacy conference also.

Denise: Good to know. Thanks so much for joining us. We hope you'll do so again sometime in the future.

Meg: Thanks.

Denise: And Mark, I'm so glad we could have you back on the show. Great chatting with you today.

Mark: Oh, thank you, thank you. It was a pleasure to come back.

Denise: Thanks for all your information and insights. And again, just if there's anything that's going on with InfoLawGroup or your practice or anything of interest you want to highlight before we go?

Mark: Nothing too out of the ordinary. I think the only thing I would plug for anyone who's active in the security space is — I cannot recommend enough making it out to the Black Hat Conference, not just because it's in Vegas. (Laughs) But it is an outstanding opportunity; hopefully, I'll be able to make it this year; it's been a couple years. So that would be the one thing I would note.

Denise: And when does that happen?

Mark: I believe it is — don't recall the exact date off the top of my head, but it should be the last week of July or first week of August, so —

Denise: (Laughs) You can neither confirm nor deny. Appropriate.

Mark: (Laughs) I can neither confirm nor deny. Aha! There we go.

Denise: Yes. And Meg, do you turn out for that one, too?

Meg: I have never been to that one, but I've never been to Vegas, so seems like a good excuse.

Denise: That's right.

Evan: That's all right. You're not missing anything.

Mark: It's — (Laughs)

Meg: (Laughs) Evan —

Denise: I could send you a few good — I could send you a few good Airbnb listings for —

Meg: (Laughs)

Denise: (Laughs) Yeah. No, I agree; you're not missing too much in Vegas. All I ever — I just go there and get fleeced. I have the worst luck at the tables, ever, of any human. (Laughs)

Meg and Mark: (Laugh)

Denise: So just — you know, if you ever see me at a Black Jack table, just haul me away because I need that money more than the casino does.

Meg: (Laughs)

Denise: Evan, how about you? Anything going on this week, or anything interesting that you'd like to leave people with before we depart and shuffle on our way?

Evan: Nothing coming up right away. I'm getting prepared to speak on — at a conference later in the summer. Well, it's actually next month, a little bit over a month from now — to a group of defense counsel in Chicago about social media as evidence. You know, it's discoverability, it's admissibility. So as we've discussed so many times on the show, Denise, social media as evidence. To talk about that, you can't do that without telling some great stories — you know, the facts of these cases. So I'm really looking forward to that. So I've been reading a lot of the new decisions that have come out, or kind of catching up on that area of the law, the different court decisions about social media as evidence. It's really interesting to see how people make bad decisions in their lives. I think that's the bottom line on it.

Meg: (Laughs)

Mark: (Laughs)

Denise: (Laughs) Yes.

Evan: So it's always fun to talk about that.

Denise: People behaving badly and then behaving badly about behaving badly.

Evan: And making a record of it. That's good. (Laughs)

Denise: Yes, exactly. Well, you guys are too modest to do it, so I'm going to do it for you. If folks are not already frequenting the InfoLawGroup site, it is not just a law firm website where they put up their phone number and contact information and law firm or attorney bios. You guys provide great information over there. I love the stuff that you write and post up, and you have a great stable of folks in your attorney roster, and the insights that you share on your blog are really, really good. So I'm going to give that a plug for the both of you since we have two —

Evan: Right.

Mark: Well, thank you, Denise.

Denise: Yes, exactly.

Mark: Yeah.

Denise: And with that, we'll go ahead and wrap up the show. I'll remind you that you should get in touch with us between the shows. Evan is InternetCases on Twitter; I'm DHowell over there. Let us know what's on your mind, what you'd like to hear us discuss, people you'd like to hear us chat with. You can do that on Facebook and on Google+, too; we've got pages over at those places. Or if you need to be more secretive or confidential about your communications with us, go ahead and email us. Again, we can't make any TrueCrypt representations for you about how secure that's going to be, but you can email Evan at; I'm And we do record the show 11:00 Pacific Time, 1800UTC every week; and it's fun if you can join us live, but you don't have to. You can pick up the shows at our site, at; on YouTube; on Roku; various and sundry ways. ITunes, of course, is a great way to go for video and audio podcasts, and we're, of course, there. So listen to the show on demand at your leisure if you'd like. If you do join us live, jump into IRC with us; that's always fun. Great to have a live audience playing along. And [unintelligible] great information. Thank you, IRC, we love you. We're glad that you're here with us and keeping us honest as we do the show. And until next week we will see you later! Bye-bye.

All Transcripts posts