PGP for Crypto Podcast 3 - Luke Hogg

Show video

Paul Brigner: Welcome to the Pretty Good Policy for Crypto Podcast, where we have in-depth discussions on cryptocurrency policy and regulation. My name is Paul Brigner, Head of US Policy and Strategic Advocacy for the Electric Coin Co. In a few moments, you will hear from Gary Weinstein, Head of Global Regulatory Relations at Electric Coin Co., and our guest, Luke Hogg,

Director of Outreach at Lincoln Network. We believe in fostering a respectful and inclusive environment for our discussions. And while we at Electric Coin Co.

hold strong opinions about the need for private and confidential financial transactions in crypto to promote economic freedom, our guests may may have differing views. Through these thought provoking and at times challenging conversations, we aim to deepen our understanding of complex policy and regulatory issues and work towards the development of Pretty Good Policy for the cryptocurrency world. This podcast is for educational purposes only and is not legal or financial advice. Our guest remarks may not reflect those of their organization or of Electric Coin Co. Thanks for tuning in and enjoy the podcast. Gary Weinstein: I'd like to welcome Luke Hogg to the PGP for Crypto Podcast.

Luke serves as Director of Outreach at Lincoln Network, an organization seeking to bridge the gap between innovators and policymakers for a freer and more abundant future. In other words, a focus on the intersection of technological innovation and public policy. Lincoln Policy under the umbrella organization, Lincoln Network, is a boutique think tank that works with policymakers and tech innovators to promote market oriented ideas to strengthen American innovation. Prior to joining Lincoln Network, Luke was Federal Affairs Manager at FreedomWorks, where he primarily focused on blockchain, internet governance and regulatory issues.

He holds a BA in Government and Data Science from the College of William and Mary. Welcome Luke. It's great to have you here. Luke Hogg: Well, thanks Gary. Thanks for having me. I really appreciate it.

Gary Weinstein: Luke, you recently co-authored together with Antonio Garcia Martinez, a policy report for Lincoln Network that I would like to explore a bit. The intriguing title is "To Be a Stranger Among Strangers, Ad Tech, Web3 and Data Privacy." But before we do, could you please let us know a bit more about Lincoln Network and your role there? Luke Hogg: Well, sure. So Lincoln Network has been around for a long time and has had several different iterations. So originally we were founded out of Silicon Valley and the original goal was to help bridge the gap between DC and the tech world, and then was completely centered around Silicon Valley in San Francisco. But it has since kind of spread out throughout the country.

Really what our founders Garrett and Aaron found was they had spent time in civil society, they had spent time in the tech world, and they found that those two worlds really spoke very different languages. And when government came in and tried to regulate or legislate on tech and tech issues it created problems. I often like to say that it's nearly impossible to properly regulate that which you don't understand.

And when it came to tech, and especially emerging tech, that was really a problem that they saw. And so we were originally founded to just kind of foster conversations between those two worlds of Silicon Valley and DC. And since then we've kind of blossomed into this bigger organization with a broader mission.

And so I work on the policy side of this, as you mentioned which is really we like to say a boutique think tank. So your traditional think tank has a bunch of scholars that, that focus on various different issues from healthcare to taxes, to tech. But we focus almost entirely on, on tech and innovation issues. And that's a broad umbrella. But my role there as Director of Outreach is I wear a lot of different hats. So I do a lot of our interaction with Capitol Hill.

I do a lot of our interaction with kind of outside organizations, but I also do my own research and advocacy. And that's primarily centered around decentralized technologies, decentralizing technologies and how we should be thinking about these technologies in the broader conversations of tech policy. Gary Weinstein: That's very helpful, thank you. In your article you talk a little bit about the transition from web two to web three, and before I dive into some questions about your policy report, it might be helpful to have sort of a level set understanding about what you mean by web three. Certainly our audience has an appreciation for that. But to have your perspective on Web three and the translation, to borrow your word, from web two to web three, would be would be helpful as a foundation for our discussion? Luke Hogg: Well, sure.

And I think it's a really, really important and deceivingly difficult question to answer. Cuz there's various different ways that we can think about what web three, is, could be, might be. And, people use a lot of different terms. It's web three, it's the D web, the decentralized web. And all of those kind of have various tinges of differences of meaning.

So when I'm talking about web three what I generally am referring to is the decentralized web, right? This idea of a return to web one. And so I think the, in order to understand what, I mean when I refer to web one, you kind of have to take a step back and think about the history of the architecture of the internet as we know it. So, right.

Obviously there's kind of the start with with DARPA Net and these original packet switching networks that are going from colleges and universities. And that's kind of web zero, right? That's this very closed off system that the DOD started running. And then eventually that kind of develops into a protocol driven network of the worldwide web the Tim Burners Lee version of the internet where you start to get some new platforms and some new ways of going about things. But generally when we think about web one, it's, it's pretty open, right? It's pretty decentralized.

People are running big mainframe computers. And over time more platforms are developed and it kind of has this transition into web two. And so to me, web two is the iteration of the internet that we are experiencing right now.

This is a very centralized and controlled environment that is primarily controlled by a select few firms and companies. So whether you think about kind of the infrastructure of the internet and, and cloud hosting, that's all really concentrated among AWS and Microsoft Azure. The platforms, the ways that we interact with the internet are pretty much centralized among a select few players of you have Meta and Google and Amazon and they all kind of have we like to call them walled gardens.

So they've built their little area of the internet. And so really what we're talking about now in the broader conversation of tech policy the, the distrust people have towards big tech platforms. The issues around, whether it be content moderation or antitrust or any of these conversations that pop up in Washington, DC and in the states around these big tech platforms are really, in my view a problem of centralization of control and who gets to control and kind of the, the famous phrase who watches the Watchmen, right? So these companies have a lot of control over our digital environment.

And so in that sense, web three is sort of an inversion of that. So using things like blockchain, using decentralized technologies, decentralized platforms, decentralized transactions, it really is an attempt to get back to the original vision of what the internet was supposed to be as a, as a really open and iterative place. This kind of vision that Tim Berners Lee obviously espouses and, and still continues to espouse to this day.

And so this can have a lot of various aspects to it and there's kind of a lot of different nuances that we can get into of, what does it look like? But part of the problem of defining what web three is, is that it's a vision of the future. It's an idea of what the internet could be if we implement these decentralizing technologies in the ways that we want to. So I hope that helps. It's not a great answer and I think we can get into some of the nuances there.

Gary Weinstein: That is a great answer actually. But I will push back a little bit and say there are some in the blockchain ecosystem that are saying it's time to retire the terminology web three. What are your thoughts about that? Luke Hogg: I think having been in and dealt with kind of tech issues a lot, and especially emerging tech issues, I think it's funny you're seeing the same kind of conversation in debates around AI with the advent of ChatGPT and GPT3 the suite more generally. This is kind of, it's always a constant issue of, of branding and marketing, right? So to me, whether we use the term web three or the term D web or somebody else comes up with a, a brand new term that everyone goes on board with. I'm totally all for figuring out what the best branding is.

I get the idea that the term web three is kind of wonky and weird and requires a lot of explanation and has a lot of different interpretations. People have their kind of project their own vision of the future onto the term as well. So I'm totally open to different different ways to describe this, but I think at the heart what we're talking about is really what is important here. Whether we want to call it web three or the D web or decentralized technologies generally. Distributed technologies, decentralized technologies that's kind of a semantic debate that I'm happy to have.

But fundamentally what we're talking about is the same thing and this vision of sort of a decentralized future. Gary Weinstein: So let's dive into the fundamentals and I'll raise again the title of your recent policy report "To Be a Stranger Among Strangers, Ad Tech, Web Three, and Data Privacy. "This policy report discusses how targeted advertising has raised concerns about data privacy and how new decentralizing technologies, which you've just described in the Web three paradigm may make existing data privacy regulations unsuitable, and you propose a sectoral approach to data privacy laws to govern the appropriate flow of information based on context specific norms. And you highlight the need for policy makers to reframe data privacy laws and account for technological change.

It sounds like what you are describing is a potential conflict between existing data privacy laws and decentralized systems in web three. Am I understanding this correctly? Luke Hogg: Well, that's exactly right, and I think the best example that we have of this comes immediately from the European Union. So in the European Union, they have what's known as the General Data Privacy Regulation, GDPR short form. And this is a, a huge omnibus regulation that covers basically all data flows in the digital environment and it's, I think it's a well-intentioned proposal. There's very real concerns that people have about how their information is being collected, stored, used, processed, sold throughout our digital environment. And so the European Union took a look at this and said, well, we need to create a set of rules for the flow of information in our digital environment.

The problem, well, one of the problems, there are a lot of problems that we can get into, one of the big problems of this approach is that it was really focused on the web two environment. It was really focused on this kind of centralized environment, digital environment that we all inhabit now. And it was focused on trying to solve some of the, the apparent problems with that framework. But in doing so it also inhibited innovation for the future, right? So one of the big problems, and I think this is pretty typical of how the European Union approaches regulation generally, but specifically tech regulation, is that they see a problem and they tend to drive a nail with a sledgehammer. And they can solve that problem, but they also tend to create other ripple effects from that from that one action. And so what you see with the GDPR and in fact the European Parliament, they openly admit this, they have their own, I think it's like 250 page report that examines how the GDPR interacts with blockchain and decentralized technologies more generally.

And they came to the same conclusion that Antonio and I did. It's like, that these things are incompatible. They're fundamentally at odds with each other.

And so that is a specific example, but we're also seeing that conversation start to play out in the United States, whether it be at the state level, at the federal level and kind of our hope is that as we're having this conversation in the United States, that we don't fall into the same trap that the European Union did, that we can think about and create regulations and legislation in a way that allows for this innovation and doesn't kind of get stuck in the mud, as it were. Gary Weinstein: Well, let's talk about a specific example. I guess the GDPR has, its as one of its most important rights is the right to be forgotten, but when one thinks about blockchain and distributed ledger technologies and transactions and their availability for anyone to explore and to track, how does blockchain and distributed ledger technology square with a right to be forgotten? Luke Hogg: Well, it's very difficult to square. In fact, it's almost impossible to square. And I want to be careful here because there's a lot of different ways that you can build a blockchain, as and a lot of different ways that these systems can work.

But generally, the whole point of a blockchain is that it's an immutable ledger. The whole point of this in technology is that you can't go, someone can't go in and, and unilaterally tamper with, delete change information. So that's I think we point to several different problems and that's one of the biggest ones, right? If you create a rule that says thou shall delete data when the consumer requests it that, sure, we can argue about how that interacts with kind of a web two environment where you have kind of traditional data sets, traditional ledgers where a single company could go in and create a process by which they could create data. But when you're talking about a distributed network those questions become much more difficult, if not impossible. And I wanna be clear again, there are ways that you could theoretically go in and, and change information and delete data. However, when you're doing it at kind of such a minute scale where if I have transactions on X blockchain that I have the right to go to the whole network and demand that that data be deleted, one do I really have that right when it comes to such a distributed network.

Two, what are the technical measures that would have to be put in place to allow that to happen? And I think the most important question when you're talking about a distributed network or a, a decentralized network, is that you kind of gotta get everybody on board, right? If, if you have thousands of nodes that have all stored either the whole ledger or part of the ledger if only one individual decides that they don't want to go along with this if I get to delete my data and they don't agree that I should be able to delete my data well then they're in violation of the law and they have broken the law or the regulation and will be sanctioned for that. I think that there's and this is one of the fundamental issues about when you're thinking about these issues of privacy and really applying laws that were written for like an analog centralized digital environment and applying them to a decentralized environment the question naturally becomes, well, who, who is liable? Who actually is involved here, if anyone? And so when it comes to the right to be forgotten, the right to be forgotten is fundamentally at odds with an immutable ledger. Gary Weinstein: There's an adage that if you're not paying for the product, then you are the product. And when you were discussing web two and centralized organizations, it caused me to think about all of the various services that are provided for free by those centralized organizations.

And I'm wondering if the widespread use of targeted digital advertising, and you do discuss this in your privacy piece, whether that impacts individuals right to data privacy and whether there's a solution here where this conflict can be reconciled with new decentralized technologies emerging in the web three paradigm. Luke Hogg: Well, certainly I think that's that's kind of a two-part question, right? And, and I think it's a really important one because One thing that we in putting this paper together and in having the conversations that came with that really started to question and think about is why is data privacy such a big issue? Why is this something that we even are talking about? It's been a big issue in the past three or four years but for a long time it really wasn't. People didn't think about the data that they were, that was being collected on them. They didn't really care that much.

And there's a lot of polling that, that, that backs up that claim. So kind of why are, why do we start caring about this now? And what is the chain of causality that, that got us to the point where we have things like GDPR or at the federal level. There's proposals now for big comprehensive privacy regulations in the United States. And our conclusion really was that this all comes back to advertising, right? So part of the problem people are on record saying that the original sin of the internet is advertising. Digital, targeted digital advertising.

And I think that that really, that there's a hint of truth in that, . I think it's, it's a bit of a hyperbolic claim. There's a lot of sins of the internet, right? But why why do companies collect information in the first place? They're not doing it for no reason, say Meta or Google or Amazon or insert X Company. Why are they collecting that information? They're not collecting it just because they want to have it, or they're Evil Corp. or they want to do nefarious terrible things with it.

They're collecting it because it makes the advertising better, and that means that they can sell advertising space at a higher premium. They can make more money. I think look at Facebook. What made Facebook the economic powerhouse that is compared to something like Twitter? They started at about the same time they were targeting roughly the same part or section to the population. Their both social media platforms. Really the difference is that Facebook figured out how to collect more information and more effectively target advertising.

So when you're talking about data privacy, and this is kind of a chain, right? When you're talking about data privacy, what you're really talking about is data governance. You're talking about how should information be collected, how should it be stored? Who should be able to have access to it? Should it be able to be commoditized and sold to third parties? And those are really important questions, right? But they all flow from the initial economic reason that you have this bulk data collection in the first place, which is to sell ad space. And this is something I think in the paper we talk about Google, right? So if you kind of go back to the earlier days of the internet, a lot of it was fee for service.

You would pay to use a platform, something like AOL and you pay a subscription service. And even now we still have software as a service and, and subscription service models. But the thing that Google figured out really early on that made it what it is today, is that they could really effectively sell advertising space. There's something about the Google search that if I'm searching for flights to Maui that at that moment I am very susceptible to advertisements.

And so they were able to effectively sell this advertising space at a really high markup. And so that happened initially, and then they figured out the more information that I collect on you, the user, the more effectively I can target this advertisement and the more money I can get out of advertisers. And so that's the fundamental chain that we're talking about here.

And you we believe that eventually you're gonna have advertising. Advertising is one of those original markets, original things that have kind of always been with humanity throughout time. Eventually it's gonna make its way into the web three environment as well. So how do we need to be thinking about those things? That was a very long-winded answer. So I wanna pause here before we kind of move on to the second second part of that question.

Gary Weinstein: Yeah, that's fine. And I think that was a very instructive answer, at least to the first part of the question. But I do wanna follow up on the first part. We talked about user data being collected by companies to serve targeted ads, but isn't it attribution as the key to make it valuable to advertisers? Luke Hogg: Well, right, and I think this is this is a big part of, of what we talk about in the paper, right? So I, going back to that Google example, one of the things that that made, makes Google so valuable as a company and so valuable to advertisers is it is really the last link in the chain. So when we think about advertising, we're thinking about how do you get someone to buy something like let's say Coca-Cola, if I want to get, if I'm Coca-Cola and I want to get you, Gary, to buy Coca-Cola, there's a lot of different ways that I can, I can kind of try and get that to happen or, or that usually costs a lot of money, but there's a lot of ways that I can try and influence you to convince you to buy my product and in the digital environment that it happens everywhere.

So I might have a promoted tweet. I might have a banner advertisement on Facebook. I might have artificially generated content that is kind of steering you towards buying Coca-Cola. But the real question is, is who takes credit for that? So advertising businesses are really built on this idea of taking credit for the final end purchase.

So taking it back to a digital example, if I want to get you to download my app and then make an in-app purchase, say I'm Amazon, well, where in that chain of causality do you actually get attribution? Who actually caused you to download the app and buy the product? Well, it turns out Google will always tell you it's Google. So if you wanna buy a Coca-Cola, if you see an advertisement on whether it's television, on a billboard on a promoted tweet, any of these types of advertising where do you normally go in order to make that point of purchase, right? You go and you Google where to buy Coca-Cola. And so Coke being, or Google being the last step in that chain means that they get to effectively take all the credit even though there's this whole long series of actions and, and money that's been spent, fundamentally Google gets to take the credit for that last action. And I think that that's a really important part of if we're gonna talk about this in a web three environment, is how do you attribute where, who gets the credit for actions that are taken in the digital environment.

So if something like, like Filecoin wants to advertise and convince people to use their service, who ends up getting to take credit for that advertisement and kind of monetize it in that way. Gary Weinstein: So let's get to the fun part of the question, which you identified as the second part of the question, and that is, how can this conflict be reconciled with new decentralized technologies emerging in the web three paradigm? Luke Hogg: Sure. I think there's a lot of conversation in kind of the web three and decentralized ecosystem about ownership and owning. Owning identity and then being able to commodify it individually.

I think that there's a lot of interesting conversations that can be had around that. But fundamentally I think people are are misled about exactly how valuable their data is. I saw there was a report and this comes out in the web two environment as well.

People propose that Facebook should pay me to use their service and the data that they collect on me. But your data really isn't that valuable. For all the data that Facebook collects on you in a year, it's worth about $2, right? That, that's some report that came out several years ago. And so, in terms of commodifying data for the individual that's an interesting conversation and I think should be had. But fundamentally I think the, the thing that we should be talking about is, is not necessarily commodification, but for control of identity and, and how you can actually influence identity.

So if you think about in a web two environment, some concept, some company like Amazon, right? The reason that we're having such a conversation about data privacy right now to me is that that data is linked directly to your identity, whether that be your email, your phone number, your account even your, your device user id, right? That the, one of the big reasons that people feel uncomfortable with data being collected on them even though they have read the terms of service and they get the service and they, they, you can figure out how some of these companies are, what the data is that they're collecting on you. But one of the fundamental issues is that people are quite frankly, creeped out at the idea of all this information being tied directly to their real world personal identity. And that's something that you're seeing pop up in, in state level. And even in the GDPR, these state level data privacy regulations is kind of a, a right to anonymity and a right to anonymize data. So there's kind of two ways that you can deal with this.

In the web two environment, you can break the link to identity and you can maintain, you can keep the data, either, you can delete the data entirely right as the GDPR has, or you can anonymize the data so you can keep the data, but it can't be linked or reasonably linkable to which is the, the, a very specific phrasing there, reasonably linkable to your real world identity, whether that be your email, your phone number, your device id. And so I think that that really brings up very interesting conversations in the web three space. So the way that we think about this and we kind of break this out in more detail in the paper, is that in the web two environment information is linked directly to your identity, but it is generally held privately. So going back to that example of Amazon.

If you go and you buy Coca-Cola on Amazon, they're gonna keep that information, they're gonna keep that bit of data and they're gonna tie it to your account. They're gonna say, Gary bought Coca-Cola on this day, and that helps them later on figure out how to sell you more Coca, Coca-Cola. The kind of how the digital advertising world works. But that information is kept private.

With the exception of selling data to third party data brokers and the like, there's not like some huge public ledger where everyone can go and like mine and look at what Amazon is doing and what information it has on you. So to reiterate, that means that your data is directly linked to your identity, but it is being held privately in a web three environment that that calculus is inverted. And so in a way that information is not directly linked to your identity, your real world personal identity, but generally, again, speaking generally it's publicly available in some way.

Gary Weinstein: That's helpful. What would you explain to our audience who is diving into the details of your policy report where you talk about a possible solution, a sectoral approach. Can you please explain to us what you mean by a sectoral approach to data privacy laws working in practice, and what are some of the potential benefits of that approach? Luke Hogg: Sure. And I think again, having a kind of a historical context to this is really helpful.

So a sectoral approach to data privacy is really what we have now. We've always been concerned about the flow of information, who gets controlled information in certain contexts. We talk about this in, in the financial world, there's a lot of rules and regulations about banks collecting your social security numbers so that you can comply with certain laws.

And what are they allowed to do with that information? How do they store it? But I think the better example here is actually when it comes to medical information. So medical information is a really unique area where everyone is very concerned about general access. It's a very private thing in general. And so there's kind of two components to that. So there's the nominative, the kind of value laden side of that, which is kind of the Hippocratic Oath, right? And these norms that have been established over centuries about privacy between a medical practitioner and their patients. And that's just kind of a general, ad hoc code that I think if you talk to most doctors, they would agree with, right? But there's also a legal component to that in the United States.

It's HIPPA and this is a law that essentially lays out what doctors and are allowed to do with the information that they collect on patients. And it sets out, it's basically established under a principle of the appropriate flow of information given that we are, given that we are talking about healthcare, very sensitive healthcare information about individuals what sorts of flows of information should we allow? And I think a helpful example here would be if you went to your doctor and got a scan, a chest scan of your heart and you just went to your general practitioner, they took this scan, they were a little bit concerned about something that they saw, and so they decided to send it over to the specialist at the hospital, the cardiovascular specialist. That's a perfectly reasonable flow of information. We are gonna send this across the internet and, a long time ago, you would've just put it in a, an envelope and sent it over. But now a lot of this happens across the internet. We're sending this information so that somebody can look at it and we would, I think everyone would, agree that that's a generally appropriate flow of information.

You've given your doctor permission to send this information to other doctors so that they can help make you better and, and potentially find any problems that are going on. Now, if your doctor took that chest scan and posted it on their Instagram I think we would generally say that that is not an appropriate flow of information. It's the same information, it's the same data, but the context in which it, in which it is being transmitted, and who it is being transmitted to is the distinguishing factor. And so I think that that is what, generally, the general approach that we have taken to legislating around data privacy is not so much focused on the specific kind of information but establishing rules that lay out the appropriate flow of particular kinds of information in particular contexts. And that's something that GDPR, for example doesn't really do.

GDPR basically says all information that is of this type must be treated in this way. Regardless of kind of the context in which that that data is being collected, stored, transferred and, and the like. Gary Weinstein: I'd like to come back to anonymity for a moment. And anonymity may not be the same as privacy and may not be the same as confidentiality, but you did use the word anonymity and we are finding in the European Union that legislators are using that term as well.

So for the moment, let's just stick with that word when we're trying to reconcile and solve for a pivot from web two to web three. And as you pointed out, anonymity is a value proposition in web two and certainly in the GDPR as you've discussed. Why in your view, and especially in light of the fact that the way you were describing it to me made me feel as if you believe that there is overregulation in the European Union and that is actually driving out or stifling or having a chilling effect on innovation, if that's the case, and if there's this value proposition in web two for anonymity with the GDPR, why in your view are European regulators, such as in MiCA, the Markets in Crypto Assets Article 68, looking to make it harder for CASPS, or Crypto Asset Service Providers, from listing coins that have anonymity enhancing features? And unless transaction history can be fully explored, arguably, although there are more regulations to come that further define and clarify, those CASPS would be prohibited from listing those coins.

Seems to me like there's a disconnect there. Could you please address that? Luke Hogg: Gary, I think that's a, that's a really interesting and important question. And if you want someone to divine what is going on in the European Union, I'm probably not the best person to do that. But I'll take a stab at it. And I think from watching what the European Union has done whether it be with GDPR or the Digital Markets and Digital Service Act, or even what they're doing now with cryptocurrency and crypto regulations, there is a generally, in my view, kind of a, a fear that the European Union has about innovation and technology. If you look at how those countries have developed, there's a reason that there basically isn't any major tech player in the EU.

And if you look at kind of all of the major tech companies in the world and, and big tech projects, big innovations, the vast majority of them are coming out of the United States. There's few in China, there's few in Japan. There's kind of spaced around Korea and other places. But they practically never come out of the European Union. And in fact, I think we were talking about this previously that, I can't think of a single major crypto player. That is, founded and operating primarily out of the European Union or was kind of built with that in mind.

And, but at the same time, the European Union likes to kind of protect what it already has. And if you look at the traditional tech world, you can see this with National Champions. And so for a long time the European Union was kind of jealous of the United States and our technological innovation.

And so they established a lot of rules and regulations and subsidies and things to try and bring tech talent and bring tech companies and create what they call their own National Champions. They've been, not super successful at that. And I think there, there's a few reasons for that. One is that these, the rules and regulations, whether they be consumer focused or whether they be market focused are typically reactionary and they're typically based in a certain amount of fear of what the future might bring. So, I think the, the European Union generally wants to prevent bad things from happening.

And so this is kind of in the United States we have a lot of conversations about permissionless innovation versus permissioned innovation. And I think the conversation is somewhat shifting in the United States on this. But if we think about in the European context, the Europeans traditionally take a very permissioned approach to innovation. And what I mean by that is that they're fearful, so fearful of potential harms from innovation, that they want to make sure that they mitigate every risk before they allow innovation to happen. In the United States, we've typically taken the opposite approach of permissionless innovation, which is that we're gonna allow you to innovate, and then if there are harms, then we can figure out ways to address and mitigate those harms once they actually appear.

But it seems to me that there's a general anxiety in the EU over tech and innovation generally. It's not, we're not just seeing this when it comes to crypto either. When conversations around AI, the EU is also really trying to take a very cautious approach to allowing certain types of innovation and disallowing other types. Whereas in the United States, we're generally saying have at it.

There's ethical boundaries that we should respect when it comes to artificial intelligence. But generally we want you to go out and innovate because that's what the United States tech industry has been really good at. So I don't know if that that perfectly answers your question, but I think that that's generally what you're seeing is this, a continuation of a general trend of a requirement of permissioned innovation in the EU and anxiety about what the future might hold, whether or not it actually will cause those harms or not. Gary Weinstein: That's a very comprehensive and wonderful answer to the question, and I recognize it was a, a long and compounded question, so thanks for that. In your policy report, you cite to a Cornell University professor Helen Nissenbaum.

Full disclosure, I'm a very proud Cornell University alum and I did take the time to read her paper as well in preparing for this podcast. She has one titled Cryptography Trust and Privacy: It's Complicated. This was published in November, 2022, and it was co-authored, but you cite to Helen Nissenbaum. In that paper, she discusses agency with regard to flow of information. Could you please discuss that topic for a bit, especially since you teased it a bit in an answer to an earlier question? Luke Hogg: Sure. So I think the, trying to unpack, I'm not gonna sit here and, and speak for Professor Nissenbaum.

She has, if you want to read, she has an absolutely wonderful book called Privacy and Context where she really teases out this idea of contextual integrity. And I'm going to kind of summarize some of those things here. But if you want, she has, her whole career has, has been focused on a lot of these questions. So I'd highly encourage everyone, don't take my word for it. Go read what she's written.

Go read what she's written. She'll explain it far better than I could. But to summarize a little bit, really what I think Nissenbaum's big point is when it comes to privacy is this idea of appropriate flows of information for a given context and the agency that goes with that. So humans and individuals should have a right to control the information to a certain degree; control information about themselves.

This is what, this goes back all the way to to Warren and Brandeis and their idea of right to privacy. And this kind of idea of user centered privacy is that, that you as the user, you as the individual, Should be able to control information about yourself. When Warren and Brandeis, this is Supreme Court Justice Brandeis, when they were writing this in the Harvard Law Journal, they were writing about newspapers and yellow journalism, and they were writing in a time, an analog era. But I think their ideas have kind of come forward into the digital era in a way that doesn't quite fit. So it's hard to have total user-centered privacy. It is very difficult in a digital world to have total control over all information about yourself.

Practically impossible, I would add. And so I think a lot of what Nissenbaum is struggling with and critiques in her work is that idea of privacy. That you as the individual have the end all be all control over every piece of information about yourself. Partially from a practical standpoint but partially from an ethical and normative standpoint. And so her, her idea is that instead of thinking about total control over all information in all contexts, that really from an ethical standpoint, we should operate, and she calls this contextual integrity. That we should think about information flows within certain contexts, and then establish rules that govern those contexts, whether those be legal rules, ethical rules and norms, societal rules.

There's different ways that you can apply that. But that's fundamentally what Nissenbaum is taking and if we're talking about this particular context. So she never, in her original book she was talking about kind of a, a pre crypto digital environment, a pre-web three world. But as you mentioned. in her 2022 paper she says this stuff is really complicated.

And it's really difficult to kind of establish a one size fits all rule that applies. That should apply equally in all contexts because different contexts are different, right? My relationship with Instagram or Facebook or Twitter and their collection of my information is very different context than, going back to that example earlier of my doctor collecting information on me. I have very different expectations from those those different actors and the different contexts in which they're using different information. And so we should be thinking about this again, to paraphrase, Nissenbaum, we really should be, when we're thinking about creating rules and standards, we should be thinking about context specific, appropriate flows of information.

And we should establish whether they be standards societal standards or legal rules that allow for the nuance there that allow for flexibility. And that govern tho those flows of information in an appropriate way for that given context. Gary Weinstein: Well, what about the agency of the individual to control that flow of information? Rationality, consent. Luke Hogg: Right.

So I think that this is something that the United States has generally done pretty well on that. You should have some agency over the information about yourself. In fact, you are, at the end of the day, you are in control of that information.

It's your information, right? But once that information is sort of disclosed and, and is governed by those appropriate flows of information, you don't get a, to retroactively decide that that information shouldn't be controlled or it shouldn't be allowed. So I think the, an example here is if I in the web two context, I think there's a lot of conversations about terms of service and I think there's a lot of ways that we can make terms of service better. There's a lot of ways that we can allow more consent and more informed consent in these, in these contexts, but once you give Facebook consent, once you sign those terms of service you don't get to retroactively come back and say, well, I really didn't mean that. Once the information has been disclosed, then, and as long as that you have consented to the standards and the norms of that appropriate flow, then you can't retroactively come back and say that this is super problematic.

You can try and establish better flow as you can, try and reframe the context for the next time. But really you should consent to the information. That you think are appropriate or we as a society think are appropriate. I think that's a better answer to the question you were asking, but happy to continue going down that rabbit hole as well. Gary Weinstein: So, Luke, we talked about attribution, we talked about targeted ads, we talked about data linkage. When you think about communication, we haven't really talked a lot about communication.

Communication is data and there are platforms out there that are end-to-end encrypted whereby I can have a conversation with you and the only people who have access to that conversation are you and me and potentially a regulator. If a regulator is able to, in a lawful, constitutional way through a lawful process, perhaps it's going to a judge and obtaining a warrant, can either go to you or me and ask for our device that has that application on it, and then request the details and obtain the details of that end-to-end encrypted data. But apart from that, and maybe one or two other examples, that communication is entirely private between the two of us.

Society appears to have gotten comfortable with that and by society I include regulators globally, Apple, WhatsApp, Signal. All of these platforms allow for that end-to-end encrypted communication. Where there is no attribution, there is no linkage, there is no targeted advertising, there is no weaponizing of data in the financial sphere. What we are seeing in this pivot to web three are protocols, blockchain protocols, building techno, technological solutions for encrypted financial transactions, end-to-end encrypted financial transactions that are protected by elegant cryptography and the same dynamic applies.

Why has society and why have regulators become mostly comfortable with the notion of end-to-end encryption in communication, but yet have a different perspective when it comes to financial transactions on the blockchain? Luke Hogg: So Gary, I I really love that analogy and I had never thought about that comparison. So I have always been a very big advocate for encrypted encryption and encrypted technologies generally. Us at Lincoln Network, we've worked with supporting the Open Technology Fund, which is a government, a US government run program to build encrypted technologies. If you look into it, Signal was funded by the US federal government.

Right. So, and Tor as well. And so I think it's, it's an issue of control. It's a question of control.

So for the vast majority of human history governments and, well, I think law enforcement specifically, I think that's kind of the subtext of part of your question, had to be okay with the fact that they couldn't control what people were doing. Right. So if you talk about communications, I can't enter your house. If we go back to the 15th century, the law enforcement of the time couldn't enter your house and know what you were talking about. They don't have that level of control and surveillance, quite frankly.

And then you kind of get into the original digital era, even if you wanna go like back to the start of the telephone. Right? And these new innovations allowed for in new and unique ways allowed for surveillance. And, whether or not you want to call it control, I think is a, is an interesting question, but a certain level of control, the government could, could know what you were doing.

And it kind of has this minority report quality of being able to predict and stop things before they happen. But now what you're seeing is we talked about web one, web two, web three and web three as a reversion, back to kind of the original principles of web one. You're seeing the same thing I think when it comes to encryption more generally, right? So you used to could have these conversations completely absent of government surveillance. And then, especially think about around the Cold War when bugging houses and getting wire taps and, and the kind of, the advent of all of that, you could never really totally be sure that the government wasn't listening to you, especially if you're now imagine like Soviet spy or something.

But now we're going back to we're kind of a reversion to that sort of surveillance-less environment where you have total control over the information that you are sending and receiving. And I think that makes the government nervous because they've lost capacity that they used to have. And I think we as a society are, I think free speech specifically, in kind of Western liberal democracies, is a fundamental thing. We all get it.

There's a reason it's the first of the amendments, right? And so when you kind of tell me what to say or tell me what not to say, or you are listening to me and what I'm saying, and that might have impacts, I think we have a guttural reaction to that as a society. Gary Weinstein: We're controlling what you say. Luke Hogg: Right, exactly. We are controlling what you say. We're punishing you for what you say.

And we as a society, especially in the United States, have a guttural reaction to that. And so that's why I think we've been really pretty much okay with things like Signal or iMessage, end-to-end encryption. And, there's notable examples of the government going in and trying to force companies to break that encryption and the companies saying no. And then it goes all the way up through the courts and the courts eventually decide, no, the federal government can't force you to break encryption. They can't force you to put a backdoor in.

I think Apple is a fantastic example of this. For, for all the, the faults and the flaws, this is one thing that they're very, very, very good at, and that's standing up and saying, no, we're not gonna break our system just because you the FBI or whoever want more control over this system. I don't think we have as guttural reaction in the financial sense.

I think all of those arguments hold true. One of the reasons that governments are so nervous about cryptocurrencies and and financial privacy in a digital era is because they had so much control, right? If you go back 500 years, let's go back to the 15 hundreds again. The only real way you can transact is by bartering or having cash and for a long time cash ruled society. And then you kind of get into to checking and banking and digital banking. And each iteration of that gave the government more control and more surveillance capacity over people's financial lives.

And so now that that is kind of being rolled back, and there are now alternatives too. I think people like to call Bitcoin and, and Ethereum kind of like the digital cash. I think there's, that's not a great example, right? But I think there's a hint of it that rings true that this is something that you as the user have control over and should have control over.

And there's really nothing the government can do about it. And so what you're seeing and why this is such a big deal kind of in the financial realm is that they've pretty much given up on the speech side. In fact, I would say they've lost, right? They. There were a bunch of lawsuits over this. There was some legislation that was proposed and they lost, right? I don't think that they've necessarily lost yet in the financial context.

I think they probably will lose in the long run. Because that comparison to me makes perfect sense, right? Why is it different in a financial context versus a speech context? Gary Weinstein: I think it may also be that, and I love your description of control, I think that's right on. One of the other attributes is that financial transactions can now be linked, aggregated, and used to pattern behavior. And for various jurisdictions where certain behavior is deemed non desirable, being able to link financial transactions to behavior provides a measure of control of population that we are seeing play out in China in a way that is quite traumatic. And I'm wondering if our inability to find a similar comfort with communication and financial transactions, preserving confidentiality, will simply land us where China is now, where government can determine patterns of behavior based on what you've purchased or who you've sent money to, or where you've received money, and then use that to control behavior.

Luke Hogg: I think it's, it's a bit of a slippery slope to kind of say that this will end up will end up being kind of a totalitarian regime like China. One of my colleagues Literally wrote the book. He spent a lot of time in Schengen and literally wrote the book on the surveillance state that is being used to oppress the Uyghur. So I think that there's a lot to unpack in that particular example.

But I think your point rings true, and I think I'm gonna provide a secondary example that I think might kind of get at this question a little bit better. There is it's kind of a famous example that's always been used around crypto communities that we like to think about cryptocurrency in a very domestic centered context. And we kind of tend to forget about the global context. And one of the examples that's given is Afghanistan. So in Afghanistan, I don't know if this is still true.

I imagine it probably is. I'm not a Middle Eastern expert. But as the story goes it is supposedly illegal in Afghanistan for women to have their own bank account independent of their husband. This is part of the Sharia system and the Taliban control. And so this obviously creates problems for women that are trying to get out and become economically independent and get out of particularly abusive scenarios. But there's a, an example that's given of a woman who started her own business and was selling products internationally handcrafted products that she was selling internationally.

And she started to accept Bitcoin as the way to get basically the way to transact, the way to get value for her goods. That was independent of her husband and that eventually allowed her to kind of get out of a, an abusive scenario. And she kind of as the story goes right, she then told other women about this and started kind of a whole business that was based around sending these goods abroad and getting those transactions in Bitcoin. So I think that that's, especially from a, a western liberal mindset, we look at that and say, wow, isn't that fantastic? Isn't that great that this, this woman was able to get out of an abusive scenario in an oppressive theocratic regime because of financial freedom and financial privacy specifically. And yet when we kind of bring it back to our own context we have a different reaction to it or maybe not as visceral a reaction to it.

So I think that that's an important way to look at this. So And that's why I appreciate you bringing up the China context, because when you're looking at kind of authoritarianism, digital authoritarianism is a, is a thing that we talk a lot about at Lincoln. These tools can be a very, very strong factor in countering those kinds of authoritarian regimes. And so I think the important thing is to kind of think about the broader context of these.

So when you talk about Signal, going back to the messaging context for encryption, one of the things and the reason that the United States government does fund encrypted technologies and development, research and development of encrypted technologies is not really because they want people in the United States to be able to use encrypted technologies. They're kind of agnostic about that. I think generally, you can kind of say what you wanna say in the United States and you're not gonna be thrown in jail for it, generally.

But they develop these programs and these are DOD funded programs, CIA funded programs because of the international context, because these privacy preserving technologies can have such a huge impact in other parts of the world where they don't have the same freedoms that we do. I mean, if you want evidence of this, go look at the rise of encrypted end-to-end encrypted apps in Ukraine right now. Like, go look up the statistics. It's absolutely astronomical. And so when we're talking about these conversations and, there are people out there that still advocate banning end-to-end encryption banning kind of privacy preserving technologies generally for various reasons that we can, I'm happy to get into. They're usually talking about it in a very domestic context, and they're talking about very small, small in the general sense, domestic risks and concerns when really we should be thinking about kind of the overall overarching global context of these technologies.

Gary Weinstein: Your Afghanistan example is quite powerful. Thank you for that. But Bitcoin is not private, it's public and it's pseudonymous, but with easy forensics, those transactions can be attributed to that Afghani woman and all of her friends who she told of this wonderful technology.

Should she have, and would she have benefited from the use of a protocol that allowed for confidential transactions on a blockchain such that the Afghani government would not be able to, with the push of a few buttons, determine where every single transaction went to and was received from. Luke Hogg: Well, sure. I mean, I think that that's, that's a perfectly fair argument. I use the Afghan example and if I remember right, it comes from the very, very early days of cryptocurrencies. This is, I believe it was a book that I read that was written in about 2016, which means this example is probably coming from about what, 2013.

And so Bitcoin is the example there because that synonymous aspect gave them a certain amount of privacy and a certain amount of control. But the more amount, the more privacy and the more control that you can give people in those kind of contexts, the better. So going back to the messaging context, the United States supported a lot of different technologies for a long time that allowed for more and more private context. You go back to Tor which, I don't even know how long tour's been around, but the original idea of that was to allow people to do things more privately than they would on the original internet.

And that's why the US government funded it and helps develop it. But Tor only went so far. There's ways to put back doors, there's ways to get information on, on tour that make it not totally private, which is why we kept developing, right? That's how we ended up getting Signal. This is how we ended up getting end-to-end encryption. So I think the farther you can go towards those things especially in the global context when you're talking about countering digital authoritarianism the more, the better in my view. Gary Weinstein: Thanks, that's very helpful.

This very podcast, PGP for Crypto Podcast, that I co-host with my colleague Paul Brigner, PGP stands for pretty good policy. So I'd like to discuss for a moment the key challenges. So what are some of the key challenges that policy makers will need to address in reconciling data privacy laws and decentralized systems? You might have some advice for them looking to better understand the potential of blockchain technology and create a supportive regulatory environment, or how policy makers might balance the need to protect data privacy with the need to foster innovation in the decentralized technology space.

It'd be very helpful to get your thoug

2023-03-01

Show video