Free Thoughts, Ep. 261: Emerging Tech (with Matthew Feeney)

Free Thoughts, Ep. 261: Emerging Tech (with Matthew Feeney)

Show Video

Welcome. To free thoughts I'm, Aaron Powell and. I'm Paul Mitsuko filling in for Trevor Burrus I am. Host. Of libertarianism.org. X' newest podcast, building, tomorrow joining. Us today is Matthew, Feeney he is director, of the Cato Institute's, new project, on emerging, technology, welcome back to free thoughts Matthew thank you for having me. What. Is the project on emerging technology, yeah, the project on emerging, technologies is, Cato's. Relatively. New. New. Endeavor so it's I trying. To count now I think it began a couple months ago June or July I should probably know that but it's a relatively, new I'm. Running it it's a project of one at the moment but the the, goal of the project is to highlight. The. Difficult. Policy. Areas, that are, raised by what. We're calling emerging, technologies, now this is always a difficult, thing to define right and of course. Emerging. Tech is not, just changing technologies, but new things arriving, on the scene and. What. I've done is to try and highlight a couple of issues where I think Cato has a. Unique. Capability. To, highlight. Interesting. Libertarian. Policies, associated, with new tech so. Some. Of the. Policy. Areas that we're focusing on include, things like artificial. Intelligence, driverless, cars drones. Data. And privacy issues and, others there, are a lot, of tech. Issues, that, have. Been around for a while so I don't think net neutrality is going anywhere anytime, soon nor, are the, numerous antitrust, issues associated with big tech companies and we've. Certainly Ocado had people write about those issues before but, this, new project is. Confining. Itself to, five. Specific areas, but I'm sure that as the, project grows and develops the, the list of issues we'll be tackling will grow. How did you choose those, five in particular yes. So the the five were, areas where I thought. Kaito. Didn't. Have enough people writing about and. Also. Area so I think libertarians, have something new, and interesting to, contribute, so for. Example, I've. First. Couple of years at Cato I did write about the, sharing economy I, also, wrote a little bit about drones body, cameras, new, tech issues, but. My work with drones for example, was just, on law enforcement, use of drones specifically. The concerns associated with. Drone. Surveillance, but. I wasn't writing, at. All really, on the, commercial, use of drones so the exciting world. Of. Taco. Delivery drones, and building inspection drones and, that's a whole different policy, area really, compared. To drone surveillance so that was an area where I thought we should really have someone who. Can direct a project, that will. Commission, work on those kind of those, kind of issues. Another. One would be so artificial intelligence, right is. Something. That, I think is very exciting, but, poses difficult questions to libertarians, and libertarian. Commentary on that space has been. Not. Nearly I think as robust and as loud as it could be so that's another reason why I picked that. But, yeah basically the, the five I think fulfill the criteria of. Being. Focused. On new and emerging tech that. Libertarians, have something interesting to say about and that, caters in a good position to.

To. Tackle. Can. You give this example of, what. You, mean by libertarians, of something new and interesting to talk about because, a lot of a lot of tech policy in. The past has, taken the form of its regulatory policy, and its should, this thing be regulated, or not, typically. And then what form should have be regulated, in, and. And that tends to break down along the standard, lines, you have the people who are opposed to regulation, you have people who were generally Pro regulation, but, what's what's, uniquely. I guess, libertarian. In, in the way that you're approaching. Technology. Issues yeah. So don't think that the, way, that we're approaching the. Project is much different to how a lot of us here in the building approach our other policy, areas and for. Me it's to. Tackle. The issues raised by this tech by embracing. A presumption of freedom and trying, to minimize coercion. Right so that's, so. Number, one on the presumption of freedom that we should. We. Should act in a way that allows, for innovation and entrepreneurship and. Make. Sure that people working. In this space are in a position where they're. Asking for forgiveness, more often than they're asking for permission and, as far as minimizing coercion, this goes back to some of the work. I discussed earlier when we're talking about data. Privacy, and drones, we, should be wary of some of the government use of the technology, and, making. Sure that. Exciting. New technologies. Like drones, can be used for really cool stuff like deliveries, and other. Private. Applications. While also trying to make sure that the scary aspects, of it like surveillance, are being. Being. Put, under lock and key as much as possible, something, like artificial intelligence, might be another good example that we want to make sure that people working in the space are. Free to innovate and to, explore. New ideas but, we want to make. Sure the government use of it especially when it comes to autonomous. Weapons and, automate, automate, assailants. That we. Ensure. That there are privacy's then keep those threats checked, so. Emerging, tech by its nature is. Still, you, know yet, to come it's already, not yet it's kind of here but it's still in prototype, or developmental, form so. A lot, of the potential benefits. As. Well as potential risks, are still in the future so, like as you're trying. To decide what should be regulated, and what shouldn't be regulated, or in what ways it, should or should not be regulated like, what's your rule of thumb for trying. To rule on decide, on something that hasn't actually happened yet. Yeah. I suppose the. Libertarian. Response to this is a comparatively, straightforward. Right. We, should. Proceed. With caution when, dealing with. Imaginary. Threats so. Let's. Think of a good example right maybe, only because I work on it in my own research right but I think it's fair to say that in. The coming decades, that we will see more and more government, use of unmanned. Aerial surveillance, tools I think that's a fair assumption I, also, think it's fair to say that that technology will improve as. Much as it proliferates, and. As, I did write I wrote a paper. Saying, look we. Should in, preparation, for this, world. We should have the following, we. Should have the following policies, in place what, I'm very hesitant to do and. Not, that it should never be done right but we should be hesitant I think to develop, new, rules because, of a new, thing coming on to the block drones. For example, raise, interesting. Privacy concerns but it's. Not clear that they're necessarily. Unique. In. The, way that a. Lot of people think they are so we don't like the fact that drones could be used.

By People to. Snoop. On our us in our bedrooms, or to fly over our barbecues, and we don't like that police could use them to, to. Do surveillance. But. We. Already have laws with peeping tom laws we have a. Tort, system that can do off a lot of these complaints and. While. The, Supreme Court precedent on things like drone surveillance is not particularly. Not. Very satisfying, it is the case that states can and have gone above and beyond what the Supreme Court requires, so. Going. Forward I think we should be hesitant to think of well we need a driverless. Car policy. We're gonna write down or, we need a drone policy we should think, about the kind of threats that come. From these, these fields, but, resist. The temptation, to, write. A lot of regulation. And anticipation, for, the proliferation of the technology, but. Isn't that the problem that because. These are emerging. Technologies, they're they're not technologies, that we either as. Citizens. Or, just, ordinary, people in our lives or, as lawmakers, or legislators, regulators. There. We, don't have any experience with them we haven't used them we, haven't seen like how they shake out and so that. That. Notion of saying, well, you know we should we. Shouldn't just imagine, threats, isn't that what we're kind of forced to, do one, of the things that can just that, distinguishes, emerging, technologies, now from emerging technologies in the past is the. Pace at which they can become all pervasive, the, pace at which they can spread so either their network technologies, that just you know in, a matter of years suddenly everyone, is on Facebook. Whereas. You know the printing press took a lot longer to get books into everyone's hands, that. Don't, we have to be. Anticipating. Threats because, it with a lot of this stuff if we don't and we don't protect ourselves now it might be too late. Well. Too late for what right this is the the, question I. Think. History, has enough, examples of, people. Exaggerating. Threats that, we can learn from so. One. Of my favorite examples of this right is the the. British 1865. Locomotive, Act which required. A. Vehicle. That not pulled by an animal so, a steam-powered, locomotive if. It was on a road and towing something it was legally required that you would have a man 60, 60. Yards ahead of it with a red flag right because people were anticipating certain. Threats right that these these new technologies. Are gonna cause accidents and so what we need is it's, obvious right we need a man running ahead of these things with a red flag to to. Alert people that this very dangerous thing is coming across I, don't. Know if that's the right kind of approach to dealing. With emerging, technology, issues right we we, can anticipate that with the emergence of the locomotive, that there will be occasional. Accidents, and some people will get hurt the. The early years of flight, for example are just full of people killing themselves in, these new flying machines and. You. Might it, sounds a little cold, harder to say but the price of innovation, for something like that is that mistakes. Get made and people might get hurt and, and it's difficult especially in. Today's, world. Where. News travels so quickly that the moment that someone gets hit by a driverless car or, a. Drone, lands on someone's head everyone's, gonna hear about it and I think. People are, thirsty, for news for for bad news unfortunately, and. That's something we're always gonna be fighting against so. I actually, would go on record right now saying I'm in favor of a law requiring the Elon Musk wave a red flag 60. Feet in front of every driverless. Vehicle, because he has more time on his hands. So, I hear, you talking about essentially. Assumption of risk that with when it comes to tech we have a long history of people over, rating or exaggerating. Fears of the downsides, of attack and having, a harder time imagining. The, beneficial, applications, and so. A light, touch regulatory, policy, wedded, with like, a general, cultural, sense of hey if you want to experiment with this as. Long as you limit the externalities. The the damage other people, go. For it I mean is that kind of the ad - you bring this stuff like on it on you know unmanned vehicles and like yeah I think that the the barrier for government intervention in this space should be.

Difficult. To overcome right so it had a very high risk of death or serious injury is basically, where I would say you, can maybe argue, for some kind of regulation, and again we're sitting in the Cato Institute right, I mean our approach to regulation. This. Is a unique approach to emerging, technology, I think libertarians. Across the board have light-touch. Approach and I. Feel. Like you can have that approach while accepting, that there are risks right, and. The. The. Problem of course though is that with a lot of this stuff an. Argument, can be made that innovators. And entrepreneurs might. Be hesitant to start. Doing, a lot of this work if they feel like they might get in trouble or, they. Want to wait until there is a safe regulatory, space so. Amazon. Right decided. To test its delivery drones in England, because, they knew that the FAA had not. Cleared. The, the drones delivery. Drone testing here, so I. Can. Understand why Amazon didn't say yeah well screw it we'll do it anyway you know people want to be I think if you want to be a respected, private business you don't want to get in trouble with the, feds I, get, that but I think. That's an unfortunate feature of FAA, regulation. That the FAA should have an approach of. You. Can you know you. Will. Better. Be careful because you, will be. In a position to ask forgiveness but I still think that's a better position than, people. In the drones base asking for permission. But. I mean and going kind of back to the question I asked before. With. Emerging, technology, and with the to. Quote, Donald Rumsfeld the unknown unknowns. In. You, know at play here. Do. We want people, to be as. Extra-special. Careful, in, a lot of these areas because, you even have situations, where so the the. Story often gets told a lot of people like. You know this is this is the narrative is that all. Of a sudden a handful, of people in palo alto well. No one was watching broke, american democracy, with. Social media right or or, a situation, where you know that, everyone's kind of out there innovating, and then suddenly we have a rogue AI. And. We can't do much about it or or. You know like gene splicing, CRISPR. People making, stuff in, their in their garages and, then we have a pandemic like, that, that. That. Kind of threat of regulation, or. That asking for permission, does. That help. At least to mitigate against those kind of sudden. Catastrophes. Well I think you're. Highlighting. Something interesting. Namely. That well. First. I'll say hindsight's always 20/20 right that it's easy to look back like wow if we had X regulation, why, would never have happened but. It's. Easy, for, people to come, up with scenarios the. Difficult job is thinking, of regulation, that would, hamper. That scenario if I'm ever taking place while also not hurting. Innovation, so. Rampant. AI okay, so this is something anyone who's what a science-fiction film, worries, about but. What's. The fix to that do we. Write. A law saying no one shall build AI that, will run. Amuck on servers and take over that I mean it's. Isolating. A threat is not, the same thing as coming up with a good regulation. For that threat and so. Social media. Social. Media companies, ruined American democracy so this is. Sometimes. Said by people but what. What's the regulatory fix, that would have stopped a lot of the. The. Bots and the trolls that got. Everyone concerned. In the wake of the. Election that. That's. A much harder question it seems to me it's easy to get outraged and to get worried about possible, threats but coming up with solutions is much much harder and I, think, we should also keep in mind how likely. The threat is, it. Would be a shame if developments. In AI were, seriously. Hampered because a, couple. Of lawmakers watched too many science fiction films and got really really worried about, the. You. Know the terminators are well, how big of a problem is, is. That specifically. That, this. Is an area where, lawmakers. I mean we put the Cato Institute we often lamented how little lawmakers, seem to know about, the. Subjects. They plan to regulate and in fact we have named our auditorium, the FA Hyack auditorium, who you, know Hayek. Famously. Offered a theory for why it was, lawmakers. Could never know enough about the stuff they wanted to regulate to regulate it well. But. This seems to be an area where. Lawmakers. Are, particularly. Ignorant. That, it's, I mean it's it's often. Cringe-inducing. To, watch like, congressional. Testimony because. These lawmakers have. Levels. Of understanding, of the Internet of networks of technology that is substantially. Worse than you know the, typical middle schoolers. So. Is that how. Do we deal with that kind of problem, that we've got we've got a situation where, lawmakers, there's this tech they you know they the urge is always to pass a law, whenever.

There Is a threat or potential threat it's passed a law and they they're doing that because they want to do it they're also doing it because constituents. You, know demand pass a law but. That this is an area we're almost like by definition, you. Can't know much about it. Yes. I. Defy. Anyone under the age of 30 to watch anything like soccer Berg's testimony. On. The hill and not have their. Head in their palms by the end of it it is very worrying, that. Many. Of the lawmakers on the hill don't, seem to know much about this and that. Makes sense because a lot of the people who'd be qualified, to be on staff in, these offices to actually give advice and to explain, to members. Of Congress how the stuff works could. Be paid much, much much better almost, doing, anything else actually in the tech industry and, that's. That's a serious worry and there's. Also this worrying. Inclination. Among, some lawmakers to urge. Technology. Companies to and, I, quote this isn't a phrase original to me but to nerd harder, right there whenever there's a problem like. End end, encryption people, think well we don't like the fact that some terrorists can communicate, using, whatsapp. Or signal but, there must be a fix you must you know how, can you not fix this and there's a there's a frustration there. Where. We're sitting i I think that we. Should maybe spend more time focusing. On the. Benefits. Of this technology not focusing on potential costs so driverless. Cars will kill, some people they, just will and. That's. Of course regrettable, but we should think about the lives that they could save the, vast majority of auto, fatalities, in the United States are directly, attributable to, human error. So. From that perspective. Driverless. Cars that are. Better, than human drivers but not perfect, will. Save thousands, and thousands of lives a year and. Once. Congress. Eventually, gets happy, with the proliferation of, driverless cars we should expect that for the next couple of years there will be headlines. Of driverless. Cars killing people and that's to, be expected and it will be a big cultural, shift so. Emphasizing, the benefits rather than the costs I think is. Is. Worthwhile both, that's easy for me to say because I won't, be the one sponsoring, the bill that allows these things to run rampant and then who. Are they gonna wag the finger at when the bad things do happen but. Like. I looted, too early a good news rarely makes headlines, and, it's also slow moving right, it will take a long time for the benefits of driverless cars to, be realized, in the data but. The. Accidents. And the deaths will be reported, instantly so. I hear from you Matthew is a. Sense. That our cost. Accounting. Or cost-benefit, accounting, analysis is flawed, right it's easy for us it's kind of a scene versus, the unseen, situation, it's easier, for us to imagine. Apocalyptic. Worst-case. Scenarios, and then, to discount, the. Possible. Benefits so whether it's you know pharmaceutical. Regulation, you, know something like the FDA has a notoriously. Stringent. Safety. Requirement. That doesn't really account for the fact that not. Approving. A life-saving, drug drug, costs. Thousands, even millions of lives and that, doesn't play a role they just are, asking whether or not the drug itself will harm lives so, in. That sense we have a you, know the the ledger the kind of accounting, ledger. Is is flawed when it comes to emerging technology, but, I'm also interested in hearing you talk about. Ways. In which. Regulate. Regulators. Themselves. By regulating, too quickly it. Can actually fulfill kind of self-fulfilling, prophecy. When it comes to kind of the downsides, of that technology. So. A good example of that would be what. I just want to make sure I understand a question so, I. Suppose. You can imagine a situation right, where the. FAA says well we haven't had as many drone accidents, as other countries because we haven't let drones fly mm-hmm right which, probably. An accurate statement. We. Need to keep. In mind that. While. That's true and the FAA is tossed with safety right they need to make sure things are safe we. Need to also, keep into account while we're losing I think, when. You ground drones you in, a cost, namely. You, are, not. Having as innovative. And as exciting, an, economy, as you could have so, yes. A federal. Safety agency can stand up and say. Bad. Things aren't happening because, we're just not letting people, experiment. But it's not a particularly useful, thing to say it seems to me and, it's. Also. Not helpful because no one who's rational is denying that. Emerging. Technologies, will. Come at a price we're just saying that in the long run the benefits outweigh the cost given, that and given that.

Bad. Regulation. Or over, burdensome regulation, can. Not. Just slow down the pace of progress but, can cost lives can, certainly reduce. Wealth. Economic, growth when. Is it appropriate and. We've seen this happen a fair amount in the emerging tech space when, is it appropriate or, is, it ever appropriate to. Intentionally. Circumvent. Regulations. So. We're the part where Aaron asks me when, is it okay to break the law so. I. Would. Like to point out that I think there are a lot of people who do this by accident right, I. Don't. Know the number but I imagine. There are many people who. Got. Drones, for. Christmas or birthdays, right and, flew. Them. Without. Adhering, a hundred percent to FAA, regulation. I can say that with almost a certainty. The. Response, from the FAA I think should not be to, bring. The hammer down. Now. When when is it acceptable, I. Mean. I don't know anything. Sorry, good the classic example, being like uber which. Uber. Has. Arguably, changed the world in, and. Frequently. In a positive, way they've granted, they have their problems as a company but a lot of that came with them. Basically. Ignoring local. Regulations. Okay. In. That case I, would. Argue that at least in some of the jurisdictions. Ubu. Could have made the argument that. Well. We looked at the. Taxi. Regulations, and we, decided that we didn't fit the definition of taxi so off we went. That's. A much easier argument, it seems to me then, a, drone. Operator. Saying. That they're not a. An. Aircraft, under FAA. Definitions. Uber. I think was doing, something very interesting which was providing. An. Obvious. Providing. Obvious competition, to an incumbent in industry, without being. Actually. A very different thing to. Two customers, behind the scenes I think a lot of people found, Oberon. Taxis, to be very. Similar but actually the very different kind of businesses, and it's a very different kind of technology I. Take, your point and of course ubers. Ubers. Opponents, would, oftentimes. Portray. Portray. Hoover as a, as. A lawless. Invader. I think. At least in some jurisdictions uber, could make the argument that actually no we just feel like we didn't fit into that, regulatory. Definition. And uber, does fit into this very well at least when it began fit into a very awkward regulatory. Gray area. So. In a situation where you've taken a look at existing, regulations. And you think that you don't actually. Run. Afoul of any of them I don't see why people shouldn't feel free to get. Into an area and innovate a. B&B, might be another example where you okay, well I took a look at local, laws and I, figured. That I wasn't a hotel, seems. To be a reasonable. Thing for people. To assume but I. Won't. Say this is without risk, you know I wouldn't advise anyone in a private company to deliberately, break the law and to hope. That you have good lawyers on hand I don't know if that's the the best approach because.

Local. Lawmakers don't like. Don't. Like that kind of confrontation for. Sure I. Mean. I suppose there's a some. Of that question comes down to, one's. Own ethic, right I mean most people have imagined, and, an ethical obligation to. Break the law when. There is some kind of clear. Cost. To life. That. Comes from following the law I mean so you know civil disobedience, writ large you, know and no, one it, well some people did. Hold. Them responsible but, when you Martin Luther King jr. or another civil rights activist, blocks, the highway, for a march on on Selma. Birmingham or, whatnot right like, it's. The, idea is is that laws or it's okay to circumvent, them when there's a clear epical obligation. To. Do so that the law is less important, than than. Like ethical systems so that that gets complicated really quickly depending on I will. Mention here though that charles. Murray i think, it's I haven't read the book but I think that in one of his most recent books Charles Mary advocated, for a, law, firm that specializes, in. Protecting. Entrepreneurs. Like this to basically, encourage. People to go out into the the wilderness. Adam. Fear from. Makita's who wrote an excellent book called permissionless innovation, he. Dafuq. Categorizes. Technologies, is born. Free and born, captive, that some are born, captive, into regulatory, regimes, and others are born. Free that, truly new, and innovative and, regulators, haven't caught up yet but. If. You're, born free as Adam, might call them I think you better be ready for certain, fights and, the, Charles Murray's recommendation, was yeah we should just basically, have law a law firm that specializes, in, helping. Entrepreneurs with. These kind of fights. From. The regulator's point of view I think they should perhaps just choose their fights. More. Carefully and, and not scare away people but that's. Not gonna happen anytime soon and we. You. Know the costs, that. We've been talking about like. Like, deaths and injuries are, I think. Easier. To discuss, but the problem with a lot of technology. Or emerging technology, discussions. Are you have these more. Difficult to pin down complaints, about the impact on society and. What's it doing to our children and. Isn't, this making them us more isolated, think, about the citizenry, all that sort of stuff is. Thank. You tipper gore well, right it's. It's. Interesting because this isn't a new kind of complaint, right but, nonetheless. Remains. Sticky, I wanted, to to briefly. Read. Out an app a quote, I found from 1992, there, was a. Neil, postman, postman. Sorry wrote a book called um, technically. The surrender of culture to technology, and he was on c-span in, 1992. And he previously complained about television. Right and he was he, was on and he said when. I started to think about that issue television.

I Realize that you don't get an accurate handle on what we Americans, were all about by focusing on one medium, that you had to see television as part. Of a kind of a system of techniques and technologies, that are giving the shape to our culture for instance if one, wants to think about what has happened to public life in America one, has to think of course first about television but also about CDs, and also about faxes, and telephones, and all the machinery that takes people out of public arenas and puts, them, fixed. In their home so that we have a kind of privatization. Of American life this. Is a really, interesting kind of complaint but he, goes on to describe a, future. That were kind of in now where he says when. His people say with some considerable enthusiasm that. In the future putting, television, computers, and the telephone, together people, be able to shop at home vote, at home express political preferences, in many ways at home so, that they never have to go out in the street at all and never have to meet their fellow citizens in any context, because we've had this ensemble. Of technologies, that keep us private away from citizens, and I, hear complaints, like this quite, regularly I mean that's from 1992, but there is still there's, a very persistent. Worry. That, emerging, tech will make us and make. Us bad citizens, make us isolated. AI. Is exciting, but what what do you will. Will our children say please and thank you to the robots will the, robots become our friends, or our sex. Partners, you know this is, isn't all this stuff making us kind of isolated, and. This. Isn't a new concern. Frustrating. And it's not going away, so. We have been talking largely, about. Policymaking. Policymakers. Regulators people, who are in the in, the the policy, world. But. How. Much of that is really just. Downstream. Of culture. Such. That when we're talking when we're dealing with these issues of emerging technology, that where the real action is, happening is in the culture is and the cultural acceptance of it and so. To some. Extent focusing. On the on strictly. The policy, is kind of missing, where. Much. Of the influence, is, or will be I. Certainly. Do think that it's. Important, to communicate to the public about this because, like. You mentioned some of these policy. Concerns, are downstream, from what from. The public and in. Preparation for the podcast I was finding articles from you, know 18:59, editorials, in the New York Times complaining. About the Telegraph, and a, 1913. New York Times article complaining. About the.

Telephone And how it's in current bad manners, all, this stuff. Isn't. New but I think. When, we're sitting, in a think tank we should be. Ready to communicate with the public in addition to regulators and lawmakers if, if we, have a optimistic. Forward-thinking. Public. Then you hope that that will translate somehow and. Translate. Somehow to, lawmakers. But. Yeah. Lawmakers, are made up of human, beings and the, public or human beings and they have a pessimism. Bias and, I. Think. Though when you, focus. Again on on benefits, that maybe. More. Parents, would be happy if driverless cars could take their kids to, baseball practice and, it, would be better for people. If their elderly parents have. Appliances. And homes that can monitor if they've fallen, down or if they have had a medical emergency it. Would be good if we, were. Able to. The, travel more safely to, have, our homes know more about us it would be nice, to to. Come home and, to, have the home you know sat at the right temperature, and playing, the right kind of music making. Sure that people realize the benefits of a lot of this stuff is is I certainly, think part of part, of the mission my, only audience is not lawmakers, that's for sure. All. Of that the home that knows a lot about you that all these things that can predict stuff about you keep track of things about you, there's. A lot of data there there's, a lot of data gathering a lot of it depends, on devices, that. Can. Surveil. Us in, in one way or another and. We. As libertarians, we, as Cato Institute scholars, we spend a lot of time talking about the problems of government having access to, data. And government. Surveillance programs, but. Are. We concerned should we be concerned about the level of pervasive. Private. Surveillance that that rosy, future you. Just sketched out demands. I think. We should be worried. You. Can listen. And read a lot of Cato material, on the concerns that we have about government. Access to data and I certainly don't want to sound blase, about that. So. I'm my, primary worry is the. Government, mostly because as creepy. As a lot of this might be when it comes to Amazon and Google Amazon. And Google contour, SME will put me in a cage I think, that is a. Big, difference people. Might be a little creeped out by these shopping algorithms, they. Might be a little freaked out by the fact that these companies do know a lot about us but I want the heavy lifting there to be on government, access to that data, you. You buy a lot of these appliances, there's a certain degree of a you you assume that. They. Will be collecting information about you but. I'm. Not as worried, about Amazon, as I am the government for the reasons I just outlined and I. Don't think Amazon has an, interest in creeping, out as customers, too much should, we be worried though about companies. Like Amazon. Gathering. All this data. Centralizing. All this data and then that data suddenly, becoming. Either through the. Passage of legislation or. Through, subpoenas. Or warrants, or through, government hacking, accessible. To the government. Yeah. There's a degree of trust you have in these big. Companies they need to do a good job at. Being custodians, of data I, don't. Want to speak to the I don't, know a lot about Amazon's. Actual security just using them as an example they. Have a very strong, profit seeking incentive, to make sure that their customers, privacy is is not violated, there's not much though that they can do right when the government comes to them with a valid court order they, they you know are, put. In a tough spot and. And, again, that's why I think that's where we should have. The focus but we shouldn't, be of in any doubt that a lot, of these companies have, a huge, amount of information on, us and I think it was my colleague. Julianne who once said that you. Know if Google, was a state, it would be a. Pretty, powerful police state given the amount of information has my, apologies to Julianne if I'm butchering your quote but the point being that we, they, do gather a huge amount of information on us and, people. Even, like me right I do. Incur, a cost when, you use, protonmail, instead, of Gmail or you, use. DuckDuckGo. Instead, of Google for web searches, and that cost is that you know Google. Now knows a little less about you and can't provide, you with the degree of service that most, people have but that's. Fine by me there's still choice Google's, not a monopoly, when it comes to this sort of stuff so and. People value their privacy subjectively. And, maybe I value, it as slightly higher than the average person but I have no problem with people using. Google products. To make their lives better I, do worry, about, government. Access to that data to.

To. Conduct investigation, it. Feels, like forever ago now but it's only a few years ago folks were there. Was buzz about, Mark, Zuckerberg running, for president, it's. That. Blend of a major, you. Know a major, tech company, with the power of the state while, it's you know unlike, now. It's. Not outside the realm of possibility even. If it's not as literal as the head of one being the head of the other um, something. To go back to to, something. Mentioned before Matthew you. Teased. A bit about how. In Great Britain I think. It was regulatory, policy towards, unmanned, aerial. Vehicles, was. More. Favorable so, it pushed, you. Know Amazon to conduct. Tests. Overseas. So, to broaden that out. How. Would you say like on the net. International. Regulatory. The. International regulatory landscape how, it compares to the United States like where is the u.s. ranked when it comes to, relative. Freedom, and regulation. Of emerging, technology, I. Think. It's difficult to say for the following reason but saying. Technology. Policy is a bit like saying economic, policy right it's a it's a huge range of things. So. Let's think of the, plus. Side first so the, United States is still a global. Leader when it comes to tech innovation, this. Country, is home to some of the best-known largest. And most interesting. Tech companies. Global. Data. Recently. Produced a list of the most the 25 most valuable, tech, companies in the world 15, are in North America, 7. In the asia-pacific only, 3 are in Europe and that, I think is not an, accident. Europe, is as. You alluded is slightly I would say ahead of the United States when it comes to drone policy but, they. Slapped Google. With a huge. I think it was 5 billion dollar fine on antitrust. There's. So. It depends, on the technology you're talking about there certainly ahead when it comes to, I would say drone policy but. When. You're leveling. Billion, fines. Worth billions of dollars on Google. Right it's not a great look and. So. Examine. The policy, the. The technology specific policy I wouldn't, want to go to a big. Generalization. I would. Say though that. There's probably a reason that the United States is still today a massive. Hub. And funder. Innovator, when it comes to technology. Does. Competition, work in that area so, do you, see is there evidence that. Countries. Look. Over at other countries that have better tech policy if so are getting better. Bigger country, companies, more innovative, products. And say well I it's. Probably good for me to loosen things up a bit too I. Don't. Know I'd have to look at data I think the problem. Is for a lot of these countries is that a lot of the Silicon. Valley is still a massive talent suck for. A lot. Of these a lot, of these countries, that's. That's a gut assumption, I'd have to look at data on that. Competition. Of course is is an, interesting, point, when you're talking about big, companies like Google, Apple Amazon a Facebook because a lot, of those companies are big enough that they can buy.

Interesting. Smaller companies, so. What. Would be a good example yeah YouTube, Instagram, whatsapp. These, are all companies, that were, bought by. By. Much bigger companies, and that's, not necessarily a bad thing and, it's, not, necessarily. Something. That we should complain, about but, for. The foreseeable future I imagined, that Amazon, Google Facebook and Apple are going to be on the lookout for interesting. New companies, to buy one. Because they view them as competition down the road but, two they also feel that they can do interesting. Things with those companies and. That's. Not a that's not a bad thing necessarily if, you are building something that competes, with Amazon, and, you're. Presented, with a life-changing, amount of money. There. Will be some people who say no thanks I'll keep plugging away what I'm doing I believe it's the case I'm not a historian, when it comes to Facebook but I believe Facebook. Faced a, buyout, option at certain point right didn't someone want to buy Facebook. Could. Be making that up but, my, point is that there are very large successful companies today that said. No to, to. Buyout flow Netflix. Famous yeah that might be a blockbuster had, the offer on the table for you, know some minuscule fraction of, what Netflix is value that right and, keep. In mind that this, this, competition. Question is something we're going to hear more of as long as Trump, is the president, because there's. A perceived. Anti. Conservative, bias in Silicon Valley that people think is actually. Affecting, the product so I. Think. It's fair to perceive that most people who work in these big tech companies, are probably to the left of the average American, I think that's fair to say that I'm not convinced that that. Personal. Bias among employees, has had a direct impact on the products and. You've. Had your in this weird situation where self, professed conservatives, wrote oh there are now saying well, they're too big and, we should talk about han tea trust when. We're thinking about the big four Google. Amazon Facebook, and, Apple I'm, not convinced, that these, companies. Are monopolies, in the true sense and I think, would be a mistake to. Bring. Antitrust. Action. Against them, so, the example that comes to my mind of international. Competent, Ori. Competition. From. TechCrunch. Disrupt out in San Francisco, a. Number. Of panels, hit on the idea that, when. Full. Self. Driving cars level-5 you know no steering wheel when, that gets rolled out it'll, be rolled out in China, before. It gets rolled out in the rest of the world and, that. Will be because according, to a number, of speakers, the. Central government in China has, just established by. Fiat, we, are going to be open to autonomous. Vehicle technology and, actually, by like, the dollar value, of investment, in China just over the past year has, matched in, a V technology has matched the rest of the world combined, so you're seeing kind of that they're, shifting, to a place because, in China, the, central party can. Cut through, local. And state level competition. What. That brings to mind for me though is a question for you Matthew about. How. Emerging, tech should be regulated, by local. And state authorities for, federal. Authorities like, the question, of federalism, and emerging tech policy how, do you approach that as someone analyzing, emerging. Tech I'm. Very interested in a lot of the local, regulations, that handle. Industries. Like ride-sharing and, other things you see in the the sharing economy but, when, it comes to a lot. Of the technologies, we've discussed that are very powerful federal, regulators. The. FAA, at. The FCC. With. Bioengineering. And, all that the FDA. So. I am, in a position where I am, mostly. Focused, on federal. Regulations but I'm certainly, keeping.

An Eye on what's happening at the local level and as we discussed earlier. State. And local governments, can take. It upon themselves to address. Some of the concerns we've discussed when it especially when. It comes to, drone. Surveillance was an example I use and there, are, state. And local governments that have, been. Comparatively, welcoming, to the sharing economy that, they have decided. No we're going to be a. Home, of innovation and entrepreneurship, and that's what we want but. I think it's fair to say that for some of the big issues we've. Been discussing, today. Driverless. Cars and. Drones. And, things like this ultimately, is probably, going to have to take some federal leadership, to get the kind of regulatory. Playing field we want implemented. Thanks, for listening, free thoughts is produced by, tests terrible, if, you enjoyed today's show please rate, and review us on iTunes and if you'd like to learn more about libertarianism find, us on the web at.

2018-10-22 01:48

Show Video

Comments:

Can anyone tell me how they created the video for this (the image with the vertical bars that move when people talk)? I want to create a YouTube podcast, but have so far found the process of converting an audio file into a YouTube video to be too time-consuming. I'm wondering if whatever they used to make this video would cut down on the time used.

The last comment by the guest, about tech innovation requiring federal regulation, seems to be a problem for those wishing to minimize the state. Further, if consumers are too fragmented to have any real power in forcing big companies to change policies, and the state is minimized (perhaps thru demographics and therefor less revenue for the gov), doesn't this mean large companies effectively become the state? This raises questions about the ability of grassroots organizations to engage large companies. At that point, it may be necessary to restructure into smaller groups which have more economic clout than individuals or families, while avoiding the anonymity (which dilutes individual, and therefor collective action) and hierarchy (that encourages corruption) that arises when groups become too large. In other words many more smaller groups but connected effectively thru internet communication, to enable economic boycotts that will replace or combine with civil protest. This idea seems to me to be the way to minimize both the state and large companies since it is non-violent and works thru small iterative change. Trying to engage such large structures on their own ground seems fruitless.

Other news