The Ethics and Governance of AI Course: Class 3, February 27, 2018
Well, hello everybody. Boy. Low Energy I'm, Low Energy. Jeb. Okay. Today's. Topic, and we, compile it down a tiny bit the audio, ownership. Control, and access. Sounds. Like the name of a law firm. And. It. Might be worthwhile to think about why we would dwell on topics, like ownership. Control. And access and, Sam's. Gonna take some notes behind me if we can put it up as. We. Go there. We are but. She's gonna in big in it so you can see it and I think she sent around into. The slack channel a way, for you to access this very dock so. That you can, put. In bold any questions, or other thoughts you have as we, go. Along we. Can also interrupt, the normal way. So. Why might we concern ourselves with ownership control and access. Well. I can think of at least three reasons one. Reason is. Something. Like that already exists, around a resource. Something. That we all maybe have. Interests, in or desire, to take part of and we're. Wanting. To know what to do about the fact so there's kind of a status quo, that. We don't know where it came from but it's already one that. Involves. Some form of. Power. Over that resource and we, might want to contest or shape or remonstrated. Against that power, another. Reason is, something. Is dangerous and a, dangerous thing should. Not just be left to the four winds so. When there's some form of danger we, look to see how. As a, governance, matter governance, is next week's topic but as a governance matter how, might we effectuate, ownership. Control and access over, that dangerous, thing so. That it doesn't become too dangerous whatever, risks it contains. Don't. Materialize and, surely. A big strand. Of our thinking around AI and its various guises. Fits. Into, this category of a dangerous, thing that maybe requires, some gate being of some kind, and. A third reason. That we might think about ownership control, and access has. To do with, how. Much of the thing there is if. There is abundance. Of something. That we need or want, that. Abundance. Might. Be the end of the story call, it. Oxygen. Laden air, really. Important, to all of us at all times it's. Sufficiently, abundant. That, there's no other scheme. Generally. Regulating. Control. Or access, or ownership of the air we, breathe. And. That, is not to be taken for granted that abundance, is for. Some of these things and ideal and it's.
Only When you have scarcity. That you need to start thinking about. How. To allocate, that. Scarce, research. A resource, and, allocation. Can, be a matter of ownership, control, and. Access, and. I should say for the law, types, among us, I, think the more law oriented. You are the. More the. Word ownership. Is just, cause to have a headache and to get depressed. When. I say ownership, what like if, word Association what it what did the law people think of next it was this ownership. Anything. A lot of people. Keep. Off my land so right. To exclude for, example. The. Only thing else on ownership. Lot. People. Liabilities. With, great ownership comes great responsibility. Damn, I own this land, who's gonna wander on and then sue me for getting injured. That's. Why I'm keeping them off. Elektra. Is, that different from the right to exclude them is it one more general. Uh-huh. Uh-huh. Alien. Ability. You. Can transfer, it someone else whatever. Rights you have if, they are alienable, can. Be moved at. A price which. Then can create a market so. A concept, of ownership might, be a condition. Precedent often. To having a market in something because you need to know what you're selling and what you're buying and. That's. Represented. By the notion of alienability. Already. In that list of only four things let's do one fifth one. Yeah. Well. Rights. Is another word that should give the lead legal people here like a total headache and be like Oh God rights not, that they're against them mind you it's just, headache. And. It actually gets to what I was just about to say which is. Ownership. Can mean a lot of different things to. Say that somebody owns something, unless. It's a very common, thing with. A very broadly understood, set, of. Responsibilities. And rights, that, come with it. It. It. May not answer a lot of questions you, have and one. Of the things I think of when I think of ownership from my old first year property, days is bundle. Of sticks do they still do that property, is a bundle of sticks I'm.
Not Sure it's the best metaphor, in the world it's. A little parched but, bundle. Of sticks means it. Can represent a, number of different. Statements. You can make about. Privileges. Or rights that the owner enjoys, and. Restrictions. That may fall on others, and. Only when you add them all up in the spreadsheet do, we know what ownership, means and. In fact. Ownership. Can just be confusing, to say a owns X. Better. To say here. Are the legally, protected, or other interests. That we will say a has, in, acts and. The. Alien ability can, be down to the level of the stick, you. Can say I'm, gonna transfer, my right to exclude to this person, but, my, right to something else will go to this one and you know what my responsibilities. With respect to the land will go to this person why would that person ever want that because, I'm paying them it's an insurance company, and if somebody gets injured on my property through my negligence, I won't. Have to pay the insurance company, will because, I'm paying them a monthly fee that's. A great example of just one stick, of the, bundle, in, that case a sharp a negative one being, able to be alienated. While others might be kept by, the owner and especially. When, we think of ownership of, intangible. Things which. If we're gonna end up talking about AI we may find ourselves talking, about ownership of intangible. Things, often. But not always thought. Of as intellectual. Property, it's, really important to think of the bundle of sticks because. It's, not as obvious, or colloquial. As it might be that if you have this land why, then you get to say who can join you on it because here's the physical property and you're on it and others, can't be when, it's intellectual property, it, has to be represented in a more nuanced, way like this, thing that you have you're. Not allowed to copy because. I hold the copyright in this thing. Whatever it is some tangible. Medium. That, contains. Some. Fixed, idea. Patent. Is even more weird because. If I hold the patent to something, even. If you didn't copy it for me even if we didn't learn it from me if you independently. Invented after. I got the patent on it you, need to license it from me which, means there might be things you're doing in the privacy of your own home. Like, for example playing. With a cat, using. A laser pointer one. Of my favorite, granted. Patents, not, that I hold it to mind you it's just out in the world, such. That if you do that you're infringing the patent, from. The patent holder that is pretty. Far afield from. Any colloquial. Notion of ownership or back in the copyright, zone. To. Publicly, perform. A song. Covered. By copyright is. To infringe, the copyright. Of the song, so, if I were to sing, some. Song to you all right now. It's. Tempting but. I'm a law-abiding, citizen and, there's somebody who's lawful, good at the risk of infringing, Gary Gygax is, copyright. If. I'm lawful good I'd better not do it because this, would be a public performance and the mere performance, of that song then would infringe copyright it, also explains the over, decade-long, battle. Over, the song happy birthday and when. It is performed as that infringing copyright, it was last spot I think by Warner Music for 25, million. Dollars, in. Part to be, able to license it if it's gonna be sung in a movie or something.
It Was later I think it validated, for, one reason or another so. TGI Fridays no longer has to sing the special, it's not happy birthday happy birthday song yes and and just sort of putting a slightly technical and maybe creative comments head on I think the way I usually think about it is that ownership, is the ownership. Of the root thing, upon. Which people build, all these crazy rights so for. Instance if you have, if you're in Europe you have database rights and you have all of these neighboring. Rights to copyright, so you may have the cup so friends, speaking. Of model, release forms that we need all of you to sign we have the copyright, to the video but, we can't post it because you have a model right and unless, you give us a model right the, copyright, can't be used and so right rights are a very weird thing and especially when, you have different rights in different jurisdictions so, for instance in Europe you have moral, rights which is the ability to determine. How your work. Is used whereas, in the United States you don't have that and so, so so so in the, the one physical. Example. That I know maybe I'm jumping the gun and you can stop this, story if if you're gonna talk about this later but one. Really interesting historical. Thing is when, before. Commercial, airlines. The. Property. Right that you had when you owned land what went all the way up, into. The sky and so, that airlines, had to negotiate with. And owners flyover, rights and, I think was Congress right, I'm. Pretty sure was Congress made, a determination, that in order to support. The, commercial, airline, industry, they, would take away the. Rights of. Between. Certain above, a certain altitude. From. These people so they literally took away an asset. That you owned and threw it into the Commons and said this will support. The airline industry so, this is often the, argument, that Larry, Lessig and others would cite when, we're talking about copyright which is for. The betterment of society and. For the Commons we, will take and we have in the past taken rights. Away from people to. Share them whereas, we. Know from our recent examples, Hollywood has been more successful at preventing the Commons from benefiting yeah the. Only thing I would say to that is it's not clear to me that the use of the noun ownership, is it, all helpful for anything you just said, it's, always more precise, to simply say what. Particular, rights or allowances. Are being made and oh yeah, you think you own your house and you thought that meant it once up here, but in fact that was just a stick that, could be, disconnected. From the rest of the bundle, ownership. Is not telling, you much again, unless, it's so, common, so. Understood. That. It's a shorthand for a very distinct, bundle with respect to a very distinct, and I think the place, where I often, use it is, in, copyright, and patents because a lot of times you can own it but, have no rights so. So, for instance in, certain I think in the US you can't, you. Can't transfer ownership of certain things but, you can transfer, perpetual. Rights and say yes patent, is particularly, bizarre, because, the inventors, have, certain inalienable. Connection. You know the work but for, the most part other, than that, it's. Rare, to see things in alienable, and copyright. There you get into some really. Obscure. Rules, like the Rod Stewart, rule that's, just my name for it that, says that, after about 35 years, any. Contracts, you signed.
Assigning. Bundles. Of sticks away from, yourself has the creator of a copyrighted, work you. Can call back again it's. Called a termination of licenses, provision, and it. Creates, you especially, if, the later it gets the more it you have to keep going later because, there's a distinct, window it's like not 37, years and it's not 34, years you have to do it right in the window and terminate. All those rights to relicense, um back to the music companies that thought they had cut a great deal when you were young and I even, in. The history of that is that there, were predatory, mm-hmm, licenses, that were signed. By, mostly. Unable. To read contracts, artists, that were horrible, and yes I think they're just sort of yes. An interesting hat the music industry ended up getting its revenge in 1998. When Congress, at its behest, passed, a law that decreed. That, those. Original. Works that were the subject of the contracts, for which you could then terminate your license were in fact works, for, hire and therefore. Were never the property. Of the writer the writer was working. For the company, and they belong to the company at t equals 0 that. Was passed by Congress and, no, member of Congress admitted. To. Having any knowledge of, the provision, and I, don't think they were lying I think it was a staffer. Who, snuck it into the bill and Congress. Just passed it and there was such an uproar over it the Congress repealed it a little bit later and somebody, some, of you will be that staffer, or. If. You're really lucky that musician. So. Anyway. This is a long way of saying, we. Have predicates, of status quo of, danger. Or of, scarcity. That, might make us want to think about. Access. And control over, something and if, there's a totality, enough of it it becomes common enough we can start to talk about who owns the. Thing but, those are the main reasons, why. We might concern, ourselves with. That and it's, always good then when we talk about allocating. A scarcity, what's. The thing we're allocating, is it a particular right. With respect to something the right to exclude the, right to make a copy especially, as we go in tangible. That. Can be quite important. If, we choose, a market. As an allocated. Mechanism. With, that alienability, is one of the basic things that permits the market to exist that. Certainly, represents, a huge policy choice it's. Saying if this thing is scarce, the. Place it belongs is in, the hands, whatever again, we're going to define the rights with respect to it to be is, in the hands, of the. Person willing to pay the most money for it or. At least whoever. Has, it should. Be free to part with it under whatever circumstances. That. Person wants we would typically expect, it to be whatever. Garner's. The, most money. If, there, are many players in a marketplace and there are low barriers, to entry the, thought would be that, if there is a big difference between, what a seller would accept, to part with something and what, a buyer would pay in order. To obtain it if there's. Many players any difference. There, will, inure, to the benefit, of the buyer that's called consumer surplus then. It will end up being that. You'll. Have competition, among the many sellers, and the buyer gets to shop at the two different gas stations, until, they lower their price to, just above their. Break-even point and that's. Quite, good the biggest difference between a market with a lot of players in this, telling. Classically. And one with say only one player is it, with one player with no competition, you, get to raise your price to, the reservation, point of the, person most ready to buy it and the. Surplus goes to the seller I think. To a classical, economist, that, doesn't matter that's not a terrible, thing it, doesn't matter where the surplus goes transactions. Are still happening they just might be going to.
Different People. But market. Allocation is, itself only, one way to, allocate. Scarce. Resources. On. April. 14th, 1912. Seats. Were scarce, on lifeboats, in the North Atlantic and. Folks of the Titanic. On the. Port, side of the ship chose, to allocate, the seats women. And children first and on the starboard, side they, allocated, the lifeboats, first-come, first-serve, and it, was not whoever, has the most money which was quickly dwindling in value he, is allowed to purchase a, seat, on the lifeboat and we might think, regardless. Of which scheme, you. Choose or a third one on lifeboat, allocation, that market wouldn't make sense for something like that and, indeed when we see surge pricing take, place on, an uber I think as I mentioned in the lecture on the opening weekend there, are times when we might find that acceptable oh yeah just let the highest bidder be, able to get that seat for an uber or, at least let the prices go up if there's a lot more demand and that'll cause more cars to flow on to the road but, you also might say if, there's some emergency and everybody wants out of town it. Doesn't feel right that the rich people get to go first and that, maybe it should just be first-come, first-serve, or whatever. Other scheme but again it's a conscious, choice. That. Could be societies, to make again. Gets back to a governance issue about. Allocating, scarcity, on terms other than what we might think of as choice. By whoever currently, holds the right I just Joe it kind of double-click, on when you say conscious, choice, right, so, and. This may be to the next, class that we're gonna do on governance, but. How. In, control. Like who's, how. Is that conscious, choice. Allocated. Like how. Do you cuz cuz the Titanic somebody. Probably just made a decision. The person who, happened to feel. Like they had the authority right, yeah. And I guess that's the government's gonna be more recovered. On them in the governance section but I think you sort of just use the term conscious, choice but I think that that's actually is gonna, be a is. A big part, of this yeah I'm just both conscious, and then whose choice right and of course the Titanic, represents, Admiralty. Poses, its own jurisdictional. Questions as probably. None of the law students have taken apple t it's. Too bad right but. It's. Interesting to think that like second, officer, Lightoller. Has some Authority, residual. Out of habit yeah, even though he's just some guy in a uniform and he gets to decide on his side of the ship also.
He's Packing a sidearm what. Happens but at some point everybody's, just like screw this we're going for the boats it's, every man for himself you. Could see a breakdown in that and with these new technologies. Where, there aren't defined forms. Of ownership where. We're doing hodge podge --is of drawing upon certain, notions of intellectual, property trying. To make them apply, there's. Certainly, no conscious, governance, yet about. Defining the landscape, what should be a market, here what should be abundant. What should be scarce especially, when, we don't yet have a theory, about these AI technologies. About whether, we want them abundant, or want them scarce the, threats we don't want abundant, I think it was Jack, Clark just came out with a paper about, the. Instances, in which AI. Forecasting. Might be put to ill use and therefore maybe. We shouldn't just let, any old person do it I don't know if I Jack, you can tell me if I'm misrepresenting, the paper. That. Is a correct representation of a miserable conclusion. Excellent. Glad. To be the bearer, of correct, bad news, and, that's. From open AI so. It's. Wonderful I mean these are huge opportunities within, this room to start thinking this stuff through and at, least just laying out, I think will help us start. To think about the toolkit we have to. Be more analytic, to be less blob-like. As we, think about the. Proper ethics and governance, of the. Different technologies, that we call AI. Yes. Tie it back a little bit to the last class we were talking about algorithmic, bias because. That. Also requires, the governance that we're going to talk about but it's also interesting because when we talk about abundance or scarcity, it's multi-dimensional. Right and that was Cathy's thing where you have different stakeholders, who. Benefit. From different things and a, lot of times it's not just one-dimensional. Abundance. It's a abundance. At a trade-off for something else and so as you start to increase, the complexity.
These Become I. Think. That this is where we start to stretch the law because even the word fair it's, fair for who and you. Can't optimize for this fairness without giving. Up this fairness, and you might not be able to optimize for this scarcity, without giving up this scarcity and in, a way that the copyright debate, with Hollywood. Was exactly, that it was we. Need to protect industry, at the, expense of the. Commons, yeah and that industry is more important, and the trade-off, of losing. 98%. 99% of. Works to orphan works was. A good, price to pay as trade-off, for Hollywood. Being able to save Mickey Mouse and stay in business well. And on that front put slightly, less, valence. A. Lot. Of legal doctrines. Around. The. Control, and access to intangible. Stuff well. We normally think of again as intellectual property are, actually, schemes, that. Take a baseline, of. Natural. Abundance you, can copy the thing fairly cheaply and share knowledge. As Thomas Jefferson said, by lighting my taper. Upon yours. Creating. Scarcity for the purpose, of future. Abundance, that people won't bother to have an idea unless they're. Given some legally. Defined, rights and defended, rights with respect to it so, that they can profit from it through, creating, a market, in that idea as if it were scarce and everybody. Has to act as if it's scarce, and. I don't think that's a crazy idea but it is one that carries its own costs, and benefits including. In fairness, if you, think of certain ideas that could be life-saving. In. Patent, there's a technique. For. Doing. Something with an eyelid, in order to do better eye surgery, and the surgeon patented, it and if, you should. Use that technique, without, the license, you're. Infringing the patent you could be in big trouble, so, it's a neat example of, some of the trade-offs and when I think of some of the doctrines, which. It might just be worth to lemon right now we can't do a mini course in each of them but I at least want to put them on this grid. Like table, here. Doctrines. Of copyright, of. Trade. Secret, of. Misappropriation. And of, patent, are each. In the zone not. Of dealing, for the most part with a dangerous. Thing. It's. Only in the weird corners, of patent where there's a little bit of review at the Patent Office that if something involves nuclear, the government, gets to review the patent application and just decree. It to be classified, and like, sorry you lose as the, patent applicant, but like talk about a weird, rare, exception. For, the most part it's creating, a scarcity, for the purpose of having, the. Innovation, take place at all and then for a limited time with. The idea being that after a mere 95, years it's, now free as the air to breathe for. People to, use and that's in the case of copyright. So. That that's one class, of form, of control. And access imposed. For a particular, utilitarian. Reason. That, we could draw from. The. Principles, of those to. Try. To define, with a shorthand control, and access over AI technologies. It's, just especially. If we're not, wanting.
To Do this for the purpose, of creating scarcity to in turn drive innovation we, may find that the doctrines, aren't that helpful. Other, forms, of doctrine that maybe are about enforced, scarcity, privacy. If, you really care about privacy and there's some form of privacy law what. Is it ultimately but. With the help of the government. Telling. People. Or companies that they cannot utter, certain. Things they cannot move, information, from here to there from one party to another lest. It impinge upon the privacy, of the, person to whom that information, relates. But. Generally. That's. An enforced, scarcity. And not even one that would be like copyright solved. By market, tests. But, just might be like no you can't, you, cannot use this data this way at least not without the consent. Singularly. Of the, person who, was the source of the data or, to whom the data relates. Privacy. Itself turns out be a huge patchwork, rather, than a comprehensive, schema, especially. In the American context, yes. Yes. Yes. To. Be totally recursive, about it I think about the patent, that Microsoft, filed, on the. Use of the Kinect, camera. To. Detect how many people were in the room for the purpose, of charging. Differentially. For. The movie they're all about to watch because. If you have enough people in the room it's a public performance. And, it's, just something that even without it being a public performance they can simply refuse to show the, movie until, a certain number of people like clipping a coupon are willing to hide behind the couch to. Watch the movie and make no mistake that's, a great form of price discrimination, because. People like I'll screw it I don't wanna sit by in the couch I'll pay another 50. Silver. Credits, or whatever it is that Microsoft's, taking these days and, you're right that that is affected. By the technology, and really what that is is a gateway to something, we haven't put on our list yet and I, don't even know if it really has a great name. Digital. Rights management is, what people sometimes think of it of but of course that's already using the rights word and the. Cool or. Terrifying. Part of DRM is it need not mapped to any legal right it's, why Richard Stallman insists on calling it digital restrictions management, I, tend. To call it in my own euphemistic, way technological. Complements, to copyright. It's. A form of using, technology. That may. Be pervasive, for, one reason I got Alexa not to enforce, my. Laser pointer usage. But, to be able to order replacement cats should I run low and. Instead. It's now used for these other purposes. But, I think. What we see especially, when we start thinking about the AI landscape. Our. Forms, of protection. Or. Assertion. Of, control, and even what we might colloquially. Call ownership, through. Simple, technological. Fact and I, think that fits into my original first, category, of status. Quo ISM it's, just like look if you want to connect and you bought it this, is how you're gonna have to watch movies these days and your only choice there is not to buy it not, to have an Alexa and if you're on a public street and they're using cameras, on the, street you may not have the same kind of opt-out.
That, You would otherwise have Joey, we gonna say something on that no, I just. Will tie it back to a. Conversation. We had about autonomous, vehicles in the German. Autonomous. Vehicle directive it was really interesting how clearly, they called out this. Notion, that we're, not going to let it, just, happened. And then say well that's just the way it is and I think the strategy a lot of these companies whether it's. Explicitly. Thought through or it's just their. Mo is you just kind of roll the stuff out and then, you say oh well we can't move, fast and break things yeah and, digital. Rights management I think is a really great example of. The. Companies, that do it, not. Trying to innovate on any better way to do it but, now through, the passage of laws they. Said well we already have this, we'll. Just keep doing it this way even, though it's terrible. For security, audits it breaks the internet it makes content fall apart and so so I think, the the interesting. Thing is the and, this ties to I think the, lack of technical, ability of the law makers mm-hmm because they're, not sophisticated enough, to say and push back and say well, could you think of a better way to do what you want to do and I think this is partially why we have this class is, is so that lawmakers, can call up their cryptographic. Friend, that they met in this class and say hey is, this is what what, what Netflix is saying is this true. That's. Yeah. It's a huge question and. Short. Answers include, you. Know call, your member of Congress now stop, SOPA, start, this other thing that kind of just pressure, representatives. If you have a vision of the good even. If it conflicts with the financial, interests of people enjoying, it the, other thing is you can find the people thinking they're enjoying the financial, interest in this space, quite. A typically, see. A sea change in their fortunes, the. People who thought patents, were quite good, suddenly. Seen themselves on the defending, end against, what they call patent, trolls and now, the hunter becomes the hunted and it's, one reason why we've seen in the space some of the biggest software companies, that, were gung-ho on software patents, yeah. Not liking them so much anymore and there's. Some technological, ground. Shifts, to where as software becomes a service, rather, than a product it. May be less important. Who's. Running the code somewhere, else running. MediaWiki. Code not, as part of Wikipedia, does not quote. Compete, with Wikipedia, and Wikipedia isn't even caring about who's competing, with them they have a license that says go to town if you want and. Still end up the. Best at wiki oriented, services. So. I think we've seen sometimes, changes, in that and in the United States, as. Judges. Have grown. Have. Senior, doubt and newer judges, have come Hin you've. Seen some skepticism, over the scope of patent for software, you, just even in recent years so I don't I wouldn't call it a brittle. Or an ossified, I wouldn't call it an ossified, space there really is change. It's just often. Patents. On software, is too broad, really, a category, to. Define, when, we think it's helpful and when, it isn't well yeah, but I would point out I think, that for most of these things you can also use, the old follow, the money rule and, I, think the Patent Office still, runs.
The Narrative, that patents. Are filed by these, genius, inventors, in their garages when, in fact that's no longer the case it's big companies, and to. Protect the genius inventors, in the garages they've, made it originally. Set up so that it's very easy to file patents, but, very expensive to overturn them and if you overturn, them you don't actually benefit, that much so, doing, something as simple as. Making. It cheap. To overturn, say you get all if you're right for, example you get you. Get a reward, and the, person who is overturned, has to pay a penalty well, if you're a lone, inventor that, would be risky, because big companies would come after you but in this case in fact, what's usually, happening especially, with self-dependence is you've got big companies, filing the patents and little, startups being, crushed because, they can't afford to go after these these, guys and so so so I think changing, the incentives, to fit the reality of society. And the problem is that lobbyists, are hired by the people who have the money and people who have the money of no incentive the change in law we'd Riley call that a public choice problem, why. We put those two words together to describe that as the problem I don't know, but. It's a public choice problem. Now. We have Alex, Stamos, the chief security officer of, Facebook, joining. Us approximately. At the top of the hour which is only ten minutes away and, he. Can among other things talk, about the paper, assigned. For today, about. Facebook's, own inventory, of, the problem, of propaganda. Especially, including. Automated. Propaganda. Some, of which may be complex. Enough to call it AI but, who knows and some. Of the ways in which Facebook, thinks. Defenses, should be mounted against it some by Facebook some outside of Facebook and to, contextualize our discussion, about ownership. Control, and access if we're, already thinking of Facebook, as an environment. That at. The moment at least is so singular. Totalizing. For, its zone, that. What happens, there, has. Public. Implication, it's. Not just like, some. Random newspaper. In some city that has plenty of competition and whatever, there's something about Facebook, that feels. Singular. And, because. It's about the flow of ideas and what I call agenda, setting what people see. If. It goes, quote wrong how we wanted to find wrong you, could end up with a misinformed populace. You could end up with an election with a different result there. Seems to be some interest, in controlling. At least anything we might identify as, huge, misuse. Of the. Platform, from outside and, some. Of the stuff I'll be curious about in light of our framing today, is. To. What extent does. Alex, think of Facebook's, powers. Here. As conferring. Responsibility. That is uniquely Facebook's, or are. There other entities, that he thinks ought, to be taking, responsibility. For it how much does it not have maybe, there's no responsibility, it just is what it is in, facebook, for its own market reasons, might, choose to intervene but. Otherwise it's, up to basically Facebook, to decide or does he feel differently about that. It's. A great example of people not, yet. Having figured out who should own what in that field I think we all tend, to be sharing the idea that it's really important, and. If. You haven't heard from Alex before he's a bit of a raconteur.
But, It'll be great for this group to kind of pin him down a little bit Holly were you gonna say, something. I. Think. We should absolutely talk about that we may even find some time today to talk about it Alex. May find himself not that interested in talking about it he will in one way or another say I'm just the security guy, but. The. One thing I would do, maybe. As with patents, and software, is first get rid of the word ownership, owning. Your data I don't know what that means I want. To know when you say own your data tell. Me the rights with respect to that which translate. To restrictions, on others about what they can do with it and I, think then we want to be quite refined, about, what we mean the. Usual model, under. The rubric typically. Of privacy, not, of intellectual, property for. Control, of one's data, tends. To be, first. We. Identify sensitive. Data, versus, non sensitive, data if it's sensitive then, maybe the default rule is somebody can't do something with it without your permission if it's non sensitive, if they find it out one way or the other they're free to do as they please. Unless, they come to some other arrangement with, you which maybe they would do if you. Are the one giving them the data, yeah. You've. Got it's so awkward um you've. Kind of said it a few times that currently. Things have evolved in this patchwork way and it's not a schematic structure of you, know privacy, as an example yes it, feels like with so many of the things that we are talking about there. It's. Like we're cobbling. Together the. Pre-existing. Notions of how things should happen to figure out what the solution. Is for something that never, existed before, and with, respect, to to, data, specifically. I think this is irrelevant to other other to subject areas but. It's. Almost like right, now it. Has the. Different, companies. Who use in, to use, data your data as an individual. Have. Figured out how to make, it most expedient for them to figure out what do I have to do what is sensitive, what is protected, yeah, what is a protected category what is PII personally identifiable, information, that we can't share, but. This notion that's like coming forward, in Europe for example leading, loaded with GDP are around, actually. You, as in it shouldn't be that I. Get. All of your data until you tell me I can't but, more that the individual, should. Be. Able to, have. To opt in as opposed to needing to opt out of being, able to you know using it for financial gain I so here's. A few quick thoughts around and I think it's actually well-timed. To, be talking about it here because it's ownership control and, access. First. It might help to think about, what. Are the. Known. Unknowns and. What are the unknown unknowns, and by that I mean what are the perennial. Problems, with privacy, to. Be unduly, alliterative. That. Have haunted, us from. Roughly. 1972. And the rise of, computerized, mass. Storage, and the, use from 72, or so onward of data. Banks for keeping. Track of who should get consumer, credit, that. Kind of thing and, for. That there, are, well-known. Issues. That. Have to do with public choice issues, about, oh it, would be nice probably to protect this data but the, industry relies upon it and consumer, credit is good and they're, the ones who write the campaign checks I would. Separate out maybe the known issues, that aren't distinct, to the. Tightly coupled autonomous. Obscure. Adaptive. Complex, technologies, that were loosely calling AI and, say what, might be different about AI in the zone and, I'll give you a quick example on that front. The. Normal, model of privacy, is. We. Know up front typically, what sensitive, and what's not sensitive, and. If. It's sensitive maybe we have a different default rule about whether it can move without permission opt in versus opt out. But. With AI you might, be able to say even the most innocuous things, turn out to track two other things, inferences. Can be made that people can't do but machines can which. Makes almost any datum. Potentially. Sensitive, the. Worry is that now proves too much does everything have to grind to a halt and what. Does it mean for me to opt in, at, T equals zero when I don't even know the implications, of it it might, mean that thinking. About the data as sensitive, or not is not the right way to do it it must have to do with use rather, than with collection. The. Other puzzle, I've been trying to figure out I forget. If I mentioned this before is. Whether. The. Way that something like Facebook, might be able to make really, insightful.
And Against. My interests, predictions. About me what, my vulnerabilities. Are wins the right minute to show, me an ad for a payday loan kind of thing is. It on the basis of a ton of, superficial. Seeming, data about me that, results, in a deep dive a portfolio, of me in which, case maybe traditional privacy. Law can, help because, I can, decide I don't want you to do that with my data, it's my data don't do that if. On the other hand. Through. Its, two billion, users it. Can know whether I'm in one of the 15, basic, categories. Of humanity. And once, it's put me in box number 13 it, can, make all sorts of insights about me and all it needed to know was my, stance on gun control what, topping I like on my pizza and whether, I use many words or few when, I make an appointment to see somebody, for a specific time or not. Then. I'm. At a real disadvantage because. It's other people's, sharing. Of data that's. Gleaning, insights against me and how to solve that I think is a great example of an unknown, unknown for, which the normal tools that. We have in privacy, law don't apply at all yeah. True I mean I'm what, as usual. One above it up one layer because, I do think this is actually a peculiar, affliction, of America, which is if, it's legal and it's. Technically possible it's. Probably okay right, and I think the problem. Is like that paper that we read about the FICO score is being used to target households. Because, it's not personal identifying. Information if, you don't target an individual, well well clearly, the people who are doing it know that it goes against the spirit of, the law clearly, there is harm but. There's something about the, capitalist. Democracy, that we have in America that if you can kind of get away with it people. Don't really take them to task I mean people write about it and people get outraged, but not a lot happens and and it's not necessary, that this is better but if you see for instance in Japan even if it's not against the law if somebody does something kind. Of bad. People. Take, them to task and and they often yeah and in the uber CEO thing was was an example of people getting outraged, but, I feel like the American.
Tolerance, For, people doing bad, to make money is pretty. High you know and I feel like a lot of this stuff as we start to sort through things, like privacy. There. Are always gonna be ways to stay ahead of the law and I. Think that the lawyer. Based. Ethics. System. That we have and this is why we have ethics in this, name of this class is I do think we do sometimes, want. To think about where, is the law the appropriate, tool and where's, the law always going to be so course, that. There will always be a way around it and I think for data because, it's advancing, so fast and, if you there's. A great document. It, was a Snowden, leaked document, of your. Agency the NSA and, in. And out and how, there may be FFT, in the FF yeah how they're, getting so much data, that. They're not storing, it they're just training, a model and that model. Actually, isn't, data and so, we're going after okay not metadata, but this data that data you can't store data UK well what happens what's, what is what is the model that's trained, against data that they don't even store and so so there's always gonna be technical, advances, a little bit ahead of where. We are right but, let. Me integrate what you're saying back into, the, different. Forms of market structures, because. It, may be that part of what works in Japan is a cultural, difference it. May be that part of what works in Japan is a difference in competition. That. If there are not low. Barriers, to entry if there are a few stayed, players, and we know who they are and they kind of defend their turf it's. Much easier to impose. Upon them an expectation of, extra. Legal ethical. Behavior, because, they won't be undercut, by a start-up I think. When you have really. Strong competition. One. And this is really, this way of restating what you're talking about a capitalist. Liberal. Democracy, you'll. Simply have it be that those who have the ethics to withhold will withhold and, that. Will create a vacuum for somebody, else to come in and do, it and that in turn often creates a decision loop where somebody says I might, as well do it if I don't somebody else will and, that, might not be true in other places where. You. Don't have competition and a lack of competition is often, from, a governmental point of view an, invitation. To more effective, regulation because, there's only a handful, of places to, apply. The. Pressure and I think that might account in part for the difference in suasion. Ability. I saw, at. Least one of the bold questions with fairness versus, justice, that's. A deep question it's. Usually in our turf tell. Me if I'm right law students, fairness. Versus welfare. Basically. Converses. Mill this time it's impersonal. And thinking. Through when, we should have an approach that's rights based and fairness, based. Versus. One that just says great is good for the greatest number when. I think of pitting fairness versus justice, I'm not sure I understand. That line, up but who asked, about it might want to unpack. That a little more. Yeah. Kathy. Hi. This was actually it's, been on my mind since the fat ml conference, because. The topic of fairness versus justice came up and it. Was interesting for me just because I think in tech we. Talk about fairness a lot and in some cases probably mean justice as well but fairness, is the word that's used and maybe because justice isn't in our vocabulary not.
That We've never heard the word practice it's just, not in at, least computer, science or engineering type. If you don't talk about justice as much and. There. Were a couple talks at Fed amel that really so. There was I guess, the famous now quote and I'm gonna butcher it and I'm sure the law students know what it is but it's the, case where the juror was asked. Like if they shouldn't or if they've been treated, unfairly or something and but. The gist was that this guy was like oh yeah I've been beaten. So. I feel like I've been treated unjustly but. Then everyone, was beaten so I was treated, fairly. And. When. It's told correctly it's like funny and as like a punchline but, I. Love. How we could just all, assume, it was told the. Right way. This. Idea, of fairness versus, justice, and just how to have that conversation around, this. That's. Interesting, I normally, think of justice, as something, associated, with, the state, the. Sovereign has, the. Ability, to dispense, justice and. Can. Do so in a way that is just or unjust that. Interpersonally. We talk about if we're treating each other fairly, but. That might be more a colloquial, notion than one that. We, see representing the professional, literature I don't know if the Philosopher's, among us if, it leaps, to mind a big difference between those highly. Loaded baggage. Laden words, looking. At our resident philosopher, but yeah. I. Mean. There's a very famous conception, of justice that, equates. Justice, with fairness that's Rawls yes. And. But. He has a very particular idea, of what fairness, involves, there. You're. Right that often, we talk about questions, of fairness versus, welfare. I. Would. Say that philosophers. Wouldn't think of justice as only having to do with the state although. They would tend to think that justice, is about, questions, of what we. Do, and then a question an analogous, question individually would be about ethics. So. That, would be the difference the distinction that we would draw there. But. You're right then fairness, I mean I take it that like part of what goes on in like. The tech circles is that fairness. People. Think their various ways of sort of quantifying, whether something, is fair and they're sort of technical, notions, of fairness that.
Might Be in play and then they're sort of things in like you, know. Social. Cognition we look at you know non-human, animals and you have some conception, of something we are inclined to call fairness, and yes all these would be very contested. Kinda, huh well. Let me let. Me bring it back and I, assume we're on standby so that if Alex were to call on skype we would know it is that right and. He has not called yet that's, fine. Bringing. It back to the ownership, control and access stuff, I'd. Like to ask. Us to think for a moment about, other. Technologies. Not. AI, for. Which we have concerned, ourselves as, a we, with. Ownership, control, and, access. And my thinking on that is in doing that we might be able to identify. How. We solved, that problem or, chose, to address it or take it on in ways that we might borrow. To. See how to treat, some of these problems in AI so. I'm just curious anybody want to offer up. Something. For which issues, of ownership or control or access, has seemed to. Be. At the forefront. Yeah. Sarah. This. Isn't quite a technology, specifically, this is going back further but something, I've been thinking about is, um. Rather. Than co-opting. Like. Everything having to come from one, paradigm, like. We own everything or we own nothing that, most of these things as they've evolved in our culture meet somewhere in the middle in. An example that comes to mind is around, the. Rights to one's person, like their one's physical body which I guess one could argue as a technology, but not. The way we don't really talk about technologies, and so specifically. Like. A government's. Right to ask. Wealthier, people to pay taxes that help support the state but. It not being okay for, example to, harvest. A person's, organs to save ten people that. Are in need of organs or it, not even being possible to sell them in, most contexts. Or most countries to sell ones own organs legally and. So it feels like they I don't know why this is exactly but it feels like the barrier of the skin or. The body is, often, sort of the barrier between. Ownership. And. I. Don't. Know if I'm articulating through very, well but it can outside. Imposition. Of, what can be done I feel like in AI there, might be something similar that we reach that it's not really Holy One or holy the other but I have somewhere in the middle and I don't know where that boundary would be because it's a lot fuzzier, certainly. Seems like there's a status quo of, one. Has control over oneself. And, then. Danger. Can trump it if, you, are a danger, to yourself or, to others in, some way there. Feels to be license, and there are legal frameworks, with which to assert, control, that, over rules that. Background, idea, so that fits into the we. Start talking about ownership control and access about something dangerous and then. Maybe about scarcity I mean organ donation, is a classic. Vitally. Scarce, to, both, literally and figuratively. Resource. And. Where. It's at the paramount, of wanting to respect, individual, decisions. After somebody dies on the one hand and on, the other hand. Wanting. To encourage, as much as possible, organ, donation for utilitarian, purposes. And, I, to me it actually connects back to the privacy, stuff we're talking about and we talk about the American paradigm, versus others the, American paradigm is heavily, choice-based. That. The more you can just say you're giving somebody a choice whether it's opt in or opt out you're. Done they express, their view and now you're disrespecting their view when you take all their information because, you gave, them a choice at the beginning about accepting, or declining, and. That sounds like Alex Stamos, good. Timing on this, is classic, Facebook models sort of thing and I. Got to say as you may have heard me say I find choice to be they're quite. Thin, that. There are too many things to ask us our opinions about and often if, the choices do you want to get screwed over or not don't give me the choice just don't screw me over an organ.
Donation Is the big exception to that in my mind where. We think reasonable, people or. Maybe unreasonable, people but people can disagree and we really want to respect their choice and, we don't know until we ask them how they're gonna feel about it I was, actually not thinking not even about organ donation be like the faults of utilitarianism, where you, could say well we could sacrifice, this one person, to save these ten yeah, maybe Facebook's, doing that too. I mean it's really getting beyond even ones own individual, rights to once well it certainly gets into when, data in bulk might. Be used to, cure, a disease including. Bulk genetic, data that you and sitting in 23andme, you get very much into that utilitarian. I don't know if that's a fairness versus justice or fairness versus welfare probably. Fairness versus welfare kind. Of model there's, Alex before, we turn to Alex, hi Alex, hi. How you doing go ahead good to see you, other. Things I'd put on the table real quick as. Technologies. For which maybe we've worried about ownership control and access could include the internet itself for. Which we could talk about you, know net neutrality versus. Open access versus, municipal, broadband, versus, totally private networks all, of, that have. Their. Application. In oh, there's, scarcity how do we allocate it, do we use a market, do we use some other means. Mass-media, worry about concentration, of it and therefore public. Obligations, upon license holders of broadcast, media for example. To. Have to include a certain amount of nutritious, programming. Amidst. Their junk, and. Knowledge. Itself when, do we think that knowledge is something to be metered out or. Something. For which there's kind, of a basic and right oven and yes tie a little bit to this is the in in in medicine there's a an. Interesting. Way. To think about consent. Versus duty of care you. Know and I think that that's kind, of your, first order one second order was to but but you. Know I think, that's the problem I see a lot of times in the American system is that you have these like you. Know click rapt consents. Where. People. Aren't informed, and in. Medicine you're a little bit more strict. About whether, what, consent is and you also have very strict adherence, to duty of care and I feel like we're, missing that right now in. The in the digital world and and and and and that maybe there's, a way to learn, a little bit from how we think about it in medicine and apply, it to places. Like Facebook.