But. If you think about the way our media and information, system, ecosystem. Is wired at the moment. Certainly. There's an enormous investment that, we're going to need to make together in, order not only to, have. Good lives and livelihoods in, this industry here in New York but. Also to make sure that ecosystem is. Future-proof. And set, for the tomorrow and will get us all where we want to go together so in. Order to talk about that a little bit more and to kind of set up this idea of the media 2030. Discussion. I've put, together a panel and I'm gonna invite my panelists, to come on out I'm gonna grab my notes while we do it. And you, can welcome them to the stage if, you will I. Wanted. To have just a little bit of a chat about where. We see things going so I've I've since these folks some questions in advance and I hope, they've had a moment to think about them, but. You, know the general frame for this was let's. Think about the decade ahead let's think about the trajectories, let's, think about if we're in this room together in 10, years what, are gonna be the things that we're going to look back on that we accomplished, what are, going to be the things that are still challenges what. Are gonna be the research questions, and the universities, what. Are going to be the problems, and the opportunities in. Industry, how. Are we going to kind of advance this question, so I've got a great group of people here to do this and, to, my, my. Left here I want, to welcome. Yael, thank you very much Eisenstaedt here you got a BIOS in your your app of course but she's, a policy advisor at the center of humane technology, and I think she's going to tell you maybe some other affiliations. That are notable. In her past so. She'll get on to that I'd recommend you google, her quickly if you want the full context, I, introduced. Just to her. Left dr.. Desmond Patton who's associate professor of Social Work at Columbia University, and founding. Director of I, think one of the most interesting. Labs. In New York City the safe lab and so if you don't know it I want you to look it up. And an acquaint yourself with Desmond's, research, I, think it's very important, and. To the left of him you've, got Tony Parisi who. Is got, a credible. Title at a company. That how many of you know Unity Technologies how many be no unity all right still some work to do tell everybody what it's about but, he. Said the game is unity, and. Running a global ad innovation, their enormous, company. Went. Through to go public Tony you can't tell us that are you all right so, here's. The thing. I think these game engines are there. A. Sort. Of early signal of where media all media, not just ARV are games, that are going we. Talked about the media's data thing earlier so. An incredibly, important perspective, I believe and then, Luke Dubois, my. Partner in crime at NYU Tannen School of Engineering on a variety different projects, co-director. Of the integrated digital media program, and. You. Know a professor, of engineering, there, with. A music PhD, which maybe he'll get onto in a little while so this. Panel we're gonna try to kind of cast their minds a little bit forward and think, about the. Decade ahead and I kind of want to put just, a simple question. To them at first which is if you had to kind of go down the line what, do you think is the, most important, technology trend how would you frame the most important, technology, trend that's, going to shape things over the next decade, and I'm gonna start with Tony Parisi, because he's right.
So. As. Justin. Alluded to him I'm. Into spatial computing, the interface is going 3d, that, is not the most important technology trend, it's a very important one it's AI interpreting. That's happening with artificial intelligence the machine learning about, our lives on a daily basis to presumably. Make him make our lives you know better to, help us but. It's also the same family. Of technologies that will be used for deep fakes and other things that can be incredibly scary, going, forward, imagine, when we can do deep fakes in real time, for. Example, so everything. We're doing with artificial intelligence needs. Stewardship, and a lot, of thought at every moment in time going forward, Desmond. Is it a yeah what's. Interesting for, me is that I'm less interested in the technology, that we will have but more so the questions that we were asked that these technologies it's. Someone that leverages, AI to study, gun violence what I'm most concerned about is. This, surveillance state they may be caused by the use of AI and so I think that we, have to think very critically about, the. Extent to which we bring, in diverse voices, and, opinions around, how we create, and ask questions, these, technologies, in the future and, we are going to get onto that but I'll ask Luke, and IO on, the technology, trend is it AI as well for you Luke yeah. But. I think I would qualify it in the same way that Desmond did that it, that it's it's. AI. Concerns. Me but what concerns me more, is the. Idea of a future where not. Everybody has access to engage. With the technology, at. A level playing field right and so. When. I think of media 2030. I I think, of you know we, have this opportunity to set the tone for a real for. Just a decade, of equity, or a better you, know a future, where we can really talk about you, know who gets, to put content out there how that contents, get capped all that kind of stuff and yeah AI definitely, keeps me up at night but. People. I know my students you know that's what gets me up in the morning is the idea of their their voice is getting heard in that conversation I don't, get you up in the morning keeps you up at night so you never sleep I never, have. A child. Bring. This home with continuing. The somewhat downer of a trend here for. Me it's not about what actual particular, technology, it's the removal, of human critical thinking and human analysis, from, the future of all of these things that concerns, me and over, reliance on data, without human intelligence okay. So, let's. Let's let's take a stock of what they've gone, on about AI of course, and, and what that will do to the mechanism. Not. Only of collecting. Data and. Information, and media which may well, look like surveillance, but also in generating, it real-time. Defects. Is the example here, look. Obviously. Lots of questions about the health immediate existing that feel like we're on a precipice of some kind maybe we always feel like that I don't know, but. If we're gonna make progress. What, are the questions we're gonna have to answer over, the next decade what are the things the, hypotheses, in your lab Desmond well. Who gets to be human is the thing that I think about a lot, in. My work I've been studying young, black and Latino. Lenox, young people and their experiences on social media and we messed, up we. Headed. By. Narratives. About the experiences, of young black youth and, totally, missed how, they, are seeking, help on social media how they are experiencing. Trauma and grief and, loss in the way in which they're trying to get support for and that, that, communication. Changes, over time but because we were framing, in a way that was punitive, and negative we, missed complete narratives and we dehumanize, them and so I think for us we. Want to make sure, we. Bring in young people to help us unpack, these, things. And ideas and we want to make sure that, their thoughts opinions, are at the forefront at the center of how we are, doing. That work yeah. What. Are the questions we're gonna happen yeah the questions we asked to think I know you're concerned in particular, about the global implications of, social, media and maybe you'll mention you. Know why and what your background is and with regard to that sure with that little prompt to my background I. Mean I spent most of my career in the national security and foreign affairs well worlds in government, and then, went. To Facebook as their head of global elections, integrity ops and walked, out in under six months so, that's the context, for my statement.
I. Would. Say I think one of the things for. The future to get to a healthier media status. To the point where we can actually both. Trust media, again and start, having the kinds, of, kind. Of media that we want for a healthier global society I think, three. Things have to change being, first being fast and being free. Cannot. Be a sustainable, model if we want a healthy media, environment, and. Unfortunately. I know that everybody, wants all their information for free now if you. Want actual, journalism, you can trust, it. Had you can't, just expect it to be free and so this whole it's. Not just the free business model, it's it's, the. We. Are forcing. Journalists, to have to be first, and fast and not necessarily, accurate and, order, and clickbait be in order to get that, show that they need on the social media platforms, so. I promise, I didn't pre think this and I realize there's three F's in there so that might be my new tagline versus. Free and fast is not sustainable. So. The. Problem of who gets to human. They. Like the three F's and, and. The problem of how do we capitalize the search for facts, Luke. What are the questions we have to answer in. The next decade well, I mean I'm gonna following when, Desmond said a little bit I feel, like technology has really depersonalized. This in so many ways and clearly, people. Of color and underprivileged, underserved, communities, suffer from this far worse than most. But. In general, everybody I feel like we're living in this West world at, this point we are living in this place where we forgotten, that we're human I think that, what's going on at gun violence is in this country in addition to being bound up in social inequality, and racism and everything else is also, a function of the fact that I don't think we're thinking of those people, as real people as. A society, so I think going forward, really. Understanding. What it means to be human in and, you. Know reconnecting, with that it's probably the most important, thing because we're just so bound up in our media and we're so you, know responding, to the clicks and so. I think that's gonna be the key question, going forward yeah, and you can you can throw a lot of. There's. A lot of like blame to go around with how that then the you know one of the things that we research, is the. Use. Of data visualization in. Everyday. Discourse, right, it's so like when it when it went from a fairly, anodyne, context, we're like you know we're in the boardroom and I'm showing how profits, are going up and that's all fine or, I'm using it in scientific, research too it's, on the front page of a paper and I'm using a bar graph instead of a photograph, right so that choice is anesthetized, right.
And That choice is actually like somebody, made an editorial decision, to treat people like numbers, right, or to represent people as numbers rather than people and that. Kind of stuff is, really pervasive. A, lot of the a lot of the agita. That we have as a society about AI is actually, like is a you. Know is symptomatic of, that right like the the cause is really the data the, data fication, of us right. And the symptom is you. Know sort of the, world of AI I have a prop, and. I thought I'd show, you guys this. Is a thing called, a crackle, box. It's. An electronic music instrument developed by some friends of mine in the Netherlands, and. The, circuit inside of it is from a polygraph. And. So. The, thing that's kind of fun about it is you know this is like one of these things of technologies. Never neutral right depending on console I can I get to turn this into a weird electronic children's. Toy, but. The same signal, is the thing that tells me whether I'm lying, in. A certain machine context. And I think that we, have a lot of that around these days and we don't we don't think enough about it fun. Fact does anyone know who invented the polygraph. The. Creator of Wonder Woman Charles, Moulton Marston the, smart guy yeah. Golden. Lasso I know that yeah I could go on about this. Also. Trying to lighten it up a little bit yeah. Well. Since we've lightened it up let's go back to disinformation. In elections, just for a moment. It's, kind of incredible what's happening right now it's almost like this informations, eating itself in. A way we. Seen, the president kind of apparently. Put himself an impeachable position, because, he's bought a disinformation. Story. And pursued. It right in office we a poor report out today from the. Oxford. Project. On the Internet on computational, propaganda, I don't even saw it yet but it said 70. Nations are, engaged. In. Evidence. Well. There's evidence that seven nations are engaged in social, media manipulation, either targeted. At foreign, parties. Or at. Their own you, know citizens, this. Is you, know huge, numbers that kind of spiking, up at the moment one of the amazing tables, that I was looking at in that report was. The number of governments that are now investing, millions, and millions of dollars hiring. Hundreds of thousands of people to do. Basically. Information, warfare. And. That strikes, me is extraordinarily concerning. And. I think you're concerned about that as well what. Impact does that have and. You, know to the dystopia, thing are. We on the beginnings, of addressing. That or is, it still gonna get a lot worse over the next decade okay. So I'm gonna try not to take three, hours Ranvir. Which normally I would not do it. First. And foremost propaganda, information, warfare none, of it is new know. The. Difference right now is, the. Easin scale. At which this, can, be spread so. What. The. Biggest concern to me yes propaganda, information, warfare it's, here, it's, not going anywhere, what. I look at I mean if we want to back up and use of course Russia's example. In our last election, to. Me the biggest question wasn't even did the Russians hack our elections the biggest questions wasn't, what did Russia do on Facebook I mean those were all important questions my, question, was why were we so easily manipulated. And so easily persuaded why, was it so easy to get us and. There's. Lots of reasons but since we are here talking about the future of media I want to talk about the social, media angle of and it's. Listen, I get, it at the country of free speech everybody, has the right to say whatever they want apparently, I, have, some thoughts on that but anyway. So it's a country of free speech but. As they. Say often at the, Center for humane tech freedom, of speech should not equal, freedom of reach and it. Is the fact that you have these platforms, who are deciding. What, they will amplify deciding. How they will curate content who. Have these algorithms who, I love, how, they. Love to say that algorithms, are a moral or algorithms, don't have politics. Algorithms. Are gonna figure this all out for us and that human bias isn't put into news well, sometimes I actually want a little bit of human bias to actually, look at something and make an assessment on it these. Algorithms are. 100, percent program to keep your eyes on their screen it's all about user engagement and so the, most salacious content, wins the, clickbait wins it, is whatever, it, is that is gonna keep your eyes on that screen, and that.
Is Where, it ends up breaking down our, ability to reason it's what breaks down our ability to to. Decide. What is fact what it's fiction if, I need one quick little kind, of meta example. Because. What brought me into the space to begin with was the breakdown of our civil discourse and exploring, why this is happening, I think the breakdown of civil discourse is one of the biggest existential, threats to, our democracy, and, I've, written enough about that if you're bored you can look that up later. But. Just a meta example, because good. Journalism, can't win in this environment, there. Was an interview on about me about, a month ago and. When. They put out the piece they put it out with the super. Salacious, click Beatty title, which if you know anything about me you understand, I think it is exactly what I talk about as being one of the problems and it was Facebook, knows more about you than the CIA yes. I'm a former CIA officer if. I didn't mention that hey. That's not what I said, be it's super click Beatty and see it has kind, of nothing to do with what we were talking about in the article and. Yet. That. Piece. Spread, like wildfire, cuz everyone's like oh it's Facebook, it says CIA this is super silesius, I guarantee. You most people didn't read it but everybody, spread it so. To me the biggest issue here is as, long. As these. Platforms. Hold no responsibility or, accountability for. The way they're curating, your content, and what, they are deciding, you will or won't see this, won't change Desmond. How does that correspond with your yeah. I think so. Yes everything, that you've said and then I think what I'm most concerned around is, kind of the, privileged. Way in which we get to discuss these topics and so the language that we use around disinformation, and, deep faith and cheap fakes are terms. That are. Not widely, dispersed. Publicly. Or understood, but. If you really want to understand disinformation, talk. To background, folks I think there's a big. Understanding. Of the ways in which information. Has. Impacted marginalized, communities, historically, and yet the language that we use does not allow us to enter into those conversations in a way but actually brings everyone to the table so I also think that as we engage these conversations, that we need to think about a more publicly accessible language. That allows everyone, to contribute, as. Well, let. Me ask you guys I'm gonna point this maybe a little bit towards towards the idea, of change in the next decade ahead. You. Know the, great thing with Media Lab I've got this kind of funny perch where I get to see what's going on the universities, but also in the school system, and in the companies, and and it's cetera and so that for me is a great education, but, I also just see so many different opportunities to, to change the way that those various systems work in.
Order, To address these problems what, do you think we have to do if, you had a magic wand and you could address the, institutions, of New York the schools of New York the society more broadly what, would you change first. Then, what would you change first I'll let you take that one on while the others think about yeah so much. So. I well. Let's keep it immediately. A co-director, yeah I for all summer programs so we have young, people black and brown folks from, New York City New Jersey come, it's been three weeks for us from the summer and, we. Initially thought that we were going to be doing the training we. Actually learned immense, amount from their live experience, and so for example we learned that a lot of young people who, may live in NYCHA housing are, being impacted by facial recognition systems. Already already. And they, have deep concerns about that and want to do something about that that was not built into our curriculum had, we not had young people value. Their voices we wouldn't even have thought around thought. About these implications, so I think that we have to again. Invite, a more, diverse, younger. Set. Of people to be a part of, developing. The questions the, implementation. And deployment of these technologies so, I think that's where I would like to start okay. Start with you Tony but. I want, to try and tie these last. Two threads together because I was going I like where yeah, oh let's go I mean it's actually I think relevant, to how, much how we might change education I've. Been thinking about this a lot in the wake of recent political events and. The. Attack on journalism, and this, sort of fall of journalism, that you know what's talking about I think, we're fundamentally, in, a place where, we're, living in Andy Warhol's world actually, well. You know we have celebrity, we, no, longer have, leadership, and judgment, I feel, like we're rudderless whether, it's in government or, you know I mean the Golden Age of news was all about. Having. Someone curate, that stuff it was Morrow and Cronkite and those people with. A point of view we've lost all that you know and and, part, of that is we do want things to be more open and democratic and, that's okay you know when we all love the electoral college to go away we think especially, in a blue state now because, you know kind. Of sucks right. But. At the same time there was a certain. Logic. To the idea that there needs to be leadership, and judgment and potentially, you know protections, against, rule of the mob I think, we're headed heading head log in the role of the mob and I think we need to bring back some judgment and I think that. Mmm, from an education, standpoint what we could do in the education, systems, is encourage. That kind of leaders so I think it's more about less, about media and education, and more about public policy and education in, these other areas I'm. Sorry that was a bit of a ramble but, that's. How I'm feeling today yeah, no I'd agree I mean I think media literacy, is a big part of it I think you. Know the the, the thing that that, concerns. Me is you know with all these new technologies they're, coming online who's. Getting left behind in the conversation. With these new technologies so, like when I think about, AI. Yeah. AI is really scary and has a lot of bias and whatever but one way to improve the signal-to-noise ratio. On it is to make sure everyone's informed, on what it is and what it isn't. You, could say the same thing with you.
Know Like, innovations, and wireless networking right so like what is 5g you gonna do right, what is what. Is better security, gonna do all these things you can sort of point out and say well. We really need is we need just like a really a much, better inclusive, conversation about, how everybody gets to participate in this and so, there's a lot of you know there's a in. This kind of you know this reflects upon like my, experience has sort of in the arts and and when you and when you think about. Diversity. Inclusion, issues in the arts one of the things that's often pointed, out is that the, real issue is the gatekeeping. Not the artist there's lots, and lots of artists, from, all backgrounds who, make excellent work and then the curatorial, system. For. Allowing whose voices gets, through is incredibly biased, right so that's the that's, the thing you always sort of whittle away at right example. Of art yes, so arts organizations, are now starting to sort of embrace his IBM's wondering like we start saying you know we don't just have artists and residents we want to have curatorial, fellows, here we're looking at equity right. Because it's one step up right and so when, I think about you know mixed reality or, when I think about you. Know the, the way like the entertainment experiences. Of the future gonna happen or when I think about citizen, science or citizen journalism, or all that kind of stuff I always think about like one, step up like who's actually empowering, that voice to get out there and how, do we change their. Ethic, to, be looking, for it for, a deeper catchment, of people like, I think that's one more. Positive, angle, how to improve the signal noise ratios, and say you, know it's not just better at curation it's better in a very specific sense of the word right, and that's you. Know a thing that we. Could probably write, about, in. This media 2030 or get some men to sign off on and, get some yeah, I do want to be yeah absolutely. Please two. Quick thoughts on this one and this. Is going, back to the last question then quickly answering the question you just asked yeah I've been doing a little experiment on Twitter lately I don't use Twitter that often. When. I write, really. Wonky like yesterday, I tweeted out sort. Of what, does the whistleblower really mean from someone who served in the Intel, community and, it was super wonky, and there was nothing salacious, in it and I didn't like throw, out conspiracy, theories and then. I, compare that with like the one time I might, have said something fallacious, and I posted, them both at the exact same time both, during big media cycles the, more salacious one, thousands. Of because you can check your engagement, yesterday's, tweet seen by almost nobody, because. It was and and so Twitter has decided, who, is interesting, enough to amplify, apparently. Someone's actually served in that world and has really. Liked fact-based. Ideas, about what it means not, cool the, young hot celebrity, with some salacious, comment, that person gets way more engagement, so that's one of them but, and and I have lots of policy ideas on that so for me the biggest thing needs to that I focus on is how to get our government to step up and do the right thing but to answer your question, I would, like to see a more interdisciplinary, approach. In, our, education. System especially, a future technologists. I think, so many young people are being told they have to learn to code they have to learn to code they have to learn to code and my. Concern, is well I mean I want. The computers, in. The future how about learning how to negotiate how, to have critical thinking how, to speak to humans I would, like to see I think. I would. Like to see, more. I mean this is part of why I try to engage with a lot of students and might be teaching next year I would, like to see people, who are not technologists. Teach more, courses, on, things, like the unintended consequences, of tech on things like teach the, future engineers, the future data scientists.
How To have them, yeah. I was, listening to the podcast once, it might have been Scott Galloway who made a comment someone made the comment imagine. If Mark Zuckerberg hadn't, dropped out of college and maybe took a few courses on, what, the world looks like I don't know if that would have changed much, but like that's, what I'd like to see a little more well-rounded. Education. Well I know one of the programs, that the New York City Lions is a well, we helped to run is a class. Called tech media democracy about a hundred students from five universities, Cornell Tech Columbia, NYU, CUNY. The, new school and. We. Bring them together from disciplines. Ranging from Media Studies and journalism, to design to engineering computer, science and. It's, just incredible, to see the. Frictions, that arise between the engineers, and the journalists, you know the engineers who immediately think we're, gonna build something and solve this and the journalists, who say maybe. We built too many things already. And. Maybe, we don't had some stuff on top of that other stuff right yeah, I have, a friend who teaches ethics at Harvard who has Mark Zuckerberg on the roster, for his fall class but he never showed up. He. Will occasionally post on Facebook, and be like you know if you just showed up to my class. We've. Got a lot of work to do we've got to address our education, system we've got to we, got to make change and regulation, and government we've got to think, about inclusiveness, and who's at the table, we've. Got to what, should industry, do Tony I'll put you to put you on the spot on that or, being, industry representative, at the moment, do. In what regard, in, terms of changing the, trajectory of, this is there a role for for, business leaders to to, make, a different. Play well. Yeah of course but I mean it's gonna every, company's gonna do what they do and everyone's, under, different, constraints. If they're public companies they have to behave a certain way because they you. Know effectively. 100%. Focused on the bottom line if they're not public yet they have a little more latitude like unity is not a public company we. Have very strong leadership who's got very progressive, values, and tries, to give back and do a lot, of good in the, world.
And. You know support, young creators and do all the things we do I don't, know, if, I could recommend a general plan, the. Entirety, of the computer. Industry you know we've got politicians, coming in why don't you break up the big tech giants, there's there's, a lot there I think, it's mostly. About engagement, I mean the companies, that will listen and, respond, and work, with the folks are actually have to think you know they think about it on a daily basis to try to make the world better, I think it's that interface, you, know there's always going to be an interplay between profit. Motive and the other things we need to do as a society right, and and, so I think it's mostly about engagement in listening when it comes to industry so. These problems we have look these are deep deep issues they're big problems, and they're they're exposed, in a way right now I think that they haven't been in. Past, but. I did ask one, last question in my email list which, was which, we've only got a couple minutes before I'd love to go down the line what, does give you hope that maybe. About twenty thirty against, the background of so much challenged, climate change. Soaring. Inequality, all the issues were the polarization, we, know we're gonna face these things but what gives you hope what, cuz you have. Former. CIA officer. What. Gives you a funny. Someone once introduced me as a most optimistic, person they know and I was like that is not a way anyone's ever introduced me and they said you wouldn't keep working on this stuff so passion if you didn't believe that it could get better so. What gives me hope is the. Fact that, more. And more people will, become aware of the, way we are being manipulated, of the, effects that these things are having on us and wow. I'm gonna sound like the old lady here that the next generation. Not, going, to, fall. That, fall for it and accept. The. Idea that a public, CEO, should. Only be responsibility, is only responsibilities, of fiduciary responsibility, to a shareholder, that, societal. Shoes matter more might my hope is that. The. Next generation will change the way honestly, it's the biggest, part is the way our incentive. Structures work for business right now. There. Is a rising, cadre. Of black and brown scholars. In quick, critical, digital studies that are wrecking, shop and are, putting us to task on how we develop, these technologies. Whoo-hah Benjamin, Sofia, noble metalia, and Conde, Andre, brach. Murder. The coustarde are just a few of those folks that are really, bringing, forth. Important. Critical, theoretical. Apparatus to, help us do this work better so please look, out for them read their books and. Leverage. Them in your development, okay. I'm. Gonna sound old too but it does come back to the youth for a couple of reasons, one. Is they are, not wizened, and jaded by, you know they have been beaten, on by life enough so the ideas are fresh.
And. So you, know I see my son who has just done just, started college and Emerson and you know whole world before him great. Ideas you know there's a there's gonna be a voice for those but, there's also young. People who don't want to be shot in their schools, they. Are really being you know and they have an existential threat in front of them and that is going to actually engender, quite a bit of creativity, and. Industriousness. On their part to make sure we shape the world in a different way going forward shot. In their schools or strangled by their own atmosphere right, I, mean that's the thing right it's so late like meanwhile, we also broke the planet. And. There's. That right and so, you, know that's you. Know the sort of existential emergency. Around that is going to cause, a lot of these conversations. In. This political moment and the way political. Actors have been behaving in this political moment to, feel alarmingly, petty right. And I think there's. A lot to be said for. You. Know the American, political, right sort of co-opted, this term values, like. A few decades ago right like Values Voters right right, or whatever but if you but if you think about like values, and is, like one of the things that I think is really positive. About, especially, my students today who I hang out with like they, they. Curate, their own world based, on who, is, coming, at them with information that embraces, things like sustainability and. Critical discourse and, inclusion, right. And the more you can kind of amplify, like. Those values, right and say those are the things that, that actually, make something worth listening to you're worth paying attention to, then. Yeah that's, you. Might actually get somewhere yeah well. You all have set us that helped. Us set some of the key themes today. Which, are going to expand on, Sabrina. Roche is over there Sabrina wait she's, gonna be helping the me lab put, together this. Media 2030, project, so I look for her outside if you can and. Over the next decade we're gonna hope to have continued, to have this conversation draw, folks together explore. What the right hypotheses, and questions are think. About the trends and issues and. Hopefully come back to you in a year from today, you, know maybe not with answers but with some ideas about things we could do together, over the next decade that might lead, us to. Some. Conclusions. At, least in that timeframe, because. I know these are big questions and they they don't get solved, necessarily. Certainly. Not just in a workshop but. But maybe not in a decade, but, I want to thank you all for being with us today and give them a round of applause. Thank. You all so much for joining.
2019-10-05