Must-watch Future of SocialMedia Futurist KeynoteSpeaker Gerd Leonhard s Keynote NEC Forum 21 PT

Show video

hello everybody bom dia bonjour bonjour tous buenos tardes grizzy gutentag it's a great pleasure to be with you today for the 27th form of national ethics councils nec in beautiful lisbon and of course virtually i'm here today to speak to you about the future future of social media and i've had great experience with social media in the last decade myself so i have some ideas about how we can improve this but basically what is happening is that social media has become an algorithmic medium and there's lots of good things about algorithms of course but media is something that is human it's something that is really important to us as a human way of communicating it's kind of like food you know it's of course we could eat other food but we like real food uh and so this is a real problem i think when we think about what's happening in media what's how that's changing our environment our society and clearly we have to start with one question about the future you know people talk to me about this a lot when they say well you know when we look at the future they want to know what the future brings but this is a completely wrong question the question is not what it can bring because literally the future can bring anything the future is technically speaking in 10 years capable of bringing bringing pretty much anything what we want is to figure out what do we want our future to be what kind of feature do you want for your kids and this is the important question for social media and for media in general because when it's about our future it's about governments governance it's about politics and about policy it's about how we hold up together it's about so many basic human things and this is the question of what we want from social media and how exactly what kind of output are we going to define and and this is going to be providing human flourishing years ago we had concerns about automation this is charlie chaplin of course from a long time ago and now we're worried about this a kind of automation and technological unemployment and computers taken over everything many of us are thinking about that being the end of work you know i'm not sure about that but basically what is happening is that algorithms and machines and robotics and the smart intelligent algorithms and software is kind of leading us to a path of potential dehumanization which means taking out the human element because it's kind of in the way my friend douglas rajkov talks about this a lot where he talks about team human when he says basically they're treating us as if we are the problem so the system would be great if it wasn't for the humans and in silicon valley where i lived a long time it's kind of like a lot of people say well basically the real problem here is that there are humans in the way we have to get the humans out and build systems that work with just the data and that's exactly what we got from social media so far facebook twitter and many others that i also use a lot but experiment with because here's the challenge with this you know we are facing a scenario where first we had or we have the climate change challenge which is a a result of pollution and now we have a technological pollution basically human change as technology is changing who we are and what we think of and what we choose what we buy and how we react and as technology is absolutely everywhere and unavoidable we're facing a double challenge climate change right and human change the change of the way that we communicate and talk to each other clearly for me it's clear that the externalities and the unintended consequences of unlimited unmitigated exponential growth and progress that it will surpass the problems then that we have with climate change the fossil fuel economy the side effects of all these things will be bigger than the side effects of oil and gas because they're going to change us as humans and our perception of ourselves and our democracy our society and our immediate future in the next 20 years in many ways you could say that social media has a lot of parallels with the old and gas industry and fossil fuel which is now kind of coming to the end with the end of all the next decade but clearly social media has the same basic components yeah it's uh it's there's lots of data which has become the new oil and that data is wanted by everybody to drive their cars so to speak but externality is the problem and the change of how people are thinking that is basically somebody else's problem that sounds exactly like shell or exxon mobile you know some time ago when we talked about what is happening with climate change well that's not our problem because people want to drive cars that's why we have five trillion dollar fossil fuel subsidies for oil and gas so we can drive cheaper and what we have now is very much the same we use technology to extract data and then we sell that and all the other consequences well that's not our business let the state worry about that or some civil rights group or somebody that cannot continue it's utterly unsustainable especially because we're moving into a future where artificial intelligence will follow the same path you know right now it's about intelligent machines and assistance digital assistants more like ia intelligent assistance and then we're going towards a general intelligence of machine that's 10 15 20 years away maybe 30 for what's called the singularity but the social media dilemma is just the very first example of what is happening here that example means that it can go seriously wrong if we if we don't ask the question of why are we doing this and who is doing it and how can we control this so it's only the first iteration of what we're seeing in social media so just pick your social media challenges artificial intelligence challenges uh human engine genome editing challenges and on the story goes so we need to get this right now because it will send what sets of the precedent for the future we're going into a future like this so i'm going to go in the future of our 2030 artificial intelligence machine learning deep learning quantum computing language processing will mean that anything becomes possible pretty much anything becomes possible and machines will rival the logical capabilities of humans now imagine that moment comes nearer and nearer and nearer we let machines sort of run the show what kind of results are we going to get potentially some very good ones very smart ones and then basically being out of control taking the human out the loop living in a society that is essentially an algorithm that is not something that we would want especially not here in europe i don't think because we can also count on the very fact that technology will always be abused like this technology and self-driving car where people are sleeping while they're driving that could definitely have a very bad ending and if we look at social media and general media based on algorithms like this we're going to have a lot of accidents and we're going to use technology for what it wasn't intended because technology is changing our world exponentially it's very hard for us to understand moore's law metcalfe's law the law of networks you've seen all that stuff but basically it means the future is explosively changing look at this one the exponential cost reduction all going towards zero whether it's genome editing or solar or batteries or data storage right we're living in a world that is vastly different is not changing step by step and this is very hard for us to get here in europe it's not four five six seven it's four eight sixteen thirty two it's leaping and if we get it right now we can make a good future from from what we have now with technology or it can just leap away and then we're kind of stuck there dealing with the consequences i really believe that the more we connect with technology the more we must protect what makes us human it sounds kind of like an oxymoron how can you have both but this is what we always do with technology you know it's not technology's fault that it can be too powerful we need to find rules around it as tim cook the ceo of apple likes to say technology can do great things but it does not want to do great things it doesn't want anything and so it's really important that if we want something that we set forth and make the rules and the circumstances because really what technology is doing it's driving itself right it's an unsustainable and corrosive obsession with exponential growth and of course that source of that really is silicon valley and china to some degree algorithmic media putting us in a box selling our data and milking us for the future profit and that is not always a bad thing when we're used to it and we we allow it but in general we're going to have to go beyond this that is not what we signed up for literally because technology can initially be a kind of a present and then it turns into a bomb very quickly just like anything that's too much of a good thing you know think about cigarettes alcohol food whatever you want to look at in this direction clearly lies trouble when we go beyond what is good and we use it so much that it becomes really a crutch to our lives too much of a good thing can be a very bad thing especially in technology that deals with data because you know i said years ago data is the new oil but now data is kind of the new plutonium it can be used as a weapon and it is being used as a weapon and this is only the beginning of it that we need to prevent and figure out how we can create transparent standards and identity standards and and security measures for our citizens and for everybody else that's why we're talking about ethics right these are the rules of engagement as we go forward into a world that is looking at technology as a religion essentially uh and that's looking at technology as a substitute for human relationships so where you can say you have more relationships with your screen then you have with real people or of course a future that substitutes our reality in virtual reality that may be great for doctors dentists lawyers and policemen but think about that for yourself and how that will change our social environment definitely uh kind of too much of a good thing as you can see here with facebook oculus rift uh the infinite office we are looking at a future well that's becoming possible like tom cruise a minority report pulling out the data and making it come alive in front of you yeah it could be a boon especially in time of homeworking working from home fantastic but on the other hand let's not mistake a clear view with a short distance and that this can actually work and give benefits like it seems to says my pollock my colleague paul saffo a fellow futurist and also the king midas problem let's be careful what we is for we may wish for that to happen but then when it happens it's like everything turning is turning into gold and we can't eat gold and we can't connect with humans anymore because now we're wearing this outfit very important when we think about social media think about the context of that as we're moving into a world where we're going to speak to technology we're going to use voice interfaces natural language processing we're going to speak in 50 languages in real time this is all happening in the next couple of years and then we look in this direction it's quite clear that software could not be just kind of eating the world as mark andres and i said 15 20 years ago it could also be cheating the world very easily because we wouldn't know what to think about it and it's kind of the black box problem here we don't really know what's inside we just trust it do you trust google maps another one of those questions well sometimes would you trust social media to give you accurate information and would that be a filter of a sword like we have in traditional television how exactly would that work we cannot afford technology to cheat the world for us we have to go a little beyond that the idea of the super brain in the sky doing all the things for us sounds very convincing and of course would make trillions of euros data is the new oil super brain and the plutonium and artificial intelligence is the electricity the power that runs through it and that is going to be mapped out in the future into tens of trillions of euro enterprises that will be going into the future pulling us there where milton friedman said this is really what they should be doing the economic paradigm of the 70s and to engage its resources and engage in activities that will increase the profits that's the purpose of the economy that was in the 70s and now we're thinking that is probably not going to work because free markets will not solve that problem for us it won't solve climate change it wants so sort of uh data regulation because those are the things that make us more uh of a holistic thinker an environment where we need a larger story a stakeholder economy people planet purpose prosperity not just one thing and this of course is not a black or white question because sometimes technology can be amazing and positive and sometimes not the question really is when humans and machines collaborate like this technology is more than neutral until we use it so whose morals do we apply well the lowest common denominator of our collective society what exactly do we stand for not on every tiny detail but on the bottom line of what we achieve for example in media it's quite clear we want human media we want media that means something that has a relevance that is accurate that is true to some degree and in europe clearly we have this sort of collective approach and we have the humanistic advantage and that seemed like a disadvantage you know 20 years ago but now it is an advantage when we're saying you know our world is more than just profit and gdp and growth it's people planet purpose and prosperity that needs to be the principle for media as well and that needs to be folded into any media provider if they're doing it voluntarily or otherwise to think about how we can integrate this this is the only question that matters for us does it make sense does it have purpose just deliver value why should i be doing it not just i that i can do it this is the why question that we have to answer the ultimate question of humanity of course why and who not just if or with what kind of tools we're moving into a future where everything is going to be in the cloud everything our healthcare records our financial records and then the question of good or bad is the fundamental question and safety and security technology drives our societies but ethics define it and if we want ethics to define society that talk that speaks about social media right ethics is knowing the difference between what is the what you have the power and the right to do and what is the right thing to do and now we have to sit down and say what is the right thing to do for social media and what is not definitely not we know we know what's not the right thing the good thing that that's a good start now we have to define what is the right thing to do respect for human rights respect for democracy furthering human flourishing having collective benefit because humans aren't just algorithms they're not just being programmed into responses like a public like a giant public dog we are beyond algorithms we are more than that what matters to us mostly is engagement experiences relationships coming together that's why it's so important that media becomes rehumanized again because humans aren't binary yes or no we have lots of things in between we're not a machine we're not algorithm we're not we're not easily forced into a model like like an ai would be we're organic constantly changing and so that has to be put back into media in some way that goes beyond the tools of the first edition of social media here picture with facebook's taking a very good aim at democracy what do we want our future to be well do we want this you know these are the choices that democratization or disinformation collective society or polarization that we have in many extreme social media countries like like in china and the us and brazil do we want conversation or do we just want to make more money and fake conversation do we want to create togetherness and collectiveness or loneliness which we have plenty of on social networks do we want human actions and human feelings and human things you know the algorithms emotions respect imagination creativity design or do we want the machine to tell us what the future holds and what we're allowed to do we're moving into a world of these mega shifts i talked about this a lot in my previous book and you can download the chapter here for free megashifts.digital but it's basically all of those things shifting together in in a quickly moving pattern that gets us into the future in a very fast way and this is going to be crucial that we rehumanize technology to create human values and to not go further and to to watch that layer between too much of a good thing and definitely too much of a good thing and that needs to be put into practice so some practical tips here uh clearly are coming forward from all different sites and of course uh ursula from the lion has said this many times about that we have to take those decisions that are coming and we can't just leave it to an ai or or the boardroom boardrooms in silicon valley that needs to be decided right here now we have to take that right back to stay human to be human and that goes for example for things like this you know as we're estimating what data should decide buyers prevention diversity transparency trying to get a machine to understand why that's important these are soft skills and soft issues and they're they have to do with emotions and beliefs and feelings and we need to put the human back in there and create an agenda that is worthy of humans not just the machines uh to create a scenario where that is becoming important again and that also means that the companies who are providing technology not just social media and search but also 5g networks telecom networks ict companies and so on that we have to ask them as well to give us a promise that this matters again right to put the human back inside to create an environment where humans are important and where we create benefit for humans not just a benefit that is on a financial side but overall benefits so here's some ideas about how to rehumanize some possible solutions to look at how we can do this first redesign social network completely from scratch collective values and ethics over monetization and that needs to be thought about when we think about advertising campaigns and cookies and tracking and right now is a very good time to do that with all the recent technology changes reprioritizing and benefits for all stakeholders in that value chain not just the shareholders and the owners of technology always keep the human in the loop even if it's less efficient or slower or more expensive put the human in the loop for the time being that is the only way we're going to get human value we need to treat media just like other media social media has to be treated like the traditional media industry or of course the new media industry taken all together but there has to be some sort of regulation on this one i'm pointing a regulator for social media that has been done in some countries i think that's by and large also a good idea and maybe we should consider some more dramatic action like considering a minimum ratio of revenues versus human beings who actually work there so that we can enforce the idea of actually using humans for human jobs and think about the automation tax when media gets automated to support public media i know these are probably radical thoughts especially when you think about the free markets and so on but let's entertain possibilities i think to go back to the old cherokee indians saying the worldview feat is the wolf that wins if we feed the wolf that we want to win that will be the war of equality and collect collective society and collective benefit then that's something that is going to require things like the digital ethics councils that we already have at this table now that i've been speaking about for roughly five years and it's starting to come together in different places and maybe we need to enlarge this and think about this in in the form of a humanity ethics council beyond just digital but also going towards of course genetic engineering and so very big topic but a council of the wise people that is really what i'm talking about here and i think we have lots of potential good candidates here so i want to wrap up by saying very important i think it is for me that we embrace technology we embrace we embrace progress we move forward we improve we build economies we embrace technology but we don't want to become technology and social media is the very first place where we need to put that example in place and create a collective beneficial future for all of us i want to thank you very much for listening and for your patients and i look forward to connecting with you online in some way or the other and have a good rest of the conference thank you

2021-06-14

Show video