KAUTE Talks x Aalto University: Data-driven Future

KAUTE Talks x Aalto University: Data-driven Future

Show Video

-Welcome to this webinar on a Data-driven future! My name is Matti Suominen. I'm the chairman of the board - at KAUTE Foundation, as well as a professor of finance at - Aalto University. KAUTE Foundation is a foundation that supports - research in the areas of business and in all technological fields. I'll take this opportunity to thank the Aalto team for organising - these webinars together with Tuomas Olkku from the KAUTE Foundation. Why are we organising these seminars, is to bring together business - people as well as researchers. Our hope is that our researchers will - get a better understanding of what is relevant in the business world, - as well as we hope that business people were to get a good - understanding of how research is shaping the future.

We start this seminar series with a seminar on Data-driven future. And this will be hosted by Risto Sarvas, - one of the most popular professors at Aalto University. So at this moment, I just wish to welcome you all to this seminar. And I hope that you will enjoy this morning with Risto Sarvas, - who will introduce our speakers next. -Thank you Matti for those words! Good morning to everybody on my - behalf as well. Like Matti said, my name is Risto Sarvas, - I'm a professor of Practice at Aalto University, where my job is - a Director of the Information Network's Masters and Bachelors programs.

And today's theme is Data-driven future. And a little bit more about that in a few seconds. But before we start, actually our talks and the speakers, just - a reminder to everyone of you, that we need your help in this as well. So there in the screen, you have the opportunity to - ask questions in the chat. So please do that during - the talks or after it talks when we have short Q&A sessions. Today's theme...To give an introduction for this -

morning, is Data-driven future. And the first thing I'm actually - going to do with you here today, is to change a little bit about that word. Because data, just like any technology in general, is passive. It doesn't do anything in itself. It needs something, somebody to do something with it. So when the question we start talking today about data-driven future, is - that if data, just like any technology, doesn't do anything - then who is actually driving? Who is in the driver's seat? And especially this morning, when we look at the data, - we look at algorithms a little bit broader.

We don't look at the technology as such. We don't look at - the business models, but we take a step back, and we look at societies - and we look at the societal influences of a Data-driven future. Now, - because the theme of this morning is more about societal level, - so first remind us of who are the traditional - players in public discussions when we talk about the future - of societies? Obviously News, established media, are - very important, in driving the discussions of what is the future - about and what are different decisions made about the future. Obviously, politics, policies, they do play an important role. A lot of economics, macroeconomics if you will, is in the discussions - of what kind of societies, what kind of a future we have. But if we take those three traditional public discussion - players, are they all in the current digital age that we live in? So who is in the driver's seat, - if these three traditional players are not the only ones? Especially, - I would say that these traditional players seem to be very reactive - to the changes, when it comes to technology such as data, - artificial intelligence, machine learning and so forth.

So we all know typical examples. I would almost say schoolbook examples of these reactive - ways of behaving. Let's say platform business models or ethics - in artificial intelligence. Or from a regulatory point of view, competition - law, that if we look at this, look at the public discussions, we look - at how these established societal discussion players are working. We get this feeling, it's more about reacting to those technologies, - to those business models, that seem to be coming from - somewhere that seem to be happening. So the question is, who are then shaping - the data future societies we have? Perhaps the answer is obvious. And I think in a way, it is obvious.

But nevertheless, in the past years when having these discussions is that - we need to keep reminding of us, of who are the who are the shapers, - who are the drivers, because, of course, an important role - is in the organisations. In the professionals who have the know-how, - who have the expertise, who have the infrastructure, - the networks to leverage and to create business, - to create innovations with data, artificial intelligence, - machine learning, high technology and so forth. And why perhaps we need reminding ourselves of this almost obvious - thing is that, I would say that perhaps these experts, these coders, - let's say designers, business people, - they don't necessarily see themselves as actively shaping society.

If you look at the history of these professions, it is quite - obvious that they have not been seen as socially active players. Their kind of professional identity is about creating those business - models, algorithms and technology. And when we look at organisations who shape these discussions, - who are in the driver's seat about the future, do they really want - to engage in societal discussions? Do they want to engage in politics? Because once we step into a societal discussion, state of mind, we are - actually talking about politics. We're talking about policies.

Nevertheless, these individuals, these professionals, - these organisations with these capabilities, - they do shape the discussions and they do shape the narratives - about date the future and society. Because, first of all, - they are the ones who understand the limits and possibilities - of these technologies that often seem very cryptic. Often they're you know, they are treated as a black box because, - you know, few people do understand what data means. What is the difference between data in an Excel sheet and a data in - a data lake, that requires certain expertise and understanding of what - does it actually mean and the limits and the possibilities of those. But perhaps more importantly, these people, these organisations, they - are the ones who actually concrete. They create the concrete working -

examples that shape the actual structures of our everyday life. Which at the end of the day is our society. When we wake up in the morning and we brush our teeth and we go to work, - which might be in the garage nowadays, - it's that everyday life, working life, you know, home life, - that's where our society is more or less built every day. And these professionals, these organisations, - they create the society through, of course, the products, the processes, - the services. And these things are, of course, created by engineering.

They're created by designing. They're created by business planning. They're created by investing and so forth and so forth. Now, if we take that stand, if you are with me this morning and we take - the stand, that it's not the data that is driving us towards certain - future, that it's actually the people who do things with this data. Do things with these algorithms. Then if these professions, - if these organisations, are in the driver's seat, - and I think a couple of very important questions start rising.

First of all, what is their agenda? If they are the ones who have - the capabilities and skills to take the technology and take the data, - what are they doing with it? What is their agenda? And especially - if we talk about organisations, commercial organisations, - public sector, third sector, what are their goals and strategies? Where do they want to go as an organisation and with - the capabilities to leverage data and the possibilities of it? Which actually very quickly boils down to questions that - what are the values? What are the politics of these individuals? What are the values of these organisations? And I would say that this resonates quite a lot with another discussion - about the purpose of business, you know, meaningfulness - of work. The kind of new rise of individual values in an organisation. It is a different discussion, but it overlaps with this is that, - why do these professionals, when they choose where they work, - I think there's a new trend about understanding the values and - purposes and even politics of the organisations they work for. But importantly, I want to also underline and - especially coming from Aalto University, where we where we literally educate - and teach these professionals. I think a very good question is, - that are they themselves aware, conscious or even interested, in this - direct and indirect power they have, in shaping our future societies? Are they actually as professionals? Did they educate themselves? When they wake up in the morning, the professionals, are - they going to work to do politics? Are they going do their job? Do they think themselves of having societal discussions? And I would say, no, that is not necessarily the case. But nevertheless, whether people whether these professionals and - organisations are aware of their societal power, they do have that.

To summarise, my agenda, to be transparent in this spirit, - so my agenda for this morning and the discussions we're going - to have with you over the chat and with the speakers coming up, - is that once we take the point of view that technology is passive, - then the question becomes, OK, if technology is not in itself doing - anything, if data in itself is not driving us towards the future, - then who are shaping the data-driven future? Who are in the driver's seat? Who are in the driver's seat, - when we come to discussions about data, technology, society? Who are in the driver's seat in actually building the future? Who are - the ones at the end of the day, that have their fingers on the keyboards? And through that, who are the ones, who create the examples in - the public discussions and for all of us? And what is actually possible about the data-driven future? What are these individual professions? I mentioned a few of them, programmers, designers, business - people, entrepreneurs, investors. I think those are quite obvious. What are the organisations? And out of those organisations, - what are the big ones? What are the smaller ones? What are the medium sized ones? Are they consciously shaping the society, - or does it happen as a byproduct of their own strategies and goals? And last but not least, what are the networks? What are the ecosystems or what are the industries, that are driving - towards their own goals? That are implementing their own strategies and - also have the capabilities to take advantage of the high technology? Of the data, of the AI of the machine learning and so forth. That's my introduction to the theme this morning. We're going - to have, in addition to myself, two more speakers, - two more perspectives, two more opinions into this.

And hopefully, like I said, share your opinions. It doesn't have to be a question. Share your thoughts. Share your comments in the chat, that we can see here and we can - take into the discussion after each speaker we have today. To continue from this, our next speaker is Eero Korhonen. He is the Head of Strategic Relationships at Google. And I'm -

sure we're going to have Eero soon here with us. Good morning, Eero! -Good morning Risto, thank you for the opening. -How are things in London? -It's very early here, 7:00 a.m. This is the first time - I'm speaking before waking up, so otherwise all right. -Okay. Are people lining up for all those vaccinations there?

-I think so. I think so. We started yesterday, quite exciting. Probably taking for months to come, before everybody - has received their own. -Okay. But that's not the theme of today. We're talking about Data-driven future.

Nice to have you here Eero, the stage is yours! -Thank you. Thank you Risto for the intro and thank you - KAUTE Foundation and Aalto University for inviting me. And again, good morning everyone, it's really good to be here with you.

I'm really excited about the event and I'm looking forward - also the discussion with Meeri and Risto, after their presentations. I'm checking...Yes, the technology works here! A remote control from London.

As Risto opened as well this morning that, I'm probably - not the good guest here, starting with the title of the event. But I also felt a bit the same. Data-driven indicates, that - the data is in the driving seat. However, for me, it sounds better - that it is Data Supported World, because it is other things, - that are driving the forces. And the support - indicates more, that it helps us to open new opportunities, - and make better decisions, based on the data.

Like in this photo from Stockholm's Skeppsholmen, - somebody might be walking the same shoreline there, - it is an early morning, in the Data-supported future. There's a lot of buzz around AI and ML, and it makes it sound like - everyone is now coding with Biden as we speak. But if you try any customer service, that pretty much proves the point, - that there's so much things to do before we are really far away. There are some basics I wanted to take in this - start of this presentation week.

First of all the data, this may sound trivial, - but the data is not the finished product. It's just the raw material. You often hear the claim, that when you have the data, - that as such, is creating the value and somehow the magical - source of business models. I don't see the slide changing here, I'm pressing again... Now! Just a small delay. The data needs to be structured and transformed - to knowledge, before it becomes useful.

The right combination and continuous development of tools, - skills and processes is the key. This is a multidisciplinary effort - by KAUTE foundation and Aalto University, as institutions. Regardless whether you look at it from a research, technology or business - perspective, you have to involve many others to the process of refining. Successful find refining requires orchestration, - and that is not a trivial at all.

I take an example from company I work with. This is Google's mission. It has stayed the same since we started. And like in the previous picture, the value at it, even stated in this - mission, is not about the data as such, but it is the process of organising - the information and making it universally accessible and useful. In our case, we make the website on the open web available - for our users, when they are relevant for them, - in a particular moment. And this is probably the reason - many of the users return many times today to the site. Also this user intent type the search bar with additional data points, - makes Google Search relevant for businesses offering their services and products.

You know it pretty well. I'll give some examples about the... Orchestration behind. So the orchestration of Google Search, - obviously we can guess it's not very trivial, what work for users some - time ago, doesn't work in the future. The data supported consumer - services are in general never ready. They need continuous adjustments.

For example, 15 % of search queries are new every day. Our interests change continuously. We, not only type the search queries, but have also started to use - natural language, and later on also voice.

In order to keep Google Search relevant, - we change the algorithm roughly six times a day, - which equals roughly 2 000 changes a year. And this, however, - requires about 200 000 - 300 000 experiments per year. So there are many ideas and experiments that we'll never go live with. If you want to know more about Search, - they just recently released a movie about Google Search. If you have free time in the evening or in a week, - it's quite entertaining even for a person working in the company. Next example is from the News Industry, - which is a industry I worked for the past 20 years. I've had the privilege -

to work with News publishers across the European continent, and seeing - their development from a legacy publishers to data driven publishers. They've have been going through a massive change - in the industry because of the consumer behaviour, - which is driven by the digitalisation available, and making new sources - how to spend time, available. In the past three years, things have really changed within the industry. Risto mentioned that the publishers have been very reactive - in the topic of data-driven, data-supported future.

That's true, but things have really changed lot. A growing number of publishers are showing a healthy - profits and that those are specifically coming from - data driven consumer revenues. And it's fair to say that many of the successful ones have taken - a massive leap, when they've changed their organisations and skills.

This is a proof to me, that understanding the opportunities - are expanding beyond the tech sectors, - universities and startups, but also for more traditional industries. Based on this work with the News publishers, I'll give a couple of ideas - about success criterias, that I see with the successful ones. I have four of them, obviously only three in this slide. One of them is obviously culture and the way of working.

Data-supported organisations, can't have silos. Successful organisations, successful publishers I work with, - they have not only brought together and combined the data from - different sources for refining, but they also have brought - together the people from different organisations, - different from different publications. Successful publishers have formed teams and acquired - talent from universities and outside of the News Industry, - and they have brought new skills to the organisations, - that I didn't see 20 years ago, they didn't exist then! These publishers make new tools, - whether do it in-house or vendors, but they are building learning models - to estimate propensity to subscribe. Personalising front pages for you, -

and recommending content in order to acquire subscribers, but also manage. I'm quite happy to say that the Nordic countries, Finland, - Sweden, Norway, Denmark, they are very ahead of this transition. One more skill. One important criteria is leadership. All of those publishers, that have shown successful transition from - a legacy publisher to a data-driven, profitable consumer revenue - publisher. I think the leadership has played a key role. It's not easy -

to chase a business that has been similar for 500 years. They have been removing organisational and technological - silos. They have driving the change of culture. They've introduced a culture of experimentation and - they've been able to hire skills, that have never been seen there before. I dare to say that even though these are examples from the News industry, - it is likely to be similar for other industries as well. So if you are a leader in an organisation and when you - return today to the office or your home office, and you feel - that you don't see a change to the digital data supporting future, - it is likely to be up to you to start it. I think these are useful - set of criteria for other industries as well.

Changing slide, hopefully it's happening here, OK. The organisers ask even today that what data can change, - and what could add value for society. First of all, - I feel data-supported future is not limited to any specific sector. We - will see innovation and application helping us across the board. Any industry is probably seeing something coming from data-supported - innovations. Today, however, I selected three sectors that are - likely to get a lot of focus within the societies for good reasons.

I give an example from the Energy Sector, Food production and life sciences. I'm afraid to say that I'm not an expert of all the details of the project, - but I think they will give you an idea about the direction and - the sentiment of experimentation that is going already. I start from a Energy Sector. This is something Google has done - together with many industry players. It is called - Environmental Insights Explorer.

Now I just lost my remote control. This is sometimes difficult... Environmental Insights Explorer, - it's part of our pledge to help 500 global cities to - reduce one gigatons of carbon emissions annually by 2030. The tool estimates, first of all, - building emissions within the city boundary. From Google point of view, - the data is taken from Google Maps. However, it's of course, combined with other models that allow to - estimate what are the emissions in buildings in this region. It also estimates the transport emissions of all trips, - that start from the city boundaries or end there.

This is based on aggregated, anonymised location history data. Last, as you can see on the right hand side, there's a small box. It also - estimate the solar production potential to all buildings, based on - sunshine exposure, weather patterns, roof size and roof orientation. The data-support world requires a lot of - data refining. The data senders require a lot of energy.

In 2017, Google became the first company of our size to match - 100 % of its electricity consumption with renewable energy. Our ambitious goal is also going little bit further, to make it that - we are 24/7 carbon free anywhere at all times by 2030. That means that we are aiming to always have our - data center supplied with carbon free energy. We have developed algorithms that we've been able to reduce our - energy consumption when we cool down our data centres by 30 %. We are also making those algorithms available for - other players in the industry. It's not only about Google.

I give an example from Finland, Nuuka Solutions, which is amazing name for a company that tries to help - real estate portfolio managers to reduce energy consumption, - and helping to build a healthy environment - for the people using it. Nuuka Solutions is - also using machine learning driven algorithms to help their customers. They are sharing this through their platform and - making models that would help large portfolios of real estate.

This obviously seems to be very interesting because just recently, - I think roughly a month ago YIT, one of - the largest construction companies in Finland, invested to Nuuka. To help them to drive their vision to make life better and - more sustainable for the real estate portfolio. This on is pretty interesting. This is my second last example of the speech.

This is a experimental project called Mineral. It is coming - from Alphabet's X Factor, which is also known as a Moonshot Factory. It starts from the premise that in the coming 50 years, - the global need for food production - is more than in the previous 10 000 years. And this should happen when the climate change is - making the crops less productive. The mineral team, -

they build new software and they design and build new hardware - tools, that can bring together diverse sources of information. They started by gathering information available already - about environmental conditions and soil and weather - and historical cropting, how they are - correlating and what other causality is between those things. Furthermore, they started to use their own instruments to - unearth new data, how the plants in these particular - fields, were actually growing and responding to the environment. Combining this existing data and the collected data like plan and - leaf area, and fruit size with environmental factors. They can - now help the breeders to understand better how different varieties of - plant respond to the environment. This is early on experimental, -

but it also shows the diversity of opportunities that data - driven and data-supported future will likely bring for us. Again... Finally! This is my last example, you might have read about it last week. First of all, I'm honestly not an expert of AlphaFold. But I wanted to select this example because of the comments on the left.

DeepMind build a dedicated, interdisciplinary - team in hopes of using AI to push basic research forward. They brought together experts from fields of structural biology, physics, - machine learning. They predict the three dimensional structure of - protein based solely on its genetic sequence. So as we - started it, the right combination of skills and way of working. If you look at the comments once again, - as the spirit of the day, as Risto started in his opening speech, - is about the data in form it's supporting. The refined data by now done by Alpha Fold, opens new opportunities - and helped us make the better decision faster. In their case, -

they accelerate the research of biology and drug development. I see that there are four cornerstones - on which the Data-supported future - will be build on. I know Meeri will reflect in her speech from - her experience some of these areas. I think some of this, most of these are definitely general, - but I will take a look at this from Google's point of view. People today are rightly concerned about the information, - how their information is used and shared.

Yet they all they find their privacy in their own ways. If you are in the family using Internet through a shared device, - you might be (Soundtrack breaks.) your privacy concern might be the one - you just don't want your family member to see what you have - been doing and you want to keep your information secure.

So a small business owner, privacy means keeping - the customer data secure. To a teenager sharing selfies, - privacy could mean the ability to delete the data in the future. At Google, we have (Soundtrack breaks.) billions of people trust products - like Search, Chrome, Maps and Android to help them every day. And this trust we match with commitment to responsibility. For private users, we give very clear and meaningful - choices around that data. We have two very clear policies, -

We will never sell any personal information, - and we always let the users decide what to do with that information. Exclusivity is - also extremely important, when you develop tools and products - in Data Supported World. AI algorithms and data sets, - can reflect, reinforce or reduce unfair biases.

We also recognised that distinguishing fair and unfair bias - is not always simple, and differs across cultures and societies. So we will seek to avoid unjust impacts on people, particularly - those related to sensitive characteristics such as race, - ethnicality, gender, nationality, income, sexual orientation, . ability, or political or religious belief.

In product design, we make sure that the underrepresented voices - are being heard throughout the product development process, - from the early phases of ideation and prototyping to - the UX design and marketing all the way to the launch. The product developers ask questions like, - does it make sense for people living in different places around the world? Is it useful for people of all ages? Are all - races represented in this product? Last, the sustainability we touched in an example as well. The data must be refined in an environmentally sustainable manner. Going towards a data-supporting future, have to admit, it's actually - better from the environment point of view, than it is now.

So we all, today in this call, we need to keep - this in mind when moving towards to the supported one. This was my opening today, and if this clip works, I say thank you and Kiitos! -Thank you Eero! It's always a privilege to hear - somebody from inside Google and sharing your perspectives on that. We have two questions in the chat.

I'll start with a shorter one and then maybe a tough one. First question, related to the Environmental - Insights Explorer that you showed, - why is this data not public for cities in Finland? -The service has been launched rather recently. So they are building the database as we come. So when we make the place for 500 cities, it's still an ongoing - progress. You need to still stay tuned and follow up when there - are other cities available. All the 500 cities are not yet there. -OK, then the other one, is data-driven future - a global megacorporation future? What do you want to say to that? I think the examples, like when you started also, you mentioned that there - are... It's in the hands of a few. However, if you see that how -

the tools and products and skills are expanding to all sectors of - industries, there is an easier access to a machine learning tools. There are easier access to skills. People are training themselves. I believe that we will see it from small companies to large ones. It is not in the hands of you and this is built for everyone. And I believe that the next few coming years, like - Nooka Solutions, as one of the examples, you will - see a small company is coming up with great ideas and great - solutions that will be helpful for their own customer bases.

-I do agree, there's a balance between which is - kind of stating the obvious, that a company like where you work for - has a huge power and in that sense, responsibility about shaping future. But also like the fact that you mentioned that - actually sharing these tools, sharing the knowledge, - is kind of a democratisation, in a way, that to have more people, have more - societal power is about education and giving tools and all that. And that can be said about Google as well, that it's a one - of the forerunners in sharing many of these things.

So anyway, there is this balance, - obviously, that I'm sure that has to be continuously discussed. -Absolutely. -Thank you Eero! Now we're going to our next speaker and - we're going to have you, Eero, back soon after that, for a general discussion. Our next speaker, another perspective and a point of view to - the morning's theme is Meeri Haataja. Welcome, Meeri. -Thank you very much. Meeri, you are the CEO and co-founder of Saidot. -Correct. -And also you are the chair of IEEE Tripolis Ethics Certification Program - for Autonomous and Intelligent Systems. -That's a great name for AI.

-That is! It's a great name. And that's - a mouthful for a business card as well, I guess. -Yeah, it is! -Nevertheless, happy to have you here this morning. -Thank you very much! I'm so excited to have this opportunity and - talk off of this amazing discussions that we heard previously. And I particularly love, that Eero raised - this, for example, Alpha Fold, the recent progress in the AI - that we have seen and it's absolutely critical to start by - discussing about the opportunities. That's the reason why we are in - the data business and changing our societies from that perspective.

But let me expand this discussion with a couple more perspectives - so that we have even more diverse or wider space to discuss - the initial questions that Risto raised in his discussion. I will bring on table two perspectives and sort of - want to point into one particular challenge, that I see is impacting - how we drive this business and also raising that as a - question on table for people who might have an influence on that one. So when we talk about impact of AI - in our societies and data AI, this whole space, it's very clear - that we are in a situation where we start to really have impact in - our societies. All the examples that were discussed -

in Eero's presentation, are proof, that we are - doing many exciting things and there are opportunities on the table. That's also the reason the impact namely for that, - there is this another perspective. And basically, when we follow what kind of impact we are creating, - we see that sometimes this impact is something that we can rightfully say, - it is not fair or just. And often these sort of unintended impacts - or something that is secondary or externalities to the original - impact that we have strived for, is something that actually - people who have been developing and deploying these technologies, haven't - even seen or being prepared for.

There's a lot of discussion about AI ethics and how - people are thinking about the technology and the directions. We see a lot of concerns from the public on this matter. I took one picture over here... This is from EU citizens -

on what people think about deploying AI - and what other concerns that they have in this space. Actually, looking from the figures, around 60 % of - the respondents in Finland, for example, think that in order to - be able to respond to these ethical challenges of the new technology, that - we are driving on, specifically AI, we need public policies. Regulations for being add to address those ones. And when we ask about what are their concerns, - why do we need this kind of interventions - or are looking for this kind of interventions? These are the things that are raised in those responses. People are - concerned about the situation where something harmful happens.

And basically it would be unclear who is responsible for those - consequences or harmful impact. So it's about accountability. And that's one of the challenges that we have seen in - this space. How to define accountability in this kind of - context, where we have multiple different kinds of new - players contributing into the development and deployment - of technologies. Another angle, which was briefly touched by Eero - in his talk, is discrimination. In terms of different - age, gender, race and so forth, how well our algorithms working? Are we're doing that kind of decisions that are actually - fair and non discriminative, or do we put some part of our - society into worse position and even amplify already existing - problems from this perspective. Non-discrimination and fairness is -

one of absolutely most important and most serious concerns in this space. Also, the agency feeling that do we have a possibility to - have an impact and who do we contact if we have problems? And is there a possibility to influence this, is also - one important angle. So this is the voice of the people. The problems and concerns that people have - when we talk about the data business and Data Driven or - Data-supported future. Let's look at another perspective. If that was what people think about these things, - then let's bring on to the table one important angle, which I feel that we - too little discuss, when when talking about data, data business, AI - driven businesses. And that's what happens in the investor side.

What has happened over the last couple of years is actually very, - interesting and important from this angle. We have seen rates rise, of what we call ESG investment. And ESG refers to two factors, sort of non-financial factors - and criterias, Environmental, Social and Governance - related factors. Also in the same line... Actually... One figure on that one... 77...

According to this study on this ESG investment, - 77 % of institutional investors say that - they will actually stop buying non-ESG products within the next two years. That means, that if this kind of factors environmental, social or - governance related factors, are not in the right position for a company, - institutional investors or the majority of them will not invest - to these kind of companies. This isn't the same line - what happened last year, when the Business Roundtable, a group - of CEOs of nearly 200 major US corporations, they issued - a statement with a new definition for the purpose of corporation. And what is that purpose, is basically challenging the old notation that - the purpose of corporations is to create value for - the shareholders. In this new statement, this group of most powerful - companies of US say, that the purpose is not only that one, but - it's also to benefit the different stakeholders of a company. That means employees, customers and the whole society.

Many interesting angles are happening on here. And the last one, during the pandemic this year, - one more important change that has happened, is basically when - we look into this ESG space and the different factors, - environmental, social and governance, what has happened is basically that - the importance of social factors have really significantly increased, in - comparison to the previous emphasis on quite much on - the governance and environmental factors from the ESG perspective. What I try to say with all of these different - data points from the investor perspective, is that we are living - times, where social impact actually matters for investors as well. And that matters how the financial markets work and - how money is being directed to our industries and companies. How does it happen in practice then? Let's continue on this - investor perspective. When we are saying, that basically there -

is an interest for investors to invest in responsible companies, - those kind of companies which actually consider the social impacts - of their businesses and the actions that they have in the societies. How does it happen in practice? This is a very long story summarised. There is basically an industry that is supporting investors in - doing this analysis and figuring out what are - the impacts that companies have in the society. These are the players that are providing analysis, ESG analysis for - investors, collecting hundreds of data points related to different - aspects, how companies are considering environmental, - social and governance factors. How they are managing the risks - related to these areas. Based on this analysis, -

investors can do considerations and decisions whether - the angles they are concerned about, actually have been - taken care of in an appropriate manner in their companies. Now I'm getting into my actual key points. Let's have a look - at, how do this criteria's now look for... Eero discussing about - communication industry, publishing industry, - this is actually an industry where Facebook and Alphabet Google - were shifted a couple of years ago, in this industry sort - of categorisation. So it's good to look into that one.

If if you see... I need to check from here so that I see the figures. What are highlighted with colors on these pictures, are - basically the factors that matter for a specific industry. This means that if a company wants to perform well in the ESG ratings, - if you come from this communications industry, interactive media - and services industry, you need to do really well in these - specific factors, carbon emissions opportunities and clean-tech. From the social aspect, we have privacy and data security, - human capital development. That's about the employees. And then we have different governance and board.

They pay a related factor. But my point is, - in this specifically. When we look at this list, - what are the things from the social aspect that matte, in these ratings? We see a clear gap related to what are those impacts, - social impacts that we are actually concerned of, when looking at - the behaviour of these companies. So in practice, -

Twitter or Facebook, Google, none of these companies are basically - evaluated, when their social responsibility is being evaluated. They are not evaluated by whether they discriminate or not. Whether they create filter bubbles, - for example, or whether they have a contribution on mis information - or radicalisation of ourselves and people in general. These are not tailored into this criteria which - are used by the financial industry for doing decisions, - what is socially responsible company or not. If we think about Enviria, one of leading AI innovators in - the semiconductor industry, they are actually selling - for their services, for health care, for self driving cars and so forth. They aren't evaluated basically for their product safety - or discrimination related factors. That's not part of the criteria.

Where am I getting? When we look what - are the concerns in the civil society among the people? When we hear the voices of the investors, we see that there is - great willingness to take action, prioritise where we put our - money and where we focus, based on the social factors. And that would be the answer for the challenges, that we have - been talking about, when discussing about AI ethics. But the mechanisms, how we do these decisions, haven't followed - where we have gone with the data driven businesses. We don't actually have a right measurement for evaluating companies - based on their impact. What is their privacy and data security? But the things that actually are missing and what are the biggest - concerns in relation to AI ethics, and why we are talking, why we are - putting together regulations for this area and so forth. Those are -

mostly related to risk with regards to health and safety, human rights. There are so many different angles in human rights alone, non-discrimination, - children's rights and so forth. And these social impacts at large, - how do we impact democracy, misinformation and so forth.

To summarising this, what I'm trying - to say and I put on the table is that, this is an opportunity. We have investors waiting for being able to direct their - assets into sustainable companies. Companies, that actually consider all this, also secondary on externalities - of their businesses. But we need to really focus -

on creating the right kind of means to support this industry, - for being able to understand how companies are performing on - these matters that have now raised on table, based on - the work that we have done on data and analytics. So this is my call for investors. This is also your responsibility to - actually look into these criterias and measures, how you are considered, whether the company is actually putting a reasonable effort - on really managing the social impacts, and driving for positive - an impact and also managing the risks related to the negative impacts.

And that's a good ground for driving towards this more sustainable and good impact on the society. If we get - this whole investment community to drive towards that direction as well. Thank you very much! -Thank you very much, Meeri. I appreciate you for finding - the time to come here this morning. -Absolutely, my pleasure. -Sharing all of this expertise with everybody here. We have one question there. I'm going to take that question -

and kind of spin it more towards your presentation. And we had Iines from the audience, asking about who actually then - shapes the shapers, that we started talking about this morning. But I really liked it in your presentation, that you had the rating - agencies as kind of a very...I hadn't realised this until hearing your - presentation, but they have a very strong role as an intermediary there.

And what they do, is they take the raw data and then they - summarise it and analyse it, then bring it on a silver tray. So obviously, what you pointed out is that...Do you see - that as a problematic? What kind of decisions and what - kind of societies do these rating agencies want us to have? -I mostly want to see this as an opportunity to actually - it's good to have that industry to to be able to mediate this, - work in it so that...We need -

these means. It's impossible to do that analysis without this kind of - players in place. But what I want to say is, - this is a call that that we need to keep up with the pace, - how the technology industry is going. I feel that there isn't really a good connection still - with talking the AI ethics, AI governance and the regulation, - that whole space and this rating industry. I don't want to point any anyone on that, but I'm more - see it as an opportunity. And we need to see that there is -

something, there is a mismatch over there. And if we get to fix that, - that's a great opportunity to actually get a very powerful - stakeholder group, investors to drive towards this same direction. -It's kind of the theme of fair, fair society, adjust society, - on a broad perspective. I find it really interesting, - that we have these rating agencies and also the investors, - of course. Kind of going back to my talk in the morning, - that what do they see as a fair society? Are they the ones? Because as you said, they had a huge leverage and power - to shape where technology and business is going with investments. It's kind of interesting that if these are the ones who - are putting their capital into certain visions of society...

So traditionally that has been in the democratic process, that we - kind of vote what kind of society we have. How do you see that? Are we now actually shifting to a discussion about democracy? Who gets to shape the future societies? -Yeah that's one of the large societal impacts, - how to write rate, whether - a company is actually shaping democracy with its technology. It's amazingly big and difficult question. I don't know how to answer that. But I think the first thing is, - really to realise, that this is not about privacy and data security only.

That's not the responsible data business. There are so, so many other aspects and sometimes they - even might be conflicting with privacy requirements. It's a great progress that we have done with privacy - and GDPR, getting privacy action actually on table, and there is no - company who doesn't see that we actually need to act on this one. In Finland, we have had calls for that and crisis - on health data rates lately. So there's a lot of work to do there as well. But that point of view, where we we would only consider privacy and - data security, as the only factor on AI ethics or what is - responsible data business, is very biased. -That's a good point.

That actually reminds me of the ongoing news about the delivery - of the vaccinations, that the whole planet is facing and reading how much - that is actually a data based, a data privacy exercise, which - I hadn't realised until somebody had pointed it out. I had just thought - it more as a physical logistics. But it turns out it's very - much... Of course logistics nowadays is very data driven. And that's kind of a nice current - thing about society very concretely becoming this data privacy thing and...

-Absolutely. And then think about the how data driven - is the discussion about the vaccination and all - the misinformation, which were really nicely analysed - by the National Broadcasting Company just very lately. But, yeah, it's all about data. How we shape people's opinions - also about these topics. But data driven is also the investment business. So we need - to care about the data that is feeding the investment decisions.

And that's maybe my call for actually figuring out more - carefully, that is this data that actually measures the social impact, - that we are interested of. -Thank you. Let's take Eero. Let's let's take another shift into - the conversation and bring Eero here. Welcome back! -Thank you! -It's nice to see you that the Sun is gradually rising in London. -It was very dark in the beginning. -It's actually raining here.

You don't see the details though. -Any take on this, what we just - discussed or what just Meeri presented? Any thoughts and comments from you? -I think the responsibility and the investor angle is actually - pretty important in this field because this is a growing field. There's a more and more players, I think that kind of understanding, - different impacts. And as you said, -

like a secondary impacts, that sometimes might not be so - positive. They should be understood. And, of course, - that should be by those who invested their businesses - and work with their businesses, because it's not about investment. It also about selecting partners. Who would you like the work with? What kind of a impact for this company or other partner opportunity - have for the environment and so on. I really like the angle that Meeri brought into this discussion. -Thank you.

-I think it's a good point to remind us us also, that - KAUTE is a funding agency, an organisation. So that is definitely an investment angle there. Let's see, we have a couple of questions. Jossa asks, what are the great success stories in ethical investment in AI? Investors are, of course, looking for a return of the investment, - which kind of makes sense, that's their job. But how are successful investments - ethical as well as good investments? Meeri, anything come to your mind? -It's a really hard question. I don't know if I have enough baseline from - the analysis. For me, this is quite a new topic.

And in general, ethical investment - that's... Yeah... Ethical investment in AI... I think that's an area for research also. We haven't really dived into this. I have found very little information about it, that's why we need to have - our own investigation on this area. Because the communities, -

ESG community and the AI ethics community, - are totally separate at the moment. So when - new things appear and it comes from certain communities, - and it takes a little bit time that we, as a wider community, - learn to look at those same perspectives. I feel I'm not yet in a position to bring - these examples on the table, because this is a very new angle. I'm also like looking forward to have more research - in this area. But if there are any other people in - the audience, who would like to raise comments or thoughts on that one, - I'd definitely be very interested to hear. -Eero, anything comes to your mind?

Success stories or good examples? -I think the success stories are probably - pointing to individual cases. It's difficult. I think the whole concept, the framework, of ethical investment - foresees that the players to actually think how do they behave, - how do they build their services, what is their role in this society? I think there should be a probably, - as Meeri said, a research about how this existing - framework of ethical investment actually impacts to the players. Do they design their products differently or do they behave - differently, when they know that there are opportunities - to some investment funds. Large pension funds, for example, - invest to companies that actually behave in certain manner.

-But in general, I really think that the whole ESG base and - the rise of that over the last couple of years. I know there has been years of work. People in - the sustainability area, are looking from the environmental aspects. It's been hard work and for years and years, it's not the new thing. But the boom that we have seen over - the last couple of years in this space. Now there's no investor who doesn't know - this space and isn't even sort of pushed or pressured - to have a stand on this one. Institutions who would say that -

we are not considering, we are not interested about the ESG factories, - we only focus on the profit, is not probably a best move - or most attractive from a brand perspective. It sort of starts to be a must have for investors. That's an opportunity. But that hasn't become -

by a miracle, it's been a long process of creating - tools and putting effort, research and a lot of work for this area. And now it's an opportunity for us to sort of - join forces with what has grown from the environmental - perspective. And start to also use the same means for the social impacts. -Let's take that as a good thing, what you just mentioned.

There is hardly an investment organisation who - isn't forced to think about these points. And we have an other question from a Rebecca. Could not the human technology interface be seen as - co-constructive or relational, rather than as a dichotomy of active versus passive? Excellent point, Rebecca. I think we're been -

putting things a bit black and white for the sake of conversation. So I think that's a good point. Rather than - technology and human not being totally separate worlds, but actually - having a continuous discussion, I think that's being co-constructive.

How do you see... Let's take what I started with - the kind of a black and white, how do you say - what's the greyscale over there? How much do people shape - the technologies and the data with their own behaviour and vice versa? And he takes on that matter? Meeri you first, then Eero. -Yeah, one of the really important principle, - from the responsible AI perspective, is transparency. And I've been personally focusing on that one very much. But it's fair to say that transparency from only that - perspective, that we broadcast information for our people and that's - it. You should trust on what we build, if we are open about that one, -

doesn't sound like that's the whole picture. And actually what contributes to the trust in a way that we are - interested to contribute to that. And I think that - the missing piece is the feedback loop and - how do we actually build technologies that are - in a broad way allowing a meaningful feedback from the users? And and how do we take into use that feedback in further developing - our technologies? I think that's also very important from - this trust perspective, because people trust what they can be part of. So if we can actually... And one of the concerns - that I showed regarding the technology, that we have a feeling - that we cannot actually influence.

And that's pretty scary, if we see errors and we don't have - a possibility to influence that, that's a scary scenario. I personally actually think that this - whole human technology interactions and interfaces, - for promoting trust based relationship between - the technology and humans explainability - goes into that space what we are calling. It's one of the most important, and one of the sort of - fundamental success factors for the future of technology. I don't know... Any thoughts from Eero? -Eero, take it from there.

-I gave an example during the presentation about... I see - multiple angles. But if you take from the product development point, - a company working in a global scale and having a billions of - users in different products, we have to take the feedback - loop when we develop new features or new functionalities. And it doesn't mean only relying on the data that we get from the usage, - but also really having different - people from different groups represented, like when we develop - the functionalities, whether it's Android or Maps or others.

I think that's one of these things. The other angle is, that we make - it so, that you all can go look for it from your Google, what - information there is. And we have given the controls - because that's one of the ways that we know that people can trust, - because they know that they actually are in control about the data. That there is an opportunity to delete them.

And there's an auto-deletion process. We don't keep - old data, which is unnecessary, to run the product successfully. That's one of the angles.

Of course, there's a lot of more cases coming, like how - a human interaction and the machines are combining. And one of these, - I think Helsingin Sanomat in Finland, wrote a week or two ago, - how it's an experiment, where a mobile phone camera technology was used to - help a blind person to the run without guides and things like that. So there is a lot of innovations done together with humans.

I'm not sure if I answered that one, it's a difficult one. -Yeah. One more point to that. Thinking of some concrete means for how to improve your - capabilities or like a position in your capability - to develop responsible technology.

The stakeholder engagement and how your team is shaped, what - kind of people are developing. If you put effort on that one, - you have a diverse team who offers representation of - the different kinds of groups that you are targeting with your technology. Bringing in different voices and perspectives on the table, - and you collaborate with the people who are actually users of - your technology or indirectly influenced by you technology. Those are two probably most powerful ways. And you know it, this is what design as a practice have...

That's normal practice for them. But now introducing these as means to ensure ethical, - responsible technology development and AI development, is something - that we clearly still need to discuss about, because we are not - living that a world yet. But it's very important. -Yeah, and actually I take... -Sorry I have to add... Team diversity -

is an excellent comment by Meeri, that the team diversity is - extremely important. I wouldn't limit that to technology companies. I think anywhere in the society we should look very carefully - what kind of teams we have in place, and do we see - that the society from all angles that we need to understand. I think media companies, for example, they have been - investing this a lot lately because they tell stories to people. They want to make sure that the stories come from multiple angles - and they consider from the diverse points of views. -And now linked to... Sorry for my earlier point in the mechanism - how we rate companies.

What would you think, if we were to actually start to report - the diversity of the AI teams? Not only the board or management team, - but the people who are actually developing their systems? And and that's one small item. But if that would - be something that we actually prioritise and are interested about, - and companies would start to report that kind of factor, - investors considering this in their considerations. Would be interesting to see the impact because but this is what we're talking - about. No one is measuring that. No one is sharing any data about that one. But I think it's a good time for companies to start - preparing for that. I don't know what Google thinks about it, - and if you even are sharing already.

I know you'e been putting a lot of effort on this one. But as a practice we are interested about in reporting, - we're interested about the board and management team composited. Also - maybe figures on a high level, what is the gender balance in - the company, but who are developing the AI technologies? No one knows. -I think that actually relates back to, - as far as I know, actually, when you go investing into startups. You actually invest in the team. And I think traditionally it has -

been the diversity of the team, more as a technology design - business diversity. And if I'm optimistic, - I see that there is this existing mechanism, or a way of thinking that - we actually look into the team, rather than actually looking at what do - they say that the final product is. Which is something you Eero brought up. I made notes over here when you talked about the capabilities in - the data future was experimentation. And I think that's a big major shift -

in innovation and product development . Where we are not so much asking that - to tell what is your exact final result is. What is the final innovation? Because you don't know yet. But we're looking at your capabilities - of experiment and now you're kind of bringing forward of looking at - the diversity of the people. And perhaps going back to what I said earlier, - looking at the diversity of what different values these - people represent. -Yeah. I think that's absolutely an important point.

This whole learning, by doing and - basically the interest... We should measure how capable this company... How interested a company is to recognise all the impacts that - they have. How they are managing, and improving their capability to - have this sort of ethical foresight into what will be the impact. But also how agile, how reactive they...

How do they adjust based on what they learn? Because no one can claim that we would be able to see what are - the impacts. These are complex technologies and society is complex. People are complex. When we put those together, it's a fact that we - we need to be able to take feedback quickly and adapt how we work. So that's also really important, I think. And we shouldn't even try to strive for that kind of situation that - as if we knew all the impacts and what we do is ethical, - and we have managed everything, that's false.

That can't happen. But are we prepared, are we - interested to actually see what are the impacts and actually - take action when we see something that requires to do so? I feel that's really important, - how we address this and how we also evaluate companies - contribution and capability to manage the social risks. -We do, Meeri, report the diversity. oO course, high level. But there is an annual diversity report publicly available. So what are you suggested, I think, is something - that every company should do. To really be open about, -

what kind of people are like building the tools. That at least gives a hint, - who we are hiring, what kind of backgrounds they have, - what is the constellation of the organisation. Of course, doesn't go probably on the team level, as you suggest. (Soundtrack unclear.) I think we need to keep it very open considering the global scale.

-One thing that I still would really like to - bring on the table, is that sustainability leaders of - companies, are in a key position. And I feel that not all of them - have figured out that AI technology ethics is of their domain. But that's something that... In order to get into this... And the whole - rating discussion that we have had over here, sustainability, reporting, - is basically the source for a lot of information on this one. And when we look at the leaders, we have gone through a lot of - sustainability reports and how companies who are using AI in scale, - how they are talking about AI in their sustainability reports.

It's a very interesting space. We have a lot of companies who don't even mention a word AI, - machine learning or artificial intelligence in the reports. And on the other end, for example... There are many good examples. Telefonica was something that I was - very excited about just a few days ago.

They have all of these perspectives on - the table and they systematically - put effort on building policies and having these - policies for managing. Not only the privacy data security, - but also the human rights impact. They know what are the specific risks - that are topical for their material, for their industries. They put effort - on the education of their team, the diversity of the teams and so forth. This is the space of the sustainability leaders. And I think it's an opportunity - that if we get these folks actually to to call for these - different aspects and start to report those externally as well.

We will have a lot more information to base - our investment decisions from this perspective. -Eero do you see that happening in either inside Google or in - the News media companies you work with? There's - this convergence of two different things that Meeri brought up. There's the sustainability, expertise and leadership, - and then we have the AI ethics. Do you see interesting, - strange overlaps or mixes going on, breaking maybe barriers? -Well, the barriers are definitely broken. I think our organisation in general is known to be very open and - data flows has been really open. I used to work with very traditional - publishing companies and also in Turkey and in Russia and so forth.

So there the data didn't flow between the organisation, and I see this - happening in every country. I feel that that's kind of making - the overlaps people are working together and meeting together. I think it's just becoming more common.

I think this our topic today, Data-supported future - requires these overlaps to happen. -We're running out of time. I don't know if I can get - quick answers from you, but from both of you, starting with

2021-01-22 16:59

Show Video

Other news