The Tech That Comes Next
- Welcome everyone. We'll just be starting in a couple minutes. We'll just let people join the webinar. - Great.
Okay, welcome everybody, I'm gonna go ahead and kick things off today. My name is Sue Hendrickson. My pronouns are she and her and I'm the executive director at the Berkman Klein Center for Internet and Society at Harvard University. Today, I'm co-hosting this event with Amritha Jayanti between the Associate Director of the Technology and Public Purpose Project at the Belfer Center and pleased today, really honored to have with us our special guest, Afua Bruce. Afua is a leading public interest technologist who has spent her career working at the intersection of technology, policy, and society. Her career spanned the government, nonprofit, private and academic sectors as she has held senior science and technology positions at a data science nonprofit, the White House, the FBI and IBM.
I had the privilege and honor of working with her before joining Berkman when I was in private practice and she was at DataKind and I'm thrilled to be rejoining again. She has a bachelor's degree in computer engineering as well as an MBA and is the author of the book that we're gonna be talking about today. 'The Tech that Comes Next, How Change Makers, Philanthropists and Technologists Can Build an Equitable World.'
I'd like to just start by acknowledging that Harvard is located on the traditional and ancestral land of the Massachusett, the original inhabitants of what is now known as Boston and Cambridge. We pay respect to the people of the Massachusetts tribe past and present and honor the land itself which remains sacred to the Massachusetts people. With that, I'd like to just say that the event is being recorded. Audience members will not be shown. If you wish to ask any questions or leave any comments, please use the Q and A function on Zoom. We'd welcome you to use the Q and A function throughout so that we can pull questions together.
So when we get to the end, we'll have a curated set of questions that we can ask and this is a friendly reminder to ensure that your mics are muted during the event. With that, first I'll turn it over to Amritha, just say hello and then to our special guest, Afua. - Thanks so much, Sue and thank you so much Afua for joining us today.
I just wanted to briefly also thank Karen (indistinct), and Becca Tabaski, who are on the Berkman and TAPP teams for helping us coordinate this event today. We're really excited to get into your book, Afua, so to kick things off, I'll actually just toss you an initial question to lay some groundwork, which is really introducing the concepts within your book and also the motivation. We heard from Sue that you've had such a breadth of background in the tech sector and in the public sector, private sector, et cetera. So we'd love to hear an overview of the book and what your motivations were for writing it. - Yeah, absolutely and thank you both so much for doing this event, doing it together and letting me be in conversation with both of you, I have tremendous respect for both of your work.
Sue, obviously as you mentioned, we worked together while I was at DataKind and Amritha, I am happy to be a non-resident fellow in the Technology and Public Purpose Project and so just really grateful to be a part of this conversation today and excited for it. When I sat down to write the book, 'The Tech That Comes Next' with my co-author, Amy Sample Ward, we recognized that we both really think a lot deeply about how technology affects communities, how it affects communities in positive ways and in not so positive ways a lot of time. We wanted to, we thought a lot about how the technology development process works, how funding of technology and the use of technology is subject to so many different forces from society and where communities actually have a voice of that process and as we were thinking about things, we realized that, you know, there is a lot of research and there's a lot of writing about all of the harms of technology and all the ways that technology has been used to inflict harm and disenfranchise people but we wanted to write a book that helped people imagine a more equitable world and so what would that look like if we took some time to step back and really think about what could the future of technology look like? What can the future of technology in the social impact space especially look like? And so that's where we started, we started by thinking of the tech that comes next, really being about the tech, but also about community, because we want our future world to really center community needs and community perspectives and so in doing that, we started by saying, well, we have to be really clear about what it is we value, regardless of whether people articulate their values or not, when you develop technology, when you developed ways of interacting in society, it reflects values and we can see over time, some of the values humans have which can be to concentrate power, to concentrate information and to concentrate resources. So if we think about communities more broadly, what would it mean to build a more equitable world? And what values would we wanna be really clear about? So in the book, we actually lay out six different values.
The first is that an equitable world requires that we value the knowledge and wisdom of lived experience, recognizing that the most impacted individuals and communities need to be central to decisions and solutions about priorities. The second value we articulated was that an equitable world requires that we value the participation of a diversity of people in decision making, in planning, and in building technology, regardless of their technical knowledge or training, we need to make space for both technical and non-technical expertise. The third value is that an equitable role requires that we value accessibility, that we value accessibility for the start. So thinking about the language we use, thinking about the tools we design and are they accessible to people? Thinking about where we hold community meetings or where we, and how we solicit feedback from communities and from users, what does that process look like to actually be accessible? And then value number four that we articulated was that an equitable world requires that we value the multiple ways that change is made, meaning that we have to recognize that even though we might want systemic change in a more equitable future, there are needs that need to be met today and there are needs that need to be addressed today and so how do we live in that tension? Both making long term progress while meeting immediate needs. The fifth value we articulated was that an equitable world requires that we value the strength of collectively creating (indistinct), oh, excuse me, creating a vision of a better world.
So it means that, you know, Amy and I spent a lot of time writing this book but we are not the be all, end all on this. We need other people, which again, is why I'm so excited to be in this conversation with both you, Amritha and Sue, to expand who is included as we are thinking about what does it mean to collectively create a vision of a better world? And the sixth value we articulated was around the dedication of individuals valuing the dedication of individuals and communities and pursuing knowledge. So recognizing that there are many different types of knowledge and different types of expertise and we wanna hold a healthy respect for true expertise and learned expertise in its different forms. And then now that we know what the values are that we're trying to build around when we say let's create a more equitable world, let's imagine a more inclusive future as it comes to technology, we had to figure out some way to organize and the way that we chose was to identify five specific roles that most people will be able to find themselves in. I'll name that, the roles are, you might identify in one today, you might identify in a different one in another couple of weeks and so recognize that you can move across roles but also that we need to go across some traditional silos in order to really develop the technology that comes next and to really build a more equitable world. So the first role we identified was the role of the social impact organizations and employees at social impact organizations, and one of the key questions that social impact organization employees are thinking about is how are staff supported and resourced to align technology with the mission and community's needs? So how do these organizations really see technology as a way to advance their mission and enhance their mission and not compete with their mission or other resources? The second category as a technologist, as a trained computer engineer and former practicing software engineer, this is where I most readily identify most often but really, with technologists, we are looking at how can we change the tech development process and really think about how are technologists investing in the leadership and capacity of the impacted community to support their long-term ownership? What does it look like to distribute power differently? What does it look like to protect rights and privacy in those situations? The third category is funders and investors.
So everything from philanthropists to venture capitalists, to corporations, to individuals and asking funders and investors to really think about changing the way tech and social impact funding works. So asking investors, "Are you committed to funding inclusively for intentional engagement, iterative processes and long term support?" And the fourth category is policy makers. Looking at policy makers who have the ability to really change laws and policy, one of the strongest levers we have if we're going to make systemic change and so a question for policy makers to contend with is, are you meaningfully engaging communities most impacted by digital divides and technological harm to inform proactive policies? Are you thinking about the policies that govern technology and the technology that governs policies? And then the final role that we identified is the role of communities. Recognizing that we have people in multiple communities and communities from diverse backgrounds, communities of historically overlooked people, and asking these communities, "What is it that you want to change? And what are your biggest dreams?" And really using the answers to those to drive the rest of the process, to drive the policies, to drive the technology, to drive the funding and to drive even some of the goals and formation of social impact organizations and so with those framings of both, let's be a little bit more clear about what we mean about equity and what a more equitable world looks like and then giving us a framework of five different roles and actions that each of those roles can take because there's a little bit of responsibility for everyone here. That's the framework that we use to really think about the tech that comes next.
- That's great. Thank you so much, Afua. It's such a comprehensive overview of the content and one thing that I loved about the book is that it's very action oriented.
It's almost like a toolkit with the questions, et cetera, that are in it. So it feels like it really helps us carve a path forward. One concept I really liked in the book too that I'd love for you to speak a little bit on as we move into more of the meat of the conversation is that you make this distinction that tech is not a solution, it's a tool, and how important that framing is as we think about how tech can be applied in different sectors and spheres to ensure that we're building it to really elevate the values that we've articulated, rather than assuming they're integrated in or embedded already. So can you speak to that distinction, especially in the way that you think about how the narrative is now around tech and where it should be as we move forward? - Yeah, absolutely. Again, I am an engineer by training. I love engineering.
I love technology. I think it's amazing. I think engineers are amazing and we can do wonderful things but if we always approach all of the problems as though technologists or engineers or computer scientists have all of the answers and sort of assume that because someone can write some really well designed code or frankly, not well designed code that still runs, that they have all of the answers to drive every other aspect of society. We really miss the mark.
We need to really focus on what are the problems that we need to solve and then back in from things there. There's an example in the book of an organization called Rescuing Leftover Cuisine for example, that existed to rescue leftover cuisine as the name might imply and they started out as an organization that was literally, you know, a team of volunteers who was trying to solve food insecurity. They'd get a call from someone, you know, restaurant or someplace saying, "I have leftover cuisine." And they'd call a volunteer to go and walk to the place, pick it up and take it to the next place and that worked for some time but then they thought about how can we be a little bit more intentional about this and thought about the technology that might help facilitate this and so rather than just asking someone to design a technology that might facilitate that, design a platform that might facilitate that, they actually spent a really, they spent some time really intentionally designing their own process and, you know, creating, I think it was a Trello board with requests that people could put in, by people I mean volunteers, staff members, people from the organizations' donating the food to document the requests they wanted and the features they wanted in this technology system and then people could sort of up vote or down vote things and when decisions were made that either aligned with the clear community wishes or contradicted but were necessary, clear communication could be had and that is how the platform that still drives Rescuing Leftover Cuisine today, that still drives Rescuing Leftover Cuisine was developed, and they were able to do that and they were able to go from one small organization that, you know, operated at this human scale to an organization that operates now in I think more than eight cities across the country and so, again, it's really by focusing on what were the actual needs, talking with those community members involved, using that to help define the technology and then move from there. When you skip some of those steps, you can get into some of the cases we've heard about, of people designing, you know, an app to solve hunger and here we are, years later and that app has not solved hunger or designing technology to, you know, help assist people in walking, but you didn't actually talk to people who needed the assistance and so those devices then aren't used or if they were used, they were actually harmful and inflicted harm on people and so that integration and really focus on the problems and the communities who are most affected by them is really important.
- Yeah, I really liked the other example in your book too, that you provided of DataKind working with John Jay College to collaboratively develop the program to identify students at risk of dropping out and to kind of help propel them across the finish line 'cause it seemed like it was exactly that kind of model that you were talking about of engaging what the community needs were to figure out how to design that. One of the things that stuck out to me with that is that there was, and I think you noted it in the book, that the weight of particular variables in algorithmic decision making had the serious real world consequences in determining who received services and who doesn't in that context. So it was important for framing and one of the things as a kind of technologist and since we're both in academia here that I was wondering about is, how do you believe that technologists should be trained differently to kind of handle the ethical and really think about the ethical and policy implications of their design, deployment and development choices given that it really is ultimately not the tool but the people and the problem that you're trying to solve? - Yeah, absolutely and just expanding on the John Jay College and DataKind example that you summarized quite well, Sue, I also just wanna note that the algorithm or the models that were built in that to help identify students who are at risk of dropping out and then to identify what interventions were best, that was a recommending tool for people who then were ultimately involved in making that final decision and implementing that and so as we think about training students differently, I think also talking about some of the limitations of technology and some of the ways that we want humans to continue to interact with technology and some of the decisions that we might not wanna cede to computers who are unfeeling, who have had biases encoded in the way they work and then operate at scale and very quickly to make different decisions. So I think that's really important. I think another thing that's important to think about when we discuss training students differently is to recognize the different perspectives that are really important and thinking about some of the ethical considerations and so yes, absolutely, engineers who take engineering courses and design there but it's also important to start to identify ways to build empathy, ways to identify how to assess a data set for completeness and where you might go to help build out that data, ways to work across disciplines and recognize that people from a law background or social science background or other backgrounds have different perspectives that's really important to incorporate in the design process because then as people are thinking about coding different weights and different variables and different values, then some of those considerations can then be translated into the code to then have a different impact on the overall process. I think also showing some of the practical side of what's working and what's not working is really important and then I think finally, centers like the ones both of you run that really have people who are deeply researching some of these issues and the positive and negative implications of them, hearing people who are in that research and dealing with those technologies on a day-to-day basis, I think is incredibly important.
- Yeah, I'm awed by the community of information that the two of us get from the people who are affiliated with our centers, it really is amazing in kind of driving the research and just understanding of a lot of these issues and implications as we think about how to go with them. One thing you touched on in there was about the silos and breaking those down and I was fascinated when I was looking at your bio that in the White House, it mentioned that you had overseen a hundred different federal intra-agency working groups and tackling kind of challenges ranging from the environment and sustainability to Homeland and national security to STEM, inter-agency working groups are a really hard thing to do usually because you are kind of breaking down silos and, you know, kind of building those challenges across them but one of the things that we're seeing in all of our work is just in today's world, there's so much of a need for interdisciplinary multi-stakeholder problem solving solutions, but the necessary actors in that historically haven't been broadly enough defined for that and so I was just wondering whether there was expertise that we should cultivate to bridge those divides, build those solutions. If there were lessons from your experience building these inter-agency coalitions that could help us get different silos of expertise to more logically and seamlessly work together on these kinds of issues? - Yeah, absolutely, I mean, I think what I would first say is to develop some patience, managing, you know, for myself a hundred different inter-agency working groups took a lot of patience, a lot of finding people to be the right messenger to different groups and what that looks like and even though, you know, say somewhat jokingly, patience was really critical then, I think it's also really true when we think about developing just working groups in general and working across different silos because we need to make sure that we are taking the time to really develop a common language and, you know, engineers might have a different language than lawyers, that have a different language than social scientists and more and more and so really taking some time at the start to really make sure that everyone is sort of understanding what the other person is saying and what that looks like.
Then recognizing that the time to actually let people speak and then process and react is not insignificant. So as you're building project plans, as you're designing processes to sort of incorporate people from traditionally different silos, allowing the time for some of that development and that information exchange to happen, I think is really, really important as well. - I think piggybacking on that a bit, this idea of cohesion. So you've named a few different stakeholders that really have power in the way we form and create our tech and so, as we think about moving forward and how each like funding streams can shift, tech developers can change the way that they're thinking about things.
How do you think about the cohesion of change within the groups that we see in the stakeholder groups that we have to ensure that they're coordinating in the way that they move forward and actually implement change that really integrates for example, community perspective, when we're thinking about technology? Because again, the tech developer sometimes is sitting in their own corner, thinking about things and they're doing some design experiments, but that's not as well interfaced with some of the philanthropists who are funding certain projects, et cetera. So to that point of cohesion that you made, how do you think about that as we think about these key stakeholder groups? - Yeah, absolutely and this Amritha, I think is a great question and one that I'd love to hear your thoughts on and Sue's thoughts on as well because I know in the very interdisciplinary work that you two have both done, I'm sure you've got a lot of examples, probably both good and bad. But when I think about how do we sort of start to encourage this, I think there are a lot of different levers to pull and from each of those different groups and so as we're, you know, thinking about funders, for example, funders and investors, I think that if investors start to ask questions about, did you talk to a certain group of people? Did you do some type of check or in, you know, a due diligence process, ask for, you know, who was included, who was not included in those conversations, that can start to change the needle and I know there's, you know, research by, I think some folks in the Technology and Public Purpose Project about that. I think Cornell Tech has done some research, some interesting research on this as well and there are other places as well.
I think, you know, on the technologist side as well, I think it's really important to recognize that, you know, there are technologists at all different spots in the process and there's something, you know, a little tweak that people can do, at least one little tweak at each process and so maybe your part of the tech development process is when you are doing user requirements and so then really thinking about what inclusion looks like there and equity looks like there or maybe it is when you're writing code and then maybe it's your job to think, you know, who did we get this information from? Who did we get our understanding of what our ultimate goal is and what this code is supposed to do or not do from and asking questions along those ways. So recognizing that no one person has to sort of transform the entire system but there are different levers and many different parts that many different people can pull. - Sue, I don't know, did you wanna jump in, even with your thoughts on some of that? - Sure, no, I was just gonna kind of echo that, that as a, you know, I'm a committed generalist who's worked in all forms of emerging tech and each time you have to kind of knit these folks together and I really appreciate it in here that you create and people come from very different perspectives and figuring out a way to create that dialogue and create the baseline for discussion about these issues. One of the things I really enjoyed about the book was both the values based framework that you were giving people and the very concrete questions to enable that dialogue as, okay, well, so, if I wanna do this, assuming I wanted to have more inclusive development and I'm a funder, how do I go about that process and started, I think as people start to ask those questions in their respective silos and start to understand the need for each other better, hopefully we can change the paradigm as to, you know, how people approach these different issues.
- Yeah, yeah, definitely. I think one point that you both touched on is the idea of communication too and number one, communicating, bringing your thoughts to the table, but also having the right language to speak. One thing that we notice, especially through our work at TAPP is that everybody has a different way that they communicate and so to bring different stakeholders to the table, there's some work to be done to make sure that we're all speaking the same language and also that people's incentives are presented when they come to do the work to ensure that we can align those in a way that makes it more productive, but it also points to a cultural shift that kind of needs to happen and I wanna circle back to a point you made, Afua, about focusing on the problems and also the fact that you have a computer engineering background and have done the tech development.
So I have a computer science background too. I've worked on some tech teams and one thing that is common is that often engineers get really excited about hard problems. They care a lot about the solving the problem in a certain kind of technical challenge way, but then that gets lost and the values and point actually kind of disappears in the obsession with some of those technical challenges. So I'm curious from that perspective, how do you ensure when we think about problem scope and the intentionality around design, et cetera, that we can catalyze the right kind of cultural shift so that we're focusing on the problem in the right way? - Yeah, absolutely. So I think we can tackle it on a couple of different angles and so I think one, just acknowledging that there are really hard problems that are out there that need to be solved and that need some super complicated technology and people who really understand the nuances of how to develop different technology systems, how to really secure different systems in different ways as people move about, as people change locations, as people adjust their identities or the ways they show up in different spaces, there are really big challenges around privacy concerns and how your data is protected and how your information is shared or not shared. So there are extremely complicated problems that need to be solved even in the social impact space and so I think galvanizing people around the fact that those do exist and that's worthy sort of the technology expertise is really important.
That said, sometimes an organization just needs a spreadsheet, maybe not a full out blockchain solution, but they just need a spreadsheet and so also recognizing when that is the right solution and, you know, I think it's important to be upfront with people about that, to say that, you know, to the engineers, that some of your time might be spent on the really exciting work and some of your time might be spent teaching someone how to use a spreadsheet or developing a new spreadsheet or something and just setting expectations in a more reasonable way I think matters a lot. It's also true that even for complicated problems or perceived complicated problems in the private sector, that it's not always complicated or exciting. If you talk to almost any data scientist who wants to write the most complicated data science algorithm, they will also tell you that they probably spend a lot of time cleaning code or cleaning data, that is not anything anyone likes to do. Or maybe there are people out there who like to do it. I have yet to meet them.
I am not one of them, but, you know, and you sort of take the good with the bad and there's that perception to some fields and so I think doing a better job of setting those perceptions for other types of work and using some of those perceptions for the technology and the social impact space I think that that could go a long way. - Yeah, it is one of those things where the shiny new tool is not always the best. Sometimes the simple solution is what was needed. Michael Bremer asked a question that was kind of a related follow on. So just wanted to throw that in here that, "With respect to thinking about the developers, so often the developers do not get to talk or interact with the people who actually use the products and develop (indistinct) instead," given a list of requirements, which we've all had, you know, here's the list of requirements, go execute on this, you know, how have you approached changing that dynamic in organizations that may not be set up for that dynamic to change? - Yeah, absolutely and so again, recognizing that, you know, power imbalances in different spaces and who has what access to what leverage the pull, in those cases sometimes starting not with the developers themselves but then really looking at the requirements process and what that sort of requirements process and requirements gathering, requirements definition process looks like and what information is captured at that requirements level and what changes you can make there can have a big impact and so starting then in that requirements process of asking for some changes to that process to sort of document what community this came from, who was spoken to, and what was sort of prioritized or emphasized, I think is important. Also many different development shops, you know, this is certainly true in the federal government is true, in some private sector places as well, have your privacy impact analysis at the end of the process and so also thinking through in different ways where that needs to happen.
So it does that type of privacy impact analysis that can take place at the very end that does go to some assessments for how the technology is used and how it impacts communities, does that have to take place at the very end all the time or are there some ways for some intermediate checkpoints along the development process? And so changing some of those things can start to get at this in an organization that is more rigid and may want to do what is seen as protecting the developer's time. - Yeah, and that goes to one of the other questions as well as to whether you've seen good examples of ongoing engagement of non-technical, but issues, proximate stakeholders, engaging with tech development. Are there good models out there that you've witnessed? - Yeah, absolutely.
I think I've witnessed, you know, specific organizations working really hard to do that well and sometimes specific teams within organizations to do that really well, excuse me. So I was, you know, now I consult with nonprofits directly and work with a particular nonprofit in the benefits access space who throughout their development process also had a team of people, they weren't the developers, but a separate team of people who consistently went out into their community to say, "How are things working? What do you wanna see? What do you wanna change?" But where the interaction came back into play is that every other week, both of those teams met together. So where we are in the development process and what have we heard from community members and to have that exchange just be part of an ongoing process that's sort of built into how the organization works and so I think that was one model I've seen that was very successful. - That's really interesting, actually, could you elaborate more on, what even that communication scheme looks like? 'Cause often, sometimes there's a dance that happens and people bring recommendations to the table, but how it gets translated into actual technical development and our requirements is a whole 'nother step of the process. So how have you seen that be successful and are there best practices that you've noticed that you would then recommend to other groups as they think through this? - Yeah, absolutely. So I think an underlying theme might be clarity is kind and so as you can be as clear as possible in collecting that information from community members and be clear and so not just leaving it as this person, you know, talked to someone at a particular client site, it turned out that most people were on phones but stepping back from that and saying, and this isn't actually an example from that same organization I mentioned, that, you know, they'd spent a lot of time developing just website interface for a computer or laptop and then when the customer team went out, that particular office that was being used as a community center to help people access benefits only had one computer and so then you had most of the staff sort of around on their phones trying to bring it up and so just the message of, you know, a lot of people are on their phones, probably not sufficient to really think about requirements but also that additional piece of people don't have access here.
So now let's think about how people might access things and where they might access and even how they might do training of others or how they might be interacting with their customers, that could then lead to a different set of requirements and additional requirements for the development process. - So I'd love to switch gears a little bit to talk about one of your other stakeholders with respect to this. You know, I was intrigued by your discussion of the roles of philanthropists. A lot of time, they're kind of left out of the map of kind of how organizations are working on things, even though they're so critical to the funding for the social impact sector and for what you can actually do with technological solutions in the sector, how do you ensure that the technology funding is democratically accountable and not just reinforcing of the funders values and assumptions when they come in? Because we'd all like funding but how do you make sure that it's geared toward the values that you're looking for and the projects that you're looking for and you have that kind of accountability? - Oh, Sue, there is so much packed into that question. So many different ways to touch on it and so many different points. So I think there are a couple of different things that are important to think about when we think about funding and yes, from the philanthropist space as well and so, you know, organizations that are funded by philanthropic dollars want more philanthropic dollars often in order to execute against their mission and so I think some of the things in, you know, some of the points that we make in the book are that it's really important for funders, for philanthropists to really think holistically about the funding process and so yes, perhaps funding an initial project but also thinking about what sort of overhead support is needed, what sort of support is needed for some time to experiment, some time to explore some new options, a little bit of tolerance of risk that something might not work initially.
The John Jay College example that we mentioned earlier involves some of this freedom to sort of play around, right? And that's how the team was able to test something like 20 models before coming up with the final two models that were used and so building in the time and funding to have that exploration time is really important and I think also on the end of the process, really building in time to, and funding for time to share back out to the community what worked and what didn't work and so really allowing some funding, again, thinking holistically, for organizations to share with similar organization what's working or to train other organization on what's working or to allow the community itself to be trained on how to maintain their own technology that has now been developed with them and will be used by them. They should also probably have some agency to think about maintaining that technology itself and then of course, one of my pet issues is the need to fund infrastructure and sort of that digital infrastructure and the need to fund that. So as an overall space, advancements can continue to be made. - I love that you threw maintenance in there. When I was in private practice, there were so many times when I'd work with nonprofit organizations that they'd had a grant that allowed them to build and to create and then there were changes that happened either in their world or in the technology and they just didn't have the follow on funding to figure out how to have that continuing chapter for the interesting things that had been built. - And yeah, just building on that, it's so rare that anywhere in technology that the first version is the final version, right? We wouldn't expect, I don't know, Facebook is for some reason the only tech company that is coming to mind right now, you know, Facebook's first version of technology, to be what we use today.
We also probably don't expect today's version of Facebook to be what we use in the future but, you know, Airbnb just made a big announcement about some big changes they made and it was adding categories of housing and so, you know, we accept the fact that there'll be multiple iterations of technology in the private sector space. We should also expect that in the philanthropic space or in the social impact space and so I think also for funders to recognize that, you know, it's cool to fund the first version but also funding version 1.5 and version 2.0 is also really valuable and incredibly important to get to systemic change that really makes a difference in people's lives. - So I'll quickly flag that we're around 2:45, so we're gonna shift over to audience Q and A in about 10 minutes. So if you all think through questions and put them in the Q and A chat, we'll field them in the next several minutes to make sure we can get to those but I would like to ask a follow up question on the philanthropic angle, which is to your point, as you mentioned, there's institutional change, but from an individual funder perspective, I guess, how can a funder now say let me shift the way perhaps I'm thinking about this, how can they implement or catalyze some of those changes from the individual's perspective that then leads to some sort of institutional change that you've alluded to? I ask that because there's also actually a TAPP fellow in the audience who works on some of these issues and I know there are people in the community here that are listening that are probably thinking about what individual power do they have to make that change? - Yeah, absolutely.
So I think, so start with maybe individual power and sort of the sense of a specific program officer, is that your question? Sorry, okay. - Yeah. - Great. So I think, you know, from, let's say an individual, at a large philanthropic institution, I mean, asking and having conversations with your grantees about what some of that holistic support is, pushing on, okay, we wanna fund this program specifically but what other support might you need? Have you thought through the maintenance? Have you thought through some of these questions about spending some time to better refine a proposal? Can you think about different ways to even just give out, even if it's the same amount of funding, we got one part to do some of the exploration and another part of the funding to do the implementation or to just, based on what you found, really being intentional I think also in, as you're assessing different organizations, asking organizations to get very clear as to what communities might mean to them or what equity means to them, just as, you know, I started out by saying here is how in book we define equity, ask organizations specifically, what does it mean when you're going to run an inclusive process? Tell me what that looks like in your overall sort of project plan and your activities and so asking folks to get a little bit more clear and a little bit more specific about what those processes look like and then what additional funding might be needed to support the time to do that. - And then I'm actually curious also, one thing that we've been looking at at TAPP is the portion of funding that goes towards tech that actually comes from philanthropy versus private money now, or rather like venture, et cetera.
So I'm curious, how do you see philanthropy, I guess playing a really important role in the way we fund value oriented tech as compared to other like financial or private capital schemes that exist that flow towards companies like Facebook, et cetera, that fund maybe a little bit more like consumer focused tech, et cetera? So what is the role of philanthropy in that larger ecosystem when we think about how we pick the winners that really then, I guess define our technical or tech landscape as we move forward? - Yeah, absolutely and I'm going to put this question back on you, Amritha, after I have answered it because I definitely wanna hear a little bit more about some of the research that's been going on but I think, you know, one thing to think about from what philanthropy can do here is to recognize that we don't need to only have a handful of technology solutions which then become the be all, end all of technology. I think sometimes in the private sector we can fall into the trap of thinking that only a few organizations are worthy of money and are worthy of sort of reverence, of how wonderful technology is, but really, you know, in the social impact space and I'd say broadly, not just in social impact, but I'd say tech in general, really understanding and recognizing that we want technology to be more accessible for everyone and so we're going to need more approaches. When you look at, you know, who receives venture capital funding or even who receives philanthropic funding, the diversity of people who are leading those organizations is not great and so we really need to think about how can we really ensure that a diverse, wide group of people are getting access to the technology, are getting access to the time and the resources to invest in going from an organization that is walking a neighborhood to pick up food and deliver it to someone else in that neighborhood to then really building a structure and essentially building a tech platform that enables neighborhoods and cities across the country to be fed. And so I think that is some of the power that philanthropy can really incite when we recognize that there are many people who are capable of designing technology systems, there are many people who are capable of designing technology that will enhance some of the missions and advance some of the missions that we've said are important. So let's fund them and let's make sure that we're funding a wide swath of them.
- Yeah, that touches on an issue that, you know, a lot of folks in our BKC community have been concerned about with the potential for adopting systems in the social sector that enforce algorithmic bias, kind of further marginalizing and disenfranchising people. Are there specific things that you think philanthropists and change makers can do to harness the knowledge of communities to design less harmful tech and do just reaching out to the community, even when it's developed with that insight, are there still kind of accountability audit, like what kind of procedures and protections do you put around it to make sure that the data sets really are helping and the solutions really are helping the marginalized communities that you were seeking out to support? - Yeah, absolutely. I'll answer briefly and then let Amritha respond to this question and in the last one as well and so I think, you know, algorithm audits are incredibly important and we should continue to do those or we should do more of them since those are fairly new but I also think, you know, asking the communities who are impacted what is working and what's not working, even if someone is not a trained engineer, not a trained technologist or not a trained policy maker, they will be able to articulate. They're often able to articulate, "Am I being harmed by this? Can I actually access the resources that I was told I was going to be able to get? Can I communicate with my friends and family? Can I get the time and space to sort of think and actually do the things that I was told that I would be able to do or that I articulated that I wanted to be able to do with the intervention of this technology? Do I still have some agency in making any changes? Some community will know. So I think where philanthropists can have a stronger tie to the community or ask some of their grantees to really show that interaction and the reaction from the, really the end users, the people who are most impacted by the solutions.
I think that is one step that we can take, Amritha, what would you say? - Yeah, I think that's spot on and I'll just echo your points. I think what we've noticed, especially in terms of philanthropy, we've looked at it from a slightly different angle which is philanthropy that funds tech policy research even and how that sets priorities in terms of research agendas, et cetera and how that then informs kind of these conversations about what is values, what does values oriented tech development really look like, how do we craft policy that's informed and effective for communities that we're looking at? And I think what's interesting is that there's so much potential for philanthropies to play a really big role in this space but right now, there also isn't that much accountability around them as institutions sometimes in terms of where the money is going, how it's really benefiting communities in certain areas and so often it's assumed to be of good nature just because it's this form of giving that we really elevate and it is great and we think there's a huge role that it can play in the way we think about, again, agenda setting and the kinds of tech that gets elevated but we're trying to think about, okay then, how can we ensure that. Afua, to your point, they are really interfacing with the communities that they say they wanna benefit. How do we make sure that there's an accountability scheme around that? And how do we make sure those dollars are really going towards the kind of impact that they say they wanna have? So we don't have the answers to that yet but it's some of the questions that we're thinking through which is why I feel it's great to hear your thoughts on that and how we build that community engagement in a way that's really effective. - Exactly, it's pulling all of the different stakeholders together to figure out how to solve this, so, well with that, I did wanna turn things over to the Q and A so that we have time for the Q and A here. Thank you so much, Afua, for these comments on it, maybe we'll just kind of run through them in no particular order and do that.
So the first question is, "If the pursuit is to build a better world, could there be a model where in big tech, internet companies who are criticized among other things for their size and wealth could be asked to substitute in part local taxes for global revenues by a contribution to a global development fund which could be highly impactful which they could operate together in the interest of humanity and manage it with the efficiency with which they manage their commercial projects. Would that be a business like design for global philanthropy?" Interesting concept. It's like how to bring in private sector dollars into a fund that then some of the questions would still remain as to how that fund gets allocated for what and for whom. - Yeah, yeah, absolutely and so is there a space for private sector dollars to make a difference? I think there absolutely is. I think, you know, we've seen some I think similar funds for other challenges, right? We have the IMF, you know, world development institutes and more that sort of pool resources and then manage those resources with varying levels of success and so I think what is important is to, yes, look for different ways to get additional funds and to get additional attention, but also to make sure that there's accountability built throughout the process for really, how are we continuing to assess the impact for communities and really, how are communities able to assess the impact on what has been said to have been done to them or said to have to benefit them and so I think that's really important. I think it's also important to recognize that efficiency is really important, but sometimes we have inefficient systems because we're humans and humans, you know, like to have relationships which are often inefficient or like to have different interactions which are also inefficient and so also I think having, you know, just the awareness that even if, as maybe new systems or new structures are built, that there might need to still be some efficiencies to account for the fact that they are humans that we are designing for and humans that need to benefit and not just technology systems that are sort of divorced from that human implication.
- Yeah, absolutely. Wow, there are a lot of good questions in this chat. Let's see.
Okay, so for the next one, I see from Claire Walsh, "Sometimes we're faced with difficult decisions like what to do when our company's survival depends on doing business with another company with a poor record or bad practices. So do you have any ideas, in those circumstances, apart from looking for a new job?" I think it goes back to that question of individual power and how we can kind of cultivate the change we wanna see perhaps in an institution that creates a lot of friction. - Yeah, absolutely. I think it certainly does go to the question of individual power but it also goes to the question and, you know, to the value I laid out at the start of balancing systemic long-term change with immediate needs and so if making this decision on an individual level, I think it's really important to recognize that it is an individual decision and the calculus that you make may be different than other people would make. The calculus you make today may be different than what you would've made six months ago and what you might make six months from now and so really, again, looking for ways where if you wanna be dedicated to building this more equitable world, that you feel like the best way to do that, that might be for you finding a different job, it might be staying in your current job, but spending more time partnering with an organization like DataKind or there are many other organizations who have structures to both get community input and a strong technical expertise and sort of combine those together and so then you're sort of funding it that way or maybe you have the finances or partner with other people who have the finances to do some of this funding differently as we've talked about before and creating space for organizations to have more of the infrastructure or more of the experimentation or more of the knowledge sharing. So I think there are a lot of different ways for individuals to contribute to these problems and some of them might, you know, ask people to leave jobs or to really advocate within jobs, or might ask people to find different paths in and to pull different levers in the system.
- If you don't mind, I actually would love to probe you a little bit more on that because you've had such a breadth of experience and you've hopped between sectors and roles. So I guess, is there a time where you felt this kind of tension where, and you don't have to name the organization but rather, there was a moment where you felt like perhaps your individual concept of what change looks like, et cetera or what your values were and the institutional values just didn't align appropriately and you had to make that decision of, do I stay and make change here or do I leave and find a new home? - Yeah, absolutely. I would say that I have been fortunate in the positions that I've held and the teams that I've worked on in the organizations I've been employed at, I've always felt enough of a connection to the organization or to the team and to the mission that I haven't struggled in that way to make me leave an organization for that.
There certainly have been teams that I'm on, for example, or organizations I've been a part of, for example, where I have desired, you know, one of my strong commitments throughout life is to diversify the tech space and if you look at my employers, I've worked in spaces where there are not a lot of black women and so finding ways to sort of create, that change your work towards the change because one person can't single handedly change the numbers or change what diversity looks like in the tech space and so finding ways to advocate for that within the organization, finding ways outside of the organization, finding other associations for myself to be a part of that can join with other people who also care about diversifying the tech space. So I'd say for myself, that's sort of how it's manifest. - Yeah, it's really important to figure out how we can continue to diversify the tech space and be the voice for the change that you want to happen in the organizations but I hear what you say and, you know, that's been my experience, that well, that there there isn't a simple answer in a lot of these organizations and there aren't kind of simple choices about how to affect that change and how to figure out how you're gonna find the pathway and work through it. One of the other interesting questions that we got here is, you know, going back, I think toward the private sector is, you know, how do you think that private companies can join the public sector to achieve social changes? I think it's a really important component that wasn't as featured as much in terms of the stakeholder base here but they, you know, obviously play a really important part in the tech space. - Yeah, absolutely and this is another question that I'd love to hear from both of you, Sue and Amritha on, you know, from my perspective, there are many ways the private sector can engage and so, you know, one, is partnering with the social impact sector in different ways and so there are a number of public-private partnerships that have been created, especially over the past several years.
There have been a number of ways for the private sector to partner with nonprofits to provide technology or expertise. I think one of the examples or one of the things we talk about in the book is making sure that nonprofits still have agency in appearing and not, you know, it's not just a sales mechanism, but actually what does that organization need? What are their community needs and things like that. So I think that's really important. I think also, you know, finding different funding vehicles to think about new ways of funding this work and so it's not just the 501C3 status that's gonna make sense for everyone. So what are different ways of funding this work? How could the private sector also encourage different types of structures, of organizations, of companies and more to think about really having a public good through really supporting public goods through technology.
- Yeah, and I worked with a lot of philanthropic arms of private sector organizations, which were really important in driving the change in these spaces from a funding and a resource perspective and others and I mean, I found one of the areas in your book interesting also about talking about the kind of private sector solutions not really perfectly overlaying and often times not being a wonderful gift when it's like, just here's, here's the technology I have, you have it for free and it's really more to think critically about how that engagement can happen in the public-private partnership in a way that is driven by a lot of the values that you mentioned in your book and is driven by the needs of the social impact organization and the needs of the communities that they're serving. - Yeah, absolutely. I think it speaks to the opportunities that exist there, especially when the incentives are aligned.
So there are sometimes specific opportunities where the incentives of the private and public organizations really overlap and I think when we can find those partnership models, there's a lot of fruitful things that can come out of it. But related to that, there's another question here that I think is really interesting that kind of plays to it, which is, "How can more responsible companies address competitive pressures from less responsible companies?" So when you're thinking about the marketplace and perhaps companies that do wanna more align with some idea of social impact or values alignment that makes them potentially less competitive, I guess that's the claim there. How do you think about survival and market pressure as to then shift the way you're doing your product development process or putting out products to consumers or communities? - Yeah, absolutely, and I think Amritha said that a TAPP fellow has done some research on this, so if so, I'll ask you to maybe share to recap, I'm sorry, I'm forgetting the name of the person who's done some research on this but would love to hear any thoughts you can share on that. You know, I think it's, you know, it's important to think of different vehicles also for private sector companies and so creating this sort of non-profit arm as well as a way to sort of separate the two, I think also looking at who's on the board of organizations, whether i