hey everyone thanks for joining me today we're going to be diving into something pretty important I think the ethical side of engineering and how we adopt new technology it's not just about building the future right right it's about building it responsibly yeah it's like you know we've mastered how to make the clock tick but have you ever stopped to think like what time should we even set it to exactly you we're really going Beyond just the how and more into like the who what where and when of it all yeah and thankfully we have some really interesting research to help guide this deep dive today it's from the 2024 AE Southeastern section conference a paper called Engineers as agents of technological chain oh yeah this is a good one a lot of different experts weighing in on how you know all Engineers no matter what your specific discipline is you really do play a huge role in shaping the future huge role and I love how the paper jumps right in with this idea this statement it's not just about how and I think as Engineers we love the how right we love the nuts and bolts of making technology work but you know sometimes we can get maybe not oblivious but a little less mindful of all the other questions the other really important question well yeah it's like having you know powerful toolbx knowing how to use all the tools that's important yeah but also understanding what you're building who it's for you know thinking about like the consequences intended or unintended I mean that's where it gets really interesting it's the bigger picture and they use a great example in the paper they talk about Blitz scaling oh yeah which everyone's familiar with that that whole move fast and break things mentality rapid Tech adoption often for the sake of you know just dominating the market and we've seen how many companies have just exploded in size using that very principle it's fascinating right because on the one hand it can be very very effective oh yeah but I think what the paper argues is that that singular focus on speed can some times you know come at a Coster right like other considerations ethical considerations might fall by the wayside in that race to be the first to be the best to be the biggest yeah and that can lead to a lot of unintended consequences it can it almost reminds me of you know building a bridge as fast as you can to get ahead of the competition but forgetting to maybe check if the materials are actually you know yeah sound exactly which kind of leads us perfectly into the next major point the paper makes which is the illusion of better we always just assume that newer technology equals better technology one% but is that always the case that's such a slippery slope to go down isn't it it is because you just assume oh this is newer therefore it must be better and the paper really emphasizes that you know progress isn't always this straight line it's true and sometimes a slower maybe even a more deliberate approach is actually the better way and there's um they quote a philosopher Mark Steam and he argues that this slower Pace it gives us time to reflect like on the uneasy questions or vulnerable experiences and uncertainty these are all things that we don't really have the luxury of necessarily pondering in our fast-paced world I think that's a really refreshing way to look at it that slowing down is sometimes the maybe the faster path in the long run but how do you balance that right that need for reflection with this world that's obsessed with what's next exactly always moving on to the next big thing and that's where I think the paper really digs into what they call the agency dilemma okay and it gets into this idea that every choice every single choice you make as an engineer about you know what technology to adopt or how to use it some of them might feel small you know they might feel insignificant sure but others those choices could have a much much wider impact yeah Butterfly Effect right it's like the butterfly effect exactly but in the tech world like you could have you know a software engineer in Silicon Val writes a piece of code seems small at the time but it snowballs into something that affects millions of people's lives exactly in ways that maybe they didn't even couldn't even really imagine when they were first writing that code yeah and it all comes back to this idea of responsibility right yeah and the paper draws a really important distinction between what they call passive responsibility you know being held accountable after something bad has already happened okay now you've got me interested tell me more about this difference so passive responsibility is is like you know realizing you left a window open during a rainstorm but only after your carpet's totally soaked oh no right but then there's Act of responsibility and that's about you know checking the forecast being proactive closing the window before the storm hits It's about foreseeing those potential issues and actually taking steps to prevent them before they happen so thinking ahead those what if scenarios I can see why that's so important but it also feels like a lot of pressure you know ing act to constantly be considering the potential downsides of everything that you're working on for sure and and the research does acknowledge that complexity but but before we even go there I think there's an even more fundamental point that this research raises and it's that Tech doesn't have ethics people do yes because it's very easy especially now with AI to start to think of Technology as this almost sensient being like it's making its own calls right but ultimately it's just lines of code lines of code written by humans written by influenced by their experiences their biases and yeah their sense of Ethics their sense of ethics and we can't forget that we can't you know the research uses the example of uh where those smart speakers speakers yeah they're designed to sound like this helpful friend but you know in the end it's the engineers the companies behind them that are ultimately setting the ethical boundaries for how that technology is used it really makes you think you know like how often do we stop and think okay who's actually making the decisions here it's it's easy to get caught up in the technology itself absolutely and it's not just about you know smart speakers it's it's really across the board and when you think about all the ethical challenges that come up in this digital age I mean wow yeah there's no shortage there's a lot to unpack there there is and this paper really encourages us to think critically about things like privacy social Equity job displacement even you know the right to repair things that we've talked about a little bit on this show in the past thiss them together in it does so let's dig in a little bit and maybe we can start with privacy because I feel like that's the one that's always you know top of mind for a lot of people it's like it's in the headlines every day it's like with all the sensors and facial recognition and you know data hungry algorithms it seems like almost every move we make is being tracked and analyzed in some way yeah like you're leaving this digital footprint everywhere you go everywhere and even if you're not online right I mean it's it's offline too oh absolutely and I think the research really highlights how crucial it is for engineers to be asking themselves what are the implications of this data collection yeah how can we design these systems where user privacy is like baked in from the very beginning designing with intention I like that because it's not just about you know protecting ourselves from some like orwellian surveillance state which you know sometimes it feels like that but it's also about just our own autonomy our right to self-determination in a world that is so driven by data yeah and that actually ties really well into the next ethical challenge which is social Equity okay how do we make sure that all these advancements all these technological advancements benefit everybody not just a select few not just a select few like we're building a bridge to the future but we want to make sure everyone can cross it not just the people who have you know the resources the knowledge to do so exactly and that means we need to consciously consider the need needs of diverse communities I'm talking about things like access to Education Health Care you know even just basic infrastructure in some places the digital divide is a real thing and if we're not careful huge it's only going to get wider and we have a responsibility to make sure that the technology we're creating isn't contributing to that absolutely yeah technology should be a tool for empowerment not for division but there's there's this other side of it too and it's something that you know comes up a lot in these discussions about ethics and technology and that's job displacement sure you know because on one hand automation it can improve efficiency it can you know streamline things it can make things better right streamline things exactly but it can also lead to job losses and that's something that understandably people are concerned about of course it's a very valid concern and I think what the paper really pushes us to do is to not shy away from those conversations even though they might be difficult but actually engage with them head on yeah how can we design technologies that instead of just replacing human capabilities actually complement them yeah you know can we leverage these advancements for things like retraining upskilling the workforce I mean these are questions that we can't ignore we can't it's about finding that balance you know between this drive for progress but also making sure that we're not you know leaving people behind right and speaking of potential unintended consequences I thought it was really fascinating how the paper also kind of dug into the right to repair which is something that we've talked about on this show before one it is and it gets into you know this whole caution of ownership and control because you know right like who gets to decide how long your devices last yeah and who gets to fix them and you know because a lot of times design choices are being made that make it much harder for people to actually repair their own stuff yeah and then they're forced to go through these you know expensive manufacturer controlled channels which is great for the companies great for the companies maybe not so great for the consumer consumer or the environment let's be honest not really because it really feels like we're being pushed towards this constant cycle of consumption right constantly upgrading which again you know good for the bottom line but is it good for the planet not necessarily not necessarily no so the research I thought it was great how it challenged us to really think about okay how are our design choices impacting these bigger issues of ownership and control and you know sustainability it's not just about the tech technology itself it's about you know what happens after it leaves our hands absolutely it's about empowering users not you know locking them into some kind of system where they're dependent on us right makes you wonder like what would happen if we designed things to be repairable from the start or if we actually design for longevity but I want to shift gears a little bit and talk about probably I don't know maybe the most talked about and probably one of the most ethically complex Technologies that's out there right now and that is artificial intelligence or AI oh yeah where do we even begin I know I know it's like everyone's talking about it it's everywhere you turn it's fascinating right because it has the potential to I mean really revolutionize so many different fields everything from Health Care to Transportation like you name it it's it's already happening it is but like you said there's this immense potential but it doesn't come without some serious ethical baggage right and it seems like every other day there's some new headline about you know AI can do this AI can do that but also wait a minute what about this potential Pitfall or that unintended consequence yeah for every step forward it feels like there's also this you know what could go wrong and and the paper does acknowledge that while there are some general principles for ethical AI usage kind of like emerging they even say that that's not enough it's not enough no so what do we do it really comes down to each engineer each individual engineer putting those principles into practice okay making sure that the AI tools they're using the systems they're building are being used responsibly so it's less about like waiting for some Grand set of rules to be handed down and more about individual accountability individual action 100% and being willing to ask those tough questions you know like what biases might be built into this AI system how can I mitigate those biases how can I mitigate those biases exactly yeah what are the potential unintended consequences like these are questions we have to asking ourselves it's about going Beyond just the capabilities of AI and really thinking deeply about like the potential impact on individuals on society as a whole so so how do we prepare Engineers for that I mean it feels like education has to play a big role huge Ro right and that's actually something that the paper dels into as well the importance of ethical education for engineers and it's not just about you know checking the ethics course off your syllabus right it's about really cultivating this culture of like ethical reflection ethical action yeah throughout the field yeah so it's not just about knowing the rules it's about really instilling that sense of responsibility in future Engineers exactly but I mean and maybe this is just me being a little bit too I don't know cynical you for it but does Simply Having that knowledge actually translate into ethical behavior in the real world that's a great question because it's one thing to like Ace your ethics exam but it's another thing entirely to actually apply what you've learned when you're faced with like real world pressures deadlines maybe even conflicting priorities oh absolutely and that's where it becomes less about you know simply teaching the knowledge and more about nurturing this genuine culture of like you said awareness and action within the field itself so how do you bridge that Gap right between the theory and the practice that is the million dooll question it's like they say right yeah actions speak louder than words 100% yeah see ethical conduct modeled in real life it makes a huge difference makes those principles feel a lot more attainable you know it's not just some abstract idea in the textbook right exactly and the paper they talk about how mentorship you know both formal and informal can be so influential in those early stages of an engineer's career but they also argue that it has to go beyond just the individual level right yeah individual influence is huge don't get me wrong yeah but we need a systemic shift where organizations are actively creating environments cultures that actually value these ethical considerations not just you know lip service yeah not just lip Sur can't just be like oh ethics are important then everyone goes back to you know business as usual exactly you got to walk the walk so how do we actually do that how do we move from just like talking about ethics to actually embedding them in the the very fabric of you know of of how an organization operates that's the million-dollar question and I think that's where this deep dive really leaves you with some things to to mull over to think about right the paper focuses mostly on you know individual responsibility which of course is critical but they also kind of hinted this idea of collective action okay and this idea that Engineers need to work together to actually address these bigger systemic ethical issues so what does that look like though like when you say Collective action in the engineering world what are we talking about yeah it's a good question and to be honest the paper doesn't necessarily get into specific strategies you know right but I think the question they pose is a really important one and it's how can Engineers use their voice their influence collectively to really advocate for these ethical practices it's almost like I'm imagining like you know Engineers banding together to create these ethical standards or to like call out practices that maybe aren't you know to par yeah hold each other accountable hold each other accountable exactly I think there's a lot of power in that there is power in numbers right absolutely and it's that recognition that we're all in this together right yeah navigating these challenges shaving the future of technology I think if we can approach it from that perspective that you know we can achieve some really great things yeah and hopefully build a future where technology actually does serve Humanity in a way that's both Innovative and you know ethical absolutely it's possible it is possible we just got to we got to get there well this has been I mean really a fantastic conversation yeah this was great we've covered a lot of ground today everything from you know that illusion of better to the responsibility that comes with individual agency and of course the importance of building an ethical culture within engineering as a whole but I think as we wrap up here if there's one big takeaway for you listening out there maybe you're struggling with some of these very same issues in your own work what would you say like what's what's the one thing to keep in mind as you're as you're moving forward yeah that's a good question I I think if I had to boil it down to one thing it would be that approaching technology ethically it's not about like checking boxes or you know passing some tests it's an ongoing process it's a journey it is it's a journey it's about cultivating this constant awareness constantly questioning constantly being willing to you know learn and adapt and just being committed to using your skills your knowledge in a responsible way and you know don't be afraid to speak up voice your concerns think about the impact of your work and remember you're not alone in this there are people out there who who want to see the same things that you do abolutely connect with those people keep these conversations going challenge the status quo if you have to and as we sign off today I want to leave you with this one last thought-provoking question from the paper and it's this think about how you can be more than just an agent of technological change be an agent of ethical technological change because you have the power to make that happen and until next time keep asking those tough questions questions and keep striving to build that better more ethical future that we all want see e e e e e
2024-10-25