[Ajmal Aziz]: Hey, good afternoon, everybody. So, I'm going to be talking to you about Technology Acceptance and, the SBIR topic that we have. Before I kind of jump into some of the specifics in terms of the, the topic criteria. I think it'll be helpful to kind of provide some context in terms of, who I am, the program I oversee, and how it kind of fits into.
Yes, I can. Can you hear me now? Awesome. So, I'll provide some context in terms of, what the program aims to accomplish. And hopefully that'll help you understand kind of how the tech adoption, Tech Acceptance kind of fits into, the larger portfolio here. So, I oversee the Public Safety and Violence Prevention program. We help to conduct social science research to address the Science and Technology requirements that have been identified across the Homeland Security enterprise.
So, following, Carl makes a lot of sense here because, you heard a lot about, the Screening at Speed, topic area. They're looking at developing technologies and capabilities to help inform, the passenger experience. From a social science perspective, we are really interested in understanding why things work in certain ways.
Are they meeting their intended objectives as designed. And that's really kind of at the heart of the Technology Acceptance, topic area here. DHS, we develop a lot of technologies, a lot of capabilities that ultimately are, transitioned back down into the field.
In order for us to know that these technologies are actually making, hitting their intended objectives. We also need to understand if they're being adopted and utilized in the way that, they need to be. So, in order to do that, we've implemented technologies like Screening at Speed. We've done facial recognition technologies. We've implemented chemical and biological detection, capabilities. However, when you roll out these technologies, there's also a risk embedded with it in terms of how the public would perceive some of these technologies.
And that's really at the heart of what we're trying to accomplish here. So, like Carl, I'm not going to read everything in here. But what we're focused on right now is looking to just better understand, how workforces and organizations accept technologies. We're interested in the public perceptions of said technologies before they're formally, really implemented across the, the US.
We are looking at developing strategic communication plans for the implementation of these technologies, and we want to support operational experiments and exercises to improve the adoption. We don't want to be prescriptive. That's really why we have the topic area here is we are interested in better understanding what you all as leaders and experts in this field have to propose.
Again, Carl had some really good slides kind of referencing some, S&T strategic documents. I would very much encourage you all to take a look at those, in addition to the DHS Strategic plan that we have, so that you have a better understanding of the things that, from a S&T perspective, that we are focused on. So, that one of the things that, I heard from the panel earlier today was, no, please don't submit things and just assuming it's going to stick on the wall. It's important to do some of that homework, early on to understand what some of those technologies are that are in development and how you all can be able to help support us in ensuring that these technologies, when delivered to the field, whether it's from, a state and local entity or from a public perception perspective. Is the deployment ultimately going to be well received and accepted by the general public? So, I kind of talked about some of the various technologies that, were that are currently out there.
There comes a risk as well. So. that's where the Tech Acceptance piece kind of comes in play. It's helping us understand those risks so that we can incorporate some of those, those enhancements before technologies are rolled out. Public perception and adoption of tech.
It's critical, it helps us identify where technology should be used, and it helps inform multiple stages of the technology acquisition and deployment lifecycle. What are we hoping to get out of this at the end of the day? We want to improve the adoption of these technologies, either for the general public or from the workforce. So again, we don't want to be prescriptive in how you get there, but when we're talking about technology adoption, there's obviously going to likely entail some interactions with the general public. You heard privacy earlier. We take a lot of importance in protecting individuals’ privacy rights, especially when you're engaging with human subjects and, having them inform some of our research efforts in here.
I would highly encourage you, when you submit applications to, kind of factor in some of those kind of compliance related reviews, because from a DHS perspective, we're not able to kind of hit the ground running and start with some of our data collection efforts until we've been able to address all of the human subjects compliance requirements that we have. In addition to the privacy requirements. That's one of the biggest challenges that we've seen. A lot of industry, folks’ kind of running into is when you submit an application that has a 12 to 24 month kind of period of performance. A lot of the times that research design does not incorporate the compliance review time that's required. So, having some of those relationships, already, kind of developed and built in from your perspective really kind of helps advance and accelerate the development and the adoption of these technologies.
So, working with institutional review boards to ensure that your protocol is got a green light that is extremely helpful for us. Putting in place research designs that keep the privacy of individuals that are working, with yourselves to help us understand if these technologies our, being used the way they need to be used are being implemented in the way they need to be implemented. That goes a long way. We've run into a lot of issues in the past where we haven't necessarily been thinking ahead and forecasting about some of these compliance related activities.
Which ultimately ends up becoming, a cause for delay, if you will. So not only do you all have to kind of go through that rabbit hole all through your all, through your end, but it's a similar process that we have to Intel, that we have to incorporate as well, when we get to that stage of ultimately collecting data. In addition to that, another end objective that we have here is we want to ensure that, the technology can be successfully embedded within the organizational structure and culture. If you're developing a technology and the workforce is not in a position to accept the technology, the question then is, is it even worth developing the technology if it's just going to sit on the shelf? So that's where having a lot of, while DHS can help with some of the implementation and adoption pieces, it's also extremely critical to rely on the network that industry has, the relationships that you all have, that you can ultimately bring to the table to help inform, how we can go about this.
Implementation is a very kind of key point, key aspect of our research portfolio. It's great to develop technologies. It's great to develop, the necessary evidence and kind of knowledge products as it relates to Technology Acceptance. But if it's not going to be used in the field, are we really making a difference here? We're all dealing with budget cuts.
So, to be able to have a specific use case in mind when we think about technology, acceptance really helps kind of stand out proposals. So, what does that mean? If we are developed, developing technologies to help assist with the passenger experience at TSA, if we're developing technologies or capabilities to help our Customs and Border Patrol agents in the southern and northern border, are we developing technologies to help our Immigration and Customs Enforcement? A lot of these technologies are rolled out there, but we need to make sure there's an infrastructure available to ultimately, housing allows for these technologies to be used in the manner that they're ultimately designed to be used. So, this is it's interesting, from an S&T perspective, a lot of what we ultimately do is developing technologies and capabilities. So, this topic area definitely stands out because, we're not looking for technology. We're not looking for hardware. We're not looking for software.
That's one of the biggest kinds of challenges that we usually run into when we receive, applications against this topic area, because that's not the priority here. There are other topic areas for that. It's help us understand how we can ultimately implement these technologies. So, I've got a link here for one of the previous studies, that we've, that we've funded that, has been published by HSOAC grant, this one was looking at public perceptions for the use of 5G technologies. So again, when you think about, all of the, the research designs that are incorporated with some of our Tech Acceptance, efforts here helping us understand, what that research team looks like, what that multidisciplinary expertise that you all are looking to, kind of bring to the table to just help us understand how these technologies are ultimately, implemented really helps, the evaluators and kind of, prioritizing and kind of identifying, applications that meet our evaluation criteria.
You guys can read the quote here. We think this really kind of helps, kind of give some context in terms of the organizational kind of infrastructure, workforce structure. There has to be something in there that will allow for an easy transition here. I mentioned some of the challenges, in terms of, having a good end user in mind that that is extremely critical. So, leveraging the relationships and infrastructure that you have that would allow us to kind of test some of these technologies, that goes a long way for us.
Again, while we are able to bring some of that to the table, we are reliant on what industry is able to propose. We want to be able to support these operational experiments, because that really gives us a really good understanding as to whether or not these technologies are meeting kind of the benchmarks and the, the performance metrics that we have. So, the operational experiments really stand out and are necessary to really help us understand, whether or not these technologies, these capabilities can ultimately be, adopted. I talked about strategic communications and kind of the plans for implementation and disseminating of new technologies.
That is extremely critical as well. So, again from Carl, you heard about the new technologies that they rolled out in Las Vegas airport. It's important for us from a DHS perspective, to get that information out well, in advance of us testing these things so that the public has an understanding as to the technologies that are under construction, if you will, from a DHS perspective, and the types of efforts that we are undertaking to ensure that when we get to that, when we get to that end stage of delivering that technology, it's going to be accepted. Again, we are dealing with budget cuts across the board. So, the last thing we want is to have these technologies and capabilities where millions of dollars are being invested, ultimately sit on a shelf. In addition, one of the other things that I mentioned was, workforce acceptance.
It is critical to talk to the workforces. So, the examples I provided about, our CBP agents or our ICE agents or FEMA personnel that are working with disaster efforts in in Florida, if you will. How can we capture their perspectives as well? So not just from a technology, implementer, but also from, the folks that are going to be utilizing these, these technologies and capabilities. Again, there's a lot of risks that are associated with, transitioning these technologies.
So, to be able to get different perspectives from different target populations. That's really going to help us kind of be ahead of the curve, address risks that we're not necessarily kind of thinking about during the design phase, and how we can incorporate those, in future sprints or, future, advancements that ultimately allow us to ensure that these technologies are meeting its intended objectives from the get-go. The public perception and adoption offers a lot of benefits, but the risks is really kind of what stands out. And that's what we're hoping to get ahead of in terms of just mitigating these risks. We do not. I said this before. I'm going to keep harping on it.
We do not want to be prescriptive. We don't want to tell you how to get to that desired end goal. We're looking at unique solutions that you all are able to provide, to kind of help us understand, what's out there in the field, that industry's already kind of tinkering with. That
we can perhaps kind of bring in within our respective kind of government walls. Going back to the quote here the piece that really stands out to me is the failure to successfully transition a technology into the organization. So, to be able to think about research designs that bring in all of these kind of perspectives.
Put in a research design that incorporates all of the the privacy protections, human subjects kind of requirements that we need to kind of be, cognizant of. Leveraging the relationships that you all have with state and local and users and entities that would lend themselves useful to partnering with yourselves, to help support some of these operational experiments. I would highly suggest incorporating, letters of support that you all have from some of your, state and local stakeholders, because that allows us to understand that, not only are you submitting a research proposal here, but you have plans as well to see how these technology acceptance research efforts can be tested with some of your partners, in addition to those other partners that DHS is able to kind of bring to the table. That is extremely vital.
And we've seen, a lot of instances in the past where it takes time to kind of, mature and kind of, coalesce those, respective organizations. So, to be able to provide that upfront really allows us to spend the time that we need to kind of conduct these this necessary exercise is to allow us to get a little bit more data, a little bit more information, so that we can pass that back to, the developers that are developing said technologies to, again, ultimately help us understand whether or not these things are being used the way they're intended to, whether or not they're meeting their intended objectives and whether or not they're ultimately going to be adopted at all levels of government, all the way from the federal government down to, local community programs. I would say I would love to take some questions here, but I know that is not allowed.
I'm going to go back here. I'm going to keep talking about the topic area. Understanding those factors that influence adoption. I'm going to be beating a drum over here. Extremely important. I've another good example here for some work that we've done with our acceptance of shoe scanner technologies for, for Screening at Speed.
It might sound non-controversial, but, a lot of people, when they understand that DHS is collecting data, I can understand as a private citizen would immediately have their alarm bells kind of ringing. So that's why it's vital for us to get that information up. Well, in advance.
Understand all of the, all of the specifics that we need from a TSA perspective in terms of what sort of capabilities they need, what sort of requirements they have for their technologies, how a shoe scanner can ultimately be, kind of integrated into existing structures. So, there's a lot of different variables that come into play. Again, we are not interested in hardware. We are not interested in software. There are other topic areas that are specifically tailored for that.
We do social science research. We want to understand how certain things work and why they work in those ways. So, I want to continue to stress that because bringing different perspectives from a research team, really helps us better understand how these technologies are ultimately going to be adopted and accepted. The workforce piece is, extremely important as well. This is where getting those perspectives from some of the networks and the relationships with organizations that you have really helps us better understand that broader pieces.
Well, while we can help provide some of that from a DHS context. What matters to DHS in terms of technology acceptance might not be, a point of, focus for some of the partners that you have at a state and local entity. Understanding kind of where those differences lie, understanding where those devices are, understanding where there's overlap also allows us to help kind of focus some of our efforts. The Strategic Comms plan I talked about getting some of that, word out well, in advance.
That also goes a long way. So, if there's mechanisms that are available to, the industry partners here, if there's, ways that we could do press releases or work with you to kind of highlight some of the research efforts that are underway, how you're planning on working with DHS on these things, how you're planning on working with local programs? Getting word out often, really kind of helps us, kind of address future, issues that we might be facing in terms of, rolling out a technology that might not necessarily, have been kind of properly, gone through the necessary PR cycles, if you will. And then really kind of disseminating these new technologies as well. We don't want to get a technology or a capability out there if we don't know it's going to work.
That really doesn't help anybody here. So, that's really where the technology acceptance kind of topic area kind of stands out when you kind of take a look at the rest of the topic areas that we have. Because it's part of the social science portfolio here.
We want to understand what's working, what's not, and what's promising in this field. We've got other technical folks that are developing these technologies and capabilities. So, we work very closely with them just to ensure that the social sciences have a seat at the table to kind of help inform, the development and implementations, of these technologies. One of the other points I mentioned was some of the challenges that we've seen is, a lack of specificity.
Please have a good end user in mind. While, again, we can bring some of those end users to the table. We are also relied sometimes on industry partners kind of bringing them to the table. We want to help with the testing aspects of these technologies. We want to help with how these technologies are ultimately fielded, because that also varies depending on the technology.
We want to also understand how we can, refine this technology. So, we work very closely with our, tech partners to see how we can incorporate some of these lessons learned and best practices. As these additional technologies get, built upon, enhanced, and ultimately rolled out. So, when I say be specific, we are able to bring the CBP folks, the Ice folks. But to get that local perspective is extremely vital for us because at the end of the day, these technologies are going to be, implemented.
It's going to affect everybody down to the local kind of state and local level. So, we want to ensure that there's at least a lot of rigors, objectivity and independence that is undertaken with these Tech Acceptance efforts so that we understand exactly, what we need to before these technologies get rolled out. Lastly, the research here increases our understanding of how organizations operate and interact both internally and with constituents that lead to a transition, into the organization.
So, I'm going to go ahead and stop there because I think I'm repeating myself now. If there are any questions, I believe there's a poll. Happy to address any of those. Otherwise I think the big takeaways that I would like all of you to walk away with here today, letters of support use, MOUs, building in some of the human subjects, requirements and privacy, requirements upfront into your research designs, are extremely helpful. And submitting applications that tie back to core DHS mission sets.
Also make applications ultimately stand out. Thank you all for your time.
2025-04-17 09:53