Health Information Technology in Implementation Science

Show video

Thank you for joining us today, this webinar is being recorded so if you have any objections, you may disconnect at this time. Thank you very much on behalf of the National Cancer Institute. I wish to welcome everybody to the implementation science webinar series. We are excited to be joined by our speakers today, but before we get started, I wanted to discuss a few brief logistics. All attendees have been muted upon entry and will remain needed for the duration of the webinar. We encourage questions and comments using the Q and a feature on the right hand. Side of your screen.

Please type in your questions, select all panelists and hit submit. Again, feel free to submit your questions throughout today's conversation. You may need to activate the appropriate panel using the menu options down at the bottom of your screen. And if you require closed captioning, please refer to the caption panel.

We also recommend navigating to the view menu at the top top left of your screen and selecting hide, non video participants for optimal viewing presenters today. And with that, I'd like to turn it over to kinda put a charge. Hello, everyone, thank you so much for joining us for this webinar today. We're very excited to hear from members of the consortion for cancer implementation size, technology and health communication actions are Rachel gold is 1 of the leaders of this group and is joined today by Constance. Oh, is that Amy pitchman? You can read their impressive BIOS by visiting the links on the webinar promotion page. We're very excited today to hear about a scoping review. They have been working on as 1 of the public goods identified by this action group. And with that, let's turn it over to our speakers.

Thank you and hello everyone again, my name is Constance. Owens and, um, I'm joined here today by Rachel and Amy, and we are pleased to share our presentation. Entitled implementation of health information technology approaches for secondary cancer prevention and primary care. A scope of review preliminary results. This review is up on myself and, uh, Rachel and collaboration with an outstanding team of members from Consortium for cancer, implementation science. In which we would like to, especially recognize our Co author Dr Jamie Ching for her substantial contributions to this work. So when we go to the next slide before we get started. Please go to the next slide. Yep. Thanks. Highlighting. Here are a few disclaimers. We want it to a note to include this presentation. We'll cover, uh, what we are learning so far from our research.

So, since these are preliminary results, the data we report today is subject to change next slide. An outline here is an overview of what we will cover during today's webinar make sure you save their questions to the end. Uh, next slide so highlighting here are a few key evidence steps that led us to this research as we all know, there's, um, that there, there's a lot known about the effect, the screening guidelines for breast, cervical and colorectal cancer. However, research shows that there are many people that are still not up to date with their cancer screening. So, when exploring evidence, based practices to address this challenge and screening update, there is a need to understand if is an evidence based practice for Kansas screening. And the short answer to this is yes.

There is evidence that supports effectiveness for cancer prevention as well as control. However, the question remains what is known about how tools improve the provision of guideline component, cancer prevention, uh, services and this brings us to the last point listed here, which is. There still remains limited evidence that focuses on what works best in primary care settings. So when we go to the next slide. Um, in efforts to help bridge these current gaps, we turn to the integrated technology implementation model as our conceptual framework for this study as we continue to grow in our knowledge of the intersection of implementation, science and technology adoption in cancer prevention. We're finding there aren't models that incorporate technology adoption as illustrated. Here. The provides a cohesive model that introduces 12 inner and our organizational concepts.

That are essential to the process of both implementation and adoption of technology and health care settings. So, now I'm going to stop and pass it over to Rachel to take us into our research questions and methods. Rachel? Yeah, thanks, Constance. Next slide please. Just again, so, uh, constants really alluded to this in in our previous slide, but just to be clear what we wanted to do with the scoping review was get a sense of what, what is the state of the science on how tools enhance. Um, we, of course, are very much guided by implementation science and this work uh, the overall the rename framework guided our analysis. We use the, um.

Integrated technology, implementation model constructs is another guide and when we thought about implementation strategies that the evidence shows are used to support ity adoption. We use the Eric taxonomy. Uh, next please, and I'm going to tear through these, because I really want to spend time on the results just to say, briefly we went to all the usual data sources for the scoping review. We had a set of search from categories. The thing I always wanted to call out on this slide, is that we look both at the peer reviewed and the gray literature because again, this is not a systematic evidence review. It's a scoping review. We wanted to just get a sense of what do we know. And what's out there? Next please very briefly, we went through a 5 step scoping review process where 1st, we identified a whole bunch of abstracts that seemed potentially, you know, seem to be a fit in our initial screen. We had they were all double reviewed to see if they did fit those that, that were clearly clearly that didn't fall out at that point. We're then reviewed as full text again, do a reviewed.

Then those that made it through the full text screening were were charted. We pulled we extracted data from them and the final 100 so we started with about 40,500 abstracts that made it to the 2nd phase but about 10,103 that we included in the analysis we're going to show you now next please. Um, just very briefly again, we, we limited just because we had to to us based work, uh, published in 2015 to 21 primary care settings with a focus on secondary prevention of cancer, rather than survivorship care. So effectively cancer screening and do the tool do the tools.

Are the tools described in the reference used to support the, the various functions that we talked about before um, and and the cancers that we're focused on next please. Um, just again to give you a sense of the kinds of data that we extracted from the included manuscripts, just information about the study and the setting, specific elements of cancer screening, the functions of the tools themselves, the, I. T, tools themselves which end users are supposed to use the I. T. tools and elements again of what what part of cancer screening was the tool supposed to support. Next please, and again, very much just to be clear on how how we use rename to to identify to pull specific data out of the the, um, the the manuscripts we looked very much, we looked at reach effectiveness adoption, implementation and maintenance with a special focus on effectiveness did the tool show, positive results and then was adoption or adoption and implementation reported. Maintenance as well next please. And again, when we looked at implementation strategies, we organize them according to the Eric taxonomy and which I assume everyone listing on the DNA talk knows about.

And then we use the Proctor categories for reporting on those strategies or, where are those categories reported next? Please. Um, we also look to see if a framework or a technology framework was used. Um, and we, and then in the, uh, was it clearly stated to be underlying the work, and then we looked at any knowledge gaps that were pulled out specifically called out in that in the paper next please. I think it's back to you now, Connie. Yep. Yep. Um, so. I'll take it over. Uh, so briefly, here, I just wanted to take a moment to, uh, give you an overview of the preliminary results that I will cover today. 1st, I'm going to describe our sample of references and what they included about the tools impact and adoption as well as the barriers and facilitators that we came across when we were exploring their adoption also.

Um, I will go into what we are seeing. So far about the implementation strategies, used to support adoption. As well, as the emerging knowledge gaps that we're identifying in our analysis. So, when we go to the next slide, um, our research yielded a final total as Rachel mentioned 103 references, publish it published between 2015 and 2021. we did see a decline of publications during the pandemic. And the majority of references were published, where peer review articles but the thing I want to highlight here is that the majority of the included references were non experimental studies. Um, next slide. And of our included references, we did find that there were far more that incorporated tools for colorectal cancer screening than 4, breast or cervical cancer screening. Next slide so next, um, we are finding that when we explore characteristics of where this research occurred as well, as the sample percent of racial ethnic minorities represented in these interventions.

We're seeing that this information is under reported when we break this down by cancer screen site. We find that for colorectal and breast cancer screening mostly occurred in primary care practices located in urban areas, whereas for cervical cancer screening was mostly conducted in rural areas next we found that racial ethnic composition of the sample was also rarely reported, or not reported across all 3 cancer screening sites, and last we found that mostly targeting colorectal cancer screening primarily occurred in, and mostly was not reported for references that covered for breast and cervical cancer screening. So, overall, we're seeing that there is a lack of evidence, and not only where approaches are used in primary care but also how these tools are working for people of color served in these particular settings. Next slide and then, as far as the characteristics of the tools themselves, we're finding that most references, uh, use base tools and the most common, uh, function that we identified to support all cancer screening types. Uh, was clinical decision, support CDs, secondary functions as I sit here included, um, risk stratification for colorectal cancer screening patients, decision, a, for breast cancer screening and screening outreach to support activities, such as patient reminders, scheduling and educational awareness for cervical cancer screening and last. We found that these functions were primarily leveraged during panel management and at the point of care.

And this is key because going back to our 1st, research question about how is used to provoke cancer screening and primary care, we see here, the evidence exist on the type of tools and functions that are being used as well as. Um, the Kansas screen activity supported by these tools in primary care, right? But when we go to the next slide. The extent to which we understand the adoption and impact of these tools remains limited and this brings us to our 1st, knowledge gap. Uh, illustrated in this table here, so for each renamed domain illustrated here, we categorize the percent of references into 3 groups outlined at the bottom of the table with low meaning less than a quarter of references reported evidence for that particular renamed domain moderate meaning 25 to 50% of our included references reported evidence for that domain. And then high representing more than 50% of references reported evidence. So overall what this table really shows us is that the literature does not include a lot of information.

Well, he's currently about the reach of HFC when it's used for cancer prevention, cancer screening and primary care as well as there's very little information on adoption and implementation for all cancer screening sites and the evidence varies by Kansas screening for effectiveness and maintenance. So, now, I want to take a deeper dive into the 3 domains in the middle listed here, specifically to let you all know about what we're learning. So far in fact, in this adoption. And implementation of is reported in the evidence. So, when we go to the next slide overall demonstrated effectiveness across all 3 screening types. With the majority of our references showing evidence for positive outcomes related to.

Uh, colorectal cancer screen followed by breasts and cervical cancer screening and primary care. Also, these reported references revealed that there is not a lot of evidence overall 1 effectiveness for breast and cervical cancer screening. Next slide and when we look a little further, um. And to, uh, effectiveness. We find that positive outcomes related to mostly represented. Clinical decision support, use during panel management, or at the point of care for all cancer screening types.

And then we also identified that there were a positive outcomes for clinical decision support when it was used during follow up care activities. Related to colorectal and, uh, cervical cancer screening. So, when we go to the next slide, as I mentioned.

Earlier, um, since there is a high percentage of references from our sample of that covered. Effectiveness for colorectal cancer screening. We decided to take another look at these references to really see what we're finding about, um, the effectiveness. In particular here, I want to point out that similar to the clinical decision support. We're seeing a, uh, number of references that report at positive outcomes. 1 risk stratification is used.

Channel management point at the point of care, and during follow up activities after, um, colorectal cancer screening referrals have been made. So overall, although the evidence is limited for effectiveness. Um, when used for Kansas screen and primary care. We do see as shown for colorectal cancer screening and primary care that these tools do work. However, when we go to the next slide, despite this evidence of effectiveness, we are finding.

That there is hardly any reporting on adoption. As illustrated here, there was only 21% of our included references that reported adoption rates. Of those, they mostly focus on colorectal cancer screening and then overall the rate of adoption was small. With report rates of 50% or less for all cancer screening types.

So, when we go to the next slide, using the integrated technology implementation model. Um, I discussed earlier we identified, uh, roughly 34 to 38 references. That reported barriers and facilitators of option. Here, it's important to note that these barriers and facilitators mostly related to enter concepts.

The nature of the innovate innovation, which, uh, in this case is the tool. And out of context, and this, those top 3 barriers of facilitators really underscore that technology. Adoption is not only impacted by. What the tool does or house design, but this evidence shows that the setting in which the is being implemented is key to its adoption for cancer screening prevention.

Um, in primary care, so to go through the next slide. As illustrated here, uh. Inner context barriers and facilitators related to the existing organizational infrastructure and policies within the primary care. Practice, um, next slide, uh, examples of barriers and facilitators related to the nature of innovation really highlight the extent to which the technology itself is. Doing what is programmed to do as well as as compatibility. Within the, uh, clinical practice setting next slide.

Unless examples related to. The outer context highlight barriers around accessibility to technology support. Limitations and data sharing across settings, outside of primary care. And facilitator and the facilitator shed light. On the importance of local state and federal policies, such as Medicaid expansion.

The impact adoption next slide. And last when we explored reporting of implementation strategies to support HFC adoption, we found that. These strategies were reported for less than a quarter of our references. And there were a total of 22 Eric implementation strategies reported.

Um, most of those strategies were multifaceted, meaning that. Uh, the interventions feature, 2, or more implementation strategies to support, uh, adoption. Um, and then the majority of these references featured implementation strategies that focused on promoting for colorectal cancer screening again. Because the majority of our sample did consist of references that covered, uh, colorectal cancer screening and then. The last point here, uh, the common implementation strategies to promote.

Among all the, uh, Kansas green types include. Having centralized, technical assistance, uh, conducting small sets of change, which feature things such as cycles and then, um, the teams having educational meetings to really help facilitate the training and, um, ongoing knowledge of using the tools. So, when we go to the next slide, as I mentioned, we only had a quarter of references that reported anything related to implementation strategies. To support for cancer screening.

But we're also finding that overall when this information is reported, there's a high. Percent of evidence across the majority of these implementation strategies domains listed here. However, as outlined here in our 2nd, knowledge gap. There is still limited reporting when these strategies are reported there's the limited reporting on when the strategy occurs.

The frequency or intensity, so the dosage of that strategy. As well as the specific implementation outcome measures. Related to, um, using that strategy to support adoption.

So next slide and our final knowledge gap that we are finding so far in our analysis is that. There is limited use of dissemination and implementation frameworks in our references that covered, uh. For cancer screening in primary care, and there weren't any technology framework. So any frameworks that relates to technology adoption acceptance.

Use and the interventions that were reported in the references for a sample. Which really raises a question about, you know, why are these, you know. Like, related to the fact that, you know, why aren't these resources not leveraged for adoption. When there is evidence that supports these frameworks effective is.

As it relates to using them in the design and implementation as well as the evaluation of evidence based practices. Uh, in clinical settings, so now I'm going to pause and pass it over to Amy to take us into our discussion. Thanks Kenny. Wonderful job there. Next slide. Please. And I'll be, um, serving as discussing today and reflecting on these things from the lens of a primary care physician, implementation scientist on our findings, um, for these tools being used in primary care.

So as Connie, just relayed, um, the technology use was predominantly electronic health record based tools also, some other web based tools. Um, and the greatest things that came out where it's being used for clinical decision support, um, as well as some other approaches and that panel management was when this came through the most, but also point of care, the, the pieces related to the tools effectiveness of the lion's share of data really come from colorectal cancer screening a much less data to tell us about this with regards to breast and cervical cancer. What really stood out to me from this work is the limited adoption that only 20% of the studies even reported on adoption at all. And then within those, it was less than half of the sites that we're able to adopt these tools. Really? Shows us. This is a major problem if there's the minority of sites can do this work.

Um, the, the various facilitators as are listed there I'll go through a little bit further, um, on some additional slides but you see, we kind of break them down to thinking of the organizational infrastructure, both person, power and most of these were electronic health record base. So, what can our electronic health records do to what extent can these things function in this way the, the technology you saw, that there were some issues with it not always doing what it was supposed to do some barriers where it wasn't felt to be accurate. What it was, uh, bringing forth to the, to the staff and then policy both what we might say is big P policy things that are more federal and state as well as little P policy. The local policies in the practice that supported or inhibited, uh, adoption. The implementation strategies as commie relayed often were multifaceted. This is true and much of our work, but it took more than 1 way to get this into place.

And as we might expect, uh, technical assistance was key as well as being able to iteratively test this. And move it together over time. Next slide. Strengths and limitations of the work. Um, you know, we're very clear that this was the intersection, the niche that we were looking at, was for secondary cancer screening in primary care, and looking specifically at the effectiveness of these tools as well as the implementation. Um, we used a mixed methods protocol was, it was guided by a, a model. The model as Connie showed you at the beginning, as opposed to much of this work that we reviewed that did not have the technology framework or model to support it. Limitations.

This is what we looked at just secondary prevention in primary care and there were some technical challenges as outlined there that made it difficult to execute all of the Pre specified approaches that we wanted to, um, extract the data with. And as Tonya outlined, there were some issues with identifying equity, um, to what extent this was equitable, even when things like reach and adoption were described, it was not always specified to what extent the population served, reflected the baseline population and the population that should be screened. Next slide so these are my reflections uh, I was really surprised by the limited use of frameworks. Um, certainly, there were also many studies that didn't report on their use of implementation strategies.

But even that outpaced the, the number of studies that used the framework, or, at least reported on it, they might have used it but didn't report on it. And we'd love to see additional thoughts, um, for you to put in the chat, in terms of surprises to you from this work or areas that you think, um, you know, that just that stood out to you next slide. And this just shows again, the, um, imbalance between the use of frameworks and the studies of the reporting of frameworks next slide. In terms of the biggest modifiable challenges, I said, really adoption stands out to me and in particular, those inner setting factors, the infrastructure, and the technical assistance needed the person power with staff as well as the tools being upgradable. I'm just going to, um, go through the next slides and remind you briefly of what we found in that regard. So, next slide. We have to push these things forward to our information technology, um, teams for the whole health system, and get them to agree to help support us to put things in. And it's a, it is a burdensome process.

We don't have staff just in our clinic who are there on site ready to do these things, unless we can push it up the chain and get policy support. So those 2 things of those facilitators, at least for my clinic, go hand in hand, and we see sort of the flip side with the barriers next slide. And then again, hear these, the resources, the time, the optimization customization to the extent, we can make this usable. I think of things like the system usability scale. How usable is this? Um, for the clinics to to do it makes a lot of difference. Um, and then we see, you know, the burden and keeping it up who's gonna do that and the inaccuracies obviously being a major issue, if its inaccurate I don't want to use that next slide.

And then, here again, we see things related to usability on the barrier side and on the facilitator side, um, elements, that push clinics, um, to have a value, um, to have the value proposition to use this that they're not just paid only for fee for service. Um, when there's an, or the incentive to do some more population management, I think those things, uh, push, um, clinics to have a value to, to incorporate these things next slide. So then, in terms of important next steps, there's many, and I want to get us to the Q and a, so we can hear from you and talk more there as well. But I'll just touch on 1 and encourage you to keep putting other things in the chat about ideas for the challenges with this work or potential solutions.

So, next slide. The 1 thing I will focus on is in our gap related to the implementation strategies. Um, the, the elements that we can, um. You know, that we can specify, according to an old proctors elements and domains and specifying implementation strategies, being more clear about the dos and the implementation outcomes affected, um, as this particularly relates to our notion of mechanism with this work. And and how this work, uh, accomplished is accomplished. Next slide and now I'm going to pass it back to Connie to wrap things up before we move into the Q and a.

Yes, thanks, Amy. Great job by the way. So we just want to give a special shadow and thanks to all our Co authors again as well as our founders listed here. And if you would, like, more information about this research, we do plan on, um, publishing this work. You can reach out to me directly. Um, my emails provided here, and I'm gonna turn it over to NCI. So we can go ahead and get it to.

The Q and a, thank you. Funny and Rachel, what a wonderful presentation when I have to say, when I saw on the slides, I absolutely didn't think you could get through it in 30 minutes. So, great job of information, um.

I'm going to encourage everyone to, you know, put their questions into the comments or the chat function. Rachel has responded to some of the questions as we were going through, which is great. And I'm sorry, I'm kind of being cut off here. Um, but 1 of the questions that, I don't know that was fully, um, answered was. Um, if someone wanted to know if you can elaborate on. What it means to be activating the tools, um, where they hurt, uh.

The clinical work flow. You share a bit more about that? Yeah. Why don't I take that? 1 Kelly is that okay? Sure, sure. Yeah. Yeah. I mean, we talk we have, I'm not sure even where the term activating came from me. These are all tools that were. I mean, effectively, either turned on in the user's electronic health record, you know, they were either again, go back to, like, what the tools we use for. They were either point of care alerts or panel management like. Data roster tools, or, um, uh, or or or or or similar tools to help with follow up hey, this patient didn't get, you know, we refer this patient to go get a.

You know, a follow up, follow up to get a copo or something, but we didn't see that happen. That hasn't happened yet. Right? So the tools were turned on on the HR and could be used. By the folks who at least were had access to them in the study, the web based tool. So there's a question about that as well. That was more like, those were like, a resource but most of those economy tell me from getting us wrong. Most my recollection is the most of the web based tools were effectively, you know, like patient, um.

Uh, decision making tools like, should I get the following screening there? There would be kind of a. A web based tool for that. So so those were just either available or, you know, and then the question is, were they use so I'm not sure if I'm not. Yeah, I wanna I wanna maybe if it's not answering the question about activation effectively, please, please, uh, pipe in, and let us let us answer that. And actually, I'm just seeing we had the barrier and the outer context of cumbersome to activate and update tools.

Speaking to that outer setting barrier. Yeah, that's interesting. I mean, given that, you know, these were all studies of how folks were using tools that were in place, um. I'm not really sure how to think about that. I guess, like, we, you know, I I think. Yeah, yeah, I think yeah, to my understanding from what I'm seeing at that, I perceive it as. Activating the actual, like, alerts and reminders used to, you know, whether help with providers, scheduling appointments or alerts or reminders to help with identifying eligible.

Patients that are due for their screening so maybe when there were updates. To, you know, customize those features. There were probably cumbersome challenges to activate those, uh, customization features for those specific CDs. Um, so, so, yeah, we, we have to do a little bit more digging on that, but from my understanding, that's. That's what I'm seeing so far. Yeah, I don't recall seeing a lot of articles that talked about boy was hard to set up this tool. Um, but we do know, I mean, I have I have read papers that look at.

We certainly know that the quality of data in the HR drives the quality of these and these tools that are driven by their support that are tools that are my HR data. We certainly understand that. Um, the quality of the HR data is going to drive the, the tool and I have. I've seen some some neighbors out there about about that. Well, we were, we weren't yeah, we were really more. Yeah, that's I think that's the, that's the I think that's the the correct answer. All right, we've got another question in here, uh, related to health equity.

The person who submitted this is interested in understanding. How to deal with health equity in future research? Do you have suggestions? Should you be sampling? Should you be. Um, trying to group for effective, you know, over sampling groups for effectiveness, testing, interviews, et cetera. What are your thoughts on that?

Yes, so I could take it. 1st, and then, you know, Rachel, Amy, but I think 1st, given what we're seeing with the results from our scoping review, we need to do a better job of just overall ensuring that racial ethnic minority groups are represented in these studies. So, um, making sure during the recruitment process for doing the outreach that's needed to.

Sure, that they're also included in, not only the control, but also the intervention group as well. So, I think right there, it just starts with the sampling and recruitment piece and then, um, to. You know, I'm sure that we're addressing health equity in the interventions. You know, not only looking at its overall effectiveness for the entire sample, but, you know, breaking it out by, you know, social demographic factors and. Um, you know, race and ethnicity to understand, you know, what specifically was it about, you know, leveraging, for example, a based tool.

Um, and that really helped us with our outreach to improve cancer screening for certain minorities and marginalized groups. So. You know, doing that extra level of stratification in the analysis to really, you know, Parse out, you know, what exactly is it in? The intervention. Um, that's working, I think that's another step, you know, once again, we didn't cover that, uh, collecting that level of the data in the scoping review but I think that could help. You know, move the needle on that effort. Uh, Rachel and Amy from, you know, the 1 thing we found was that most of the papers didn't even report.

Raise in this city, so we have no idea. Like, there's the literature has just this huge. It's not just a knowledge gap, but a knowledge chasm of other differential. Are there differential impacts in different groups? Like, they're not even reporting. Right? So, at a minimum, let's start reporting. Yeah. This whole, you know, at a minimum. Let's start reporting that, you know, I'm seeing a lot of questions about WH. Wh. What's interesting to me about some of the questions I'm seeing in the chat about are, you know, have these tools been translated again. I want to be clear that a lot of the tools we looked at were provider facing tools. These were tools targeting.

Clinical decision support, they were mostly provider facing tools, not patient facing tools. So the translation element seems a little less. Seems more relevant to the patient facing tools, but I think the take home I want folks to get from this. Talk is what you guys you're asking about is like. We're not even ready for that. The literature there is. So we, I was shocked by these findings. There is so little written about the effectiveness of using based interventions to improve cancer screening. In primary care, and even less about how the tools are adopted. So all this these other questions, I agree they're all important, but we're we have some basic ABCs we have to deal with. Like, we don't even know if we, you know, so so I well, I appreciate all of these questions. I just want to call out that there are some basic science evidence gaps here that, that maybe need to get handled. 1st, like, do these tools work, even in. In any settings, let alone in more diverse setting. So I just want to call that out.

There's, um, and the quality of the HR data. I, that's a great question. I've certainly seen that some of my research right that that the data is not but but but the point I want to call out here, is this in terms of demographics preferred language that that's not what the tools we were looking at were for again they were mostly for. Point of care identification of patients who are due for screening or panel management for patients who are overdoing need to be brought in or follow up on care. For patients who in broad strokes, that's what these tools are for. So the demographics and preferred language shouldn't I don't see how they'd be relevant to that those kinds of tools. And then what I would say is, but the point well taken right is that. If the HR data is not complete, then the quality of the tools are not going to be as good. But but I would say this, if you've got any person's health record, that doesn't have their most recent. Training data, that's that's the result. That's what you you want, then a tool that says, hey, we, you know, that says the HR doesn't even include this patient's screening data, like what's going on there. So, in a way that that what I mean, that's not actually missing data.

That's that's the data. The data is that there's no information. I hope this is. I hope I'm making myself clear. Yeah, I thank you, Rachel. I wanted to give, um. Any a chance to expand a little bit on sustainability. There was a question from Natalie of whether the studies that we found. Any issues related to sustainability.

The there were a couple of things related in our, um, qualitative. So, there was 1 piece in terms of a barrier on the burden to maintenance as well as develop. So so, to the piece that if we were the research team was able to bring this in and build it and get it in there, but then there was no 1 left, um, in the clinics who could sustain it, then, that would be a sustainability barrier. Um, Rachel, Connie, I don't know if there's anything else that you wanted to comment on related to St sustainability.

Based on our preliminary results, uh, it was. Like, we have outlined here, um, when we say it was like, less than a quarter, it was only like. 5 references if that that reported anything on.

Um, sustainability, um, it, it was just very low so there's a huge gap there. Um, and the evidence to, uh, address that. So. Oh, hey, Cindy can I address the restaurant lung cancer? Someone asked her if we consider lung cancer screening. Cindy may I jump to that? Or did you have another point? Sure. Really quickly Kate. We did consider looking at lung cancer screening and we had to limited for scope. We just, we, you know, there was only so much we could do. We just chose not to look at it because we could only do so much.

It was just a matter of keeping the work manageable. So, Mary actually was supposed to be a really good question about. If you could think about what type of studies or research needs to be done to fill those. What's the right mix and qualitative and quantitative? Yeah, so I can I could take it. Um, so I think. For, you know, um, in regards to the type of studies. For research done in, you know, that we're well positioned to really leverage mixed methods studies as well as qualitative studies and quantitative studies. But I think, um.

The chronic qualitative studies could really dig deeper into, you know, what specifically are we lacking when it comes to understanding adoption? When we look at these implementation barriers and facilitators. You know, what it, what, how are they linked back to the implementation strategies that we identified? Um, and like we said, like, there was a lot of information lacking in regards to implementation outcomes. So looking at different. Factors such as, like, feasibility accessibility we can really, you know, against the link, you know.

Those outcomes with the barriers and facilitators. If we use a a blend of mixed methods. Hope that answers the question can I tag on real quick as well? I thought that was a great answer. Connie but Larry 1, other thing that struck me is I was looking at our barriers and facilitators. What we've learned. So far is a lot of them seem related to the process and not as many of them seem related to the stakeholder priorities. I worry about understanding the value proposition for, um, our stakeholders a little better, and we see such a diversity of areas where these tools can target panel management point of care, follow up, um, understanding more about the most critical needs to our stakeholders.

Um, I feel would really be useful to to hone in on getting these tools to be more usable for those elements of the processes. Rachel anything to add already for the next question yeah, no, I agree with what those guys I mean. Okay. Yeah. So not only had a question. If you have kind of a take away of the do's and don'ts developing too. And promote adoption, you know, is that just something else we need to do more research on or what's your.

I think that's a gap. I think that's a gap as well. I mean, we're, we're finishing that what that would fall in, you know, is when we're looking at the, you know, the, the, I guess we were talking about the various barriers. You know, 1 of them is sort of the quality of the tool and I, I don't I don't call seeing a lot of articles that talked about how some of them didn't talk about how they developed a tool but I don't feel like it was a. I don't think anyone's formally, I didn't see any stuff where they formally tested different strategies for developing ity tools and and to make them more adopted. I think that's the short answer. Connie and Amy. Do you have any anything you'd add there? Yeah, I would just simply set to reiterate. This is just something that we need to do more research on because that is just. What we have to find more evidence about, like, I.

So our understanding, you know, these tools have already been in place. So we don't really get too much information about, like, their development, and, you know, like the Pre implementation of these tools. So, I think that, um, you know, just. Leveraging more research priorities that really seek to understand, you know, this gap will really help fill it. You know, 11 thing I would say here that and I could be, this is a guess, but, um, having written, uh. 1 paper about specifically, like the process of all than just simply implementing an I. T tool myself, not focused on cancer. Um, we, we, I think I think there may be a publication issue here right?

That folks don't necessarily think. Oh, I need to publish about how I developed the tool, a given tool. Like, you know, folks may just not even think that that's. You know, like, I'm not gonna, you may not even be considering writing that up, even though I do think it's important. Um, I'm not, I'm not, you know, I, I don't know that there's a whole lot of literature out there on. Processes used to develop tools. I could be wrong. I mean, that's not, you know, but we, we, but when we looked, we looked at again, was, like, you know, barriers to W, who reported on of areas to tool adoption. And what are the things what are the kinds of areas that they reported on um, and we did see, you know, some of the stuff 1 of the things that did that that did get reported on somewhat it's all not a lot, but somewhat was around you know around the, the content of the tool itself so clearly that's an issue right that we need to think about how the tools are developed, but.

I would again, I would say, I mean, really, I think the take them from this talk is that this is a field that is really under developed much more than I thought it would be going going into this. To this process, Cindy, I gotta get back to you cause there's lots of questions coming in. I'd like to. Yeah, so there's a lot of questions about and usability. Um, and really trying to dig into whether. That whether or not the implementors found it worthwhile. To activate the tools so, you know, even you mentioned, but this is really what we're talking about getting to the end user.

If an system isn't being maintained. They may not see value in. And music you like to comment on that.

I'm not, I'm not sure we know I think that's what this this results show us that, that, I mean, we so again, like, the take home here is that we have a, there's a lot of literature and colorectal cancer, and a little bit on the other 2 cancers around effective tools for supporting. Cancer prevention and primary care effective I. T tools. There's but but what where there is a nurture they show the tools are effective when used so. So right it's again need a lot more work and rest and circle around what what are, what are what, what kind of tools are effective, but, you know, the overall where we're at right? But I think I think that the issue is really 1, this is an implementation science issue, right? If you've got tools that are effective, when used, then what are what is going on with my folks aren't using them. And so I might I hear I see that.

Maybe folks are not enthusiastic about using the tools. That's of course that's possible. We know that, you know, perceptions of tools, usefulness. Drives future use of the tool, but. Yeah, so again, what my overall take over here is that we need to study this and yeah, Connie and Amy. Do you want to anything to that? It was not a Super answer for my part. No, I mean, I would just simply add just from the barriers of facilitators that we identified in our results.

We understand the context matters, you know, the setting matters. So if the tool is not compatible. You know, on a low resource, uh, primary care setting, you know, there's more research that needs to be done. To explore why it's not compatible to explore, you know, what are those, uh, limitations that, you know. You know, make it hinder, it's from being used effectively and then, you know, if it's working in other settings, you know, what what's driving what's working in those settings. So, you know, going back to the question about, you know, is it mixed methods is a quantitative. It's, it's a little bit of everything that we need to do to really dig in it. I mean, this review shows us that the gaps are there. Um, and, you know, we have the resources to really.

You know, go in and and, you know. Research and evaluate what how we can help close those gaps. I'm going to add 1 other thing here to respond to this question about activation again, consider that we, these were all papers on. On how the that were about these tools, right? They were interested in these tools usually effectiveness. But but but but somewhat adoption and. So so the question of the newsletter is find it worthwhile and make the effort to activate them. They certainly wanted people to, um, activate. Wait til, I can't hear you.

Did you just join me? No. Now, you're back, we can hear you better better. Okay, sorry about that my Internet of course. Classic timing right? The so the question about making it worthwhile to activate the tools consider that we were looking for studies of tools. Right? So, in any of these circumstances, the question is, you know, I mean, the the folks who are trying to get the tools to be used, we're certainly going to make the effort to activate them because they wanted to study them. I think, and what I think your question points too is that even what we find what we learn from this, this scoping review is a plot is relevant to tools used in some kind of a research context. And that, that the next step, I think in the field is well, okay, what happens in the real world setting when it's when there's not necessarily a research team trying to get folks to use the tools. Because we know adoption is going to be worse than or we can assume.

I want to take the question is prerogative cause. This is work that started on the construction for cancer, implementation science. Um. The action group that you have? Totally Rachel.

Can you just give a little bit of background of. How this all came to be I mean, this wasn't something that just started last year. Yeah, consortion and, you know, are you thinking that there's additional work that that action group is going to be able to do. Uh, based on the scoping review, and that citation that people are asking for, you know what. Which I'm sorry, which citation I'm missing wondering when is this, you know, when is this going to be made public.

Well, as soon as we can write it and submit it and get it accepted, I'd say, probably within a year takes awhile, you guys all know the publication process is slow. Yeah. So, the action group, you know, the technology actually, group was part of the. Did I get the wrong? It's the always mix them up. It's a, thank you, Cindy. And, um, the, and we know we, a couple 2 years ago. The action group kind of got together and said, what do we, what do we need to do to move the field forward in terms of the use of technology in cancer prevention or cancer care really? And and, um.

And the role, and how implementation science, how do we move the field for implementation science for it in terms of if these are effective or these effective interventions and do they. And when we came to, it was like, we actually, we don't even know we were like, we don't even know what's out there. And so we said, oh, you know, what we kind of landed on was. I think our 1st step is to assess the literature, where are the gaps in literature. So, in terms of what's next, I mean, I think that what I think we might want to do in the action group is think about, what do we think is the top priority where, what are ways to efficiently learn more what is the research that needs to happen next. That's that's a short answer, Cindy, but and it's but it's a huge task, right? You know, I mean, like, building a field of research. This is, I mean, this is many years of research to understand this better. Maybe we could think about as ways to. I don't know expedite the research, you know, maybe you could we can do study, you know, I think some studying the impact of these kind of tools. Maybe you don't need a 5 year study. Maybe we could do it in a shorter timeline. Maybe there are ways to to to learn more efficiently so it doesn't take us 10 years before we get and understand how to use decision support tools. Um.

That that's a, I guess I, I, it's, uh. Yeah, I guess my answer to you, Cindy, I mean, the sort of next steps are, like, figuring out what what to do next in the to move the field for now that we understand what a huge, what huge gaps there are knowledge. Excellent. So. I want to see if there's any more questions coming in. There's a lot of great comments and discussion going back and forth and the chat.

What do you do each of you have 1 last thing that you would like to say about this scoping review we'd like to share. Words of wisdom, other than more research absolutely need to be done. This really does show that there was a directive research in this area. So. Well, I mean, I would just say this is sort of my soapbox, but I, we given that we are showing that I T, tools can improve cancer screening outcomes, right? Improve rates of collagen, Gordon, cancer screening duty, a little more research on how to you on the effectiveness of such tools because we did kind of look at tools at different stages of workflow.

But the fact that we, you know, that, that there's so little reporting on how these tools are used, it's just a, it's just a huge knowledge gap. And we, if we have an effective intervention, why how do we get people to use it? And I, I, I, I, yeah, I mean, Cindy, I know, you know, I wish there was a more sophisticated answer, but I think the answer is, let's we need to learn this if these if these are effective interventions, why aren't they being used? How do we help them? Get used more effectively. Connie and Amy yeah, and I would just add this research really helps to identify those gaps, you know, where there's a lot of gaps on. Understanding the adoption as well as implementation. Uh, you know, we see we're seeing evidence around as effective this, but it's not a lot. So, you know, once we do get this, uh, paper published, it would be.

We hope that it could really help to move the research agenda forward to really help to bridge these gaps. Because, um, you know, we did a lot of work here to just lay it all out there. And now it's time to really. You know, take those next steps to tailor our research to really address these emerging knowledge gaps that we are identifying. Thank you.

And I can add 2, quick comments. 1 is on the stakeholders, priorities and understanding how these tools could really fit their priorities best. Because we're hearing that it does then take some policy in some person power at the local level to get them to work. So, if if we're not right on target with some of the more important priorities of how to do this, and which element panel management point of care other, and maybe it varies by different types of clinics. That's really key to know. And then the equity piece, and my thought on what really should be measured there is, at least. How representative how equitable the population, um, identified and.

Touched by this tool is as compared to the general population of the clinic, at least being transparent about that representativeness of reach. Excellent but there was 1 last question about whether whether the training component, which include the implementation phase. Not quite sure I got that. Sure, sure. I mean, Sandra, that would really relate to that that we looked at that that's a training is an implementation strategy. So we, you know, did did we, we looked at how often people. Reported on the implementation strategy is used to enhance use of a tool. It was basically fairly under reported.

That's the answer, but we do, we will report in the paper on what the different strategies were that that were used to to enhance the tool. Adoption and training is certainly 1 of them. Excellent. Well, what a great Thank you for sharing a preview of a future publication and an outbound from the insertion for cancer implementation science. Really appreciate all the hard work you put into this. And with that, I'm going to turn that over to to close this out.

Thank you everyone thanks to our presenters for a great presentation and to our audience for a great discussion. Um, please note that the webinar from today will be archived on the IAG website. So please be on the lookout. And with that, please feel free to disconnect at this time. Thank you. Thanks everybody. Thank you Amy and Constance. Thank you. And.

2022-03-04

Show video