2479
Blog

Thinking of integrating radiology AI? Some answers to your questions

If you are contemplating bringing AI into your practice, you probably have dreamed of a process as easy as downloading an app on your smartphone. It may not appear easy at the outset, but it certainly can be.

In order to frame this discussion, it may be helpful to use a familiar reference point: a PACS system. Some may remember the transition from film to PACS, which happened in the 1990s (depending on the resources and quirks of a given institution). This transition was a shining example of disruptive technology. It seemed that there was no graceful transition to be had. Some radiologists were taken to the digital workstation kicking and screaming. The technology may be completely different, but the user concerns are similar.

The newer generation of radiologists may be excited by the advent of a new tool, and less concerned about practical integration issues, but may nonetheless want to better understand the challenges facing the cause they will champion. On the other hand, radiologists “of a certain age” may have a genuine Luddite fear– a sort of technologic PTSD born of PACS snafus. Regardless of how angsty any particular physician might be feeling, there are answers to the main questions facing radiologists who want to champion AI in their medical organizations.

What’s the ROI of AI for radiology?

Perhaps you are still at the stage of convincing yourself or others that AI is worth the investment. Although there is a lot more for the AI community to learn,  the radiology literature is beginning to offer fairly robust studies indicating that radiology-specific AI can positively affect real metrics of efficiency. This is true not only in local/upstream metrics such as accuracy and turnaround time, but in surprisingly downstream metrics like inpatient length of stay or ER occupancy.

There is a bevy of similar articles in other fields of medicine as well, also showing promise. For those who were too young to remember, similar budgetary and cost effectiveness considerations were given to the adoption of PACS technology in the old days, but today PACS is a universal requirement for running a radiology department. AI seems like a less pressing expense to many, but will we look back at this point in time, asking why we questioned the utility of AI in the radiology department?

Is radiology AI secure?

Perhaps the sticking point in many institutions is the issue of information security. At the dawn of the PACS revolution, physicians and administrators recoiled at sending patient data all over the hospital, and even outside its walls, to be stored in servers beyond their purview. Like a paperback next to an e-reader, there is some intangible comfort in hard copy. 

Nowadays no one questions that the cost-benefit analysis of PACS is favorable even with the data risks. Similarly, today administrators are concerned about permitting an outside company to deploy an AI analysis system to peek at its scanner output, PACS files, or RIS data. It’s understandable; after all, the public’s wounds are still fresh from recent abuses by big data. This is perhaps exacerbated by the “black box” problem in machine learning.

The solution to these security concerns is first a design with good data hygiene in mind, and subsequently, education of the user community. The ideal deployment is the smallest footprint possible, the leanest use of data, and the most secure de-identification, encryption, and routing.

On the user side, we need a detailed yet non-geeky explanation of how data is de-identified, flows to the server housing the algorithm(s), how it is treated thereafter, and how the output of the algorithm then flows back into the hospital system(s). Ongoing education, fomenting a real understanding of how we can securely move and process data in medicine, avoiding the tendency to reflexively lock everything down.

How challenging is AI integration?

Regardless of security or cost issues, a separate, unquantifiable question for those considering implementing AI is how painful will it be to install.  “Will this take down my PACS or RIS? My scanner(s)? For how long?” As mentioned above, who among us has not suffered the turbulent takeoff of a PACS transition/upgrade? Here is where an AI enterprise flexes its muscles. Whether a hospital is going to deploy something built in-house, or from a vendor, there is a vast gulf between designing a machine learning algorithm and deploying that algorithm in the wild.

It’s safe to say that the majority of medical AI algorithms described in published reports today are African violets; they are lovely, but they work only under special, controlled conditions. This is not to say that the training data sets are over-cultivated or biased. That is a well-recognized problem which the field has thoroughly examined.

Less examined however is what it requires to take this African violet out of its greenhouse and put it in someone else’s garden, where it will be exposed to vastly different operating conditions, and probably neglected somewhat too. Can the AI that’s been invested in be planted in the garden? The successful vendors will be those who have thought extensively about this issue, in advance, and not on the fly at 8pm with your local IT on the phone and the PACS server offline. 

And this is not science fiction, we are doing it. The same design that ensures a secure deployment also ensures a smooth one: small footprint, smart routing, and—most critical—agnostic to the various systems with which it will interact.

Will AI impede my workflow?

Finally, you are worried that your AI application(s) will slow you down. After all, disruptive technology is disruptive, sometimes in ways we did not foresee or intend. One of these disruptions is the radiologist’s workflow. Workflow is a funny thing – even the most experienced radiologist, or perhaps specifically the more experienced radiologist, has a set of conditions (like an African violet?) in which he or she is most productive. The variables are manifold, and many strive to control these variables as much as possible.

Now enters AI, a new sidekick, insistent, ever vigilant, maybe too vigilant, maybe reading at a position on the ROC curve at which you do not care to function. It will point out oversights or try perhaps intrusively to assist in one way or another. And you cannot just get AI off your back by politely sending it off on a coffee run. Or maybe the opposite concern—I’ve heard this more than once—you are concerned you may become too reliant on it. Regardless, the solution here in my opinion hinges—again—on relationship and competence. You need to have a partner who will anticipate needs, and then listen for the echo….the feedback that is critical to adjust for your specific habitat. AI algorithm output is not monolithic, nor is it always like a black-box. In the right hands, it can be adjusted, throttled or expanded.

And it’s not only a question of algorithm output, but also how that output is presented. I should add here that through all this adjustment it doesn’t hurt to have a partner who’s done it before, elsewhere, and may have an idea of how your proposed adjustments will pan out. And then, in tandem, your partner needs to have both the competence and the bandwidth to address your specific deployment, tweak, assess, and repeat as necessary to try to find some calculus between impact and unobtrusiveness. 

A vendor to guide you through it

The journey may pose its challenges, but there are plenty of answers and guidance along the way. Through any questions or challenges you may be facing, Aidoc’s team of AI and healthcare experts is happy to support you through your process of championing AI at your healthcare institution. With 12 FDA-cleared and 13 CE-marked solutions, Aidoc is the industry leader in AI for radiology and multidisciplinary response team solutions.

Explore the Latest AI Insights, Trends and Research