Welcome everyone. Today's webinar Ai
technology that is transforming imaging.
I'm only gamble Becker's healthcare and
thank you very much for joining us today. Before
we begin I'm going to walk through a few
quick housekeeping instructions
first. We'll begin today's webinar with
a question and panel discussion and then we'll
have time at the end of the hour for question
and answer session. You can submit any
questions you have throughout the webinar
by typing them into the Q. And a box you see on
your screen.
Today's session is being recorded and will be
made available after the event. You
can use the same link used to log in today's
webinar to access that recording
if at any time you're having trouble with
the audio. Try refreshing your browser.
You can also submit any technical questions
into that same Q and A box. We are here
to help with that. I am
pleased to welcome our terrific speakers
today.
Dr Evan commoner is the chairman of radiology
at Montefiore Nyack Hospital
and Ceo of Hudson Valley Radiology
Associates. Dr Sonia
Gupta is the Chief medical officer of Change
healthcare and an abdominal radiologists,
radiologist. Association of florida.
Dr Commoner. Dr Gupta. Thank you so
much for joining us today. We're just thrilled to have
you here. Can I begin by asking
you each to tell us a bit more about yourself.
Dr Kamler, let's start with you.
Thank you and good afternoon everybody and
thank you molly for sharing this talk.
Uh So I'm at Montefiore
Nyack Hospital. It's a license
for 400 beds but we typically have a sense
of about 200 to 2 50
in house at any time. We
do about 60,000 visits
a year. We do about 90,000
radiology procedures a year were level two
trauma center.
We're about 30 minutes north of New york city
and we're part of the Montefiore Health system. Since
about 2015
we've been we implemented our first AI
algorithms back in 2019
and seeing a steady growth in our portfolio since
that time.
Great! Thank you so much. Dr commoner
and dr Gupta. Let me turn to you next.
Yeah, thank you so much for
hosting us. Really excited
to be here. My background
is I was an academic radiologist
previously and faculty
at Harvard Medical School and University
Hospital as
an academic radiologist. I had
great experience teaching Residents and fellows
and kind of seeing their excitement for
artificial intelligence and
during that time I also served
as an adviser and consultant for multiple
Fortune 500 companies trying
to figure out, you know, Ai strategies.
It's all kind of new right now and
I'm really excited to be here
attendees. We have a lot in front of us
today. Over the next hour, here's a few
highlights about our discussion. We'll dig
into. We're gonna cover a number
of important topics around AI and medical
imaging, specifically. Today's experts
will explore the current landscape of Ai
adoption and imaging, how AI
has helped across care teams
and what to look forward to when it comes
to partnerships and AI investments.
So that said, let's start by
doing some some level setting.
We hear quite a bit about
AI today. Dr Steptoe and commoner
folks are considering what's the best
approach to helping them determine
even if AI should be part of their practices
and if so what AI is most effective.
Can you begin by telling us where
we are today with the adoption
of AI and imaging at large.
What are you seeing Dr Gupta? I'm going
to turn to you first.
Sure, well we're at a really
exciting point I think in
A I. And radiology, so
about one third of radiologists
are currently using AI in
some capacity in their practices
and even more plan to
integrate AI into their daily practice
in the next several years.
And the way I like to think about
AI broadly speaking is that we
can divide it into two big buckets,
we can talk about image interpretation
AI and workflow related
ai.
So right now a lot of large
radiology practices and hospitals
are using aI for a variety
of image interpretation
use cases. So examples
include identifying stroke
intracranial hemorrhage and acute
pulmonary embolism
re fractures and
others are using ai more for workflow.
So workflows being,
you know, incidental finding follow up
some of the administrative tasks
that we need assistance with assistance
with radiology reporting or
even improvement in image quality and M. R.
I. And pet decreasing
that stand time and using ai to
improve the image quality.
So we have you know lots of
exciting developments in this space right
now with a lot of investment and
innovation in the Ai space.
I think it's really helpful understand one third
of radiologists using AI right
now in their practices even more like you said
playing to integrate it into their daily workflows
in the years ahead. Dr
Cameron let me turn to you and get your sense of
what you've had experience with AI.
Can you tell us about what's worked
and what hasn't for the radiologist
in your group?
So we you know my practices
we have an inpatient practice
and we also have a large outpatient practice and
our ai
algorithms that were implemented differ
between the two practices
that I have for the hospital
based practice. We focused more
on ai that will find
E. D. Critical findings that we
either might have missed or that
we want to catch earlier and move up
in our algorithm in our workflow. So we read them
faster.
We first started with these
E. D. Interpretation. I'll call them
algorithms that look for pulmonary
embolism, intracranial hemorrhage,
free air in the belly things
that should we miss them could
have severe outcomes to the patient
and things that we might want to
alert the er faster
that there's this finding. So we focused on
that first.
Then in the outpatient we started
we just recently implemented a new algorithm
for reading mammography which
at first just prioritized studies
that were abnormal but now actually will
show us the cancer and can
find 10% better the
cancers earlier by as much as
two years. So we have
implemented that in outpatient. We're toying
with. Well actually we're close to implementing
prostate um our algorithm
in our outpatient because that those are very difficult
to read. And we needed some help in reading
those as well. Uh In our
in patient we have algorithms
for care coordination. We can talk about later
that later in our discussions.
And we're also right now
implementing a test in report
generation where the er an
ai algorithm will generate
the impression in our reports to try and
not only speed up the radiologist
but also you know uh
make a cohesive report.
That uh
I guess the best way to describe it is uh
abides by standards created by the american
College of Radiology. So it kind of puts it all together
in an impression for us.
Thank you so much. It's a great overview of the different
algorithms that you your work and your
groups are associated with. Like you mentioned the one
for E. V. Interpretation
looking for pes other serious
issues that if missed would have some severe
outcomes for patients like you said. And then the
other algorithm from mammography identifying
issues up to two years sooner than otherwise
would be detected.
Um So some some great improvements that
sounds like dr commoner in clinical practice
but if we had to take a step back
and think about Nyack and usually
when these solutions are rolled out there is
some problem or problems to
be solved. Right and I'm curious
in your case what were some of the
challenges Nyack was running into?
And how did ai
help address them or drive better
workflows to coordinating care
especially with teams. Can you speak to
that a bit more?
Yeah I'm gonna uh There's
a couple different topics and I'm going
to break him out a little bit. Let's talk first about how
we integrated ai into our workflow.
Prior to ai implementation
we had a workflow manager which we had
set up to
organized the radiologist workflow
that more optimally read the studies
in concert with what the hospital's expectations
were. So we went to the
ICU team and said what time do you around
and what time do you want to report to read body. We
went to the hospital's team and asked the same
question. We went to the E. R. And ask the same
question and then we created a workflow
that more optimally coordinates
the radiologist reading with those of the
care teams we support
Using that that workflow. We were
able to reduce our E. D. turnaround
time by 38%.
And not only that but we had an agreement
with the er that we would read the er
cases within 30 minutes
And that compliance went from 79%
prior to workflow organization
to 94% which was a great
help to the emergency room. We
also uh
uh improved our I. C. U. Report turnaround
time by 50%. This was
before we even had artificial
intelligence
and our mantra at that point was we
wanted to read the most important case first.
That point case might be the I. C. U. Or
the er depending upon you know
how critical the patient's those are our
most critical patients.
But with ai we're able to add a new
new layer into that which is if you want
to read the most important case first
what's more important than a case that you know is positive.
So the ai would
scan that would scan the study
determine if there's a positive finding
and then automatically raise that case
to the top of our work list so that it was read next.
Using this adding this layer
of um
Workflow into our workflow.
We were able to improve our er
turnaround time by at least
17% and that's an underestimation
because all those cases
were positive and I could not capture
when the er was actually called with the
result because every one of them required a phone call
so that 17% is an underestimation
of the actual improvement in turnaround
time. So that was
how we you know
layered this
ai into our workflow to
try and support the hospital.
We did have challenges when we went live.
Uh, some of the challenges
was radiologists, you know, just change
management, you know, getting the radiologist to accept
it. We had a problem with the AI, taking
too long for the results to come back.
The radiologist would read the study
before the AI results came back
and we had to create workflows to try
and capture those positive results and tell the
AI tell the radiologist. I'm sorry
that there's an AI positive finally, and make
sure it coordinated with your interpretation.
Um, and then we just had some radiologists
that just didn't want to wait for the AI. But
eventually, you know, we work with our vendor, we
worked through some of the issues and
now the AI report turnaround
is such that we have the result before
the reports. Even at the radiologists
for interpretation, that was one of our main
challenges was just the speed in which
the AI results come back to us.
It makes a lot of sense. Thank you for kind of
illustrating from beginning to to date
of how this has gone at Nyack.
And I think you mentioned a few really impressive
metrics there. Dr camera in terms of
compliance with turning reads around
in 30 minutes or less your turnaround
times improving that 17%
and underestimate like you said, Is
there anything else worth mentioning about feedback
from the many teams
that your department works with? Um
In terms of how the solution has been embraced?
You mentioned there was a little bit of reluctance at first
people did get on board. But is there anything
else worth mentioning? Because as you said in your remarks
you really are trying to coordinate outcomes
for a number of different care teams
and departments and I'm just curious if there's
any feedback worth mentioning here.
So uh
we there's actually two phases of
that question when we first went live,
the information was just presented to
the radiologist uh and
the the er was thrilled
that we had it and it was something that
they spent a lot of time talking
up that we had a i to improve
not only our interpretations to find
those abnormalities
which we have had
which could be missed quite frankly
but also to speed up the result
turnaround time.
But eventually we also then moved
into a new ai
algorithm which which is in what's called a care
coordination space and what
that does is it allows you
to coordinate ai uses
the ai to coordinate various teams in
your hospital. So the most common
example is the stroke team but there's also
a product out
there that works with a
pulmonary embolism rapid response team.
But for the stroke team we were able
to coordinate care not only in my own hospital
but across the entire health system. So
what the system does is it
it the the pack system
sends the images from a brain C. T.
Or a C. T. A. Of the brain to
the ai the ai then interprets it if
it finds a positive result, it notifies
everybody
on the care team which would be the er
the neurologist and also the interventional
radiologists who do
the clot extraction and other
procedures which at another hospital.
So once they announced
the radiologist which of course then does the
final read just to make sure the Ai is correct.
Excuse me correct.
But by getting everybody on board
and moving that that result up sooner.
We can then improve outcomes because as we
know in stroke time
is brain tissue. So we
by notifying everybody on the care team.
Everyone can start organizing to
faster to take care of that patient.
And we also able to make that across
the entire hospital health system so
that all the smaller hospitals then can
send a more critical patients to the tertiary care
center for therapy as
needed. Dr
canter is a departure from the norm because
everyone on that care team, the stroke care team
is getting the same notification at the same
time. Whereas otherwise before
it might have been a
ladder of stars were a tree
that prior to this the
you know we would call the emergency room radio doctor
who then would activate the care team.
They would have to call the neurologist. They'd
have to call the pharmacy. They'd also
have to advise the Church
care center that there may be a patient coming
their way. Now everybody gets
notified at once.
They're also able to communicate by this
application. So they can not
only they can coordinate their care. Oh, I looked
at the ct of the brain. This
patient is not a candidate for intervention.
So that changes our practice here
at our local hospital. Or the
interventional radiologist could say, you know what this
patient is a candidate for for
cloud extraction or
whatever they're gonna do. And they can then
coordinate with our local team to get the patient
transferred more quickly. And as
we all know, transfers take a long time
to happen. So it really does speed
up the care of the patient by getting everyone
on the team notified at the exact
same time.
You know, it makes a little more complicated to implement such
a product because
there are more stakeholders in it
and you still have the problem that the,
you know, the Ai is not perfect, none of them are perfect.
So there's always gonna be false positives and quite frankly
they're gonna be false negatives. So the radiologist
is still an integral part of this whole process
to make sure that the AI is accurate.
But in general, you know, sometimes we
get we get activated and we have to stand down
and sometimes we have to use an old method of
activation. But its general,
it makes a significant improvement in patient care
across the entire organization, both
my own hospital and across the entire
health system.
That's such a great point that these solutions are by no
means replacements for radiologists that can help
augment them in so many ways. As your remarks
just beautifully illustrated, doctor, let
me turn to you here because I think DR commander's
remarks, they really help us see
all the various types of burdens that AI
can help reduce, whether that be cognitive,
administrative. Um can you
talk a bit about how you see
AI helping to alleviate the
burdens that physicians might otherwise run
into in their day to day practice?
Yeah, absolutely. We have to really pay
attention to AI in the workflow.
You know, we don't want to adopt
an AI solution that kind of makes
the workflow tougher for the radiologist
and puts more of a burden on them because
ultimately that will kind of hinder the adoption
and then, you know, choosing
AI partners and vendors
that are able to work
with your system and your, you
know, group to make sure that the workflow
stays efficient and
you know, there's not a lot of extra application
or clicks involved so that,
you know, the interaction is more smooth
and kind of less distracting
so that you're able to get all the benefits,
you know, without some of the negatives
that there could be with adopting a new
technology
DR Cannon or anything you would add in terms of
burdens and what you've seen really be
alleviated with the roll out of this this
tool
yeah. When we when we first implemented
look the Ai
you know is a double reed for the radiologist.
You know we we hope that
you know we know that the radiologist
has a certain error rate and we know
that the ai has a certain error rate.
We're just hoping that the two
what if you look at the swiss cheese model
of you know, errors, those
two holes don't align so that
there won't be an error that gets missed by both
of them or find this missed by both of them.
So the radiologist really ended
up liking this even though it
there's not a not a large
number of cases that that the a
I found that the radiologist did not find
but it like having that second read
that we know that second reads
improved quality. We knew that from mammography space
many years ago but we just could never afford
to have two radiologists read every
single study which is to cost too
costly. But now when
one of them is a computer
now we get the benefit of that second read
and a much more affordable cost
which should improve quality reduce
errors. Now when
we first rolled it out
we asked the radiologist to
tell us how many of these cases
the a I actually made a difference on uh
and it turned out to be not a large percentage.
But then when they asked the radiologist
saying, okay, granted it didn't improve
your
the number of cases you had
you had a change in interpretation.
You know, should we pay for this or
should we pass on this technology? And
every single one of my radiologists said
we want to keep it.
And the reason why is the
benefit. We're trying to reduce
stress for all doctors. It's it's
a burnout and stress is a
is a,
you know, entire system, you know, health
system wide problem. Every
every doctor is seeing
more patients has to work faster
and we're dealing with stress and burnout.
It's not the positive findings that
causes the stress and the radiologist,
it's the negative finds the study that has nothing
on it. Because those are the cases you're
signing off when you're saying, I don't see a lung
nodule, I don't see a pulmonary embolism
and you're hoping that you're
not hoping that you're right. But you know, you're using
your strength of your practice and experience to make sure you don't
make an error. But by saying,
I've looked at the study, I don't see anything
and now I look at the ai and the ai
doesn't see anything either.
That extra reassurance
is a big stress reliever for my radiologist,
especially when I have radiologists that are
occasionally reading outside groups, especially
when they're on call for them
for a body radiologist
to know that the A I didn't see intracranial hemorrhage
and the and the neuro radiologist to say
that the A I did not see a pulmonary embolism
was also a big
stress reliever burden, relieved to know
that there was a second algorithm
that was looking over your shoulder to make sure that
that you didn't make an error. And
that was a big stress lever and let
all my radiologists to say we want to keep
this technology.
Right. So like you said,
you you can never afford to have a second read
for every image because you said in your opening remarks
90,000 diagnostic imaging
procedures a year at Nyack. Right,
okay. So I have a question question
for you dr canter, how how do you describe
that? What you just described
for us that the physicians are feeling in terms of
that added support, do you call it to your
leadership team? Is that added protection?
Is that a safety net? How
do you describe what you just discussed?
I actually call it a safety net. I mean if you kind
of think about, you know, the trapeze artist,
you know, they're doing their job there,
you know, swinging up there. They're grabbing each other's
arms and occasionally they miss
and there's a safety net at the bottom to catch them if
they fall. That's the way I look at
it. This the ai that we have
is a safety net to make sure
that we
don't miss critical findings we have
prior to the AI implementation. We
had a patient that had an intracranial
hemorrhage that was not picked up by the radiologist
and went on to get anti
coagulation and had a bad outcome because of
that. Uh we,
the ai in retrospect,
would have picked up that case and had a different outcome
for the patient. So this this
uh technology
does significantly improve
the quality of the radiologist reads.
And it also has an unquiet
unquantifiable at this point, improvement
in our efficiency because, you
know, we don't have to look at it quite as
long because we have that safety net there as
well. So it
has to to improvements right there
for at least the radiologist workflow.
Mhm. I see attendees.
I see many questions pouring in. Thank you so much.
We have a couple more questions we're going to get through in
our prepared discussion today and then
we'll dive into your questions with
Dr Gupta and commoner. So thank you and keep
them coming in if you will. Um dr
Gupta, I'm gonna turn to you, we've we've seen
a I embedded into we're close
a great deal of the past couple of years
specifically and one of the hot
topics, if you will, that seems to be
popping up time and time again is incidental
findings from screening programs
from other exams. Can you help
us understand the impact that
AI has on the management of those.
Incidental findings.
Yeah, absolutely. You're right. It is a
hot topic and one that we're hearing
a lot more about. So
you know, I think this kind of falls into that
bucket of AI for workflow,
you know as we are starting to kind
of move past covid, we've
seen a lot of our screening programs
kind of get all of the volume of patients
that were not doing their annual
screenings, you know during 2020
and 2021. All of them
are coming back now and
incidental findings are just
popping up on a lot of these exams
and incidental findings
are findings that need further
care and follow up and
sometimes that communication gets lost
because the patient is coming in
for one issue
and then you we see an incidental
finding which actually has nothing to do with the
reason the patient came to us
or why they have their appointment but
it is equally important and it
means you know, further care either that
is repeat imaging, you know
in a certain time interval, six months or
a year or it could mean that they need
to have an appointment with a specialist.
So this presents a
big workflow problem in a lot of ways because
the communication back to the referring
physician and the patient,
you know about this finding and the fact
that they need, you know a little bit extra
care for it can sometimes get
lost and our goal is to
make sure that that finding doesn't get
left unnoticed or untreated
or just kind of lost and all the other things
that are going on in the patient's
care journey.
So ai can be used to address
this by communicating
to a patient navigator. It can be automated
to get an incidental finding
back to the referring physician and to
track that communication, you
know, whether the communication is by phone
or facts or certified mail, you
know, it can automate that process and track
it And you know, when
the finding occurs, the study can be sent
to a special work list
and it can go through a process to make sure
that the diagnosis gets back to
not only the physician but also the patient
and this kind of workflow improvement.
You know, it alleviates a lot of the manual
work that's involved. Uh you
know, again we talked earlier that Dr
has 90,000 exams, you know,
passing through the system. So if you
can imagine in a typical radiologist taking
care of probably over 100 patients
a day, you know, there's that's a lot of
incidental findings and a lot of
manual work that adds up. You know, whether
it's keeping a
list, a separate list of patients that need
to have findings that are faxed or
called to another office. You know, keeping track
of all of that can really
be automated by ai and
that's what I'm really excited about.
I think the context that you place your remarks
in about the pandemic and people who put
off care had to delay their care
and how costly that has been
for for people and for the health system and then it
comes down like you said Doctor Gupta
to even work flow and just making sure that those incidental
findings are communicated with urgency
um and if they ever slip through the cracks that is only
adding to the problem. So it's great to understand
how AI can really solve that challenge
or at least make improvements for it. Um
I wanted to talk about another hot topic
and that is the debate of implementing
ai on prem or
in the cloud dr commoner.
What has been your experience with this? Do you
see one choice as stronger than
the other?
Yeah. I actually have given a lot of thought
about this particular topic. Um
The on the on premises solution
is attractive to many hospitals because they
like to maintain control of their servers
infrastructure and also the patient
health information which is a big concern for
institutions to make sure that patient
data as it goes over the internet
is secured.
But the problem with that solution lisa my institution
is the burden of supporting that equipment, uh
the service need maintenance, they need operating
system upgrades, they need security updates
to make sure that we don't get a data breach
which would of course put the patient information
at risk.
Uh The equipment occasionally gets
end of life, there's nothing worse for me
than to have the budget. You know several million
dollars for a disc farm for
my pack system because while
it works perfectly fine it's end of life
and the vendors no longer supporting it,
that takes a million dollars out of my budget
that I could be using for something else to
basically just buy something that sits in the closet
and doesn't have any
you know our ally on it
beyond what
already exists already haven't just replacing
what it had. So the
on premises solution has a
huge cost associated with which I would
rather shift to my vendor.
Let them worry about operating system
updates. Let them worry about security patches.
Uh The cloud
services like google cloud
and amazon web services,
they're gonna be much better at security than we
are at our hospital. So
I'm basically shifting the burden to them.
I have a very small footprint at
my institution which is just a
virtual server which is very inexpensive
for me to create
and maintain and let them worry
about about everything else. If they have to
upgrade their disks then let them worry about
it. I have a fixed monthly fee
that I can budget for easily. I don't
have to worry about these. You know every
you know couple of year major updates
that I need to do to replace servers and disk
farms and the like
and I also have a much smaller I. T. Staff to manage
this. I don't have to so I have savings
there. So in general I have been
looking for for solutions that shift
the burden of infrastructure,
computer infrastructure support to my
vendors
and less on my own internal staff
and then have a kind of have
a fixed budgetary fee that I can then
maintain rather than have like
I said these you know every couple of years
major infrastructure improvements that I
got to pay for.
That makes a lot of sense. I think thanks for
outlining your your thinking and
the pros and cons there for us so cleanly.
Um let's talk about the Ai partnership
if you both bring a different point of view
to this question. But what are some
key takeaways for any attendees
who are with us today who have been debating
partnership are looking to pursue one
in the near future. What are some
things that should keep front and center
and they're thinking dr Gupta. Let me turn to you
first.
I think the first step is just to
identify the issues that you want to solve
in your practice you know and not just
get a i for the sake of getting a i
it really helps to have a clear
list of problems that you think
that ai can help with. You know space
specific areas where you'd
like some help in your practice and I
think once you've identified those problems
that you want to solve and, you know, those problems
they vary, they vary
based on practice size, you
know, geographic geographic region.
So it could be workflow related,
it could be image quality related,
you know, maybe you need help with scheduling
or image interpretation,
you know, once you identify the specific
problem that you want to solve, then
the next step is to find resources to kind
of support that journey for your practice
or your group. And our
professional societies and radiology have
really done a great job with this,
you know, radiology, especially
as you know, within medicine
is using more AI than a lot
of other specialties right now. And
so are professional societies have really been on the
cutting edge of that and they have excellent
forms to guide practices through these steps
and, you know, interacting with your
colleagues is another way contacting
friends from residents, your fellowship or
maybe colleagues working in other practices
to compare notes and ask them what they're
doing with AI or what ai they might
be planning to adopt.
That's a great way to get practical recommendations.
You know, we've heard a lot from dr
who has great hands on experience,
you know, evaluating multiple AI partnerships
and I think that's a really great
opportunity for us to learn from.
So come with
some clear goals in mind, don't just do
it to hop on the AI bandwagon, That being
a number one pointer from dr Gupta
dr Kamler, you look at things from the
other side of the aisle, What would you say
if colleagues at health systems practices
are looking for an Ai partner?
What are your recommendations? Yeah,
I think, you know, we started this with
a simple trip to the R S N. A, which
is the big radiology show every
year, the week after thanksgiving,
they have a whole floor devoted
to aI with just about
every vendor you can imagine there, it's
a good way to get a sense
of the landscape, who the players are, what
the types of solutions there are. There
are different ways you can do this, you could make,
you can go to each one individually
and set up your own solution
with them, but they're also what
are called Ai marketplace. It's kind of like
app stores for AI and there are
several vendors, you know, some of them might be
vendors you already work with like nuance for
your, which uses
power scribe for radiology reporting. Um
A lot of the pacs vendors have their own Ai
marketplaces and they kind of act as a middleman
between you and the AI vendors
and give you a common interface for all of them, it
kind of reduces your infrastructure, so
you have one server serving
all the Ai structures rather than the building
a separate server for each one.
The only problem is you got to make sure that that
workflow works for you. Uh
we I've seen a bunch of the workflows
and I think a lot of them make it harder for
the radiologist to get their work done. Radiologists
are very thoughtful
about efficiency. So you really want to
make sure that whatever solution you choose
has the minimum has a minimal impact
on the radiologist efficiency. Otherwise you're not
gonna get a big buy in from the radiologist.
The other thing I'd like to point the listeners
to is a is a new article that was just
published in the journal radiology by
big players in the radiology of traumatic
space, keith Dreyer and kurt Langlands.
They just published an article basically
going over a I governance,
how to choose a. I algorithms. It's
an excellent article about how to
to dive in ai and make
investments which
are useful for your institution, including
uh an algorithm that they developed
for deciding which ai ai
algorithms are worthwhile for your institution.
So I think if you're looking at a
I it's worth spending looking at that article,
it's very informative and helpful
suggestion. Thank you so much. We are
going to now turn to our questions from our attendees.
There are many that have come in at this
time. If any of you still have
questions that you have not yet entered into that Q and a box
on your screen by all means. Please do so
for devoting the rest of our time together to answering
your questions. Um So I'm gonna
go right to one that just spoke to what
you were sharing. Dr commoner, but an attendee
would like to know what ai vendor does dr commoner
use. Can you tell us more?
Yeah, so we use uh
we try to be vendor neutral in these talks, but
we use ai doc for our
hospital based solutions
and we also use radnet
slash deep health for our private
practice solutions for
our mammography and prostate M. R. I.
Thank you.
Another
question for you dr camera or the AI results
imported into the EMR
uh So good question.
We have the choice about whether
to do that or not. We
elected by group not to include
them in the Ai into the EMR
uh We don't store the Ai
results in our pack system, but
you could if you wanted to
we just decided that it's it's
it's you know, since we're the ultimate
arbiter were the one who makes the final
determination that we didn't think
it was necessary to put the AI result
in GmR
Great
and dr kim I'm gonna stay with you for this next question.
Dr Gupta, I would welcome your perspective
here in this one as well. I think it's an interesting question
from your perspective, have you found that
patients are more or less comfortable
with the radiology report knowing
ai was leveraged or in general,
does the patient not know what tools were
leveraged for the interpretation?
Yeah, We
you know, we did a little advertising
around ai when we first got it.
Um But not a lot.
And there's no well
That's actually not 100.
Our mammography reports do say that
they were scanned by an Ai. Just like
they used to say they are scanned by a cat program.
So we do put in our mammography reports
are radiology reports. We
do not state that it was scanned by AI.
And I do not know
uh the answer to that question of
what the patients from the patient's
perspective which they would prefer.
I know my preference would be that
I definitely would want a eye scanning because there's
no question that it improves
outcomes. Uh
It finds mistakes that,
from my personal experience there's plenty
of cases where AI is made a change
in my interpretation. There are also plenty
of times where I found something that they I did not
find as well. So it's a it's a it's
a cooperative
relationship between AI and the radiologist.
AI is not gonna be replacing radiologist
anytime soon. We're gonna
see a
An efficiency improvement from a I just
like we saw an efficiency improvement from pacs systems
back in the in the mid-2000s.
So eventually we'll see that
it will speed up the radiologists and make
us more efficient. But
I don't think it's replacing a radiologist anytime
soon. Anything.
You know that you can speak to on this when it comes to patient
attitudes, any literature research
or even anecdotal evidence and that you would
bring to the table.
I think it's so early stage right now
that it's hard to make that decision
or that question for patients because
you know like dr Kamenar said are typically
the reports don't have a little notation
that says you know AI was used.
I think it would be up to the individual practice or
hospital if they wanted to include that information.
But it is used sometimes for marketing
you know and that kind of to
let patients know that your
hospital system is in my
opinion at the cutting edge because you're evaluating
new technologies and using them
for improving patient care and
kind of taking that leap. But
I think you know I know again my patient
preference would be that my doctor
used ai uh
thank you both for sharing your thoughts
on that one. It sounds like there's still a
lot of T. V. D. In terms of
understanding patient attitudes at a large scale
about about that tool and how it's communicated to them.
Another question for
either of you or both of you but
this attending would like to know what the potential
of result bias based on the ai interpretation.
Have you implemented any different peer review
processes for for studies
using ai
doctor. Can I turn to you first for your
thoughts here and then dr Gupta if you have anything to
add I welcome your input too.
Yeah. When we first started the
Ai uh we did
have each radiologist uh
fill out a form. It was
a computer form so it wasn't it was
you know it was true positive, false positive,
true negative, false negative. And from
there we got our you know, our accuracies
and all the statistics that come with those values
and our our
measurements were similar to what was published by
the vendor.
We did ask whether a I made a difference
in the interpretation. We found that it was a very small
percentage but like I said earlier
that it wasn't so much
the number of cases that have changed report on it
was the fact that it reduced stress and
improved accuracy which overall
lead us to keep the product.
Um There's definitely a bias
in those values. There's no question
because we don't screen, we did
not screen the ai results from
the radiologist. So there's
no question, there's a bias in there. And
no we did not implement any
new algorithms around
our our radiologist um
peer review process around
this. Now we have not done that.
Have you got to anything that you would add?
Yeah it does require, you know,
like dr said it requires a manual review
which can be kind of labor intensive,
you know, to have radiologist review
the cases where either there's a disagreement
with the Ai or they're
an agreement but they just need to make sure that the
report lines up with what the AI found.
And sometimes you can do the review,
you know, after the fact, you could ask your
Ai vendor to just give you a list of the positive
cases and then you can have a
radiologist or a group of radiologists
go through the cases manually and look
at them and see if they all agree. So
I mean there's a lot of different ways to do it
but you know it can be labor
intensive on the front end and I usually
encourage practices to kind of talk to the partner
that they are considering to
talk and discuss how they want to evaluate
the metrics together, you know, so they can
kind of lean on the vendors expertise,
especially if they're deployed at multiple sites
because all the practices are kind of doing the same
thing to kind of have a list like
a checklist of how to evaluate it,
you know, can I just add one more now that I think
about it when we you know, one of the things
that dr lang locke suggested
is that you have a a
when you first implement it that you don't roll it out
to everybody you first implemented silently
in the background so that you can make sure it
works and make sure you don't get this bias. When
we rolled out our mammography algorithms.
We first did not show
the individual radiologist the results, we
would the results would go to a third radiologist.
Second radiologist
who would then get a I results and then
compare decide whether they agreed with
the Ai or the original radiologist
and then have a conversation with the radiologist.
They thought there was a mess.
Uh
that gave us a way of
kind of blinding the the radiologist
to the ai results as we tried to work
through the issues of the AI
and make sure we didn't have a bias
as as far as at the outset
of it. But eventually we un blinded
the radiologist and now the radiologist
sees the results right away. But they recommend
this first phase where you have a blinded
in
blinded go live at first
is what their recommendation is.
Great. Thank you so much for your thoughts
on that question. We have looks like we're
winding down here. Um there was
one other that dr Cameron I wanted to
share with you and get your response to
but this attendee said dr commoner you
surprised me a bit on the cloud approach.
How about the privacy of patient data?
Would you prefer to have a I run behind your firewall
and transfer data directly into your EMR
without going into a virtual server somewhere.
Let's assume the vendor still care
of all the updates and maintenance.
So that's a good question. So the way
at least our vendor does it. There's no
person patient health information that goes across
the internet. Only thing that goes across
the internet is
the images which have been de
identified at a token. So
the
the the actually the vendor itself
doesn't know who the patient is. It all stays
local on our site. So only
the images go across and then we
then say a query the server,
their server in the cloud, say I have
this token. Do you ever result back on this
token, is it yes or no, that's
it. So there's never any patient information that goes
across at all, You know, whenever
there's a cloud solution, there's got to be
uh security behind this.
Look, you know, we all do
you use the internet for our banking, at
least most of us do uh and
that seems to work pretty safely. So
I think these security issues have
been worked out around it, um at
least I have faith in those security issues.
And quite frankly, you know, there have been data
breaches and hospitals to, so
uh there's been hospitals that have
had to pay ransoms for their to
get their data back. So, you know,
I think there's on both sides,
there's security issues, but
I think the cost savings outweigh
the security issues in this issue. I do really think
the security has been worked out.
Yeah. And you know, to add to that the
information is generally more secure in the
cloud, you know, and
I think that's a common misconception that
the patient data is more secure on prem
uh you know, there's a lot of security that goes
behind a cloud solution that
is sometimes even more advanced than what an individual
hospital or health system
or practice might be, you know, keeping
I agree 100% what you just said, I agree
with that. A 100%.
Thank you to the attending for sharing their
the reaction and their thought process and raising
that question. So our panelists could expand more
on their thinking. Their let's say you've
got a few other questions in the
event that a positive finding is missed
by both the Ai and radiologist.
How will that open? Not only the radiologist
but the ai vendor to liability
any thoughts here. Dr Gupta.
Yeah, I mean, this is definitely a new
area that's kind of developing. We don't
really have a lot of legal precedent for
it right now. But I think ultimately
the radiologist is responsible for
their read and their interpretation.
Can you see it similarly to how
does ultimately it's the radiologist.
I mean, there will come a time when aI
might scan the normal studies and
only show the abnormals
to a radiologist for interpretation were
not there at the moment,
but ultimately, the radiologist,
we, as far as, I don't
know, I don't know how
the ai results get discovered by
the attorneys and that kind of process.
We're still very early in the process
of ai adoption. But
uh ultimately it's the
it's the radiologist that's responsible.
Okay,
there was a question
about reimbursing that came in from an attending doctor
Gupta. I'm gonna turn this one to you. Are
you seeing that aI is reimbursable
in some manner?
Yes, there are, you know, some
cpt codes associated with AI reimbursement
and I think that
that helps with that bottom line on
our ally that needs to be
examined when you're considering investing in
an AI solution. But you know, doctor
came and I brought up a really excellent point earlier
about the radiologist stress level
and burn out and you know, that is
kind of a soft roo you could say
because we really need radiologists
and you know, we kind of have a shortage
right Now actually. And so
really investing in the long term,
if we think about 10 years from now
we really need to keep our radiologists in the
workforce and we see rising rates of
burnout and stress. And
so, you know, if all of them are saying
they want to keep the Ai because it's helping with that,
that's a really, really important metric as well.
Mhm.
And attendees, if you are curious about
the article that dr commoner had mentioned
some questions have come in for you to re share
the publication and the author sector commoner.
So anyone who has a pen
and paper nearby can drop this down
DR camera will turn over to you to re share
that article. Sure. Um
it was just published actually I think
it's only online at the moment in the radiology
journal.
The title of the article is implementation
of clinical artificial intelligence and
radiology who decides
and how the first author
is. His last name is D A Y
D N E A D A and I A
D A Y E. But
the two principal authors are
keith Drayer, D R E
Y E R and Curtis Langlands,
L A N G L O T
Z A very
good article, very timely, especially
for this particular discussion uh
and you know I advise everyone
to review it
attending. Thank you so much for your questions.
This is all the time we have for today.
I want to thank our speakers, here's an excellent
discussion and also thank Change
healthcare response from today's webinar
to learn more about the content that was presented
today. Please check out the resources section
on your webinar console and fill out the post
webinar survey. Thank you very
much on behalf of backers for joining us today.
We hope you have a wonderful afternoon.
Thank you.
Thank you
Related resources
White paper
By applying analytics solutions, states can create a comprehensive roadmap to better health equity.
Case study
Using digital integration to ease administrative burdens can help you spend more time with patients and deliver positive patient outcomes.
White paper
Read the white paper for strategies on controlling fixed costs, efficiency and organizational flexibility.