In the study they assessed time to treatment between two different care pathways - one without the app, and one with.

The key difference between workflows is that the app bypasses the radiology read and sends a result straight to an interventionalist.
They use this graphic to show the difference. However there are some rather odd inconsistencies....
1st - note that the two pathways start completely differently. In fact, the non-AI pathway even repeats a CTA study!

Then, in the pre-AI pathway, the technologist has to reconstruct the images - but this step is missing in the post-AI pathway.
2nd - note that the post-AI pathway shows only 3 total steps, compared to pre-AI of 10. Looks great, huh?

In addition to missing steps (above tweet) they have actually combined three steps into one!
Nothing in this study pertains to the actual accuracy of the AI itself - most of the time savings can be accounted for by simply skipping radiology and sending the scan to the interventionalist to read. But that's not what was tested. A true study would have a control group
Now, I'm not saying that this is a bad product - I'm saying that claiming the AI is the time saver here is clearly not true - it's the app functionality.

So...does the AI save 66 mins of time? Is it worth $1040 a pop? No!

The app functionality + care pathway redesign might be
There is likely other evidence and modelling that I haven't seen which may clarify further - but based on this study alone, I don't think the argument stands up
You can follow @DrHughHarvey.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: