[Guest Blog] Using AI to Save Lives
Published Jul 01 2020 01:13 PM 2,993 Views
Copper Contributor

This post was written by MVP and Regional Director Tim Huckaby as part of the Humans of IT Guest blogger series. Read on to learn how Tim is helping to save lives through computer vision and AI.

I have been on a 5+ year computer vision journey. It’s been an exciting ride. As Moore’s Law continues to execute, facilitating increasingly more powerful CPUs at a staggering rate, the direct-correlating precision and performance results in artificial intelligence (AI) - specifically Machine Learning (ML) - has been staggering.

Computer Vision is just a small part of the AI / ML collection. Computer vision is the execution of algorithms that do analysis on digital images or video. The most common (and sometimes controversial because of complex privacy laws) computer vision is facial recognition. But, computer vision is so much more. From operational efficiency to real-time weapon detection to solving cancer to COVID-19 detection, my company InterKnowlogy has solved a number of tricky problems with super interesting computer vision use cases for years.

But no computer vision project has more meaning for me and so personally satisfying than my story about Melissa, Connor, and a rare (1 in 8000), lethal syndrome called Posterior Urethral Valves (PUV). In technical terms this was also the computer vision pinnacle and breakthrough moment for me.

 

Melissa Mullholland and son. Photo credit: Suzanne Choney/Microsoft News.Melissa Mullholland and son. Photo credit: Suzanne Choney/Microsoft News.

 

Melissa Mulholland is a talented executive at Microsoft. I met her a few years back and we immediately “hit it off” because we shared a few medical challenges we have in our own immediate families. We have been friends ever since. Melissa shared with me that her youngest child Connor had PUV. This journey was launched from a simple statement from Melissa: “PUV is rare. It is easy for a trained physician to detect. But, since it’s rare, not all physicians are trained to detect it. If left undetected, it can be fatal.” In simple terms PUV is a in-utero blockage in the urethra in newborn males. 

Since PUV is easy for a human to see/detect if they are trained to look for it, my theory was that PUV should be easy for computer vision to recognize through machine learning. That was the hypothesis I ran with.

The interesting thing from a technology perspective is how spoiled I / we are by Microsoft in terms of the tools, plumbing and platform we get for AI in Azure. And I didn’t even know it at the time. I “cut my teeth” in learning ML, AI and computer vision from the Microsoft side of the world. It wasn’t until way after that project that I learned how hard it is to do computer vision, especially object detection and recognition in competitive technologies like TensorFlow. Microsoft’s Azure custom vision portal allowed me to prototype the PUV solution in a toolset without any programming. It trained my models and produced a testing harness all from the GUI interface on the custom vision portal. And it produced a production ready RST interface. That type of productivity just doesn’t exist in the competitive worlds of machine learning. I’m sure they’ll catch up, but at the time, doing this in competitive tech was a lot of command line interface and python scripting. And for developers, Azure Custom Vision is free.

Melissa helped me find ultrasound images of male babies in utero with and without PUV. She also gave me the actual image of Connor that was taken by her husband Kyle on his smartphone aimed at a low-quality screen when Connor was diagnosed with PUV. It had been a long time since I pulled an all-nighter trying to figure out a technical solution to a problem. I’m way beyond my production programming years.

But, this challenge motivated and inspired me. It was that 3am breakthrough, tinkering and doing the ML training on the images I collected with the Azure custom vision recognition models to do the identification of PUV that got me so excited. I emailed Melissa in the middle of the night that I had figured it out and could not wait to show it to her. I demonstrated it to her virtually that morning. She was ecstatic.


The brilliant engineers at IK built me a real-time test container that used any camera like the one on my computer to analyze images with my pre-trained PUV computer vision model. The results were staggering and it ultimately lead to a ton of press and me doing the demo live at Melissa’s keynote at the Microsoft Inspire conference. You can also read more about Melissa's story on Microsoft News: https://news.microsoft.com/transform/how-one-microsoft-mom-inspired-health-care-companies-to-embrace...

Understand that project, like most of computer vision, is simply a tool to help humans make decisions. However, when placed into the workflow of the ultrasound process, this can and will help save many lives. It can detect rare anomalies like PUV that a physician might otherwise miss, prompting him/her to take a 2nd look and correctly diagnosing the condition.

This project not only inspired me and my company InterKnowlogy to specialize in the computer vision area of AI and machine learning., but it also changed my life because of the realization of how computer vision really can save lives. How will you use tech to save and change lives today?

#HumansofIT
#TechforGood

3 Comments
Bronze Contributor

Well, I agree with the point that AI could contribute in solving problem and helping humans but I need to express concern that it should not be replacement of current technology. In AI , there is false-positive and false-negative which we could not prevent. So if there is false case, it means incorrect detection and we detect healthy person as an ill person and vice-versa. 

AI could be one tool but we need to use other technologies and verify our data and findings.

Sometimes just one false-positive or false-negative is very costly.

Copper Contributor

with all due respect, @Reza_Ameri-Archived, it seems you didn't read the entire post.  specifically towards the end:

 

"Understand that project, like most of computer vision, is simply a tool to help humans make decisions. However, when placed into the workflow of the ultrasound process, this can and will help save many lives. It can detect rare anomalies like PUV that a physician might otherwise miss, prompting him/her to take a 2nd look and correctly diagnose the condition."

 

Additionally, this is not a replacement of any technology that exists.  It's simply a new AI tool to help humans take a second look on things they may miss.  i'm sure you are also aware that every computer vision algorithm comes with what is effectively a confidence percentage.  which is throttled to ignore the majority of potential false positives and negatives.  

Bronze Contributor

Thank you @TimHuckaby for highlight and agreed with the points.

However, I shared this comment, because what I feel in industry is like they are fully invest on AI and fully trust on AI and I want to highlight it is wrong.

Like what you said, it is only supporting technology and not replacement and from my experience using AI is sometimes risky and we should avoid it.

Version history
Last update:
‎Jul 01 2020 01:13 PM
Updated by: