Skip to Main Content

Let’s assume you are not an expert highly trained in medical imaging. And let’s assume you were invited one day to try out a new technology for heart ultrasounds — diagnostic tools that are notoriously difficult to use because of the chest wall and because some shots must be made while the heart is in motion.

Could you do it?

advertisement

Maybe. When I was given the shot on a recent day, I was able to take the ultrasound in a matter of minutes with the help of software, developed by a San Francisco-based startup called Caption Health. The software told me how to hold the ultrasound probe against the ribs of a model who had been hired for the purpose of my visit and knew on its own when to snap the image. It was a little like having Richard Avedon’s knowledge of photography uploaded into the guts of my iPhone camera.

You can see the image I took of the parasternal long axis view of the heart pumping at the top of this page.

If the technology holds up, Caption, until recently called Bay Labs, could succeed in solving the problem of making heart sonograms easier to obtain. It’s already impressed some in the life sciences. Among them is health care executive Andy Page, who spent four years as Anne Wojcicki’s right-hand man at 23andMe and a year as the president and chief financial officer at digital health startup Livongo. He was introduced to Caption Health last fall by one of its investors, the billionaire Vinod Khosla. He has chosen to become its chief executive.

advertisement

“I was interested in how AI could impact health care,” Page told STAT. “Knowing it was a trend that was coming, my thought was that to really impact health care, the AI implementation would have to be straightforward, understandable, practical, trusted. And that’s exactly what the company was doing.”

Andy Page
Andy Page Caption Health

The use of AI in ultrasound is becoming a hot area. Butterfly Network, which launched a handheld ultrasound device that is much cheaper than competitors early this year, is also working on AI. Ultromics, based in London, is also working on using AI in ultrasound.

I had used Butterfy’s technology two years ago to take images of my carotid artery. The experience of using Caption’s was similar in a lot of ways, but it was obvious that the images I captured with the latter technology were harder-to-get shots.

“The word revolutionary is probably overused a lot these days with a lot of the tech things we have coming out, but this has the potential to really change how we’re treating our patients in the not-distant future,” said Dr. Patrick McCarthy, the executive director of the Northwestern Bluhm Cardiovascular Institute, who was primary investigator of a study of Caption’s AI but said he has no financial relationship with the company.  McCarthy said he thinks the AI could “democratize” heart ultrasound by increasing the number of health care professionals who can give the test, meaning that more patients who should have it will.

Caption was founded in 2013 by Charles Cadieu, its president, and Kilian Koepsell, its chief technology officer. Cadieu spent his early adult life moving between the Massachusetts Institute of Technology, from which he has a master’s degree in engineering, and the University of California, Berkeley, where he received a Ph.D. in neuroscience. “I’m kind of the planning/thinker/architect and Kilian is the tuned-in laser beam to get things done,” said Cadieu.

Cadieu and Koepsell both wound up on the founding team at IQ Engines, a company that was involved in using deep learning to identify images. After it was sold to Yahoo in 2013, the pair started working on the idea that deep learning was ready to be applied to medicine. “I was always inspired by applying science to medicine,” said Koepsell, who grew up in a family of doctors. According to family legend, he said, his great-grandfather was present in the lecture where the use of X-rays was demonstrated for the first time.

Ultrasound is particularly suited not just for having AI interpret ultrasound images, but also for tackling a more immediate challenge: getting the images in the first place.

“If you don’t do these every day, you get hesitant about, ‘What am I looking at?’” said Dr. Mark Schneider, chair of the department of anesthesiology and director of applied innovation at Christiana Care in Wilmington, Del. “And then you get hesitant to use it.”

Right now, the image quality that gets taken of patients is “all over the place,” said Dr. Arun Nagdev, director of point-of-care ultrasound at Highland General Hospital in Oakland, Calif. “The ability to obtain that image is crucial,” Nagdev said. Once novice users can use the technology, he foresees a “hockey stick” growth in the ultrasound.

Page said he thinks of the technology under development as “a co-pilot” that can assist doctors who have trouble getting particular scans, as well as those who have not used ultrasound much before — a use that could expand to hospitalists, who focus on hospitalized patients, anesthesiologists, and nurses.

Caption Health provided me with unpublished data from a study in which 8 nurses with no previous experience in cardiac ultrasound performed four different types of scans on 240 patients.

For assessing patients’ left ventricular size and function, as well as assessment of pericardial effusion, or fluid around the heart, the AI took the same number of usable images. For each, 240 scans were performed, and 237, or 98.8%, were of sufficient quality, according to a panel of five cardiologists. For images of the right ventricle, which is harder to see, the results were a bit worse: 222 images, or 92.5% of them, were of adequate quality. Eric Topol, the director and founder of the Scripps Research Translational Institute, commented that this was still a small number of samples for AI work; Caption Health said it “respectfully disagrees” because the study was prospective. The goal of the study was to show the test was 80% accurate.

Caption Health will need to partner with device makers in order to bring its device to market; it does not make ultrasound equipment. It is currently partnered with one ultrasound manufacturer,  Terason. Caption has received breakthrough device status from the Food and Drug Administration, which could expedite its regulatory review. Caption said it has raised $18 million to date from investors including Data Collective and Khosla Ventures; a recent valuation is not available.

Page is nothing if not confident. “Nothing else exists in the market of this nature,” he said.

  • Sonography is so much more than just getting a picture. So many things are missed now by trained sonographers I can’t imagine the danger this opens up when medical personnel who have no experience in sonography attempt to diagnose with a picture conceived with AI. Will RNs start reading Xrays and CTs now too? Unless the AI can judge the degree of Diastolic Heart Failure, or find a clot in the right atrium and distinguish what the mass is, or measure an Aortic Valve Area on a 450 lb man I would not want this technology anywhere near a family member of mine. Ultrasound in the hands of just anyone is opening the doors to medical malpractice. The best Sonographers have years of training and experience. There are so many small nuances of ultrasound combined with knowledge of patient history and clinical picture that are necessary to do a proper exam. It most cases you have to know exactly what you are looking for and exactly how to find it with all the different combinations of 2D, spectral Doppler, and color Doppler. This requires a great amount of training and experience that RN’s and MD’s don’t have time to learn. Will they learn my job on the job? This leads down a very dangerous path. When are we going to start replacing doctors with just AI computers? Just tell the computer what’s wrong with them and have the CNA draw their blood and the RN take their X-ray and do their MRI. You see how crazy this sounds?

  • I am a board certified congenital cardiologist. The left ventricular long axis clip above has been around since the 1970s. A whole host of individuals can relatively easily interpret this image, AI is not needed. This article is extremely misleading, as the complexity of complete echocardiographic imaging along with pulsed wave, continuous wave, and color Doppler imaging is infinitely more complex, as the variations in anatomy and physiology are close to infinite. The usefulness of this single image is as about the same as trying to identify a person from more than a mile away without binoculars and to then determine their date of birth, address, number of siblings, and age of their parents, and a near infinite additional factors.

Comments are closed.