Blog Post

Deep learning might help you get an ultrasound at Walgreens

Jonathan Rothberg, the founder and CEO of a startup called Butterfly Network, isn’t one for taking small chances. He has already founded and sold multiple genomics startups over the past couple decades and, by his own account, helped make possible next-generation gene sequencing. His latest endeavor is to build a real-time medical imaging device that’s the size of an iPad and can be used by nearly anyone.

The foundation of Butterfly Network’s device — which Rothberg hopes will be available in the next 18 months — is a special type of chip equipped with radio telescope technology capable of “seeing” inside the body. MIT Technology Review has a good feature on that technology, as well as the company’s $100 million in funding (some of which came from Rothberg himself), from earlier this month.

However, Rothberg explained in a recent interview with Gigaom, “It wouldn’t help the entire world … if it was too difficult to operate the device or too difficult to analyze the results of the device.”

Butterfly Network's three-pronged vision.
Butterfly Network’s three-pronged vision.

That’s where artificial intelligence, particularly computer vision powered by deep learning models, comes into play. Trained on enough images of particular medical conditions and enough scans across people’s insides, the models should learn how to scan a patient’s body in order to ensure it covers the right parts, and should be able to identify abnormalities when it sees them.

The object-recognition part of the formula isn’t entirely novel. There’s another startup, called Enlitic, already trying to help diagnose diseases like cancer by training deep neural networks on medical images. There have been numerous studies testing the effectiveness of deep learning and other machine learning approaches on the same problem, as well.

Notably, though, Butterfly Network’s radio telescope technology doesn’t see what’s going on inside the body as much as it hears what’s going on. So whereas most computer vision algorithms try to mimic how humans see, Rotherberg said Butterfly’s will be closer to how bats see. (“Instead of seeing in colors, you might see in echoes,” he analogized.) It should be able to detect the physical properties and densities of what it’s seeing, as well as the shape and size.

These additional features, combined with user guidance to ensure high-quality images, could help Butterfly train accurate models even in the absence of huge collections of images to analyze. Of course, the relatively low cost and high ease of use of the device could also result in lots of them being used in the field, which would result in more images to populate those training datasets. Most of the actual processing will take place in the cloud rather than on the device or on local servers.

Jonathan Rothberg (left) at an event in 2008.
Jonathan Rothberg (left) at an event in 2008.

And ubiquity, along with delivering fast results, is one of Rothberg’s biggest goals with Butterfly Network. If all goes according to plan, he explained, an employee at a business like Walgreens or a rural clinic could use the device and send the results (which would include any preliminary finding from the AI models) to a doctor located somewhere else. A doctor or radiologist using the device herself could deliver a diagnosis almost immediately rather than waiting weeks for the results of other imaging technologies.

Rothberg says he hasn’t run into any pushback about the Butterfly device because, just like IBM with its Watson technology, he’s positioning it as a tool to make professionals more efficient rather than obsolete. “I feel that we use the software in a way that makes people comfortable,” he said.

He also doesn’t seem too worried that deep learning will ultimately fail to live up to its incredible hype, leaving companies like his that rely on it looking foolish. For starters, Rothberg said, he has been following the AI space for decades and chose deep learning to underpin Butterfly Network’s imaging algorithms because it’s the right tool for the job, not because everyone is talking about it.

However, he added, all that attention does mean deep learning has one other important thing going for it: Money, lots and lots of research dollars going toward ensuring steady advances in the field. “I want to place my bets where the most money is going into them,” Rothberg said. “I like to leverage other people’s billions.”

Feature image courtesy of Shutterstock user Kaspars Grinvalds. Post images courtesy of Butterfly Network and Flickr user Robert Scoble.