Scroll Top

Bytes, Chips, and QuBITS: How the revolution in computer technology is changing Healthcare

Today, healthcare and health research rely more than ever on technology, from advanced lifesaving devices in the intensive care unit to DNA sequencers that drive Precision Medicine. Though the machines themselves are tangible, it’s the hardware and components that underlie those machines which makes it all possible. Computers, once the size of entire rooms, can now fit in the palm of your hand. Moore’s Law, introduced in 1965 by Intel co-founder Gordon Moore, states that the processing power of a computer doubled roughly every two years [1]. Graphics processing units (GPUs), once used primarily in the video gaming industry, have increased in performance by a factor of 25, while conventional processors have improved only 5X over the past five years [2]. It’s clear the pace of innovation has gone beyond Moore’s Law: while silicon transistors have already shrunk to nanometer scale, companies like Google, Microsoft, Intel, and IBM are focusing their attention on new technologies like quantum computing, GPUs, and reconfigurable chips [3, 4]. These technological innovations will have a profound impact on healthcare and health research and this is only the beginning.

From Video Games to Mammograms and Ultrasounds

NVIDIA and AMD are names that have long been associated with video games; their graphics cards are among the top-rated year after year [5]. Their products are essential to help video games’ graphics or other multimedia render quickly and seamlessly on high definition monitors [6]. But these components are being recognized for their abilities in a variety of applications that have little to do with gaming. Mining for Bitcoin, Ethereum, Ripple, and other cryptocurrencies are being mined with high-powered graphics cards, in parallel, due to their capabilities to handle the complex mathematical algorithms necessary—resulting in shortages and significantly higher prices for the devices [7]. But one of the most surprising forays for these and similar technology companies has nothing to do with video games or Bitcoin: they’re helping doctors and scientists identify cancers earlier, crunch massive amounts of data, and improve imaging for healthcare.

NVIDIA, founded in 1993, spent most of the 1990s and 2000s expanding their footprint in the gaming graphics industry [8]. The company soon realized that their graphics processors that video gamers clamored for could be used in other industries with graphics-intensive tasks. In 2009, they partnered with Siemens to improve prenatal ultrasonography, giving clinicians and parents-to-be a noninvasive, unprecedented look at the developing fetus [9, 10]. This entry into healthcare was not a one-off, however; in 2015, NVIDIA launched a processor for training deep neural networks and a mobile “super” chip with 1 terabyte of power for deep learning projects. Today, NVIDIA is pushing farther into artificial intelligence and healthcare applications [8, 11]. NVIDIA’s Project Clara, a supercomputer for medical imaging, was unveiled in March 2018. Clara leverages virtual GPUs to accommodate multi-user access and numerous instruments at the same time and can be scaled up or down as users’ needs change [12].

Advanced Micro Devices, better known simply as AMD, is similarly expanding into the healthcare and scientific industries. For more than 4 decades, AMD has developed hardware that video game developers and movie studios have used to create realistic-looking visual effects [13]. The company partnered with the Associated Press in 2016 to leverage the life-like abilities of virtual reality technology to deliver news, calling it “next-generation” journalism[14]. But as with NVIDIA, AMD has joined forces with other companies to improve healthcare delivery and scientific research. Barco, a manufacturer of audio-visual products for a variety of industries, partnered with AMD to create a display for radiologists specifically designed for breast imaging that uses AMD’s graphics cards, an essential component to accurate rendering and high resolution of clinical images [15]. AMD’s technology was also behind a reported 3x improvement in ultrasound imaging achieved by Analogic [16].

Super-charged GPUs are behind some of the recent advances in medical imaging which have resulted in better resolution for doctors and pathologists and less radiation exposure for patients because the images can be captured more quickly [12]. Massachusetts General and Brigham and Women’s Hospitals are using these next-generation GPUs for machine learning and decision support tools because they can help radiologists interpret imaging and clinical data quickly and accurately [17]. Imaging and machine learning are just some of the ways these companies are changing medical imaging and healthcare. As noted by NVIDIA CEO Jensen Huang at the company’s 2018 GPU Technology Conference, though we routinely update our personal electronics, such as phones, computers, and even TVs every few years, the millions of outdated medical devices in hospitals would take more than 30 years to replace! And once replaced, the ‘new’ equipment would rapidly become outdated again, spurring a never-ending cycle [11].

This stark reality combined with growing financial pressure is leading many health systems to turn to the cloud. NVIDIA’s Project Clara uses cloud technology to deliver supercomputer capabilities to users. And unlike traditional systems, cloud computing users benefit from having centralized data storage that is accessible even when workers are remote or at multiple locations [12]. This latter point is critical for researchers who may be collaborating with others at different institutions or medical centers. Additionally, cloud systems are scalable: Instead of medical centers or research facilities being forced to choose between paying for a system that might not be fully utilized for years or needing to constantly upgrade, bringing things to the cloud means that as the computational and data storage needs of the facility change, the cloud adapts seamlessly, growing (or shrinking) as the system requires.

Medical imaging is becoming more realistic, tools like virtual reality are being integrated into surgery and medical education, and artificial intelligence and deep learning technologies are helping doctors and scientists make new discoveries and change patient management. For example, researchers at Johns Hopkins Hospital are using NVIDIA’s GPUs to help identify pancreatic cancer earlier, when it is more treatable, with machine learning techniques [18]. As AI and machine learning are used for more healthcare applications, the need for faster processors and cloud capabilities will increase, fueling new collaborations between NVIDIA, AMD and others with healthcare device manufacturers. The computing power of video games has come to healthcare and it’s changing the industry.

TPUs and Quantum Computing

While GPUs are revolutionizing medical imaging and machine learning, some companies are developing alternatives that promise to go beyond even the most powerful GPU. For example, Google recently unveiled its Tensor Processing Unit (TPU). According to Google’s blog, each cloud-based TPU is capable of up to 180 teraflops (Tflop) floating-point performance [19]. A Tflop is one way to measure the performance of a computer based on its mathematical capability to handle 1 trillion floating-point calculations per second. This is far more powerful than an average household computer and has long been the domain of supercomputers. As recently as 2000, the ASCI Red at Sandia National Laboratory was the world’s fastest supercomputer and could handle a Tflop. Today’s supercomputers are capable of even more: Argonne National Laboratory’s Mira supercomputer is capable of 10 petaflops (1 petaflop = a thousand trillion flops) and has been used for modeling disease outbreaks [20]. To date, it’s been impractically expensive for nearly anyone outside of a national laboratory to obtain the kind of computing power the Google TPU offers—which is why this innovation is so important. With machine learning and artificial intelligence requiring substantial computational power, Google’s cloud-based TPU substantially levels the playing field and can support the kind of technological innovation that is driving healthcare and health research today.

In 2017, Intel launched its first Tflop chip aimed at consumers, called the Core i9 Extreme Edition. According to Intel, the Tflop chip is designed for users who want to “mega-task”: individuals who might be streaming in real-time as they play a video game in 4k, while simultaneously performing other tasks [20]. Though the company may have anticipated the draw of such a chip to the gaming or autonomous driving industries, its application to healthcare and scientific research offer much potential to reshape how quickly (and inexpensively) data analytics and imaging analysis can be done.

Years after a shrinking mainframe business forced the company to adapt to new technologies and a service-driven industry, the technology stalwart IBM is focusing on innovating the hardware needed for AI and machine learning with research on silicon-based neural network method [21]and quantum computing [22]. Structured similarly to the way a human brain makes connections, neural networks are typically run as software on computers. But as described previously, how fast those computers work depends on large part on the chips inside them, so research using neural networks is computationally intensive and somewhat slow. Hardware-based attempts have been plagued by inaccuracies. IBM is attempting to make the process substantially faster by creating chip-based microelectronic synapses for a blended hardware-software system that is as accurate as software-based neural networks but runs more efficiently as hardware, consuming only 1% as much energy [21, 23]. While not yet ready for commercialization, the synapses could help IBM further its reach into healthcare, where AI is just beginning to make inroads.

Despite the push in healthcare to have obtained as much data as possible to improve analysis, the reality is that we are currently limited to a set number of variables before the computational capabilities break down. Though conventional computers and supercomputers are pushing those boundaries with new processing chips and algorithms, there are limits. Quantum computers are a completely different kind of computer that removes some of those barriers. Based on the quantum nature of objects, these computers can solve problems that conventional computers never will [24]. IBM accurately modeled a small molecule with an early quantum machine in 2017, but if engineers can successfully build a stable quantum computer, researchers could use it to simulate the interaction between complex molecules or design new medications [22]. The potential for quantum computing to revolutionize numerous industries has not gone unnoticed by other companies; Microsoft, Google, and Intel are among those developing quantum chips and machines [25].

Conclusion (at least for the moment)

Technology is moving at a breakneck pace and healthcare has the opportunity of reaping the benefits.  New GPU chips and cloud-based TPUs are enabling faster and less expensive data analytics, higher resolution medical imaging, not only driving toward better patient outcomes, but opening up opportunities to rethink existing business models and processes. AI and machine learning are only beginning to be exploited for health and science; as computational capabilities improve, we will see more systems using AI/machine learning that will impact everything from how basic research is performed, to empowering doctors and revolutionizing patient care. Where will these existing and future technologies take healthcare?

References

  1. Intel. 2018 50 Years of Moore’s Law.https://www.intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html
  2. Morris, J. 2018.Nvidia aims to extend its lead in AI. ZDNet. https://www.zdnet.com/article/nvidia-aims-to-extend-its-lead-in-ai/
  3. CNET 2013. End of Moore’s Law: It’s not just about physics. Scientific American. https://www.scientificamerican.com/article/end-of-moores-law-its-not-just-about-physics/
  4. Simonite, T. 2016. Moore’s Law Is Dead. Now What?MIT Technology Review. https://www.technologyreview.com/s/601441/moores-law-is-dead-now-what/
  5. Hruska, J. 2018.The Best Graphics Cards of 2018. PC Magazine. https://www.pcmag.com/roundup/355217/the-best-graphics-cards
  6. Papiewski, J. 2018. The Purpose of a Graphics Card. Small Business-Chron.com. http://smallbusiness.chron.com/purpose-graphics-card-55327.html
  7. Warren, T. 2018.Bitcoin mania is hurting PC gamers by pushing up GPU prices. The Verge. https://www.theverge.com/2018/1/30/16949550/bitcoin-graphics-cards-pc-prices-surge
  8. NVIDIA. 2018 About NVIDIA: NVIDIA History.http://www.nvidia.com/page/corporate_timeline.html
  9. Miller, P. 2010.NVIDIA 3D Vision Pro Technology uses RF syncing to woo professionals. Engadget. https://www.engadget.com/2010/08/04/nvidia-3d-vision-pro-technology-uses-rf-syncing-to-woo-professio/
  10. Takahashi, D. 2010. Siemens uses graphics chips to create better 3-D views of babies in wombs Venture Beat. https://venturebeat.com/2010/02/04/siemens-uses-graphics-chips-to-create-better-3-d-views-of-babies-in-wombs/
  11. Nguyen, C. 2018. When diagnosis time means life or death, NVIDIA’s advanced AI can save lives. Digital Trends. https://www.digitaltrends.com/computing/nvidia-gtc-project-clara-machine-leaning/
  12. Powell, K. 2018Project Clara: NVIDIA Supercomputing Platform Redefines Medical Imaging.https://blogs.nvidia.com/blog/2018/03/28/ai-healthcare-gtc/
  13. Advanced Micro Devices. 2018 Our History.https://www.amd.com/en-us/who-we-are/corporate-information/history
  14. Advanced Micro Devices. AMD and The Associated Press Collaborate To Enable Next-Generation Virtual Reality Journalism. 2016-02-17.https://www.amd.com/en-us/press-releases/Pages/amd-and-the-2016feb17.aspx
  15. Advanced Micro Devices. 2018 Medical Imaging.https://www.amd.com/en/corporate-responsibility/technology-medical-imaging
  16. Trefis Team 2015. AMD Eyes The Medical Device Segment In the Embedded Market. Forbes. https://www.forbes.com/sites/greatspeculations/2015/01/14/amd-eyes-the-medical-device-segment-in-the-embedded-market/#6115f42f936f
  17. Siwicki, B. 2018. Mass General, Brigham and Women’s to apply deep learning to medical records and images. Healthcare IT News. https://www.healthcareitnews.com/news/mass-general-brigham-and-womens-apply-deep-learning-medical-records-and-images
  18. Beckett, J. 2018 Hidden Figures: How AI Could Spot a Silent Cancer in Time to Save Lives https://blogs.nvidia.com/blog/2018/05/21/ai-pancreatic-cancer/
  19. Barrus, J. S., Zak;. 2018 Cloud TPU machine learning accelerators now available in beta.https://cloudplatform.googleblog.com/2018/02/Cloud-TPU-machine-learning-accelerators-now-available-in-beta.html
  20. Verger, R. 2017. Intel’s new chip puts a teraflop in your dekstop. Here’s what that means. Popular Science. https://www.popsci.com/intel-teraflop-chip
  21. Knight, W. 2018. AI could get 100 times more energy-efficient with IBM’s new artificial synapses. MIT Technology Review. https://www.technologyreview.com/s/611390/ai-could-get-100-times-more-energy-efficient-with-ibms-new-artificial-synapses/
  22. Knight, W. 2018. Serious quantum computers are finally here. What are we going to do with them?MIT Technology Review. https://www.technologyreview.com/s/610250/hello-quantum-world/
  23. Ambrogio, S., Narayanan, P., Tsai, H., Shelby, R. M., Boybat, I., di Nolfo, C., Sidler, S., Giordano, M., Bodini, M., Farinha, N. C. P., Killeen, B., Cheng, C., Jaoudi, Y. and Burr, G. W. 2018. Equivalent-accuracy accelerated neural-network training using analogue memory. Nature 558 (7708):60-7.
  24. Hartnett, K. 2018. Finally, A Problem Only Quantum Computers Will Ever Be Able To Solve. Wired. https://www.wired.com/story/finally-a-problem-only-quantum-computers-will-ever-be-able-to-solve/
  25. Smith, M. 2018.To put a quantum computer on your desk, Intel has a plan unlike any other. Digital Trends. https://www.digitaltrends.com/computing/intel-quantum-computing-research-lab/