From The Editor | May 9, 2024

Digital Twins: From Apollo 13 To Treating Cancer

John Headshot cropped  500 px wide

By John Oncea, Editor

Machine Learning digital twin-GettyImages-1220294243

Digital twins are virtual replicas of physical objects or systems, enabling real-time monitoring, analysis, and optimization. An earlier version of them helped save the Apollo 13 astronauts, and future versions of them will be used to treat cancer.

A digital twin is a digital model that serves as the nearly indistinguishable counterpart of a real-world physical product, system, or process for various practical purposes. Used for simulation, integration, testing, monitoring, and maintenance of its physical counterpart, the digital twin concept underlies Product Lifecycle Management, existing through the create, build, operate/support, and dispose stages of a physical entity.

A digital twin can exist before its physical counterpart and may be used in real-time, regularly syncronized with the physical system it represents. Created for value-based use cases, digital twins allow for the modeling and simulation of an intended entity’s entire life cycle.

For instance, car companies are building digital twins of their new designs to see how they'll respond under different conditions. And, as digital twin technology evolves, digital versions of homes could exist, allowing homeowners to do things like track how energy costs would rise or fall if they went solar.

A Brief History Of Digital Twins

The concept of using a digital twin as a means of studying a physical object was first introduced by NASA in the 1960s, according to Challenge Advisory. “NASA would use basic twinning ideas during this period for space programming. They did this by creating physically duplicated systems at ground level to match the systems in space.”

The most famous example is when NASA developed a digital twin to assess and simulate conditions on board Apollo 13, allowing Mission Control to quickly adapt and modify simulations to match the conditions of the damaged spacecraft and troubleshoot strategies to bring the astronauts safely home.

In the early 1970s, mainframe computers were used as digital twin-esque systems to monitor large facilities such as power plants, adds Unity. “In the 1980s, 2D CAD systems like AutoCAD emerged to produce technical drawings, making it possible to design anything with a computer, and were quickly adopted by millions of designers and engineers.”

In 2010, NASA further applied the digital twin concept for equipment maintenance and simulation use cases during space explorations. John Vickers, a principal NASA technologist, helped popularize the term outside of space exploration, Mike Kalil writes.

“In the mid-2010s, digital twinning started gaining real traction with innovative manufacturing organizations on ambitious digital transformation journeys, including General Electric (GE), which used digital twins for applications such as wind farms and gas turbines,” writes Kalil. The concept surged in popularity in parallel to the explosion of Internet of Things (IoT) technology advances.

The Digital Twin Consortium was formed in 2019 to begin standardization efforts for digital twinning. The COVID-19 pandemic in 2020-21 further drove wider-scale adoption of digital twinning as manufacturers invested in technologies enabling remote collaboration and virtualization.

“Houston, we’ve had a problem here.”

On April 11th, 1970, NASA’s Apollo 13 launched. It was supposed to land in the Fra Mauro area but an explosion on board forced Apollo 13 to circle the moon without landing.

But, nine hours before the on-board explosion, “Apollo 13 was looking like the smoothest flight of the program,” NASA writes. “At 46 hours, 43 minutes Joe Kerwin, the capsule communicator, or Capcom, on duty, said, ‘The spacecraft is in real good shape as far as we are concerned. We’re bored to tears down here.’ It was the last time anyone would mention boredom for a long time.”

At just under 56 hours, Apollo 13’s oxygen tank No. 2 blew up, causing the No. 1 tank to fail as well taking the module’s normal supply of electricity, light, and water, with it, all at about 200,000 miles from Earth.

Commander James Lovell then uttered one of the most famous space-related phrases ever, “Houston, we’ve had a problem here.”

The mission was in turmoil. “Warning lights indicated the loss of two of three fuel cells, which were the spacecraft’s prime source of electricity,” writes NASA. “With warning lights blinking, one oxygen tank appeared to be empty and there were indications that the oxygen in the second tank was rapidly depleting.

“Thirteen minutes after the explosion, Lovell happened to look out of the left-hand window and saw the final evidence pointing toward potential catastrophe. ‘We are venting something out into the… into space,’ he reported to Houston. Capcom Jack Lousma replied, ‘Roger, we copy you venting.’ Lovell said, ‘It’s a gas of some sort.’”

The gas Lovell saw was oxygen gas escaping at a high rate from the second, and last, oxygen tank. The world watched as the astronauts drifted further and further away from Earth.

Lovell and the rest of the Apollo 13 crew – Command Module Pilot John L. “Jack” Swigert and Lunar Module Pilot Fred W. Haise – tried to close the hatch between the Command Module (CM) Odyssey and the Lunar Module (LM) Aquarius before realizing that there wasn’t a cabin leak.

But the oxygen pressure in the CM was getting dangerously low and, with only 15 minutes of power left in it, the crew was forced to make their way into the LM. “Haise and Lovell quickly floated through the tunnel, leaving Swigert to perform the last chores in the command module,” NASA writes.

It was determined that the oxygen supply wouldn’t be a problem but there was concern about power and the availability of enough water for the crew. “There were 2,181 ampere-hours in the LM batteries,” writes NASA. “Ground controllers carefully worked out a procedure where the CM batteries were charged with LM power.

All noncritical systems were turned off and energy consumption was reduced to 1/5, which resulted in having 20 percent of LM electrical power left when Aquarius was jettisoned. There was one electrical close call during the mission. One of the CM batteries vented with such force that it momentarily dropped off the line. Had the battery failed, there would have been insufficient power to return the ship to Earth.”

The crew had to minimize their water consumption to only six ounces per day, which is one-fifth of their normal intake. They compensated for the lack of water by drinking fruit juices and eating wet-pack foods like hot dogs. However, this still resulted in severe dehydration for the crew. Lovell lost 14 pounds during the mission and the whole crew lost a total of 31.5 pounds, which was almost 50% more than any other crew. Despite the harsh conditions, the crew managed to conserve 28.2 pounds of water, which is approximately 9% of the total.

While Lovell, Swigert, and Haise dealt with this, Mission Control turned to “digital twins” on the ground to figure out a way to bring the astronauts home. NASA had numerous simulators on the ground that were being fed data from the real spacecraft 200,000 miles away, replicating what was happening on board and allowing Mission Control to strategize how to get the astronauts back safely. These simulators – often erroneously referred to as the first digital twins – helped NASA predict and then execute the return mission that brought the astronauts home.

“The ‘twins’ involved in this heroic saga were the physical twins of the spacecraft,” writes Michael Grieves, Executive Director and Chief Scientist of Digital Twin Institute. “The plan to recover functionality and get Apollo 13 home safely was also a critical effort by very talented NASA engineers. Their proposal of workarounds and recovery was validated with physical simulators.

“The simulators were physical counterparts of the actual Apollo spacecraft. There were some digital aspects to them in that they were running primitive computers with the same programs as the actual spacecraft itself. However, these physical space capsules weren’t Digital Twins.”

But, while the claim that the digital twin originated in the Apollo program is unfounded, its development does have a strong connection to NASA.

NASA’s Move From “Digital Twins” To Digital Twins

The idea of a “digital twin” was born at NASA in the 1960s as a “living model” of the Apollo mission. But it wasn’t until 1998 that the phrase was first mentioned, referring to a digital copy of actor Alan Alda’s voice in Alan Alda meets Alan Alda 2.0, Challenge Advisory writes.

“Although the digital twins have been highly familiar since 2002, only as recently as 2017 has it become one of the top strategic technology trends. The Internet of Things enabled digital twins to become cost-effective so they could become as imperative to business as they are today.”

Advancements in computing power, data analytics, and artificial intelligence are also responsible for the rise of digital twins and, as these technologies continue to progress, digital twins are expected to evolve further, potentially becoming capable of proactively searching for insights, interacting with each other, and even learning autonomously.

NASA’s role in the growth of digital twins began with the pioneering physical version but continues today as it, along with others in the aerospace community, continues to develop and utilize high-fidelity digital models of physical systems and components as well as the extreme environments in which they operate.

“NASA aims to travel further and stay longer in space as we realize the Artemis program, taking us from the moon to Mars by establishing a sustainable presence on the Moon to prepare for missions to Mars,” NASA writes. “We will no longer be able to rely on constant connectivity with an asset nor be in the loop for on-demand human intervention in the event of an anomaly.

“Further, the importance of digital twins is increasing as we seek alternatives for certification of structures so large that they cannot be fully evaluated in existing test facilities and autonomous systems that are not deterministic. The idea of a digital twin is not a new one. What is new is the scale, ordinality, and non-deterministic nature of the models that are critical to achieving NASA’s goals. Their number and autonomy from each other as well as the reference system are suggestive of a changing ecosystem, returning us the idea of a living model.”

NASA has continued to use digital twins extensively for designing, testing, and operating spacecraft, rovers, and other space assets. For example, they employed digital twins for the Orion spacecraft to optimize its design and analyze mission scenarios. In addition, NASA’s Earth System Digital Twins (ESDT) program aims to develop digital replicas of Earth systems to support dynamic forecasting models and impact assessments for phenomena like wildfires and climate change.

Through initiatives like these, NASA has been at the forefront of advancing digital twin technology and demonstrating its practical applications in complex, mission-critical systems. So, while the term digital twin was coined later, NASA's pioneering work with high-fidelity simulators and virtual replicas of physical assets laid the groundwork for the development and widespread adoption of digital twin technology across various industries.

Digital Twins And Your Health

A more personal way digital twins are going to be in our lives is in healthcare, where they are improving care through the adoption of a more data-driven approach. With digital twins, personalized models for patients can be developed and continuously adjusted based on tracked health and lifestyle parameters resulting in better medical care, improved sports training, and enhanced educational outcomes for all.

Digital twins have the potential to revolutionize healthcare by creating a virtual patient that can provide a detailed description of their healthy state, based on their characteristics rather than just their medical history. This enables doctors to compare individual patient records with the wider population to identify patterns and gain insights into what constitutes a healthy patient.

The biggest advantage of digital twins in healthcare is that healthcare can be tailored to anticipate the unique responses of individual patients. This approach not only enables more accurate diagnoses and treatments but also changes our understanding of what it means to be healthy. Rather than simply being the absence of disease, health can now be defined by comparing an individual patient to the rest of the population.

According to TNW, an example of this is Foresight, a tool that uses generative pre-trained transformers, the same family of large language models (LLMs) used by ChatGPT. “Researchers in the U.K. first trained the models on medical records. Next, they fed their tool fresh healthcare data to create virtual duplicates of patients. Finally, the digital twins forecast various outcomes, from disease development to medication needs.”

The digital twins correctly identified the next condition of patients next condition with 88% accuracy when applied to U.S. data but were less successful using British data. Despite the mixed results, the researchers trust their twins can guide diagnoses, treatment, and clinical research.

“Study co-author James Teo, director of data science and AI at King’s College Hospital believes the forecasts represent ‘possible multiverses’ to understanding diseases. ‘Our generative AI produces forecasts from text in health records across any disorder, test, medication, treatment, or complication across all disease groups into the future. This patient digital twin can provide insights and ‘what if’ scenarios.’”

However, the emergence of the digital twin in healthcare also brings some downsides. The digital twin may lead to inequality, as the technology might not be accessible to everyone by widening the gap between the rich and poor. Furthermore, the digital twin will identify patterns in a population that may lead to discrimination.

Digital Twins And Cancer Treatment

Karen Willcox, an aerospace engineering professor at the University of Texas at Austin, is leading research on using digital twins for personalized cancer treatment, specifically for brain tumors, according to TED Radio Hour.

“Here at UT Austin, we started talking with the Center for Computational Oncology at the Oden Institute,” said Willcox. Those conversations led to the realization that even though “the physics and the biology is very different from what I do as an aerospace engineer, there are many common challenges,” in the idea of digital twins as a way to move toward personalized cancer care.

Using a cancer patient as an example, Willcox says, “We have powerful models – mathematical models – that can represent a tumor and make predictions on how that tumor is going to grow. The future vision is to have the digital twin work hand-in-hand with human clinicians to try to achieve the best outcomes for that individual patient.” Willcox stresses that this technology is still a long way from being applicable because of the complexities of modeling the human body at the scale needed.

Oden Institute writes, “The digital twin is initialized with population-level clinical data and then tailored to an individual patient through Bayesian model calibration, continually assimilating patient-specific magnetic resonance imaging (MRI) data as it is logged in the clinic. Using this data, the digital twin predicts how the patient would respond to different dosages and lengths of radiotherapy. It is the Bayesian framework that sets this research apart. By employing a Bayesian framework, this digital twin accounts for the uncertainty in a patient’s data and their predicted response to treatment. This is essential for making the high-stakes decisions that come with treating cancer.”

While still early research, Willcox’s digital twin methodology showed promising results in simulations of 100 virtual brain cancer patients, according to Frontiers. It delayed median tumor progression by around 6 days compared to standard radiotherapy. For some patients, it enabled equivalent tumor control with 16.7% lower radiation doses.

Willcox envisions digital twins shifting cancer care toward truly personalized medicine, minimizing guesswork and maximizing positive outcomes. However, further research and clinical trials are needed before implementing this technology in hospitals.

When asked by TED Radio Hour host Manoush Zomorodi about having the digital twin work hand in hand with the human clinician to try to achieve the best outcomes for that individual patient, Willcox said, “You could create a digital twin based on what your cardiovascular strength is and your blood type and then add variations to see how things would go.

“For example, if you got chemotherapy and you quit smoking and brought your blood pressure down, we think we could predict that your outcome would be X. If you didn't do those things, here's what that could look like. That's the vision.”

Willcox added that we’re still a long way from being able to get to that point because of the complexity of our biological systems, as well as the fact we still don't yet have the computational power to model a full human being.

She concludes, saying that if the technology when she was 47, it would have told her “to enjoy red wine because, within the next couple of years, (your) body would not be able to metabolize it as well. The prediction would have been there and (I) could have taken that trip to Tuscany before it was too late.”