„Over the past decade, we have strongly promoted the use of computational methods in medicine and drug development”
In hospitals: Over the last 10 years, the international CompBioMed research project has helped to ensure that doctors can now use
more and more software and tools to test or practise treatments digitally first. Photo: Irwan/Unsplash
Peter Coveney is delighted with the outcome: "What I liked about the CompBioMed project was well expressed by one of the evaluators. He said that there's a lot to be gained for a very large number of people and for the future from the software and methods that have been developed," says the Professor of Physical Chemistry and Computer Science, who led the CompBioMed Centre of Excellence and teaches and researches at University College London (UCL) and the Universities of Amsterdam and Yale. "It's a good thing that the use of mechanistic modelling and other computational methods, i.e. predictive methods, are starting to gain ground in biology and medicine, because in the long term that's how we're going to make more progress." The ambitious, multinational CompBioMed project, which involved 52 partners companies, universities and research institutes such as the Leibniz Supercomputing Centre (LRZ), is coming to an end. At the heart of CompBioMed was a digital human twin, the Virtual Human, which was not only intended to advance treatments and the development of new drugs, but also led to new digital methods, a host of software and algorithms, as well as visualisations and technologies. Coveny summarised the work and results of CompBioMed at the Centre for Advanced Studies (CAS) of the Ludwig Maximilian University in Munich, where he spent two months researching, initiating new projects and collaborations, and putting the finishing touches to his new book: "I found an inspiring ambience, a very good environment to work on my book and many interesting people to talk to."
The CompBioMed Centre of Excellence has come to an end - are you satisfied with what has been achieved? Prof. Peter Coveney: Let's let the numbers speak for themselves: more than 200 publications, around 60 research datasets that can be further exploited, around two dozen software and digital tools or algorithms for the verification and processing of medical data - yes, I believe that CompBioMed's impact on medicine, pharmaceutical research and bioengineering has been enormous. We recently had our final review with the EU's Horizon 2020 Commission, and the reviewers were really impressed. But these are only the results of the second phase of the project. In the four years before that, the cluster of excellence was similarly productive. Over the past decade, we have strongly promoted the use of computational methods in medicine and drug development. The COVID pandemic has given us the opportunity to prove in practice that computational methods can really speed up the development of new drugs many times over.
Could you describe the role of supercomputing in the CompBioMed projects? Coveney: In the way that projects like CompBioMed are set up, it's essential to have access to powerful supercomputers. We can use them to do things that haven't really been done before in biology and medicine, such as using mathematical models that represent the human body and predict what will happen in the body or during treatments. Biologists and medics are not yet used to supercomputing, so you often hear that modelling the human body is far too complex. But powerful computers have now reached a scale where they can tackle it. That is why the supercomputers of the LRZ, the Barcelona Supercomputing Centre and the Argonne National Lab were an essential part of CompBioMed.
The CompBioMed Software Hub
Programmes, tools, databases or AI models developed for CompBioMed:
Alya: computer modelling framework to study, diagnose and potentially treat patients suffering from cardiovascular disease
BAC: high-performance simulation workflow which automates ensemble molecular dynamics simulation and free energy calculation protocols
BoneStrength: in silicio trial technology to simulate the effect of osteoporosis drug treatments
Computed Tomografy to Strength (CT2S): online service developed to predict an individual’s risk of (femoral) fracture
IMPECCABLE workflow: a scalable, flexible, and extensive infrastructure to discover improved leads targeted at SARS-CoV-2
HemeLB: a 3D macroscopic blood flow simulation code
HemoCell: high-performance cellular suspension code primarily aimed at blood flow related simulations.
MonoAlg3D_C: GPU-compatible framework for the modelling and simulation of human cardiac electrophysiology
openBF: a validated open source software for the simulation of blood flow in networks of elastic arteries
Palabos: a massively parallel simulation of complex fluid flow
PlayMolecule: a drug discovery web service that provides a comprehensive set of applications aimed at accelerating and improving drug discovery workflows
Predicting AMR: AI model for forecast about resistance to antibiotics
TIES: a collection of software packages which can be used to calculate protein ligand binding free energies with physics based alchemical methods
TorchMD-NET: a PyTorch-based package which provides state-of-the-art neural network architectures for learning molecular interatomic potentials, integrated in OpenMM
UVA: a database of cellular simulation resulting from investigations carried out with HemoCell
UOXF: a database for in silico populations developed in Task 3.6 for the simulation of large populations of subjects, complete with realistic representations of heart shape and ECGs
UNIBO: a synthetic cohort generator based on 94 proximal femur CT-based subject-specific finite-element models from the Sheffield cohort
You will find more details here: https://www.compbiomed.eu/compbiomed-software-hub/
CompBioMed experts consult specialist in the Slack-Channel In Silicio World and on the Website CompBioMed.eu
One of the goals was the Virtual Human, a digital twin. How far has that work progressed? Coveney: The human digital twins will be part of a virtual future of medicine. Building accurate, high-fidelity models of people is a huge task, and CompBioMed has contributed to this in a number of ways. We're not talking about a single representation of a whole human being in the near future. It's more about components of the human body that can be accurately simulated on supercomputers. Anything else: Normally, when you run a simulation, you get information out of it. Especially when you take the next step to a digital twin, you're trying to link the model much more closely to the human subject. This cycle really puts much higher demands on the management of experimental or clinical data and the various imaging technologies that are used to bring that data into the simulations as quickly as possible. In this particular case, we are talking about people and their personal data. If you're going to really personalise the virtual human, people have to agree to have their data used in digital models. Are people willing to share their data with a computer?
Are they? After all, many people already use devices to monitor their health data. Coveney: The incessant monitoring of everything may be going too far. But it's also clear that the medical profession is failing a lot of people around the world. There's so much information on the internet now that people can easily see if they're not being treated properly by doctors. A lot of people talk about cases where doctors haven't been able to cure them of an ailment, but they've been able to find a solution themselves online. That's part of the problem in healthcare, that doctors act a bit like AI systems, because they have a lot of experience with a lot of patients and they have a general idea of what's going on. But when it comes to treating a particular patient, they're usually not very good because they're trying to deal with an average patient.
Can this personal monitoring data help improve medicine or individual treatments? Coveney: At present you can get a lot of patient data for research purposes as long as it is anonymised. But that's only the first step, we want to build specific models of a specific individual. Medical data needs to be secure enough that it can be used to help with patient therapies and treatments, but not to pose a risk. So data security is one of the biggest challenges for digital health. On the other hand, it's good to get data - artificial intelligence comes up in all these conversations about the digital twin and medical data. AI needs access to data to run statistical models and methods. But that has led to a lot of ethical and moral questions, because everyone can see that companies are making a lot of money out of taking people's data without them being aware of it or giving permission. I am very concerned about that. And: In healthcare, I can get a lot of data about individuals and I can make predictions about their general state of health, well-being, wellness. But in my view, that's not really personalised medicine. It's predicting someone's health based on everyone else's health. Why would that be a good idea?
So for personalised medicine with digital twins, we should have a single person's health data, as complete as possible, from every day? Coveney: Yes, but there's a lot of misunderstanding about digital twins and data. You might think of a whole person, but in practice doctors only look at one part of a person, for example the heart. Human hearts already exist in computers, they are based on imaging technology and fitting that to the software. This can be used to create a model for a specific patient or even a whole population of real people. That's what's happening now, so you can do what we call in silico studies to replace testing on real people. These could be drug trials. You can test the effect of that drug on the human heart, but not just an average, for each of the patients whose models are available. One of the companies involved in CompBioMed 2, ELEMBIO, is developing such testing methods. With in silico or virtual trials, we can avoid the risks of side effects in humans and improve the results of clinical trials because we can statistically represent the population quite accurately.
Are computer models and in silico trails already widely used in the pharmaceutical industry? Coveney: I regard drug discovery as part of the virtual human. To figure out what drugs to give to individuals, you need the same kind of data, even though the medical treatments are different. A lot of molecular research is focused on trying to discover drugs that can be used to treat larger organ systems. But the global pharmaceutical industry is not a success story. Statistics show that it takes an average of ten years to develop a drug, which can cost around $3 billion. If it works, it will be used. When a drug is successful, it is estimated to be used by about half the population. Usually, drugs are effective in far fewer people, and there will be side effects. We need to be able to develop better targeted drugs that work for different people.
The digital twin for personalised treatments requires a lot of very sensitive personal data about an individual, stored for as long as possible. Are we ready for that? Coveney: That is a question which now arises. We need to inform people about what's going on with the technology and we need to address the ethical, moral and other issues. In some of the surveys that CompBioMed has done, we have found that a lot of people are very interested in this idea. At the end of the day, part of it is actually trying to build personal sovereignty over personal data. You have your own data and you can do what you want with it. Many of us believe that in the future, medical data will be managed in the same way that people's bank accounts are managed today. People do their financial business electronically because it's much more convenient. So the same idea applies to healthcare, which means you will no longer be tied to particular hospitals or doctors. You can also sign up for virtual trials. And you can consult doctors digitally. If you are interested in making positive choices about your lifestyle - diet, exercise and more - you can do that.
With the LRZ, the CompBioMed cluster of excellence has also visualised processes in human organs, such as blood flow. How important are visualisations for the medicine of the future? Coveney: Visualisation is really important for the work we do. Films can be very successful scientific processes. I'm always pushing for a scenario where you have a reliable simulation that you can interact with in the way that you interact with video games. This gives clinicians the opportunity to work with what-if scenarios before performing invasive surgery. The quality of the visualisation depends on the complexity of the model, how fast you can run it, how fast you can actually interact with it.
- 18 Core-Partners; 52 associated parners
- 310 sience paper, 2 books: Virtual You / HPC for Drug Discovery and Biomedicine)
- 139 events, 26 workshops,
- 2 films: Virtual Human / The Next Pandemic
- Education, Trainings, Information, software development
- round about 190 mio. peoples reached
Brexit happened during the second phase of CompBioMed, how did it affect the project? Coveney: Brexit has been very damaging, especially for science and research. It has created barriers to communication and collaboration. Definitely a bad thing. CompBioMed had some bad moments because of Brexit. I noticed that there were significant difficulties in dealing with the Commission, but not at all between the partners in the project. I'm glad we managed to get through it without any significant damage to our work.
What's next for the cluster of excellence? Coveney: It took until last year for the UK government to finally agree to participate in Horizon Europe. In addition, everything related to high performance computing in Europe became part of the EuroHPC Joint Undertaking. However, it seems that EuroHPC is currently focusing more on hardware development issues than on research projects. So unfortunately I do not know if we will be able to continue CompBioMed.
You spent the last two months in Munich at the CAS at the LMU – what have you been doing here? Coveney: First of all, the mundane - I regularly visit people as part of many collaborations, including at the LRZ. What we like to do is discuss the future, where technology is going, and people there are interested in quantum computing. It can't really offer anything significant today, but maybe it will in the near future. It's a bit of a bet. If we get another CompBioMed project, I would include quantum computing. Bringing quantum devices and supercomputers together is now an essential step, because we can only start with this technology in a hybrid setup. We have to integrate it like GPUs and other forms of accelerators. Quantum computing has to be seen as a new part of supercomputing if it's going to be really useful. Another reason for my stay - Munich is a place I like very much. I came here when I was much younger to study German. And, of course, the Centre for Advanced Studies of the Ludwig-Maximilians-University and its Department of Computer Science provide me with good facilities. So I have found an inspiring spirit, a very good environment for working on my book and a lot of interesting people to talk to.
What do you write about? Coveney: I'm writing about molecular dynamics, more specifically about probability and uncertainty. Molecular dynamics is a science that has been using computers for about 60 years, and supercomputing is one of the methods. Probably up to half of a large national supercomputer is spent on molecular dynamics simulations. My book is about their unpredictability and probability. It's like weather forecasting. If you do a weather forecast on a computer, everybody knows there's a huge amount of uncertainty and you can't predict tomorrow's weather with a single simulation, you have to do a large number of them. That's why you need a supercomputer. The difference with molecular dynamics is that people don't really understand that it has the same limitations. You need a statistically robust sample. Molecular dynamics is chaotic. If there are small changes in the initial conditions when you run a simulation, the subsequent behaviour can be very different. So we need to develop a better theoretical understanding of molecular dynamics, so that we understand the probability and the associated uncertainty of the predictions. Surprising as it may seem, this is not properly understood in science. Then we wish you continued success in your writing and research. Thank you for the interview. (Interview: S. Vieser/LRZ)
Prof. Dr. Peter Coveney, talking in the garden of CAS in Munich. His book " Molecular dynamics:
Probability and Uncertainty" will be published 2025 .by Oxford University Press