From risk profiling to gene therapy and molecular diagnostics, personalized medicine opens new, exciting fields to medical research. Not only is it good news for the patients: considerable improvements are at stake, both for health systems and pharmaceutical firms now struggling to reinvent themselves. But the road ahead is still full of obstacles.
The concept of personalized medicine was introduced two decades ago by the Swiss company Roche. The initial concept was based on a simple reality in medical practice: the same drug may induce different reactions according to patients, and for a given patient, some drugs work while others don’t. With the introduction of a treatment of breast cancer in the 1990’s, Roche demonstrated that it was possible to anticipate which patients would or would not benefit from the treatment.
It became therefore possible to personalize treatments, that is to administer a drug only to those patients who would react positively to it. The impact of this new approach is huge: a better efficiency, less side-effects, and no more waste of time and resources spent on a treatment that does not work.
This first form of personalized medicine opened the way to five different approaches.
Stratified medicine is the approach developped by Roche for its treatment of breast cancer, Herceptin: it consists in dividing patients in four groups, depending on their reaction to a drug regarding efficiency and side-effects. The drug is given only to the group who reacts positively and does not display side-effects.
Oncological vaccination is another form of personalized medicine whereby the patient’s immune system is “trained” to destroy tumor cells. This result is obtained by re-injecting a sample of a patient’s own cells after an external treatment. The treated cells are designed to stimulate the patient’s immune system specifically against tumor cells.
Tissue reconstruction is a promising new area. It can be done by implanting cells that have the capacity to grow on a damaged tissue and repair it. The cells can either come from the patient himself after an external treatment, or they can be stem cells donated by others.
Gene therapy aims at changing the genome of a patient, in order to remove the mutation at the origin of a disease. Gene therapy can be divided into persistent DNA transfection, transient DNAA transfection, and RNA therapy. Persistent DNA therapy has been almost abandoned today: changing permanently the genome of somebody can have implications in the long term, even after the patient’s death, since genome modifications are transferred to his offspring. It also often has consequences on other biological processes that cannot be predicted. Current research is now concentrating on the other two approaches, which have the benefit of being reversible. The first drugs are now reaching the market, an interesting example being Glybera, the first effective treatment of LPLD, a rare disease where patients have a deficiency in a protein called LPL, which induces diabetes and cardiovascular diseases.
Risk profiling is probably the most publicized form of personalized medicine: by sequencing a patient’s genome, it becomes now possible to measure the risk that a person develops a disease in the future. As the total cost of sequencing the entire genome of a person goes down every year, it will soon be possible for everyone to obtain his own genome sequence, and therefore his risk profile for a number of diseases. Besides, some other, more accurate forms of risk profiling exist: in cardiology for example, by measuring the concentration level of a certain protein in the blood, it becomes possible to accurately predict the risk for a patient to have a cardiovascular accident (stroke, heart failure) in the coming months or years.
Risk profiling is a particularly important aspect of personalized medicine, since it opens the way to prevention, a huge potential source of well-being for people and cost-savings for healthcare systems.
To understand why and how the whole healthcare industry is now turning to personalized medicine, one has to take a look at the scientific revolution that triggered this paradigm shift.
For the last 10 years, improving R&D productivity has been one of the key challenges for pharmaceutical companies. With most of their top-selling drugs going off-patent between 2004 and 2012 and not enough new products coming up to compensate for the loss, this wealthy industry has been facing an increasing pressure for more innovation.
Streamlining management of R&D projects, forming alliances, building long-term agreements with academic or biotech research teams, licensing-in research projects with sophisticated deal structures have been typical and sometimes very effective responses to this challenge. However, none of these approaches have really addressed the central problem of pharmaceutical R&D today. To find new drugs tomorrow, scientists will need to better understand the molecular processes that govern diseases, and how genetic diversity of patients influences treatment outcomes.
In parallel, the industrialization of molecular biology techniques has led to significant progress in the detection and analysis of molecules of ever-increasing complexity such as long DNA/RNA fragments, proteins, or even entire cells. These techniques have allowed scientists to gain a much more accurate understanding of the molecular processes which are at the origin of many diseases. The potential of what is now called molecular diagnostics is immense, not only in R&D, but in daily medical practice as well.
At the other end of the spectrum, healthcare systems in most countries are under high pressure. They are no longer reimbursing “me-too” drugs that do not add any medical value on top of existing treatments; and they are now also, in some cases, paying only if the treatment outcome is positive. A growing number of healthcare systems admit the need for return on investment – once a taboo.
Molecular diagnostics are providing the scientific answer to both of these challenges: not only are they improving R&D productivity, but they are also providing clinicians with powerful tools to better understand what they are doing and how effective their treatment decisions are. Hence in the end, they treat their patients better, at a lower cost.
Molecular diagnostics detect and measure specific molecular targets using selective probes; they are always associated with a visualization method. The value of these targets – called biomarkers – is that they correlate with a disease, or sometimes even are the root cause of it. Biomarkers can be of very different nature, ranging from simple small molecules (like metabolites) to complex structures such as proteins, nucleic acids, or even entire cells.
Finding the right biomarker that will enable clinicians to understand and accurately track the progression of the disease – or the response to a therapy – is one of the key challenges of pharmaceutical research today. Indeed, there is often no direct causality between a biomarker and the disease/response. Some biomarkers may be representative of several diseases and conversely, it might be necessary to detect several biomarkers to identify a disease.
Detection and amplification technologies have made huge progress in recent years and have become the main drivers in the development of molecular diagnostic tools. The most well-known methodology is Polymerase Chain Reaction (PCR), but others such as Micro-Arrays, Fluorescent In-Situ Hybridization (FISH), immunohistochemistry, Fluorescence-Activated Cell Sorting (FACS) and many others have now reached maturity.
Biomarkers play an important role at all levels of the pharmaceutical value chain: very early in the discovery process, during product development, and also at the patient’s level, for prevention, treatment monitoring and disease management.
Biomarkers can be sorted in three categories; their role depends on where they are used in the value chain.
Discovery biomarkers are molecular entities that are usually identified during the early discovery process of a new drug. These biomarkers are related to the biological target, and are used to validate disease mechanisms, probe potential toxicological effects or anticipate potential genetic variations in drug response. The hope of many pharma R&D executives is that they will help reach proof of concept in humans quicker: the idea is to be able to perform clinical trials safely, with very low doses, on a few patients much earlier, before even the full pre-clinical package has been completed. Drugs that are shown not to work in these early trials would then be eliminated earlier, which in turn would allow more substances to be tested.
Later in the process, development biomarkers can also be very useful. Usually the same as those identified during the discovery process, they are used in clinical trials to provide early information about the drug’s efficacy and safety, before any clinical parameter can be measured. Because they are able to provide information that is not accessible to conventional clinical monitoring, they can sometimes contribute to significantly accelerate the development process. In a nutshell, in order to measure the efficacy of a new drug, simple blood tests can now (in some cases) replace months or even years of clinical observation.
And finally, the commercial biomarkers. By “commercial” we mean biomarkers that have been officially recognized by the regulatory authorities as tools that bring useful information on a patient’s state and that can be used to make clinical decisions; diagnostic tools using these biomarkers can also be reimbursed by the healthcare system. This type of biomarker is not new. To mention two examples, glucose has been recognized as a biomarker of diabetes since the XIXth century (now it has been replaced by HbA1c), and PSA (Prostate Specific Antigen) since the 80’s for prostate cancer.
Although biomarkers have been used for a long time, the development of molecular biology, bioinformatics and innovative detection methods has hugely increased the possibilities, and a new type of tool has emerged: “molecular” diagnostics. The first emblematic examples of these new powerful techniques are the so-called “companion diagnostics”, which have been introduced in cancer (first with Herceptin, mentioned earlier) and more recently HIV, where the prescription of some expensive treatments has been conditioned by the patient responding to a specific genetic test. Other well-known examples are routine screening tests for infectious diseases (HIV) that have a very high degree of specificity and sensitivity.
Molecular diagnostics are the tools of modern medicine. They enable clinicians to identify early signs of diseases, to select the right treatment based on genetic profiles (companion diagnostics), to monitor treatments’ efficacy and to make much better informed decisions based on each patient’s particular profile and reaction to previous treatments. They also have an economic role, because they may lead in some cases to significant savings in treatment costs.
It seems, then that an ideal world is ahead where all diseases would be diagnosed before it is too late, would receive personalized treatments that would be cured immediately, without side-effects and relapses… There are, however, some barriers on the road.
Firstly, for healthcare manufacturers: the use of biomarkers may in some cases not accelerate, but slow down research and development: as genetic profiling of patients becomes possible, the temptation is high to demand that patients be divided into more groups, according to their genetic profile, which in turn increases the amount of tests to be performed and data to be analyzed, and may also slow down patient recruitment.
Secondly, and this is the biggest challenge, the regulatory and economic framework that should favor the development of molecular diagnostics is not in place yet. Molecular diagnostics is an emerging field, and healthcare is a highly regulated environment. For drugs in most countries today, the regulatory process that leads to the marketing authorization is well structured, as well as the “market access” process that leads to the reimbursement of the drug by healthcare systems. It is, however, not yet the case for molecular diagnostics. The first reason is that the field of molecular diagnostics is new; the second, more important reason is that the nature of what they bring, information, is something that healthcare professionals are not yet used to value.
Be it in Europe, in the United States or elsewhere, a new molecular diagnostic tool will always have to prove that the biomarker it is supposed to monitor has both clinical relevance AND economic value in the overall management of the patient. In contrast with drugs where clinical performance (vs. standard of care) is always the most important criteria, what payers will look for in a diagnostic tool is most often how much can be saved. The medical information brought by a molecular diagnostic tool can save or prolong lives, avoid pain, secure the patient, help a physician make the right decision: payers however, especially in Europe, are not prepared to pay for this information – yet. Introducing the principle of “value-based pricing” in future reimbursement negotiations is tomorrow’s main challenge for the diagnostic industry.
The time has not come where molecular diagnostics will be used systematically for every patient in every country, in prevention programs, before any treatment decision is taken, and after each treatment phase. It is also not yet entirely clear when and how payers will recognize the value of the information they bring.
However, there is little doubt that in the end, because the information they bring is so valuable to physicians – and eventually to payers as well – they will become the main tools of tomorrow’s medicine, personalized medicine.