SciELO - Scientific Electronic Library Online

vol.47 issue1Need to improve clinical trials in rare neurodegenerative disordersTranslating basic research in cancer patient care author indexsubject indexarticles search
Home Page  

Annali dell'Istituto Superiore di Sanità

Print version ISSN 0021-2571

Ann. Ist. Super. Sanità vol.47 n.1 Roma Jan. 2011 



The dawn of mesoscopic approach in drug development


L'alba dell'approccio mesoscopico nello sviluppo dei farmaci



Alessandro Giuliani

Dipartimento di Ambiente e Connessa Prevenzione Primaria, Istituto Superiore di Sanità, Rome, Italy

Address for correspondence




The feeling of the reaching of a crucial turning point is shared by the whole spectrum of sciences. The main features of this turning point are basically identical across different disciplines and can be interpreted as a re-location of the most relevant level of explanation (and consequently intervention for more applicative fields) from the microscopic to the so called mesoscopic level. Here, the character of the mesoscopic approach in drug development field will be briefly commented.

Key words: network pharmacology, target, systems biology, biocomplexity.


La sensazione di essere giunti ad un punto di svolta cruciale è ormai condivisa da tutte le scienze naturali. Le caratteristiche principali di questo punto di svolta sono le stesse per tutte le differenti discipline e si possono riassumere nella consapevolezza che il livello di spiegazione più importante (e conseguentemente il livello di intervento più efficace per le discipline applicative) non sia più il livello microscopico ma il cosiddetto piano mesoscopico. Qui si commenterà brevemente sulle conseguenze per la ricerca farmacologica di questo passaggio di paradigma.

Parole chiave: farmacologia di rete, bersagli farmacologici, biologia dei sistemi, complessità.




In the year 2000, in their seminal paper entitled The middle way, appeared on the proceedings of the American Academy of Science, the Nobel laureate for physics Robert Laughlin and other eminent scientists set the scientific agenda for the XXI century [1]. After three centuries in which the definitive (or at least the most complete) explanation of a natural phenomenon was identified by the construction of a causative model at the most microscopic level, scientists discovered that the place where the "most interesting" things happen were not the "basic bricks" but the level where the correlations between the fundamental structure and the global behaviour take place. Using an architectural metaphor, the authors tell us that we cannot discriminate between a romanesque and a gothic cathedral neither at the level of the bricks composition (microscopic level) nor at the level of the general plan (macroscopic level), but we can efficiently perform this task taking into consideration the shape of arches, being the arch the mesoscopic level linking the microscopic (bricks) and macroscopic (plan) layers.

Far from being a purely philosophical and theoretical proposal, the article reports an impressive list of "mesoscopic approaches" to different facts of nature demonstrating the impossibility to tackle these problems from a classical "back to the fundamentals" approach.



The physicists were used to consider differential equations as the privileged form of expression for the scientific explanations: this dates back to the dawn of modern science in which each phenomenon was considered in terms of "motion". Biologists, especially in these last four decades, are used to look at the world as a place where some microscopic agents (proteins, metabolites, hormones) interact by a "mutual recognition" at specific molecular sites. These mutual recognitions, linearly arranged in subsequent steps give rise to "pathways" that end up with a macroscopic (organism level) consequence. The above approaches are usually called as "reductionist".

In the mesoscopic approach both these attitudes are replaced by a "phase space" geometrical approach in which the system at hand is placed into a multidimensional space correspondent to its relevant features where it "finds its way" by the minimization of some general "energy" function. This space is far from being smooth and continuous, it can be equated to a mountain territory where peaks (correspondent to energy maxima) go hand in hand with deep valleys (energy minima) as a matter of fact these representations are named "rugged landscapes"by scientists as depicted in Figure 1. Probably the most studied of such landscapes are those correspondent to the folding dynamics of proteins [2, 3]: the axes spanning these landscapes are suitable structural descriptors (distances between landmark aminoacid residues, contact matrix descriptors etc.) and the valleys correspond to stable configurations of the protein molecule.



As often is the case in the history of science, this kind of representations are not totally new: thermodynamics made use of very similar diagrams since decades, anyone knows the so called "triple point": that specific combination of pressure, volume, temperature (the axes of the phase space) in which water can simultaneously exist in the three liquid, gas and solid aggregation phases. An efficient phase space description of a given system depends heavily upon the choice of meaningful "collective state descriptors" (such pressure, temperature and volume for thermodynamics) that could allow us to focus on the relevant emerging properties of the system at hand without being lost in a myriad of tiny (and globally not relevant) details.

In the case of ideal gases, in which the interactions between different particles can be considered as negligible, such collective descriptors arise as simple statistics over myriads of particles, in the case of biological entities, where the correlation between the constituent parts are far from being negligible, we need something different.

In the case of proteins, where the correlation both in space and in chemico-physical properties of the constituent aminoacid residues are of utmost importance, the phase space representations are based on properties of the molecule strictly dependent on specific correlation structures [4, 5], the same is starting to happen for microarray based differential gene expression studies [6, 7].

Pharmacological research represents an exceptionally interesting perspective from where to look at this "transition to mesoscopic", because while this transition is still in its infancy and, to be completely honest, we still do not know from where to start, the deep crisis of pharmacological research where the number of new drugs developed is declining since thirty years asks for an immediate response.

For decades the dominant paradigm of pharmacology was fully reductionist: the goal of pharmacological research was to look for the molecular determinant of a given disease. When this determinant, generally a protein molecule, an enzyme or a signalling one, was selected, this was considered the "target" of the drug (generally coincident with the so called receptor) and the efficacy of candidate drugs screened on their ability to bind selectively to the receptor. Clearly this is only the very first phase of the screening procedure, the candidates that pass this step enter into subsequent phases in which their efficacy is tested on animal models of the disease, their toxicity is assessed and eventually go into clinical trials. Even if the great majority of candidate molecules fail in these subsequent phases, nevertheless all these steps are considered as "contingencies" that do not allow for the intrinsic efficacy (equated to the binding to the receptor) of the molecule to emerge at the macroscopic level. In other words, the "basic science" ends with the initial phase, all the rest is "application".

This strategy worked remarkably well for around thirty years then, almost abruptly, it stopped to work around the eighties provoking the apparent paradox of an exponential growth of basic knowledge in biology going hand-in-hand with a drastic fall of newly marketed drugs. This crisis is very well described in a paper appeared four years ago [8] in which the authors sketched an approximate estimate of 76% of drugs discovered in these last twenty years that interacted with receptor molecules (targets) discovered around the fifties while only the 6% could be interpreted as binding to recently discovered targets and for the remnants no reasonable hypothesis of mechanism of action could be sketched. All the drugs actually in the market were estimated to refer to around 130 receptors pertaining to 5 major protein families, an incredibly small number when compared to the 16 000 protein families discovered by genomics. The promise of a "druggable genome" set forth by the completion of human genome sequencing with the consequent opening of a practically infinite horizon for the development of new drugs clearly failed, something very fundamental went wrong.



There are two related features of complex system organization and more in general, of scaling laws in nature, that explain the failure of "druggable genome": the first feature is the non linearity of genotype/phenotype relations so that many different genotypes end up with the same phenotype [9], the second is the relative paucity of "short cut" pathways going from microscopic to macroscopic organization scales in a linear manner [10].

The first point has to do with the shape of "rugged energy landscapes" in which only a discrete number of equilibrium states do exist in front of a continuous variety of control parameters. This dicreteness has huge consequences for biological matter: around 1000 protein folds do account for the 3D shapes of all the known proteins, only around 200 different tissues are present in mammalian correspondent to 200 patterns of genome-wide expression of around 30 000 genes potentially varying on four order of magnitudes of expression levels [6]. This implies the repertoire of "possible behaviours" of a given biological entity is highly constrained.

The second point has to do with the presence of many alternative paths into a complex network of relations between constituent elements so that, in the great majority of cases, even if a given node (target) is involved in a specific process, a specific action on this target has no appreciable effect on the macroscopic scale for the myriads of alternative pathways the system can experience keeping it in the same state.

The presence of few "allowable" states corresponds to what is called an "attractor-like" dynamics [12], being an attractor a valley in the energy landscape, in which the system tends to remain unless we push it out with a sufficiently high energy to make it exit from its equilibrium states, the depth of the valley is roughly proportional to the energy needed to exert an effective change on the system. This is the basis of reliability and stability of biological systems that otherwise will be simply unconceivable [11].

The presence of attractor-like dynamics in biological systems implies a very big change in pharmacological research strategies as aptly evidenced in [12], we can try and describe the differences between the two attitudes (classical vs. mesoscopic) in a schemata reporting the different choices linked to the two paradigms at three different levels of pharmacological experimentation:

1) Initial screening

Reductionist paradigm. Accept as candidate drugs those molecules that selectively bind to the receptor considered as crucial for the disease. Eliminate the molecules that bind to a multiplicity of different receptors because they are more prone to give rise to undesired side effects.

Mesoscopic paradigm. Prefer as candidate drugs those molecules weakly binding to a multitude of different receptors because they are more prone to be "network modifiers" inducing a systemic effect pushing the dynamics to another attractor.

2) Choice of biological end-point

Reductionist paradigm. Select a biological test strictly linked to the molecular function of the target.

Mesoscopic paradigm. Select a coarse-grain biological modification (e.g. shape modifications of cell membranes in a tissue, metabolomic profile considered as a whole) that is more probably linked to a general shift of state.

3) Clinical trials

Reductionist paradigm. Patients to be enrolled in the study are selected with the goal to present in the "purest form" the disease condition we want to cure keeping to a minimum the possible confounding due to other contingencies. The measured end-point is the clinical variable considered as the most relevant for the patient state.

Mesoscopic paradigm. Patients to be enrolled in the study should generate a sample in which the natural sources of variability of the actual patient populations are present. A multiplicity of clinical end-points should be measured and the effect of the drug on the correlation structure of such end-points (e.g. principal component scores) considered as the signature of drug efficacy.

Pharmacological research is a mainly empirical affair, this implies that, beside the shift to a mesoscopic attitude taking place in more theoretical science, actual scientists and regulators need something of more "sturdy" to leave an old path for a new one.

In my opinion the two main "sturdy" reasons pushing for the change will be the progressive lack of efficacy of old path and the presence of already known "network-multi-target" drugs (e.g. aspirin) that still escape, after decades, any receptor-like explanation.

All in all the acquiring of a more "realistic" attitude toward the analysis of biological systems by means of pharmacological research could be very important not only for the future drugs but for a renewed biology that will open us still unexpected avenues of knowledge.

Conflict of interest statement

There are no potential conflicts of interest or any financial or personal relationships with other people or organizations that could inappropriately bias conduct and findings of this study.



1. Laughlin RB, Pines D, Schmalian J, Stojkovic BP, Wolynes P. The middle way. Proc Natl Acad Sci USA 2000;97:32-7.         [ Links ]

2. Frauenfelder H, Sligar SG, Wolynes P. The energy landscapes and motions of proteins. Science 1991;254:1598-1603.         [ Links ]

3. Okazaki KJ, Takada S. Dynamic energy landscape view of coupled binding and protein conformational change: Induced-fit versus population-shift mechanisms. Proc Natl Acad Sci USA 2008;105:11182-7.         [ Links ]

4. Krishnan, Zbilut JP, Tomita M, Giuliani A. Proteins as networks: usefulness of graph theory in protein science. Curr Prot Peptide Sci 2008;9(1):28-38.         [ Links ]

5. Del Sol A, Fujiashi H, Amoros D, Nussinov R. Residues crucial for maintaining short paths in network communication mediate signaling in proteins. Mol Systems Biol 2006:2:2006-19.         [ Links ]

6. Giuliani A. Collective motions and specific effectors: a statistical mechanics perspective on biological regulation. BMC Genomics 2010;11(Suppl. 1): S2.         [ Links ]

7. Huang S. Reprogramming cell fates: reconciling rarity with robustness. BioEssays 2009;31:546-60.         [ Links ]

8. Overington JP, Al-Lazikani B, Hopkins JL. How many drug targets are there? Nature Rev Drug Discov 2006;5:993-6.         [ Links ]

9. Krishnan A, Giuliani A, Tomita M. Indeterminacy of reverse engineering of gene regulatory networks. The curse of gene elasticity. Plos ONE 2007;6:e562.         [ Links ]

10. Hopkins AL. Network pharmacology: the next paradigm in drug discovery. Nature Chem Biol 2008;11:682-90.         [ Links ]

11. Brock A, Chang H, Huang S. Non-genetic heterogeneity. A mutation-independent driving force for the somatic evolution of tumours Nature Re Genet 2009;10:336-42.         [ Links ]

12. Csermely P, Agoston V, Pongor S. The efficiency of multi-target drugs: the network approach might help drug design. TRENDS in Pharm Sci 2005;26:178-82.         [ Links ]



Address for correspondence:
Alessandro Giuliani
Dipartimento di Ambiente e Connessa Prevenzione Primaria, Istituto Superiore di Sanità
Viale Regina Elena 299
00161 Rome, Italy

Submitted on invitation.
Accepted on 20 September 2010.