Luis David Castiel 1
The next millennium and epidemiology: searching for information

Próximo milênio e epidemiologia: em busca da informação

 

 


1 Departamento de Epidemiologia e Métodos Quantitativos em Saúde. Escola Nacional
de Saúde Pública,
Fundação Oswaldo Cruz.
Rua Leopoldo Bulhões 1480, sala 828, Rio de Janeiro, RJ 21041-210, Brasil. castiel@manguinhos.ensp.fiocruz.br
  Abstract On the eve of the new millennium, it has become 'natural' to admit the emergence of tendencies to perform evaluations and inventories of the past and attempts to forecast future scenarios. While recognizing the ensuing uncertainties, the current paper takes this point of view as the point of departure for proposing a discussion on the future directions and prospects of epidemiology. Based on the pertinent analyses performed by the Sussers (father and son), I approach and discuss the scope and limits of new aspects assumed by the field, especially to the extent that it has included techniques and instruments from bioinformatics and molecular biology. In the latter areas (amongst many others), the notion of information has gained enormous importance. I then proceed to analyze the conceptual origins and shifts in this notion, in addition to possible repercussions and effects on the field of biological sciences in general and their research practices in particular.

Key words Epidemiology; Molecular Biology; Information

 

Resumo Diante da expectativa de um novo milênio, torna-se "natural" admitir o surgimento de inclinações para procederem-se a avaliações e balanços do passado e a tentativas de prever-se cenários futuros. Este texto parte deste ponto de vista ­ reconhecendo suas incertezas ­ para propor uma discussão dos rumos e perspectivas da epidemiologia. A partir das análises nesta direção pelos Sussers (pai e filho), são abordadas e discutidas o alcance e os limites de novos aspectos assumidos pela disciplina, especialmente, ao incluir técnicas e instrumentos da bioinformática e da biologia molecular. Nestas áreas (entre muitas outras), é notável o fato da noção de informação possuir enorme importância. São, então, analisadas origens e deslocamentos conceituais desta noção e possíveis repercussões e efeitos no campo das ciências biológicas, em geral, e de suas práticas de pesquisa, em particular.

Palavras-chave Epidemiologia; Biologia Molecular; Informação

 

 

"We learn to build discourses on reality that are highly abstract, but which we know how to recognize as true or false. It is a generalization, a fantastic extrapolation of our immediate experience with error or lie. In fact, this is a concrete, unique experience by which we learn simultaneously and concurrently a unique thing and a discourse on this thing that is different from what we learned about it."

Henri Atlan

 

 

Introduction: turn-of-the-century, inventories and forecasts

 

It is known that "ends of periods" exercise curious effects on human beings. They may be weekends (here, specifically, there is an obvious, strong link to leisure, by opposition to so-called "workdays"), ends of the month (paychecks and bills...), ends of the year, decades, quinquecentennials (a fascination with numbers ending in fives and zeros), turns of centuries, and so on... When such periods end, we join in acts of celebration, joint remembrance (co-memmoration), weddings, birthdays, quinquecentennials, centennials (of births and deaths of both persons and human creations)...

An "end of period" is obviously an arbitrary, conventional temporal category (human groups culturally define the ways by which they mark the passage of time) created by these peculiar beings that produce things which recursively affect them in what is often an unpredictable way.

In the face of presumed "chronological terminations", there emerges the task of producing evaluations, inventories, and value judgments. To what ends? In very simplified terms, several can be identified:

a) more explicit ends: to monitor and objectively identify trajectories and processes under way over the course of a given period of time;

b) less explicit ends: to (re)describe for ourselves (by redescribing to ourselves) "what happened" in terms of ordered narratives, full of symbolic components, seeking meanings in the "subjective" happenings that accompany the events of chronological time.

There appears to be a particular need to prepare for the vicissitudes of the "fate" that "awaits us"... We need periodic structures to allow for the ordered narratives of our (re)descriptions of ourselves and our surroundings.

A brief comment is in order on our metaphorical constructions with regard to the notion of passing of time. In this sense, the passing of chronological time is usually seen as a movement, allowing for two cases:

1) time as the movement of objects in continuous, linear fashion, with the future moving in our direction and the past being left behind.

2) time as movement across a landscape. In this case, time can also be seen as "stopped", while we are the ones that move through it (e.g., "we are getting to the end of the year, or close to Christmas"). In a word, the metaphorical structure presents "time" passing us from front to back, whether we are standing still and "it" is moving towards us or we are moving towards it, while "it" remains static. That is, something or someone must be moving... There is no other way, since as the poet once said, "Time never stops."

According to Lakoff (1992), such descriptions of time in terms of movement, objects, and places have a biological basis, since our sight specifically detects these phenomena, while we have no specific sensors for the passage of time. In order to perceive time, we must use references obtained from our available visual sensors. Yet we are unaware of this in our daily lives. In fact, this does not even matter in solving everyday problems (and it may be more convenient to ignore it). As Lakoff & Johnson say:

"(...) All this detailed and consistent metaphorical structure is part of our daily literal language for time, so familiar that we normally do not realize that it is a metaphorical construction" (Lakoff & Johnson, 1980:82).

Regardless of the metaphorical configuration of time that comes into play, this preliminary comment requires identifying the "topographical" perspective adopted by this commentator, situated as I am below the Equator, with all the possible sins this may entail, especially the geographical and sociocultural gaps (with unavoidable biases and prejudices) vis-à-vis the players in the Anglo-Saxon scenario, where most epidemiological work and futuristic speculation is produced. I refer to this line of work because of its undeniable influence on the field of epidemiology worldwide. Sooner or later we feel the repercussions of what happens in epidemiology above the Equator.

Although such a speculative undertaking might spark criticism due to the risk of a mistaken reading, I should highlight the pertinence of such an exercise, since it creates possibilities for reflection and perhaps organization vis-à-vis overwhelming and adverse situations (after all, there are always "priorities" defined according to the respective interest groups involved). This paper thus discusses an outline for epidemiology "in the next millennium". In other words, my goal is to call attention to the description of future scenarios (even admitting the decline of futurology, a discipline in vogue in the 1970s) in order to orient what could (or should) be "the best" (insofar as possible) prospects for Sub-Equatorial epidemiology.

While there is an underlying "evaluative furor" in this exercise, justified by the spirit of expectation surrounding this period of "new times", on the other hand it is necessary to adopt a certain degree of complacency based on the obvious reasons for the fallibility of any current forecast: in addition to the usual "observation biases", there is also a great lack of precision resulting from the instability and rapid pace of today's technoscientific and sociocultural changes.

Outside the walls of academe, astrologers, "wizards", "prophets", "soothsayers", and various crystal-ball "experts" are in great demand. Especially (et pour cause?) at a moment when the natural sciences in general have begun to view most of the systems surrounding us as complex and dynamic, thus highlighting their stochastic character and resulting unpredictability (in deterministic terms). Worse yet, some researchers at "state-of-the-art" research centers, like the Santa Fé Institute in the United States, have begun to doubt the possibility of reaching a unified theory for complex systems. According to them, on the one hand there may be excesses and distortions in the so-called "scientific journalism" with regard to such ideas as entropy, chance, chaos, and information, amongst others. On the other, the problems start with the lack of precision in the concept of "complexity". More than 31 definitions for this term have appeared already, to the point where the idea has lost its meaning, and at least one author has bemoaned that we are moving from complexity to perplexity (Horgan, 1995). Whether such issues are pertinent or not, in everyday terms they undermine belief in the redeeming ability of science to mitigate human suffering and respond to the anxiety identified above in the (re)descriptions of both ourselves and what is going on around us, in the face of the unceasing proliferation, multiplicity, and simultaneity of events...

Before moving on I should touch on two points. First, the "next millennium", or rather its "spirit", is already amongst us... In fact, it has been said that the future began some time ago. Examples abound in other fields of knowledge, in sectors of (bio)technological production, and in the "futuramic" characteristics assumed by sociocultural practices in contemporary societies, along the lines previously hinted at by the genre known as "science fiction".

 

 

Predicting the future: a jigsaw puzzle without all the pieces

 

When we were kids, we used to have fun playing with kaleidoscopes, which would assume various abstract shapes, with symmetrical planes [kali- comes from the Greek kállos, or "beautiful" (hence calligraphy) (Ferreira, 1986)]. The designs changed their appearance based on the mechanical movement of the object, maintaining the same elements, through a play on mirrors. We could control the speed of the changes and even carefully stop the toy and show the resulting designs to our friends.

Along the lines of this playful metaphor, the new "kaleidoscopes" are built with electronic microcircuits linked to screens (LCDs or traditional videos or kinescopes) and/or internet networks, with extensively multiplied elements that shift cinematically, whereby the images can be animated, in script form, and anthropomorphic, allowing for control (or so-called "interactivity") with objectives, phases, or (for those who so desire) heavy doses of competition and scoring (videogames) or even "humanoid" demands (see Tamagotchi, the eggwatch, and equivalents).

Although it may be a truism, it is important to stress that our observations are a procedure that seeks to demarcate some intelligibility in a hypercomplex, intertwined, and simultaneous picture. Very well, in these times marked by the proliferation of "new kaleidoscopes", mixing the playful and figurative senses (the rapid and changing succession of impressions and sensations), our goal is to ascribe possible meanings to the new "figures" that are shown to us, viewing them initially as mysteries, like another toy, the puzzle, which also serves to designate an "enigma" or "perplexity". The point is that we feel not only delight with the images produced by the neo-kaleidoscopes. We are both obsessed and bewildered by such dazzling virtual aesthetics.

Because of our bewilderment with the speed and proliferation of new enigmas (and their "puzzling effects"), we are in constant need of producing new "solutions", i.e., new meanings (albeit transient, fragile, and local). Thus, while our "game" is now a kaleidoscopic mix of puzzles and enigmas, we must admit that we have no definitive solution, model, or gold standard. There are pieces missing from the puzzle. Meanwhile, other pieces are constantly being added that provide multiple, complex "configurations", depending on the observer's point of view. Thus, more than ever before, faced with the unpredictability of contemporaneity, predicting the future has increasingly become an exercise in providing a feasible order for the present. Such predictive exercises should be viewed in this light.

 

 

Describing the "epidemiological situation"

 

Petersen & Lupton (1996) produced a critical text with brilliant arguments on the results of contributions by risk-factor epidemiology in constituting the "new public health" and its corresponding "new morality". While a rhetoric of regulation is developed through the risk discourse, it is now the "irrationality" of individuals adopting harmful lifestyles that much be approached through the rationalizing light of epidemiology. The authors emphasize the central role of statistical and epidemiological quantification in the construction of epidemiological "truths". In reality, such "facts" are presented under the guise of the neutrality and objectivity of scientific knowledge, without revealing the socially defined contingencies by which epidemiological constructions are built and interpreted. Furthermore, in the public communication of findings, the indetermination and corresponding margin of error inherent to operating the statistical/epidemiological device when referred to the individual are not usually explained clearly, and when they are, there is no certainty as to the degree or reliability of understanding by the lay public receiving such information. As discussed previously (Castiel, 1998), individuals generally lack statistical literacy training allowing them to access knowledge to handle the implications of probabilistic reasoning.

Is it still possible to speak of Epidemiology in the singular and with a capital "E"? That is, there is strong evidence that different epidemiologies have taken shape whose adjectives have become "last names" [in Portuguese and other languages in which adjectives follow nouns ­ Translator's note], connoting different clans and sparking feuds and disputes for hegemony to achieve capital-letter, dominant status. So as not to dwell too long on this topic, a summary of the essential differences in such watersheds was suggested by Pearce (1996) (slightly modified), and despite the limitations inherent to summarization, it is sufficiently illustrative (for greater detail, consult the author). On the one hand, there is "traditional" epidemiology (the author uses the term in a favorable light) whose "motivation" is public health and its ideals ­ promotion, prevention, and the control of injuries to health, through structural epistemological strategies with a realist focus and population studies and interventions in a historical and cultural context, using observational research techniques.

On the other hand, there is "modern" epidemiology, whose "motivation" is scientific/ academic, with a predominantly biomedical view using reductionist epistemological strategies of a positivist bent, studies and interventions at the individual level (in organs, tissues, cells, and molecules...), with the exclusion of experimental research contexts and techniques (whose basic model is the randomized clinical trial). This scheme, in addition to its poorly disguised Manicheaism, eludes several issues. In principle, it is at least debatable to affirm a clear distinction between the epistemological strategies and approaches of the various watersheds. Furthermore, as analyzed by Ayres (1994), public health underwent a decisive rearrangement in terms of scientific normativeness in the United States with W.H. Frost in the 1940s and 50s, in a process that began in the late 19th century. A hygienist share of the field was incorporated by the state, i.e., public health per se. Another part, in the disciplinary form of preventive medicine, was linked to medical and health care activities, with repercussions on the training of health professionals and the shaping of corresponding biological knowledge on the human disease process. Epidemiology as a scientific activity took hold in medical schools together with other bioscientific contents in medical training. That is, we are now experiencing the paroxysm of a splitting trend that appears to have generated at least two epidemiologies displaying different structures. One is "sanitary/collective", interventionist, linked to population-based practices, surveillance, disease control, health education, etc., in a sense a subsidiary of the other, "scientific/academic", producing evidence on the "natural history of diseases", in relation to which both public health and medicine are expected to base their actions.

Under the scheme of "epidemiological modernity" proposed by Pearce (1996) there is no clear indication of the trend called "clinical epidemiology" or "evidence-based medicine". In fact, this expression denotes a grammatical inversion where one notes the notion of the absorption of epidemiological contents by medicine, thereby consummating a disciplinary shift. Clinical medicine substantiates itself, and the adjective is provided by the epidemiological techniques which produce a suggestively solid basis ­ evidence!

Barata (1996) identifies the ideological dimensions embedded in this shift, with emphasis on the positivist facet of the technobiosciences, through closely linked myths: 1) the unconditional objectivity of scientific knowledge and its capacity to display "truths"; 2) the power of the probabilistic quantitative arsenal in this process; 3) the notion of unlimited progress in the technological development of products, techniques, and interventions aimed at prevention, detection, and treatment; and 4) the strong belief in the neutrality of scientific enterprise, whose main premise is the dichotomy between subject and object and consequently the control of both, optimizing objectivity and avoiding the hazards of subjectivity.

Despite the trials and tribulations of the struggles for prestige, the situation sometimes shows a picturesque side. A curiously dramatic and unabashed review by Carl M. Shy (1997) in the orthodox American Journal of Epidemiology proceeds literally to a "judgment" of academic/scientific epidemiology. The field's alleged "crime" was to have devoted itself primarily to studies whose main perspective was to "discover" risk factors in the relations between given exposures in groups of individuals and their respective outcomes, a conservative proposal which, according to the prosecution witness (the author), constituted a "failure", since it excluded community and ecological dimensions and their interrelationship with socioeconomic, cultural, and behavioral aspects in the understanding of the individual disease process.

This explicit critical stance by authors from the Anglo-Saxon epidemiological community towards "modern" epidemiology and the assumption of the limitations of risk-factor ideology is quite recent. As suggested in a previous study (Castiel, 1998), it mimics certain aspects of the so-called Latin American social epidemiology of the 1970s, of a Marxist strain. Could it be that the fall of the Berlin Wall in 1989 brought greater freedom for the so-called left-wing intellectuals of the United States, to the point that they can now speak out without fear of a return to witch hunts?

 

 

Tracing the future of epidemiology

 

Let us now turn to a renowned Anglo-Saxon epidemiologist originally from South Africa, now based at Columbia University in New York, along with his son, i.e., Mervyn and Ezra Susser (1996). A study divided into two papers (a preliminary version was presented at the Congress of Epidemiology in Salvador in 1995) describes past epidemiological eras and proposes a future picture: that of health statistics, based on the miasma paradigm, in the first half of the 19th century; that of infectious diseases, with the germ theory, in the late 19th and early 20th centuries; and that of chronic/degenerative diseases, emphasizing the exhaustion of the black box (risk factor) theory, during the latter half of the 20th century.

According to the Sussers, the field's future is shaped by "eco-epidemiology" (whose metaphorical paradigm is that of "Chinese boxes"), characterized by an ecological point of view studying "relations within and between localized structures organized in a hierarchy of levels" (Susser & Susser, 1996:676); an analytical approach involving "[a]nalysis of determinants and outcomes at different levels of organization: within and across contexts (using new information systems) and in depth (using new biomedical techniques)" (Susser & Susser, 1996:676) (here, read especially recombinant DNA manipulation techniques and molecular markers and probes, constituting what is already known as "molecular epidemiology"). The preventive approach is based on "[applying] both information and biomedical technology to find leverage at efficacious levels, from contextual to molecular" (Susser & Susser, 1996:676).

In other words, the Sussers appear to feel that a bright future for epidemiology rests primarily on the transdisciplinary conjugation of bioinformatic techniques with so-called "molecular epidemiology".

I will not dwell here on the undeniably important issues of interdisciplinarity in general and its Collective Health dimensions in particular. Interested readers are referred to Almeida-Filho (1997) and the respective debate with other authors. Suffice it to mention the major effort in this direction in successful studies on cholera, a well-known epidemic disease, considered paradigmatic in the construction of epidemiological science. Interdisciplinarity has served to propose a consistent predictive model for outbreaks of the disease, involving processes for identifying strains of the cholera vibrio using biochemical techniques (like PCR, monoclonal antibodies, and fluorescent labeled RNA probes), in addition to epidemiology itself, oceanography, ecology, microbiology, marine biology, medicine, satellite image geoprocessing, and (bio)informatic techniques.

This position postulates the influence of the amount of chitinous marine zooplankton, like copepods, tiny crustaceans, elements in the food chain of fish, as hosts for the vibrio. The copepod population is a function of global climate changes (like the El Niño phenomenon, which provides rain, brings nutrients from the coastal areas, and heats the ocean), and its movements are related to the marine winds and currents. In addition, molecular genetic probes have also shown that certain vibrio strains assume a viable and pathogenic state, yet refractory to culturing in the laboratory. One can thereby detect and count Vibrio cholerae in environmental samples and measure the corresponding degree of contamination (Colwell, 1996).

The expression "bioinformatics" encompasses mathematical and computer techniques employed in the study of biological problems. Such techniques are used increasingly as powerful tools to study natural systems in various branches of biology: ecology, genetics, evolution, immunology, virology, and epidemiology (Levin et al., 1997).

The field incorporates non-linear and non-parametric mathematical methods and the study of genomic sequences of pathogens (Escherichia and Listeria), or so-called phylogenetic analyses; investigation of host-agent co-evolutionary interactions; genetic immunoepidemiology; and modeling of immune response patterns resulting from the complex dynamics involving pathogens and the immune system with control strategies (Levin et al., 1997, op. cit). Many new drugs have been developed using these techniques. There are pharmacogenetic prospects for expanding the power to identify genomic characteristics of individuals, grouping them according to their corresponding genotypical configurations and prescribing more "personalized" and supposedly more effective drugs (Cohen, 1997).

In order to simplify the presentation, we will classify issues pertaining to the mathematical modeling of "molecularization" aspects of epidemiological studies. We have discussed the pertinence of the expression "molecular epidemiology" in a previous article (Castiel, 1998), but it is worthwhile to refer to its descriptive side now and then proceed to the discussion of its basis in molecular biology.

First of all, how does one define the term? Simply speaking, molecular epidemiology consists of the use of biological measures or markers at the molecular level in epidemiological research. In other words, studying the relations between exposure and disease in populations through methodological approaches proper to epidemiology.

The necessary quantifications or measurements are based on modern molecular biological laboratory techniques, aimed at the following: a) direct detection of changes in molecular structures (both pathogens and individuals susceptible to disease) and b) indirect detection using immunologic techniques to verify the existence of specific molecules from given products of gene activities. The term originated from cancer epidemiology studies using molecular biochemical techniques in the 1980s (McMichael, 1995).

It also serves to: 1) demarcate the gradient of events from exposure to disease ­ internal dose, biologically effective dose, early biological effect, and altered function/structure, clinical prognostic significance; 2) identify exposures to lower or older doses to presumed noxious agents; 3) reduce classification errors in exposure variables for disease; 4) indicate mechanisms; 5) establish the role of exposure to given factors in the susceptibility and variability of individual response; and 6) expand the verification of risk levels in individual and group terms (Schulte & Perera, 1993).

Groups of studies have emerged in the United States that have begun to discuss relative risk/benefit issues involved in the transition of predictive genetic tests from basic research to clinical practice. The benefits are evident: screening of various diseases in neonates, making early intervention possible in many cases. But for diseases like breast cancer, despite the availability of predictive genetic tests, there is still no evidence that preventive measures or optimum treatment are devoid of risk or fully effective. The risks can be summed up in the issue of "predictive uncertainty" as to the occurrence of future disease vis-à-vis some tests. This also applies to non-genetic tests (Holtzman et al., 1997). In fact, this is still one of the crucial problems in risk as a probabilistic category for exposed individuals in clinical contexts. Physicians (and patients) generally feel alone at such times, with no data as to the validity and utility of recently developed tests. And I believe that even when there is access to such data, decisions have not become substantially safer or guaranteed.

We should also mention the existence of a "molecular" watershed in the epidemiology of infectious/contagious diseases. The approach's principles are based on the fact that bacterial genes coding for molecules that perform activities in the basic maintenance of the microorganism's structure and function do not undergo major changes over the course of evolution. On the other hand, other genes are under strong selective pressure. For example: those coding for cell membrane proteins. Due to the common origin of medically relevant bacteria, one can construct the respective evolutionary trees based on the analysis of the genes coding for the constant macromolecules (McDade & Anderson, 1996).

Sequencing of other group-specific variable bacterial genes is used to type strains and identify differences between such bacteria. Although it is not possible to construct an evolutionary tree for all viruses, since there are no conserved molecules as there are with bacteria, there are conserved and variable genes allowing for the identification of relations within viral groups. This technique is also called phylogenetic analysis (McDade & Anderson, 1996).

Such procedures serve to: 1) study outbreaks of diseases of unknown origin (e.g., hantavirus, a respiratory disease with a high case-fatality rate); 2) detection and identification of culture-resistant bacteria (e.g., Whipple's disease, a systemic disease involving arthralgia, abdominal pain, diarrhea, malabsorption, and wasting); 3) establishment of unusual forms of disease transmission (e.g., AIDS and HIV-positive dentists); 4) verification of long incubation periods in rabies infections (bites in immigrants having occurred in their countries of origin more than six years previously), 5) paleomicrobiology (geographical identification of the origins of retrovirus strains in the case of HIV and HTLV-I) (McDade & Anderson, 1996).

Even so, it is important to highlight that in the current state of the molecular arts, one notes that exposures to presumed external carcinogenic agents lead to the formation of DNA mutations in receptor tissues (adducts?). This does not necessarily mean establishing causal nexuses, since there are still elements missing at the individual level to sustain the relationship between such molecular alterations and cancer genesis (McMichael, 1995). In other words, even with the vigorous evidence backing the determinant role of certain biomarkers in carcinogenesis, exceptions to associations viewed as causal have not been unconditionally ruled out (Vineis & Porta, 1996).

 

 

Discussing the scope and limitations

 

Bioinformatics


 

The greatest computational challenge in highly non-linear stochastic systems is the representation of complexity and the impact of control measures. Depending on the problem, all scales can be important, from the individual to the greater metropolitan level. The central issue is the following: how does one effectively adjust and calibrate the number of elements in the model to a given context?

There are many epidemiological studies approaching the dynamics of infectious diseases from the above-mentioned perspective (see Levin et al, 1997, op. cit). Still, in so-called mathematical modeling and computer simulation techniques one must consider such complicating factors as interactions between spatial and genetic heterogeneity, non-linearity, and stochasticity. Even greater problems for modeling in epidemiology are transmission variability according to social and geographical space or the diversity and heterogeneity of individuals. How and at what level can one represent spatial variations in intrinsically non-linear contact processes underlying transmission? An example is the highly dynamic spatial and temporal patterns in the AIDS epidemic and the possibility of chaotic, non-linear dynamics in establishing complex transmission networks (with high degrees of imprecision) (Levin et al, 1997).

And we should remember that mathematical models can provide an unjustifiable feeling of verisimilitude. In reality they are imitations or simulations of reality that attempt to represent where dynamic or complex processes and systems really occur. According to philosopher Naomi Oreskes of Dartmouth College, quoted by Horgan (1995:77), "Verification and validation of numerical models of natural systems is impossible." At best, one can obtain partial, approximate knowledge. This is due mainly because such models deal with "open" systems. Affirmations than can be firmly verified (or validated) are ones dealing with "closed" systems, in which all of the variables are taken into account and amenable to monitoring through mathematical logic and algorithmic approaches.

Oreskes emphasizes the rhetorical power of mathematical models and their potential for convincing people based on the assumption of their capacity to represent "reality". By analogy with literary works, which can have characters based on either existing or fictitious facts and persons, the following crucial question arises: how much of the respective elaboration is based on: 1) the observation and measurement of accessible phenomena; 2) presumably consistent, well-informed assessments; and/or 3) convenience (Horgan, 1995)? Isabelle Stengers (1993) sees models largely as "mathematical fictions". Furthermore, they constitute a new modality for putting fictions to the test. With the new perspective raised by the development of (bio)informatic techniques, the use of increasingly powerful computer systems as simulation tools has led to the rise of "new sophists" in the scientific milieu.

Stengers (1993:153) refers to "researchers whose involvement no longer relates to a truth which lays fictions to rest, but to possibilities, whatever the phenomenon may be, for the mathematical fiction that reproduces it". The same author quite properly raises the ethical issue of simulation: to "what" does an investigation performed on virtual molecules and populations refer? To what extent are such studies performed exclusively on abstractions, and what are the representational links to "true" elements belonging to the so-called "real" world. Therefore, what kind of enunciates can they generate? They obviously no longer constitute experimental or observational findings.

In a word, what kind of data or findings are obtained or produced by simulation studies? This situation challenges the idea of truth as an arrangement involving the explanation and "reality", a notion dear to the natural sciences. Such contingencies where the notion of virtuality imposes itself further subvert the organization and consistency of scientific disciplines and knowledge.

Lévy (1995) proposed a scheme to deal with this order of problems. According to him, any event can: 1) be latent in its virtuality and exist as such and 2) manifest itself in its actualization, and thus occur. In this sense, actualization would invent a form of happening as a modality of creation (Lévy, 1996). The "temporality of actualization", according to Lévy (1996), "is that of processes. (...) To the extent that there are as many temporalities as vital problems, virtualization moves in time with the times. Virtualization emerges from time to enrich eternity. It is the source of times, processes, and histories, since it commands actualizations without determining them. Creator par excellence, virtualization invents questions, problems, devices that generate acts, process lineages, and machines for the future" (Lévy, 1996: 139-140).

I do not feel that Lévy has solved the problem satisfactorily. In my opinion a brief observation suggests the risk of semantic raveling: if the event "exists" at one level and "happens" at another, what does it mean "to exist" after all? That is, we find ourselves in the midst of ontological issues in a strange context where the borders between what is possible, real, virtual, and actual become fuzzy.

The triumphant tone employed by Lévy suggests a deification of Virtuality [whereby the capital "V" becomes a "logical imposition" (!?)]. Indeed, along this line of reasoning, It would be a "manifestation" of (and for) virtuality. Would it thus be mandatory to believe that virtuality possesses the (omni)potence of "existing" in order to become an act, i.e., "to happen"? Deriving from this elaboration is the establishment of processes which ineluctably constitute stages or phases of the happening which, we stress, may or may not actually happen.

From the biological point of view, Lévy's reasoning would be quite applicable to bacteria reproducing by fissiparousness and occasionally undergoing mutations from influences in the context. But a fecundated human egg constitutes a happening of a quite different order from that of an adult organism. One can even conceive of them being distinct happenings, albeit possessing links to each other. A human egg appears not to possess a mind, while an adult organism appears to possess one. Here, the cautious use of the verb "to appear" is due to our intent not to delve into animist theological discussions. Indeed, note how Lévy leads us towards such issues.

In a word, despite the efforts by Lévy, problems with the relations between what is possible, real, virtual, and actual remain, as do their definitions. Since the matter is something consistent, externally produced, objectifiable, reproducible, and amenable to shared description (and interactivity) and is thus valid, would it be absurd to conceive of the paradoxical image of a "true hallucination"? Perhaps the most appropriate pathway through this state of things would be to assume the condition of entities with intermediate statutes, hybrids generated by virtual simulations and images, mixtures of "reality" and "representation", yet not symmetrically shared. According to Philippe Quéau (1994), virtual images are a mixture of idol and icon, but with a predominance of the first order, as long as we understand /idol/ in the sense derived from Indo-European roots, i.e., that of "knowing" and /icon/ as an image that seeks to capture similitude (Quéau & Sicard, 1994), images of reality that produce and multiply knowledge, but no longer with a concern over defining them according to their statute as real or virtual objects. But their ethical effects are certainly important in light of the potential for twisting the role of image reproduction techniques as documentary proof of facts...

 

 

Molecular biology


 

We turn now to the pertinence of theoretical and epistemological contents conveyed by molecular biology and whose links to molecular epidemiology (with or without quotation marks) are obvious. This leads unavoidably to a problem. How does one produce a balanced description with synthesis and depth without committing improprieties or neglecting essential aspects of the field, especially if the point of view of the observer-interpreter is situated in the epidemiological field...? For better (or worse), at this stage of the game, the allusion to (more) difficulties should not prevent us from continuing our exercise. The greatest risk is to scare off any erstwhile and understanding readers once and for all and perhaps to fuel the fires of our critics...

Molecular biology (MB) emerged as a discipline through the fusion of chemistry and biology, with the creation of techniques with a language of their own, the object of which are biological macromolecules (Atlan, 1986). There are several expressions with a juxtaposition of aspects that are correlates of the so-called MB field. Two are particularly evident: biotechnology and genetic engineering. Technical vigor appears in both, ruled by the criteria of productivity, applicability, and efficacy. The term "engineering" even derives from the notion of "engines": skills and devices allowing one to overcome opposing forces.

In the technological field, more and more double-faced products and processes for use by men are invented by engineers whose power appears in the "great river of technique, which in overflowing is capable of both fecundating the adjacent plains and causing irremediable erosion, dragging topsoil and causing pollution, relieving men of their burden and subjecting them to new obligations, elaborating a contest that manufactures as many 'winners' as it does outcasts, developing communications fostering improved 'communion' even while multiplying the number of 'excommunicates'" (Lesgards, 1994:11).

This sharp diagnosis by Lesgards is accompanied by a frightening statement. The intellectuals who purport to dwell on "what is going on" and to produce reflections concerning the world around them have never lagged so far behind the changes produced in the vortex of the prevailing technological trend, perhaps due to the fact that the simultaneously proliferative and dizzying effects have uniquely altered our ways of seeking order in the world by subverting notions of time and space, identity, relations with the body, thought, and disease (Lesgards, 1994).

Although it may seem obvious, note that current biotechnological engines require deep and urgent reflection and investigation. In our case, manipulative techniques used on human beings are a particularly hot issue in relation to Lesgard's topics. But what will our "bio-point of view" be? How do we conceive of and deal with life sciences today? With which analytical tools? Under which epistemological premises? Or are these questions not relevant? They are. But the "engineers" (genetic and otherwise) are preoccupied with more "concrete" "things" ­ to efficaciously produce new (bio)technical objects and make them available as quickly as possible.

In other words, I believe we should doubt that current concepts and instruments based on games to analyze language and symbols are consistent enough to "monitor" and understand what is going on in the technobioscientific world. To accompany this milieu, I believe we must delve into the "biotechnicalities" and seek insofar as possible to accompany their unceasing production, even while knowing that we are at a disadvantage in this "race". It is quite difficult to carry out attempts to decode, translate, and almost simultaneously reflect on the multifaceted repercussions of the technobioscientific field and its prolific production when one is removed from the production centers and/or lacks the minimum technical training (whatever that level might be) for such an undertaking.

Various editions of Science expressed the first doubts as to the reliability of the experiment that produced Dolly. The original paper had been published in Nature. Both journals are sources of frequent consultation by health, science, and technology columns from the lay press. Two articles in Science covered the following themes:

1) Recognition of the pertinence of Carlo Woese's theory (two months after he formulated it) concerning the existence of a domain of unicellular living beings different from all other one-celled beings.

This new branch, called Archaea (including the extremophiles, or beings living in extremely high or low temperatures and with great biotechnological potential) completely altered the make-up of the evolutionary tree of living beings with its two consecrated branches, Bacteria and Eukarya (where we are located on some branch). The point here is not to dwell on Woese's methods in 1967, but to point out that his findings have been confirmed by current sophisticated molecular techniques. Yet his study received no recognition whatsoever when it was published in the Proceedings of the National Academy of Sciences. Woese was considered an introvert. He never attended the scientific meetings of the microbiology societies. Some considered him a "nut". The point was that his article was ignored by the more prestigious microbiologists of the time (Morell, 1997).

Events like this are hardly uncommon. In the field of genetics, the lack of recognition for Mendel's seminal work is infamous. Genetics historians point out that his groundbreaking study was originally published in a minor journal.

2) The recent appearance of a "new" sub-discipline, functional genomics (FG), a predictable field in logical terms, although still poorly defined, but already frequently quoted in the specialized domains (Hieter & Boguski, 1997). Very well, while the term "genome" (an organism's set of genes and chromosomes) was coined over 75 years ago, genomics was only created in 1986 to define the discipline in charge of mapping, sequencing, and analyzing the genome. Genomics can now be divided into structural (the complete, high-resolution transcription of the physical genetic maps of an organism's DNA) and functional (the application of structural knowledge to establish the genes' functions, based on statistical and bioinformatic techniques).

"The fundamental strategy (...) is to expand the scope of biological investigation from studying single genes or proteins to 'studying all genes or proteins at once' in a systematic fashion (...) [my emphasis]. Functional genomics promises to rapidly narrow the gap between sequence and function and to yield new insights into the behavior of biological systems" (Hieter & Boguski, 1997, op. cit: 601).

The article quoted above describes studies ranging from the completeness of yeast genomes to genetic approaches to the diagnosis, prognosis, and treatment of cancer that could already be included under this new "heading" (Hieter & Boguski, 1997). Strictly speaking, will "functional genomics" turn out to be (or is it already) an important field, and will it thus merit our efforts to accompany its results? FG is already being viewed as the form the human genome project will assume over the next millennium, after the descriptive/ structural phase (Morel, 1997). Would it be appropriate now to consider it just a passing fad turning technobiosciences into a spectacle? For better (or for worse?), how much of each (or both) of the above will it turn out to be? Rare indeed are the situations that are readily discernible in dichotomous terms (i.e., like black-and-white, when the norm is usually shades of gray). As if it were possible, based on an analysis in the heat of events, to reach conclusive judgments (of this magnitude) vis-à-vis the emergence of a field or discovery. In general, it is impossible to quickly perceive the innocuousness (which generates the quotation marks) of a finding like "cold fusion", which proved to be a fluke and was relegated to the past. At any rate, there is strong evidence for the relevance of functional genomics. But will it be possible some day "to study all genes or proteins at once" in human beings and measure their effects? The most sensible answer is inconclusive: maybe...

Such examples indicate the great difficulties one faces today in keeping up-to-date and certain of the pertinence of findings presented in the main publications from one's respective field, plus their intersections (not to mention the countless internet sites and links...). I fear this may be the scenario that is unfolding: the great probability of extrapolating our capacity to follow and understand the minutiae and spin-offs of what is produced (to the point of saturation) in our areas of interest. There is a plethora of information...

But let us be optimists. Some issues are amenable to specific treatment, providing ways to deal with given orders of problems. Following the line of thinking of Lesgards, Sheps, and Tarnero, an argument worthy of note is developed by Gilbert Hottois (1994), noting that "what characterizes modern science is the break with symbolic discourse and metalinguistic speculative knowledge. Neither technique nor mathematics belongs to the order of language (...). Games are created which are not new language games, although language is not totally excluded and frequently intervenes (...). Within these games (...) things are not decided through conversation, but through calculation (increasingly performed by computers) and technophysical exchange, whether efficacious or not (...)" (Hottois, 1994:63). From this perspective, to study the concept of information would appear fruitful.

 

 

Seeking information

 

According to Jorge (1993), one can postulate three fundamental concepts for so-called molecular biology: information, adaptation, and self-organization (or autopoiesis). I believe it would not be outrageous to include the following: evolution and natural selection. Yet as we shall see, the notion of information is particularly important and will be the object of our attention.

Scholars generally establish inaugural moments. With the emergence of the notion of information as a quantifiable element, references tend to converge on the classical work of Shannon and Weaver, The Mathematical Theory of Communication (1949), in which the authors develop a theory on the measurement of the amount of information in a message transmitted by a communications pathway. They create the notion and the forms of mathematical treatment of "binary digits" (bits), units of basic information for the functioning of computer systems and ways of calculating and determining the storage capacity of these elements for processing and transmission. In other words, if computing operates on symbols, bits constitute the units of these symbols (devoid of meaning) allowing for such operations. This is all quite trivial for any novice in today's computer arts. But such were the beginnings of cybernetics (now referred to as first-order cybernetics), a discipline whose seminal text by Norbert Wiener, Cybernetics, was published in 1948 and which would deal with "information" and shape it into "programs"... Nevertheless, how does the calculable concept of "information" with high mathematical, statistical, and cybernetic contents extend to molecular biology? Maria M. A. Jorge (1993) and J-P. Dupuy (1995) trace this transition in similar fashion. According to Jorge, the "intellectual infrastructure of molecular biology" is situated in the postulation of a complementariness between physics and genetics as postulated by Niels Bohr and developed by one of his disciples, Max Delbrück.

During the 1940s, studies by Delbrück's group convinced him that genes were molecules viewed from the perspective of quantum physics. But there appeared to be a biological principle of uncertainty which hampered an understanding of the genetic minutiae. The two disciplines were drawn together by the discovery of new laws of physics (Jorge, 1993).

The ideas raised by information communications theory and feedback regulation served initially as a new "language game" to approach the phenomena of heredity and genetics. Thus emerged such concepts and terms as "information", "program", "code", "message", "translation", and "transcription".

Another physicist, Erwin Schrödinger, asked the question in 1944 (in the form of a book), What is Life? and suggested that an answer to mechanisms of heredity and genetics might come from physical laws (Dupuy, 1995). Fox-Keller (1995) points out that it was Schrödinger who provided the notion of chromosome as script/code.

According to Fox-Keller (1995), the very expression "information", with strong metaphorical connotations since it was explored in the 1950s by the discoverers of the DNA double helix, Watson and Crick, converged towards the notion of instruction. Biologist and science historian Fox-Keller analyzed the evolution of the concept in the 20th century and how the original sense of the information theory was not maintained in the description of the functioning of nucleic acids in protein synthesis. Furthermore, this perspective turned the genetic code into a "message" (see messenger RNA) assuming the form of "orders". Today, the predominant points of view see genes as cause, machines as organisms, and organisms as messages. It is essential to be clear that all language is not only descriptive but also "performatic", i.e., socially constructed and context-dependent, and must thus be evaluated as to its effectiveness and not according to true/false criteria (Fox-Keller, 1995).

According to Jorge (1993), one can thus classify molecular biologies in two fundamental watersheds (with intermediate areas): 1) the "official" one, based on the notion of "order based on order", where the living being results from stable processes of ordered construction, by regular, unvarying repetition, such that sooner or later such mechanisms will be discovered (the human genome project appears to feed on this perspective); 2) the "other" one, the central idea of which is "order based on disorder" (or noise), where unpredictability, randomness, instability, bifurcations, and imponderableness are crucial to the genesis of living beings.

I believe, perhaps in simplistic terms, that there are situations where both approaches may apply (in an example mentioned above, the first is particularly applicable to viral, bacterial, and correlate forms and the second is more in keeping with the human experience). In other words, here we are with the recurrent problem of measuring the proportions of the "nature(inborn)/nurture(acquired)" conundrum in the constitution of various living beings (without delving into another recurrent and thorny terrain, that of defining which living beings possess a mind, whatever that is...).

It is also important to highlight Jorge (1993) in emphasizing that the current vigor of the notion of "information" can be ascribed to the fact that it serves both the molecular watershed of order (neo-mechanism) and that of disorder (neo-vitalism). In the first case, the notion is linked to calculating and processing so-called informational units (like bits), as applied to the above-mentioned field of bioinformatics. If life is information ­ and this is the hypothesis of "ordered" molecular biology ­ then living beings can be explained on the basis of their algorithmic information content (AIC) (Gell-Mann, 1996) [with algorithm based on the concept of the computational machine proposed by Alan Turing, "given sequences of logical/mathematical instructions oriented in a specified direction" (Atlan, 1991: 217), or more simply, as a rule (or set of rules, i.e., program) to calculate/compute something (Gell-Mann, 1996)].

From this perspective, the complexity of biological systems can thus be measurable and computable, especially allowing for manipulations. This is the position taken by neo-Darwinist Daniel Dennett (1995), who considers evolution by natural selection an algorithmic process occurring in the molecular record of nucleic acids. In his opinion, "Darwin's dangerous idea" pertains to the fact that the "algorithmic level is the level that best accounts for the speed of the antelope, the wing of the eagle, the shape of the orchid, the diversity of the species (...)" (Dennett, 1995:59), even without the obligation of producing such characteristics (and by extension, without the need to reach us). Neuronal functioning and the so-called cybernetic systems (analogically called neural networks) also obey algorithmic and thus intelligible rules (i.e., ones that can be modeled) from the point of view of a computational neo-mechanism. However, this "computational surveillance" of humans based on the notion of "cold, calculating information" was challenged by the so-called second-generation cybernetics movement captained by Heinz von Foerster (1991). This Vienna-born physicist was one of the forerunners of the notion of information as the underlying element in self-organization by living beings, who work with information using recursive, autonomous, and self-referred processes such that their organization of themselves and "reality" occurs in infinite circles, in an association of information with life and knowledge. This occurs in humans through the specificity of the human mind, allowing one to be conscious of science itself ­ to operate oneself with science (the etymological root of conscience).

Such propositions result in an approach between cybernetics, biology, ontology, and epistemology (both in the sense of the questions as to knowing and the possible answers concerning the issue of knowledge). Cybernetics looked at itself and proposed as its enunciates the questions as to what it means to exist, to know [and its derivations vis-à-vis observer-subject (who knows?) and observed-object (what is known?)] (von Foerster, 1991). However, one of the risks of this perspective is that of falling into a kind of neo-vitalism ­ reduction of the biological sphere to the psychic/mental, which after all possesses particular cognitive properties. Such properties originate from emerging "complexological" models, the "interest in which", according to Atlan (1991), "lies in establishing how structures and functions are produced that play the role of creating meanings in the eyes of an objective observer. From there on, such models are confused with the immediate, unique experience of our subjectivity (...) [We confuse] the form of creativity we perceive and describe in certain natural phenomena with that of our own spirit" (Atlan, 1991:110). In general, when we approach evolutionary phenomena in macromolecules and thereby apply informational notions, we are proceeding to analogical/metaphorical transpositions ("nomadism") of concepts from different orders of organization. This occurs whether we affirm either of the following:

a) that evolution occurs by natural selection at the (molecular) level of algorithmic information contents.

The above belief fosters the so-called gene fetishists, those who believe in genomics as the code-of-codes, the book-of-books, the Holy Grail, and the gene as an exclusively material entity, bearer of a heavily deterministic causal action, a thing in itself. Fetishes, as the substitutes they are, provide a concreteness with operational ends for the genome. They have the function of making things appear well demarcated and controllable, something which is occasionally possible to conceive of and above all allows ones to operate. But under many circumstances this proposition is untenable, since "the reality and materiality of the genome is simultaneously semiotic, institutional, machinic, organic, and biochemical" (Haraway, 1997: 99) and thus depends on the context and is difficult to control or predict, or

b) that biological systems result from the ways by which organisms exchange "information" with their milieu and we, subjects/observers, study them as objects/observed in the form of couplings, under the premise that to exchange and process information is to know, which in turn is to live...

In humans, to live is more than to know, which is more than to process information. Yet currently, "'Life', materialized as information and signified by the gene, displaces 'Nature', preeminently embodied in and signified by old-fashioned organisms" (Haraway, 1997:134). By the way, Dennett's peculiar verve (1996) serves to inadvertently illustrate this shift quite clearly. The American philosopher goes to the point of calling the evolutionary process by natural selection "Mother Nature". Yet it would appear that this denatured mother refuse to nurse both the mineral kingdom (with its quakes and volcanoes) and meteorological phenomena...

Finally, running the risk of sustaining conceptual stances whose ideological spin-offs and sociocultural repercussions are problematic, to say the least, it is essential to distinguish between "information" as constituent potential and knowledge. Knowledge is what actually occurs in the ordering and integration of various "(in)formative" elements. The rationalizing discursive pressures of epidemiology (both current and future) are undeniable, through its scientific models of intelligibility. But rather than taking them as unconditional, ineluctable truths, it is essential to distinguish (within such proposals for knowledge) the premises and vicissitudes of the constitution of their elements for our knowledge and intervention in health, in addition to their functions in the possible idiosyncratic interpretation and creation of meanings in the lives (however they may be...) of each and every one of us.

 

 

References

 

ALMEIDA-FILHO, N., 1997. Saúde coletiva e transdisciplinaridade. Ciência e Saúde Coletiva, 2:5-52.         

ATLAN, H., 1986. A Tort et à Raison. Intercritique de la Science et du Mythe. Paris: Seuil.         

ATLAN, H., 1991. Tout, Non, Peut-être. Éducation et Verité. Paris: Seuil.         

AYRES, J. R. C. M., 1994. Epidemiologia e Emancipação. São Paulo: Hucitec/Rio de Janeiro: Abrasco.         

BARATA, R. B., 1996. Epidemiologia clínica: nova ideologia médica? Cadernos de Saúde Pública, 12:555-560.         

CASTIEL, L. D., 1998. Apocalipse... Now? Molecular epidemiology, predictive genetic tests, and social communication of genetic contents. Cadernos de Saúde Pública (in press).         

COHEN, J., 1997. The genomics gamble. Science, 275: 767-776.         

COLWELL, R. R., 1996. Global climate and infectious disease: the cholera paradigm. Science, 274:2025-2031.         

DENNETT, D. C., 1995. Darwin's Dangerous Idea. Evolution and the Meanings of Life. New York: Touchstone.         

DENNETT, D. C., 1996. Kinds of Minds. Towards an Understanding of Consciousness. New York: Basicbooks.         

DUPUY, J.-P., 1995. Nas Origens das Ciências Cognitivas. São Paulo: Unesp.         

FERREIRA, A. B. H., 1986. Novo Dicionário da Língua Portuguesa. Rio de Janeiro: Nova Fronteira.         

von FOERSTER, H., 1991. Las Semillas de la Cibernetica. Obras Escogidas. Barcelona: Gedisa.         

FOX-KELLER, E., 1995. Refiguring Life. Metaphors of Twentieth-Century Biology. New York: Columbia University Press.         

GELL-MANN, M., 1996. O Quark e o Jaguar. As Aventuras no Simples e no Complexo. Rio de Janeiro: Rocco.         

HARAWAY, D. J., 1997. Modest_Witness@Second_ Millenium. FemaleMan©_Meets_Onco-mouse. Feminism and Technoscience. New York: Routledge.         

HIETER, P. & BOGUSKI, M., 1997. Functional genomics: It's all how you read it. Science, 278:601-602.         

HOLTZMAN, N. A.; MURPHY, P. D.; WATSON, M. & BARR, P. A., 1997. Predictive genetic testing: from basic research to clinical pratice. Science, 278: 602-605.         

HORGAN, J., 1995. From complexity to perplexity. Scientific American, 272:74-79.         

HOTTOIS, G., 1994. La science post-moderne. Actes du colloque del'Institut Piaget. Institut Piaget.         

JORGE, M. M. A., 1993. Da Epistemologia à Biologia. Lisboa: Instituto Piaget.         

LAKOFF, G. & JOHNSON, M., 1980. Metáforas de la Vida Cotidiana. Madrid: Catedra.         

LAKOFF, G., 1992. The contemporary theory of metaphor. In: Metaphor and Thought (A. Ortony, ed.), pp. 5-32, Cambridge: Cambridge University Press.         

LESGARDS, R., 1996. Prefácio. In: O Império das Técnicas (R. Scheps, org.), pp. 9-13, Campinas: Papirus.         

LEVIN, A. S.; GRENFELL, B.; HASTINGS, A. & PERELSON, A. S., 1997. Mathematical and computational challenges in population biology and ecosystems science. Science, 275:334-343.         

LÉVY, P., 1996. O Que é o Virtual? Rio de Janeiro: Ed. 34.         

McDADE, J. E. & ANDERSON, B. E., 1996. Molecular epidemiology: applications of nucleic acid amplification and sequence analysis. Epidemiologic Reviews, 18:90-97.         

McMICHAEL, A. J., 1995. La 'epidemiología molecular': ¿Nueva ruta de investigación o compañero de viaje?. Boletín de la Oficina Sanitaria Panamericana, 119:243-254.         

MOREL, C. M., 1997. Personnal communication.         

MORELL, V., 1997. Microbiology's scarred revolutionary. Science, 276:699-702.         

PEARCE, N., 1996. Traditional epidemiology, modern epidemiology, and public health. American Journal of Public Health, 86:678-683.         

PETERSEN, A. & LUPTON, D., 1996. The New Public Health. Health and Self in the Age of Risk. London: Sage.         

QUÉAU, P. & SICARD, M., 1996. Novas imagens, novos olhares. In: O Império das Técnicas (R. Scheps, org.), pp. 115-126, Campinas: Papirus.         

SCHEPS, R. & TARNERO, J., 1996. Introdução. In: O império das técnicas (R. Scheps, org.), pp. 15-22, Campinas: Papirus.         

SCHULTE, P. A. & PERERA, F. P., 1993. Molecular Epidemiology. Principles and Practices. San Diego: Academic Press.         

SHANNON, C. & WEAVER, D., 1949. The Mathematical Theory of Communication. Illinois: University of Illinois Press.         

SHY, C. M., 1997. The failure of academic epidemiology: witness for the prosecution. American Journal of Epidemiology, 145:479-484.         

STENGERS, I., 1993. L'Invention des Sciences Modernes. Paris: La découverte.         

SUSSER, M. & SUSSER, E., 1996. Choosing a future for epidemiology: I. Eras and paradigms. American Journal of Public Health, 86:668-673.         

SUSSER, M. & SUSSER, E., 1996. Choosing a future for epidemiology: II. From black box to Chinese boxes and eco-epidemiology. American Journal of Public Health, 86:674-677.         

VINEIS, P. & PORTA, M., 1996. Causal thinking, biomarkers, and mechanisms of carcinogenesis. Journal of Clinical Epidemiology, 49:951-956.         

Escola Nacional de Saúde Pública Sergio Arouca, Fundação Oswaldo Cruz Rio de Janeiro - RJ - Brazil
E-mail: cadernos@ensp.fiocruz.br