Ethical aspects of genetic engineering and biotechnology


Stefano Fait 

A breeder of people should possess a supermanly foresight. But it is precisely those persons who are ethically and spiritually superior that are conscious of their weaknesses, and would not volunteer for such a tribunal, much the same as earlier on it was certainly not the best people who pressed for the office of Grand Inquisitor
Oscar Hertwig, German cell biologist, 1849 – 1922.

What is the ape to man? A laughing-stock, a thing of shame. And just the same shall man be to the Superman: a laughing-stock, a thing of shame.

F. Nietzsche, Zarathustra’s Prologue, 3



In assessing the ethical implications of genomics and biotechnology, it is important to acknowledge that science, technology, and bioethics do not exist in a vacuum and are not socially, politically and ethically neutral. Certain technologies have a greater social impact, may require the State to intervene in the private sphere, and may be differentially accessible to users. Also, science and technology can change our relationship with other people and with our environment. Hence the importance of ethnographic, historical, and cross-cultural studies for the analysis of today’s thorniest bioethical controversies.

This chapter discusses some of the most contentious issues surrounding the use of genetic technology in human procreation and gene patenting, including eugenics, genetic consumerism, animal-human hybrids (chimeras), the commodification of life, disability and genetic testing.


Even a casual observer would not fail to notice the pervasiveness of bioethics in contemporary society. How did bioethics come to take on such significance in Western societies? This is a rather puzzling phenomenon given that, in a pluralist society, philosophy cannot deliver incontrovertible moral verdicts and the philosophers’ views are no more binding than those of the man in the street (Maclean, 1993). As logician Charles S. Peirce noted long ago, absolute certainty, absolute exactitude and absolute universality cannot be attained by reasoning and, in a world in which human reason and knowledge are socially, culturally, and historically embedded, it would be misguided to expect bioethicists to provide objective and rigorously codified precepts and indications. Their speculations can only tell us what they believe is right and fair, and their logical demonstrations must be first evaluated against the empirical evidence. Accordingly, this paper only provides one among many possible interpretations of the ethical issues involved in genetic technology, one that is rooted in a specific tradition (Continental/Mediterranean Europe), period of time (early twenty-first century), and discipline (political anthropology).

Following an account of the history of the trans-national movement known as eugenics in the opening section, the chapter then proceeds to examine the future of eugenics as a consumer purchase (designer babies) and the limits of parental decision-making, epitomised by the upbringing of Francis Galton, the founder of modern eugenics. The third section, entitled “Human nature and speciation,” provides a brief outline of some of the issues arising from the Human Genome Project and also covers the debate, which is still in its infancy, on the possible redefinition of personhood and human nature that might be required by future applications of genetic engineering. Questions concerning the commodification of body parts are discussed in the third section. In the fourth section, entitled “Disabilities and genetic testing” I draw the reader’s attention to the impact that biotechnologies are likely to have on the life of people with non-standard bodies and minds. In the concluding remarks I engage with libertarian bioethics, seek to identify some of its most glaring shortcomings and urge bioethicists in general to pay greater attention to social, cultural and political factors in their ethical deliberations.

A brief history of eugenics

The term “eugenics” was coined in 1883 by Sir Francis Galton (1822–1911), after the Greek εύγενής, meaning “wellborn”. The logo of the Third International Congress of Eugenics, held in New York in 1932, defined eugenics as “the self direction of human evolution.” Negative eugenics was concerned with the elimination of inheritable diseases and malformations and involved prenuptial certificates, birth control, selective abortion, sterilization, castration, immigration restriction and, in Nazi-occupied Europe, involuntary “euthanasia.” Positive eugenics would instead encourage the propagation of desirable characteristics via tax incentives for “fit parents”, assortative mating and, in the years to come, cloning and germline engineering.

A combination of Eternal Recurrence – human beings as expressions of an immortal germplasm – and natural teleology of history – biology as destiny – stamped the arguments of early eugenicists and genealogy researchers, who linked folk hereditarian beliefs about the transmission of patrimonial and biological inheritance and the religious notion of the inheritability of sins. They fostered notions of evolutionary throwbacks and of populations as bundles of lineages, and arbitrarily equated genealogical perpetuation with social distinction. When these deterministic explanations of human behaviour were finally challenged, eugenics did not lose its appeal. Mainline eugenics gave way to ‘reform eugenics’, family planning and population control, characterized by a greater emphasis on environmental factors, birth control, the rational management of human resources, and the repudiation of an overtly racist language. This tactic made eugenics far more palatable and effective: if the impact of nurture was so important, then children should be raised in healthy home environments. In order to redress nature’s essential randomness and synchronize biological and socioeconomic processes, irresponsible citizens unable to meet the challenges of modern society would be forced, blackmailed, or cajoled into accepting sterilization or castration. Consequently, by the early 1930s, sterilisation programmes were in full swing. Following the moral panic generated by the Great Depression, few families were prepared to put up with the social protection of what was perceived to be a disproportionate number of dependent people (Paul, 1995).

Some argued that, under exceptional circumstances, basic rights could be withheld and that social services should only be granted to those whose social usefulness and biological capability were certain. The theoretical foundation of constitutional rights were undermined by prominent legal scholars in North America and Northern Europe, who argued that the state was the source of a morality more in line with the demands of modernity, and therefore was not necessarily bound by constitutional principles and norms. Radically realist and functionalist jurists submitted that personal rights were not inalienable, for they really were culturally and historically relative legal fictions or superstitions, their existence being, to a large extent, contingent on the majority’s willingness to uphold them, that is, on considerations of general welfare and public utility. Enlightened governments, like good shepherds, would foster virtues and restrict personal rights for the sake of communal rights and civic responsibility (Alschuler, 2001; Bouquet & Voilley, 2000).

This led to the paradoxical result that involuntary sterilizations and confinements were almost exclusively carried out in the most advanced and progressive democracies, the only exception being Nazi Germany. The following states or provinces adopted laws permitting the eugenic sterilisations of their citizens: Tasmania (1920), the Swiss canton of Vaud (1928), Alberta (1928 and 1933), Denmark (1929 and 1935), the Mexican state of Veracruz (1932), British Columbia (1933), Sweden (1934 and 1941), Norway (1934), Finland (1935), Estonia (1937), Latvia (1937), Iceland (1938), Japan (1940), and thirty-one American states. In 1936, the ‘Lebensborn e. V.’ (‘Spring of Life, registered association’) was launched by the Nazis, which involved the selective breeding of ‘racially superior’ children and the kidnapping of ‘racially valuable’ children across occupied Europe.

By 1914, in the United States, marriage restriction laws targeting “feeble-minded” citizens had been enacted in more than half the states and, by 1917, 15 states had passed sterilization laws. But “only” a few thousand sterilizations had been actually performed, mainly because nearly half of such laws had been struck down on the ground that they violated due process, freedom from cruel and unusual punishment, and the equal protection clause. A second wave of eugenics laws followed the Immigration Restriction Act (1924) and Virginia’s Act to Preserve Racial Integrity (1924). In 1924, Virginia also passed a law authorizing the involuntary sterilization of alleged mental defectives. This law was upheld, 8-1 by the Supreme Court, in Buck v. Bell 274 U.S. 200 (1927). As a result of this decision, taken in a country that prided itself on its commitment to individual freedom but favoured scientifically unverifiable notions of social progress over clear constitutional principles, nearly half the U.S. states passed eugenics laws authorizing compulsory and non-voluntary sterilization.

The ostensibly progressive civic religion of eugenics was seen by many as essentially fair and morally unassailable. Various representatives of the judicial branch became self-appointed guardians of the public morality and urged state governments to intrude in people’s private lives “for their own good”. Wayward citizens, namely those who could not be converted to an acceptable lifestyle, and whose behaviour remained unpredictable, were liable to being sterilized or institutionalized. This kind of society, at once ready to embrace an abstract notion of humankind and reluctant to put up with certain categories of human beings, was so insecure, apprehensive, and self-doubting, that it was willing to carry out self-mutilation in order to become risk-free, while refusing to consider the motives of the offenders and “miscreants.”

In the United States, as in Sweden or Alberta, this Machiavellian interpretation of public law made ethics the handmaid of politics: rights could only be granted by law, and social utility overruled the “untenable notion” of human rights. Virtues, rather than rights, were the defining attribute of citizenship. Instead of protecting the citizens, law legitimized the persecution of certain categories of people, purportedly unable to enjoy freedom and to pursue happiness, by gradually stripping them of their rights and legal protections. Such policies were described as politically necessary and ethically indisputable. In a tragic reversal of roles, according to the dominant “discourse of truth,” those who violated the physical integrity of other citizens were fulfilling a constitutionally sanctioned civic duty, while the victims of involuntary sterilization and confinement were a social threat and, as such, subject to legally mandated sterilization or confinement “for the good of society” (Colla, 2000; Morone, 2003).

Eugenicists were persuaded that what stood in the way of the modernizing process was the result of ignorance, parochialism, and backwardness. Those who questioned their ostensibly sophisticated and rational arguments were labelled as uncooperative or reactionary. In a burst of self-serving enthusiasm, they regarded themselves as modern, progressive and boldly experimentalist. This made resistance to ethical self-scrutiny particularly strong, because the project of a rationalist utopia was inextricably bound up with social systems that many believed were a model of humanitarian and enlightened administration, the embodiment of intrinsic benevolence and farsightedness, and therefore eminently fair and morally unassailable. Explicit coercion was often unnecessary, as thousands of people genuinely believed, or were led to believe, that eugenics measures were desirable, and they had themselves or their family-members sterilized or confined. This should remind us that informed consent is not just a signature on a form but a two-way process involving information exchange, education and counselling.

Most North American and Scandinavian laws were only repealed in the late 1960s and 1970s, even though the Supreme Court ruling in Skinner v. Oklahoma 316 U.S. 535 (1942) defined procreation “one of the basic civil rights of man” and sterilization an invasion of fundamental interests which, according to Justice William O. Douglas, “in evil or reckless hands,” could have genocidal consequences. As late as the 1980s, 44 percent of the American public was still in favour of compulsory sterilization for “habitual criminals and the hopelessly insane” (Singer et al. 1998). By contrast, in those same years, law-makers in Holland, Britain, in Latin American and Latin Rim countries[1] objected to selective breeding, involuntary sterilization, the assault on the notion of free will, the spurious conflation of modernization and liberation, and the linear extension of natural laws into the social sphere.[2] Eugenics, genetic fatalism, and the marriage between bureaucratic rationality and scientism did not resonate with every Western repertoire of values and symbols (Baud, 2001). This finding is of signal importance for the analysis of current trends in bioethics, social policy and biotech regulation.

Eugenics as a consumer choice

Western societies are today on the verge of a eugenics revival in the form of reprogenetics, germline engineering, and cloning, a trend which is indirectly reinforced by courts’ recognition of wrongful birth and wrongful life claims, by the commodification of healthcare, by the diffusion of testing for genetic predispositions, and by the rhetoric of genetic responsibility, involving new forms of discrimination and exclusion. Medical, cosmetic, and enhancing technologies are being pursued to meet our needs, such as the self-imposed obligation to be fit, active, self-sufficient and responsible citizens, and an almost universal desire to control our own lives and possibly improve them.

What measure of genetic and personality enhancement are we going to tolerate? In this section I explore continuities and discontinuities between past and future eugenics.

Opponents of genetic engineering of the human germline and human cloning point out that a society in which parents can avail themselves of preimplantation genetic diagnosis (PGD) tests has no use for them. If embryos are affected by serious genetic disorders, they can be discarded and only the healthy ones will be implanted in the womb. Therefore, critics argue, the advocates of germline engineering and cloning do not have therapy in mind, or the noble goal of redressing genetic injustice, but species enhancement. Promoters of human genetic enhancement counter that it would be more sensible and economical to try and eradicate genetic conditions instead of treating them each generation. Their detractors respond that “simple”, single-gene disorders are very rare, and that most severe genetic conditions are complex, involving a combination of genetic and non-genetic factors. The risk of unanticipated inheritable negative consequences that the reconfiguration of human biology entails is simply unacceptable, even if germline manipulation could be made reversible, because the more complex the trait that is changed, the less simple it will be to undo the change.

I will not object to these procedures on philosophical or religious grounds, nor will I dwell on the inevitable widening of the ontological and social gap between the rich and the poor that they are likely to cause, including the prospect of a future caste of uninsurable and unemployable. These arguments have already been addressed “ad nauseam.” A different case against designer babies and the medicalization of childhood can be made, which draws on the life of Francis Galton himself, the father of modern eugenics.

Galton (1822-1911), half-cousin of Charles Darwin, was destined to a life of fame and academic prestige, to fulfil his father ambitions (Sweeney, 2001). Persuaded that heredity was destiny, and given the outstanding pedigree of the Galton-Darwin-Wedgwood family-stock, his parents decided that he would be taught how to realize his full potential and become the genius he was meant to be. As a result, in the family diaries Francis is only mentioned for his educational achievements and intellectual exploits. Such were the forces at work in the shaping of the character of the proponent of the theory of hereditary genius: destiny was implanted like a programme into Francis, who would grow into a man “suffering considerable angst as a result of seldom achieving the heights of intellectual acclaim to which his parents had encouraged him to aspire and for which he had worked assiduously hard.” (Fancher, 1983).

At the age of four, he was already saving pennies for his university honours and four years later he was encouraged to study French, Latin and Greek. But when he confronted the highly selective environment of Cambridge, he crumbled under the pressure of harsh competition and constant mental strain: dozens of exceptionally-gifted students made it exceedingly hard for him to excel and a sudden and severe nervous breakdown ensued (Sweeney, 2001). Little by little, Galton drifted away from his family and devoted himself to those fields of knowledge in which he felt he could stand out. He tried his hand at poetry, soon to realise that he had no literary talent, then he turned his attention to mechanics and devised a number of contrivances that were never patented or manufactured. Even his statistical calculation of the relative efficiency of sailing came to naught when the steam engine was invented (Forrest, ibid.). This series of failures brought him to a second and more severe mental breakdown in 1866.

It’s easy to see where his unhappiness and frustration came from: not from state coercion, but from parental despotism (Forrest, 1974). Authorizing the creation of designer babies may have dire consequences, because embryo selection carried out for the sake of ‘quality control’ in reproduction is far more restrictive of a child’s freedom and places much more pressure on the offspring. Intuitively, one would expect that it would be more difficult for these children to develop into autonomous beings and to be free to choose not to fulfil the wishes and aspirations of their parents, irreversibly inscribed into their DNA, at least at a symbolical level. Even assuming that most of us would justifiably reject the fallacy of “Genes ‘r’ Us” determinism, “made-to-order” children would be hard put to do the same. As American sociologist W.I. Thomas once said, “if men define situations as real, they are real in their consequences.”

What would the consequences be, in terms of their development as nominally independent and responsible moral agents, if the alterations were of a non-medical nature? Would they take pride in their achievements in the same way as ordinary people do, even if their talents are inherited? Why, in a meritocratic society, should they feel they owe anything to the less fortunate? What could be their response to personal failure: would they assume that they are entitled to the best of everything? Finally, and more importantly, those parents so preoccupied with the uncertainties of life that they would rather have their children genetically engineered, how are they going to deal with the unavoidable challenges of parenting and the realization that control can never be complete? Are we going to rear children who cannot face life’s challenges without the help of chemical and genetic enhancers of mood, memory, cognition, sex life and athletic performances? If the goal posts are constantly being pushed forward, how are we going to avoid that what was once regarded as unnecessary should become imperative?

Victoria University ethicist Nicholas Agar (Agar, 1998) has argued that if a 6-year-old Mozart had mixed with children of his own age instead of performing in the courts of Europe, today we would not be enjoying The Marriage of Figaro or Don Giovanni. But we could counter that perhaps Wolfi might have preferred to play with his peers and live a longer and less tormented life instead of complying with the requests of his authoritarian and manipulative father. Even a strictly utilitarian perspective should not contemplate a scenario in which kids are sacrificed for the greater good and in the parents’ pursuit of reflected fame and status, to the point of transforming them into biological artefacts designed by others.

Even though past government-sponsored coercive eugenics programmes have been discredited, the mere defence of reproductive freedom is not sufficient in itself to protect citizens from abuses and harm. Unfortunately, the history of libertarianism is replete with examples of citizens claiming liberties for themselves while demanding restrictions for other “less deserving” citizens. Also, there is no such thing as a government stubbornly refusing to cater to the demands of powerful lobbies.

Apart from the fact that, under more strained socio-economic circumstances, democratic states may at some point be forced to recommend compulsory screening for certain genetic conditions, we might also want to consider the historical evidence pointing to a growing presence of the State in the family sphere in Western democracies, motivated by the imperative to better protect the children’s rights (Cavina, 2007). In point of fact, a proposal has been made in Texas, to the effect that the state should embark on mass presymptomatic diagnosis of Attention Deficit Hyperactivity Disorder in schoolchildren, followed by widespread prescription of psychoactive drugs (Rose, 2005). This should be a sufficient warning that the so-called consumerist eugenics will not be a democratic panacea: treating shyness and liveliness as biochemical imbalance, and medicalizing our children to make them well-behaved and cooperative, as though they were faulty devices – regardless of the unforeseeable long-term side-effects of taking drugs at such an early age – is, for all intents and purposes, an experiment in social engineering on an unprecedented scale, and one which can only disempower parents and children and suppresses human diversity.

In sum, the language of autonomy, empowerment, choice, and rights ought not to obscure the fact that: a. it is a rather apposite way for medical professionals and the State to be released from their responsibilities vis-à-vis patients and citizens; b. the randomness of sexual fertilization is, alas, the closest thing to freedom (Sandel, 2007) in societies where choices are constrained by legal restrictions, social and gender-related expectations, obligations and imperatives, as well as by prejudices, ignorance, practical impediments, and huge economic and social disparities, which translate into a dramatic differential distribution of power and authority.

A society where individuals are expected to responsibly monitor their health and lifestyle and to act on the available knowledge – “free choice under pressure” is a fitting definition of life in advanced democracies – will look on those who do not fulfil that obligation as reckless and uncaring. This is also what we gather from Nancy Smithers, 36, an American lawyer, and from her first-hand experience of how the line between care and desire is becoming blurred and how the range of human variability that is deemed socially acceptable is being inexorably narrowed: “I was hoping I’d never have to make this choice, to become responsible for choosing the kind of baby I’d get, the kind of baby we’d accept. But everyone – my doctor, my parents, my friends – everyone urged me to come for genetic counselling and have amniocentesis. Now, I guess I’m having a modern baby. And they all told me I’d feel more in control. But in some ways, I feel less in control. Oh, it’s still my baby, but only if it’s good enough to be our baby, if you see what I mean.” (Rapp, 1988: p. 152).

Human nature and speciation

While Sophocles thought that “there are many wonderful things, and nothing is more wonderful than man,” Nietzsche famously portrayed man as das noch nicht festgestellte Tier , “the animal which is yet undefined.”

The Human Genome Project, an international collaboration to code the information contained in the human genome through DNA-sequencing and store the resulting information in databases, was heralded as the means whereby we would attain a more precise definition of human nature. The first working draft of a human genome sequence was published in 2001, but it is important to stress that the genome sequenced by the publicly funded Human Genome Project does not represent the genetic make-up of the human species. Based on blood and sperm samples submitted by several anonymous donors, it really is a statistical artefact, the abstraction of a non-existent species-being standing for all of us, individually and collectively, without being us.

Therefore, genomes are benchmarks against which individual genotypes can be examined and described. This is because the genome is not a fixed essence that we all share in common. Each one of us possesses a unique combination of nucleotides and genes coding for proteins (genotype) and even the same genes shared by identical twins express different phenotypes under different environmental circumstances. In other words, we are at once very similar and very different from one another. Suffice it to say that while we differ from each other by 0.1 percent, humans are reportedly 98 percent genetically identical to chimpanzees, proving that such seemingly slight discrepancies have far-reaching consequences, when combined with environmental factors. In human beings, variation is the norm and, strictly speaking, by “human genome” we should refer to the sum of all genotypes in the human species, a goal that is currently beyond our reach.

One of the outcomes of the Human Genome Project has been the recognition that genetic determinism is incompatible with the evidence provided by the preliminary analysis of the base pair sequence of the “human genome.” From a strictly deterministic point of view, defined by the single-gene single-biological function paradigm, our 30,000 genes, approximately twice the number of genes of a fruit fly and far fewer than most geneticists expected, are simply not enough to make us the way we are.

We have not found the “secret of life” and are not anywhere near to being able to explain human nature, let alone control it. However, the finding that the human genome is a dynamic landscape has important ramifications. Assuming that the “genome” is, to some extent, malleable and adaptable without apparent adverse effects, those who still assume that a common genome plays an important part in the definition of human nature (and human rights) will be inclined to regard human nature as infinitely malleable and its characterization as too fluid to serve any meaningful legal, scientific, ethical, and political purpose. They might raise the question that if the social order reflects a society’s conception of human nature, and there is no fixed human nature, then who is to decide what is just and moral, and on what grounds?

Traditionally, personhood has only been attributed to human beings: then, what would a valid criterion for species differentiation be if we are going to grant personhood to great apes and to create human chimeras, cyborgs, or a new posthuman species/race? The problem really comes down to what makes us human: if patients in permanent vegetative states and severely mentally impaired persons are human, then some commentators would argue that it would be fair to grant human chimeras the same status. In other words, we need to clarify the defining criterion that we use to self-identify as humans: what we can do, what we are, or something else?

In 1974, Joseph Fletcher, Episcopal minister, academician, and one of the founders of bioethics, published a controversial treatise in which he argued that certain “retarded children” should not be viewed as persons, that procreation was a privilege, not a right, and that devising ways to obtain chimeras and cyborgs to be put in the service of humankind would be a morally legitimate enterprise (Fletcher, 1974). In much the same way, in the early Seventies, a Rand Corporation panel agreed that genetic engineering would also be used to create parahumans, namely humanlike animals, or chimeras: these beings would be more efficient than robots, and would be trained to perform low-grade domestic and industrial work or else provide a supply of transplantable organs (Rorvick, 1971).

Given the relative genetic proximity of chimpanzees and human beings, and the fact that the evolutionary split between the two species may have occurred fairly recently, it is conceivable that the first human hybridization would generate a humanzee, that is, a cross between a human and chimpanzee. But there remains the problem of the unpredictable consequences of interspecies transplantations at the embryonic stage, when bodies and brains are highly malleable and every new insertion of non-human stem cells is likely to cause random alterations in the development of the organism and, as a result, in the identity of the individual. Nobody can really anticipate the dynamic interactions of animal mitochondrial DNA and the nuclear DNA of a human embryo. It may even be the case that a chimera might look like a member of one species and behave like the members of the other species.

Supposing that the procedure is successfully and safely applied and repeated, we should then broach various topics related to artificial hominization, and discuss the moral and legal status of these beings. If law only recognizes people (including juridical persons) and property, to which category will they belong? Will they be patentable, that is, could they be owned by a company? Will researchers need their informed consent prior to their inclusion in a medical study? Is personhood coterminous with humanity? Are we going to establish a juridical and moral continuum from inanimate things, to animals, semi-human beings (e.g. chimeras, replicant-like androids), and fully autonomous persons?

The idea of a seamless gradient is reminiscent of the medieval notion of the Scala Naturae, or Great Chain of Beings, a linear hierarchy for the zoological classification of living beings which generated the visual metaphor behind the theory of evolution. Yet this model is not without its problems, for it was also employed to establish a pecking order of social worth and, simultaneously, thwart the extension of civil rights to certain categories of “diminished” human beings like women, workers, children, minorities, etc. (Groce & Marks, 2001). Modern advanced democracies will be compelled to blur the boundaries along the abovementioned continuum and make it as inclusive as possible. But this can only mean that many human chimeras and artificial intelligences with highly developed cognitive skills and self-consciousness will be allowed to become our moral equals and, as such, enjoy the attendant legal protection. Ideally, the “borderlines of status” of “artificial human beings” will be removed, with no detriment to senile, foetuses, and the vegetative: human rights will no longer be the monopoly of Homo Sapiens.


The commodification of life

“Does it uplift or degrade the unique human persona to treat human tissue as a fungible article of commerce?” was Justice Arabian’s rhetorical question in his concurring opinion in Moore v. Regents (1990).

For centuries, millions of people were enslaved on the ground that certain human beings could be assimilated to Aristotle’s “natural slaves.” Chattel slavery, that is the extension of market relations to the human person as a marketable property and human commodity (res commerciabilis) or legal tender, outside the domain of mutual obligations, was officially abolished only in the nineteenth century.

In the United States, the Thirteenth Amendment, prohibiting slavery and involuntary servitude, was ratified in 1865: it meant that no human being could be owned by another human being and, by extension, that people’s genotype cannot be patented. But isolated genes and partial gene sequences from human tissue samples can be patented for research purposes, provided that the applicant can “prove” that a “natural” object has been transformed into an “invention.” Ironically, mathematical formula are not patentable, because they are assumed to be already out there, like laws of nature or natural phenomena, whereas genes, which are verifiably part of nature, can be patented when they are discovered, regardless of the fact that the assignees may have failed to demonstrate a use for their discoveries.

As a result of the 5 to 4 U.S. Supreme Court ruling in Diamond v. Chakrabarty, 447 U.S. 303 (1980), which determined that “anything under the sun made by man” is patentable, today more than 6000 human genes from all around the world are covered by U.S. patents (Lovgren, 2005), on the ground that the mere isolation and purification of genes from their natural state, by now a routine operation, allows an applicant to be issued a patent. This sentence and the Senate’s refusal to ratify the UN Convention on Biological Diversity (CBD),[3] which had been designed to protect the interests of indigenous peoples, have paved the way to “bioprospecting” (gene hunting), also known as “biopiracy,” of pharmaceutical companies in the developing world, so that U.S. private firms and public agencies are granted exclusive rights to decide who can use those cell lines that are profitable for pharma-business, and how much they will have to pay to do so. There are a number of remarkable analogies that can be drawn between today’s biopiracy and the nineteenth century westward expansion of the United States, when Native Americans were thought to be an inferior race incapable of fulfilling the moral mission of harnessing natural resources. The Doctrine of Discovery meant that the land inhabited by the indigenous peoples was terra nullius (no man’s land). The aboriginal occupants had no legal title to the land because they had no fixed residence and did not till the soil according to European standards. They could only exercise a right of occupancy, under the American “protection and pupilage.” White people had “discovered” the land – and nowadays the genes – and therefore they owned it.

In a multibillion dollar market, enormous economic interests are involved in the search for “biovalue” (Waldby, 2002) and ethical considerations are not binding rules and are not always high on governments’ agendas. In the United States, where individual autonomy is oftentimes equated with the privilege of disposing of one’s body as one sees fit, there is a trend to extend market relations to DNA and body parts. Thousands of patents have been granted, mostly to private companies, on human genes whose function is still unknown. This process of parcelization allows companies to gradually take control of the human genome in the form of immortalized cell lines, and put a price on them, without violating the constitutional principle of the non-patentability of human beings.

Today, advances in biotechnology raise new questions about the treatment of individuals and their bodies, which can now be seen – a vestige of Cartesian dualism? – as collections of separable, interchangeable, and commercially transferable parts.

Bioscientists will, unwittingly or intentionally, play an increasingly important role in this process by introducing technologies that will facilitate the exchange of body parts and DNA – now endowed with a social life of their own –, in commercial transactions, and by selecting (PGD) or cloning donor babies, namely babies who can supply compatible tissues to treat sicksiblings.

This will raise a host of new questions such as: who is the owner of someone’s separated human tissue? If people are the owners of their cell-lines, should not they be entitled to share in the profits from scientific findings and commercialization? Are patent claims for intellectual property of DNA from indigenous peoples morally and legally justifiable? In general, who can demand a share of the profit from the commercial exploitation of human DNA?

Most jurists and legislators of Continental Europe, where the intellectual establishment is generally opposed to the idea of the market as a civilizing and liberating force, will presumably continue to favour social cohesiveness, altruism, and an ethics of the good life (eudaimonia) (Gracia, 1995). The primacy of autonomy will most likely be underplayed for the sake of social justice and equality (Braun, 2000). Reasoning that it would be unrealistic to expect societies to be able protect all citizens, especially the destitute and disenfranchised, from coercion and exploitation, it is to be expected that most will refuse in principle to regard the human body as a repository of economic value, a marketable property and a source of spare parts.

They will stress that Western civilization, from habeas corpus to the abolition of slavery as commerce of “human commodities”, and to the emancipation of women, has developed in opposition to the objectification of the human body and to the idea that anything can be converted into a commodity and into an object of contractual relationships: the argument that human body was a res extra commercium[4] was at the heart of the abolitionist movement, for there is no person without a body, and a subject cannot be the object of commercial transactions. Some will further argue in favour of the Kantian normative position, whereby one should always treat human beings as ends in themselves, that is, as having intrinsic value or worth (non-use goods), and therefore as the source of our valuation process, and not as the means to satisfy our values (use goods). They will point out that bodies, babies, and life are gifts, and money is no substitute for them (Crignon-De Oliveira & Nikmov, 2004), mostly because allowing market forces to define a scale to measure the value of all things would be degrading to our sense of personhood and to our values (Gold 1996).

Accordingly, Article 18 of the European Human Rights and Biomedicine Convention, signed in Oviedo in 1997, forbids the “creation of human embryos for research purposes”, while article 21 states that “organs and tissues proper, including blood, should not be bought or sold or give rise to financial gain for the person from who they have been removed or for a third party, whether an individual or a corporate entity such as, for example, a hospital.” Its proponents were preoccupied, among other things, that relaxing the restrictions on ownership of human body parts would lead to the proverbial slippery slope with companies legally owning potential human beings (or chimeras) from embryo to birth. In Europe, people cannot make their bodies a source of financial gain.[5] While excised body parts, like hairs or placentas, are usually treated as res nullius, that is, free to be owned by the first taker, like an abandoned property, European civil codes prohibit the removal of a body part when it causes permanent impairments, unless it is done within a formalized system of transplant donations.

The principle of market-inalienability has gradually replaced the former principle of absolute inalienability. It is now argued that people do own and control their bodies (and tissues) but have no right to sell them, for they cannot exist without them and their rights as human beings and as consumers cannot trump the right of society to attempt to stave off the process of commodification of human life.

Negotiating the economic value of bodies and body parts is therefore out of question, as it used to be before 1900, when insurance companies reassured their clients that “the term life insurance is a misnomer . . . it implies a value put on human life. But that is not our province. We recognize that life is intrinsically sacred and immeasurable, that it stands socially, morally and religiously above all possible evaluation” (Zelizer, 1978).

Disabilities and genetic testing

In most societies, and especially those with a greying population, people with disabilities constitute the single largest minority group. Depending on how disability is defined, there are currently between 3 to 5 million Canadians, 50 million Americans, and 40 million Western Europeans with disabilities. In half the cases it is a severe impairment. Disability is therefore a fact of life, and the boundary between ability and disability is permeable. It is reasonable to assume that, at some point during one’s life, everybody is going to have to deal personally with a disability or to look after a disabled person, and that it is therefore in everyone’s interest that societies should strive to accommodate disabled people, instead of viewing them as “damaged goods” (Asch, 2001). This is all the more important now that biotechnologies are poised to make the boundary between “abled” and disabled even more porous: the notion of disability will be presumably extended to more individuals (e.g. alcoholism, obesity, predispositions to chronic diseases, etc.).

How is this going to affect the social model of disability and the issue of status recognition? Are further adjustments to accommodate the needs of “asymptomatic ill”, that is, people with an “abnormal” genetic constitution necessary? Is it possible that that is going to magnify the problem in unpredictable ways (viz. proliferation of identity groups)?

In everyday life, there remains an enduring tendency to view human beings as worthwhile not for who they are but for what they do and to confuse facts and values, is and ought. The status of physically and cognitively impaired persons – that is, people with non-standard bodies and minds – best illustrates one of the most glaring antinomies of advanced democracies: they classify their citizens by making up new social categories, labels, and group identities, and they attempt to maximise their potential in order to better include them but, in doing so, they cause the already marginalised to become even more vulnerable and less visible, and they also affect the common perception of what is normal, and therefore acceptable and appropriate, that is, normative (Hoedemaekers & Ten Have, 1999).

Nevertheless, the boundary between “normal variation” and “genetic disease” is in part a social construction, because the notions of “normalcy” and “deviance” are historically and culturally relative. What is more, the consequences of physical impairments can be mitigated by the provision of personal and technical support. It follows that this classificatory exercise is completely out of place (Lewontin 2001). No one, not even the State, can arbitrate normality (Rapp, 2000) and, it should be added, it is not at all clear that disability is a kind of harm that is qualitatively different from other socially constructed “harms”, such as poverty or race (Shakespeare 1999).

Historically, there is no such thing as a linear transition from discrimination to acceptance, as far as people judged to be abnormal and pathological are concerned. Instead, economic, social and political determinants (ideologies, cultural trends, and societal arrangements) have changed the experience of disability along an erratic course (O’Brien, 1999). Their dependence on an artificial environment has been a constant reminder of human imperfections and frailty, and of medical and scientific powerlessness. Furthermore, economic setbacks have often resulted in growing concerns over the financial burden of social spending on people with disabilities. In the Twenties, when the German economy was in a state of collapse after WWI but prior to Hitler’s rise to power, it was revealed that a majority of parents with handicapped children would consent to their “euthanasia” if the medical authorities decided on this course of action (Burleigh, 2002). They sincerely believed that, under those circumstances, it would be best for their children.

Unfortunately, too much insistence on the values of autonomy and self-sufficiency, coupled with cost-benefit considerations on how people might best contribute to production and bolster the economy, are likely to devalue people who are not self-sufficient. If detected abnormalities cannot be treated, prenatal diagnosis and subsequent selective pregnancy termination could still be regarded by many as a quick fix to an intractable social problem, namely society’s unfair treatment of the disabled. The mixed message that society is sending to people with disabilities is that they are mistakes that will hopefully be redressed by technological progress; yet they are still welcome. The two goals of trying to eradicate disability while steering society towards a more embracing and supportive attitude to diversity may well prove incompatible.

We must also consider that prenatal genetic testing and, in the future, fetal therapy, that is, the medical treatment of babies in the womb, will not guarantee a “normal” baby. Even the systematic screening of foetuses cannot prevent all “defective children” from being born. This raises the issue of how they will be treated in a society which tends to value competence and intelligence more than anything else, that is to say, one where they would be “better-not-born” (Baroff, 2000). We are already witnessing the clash between constitutional equality and the inequality of bodies (Davis, 2002). It is a situation in which women’s rights are pitted against the civil rights of people with disabilities and of unborn children, while individual rights, values, and interests are played against those of the larger society.

Today, prenatal screenings are largely performed by midwives and obstetricians. In some countries, these techniques have already reduced the prevalence of spina bifida and Down syndrome by 30 percent or more, and the incidence of neural tube defects, Tay Sachs, and beta thalassemia (Cooley’s Anemia) by 90 percent (Asch et al., 2003). Genetic testing, which can analyse thousands of mutations, is instead usually carried out by genetic counsellors and clinical geneticists.

We might want to speculate about the range of possible short-term and long-term effects of the application of new technologies in the field of genetic medicine, albeit they cannot be predicted with certainty. Following the adoption of gene diagnostics (when there is an indication that someone might be affected by a genetic condition) and genetic testing in carrier screening (when no such indication is present), a new class of citizens could arise, which will include people who have been diagnosed as ‘asymptomatic ill’, that is at a higher risk of contracting certain illnesses. Passing specific legislation to prevent discrimination against them would paradoxically make them seem more different from other people than they really are. It has been suggested (Macintyre, 1997) that discrimination in insurance, employment, and healthcare provision – to contain the costs of healthcare benefits – could be the logical consequence of a double-bind whereby you will be treated differently whether you agree that you and your children should be genetically tested, and test positive, or you refuse, for in such case it will be assumed that you might be concealing some inheritable condition. The bottom line seems to be that an epistemological shift has taken place, so that whenever human life does not conform to increasingly high standards of health and quality, it is likely to be deemed a grievous miscalculation.

Human dignity, a notion that law cannot define unequivocally, lies at the foundation of the human rights doctrine, but it is also indissolubly bound up with the concept of quality of life, which is relative. Because of this, quality and equality are pitted against each other and developments in biotechnology could undermine the constitutional principle of equality of all human lives, which is the glue that holds society together. Finally, because prenatal screening is an expensive procedure, it is even possible that, in countries without universal healthcare, more and more people with disabilities will be born into poverty (Asch et al., 2003).


Studying the ethical implications of the new biomedical technologies involves much more than simply assuming that totally rational agents, altogether free from social and cultural strictures and contingencies, and from their physicality, would arrive at the same conclusions, following a single, completely reliable deductive mode of reasoning or, alternatively, starting from some unverifiable articles of faith. A reality made of moral flexibility, discrimination, inequality, differential power relations and access to healthcare cannot be wished away for the sake of conceptual clarity and simplicity. Yet, historical, political and social issues – including the discussion of the common good, the unfairness of healthcare in the United States and elsewhere, and the sheer nonsense of applying the ethical standards of affluent societies in developing countries –, are seldom the object of critical investigation on the part of mainstream bioethicists. These “secular moral experts” understandably prefer to rely on hard logic rather than on the disputable evidence, multiple constraints, relative values, nagging contradictions, and subjective feelings of everyday reality. But that untidy reality, with its attending uncertainty, is the only one there is, at least for most of us, and this is why there can be no univocal, logically necessary solution to our moral quandaries. Condemning the tendency of ordinary people to cling on to their beliefs as a matter of course seems unwarranted. On various important ethical issues people trust their own judgment because they see that their views are widely shared and because they have strong reasons to believe that such a consensus is not going to vanish into thin air any time soon. Indeed, most of us generally subscribe to those moral precepts that have stood the test of time.[6] It is our appreciation of the practical insights and moral expertise of those who came before us which, for instance, lead many to maintain that human dignity is important even though it is hard to define. Unfortunately, the haste with which common sense is waved aside as an inconsequential distraction, together with a rather strong measure of technological determinism, can only reinforce the impression that bioethics has the justificatory function of bringing the public around to the way of thinking of the most enlightened and glamorous elite and, by extension, of the bio-pharmaceutical industry. The fact of the matter is that a thin bioethics confronting the market and powerful professional and corporate interests is bound to be either crushed or to lend itself to the endorsement of an ideology of unbridled competition and rampant consumerism. Bioethicists would therefore be well advised to pay heed to the words of Jean-Baptiste Henri Lacordaire, who once said that “between the weak and the strong, it is freedom which oppresses and the law which sets free.”[7]


Agar, N. (1998). Liberal eugenics. Public Affairs Quarterly, 12(2), 137-155.

Alschuler, A.W. (2001). Law without Values: The Life, Work, and Legacy of Justice Holmes. Chicago and London: University of Chicago Press.

Asch, A. (2001). Disability, bioethics and human rights. In Albrecht, G .L. (et al.) (eds.), Handbook of disability studies (pp. 297-326). Thousand Oaks, etc.: Sage Publications.

Asch, A. et al. (2003). Respecting persons with disabilities and preventing disability: is there a conflict? In S. S. Herr et al. (Eds.), The human rights of persons with intellectual disabilities (pp. 319-346). Oxford: Oxford University Press.

Bachelard-Jobard, C. (2001). L’éugenisme, la science et le droit. Paris: Presses Universitaires de France.

Baroff, G. S. (2000). Eugenics, “Baby Doe”, and Peter Singer: toward a more “perfect” society. Mental Retardation, 38(11), 73-77.

Bouquet, B., Voilley, P. (Eds.). (2000). Droit et littérature dans le contexte suédois. Paris: Flies, 2000.

Braun, K. (2000). Menschenwürde und Biomedizin. Zum philosophischen Diskurs der Bioethik. Frankfurt/New York: Campus.

Burleigh M. (2002). Death and deliverance: ‘euthanasia’ in Germany, c. 1900-1945. Cambridge: Cambridge University Press.

Cavina, M. (2007). Il padre spodestato. L’autorità paterna dall’antichità a oggi. Roma-Bari: Laterza.

Colla, P. S. (2000). Per la nazione e per la razza. Cittadini ed esclusi nel “modello svedese”. Roma: Carocci.

Crignon-De Oliveira, C. Nikodimov, M. G. (2004). A qui appartient le corps humain? Médecine, politique et droit. Paris: Les Belles Lettres.

Davis, L. J. (2002). Bending over backwards. Disability, dismodernism & other difficult positions. New York & London: New York University Press.

Fancher, R. (1983). Biographical Origins of Francis Galton’s Psychology. Isis, 74, 227–33

Fletcher, J. (1974). The Ethics of Genetic Control: Ending Reproductive Roulette. New York: Anchor Books.

Forrest D.W. 1974. Francis Galton. The life and work of a Victorian genius. London: Paul Elek

Gold, E.R. (1996). Body Parts: Property Rights and the Ownership of Human Biological Materials. Washington, D.C.: Georgetown University Press.

Gracia Guillén, D. (1995). Medical ethics: history of Europe ­ Southern Europe. In T. W. Reich (ed.), Encyclopedia of Bioethics (pp. 1556-1563), Vol. 3, New York: Simon and Schuster Macmillan.

Groce, N.E. & Marks, J. (2001). The Great Ape Project and disability rights: ominous undercurrents of eugenics in action. American Anthropologist, 102(4): 818-822.

Hoedemaekers, R. & Ten Have, H. (1999). The concept of abnormality in medical genetics. Theoretical Medicine and Bioethics, 20(6), 537–561.

Lewontin, R. C. 2001. It ain’t necessarily so, New York. New York review book.

Lovgren, S. (2005, One-fifth of human genes have been patented, study reveals, National Geographic News, October 13. Retrieved May 5, 2007

Macintyre S. (1997). Social and psychological issues associated with the new genetics. Philosophical Transactions: Biological Sciences, 352(1357), 1095-1101.

Maclean, A. (1993). The elimination of morality. Reflections on utilitarianism and bioethics. London & New York: Routledge.

Morone, J. A. (2003). Hellfire nation. The politics of sin in American history. New Haven and London: Yale University Press.

O’Brien, G. V. (1999). Protecting the social body: use of the organism metaphor in fighting the “menace of the feeble-minded”. Mental Retardation, 37(3): 188–200.

Paul, D. (1995). Controlling human heredity: 1865 to the present. Atlantic Highlands, N.J.: Humanities Press.

Rapp, R. (1988). Chromosomes and Communication: the discourse of genetic counselling. Medical Anthropology Quarterly, 2(2), 143-157.

Rapp, R. (2000). Testing women, testing the fetus. The social impact of amniocentesis in America. New York and London: Routledge.

Rorvik, D. M. (1971). Brave New Baby. Promise and peril of the biological revolution. Garden City, New York: DoubleDay & Co.

Rose, N. (2005). Will biomedicine transform society? The political, economic, social and personal impact of medical advances in the twenty first century. Clifford Barclay Lecture. Retrieved May 24, 2007,

Sandel, M. J. (2007). The case against perfection. Ethics in the age of genetic engineering. Cambridge, Mass.: The Belknap Press of Harvard University Press.

Shakespeare, T.  (1999). “Losing the plot? Medical and activist discourses of the contemporary genetics and disability. In Conrad, P. & Gabe, J. (Eds) Sociological perspectives on the new genetics (pp. 171-190). Oxford: Blackwell Publishers.

Singer E. et al. (1998). Trends: genetic testing, engineering, and therapy: awareness and attitudes, Public Opinion Quarterly, 52(4), 633-664.

Sweeney G. (2001).“Fighting for the good cause” reflections on Francis Galton’s legacy to American hereditarian psychology. Independence Square, PA: American Philosophical Society.

Waldby, C. (2002). Stem cells, tissue cultures and the production of biovalue. Health, 6(3), 305-323.

Zelizer V. A. (1978). Human Values and the Market: The Case of Life Insurance and Death in 19th-Century America The American Journal of Sociology, 84(3), pp. 591-610.

Attention Deficit Hyperactivity Disorder (ADHD) – is a mental condition affecting children and adults and is typified by inattention, hyperactivity, and impulsivity. Hundreds of scientists and medical professionals in both North America and Europe claim that there is no clear evidence to support the existence of ADHD and contend that most cases fall within the normal range of variation in human behaviour.

Base pair – a structure made of two complementary nucleotides (strands of DNA molecules) joined by weak hydrogen bonding. The base pairs are adenine (A) with thymine (T) and guanine (G) with cytosine (C) for DNA and adenine with uracil and guanine with cytosine for RNA. This is where genetic information is stored. It is estimated that the human genotype contains around 3 billion base pairs which, together, give DNA its double helix shape.

Chimera – Legendary creature with a lion head and chest, the belly and a second head of a goat, and with a serpent for a tail. In biology and genetics, a distinction is drawn between mosaics, that is, those plants and animals that contain different sets of genetically-distinct cells (e.g. humans with mismatched eyes, but also serious genetic conditions such as Turner’s syndrome) deriving from a single zygote, and chimeras, whose cell populations originated from more than one zygote. Animal chimeras are routinely experimentally produced, whereas the creation of part-human, part-animal hybrids (parahumans) is currently unfeasible and illegal.

Germline engineering – The genetic modification of individuals whose alterations will be passed on to their progeny. It involves altering genes in eggs, sperm, or early embryos, by insertion (e.g. of artificial chromosomes), gene deletion or gene transposition.

Germplasm – hereditary material (chromosomes and DNA) of living organisms. Sometimes it is also the name given to a species’ “genome”, namely the entire repertoire of that species’ genotypes.

Human cloning – If it were legal, reproductive cloning would be used to create children who are genetically identical to a cell donor. At present, it would be a very expensive procedure with a staggering rate of failures (about 90%). Therapeutic cloning refers to the creation of identical embryos and tissues in order to harvest stem cells for research and transplantation purposes. There are two main cloning techniques: (a) by embryo splitting (also known as artificial twinning, because it occurs naturally with identical twins): an embryo is split into individual cells or groups of cells that are then artificially prompted to grow as individual embryos; (b) by somatic cell nuclear transfer (SCNT), which is done by transferring genetic material from the nucleus of an adult cell into an enucleated egg, that is an ovum whose genetic material has been taken away. This is the technique used to generate Dolly the sheep.

Hyperparenting – A form of child-rearing in which parents become too involved in the management of their children’s lives.

In vitro fertilization (IVF) – An assisted reproductive procedure in which a woman’s ova (eggs) are removed and fertilized with a man’s sperm in a laboratory dish (the Petri dish). Each IVF cycle is very expensive and has a success rate of no more than 30 percent. It is estimated that there may currently be about half a million IVF babies worldwide.

Mitochondrial DNA (mtDNA) – The portion of the maternally inherited cell DNA which is contained in the mitochondria, tiny organelles that generate energy for the cell by converting carbohydrates into energy.

Preimplantation genetic diagnosis (PGD) – Cells taken from embryos created through in vitro fertilization (IVF) are examined in a Petri dish. Embryos carrying harmful and lethal mutations are discarded and only “healthy” ones are subsequently implanted in her mother’s uterus.

Reprogenetics – The combination of reproductive medicine and biology and genetic technologies. Embryonic stem cell research, the alteration of select genes, as in germ line therapy and in the genetic manipulation of early embryos, cosmetic gene insertion, human embryo cloning, and embryonic pre-implantation genetic diagnosis (PDG and IVF) are reprogenetic techniques.

[1] Spain, Portugal, Italy, and France.

[2] In those countries, most scientists and social analysts correctly understood that Charles Darwin had historicized nature without closing the gap between nature, human history and society. Elsewhere, Social Darwinists, who held that the Darwinian revolution had paved the way to the naturalization of history, found a more receptive audience.

[3] In 2007, the United States, Andorra, Brunei, Iraq, and Somalia were the only countries that had not ratified this treaty

[4] Meaning “beyond commercial appropriation.”

[5] This position finds expression in the principe de non patrimonialité du corps humain of the French civil code, in the principio di gratuità (principle of gratuitousness) of the Italian civil code, and in the Article 3 of the European Charter of Fundamental Rights. As an aside, Article 25 of the Civil Code of Québec, states that: “the alienation by a person of a part or product of his body shall be gratuitous; it may not be repeated if it involves a risk to his health.”

[6] In the words of Spanish bioethicist Diego Gracia Guillén: “la historia es tambien el gran tribunal de la moralidad,” that is, as it were, “ethics is the daughter of time.”

[7] « Entre le fort et le faible, c’est la liberté qui opprime et la loi qui affranchit. »

Non siamo scimmie nude – diritti umani, diritti degli animali, specismo e violenza

a cura di Stefano Fait

Si sono dette un mucchio di sciocchezze sugli esseri umani e sui primati. Qui cerco di usare il buon senso e dati empirici per fare chiarezza una volta per tutte.
Non è una questione marginale: i diritti umani, la prospettiva di un riscatto, la democrazia, la speranza, la fiducia, l’idea di progresso morale, di civilizzazione e maturazione…tutto questo e molto altro dipende da che concezione di natura umana prevale in un dato periodo.

L’idea che ogni essere umano, indipendentemente dalle sue virtù o manchevolezze, è prezioso ed insostituibile informa il nostro giudizio morale riguardo a come si trattano gli esseri umani. Ci comportiamo diversamente nei confronti degli animali perché in loro questa individualità è marcatamente attenuata. Gli insetti, in particolare, non ispirano particolare rimorso, in noi, se li uccidiamo accidentalmente.

Da bambino, il mio raccapriccio nel vedere altri bambini torturare degli insetti non derivava dalla mia empatia verso le vittime, ma dal senso di ripulsa che provavo, istintivamente (non v’era alcuna ponderazione, allora), nei confronti di esseri umani incapaci di trattenere la loro bestialità, ossia di porre un freno al loro potere, di disciplinare la tracotanza, di ascoltare la voce della coscienza.

Non siamo scimmie nude. Gli esseri umani apprendono e progrediscono perché sono in grado di capire che il cambiamento è preferibile ad uno stato di ignaro appagamento, che la vita reale non deve necessariamente seguire le istruzioni contenute nel copione della tradizione, perché una vita creativa, innovativa e vibrante, cioè una vita culturale, può essere immensamente più gratificante.

La lentissima evoluzione degli scimpanzé (se mai c’è stata), che pure sono geneticamente così prossimi agli esseri umani, è la prova più evidente di questa distinzione di fondo. Ciascun essere umano è votato al cambiamento ed è destinato a contribuire al cambiamento del pianeta; in questo, nessun animale si avvicina anche lontanamente alla condizione umana.

Eppure l’incrollabile persuasione che non vi sia mai stata una vera e propria fuga dalla natura e che il nostro temperamento e la nostra condotta di vita siano ancora fortemente determinati da una natura biologica ancestrale ha spinto alcuni ad ipotizzare che l’ipotetico Uomo Naturale non debba essere poi molto diverso dallo scimpanzé (Stanford, 2001).

Secondo questi evoluzionisti gli umani condividono in gran parte con i loro “cugini” primati una natura ominide universale, trasmessa geneticamente. Siccome la cultura, ai loro occhi, è il precipitato delle nostre esigenze evolutive, ne consegue che la cultura umana altro non è che una versione più sofisticata della cultura degli scimpanzé (Kuper, 1994). Tuttavia, come sottolinea Charles S. Peirce, è un grave errore logico quello di ritenere che sia assai probabile che due cose che si rassomigliano molto per certi aspetti siano molto simili anche per altri aspetti. La separazione dei percorsi evolutivi della specie umana e delle altre specie di primati è avvenuta quasi 7 milioni di anni fa. È difficile credere di poter chiudere un occhio su quel che può essere avvenuto in questi 14 milioni di anni (sette milioni di anni per ciascuna diramazione) di evoluzione divergente.

È possibile che l’evoluzione biologica degli scimpanzé non sia coincisa con delle modificazioni rilevanti nel loro comportamento e stile di vita (Tattersall, 1998). In ogni caso parlare di cultura nel caso degli altri primati è drammaticamente riduttivo. La presunta evoluzione culturale degli scimpanzé è così fiacca che non hanno ancora raggiunto neppure l’età della pietra e le dimensioni del loro cranio sono rimaste pressoché immutate per milioni di anni, sempre stando a quanto ci è dato di capire al momento attuale.

Al contrario i cani, che sono geneticamente più distanti da noi rispetto agli scimpanzé, ma hanno condiviso il nostro ambiente domestico per almeno 10 mila anni e forse molto di più, sono in grado di comunicare con gli esseri umani in modi più complessi cioè, in un certo senso, sono più partecipi della cultura e dell’intelligenza umana.

Sembra che alcuni primatologi, invece di concentrarsi su che cosa renda unici gli scimpanzé, e quindi degni di considerazione in quanto tali, siano ossessionati dall’idea di dimostrare quanto siano simili agli esseri umani, come se questo li rendesse più speciali e più meritevoli di tutela. In altre parole proprio chi denuncia teologi ed antropologi di antropocentrismo (come se questa fosse una colpa) non ne è affatto esente, anzi.

Il problema è che difendere i diritti degli animali è “politicamente corretto” e moralmente lodevole, ma non è per questo scientificamente rigoroso. Ancor meno se lo si fa minimizzando le oggettive e straordinarie differenze che separano gli esseri umani dal resto del regno animale. Non c’è nulla di antropocentrico nel sottolineare quel che ci rende unici, cioè un dato di fatto incontrovertibile, così come non sarebbe corretto accusare di bonobocentrismo chi rileva l’unicità degli scimpanzé bonobo.

Ogni essere vivente è unico e prezioso ma, come osservava il genetista e biologo evolutivo Theodosius Dobzhansky, con un riuscito gioco di parole, “ogni specie vivente è unica, ma la specie umana è la più unica”.

Uno scimpanzé è uno scimpanzé, parte di un ramo dell’albero della vita che non rinascerà mai identico a se stesso, e non un tentativo prematuramente fallito di generare un essere umano. Costringere gli scimpanzé in cattività a comportarsi come esseri umani, per poter provare che possiedono innate capacità di risoluzione di problemi, di uso di forme rudimentali di linguaggio e di formazione di un’autocoscienza sembra testimoniare, nella migliore delle ipotesi, la difficoltà per alcuni esperti di accettare il fatto che un membro di una specie non si troverà mai a suo agio comportandosi come un membro di un’altra specie. Nella peggiore delle ipotesi si tratterebbe invece dell’ostinazione antropomorfizzante di chi vede negli scimpanzé un essere umano incompiuto, il cui sviluppo si è interrotto per cause a noi ignote, piuttosto che uno scimpanzé completo. Non potendo parlare di razzismo, in questo caso si dovrebbe parlare di “specismo”.

Non è forse lo stesso atteggiamento coloniale ed imperialista usato dall’Occidente nei confronti dei “primitivi”?

Il fatto è che parlare del comportamento animale in termini umani significa calarli metaforicamente nella matrice delle istituzioni e delle pratiche umane, umanizzandoli, per poi dedurre dalle somiglianze necessariamente riscontrate che il comportamento umano in effetti non può che affondare le sue radici in quello animale. Insomma con un procedimento logico circolare si inseriscono nella premessa le conclusioni che si dovrebbero invece dimostrare.

Tuttavia, anche se certi animali si comportano in modo apparentemente simile agli esseri umani, non è detto che lo facciano per le stesse ragioni che guidano il nostro comportamento. Dopo tutto gli esseri umani sono animali, ma non è vero il contrario. Questa conclusione dovrebbe essere evidente a chiunque non si lasci incantare dalla vezzosa immagine di una Disneyland naturale. L’umanizzazione degli animali e l’animalizzazione degli umani sono di ostacolo alla scienza. Anche se permane una certa propensione a raffigurare l’uomo come “un carciofo da cui puoi togliere le foglie spinose della cultura lasciando solo il nudo cuore tenero dell’uomo naturale” (Marks, 2003: p. 163), va chiarito una volta per tutte che non v’è alcuna natura umana al di fuori della cultura. Natura e cultura non possono essere separate per poter confermare le ipotesi che più ci aggradano.

Neppure l’intento, pur nobile, di rendere gli esseri umani meno arroganti nei confronti degli altri animali può giustificare l’inclinazione a minimizzare quelle caratteristiche che ci distinguono dagli altri primati, che non sono certo il pelo e la promiscuità, ma le facoltà intellettive. Rifiutare l’antropocentrismo per scalzare l’uomo dal suo piedistallo e per promuovere una maggior sensibilità verso gli animali è una scelta di natura morale, non scientifica. Gli esseri umani sono animali culturali e le nostre capacità sono aggiuntive e non alternative a quelle degli altri animali. È proprio questo che ci rende animali differenti da tutti gli altri. La differenza quantitativa tra lo sviluppo umano e quello degli altri animali è tale che può legittimamente essere descritta come una divergenza di ordine qualitativo.

Dopo tutto gli altri primati hanno avuto lo stesso tempo per sviluppare una loro cultura, eppure finora non abbiamo trovato indizi inequivocabili di progresso culturale – se per “cultura” intendiamo creatività, originalità, innovazione e l’abilità di trasmettere certe nozioni alle generazioni successive. Sono le invenzioni il carburante dell’evoluzione culturale e gli animali, fino a prova contraria, non inventano nulla, si limitano ad imitare. Sanno usare semplici utensili, ma non li sanno fabbricare, né pare che sappiano trasmettere alle successive generazioni il modo d’uso appropriato. È come se ad ogni generazione, morti gli esemplari che avevano appreso per esperienza, bisognasse ricominciare tutto da capo.

L’uomo non è una scimmia nuda per la stessa ragione per cui il Barolo non è succo d’uva, anche se entrambi derivano dall’uva. La cultura fa la differenza: senza la cultura le società umane non si differenzierebbero molto da quelle delle scimmie, né il Barolo dal succo d’uva. L’antropologia e l’etnologia hanno il compito di definire in che cosa consista questa differenza. L’etologia, lo studio del comportamento animale, può dirci ben poco in proposito.

La questione centrale di questo dibattito è perciò chiaramente il ruolo da assegnare alla cultura nello sviluppo umano. La cultura è il frutto delle nostre elaborazioni mentali, cioè della nostra mente, che è una proprietà emergente del cervello che ancora facciamo fatica a comprendere e non può certo essere ridotta alle nostre funzioni biologiche, ad una faccenda di biologia molecolare, anche se questo è ciò che certi scienziati e commentatori tendono a fare. Affermare che la cultura sia un fenomeno biologico è un’enorme scempiaggine (Jones, 1993).

Nel corso del progresso umano la nostra specie ha completato una transizione essenziale, dal corpo alla mente, e le prodezze della nostra mente trascendono largamente il nostro DNA e, in parte, la nostra corporeità. Esistono certamente numerosi istinti innati, ma la nostra specie, unica fra tutte, non deve per forza sottostarvi. La selezione naturale è stata rimpiazzata da quella artificiale, in un ambiente artificiale che, per definizione, non produce evoluzione nel senso naturalistico del termine.

Quanto alla capacità di acquisire e manipolare un corredo simbolico, essa non è latente negli altri primati, cioè non si sviluppa normalmente quando si verificano le condizioni più idonee. Il linguaggio, forse la più importante componente della cultura, ma non certo l’unica, costituisce una tale agevolazione evolutiva che è altamente improbabile che una specie dotata dell’abilità di usarlo non se ne sia avvalsa. In natura non esistono animali che insegnano ad altri animali, se non per emulazione passiva. In cattività gli animali apprendono non per imitazione, ma solo grazie all’intervento di istruttori umani. Il linguaggio ha permesso alla specie umana di inventare un uso specifico per degli oggetti, trasformandoli in utensili, e di far capire ai membri di un gruppo la natura e funzione di quegli oggetti. Cioè ha consentito la trasformazione dell’ambiente di vita e quindi della specie, un atto che segna l’inizio della storia. Questo distingue la cultura, che include la tecnologia, i comportamenti, le idee ed il linguaggio, dalla natura, l’utensile dalla cosa, l’informazione dalla conoscenza, l’invenzione dall’atto casuale e l’essere umano da ogni altro animale.

In altre parole, l’etica e la cultura sono un prodotto della mente umana in un ambiente antropizzato. Non possiamo ricavare alcuna prescrizione morale dalla natura perché essa è del tutto amorale. La nostra esistenza o l’esistenza di qualunque altra specie vivente le è del tutto indifferente. È questo che ha reso necessaria la nostra fuga dalla natura nella cultura: la natura ci stava ormai stretta.

La cultura è dunque a buon diritto quel piedistallo che gli esseri umani si sono costruiti nel corso della loro storia evolutiva e sul quale sono poi saliti: nessuno ci ha messi lì, se non noi stessi e peraltro non senza buone ragioni. A livello cognitivo, non siamo neanche una mera estrapolazione o raffinamento di tendenze precedenti (Tattersall & Schwartz, 2000).

È concettualmente e metodologicamente corretto considerarci speciali e unici, senza per questo sentirci autorizzati a dar mostra di un altezzoso autocompiacimento. Preferire il Barolo non significa disprezzare il succo d’uva, e l’Umanesimo e l’Illuminismo sono la miglior testimonianza del fatto che sentirsi speciali può incoraggiare lo sviluppo di un sincero senso di responsabilità universale, anche e soprattutto verso gli animali. Al contrario una delle lezioni della storia è che proprio la naturalizzazione radicale degli esseri umani è quasi sempre sfociata nel genocidio o nella pulizia etnica. Questo lo aveva ben capito lo stesso Charles Darwin, nel suo “L’origine dell’uomo”, in cui denunciava senza mezzi termini il riduzionismo zoologico degli esseri umani, che imputava il male alla Bestia Interiore (Mayr, 2000).

D’altronde il dogma centrale del nazional-socialismo era che la lotta per la sopravvivenza è una legge iscritta nella natura che per ciò stesso dev’essere applicata alle società umane. Ciò sancì l’emergere della nozione di comunità biotica (biocrazia), in cui la separazione tra animali ed esseri umani era annullata (riduzionismo zoologico) mentre la linea divisoria tra malati e sani finì per coincidere con quella tra morte e vita (Sax 2000). Solo chi era costituzionalmente sano e razzialmente eletto era degno di prevalere nella lotta per la vita. Il resto della specie umana poteva solo servire o estinguersi. Non si possono più ignorare le tragiche conseguenze dell’idolatria di una natura antropomorfizzata e della naturalizzazione degli esseri umani tipiche del nazismo. Come detto, la natura ci stava e ci sta sempre più stretta e così siamo stati costretti a fuggire dalla natura nella cultura. Non c’è nulla di moralmente abominevole in questo, con buona pace dei militanti dell’Avatarismo:


Anche i genetisti stanno gradualmente rendendosi conto di aver arbitrariamente e frettolosamente semplificato ciò che andava esaminato con cautela. Per oltre trent’anni ci hanno detto che biologicamente e poi geneticamente la distanza tra scimpanzé ed esseri umani era molto ridotta, circa l’un 1 per cento e ciò doveva bastare a chi si ostinava a sottolineare le peculiarità umane. Ma negli ultimi tempi questa ortodossia è stata messa in discussione e ci si è chiesti se quel valore statistico non costituisse più un ostacolo che uno strumento euristico, specialmente se spingeva gli scienziati a costruire mappe concettuali palesemente errate. Hanno cominciato a fare la loro comparsa studi che riducevano la convergenza tra noi e gli scimpanzé. Per molti genetisti la distanza si aggira ora tra il 5 ed il 17,4 per cento ed imputano la selezione del valore inesatto ad un errore di semplificazione eccessiva. Qualche zoologo ha commentato che l’1 per cento andava bene finché era utile far risaltare le somiglianze, che fino ad allora erano state sottovalutate. Ma poi la realtà dei fatti si è fatta sentire. Steve Jones, professore di genetica allo University College di Londra, ha correttamente osservato che “il DNA c’entra poco…gli scimpanzé possono anche assomigliarci in un senso letterale e stucchevole, ma in tutto ciò che ci rende quello che siamo, l’homo sapiens è realmente unico” (Cohen, 2007). Gli stessi autori del primo articolo responsabile della diffusione del “mito dell’1 per cento”, Mary-Claire King e Allan Wilson (King & Wilson, 1975), misero in guardia i lettori dal giungere a conclusioni affrettate (p. 113): “Sembra di poter dire che i metodi di valutazione della differenza tra umani e scimpanzé producano conclusioni alquanto discordanti”, sottolineavano già a quel tempo. A dire il vero, questo scrupoloso impegno di quantificazione delle differenze, che in precedenza era stato impiegato per sceverare ciò che distingueva i membri delle varie razze umane, appare come sempre più stravagante ed ingiustificato: “Non penso ci sia alcun modo di ottenere un numero qualunque”, afferma il genetista Svante Pääbo dell’Istituto Max Planck di antropologia evolutiva di Lipsia, “Alla fine il modo in cui vediamo le nostre differenze è solo un fatto politico, sociale e culturale” (Cohen, 2007). Il che comprova ancora una volta che il pregio della scienza risiede proprio nella sua capacità auto-correttiva.

Detto questo, imputare il male nel mondo all’antropocentrismo, e quindi ridurre la distanza tra uomini e scimmie antropomorfe come se questo potesse prevenire futuri genocidi, non ha molto più senso che affidarsi alla clemenza di Dio per conseguire il medesimo risultato. Entrambi gli approcci compromettono l’obiettività dell’osservazione e l’interpretazione dei dati.

C’è una sottile ironia nel parallelo che si può stabilire tra la genetica dell’1 per cento e l’antropologia illuminista. Quando gli scimpanzé furono scoperti dai primi esploratori, nel XVII e XVIII secolo, si notarono le ovvie somiglianze ma, invece di dedurne una profonda affinità e fratellanza, ciò servì invece a rafforzare vieppiù l’idea che l’uomo fosse un essere speciale. Quale prova più schiacciante dell’unicità dell’uomo dell’esistenza di essere antropoidi così rassomiglianti all’uomo, anche e soprattutto a livello fisiologico, ma totalmente privi di carattere morale, che solo l’anima umana poteva ispirare? Così, quanto più chiare apparivano le analogie, più innegabile risultava la verità dell’unicità dell’essenza spirituale umana (Wokler, 1993).

Anche la favola bella della Buona Scimmia, il Bonobo “naturalmente” altruista e pacifico, è stata ormai ampiamente screditata. Solo chi, magari abbagliato da un pregiudizio naturalista o da sentimentalismi malriposti – magari il senso di un’ingiustizia che va sanata – volesse ignorare l’abisso di complessità socio-culturale che ci separa dai Bonobo potrebbe credere di ricavare dal loro comportamento indicazioni utili su come dovrebbero procedere gli affari umani (Goodman et al., 2003). Quarant’anni fa, in piena battaglia per i diritti civili, gli scimpanzé erano trattati dai primatologi alla stregua di “buoni selvaggi”. Divennero poi crudeli predatori dediti all’infanticidio ed al cannibalismo negli anni 80 degli yuppie rampanti à la Gordon Gekko, lo spietato Michael Douglas del celebre “Wall Street”; recentemente sono ridiventati fondamentalmente buoni ma in parte “corrotti” dal contatto con gli esseri umani, in natura come in cattività. La domanda che ci dobbiamo porre è se siano stati loro a cambiare in questi 40 anni oppure gli umori della nostra società. Insomma, sembra di poter dire che il nostro atteggiamento verso gli altri sia determinato in gran parte dalla nostra autopercezione.

Il problema sembra essere che quando si studiano specie molto simili alla nostra lo scienziato tende a vedere ciò che desidera vedere ed a rimuovere le discrepanze che in qualche misura macchiano l’immagine desiderata. Così la descrizione del comportamento sessuale e sociale dei primati sarà tanto più popolare quanto più corrisponderà alle attese del pubblico, cioè quanto più rifletterà l’auto-ritratto della nostra specie: se ci percepiamo come naturalmente violenti allora gli altri primati, e non per caso, confermeranno questa nostra percezione; se invece poniamo l’accento sulla nostra capacità di esercitare un’ampia misura di discernimento morale, allora appariranno studi scientifici che “dimostreranno” come le scimmie abbiano sviluppato rudimentali ma promettenti sistemi etici.

Forse la più convincente spiegazione delle variazioni comportamentali dei primati è alquanto semplice: ancora negli anni Venti l’anatomista Solly Zuckerman riferì che i babbuini dello zoo di Londra mostravano elevati livelli di gerarchizzazione ed aggressività. Ma nessuno dei suoi colleghi che lavoravano sul campo, nell’habitat stesso dei babbuini, riuscì a replicare queste osservazioni. Divenne quindi presto evidente che il comportamento dei babbuini di Zuckerman era stato drammaticamente alterato dalla riduzione dello spazio in cui i babbuini si trovavano a vivere (N.B. i Bonobo hanno invece a disposizione un’area vasta quanto l’Inghilterra e priva di rivali). Le restrizioni artificiali imposte da uno spazio chiuso come quello di uno zoo avevano generato dinamiche interne che non potevano esistere all’esterno e quindi le osservazioni di Zuckerman non erano generalizzabili (Rose, 1998).

%d blogger hanno fatto clic su Mi Piace per questo: