Ethics in postmodernism

Modernism held sway over Western thought over centuries. It stripped morality of its transcendent religious frame of reference. Away with God, was its cry. Even when it tried to shape a world without any reference to restraints, constraints, traditions, and above all religion, modernism did attempt to retain such values as work, saving, and the postponement of immediate satisfaction in order to attain a long-term benefit. What it did try to retain may owe their origin to a reference outside of the individual, but that was no immediate concern to modernism. Subjective self-expression was its goal. But when modernism reached its critical point, when the emphasis on subjectivism destroyed the need for objectivism, it eventually led to an almost “lawless” status in human history. Consequently, a new morality emerged. This new morality was pleasure-seeking, playful, individualistic, and geared to the present moment, denying the need to look to the past or gaze into the future. Now became its new mantra. As a result, there arose a stand against all efforts to place limits on individual freedom and fulfillment.

This new morality is at the core of postmodern ethics.

Postmodern ethics

At the foundation of postmodern ethics is an authority crisis.1 The crisis involves traditional institutions (family, school, church, state, justice, police) through which modernism sought to organize a rational and progressive society. The crisis manifests itself in several ways: A society that worships youth, and panders to their whims and fancies. 2 A culture where wealth is the sign of success and happiness. A consumer economy where “to be” is to buy, consume, use, and throw away. An identity marked by market acquisitions and not by ideologies.3 Gilles Lipovetsky, a contemporary French philosopher, has observed that in postmodernity “imaging” dominates reality. To be somebody is to be on screen or on a web site.4 What is seen defines what is; almost nobody cares anymore about what “really” is: the public image is the object of worship.5

Our postmodern culture has lost its love for the truth.

In contrast to modernism’s work ethic and individual saving, today’s ethic affirms the values of consumer spending, 6 free time, and idleness.7 But this could not function without the exaltation of individualism, a devaluation of charitable causes, and indifference toward the public good.8 The pursuit of gratification, pleasure, and private fulfillment is the supreme ideal. The worship of personal independence and diversity of lifestyle become important. Pluralism provides a multiplicity of values, with individual options, but none with authenticity. Differences in ideology or religion are treated as fashions and superficial.9 The culture of personal freedom, relaxation, the natural, the humorous, sincerity, and freedom of expression emerge as something sacred.10 The irrational is legitimized through affections, intuitions, feelings, carnality, sensuality, and creativity.11 All these take place within the framework of an axiom respected by nearly all: Minimize austerity and maximize desire, minimize discipline and maximize understanding. 12

At the same time, the media of mass communication and information, determine public opinion, the standards of consumer spending and behavior.13 The media replace religious interpretation and ethics with punctual, instant, direct, and objective information. They value what seems real now above concepts of good and evil.14 Paradoxically, the influence of the media grows in the midst of a crisis of communication. People talk only of themselves. They want to be heard but do not want to listen. They want communication without commitment. Hence the search for connection at a distance, invisible friends, hotlines and e-mail chat rooms, and friendships.15

A new shape to morality

What shape does morality take in the epistemological-social-cultural context of postmodernism?

According to Lipovetsky, with the dawn of postmodernism in the mid- 20th century, an age of post-duty has come to be. This age renounces absolute duty in the field of ethics.16 An ethic has taken shape that proclaims the individual right to autonomy, to happiness, and to individual fulfillment. Postmodernism is a post-morality age because it disregards higher, unconditional values such as service to others and self-denial.

Nevertheless, our society does not exclude repressive and virtuous legislation (against drugs, abortion, corruption, evasion, death penalty, censure, protection of children, hygiene, and healthy diet).17 Postmodernism does not propose moral chaos but rather redirects ethical concerns through a weak, ephemeral, painless commitment to values that do not interfere with individual freedom: It is not so much hedonistic as neo-hedonistic. This blend of duty and denial of duty in postmodern ethic becomes necessary because absolute individualism would destroy the conditions needed to facilitate the search for pleasure and individual fulfillment. An ethic is needed that prescribes some duties to control individualism without proscribing the same. The postmodern moral concern does not express values, but rather indignation against limitations on freedom. The object is not virtue but rather the earning of respect.18 There is an effort to forbid everything that could limit individual rights. That is why the new morality can co-exist with consumer spending, pleasure, the individual search for private fulfillment. It’s a painless, lite morality where anything goes, but where unconditional duty and sacrifice are dead. Postmodern has left behind both moralism and antimoralism. 19

But such a course results in an ambiguous morality. On the one hand we have an individualism without rules, manifested in family indebtedness, families without parents, parents without families, illiteracy, the homeless, ghettos, refugees, marginal people, drugs, violence, delinquency, exploitation, white-collar crimes, political and economic corruption, the unscrupulous grasping of power, genetic engineering, experimentation on human beings, etc. On the other hand there floats over society a spirit of hyper-moralistic vigilance ready to denounce all attempts against human liberty and the right to individualistic autonomy: an ethical concern for human rights; apologies for errors of the past; environmentalism; campaigns for saying No to drugs, tobacco, pornography, abortion, sexual harassment, corruption, and discrimination; ethical tribunals; silent marches; protection against child abuse; movements to rescue refugees, the poor, etc.20

In this context, the neo-hedonistic morality of postmodern life translates into demands that pull in opposite directions. On the one hand, we have standards: You must eat healthfully, keep your figure, fight wrinkles, keep trim, value the spiritual, relax, be involved in sports, succeed, excel, control violent behavior, etc. On the other hand, we find the promotion of pleasure and the easy life, the exoneration from moral responsibility, exaltation of consumer spending and image-making, valuing the body to the neglect of the spiritual. As a result, there is depression, emptiness, loneliness, stress, corruption, violence, pushing to one side, cynicism, etc.21

Postmodern morality in everyday life

To understand how much postmodern morality has affected life around us, consider two typical lists that postmodernism projects: a list of moral “duties” and a list of moral “permissions”:

List 1: Typical “moral” duties in postmodern “ethics”:

  • Don’t discriminate against any kind of lifestyle.
  • Attend benefit concerts for charitable causes.
  • Dial a number to make a donation.
  • Paste an anti-racism logo on your windshield.
  • Walk in a march against perceived injustice.
  • Run in a marathon for a healthy life.
  • Use condoms.
  • Prohibit prohibition (everybody should be free to run his or her own life).
  • Wear a ribbon to protest discrimination against homosexuals.
  • Be an environmentalist.
  • Donate your body organs.
  • Regulate the workplace to prevent sexual harassment.
  • Be faithful (as long as love lasts, but afterward ...).
  • Condemn every kind of violence.
  • Don’t try to convert someone else to another religion.

List 2: Typical “moral” permissions:

  • Provide sexual freedom, but no harassment, and watch out for AIDS.
  • Corruption is better than being considered stupid.
  • Smoke, but not in the non-smoking section.
  • Have no commitments to rules, people, or causes that interfere with personal fulfillment.
  • Prostitution is OK, but only in the red-light district.
  • Lying is OK, but not during a political campaign.
  • Divorce is OK, but only to attain personal fulfillment.
  • Infidelity is OK, but only when love has vanished.
  • Abortion is OK, but only to further family planning.
  • Try anything in the pursuit of selfexploration, in search of personal fulfillment.
  • Adapt religion to the commitment one wants to make.
  • Drink, but not to excess.
  • Collect success, fame, and money, at the expense of whomever.
  • Have a good time; don’t worry about the future.

“Conscience code” of a post-moralist

Postmodern ethics does not stop with such ludicrous lists. Postmodernism’s spirit of ultimate freedom produces its own code of conscience. In an atmosphere of neo-individualism, a new type of ideological, social-cultural and ethical elements coalesce to gel a new kind of postmodern conscience. Its particulars would look something like this:

  • I must not discriminate because I must have an open look and there are no absolute truths.
  • I must donate money to charitable causes because I’m turned off looking at hungry children.
  • I must walk in a march against impunity so that criminals will not get off easy.
  • I must live healthfully because my body is my tool to acquire success and pleasure.
  • I should take an interest in some kind of religion because it might energize me.
  • I should show a concern for serious topics so I won’t look like a cheap materialist and copycat.
  • I shouldn’t criticize any lifestyle because anything goes and nothing works.

Critical evaluation: A cynical morality

Having said all this, some may point out that postmodernist ethics is not all bad. Yes, there are some positive contributions made by postmodern concern for problems that threaten human life today. Healthful lifestyle, care for the environment, and the struggle against violence and discrimination are all commendable. Furthermore, postmodernism points out the theoretical and practical ethical failures of the past. But let us not be deceived. At its core, postmodern ethic does not have a moral motivation. In reality, it pursues the individualistic search for personal fulfillment and autonomy. While the motive behind all authentic ethics is to overcome evil with good, postmodernism is devoid of moral inspiration. It wants only to combat the excesses of evil but does not want to eradicate evil. It struggles against certain manifestations of evil without recognizing the root of evil. Its goal is the achievement of selfish autonomy —something against which the biblical portrayal of sin speaks so much.

How then can a moral system struggle against evil if its very foundation is the pursuit of self, which is, biblically speaking, the source of evil? Is it possible to achieve happiness within this kind of morality that postmodernism advocates? If happiness is the search for autonomy, personal fulfillment, the satisfaction of immediate desire, the control of excessive individual freedom without a true opening of the soul to one’s neighbor and to God, then in this morality the search for happiness is a perpetuation of things as they always have been. More of the same: a mixture of life and death, pleasure and pain, success and failure, happiness and sadness. But this ignores what’s behind the human search for happiness: the desire for something else, something different, something that will do away with these antithetical clashes. That “something else” is missing in the postmodern search for happiness. Its ethics settles for a trifle, for a lower goal; it argues that because traditional moralities, including Christian ethic, have not changed us for the better, it’s time to set a lower goal and accept people as they are.

However, this attitude of resignation assumes that Christianity has truly been applied and failed, and on that basis we must judge the potential of Christianity to make a contribution as nil. But this assumption contradicts the postmodern maxim that there is no absolute truth. There is no truth, says postmodernism, on the one hand. However, it presumes, on the other hand, that traditional morality has run its course, that the human today cannot be improved on, that a radical change is impossible, and that we should resign ourselves to that. Who can know that, and how can it be known? It would appear that postmodernism has somehow managed to know for sure a few things about human nature and about the future, a knowledge which it denies to all the ideologies and religions of the past. That’s why we consider that it is a cynical posture, affirming (implicitly) on the one hand what it denies (explicitly) on the other.

Raúl Kerbs (Ph.D., Universidad de Córdoba)

Notes and references:

1. Kenneth Gergen, El yo saturado: Dilemas de identidad en el mundo contemporáneo (Barcelona: Paidós, 1992) pp. 164-168.

2. Beatriz Sarlo, Escenas de la vida posmoderna: Intelectuales, arte y videocultura en la Argentina (Buenos Aires: Ariel, 1994) pp. 38-43.

3. Sarlo, pp. 27-33.

4. Gilles Lipovetsky, El imperio de lo efímero (Barcelona: Anagrama, 1990), pp. 225- 231.

5. Sarlo, pp. 27-33.

6. Lipovetsky, pp. 225-231.

7. Gilles Lipovetsky, La era del vacío: Ensayos sobre el individualismo contemporáneo (Barcelona: Anagrama, 1986), p. 14.

8. Lipovetsky, El imperio de lo efímero, pp. 201, 202.

9. Ibid, pp. 313-315.

10. Lipovetsky, La era del vacío, pp. 7-11.

11. Lipovetsky, El imperio de lo efímero, p. 196.

12. Lipovetsky, La era del vacío, p. 7.

13. Lipovetsky, El imperio de lo efímero, p. 251.

14. Ibid, pp. 256-258.

15. Ibid, pp. 321-324.

16. Gilles Lipovetsky, El crepúsculo del deber: La ética indolora de los nuevos tiempos democráticos (Barcelona: Anagrama, 1994), pp. 9-12, 46.

17. Lipovetsky, El crepúsculo del deber, p. 13.

18. Ibid, Chapters II, III.

19. Ibid, pp. 47-49.

20. Ibid, pp. 14, 15, 55, 56, 208, 209.

21. Ibid, pp. 55ff.

Have any man-made structures mentioned in the Bible been unearthed by archaeologists?

Yes, quite a number of Biblical structures have been excavated. Some of the most interesting are the following:


Author: Bryant Wood of Associates for Biblical Research

Einstein and Intelligent Design


In the past few years numerous scientists, scientific journals, and popular authors have published a slew of articles and books ripping the concept of Intelligent Design. While not specifically denying the theory of evolution, the theory of Intelligent Design postulates that the incomprehensible vastness and complexity of the Cosmos are the result of design on the part of an inconceivably intelligent being.

Many scientists dismiss any concept of an intelligent designer as unscientific, and claim that any recognition of or belief in such a designer does harm to the scientific method. However, the greatest scientist who ever lived, Albert Einstein, did not share this outlook. His years of studying the universe not only led him to come up with the Theory of Relativity, but also led him to believe, in his own words, in a “spirit manifest in the laws of the universe,” in a “God who reveals Himself in the harmony of all that exists” (Isaacson 2007: 44). He once wrote:

“The religious inclination lies in the dim consciousness that dwells in humans that all nature, including the humans in it, is in no way an accidental game, but a work of lawfulness that there is a fundamental cause of all existence” (Ibid. 46).

In a 1930 essay entitled “What I Believe” Einstein wrote: “To sense that behind anything that can be experienced there is something that our minds cannot grasp, whose beauty and sublimity reaches us only indirectly: this is religiousness. In this sense, and in this sense only, I am a devoutly religious man” (Ibid. 47).

He also made the following statement in an essay entitled “The Religiousness of Science,” which appeared in a collection of his essays published in English under the title “The World As I See It”:

“The scientist is possessed by the sense of universal causation. His religious feeling takes the form of a rapturous amazement at the harmony of natural law, which reveals an INTELLIGENCE of such superiority that, compared with it, all the systematic thinking and acting of human beings is an utterly insignificant reflection. This feeling is the guiding principle of his life and work, in so far as he succeeds in keeping himself from the shackles of selfish desire” (Updike 2007: 77 [emphasis added]).

These statements are highly significant, considering that no scientist of any worth would dismiss Einstein as superstitious or unscientific. Moreover, the above quotes can’t be dismissed as the product of a religious bias on Einstein’s part, because, except for a brief period of “deep religiousness” when he was twelve, Einstein rejected organized religion (Ibid.).

According to the April 16 2007 issue of Time magazine, in his youth Einstein “rejected at first his parents’ secularism and later the concepts of religious ritual and of a personal God who intercedes in the daily workings of the world” (Isaacson 2007: 44). The magazine further reported: “Einstein’s parents…were ‘entirely irreligious.’ They did not keep kosher or attend synagogue, and his father Hermann referred to Jewish rituals as ‘ancient superstitions,’ according to a relative” (Ibid.). As mentioned, the 12-year-old Albert briefly embraced strict Judaism, but he later wrote: “Through the reading of popular scientific books, I soon reached the conviction that much in the stories of the Bible could not be true” (Ibid. 46).

Einstein’s belief in an intelligent designer thus derived not from a pre-conceived religious bias, but from the phenomenal insights into the Universe that he possessed as the most brilliant scientist who ever lived. His recognition of a creator refutes the recent claims by atheists that belief in any sort of god is unscientific.

By Stephen Caesar

References:

Isaacson, W. 2007. “Einstein and Faith.” Time, 16 April.

Updike, J. 2007. “The Valiant Swabian.” The New Yorker, 2 April.

Stephen Caesar holds his master’s degree in anthropology/archaeology from Harvard.

Why Evolution is False

What Are We Talking About?

Here is Coyne’s definition of evolution:

In essence, the modern theory of evolution is easy to grasp. It can be summarized in a single (albeit slightly long) sentence: Life on earth evolved gradually beginning with one primitive species—perhaps a self-replicating molecule—that lived more than 3.5 billion years ago; it then branched out over time, throwing off many new and diverse species; and the mechanism for most (but not all) of evolutionary change is natural selection. 1

Notice that he intentionally excludes the origin of life. He postulates the existence of a single kind of living thing, “perhaps a self-replicating molecule,” upon which all subsequent changes build. Because of this definition, he avoids all discussion of how a lifeless Earth produced that first living thing.

According to Coyne, evolution begins with a living thing that already contains a mechanism for obtaining energy from the environment, a mechanism for storing that energy, converting the energy to other forms, using that energy for useful purposes, the ability to grow, the ability to reproduce itself, intrinsic genetic information, and has a method for expressing that genetic information as physical features. This living thing came about by some natural process which we can’t even begin to imagine, but isn’t of any real importance to answering the question of how we came to be on this Earth.

Clearly, the origin of that first living thing is vital to the theory of evolution. Why doesn’t Coyne include the origin of life in his definition of evolution? You know the answer. He can’t begin to explain it. Defining evolution as he did gives him an excuse to not even try.

Excuses

If you are expecting a book with the title, Why Evolution is True to contain proof for the theory of evolution, you will be disappointed. What it really contains is excuses why evolutionists can’t prove evolution is true, why it is unreasonable to expect evolutionists to provide proof, and why you should believe in evolution anyway. Let the excuses begin!

Why We’ve Never Seen It

Nobody has ever observed macroevolution in the laboratory or in nature. Here is his excuse for why we have not.

Further, we shouldn’t expect to see more than small changes in one or a few features of a species—what is known as macroevolutionary change. Given the gradual pace of evolution, it’s unreasonable to expect to see selection transforming one “type” of plant or animal into another—so-called macroevolution—within a human lifetime. Though macroevolution is occurring today, we simply won’t be around long enough to see it. Remember that the issue is not whether macroevolutionary change happens—we already know from the fossil record that it does—but whether it was caused by natural selection, and whether natural selection can build complex features and organisms. [italics his] 2

There is a process known as “microevolution” that really does occur. Microevolution is the variation within a species that occurs because of loss of genetic information. But he is talking about “macroevolution,” which is the creation of a new kind of living thing resulting from genetic information that previously did not exist.

He asserts, without proof, that macroevolution is occurring today, while admitting that one can’t see it happening. That is, genetic information is supposedly arising spontaneously that will create a new kind of creature. He just knows it, even though nobody can actually see it. The alleged reason nobody can see it is because it happens so slowly.

For one thing, natural selection in the wild is often incredibly slow. The evolution of feathers, for example, probably took hundreds of thousands of years. Even if feathers were evolving today, it would simply be impossible to watch this happening in real time, much less to measure whatever type of selection was acting to make feathers larger. 3

The real reason why nobody has ever seen it is because it hasn’t happened! Genetic information doesn’t just magically appear.

He thinks he sees macroevolution in the fossil record. This is remarkable because he spends so many pages trying to explain why there are no missing links in the fossil record!

Why There Are No Missing Links

We don’t find any missing links in the fossil record but, according to Coyne, we should not expect to find any.

Taking into account all of these requirements, it’s clear that the fossil record must be incomplete. … we can estimate that we have fossil evidence of only 0.1 percent to 1 percent of all species—hardly a good sample of the history of life! [italics his] 4

What should our “missing link” with apes look like? Remember that the “missing link” is the single ancestral species that gave rise to modern humans on the one hand and chimpanzees on the other. It’s not reasonable to expect the discovery of that critical single species, for its identification would require a complete series of ancestor-descendant fossils on both the chimp and human lineages, series that we could trace back until they intersect at the ancestor. Except for a few marine microorganisms, such complete fossil sequences don’t exist. And our early human ancestors were large, relatively few in number compared to grazers like antelopes, and inhabited a small part of Africa under dry conditions not conducive to fossilization. Their fossils, like those of all apes and monkeys, are scarce. This resembles our problem with the evolution of birds from feathered reptiles, for whom transitional fossils are also rare. We can certainly trace the evolution of birds from feathered reptiles, but we’re not sure exactly which fossil species were the direct ancestors of modern birds.

Given all this, we can’t expect to find the single particular species that represents the “missing link” between humans and other apes. We can hope only to find its evolutionary cousins. Remember also that this common ancestor was not a chimpanzee, and probably didn’t look like either modern chimps or humans. Nevertheless, it’s likely that the “missing link” was closer in appearance to modern chimps than to modern humans. We are the odd man out in the evolution of modern apes, who all resemble one another far more than they resemble us. 5 [italics his]

We will return to this issue of humans being so different from modern apes later; but let’s stick to the impossibility of finding missing links for the moment.

Clearly, he is talking out of both sides of his mouth. He says that complete fossils sequences don’t exist, except for a few microscopic marine organisms. Microscopic fossils are controversial because scientists don’t always agree that they even are fossils. But, let’s suppose they really are fossils. Just because they look similar doesn’t necessarily mean that they are biologically descended from one another. Even if they are descended from one another, they are all still just microorganisms which demonstrate variation—not evolution. So, actually, the alleged microscopic fossils don’t really show evolution.

Human and bird fossils allegedly provide the best (although incomplete) sequence of fossils, but even they don’t really show a clear pattern of evolution, so Coyne remains in full-blown excuse mode.

Although far from complete, the record of human evolution is one of the best confirmations we have of an evolutionary prediction, and is especially gratifying because the prediction was Darwin’s.

But a few caveats. We don’t (and can’t expect to) have a continuous fossil record of human ancestry. Instead, we see a tangled bush of many different species. Most of them went extinct without leaving descendants, and only one genetic lineage threaded its way through time to become modern humans. We’re not sure yet which fossil species lie along that particular thread, and which were evolutionary dead ends. The most surprising thing we’ve learned about our history is that we’ve had many close evolutionary cousins who died out without leaving descendants. It’s even possible that as many as four humanlike species lived in Africa at the same time, and maybe in the same place. Imagine the encounters that might have taken place! Did they kill one another, or try to interbreed? 6

After saying they unable to tell how the different fossils are related, he next admits they aren’t even able to classify the fossils with any degree of certainty.

And the names of ancestral human fossils can’t be taken too seriously. Like theology, paleontology is a field in which the students far outnumber the objects of study. There are lively—and sometimes acrimonious—debates about whether a given fossil is really something new, or merely a variant of an already named species. These arguments about scientific names often mean very little. Whether a humanlike fossil is named as one species or another can turn on matters as small as half a millimeter in the diameter of a tooth, or slight differences in the shape of the thighbone. 7

It is important to remember that when paleontologists talk about “human fossils” they generally aren’t talking about complete skeletons. Often they are talking about one or two bones, a partial skull, or a few teeth. One can’t even be sure that the teeth and bones go together. This is why there are so many arguments. The models of our “human ancestors” that are displayed in museums are based on a few bones and a lot of speculation based on the presumption of evolution.

Here is his self-contradictory summary.

Looking at the whole array of bones, then what do we have? Clearly, indisputable evidence for human evolution from apelike ancestors. Granted, we can’t yet trace out a continuous lineage from an apelike early hominid to modern Homo sapiens. The fossils are scattered in time and space, a series of dots yet to be genealogically connected. And we may never have enough fossils to join them. 8

It is indisputable and yet unproven. How can you argue with “logic” like that?

For the Birds

Coyne makes general claims that the evolution of dinosaurs to birds, and the origin of flight, is well documented in the fossil record. But when he gets to specifics, he just makes excuses for why they don’t really know anything at all about the evolution of birds.

Because reptiles appear in the fossil record before birds, we can guess that the common ancestor of birds and reptiles was an ancient reptile, and would have looked like one. We now know that this common ancestor was a dinosaur. 9 [italics his]

Coyne so easily goes from “guess” to “know.” Even if the fossil record showed that a particular reptile died before a particular bird, it doesn’t prove that the bird is a biological descendant of the reptile. It is an indisputable fact that Big Brown (the horse that won the 2008 Kentucky Derby) died in 2008, and President George Washington died in 1799. Does that prove that Big Brown was a biological descendant of George Washington? Of course not!

We want you to get the full impact of Coyne’s explanation about bird evolution, so here is a long passage. As always, colored highlights are ours, but the italics for emphasis in the quote are his.

But if feathers didn’t arise as adaptations for flying, what on earth were they for? Again, we don’t know. They could have been used for ornamentation or display—perhaps to attract mates. It seems more likely, though, that they were used for insulation. Unlike modern reptiles, theropods may have been partially warm-blooded; and even if they weren’t, feathers would have helped maintain body temperature. And what feathers evolved from is even more mysterious. The best guess is that they derive from the same cells that give rise to reptilian scales, but not everyone agrees.

Despite the unknowns, we can make some guesses about how natural selection fashioned modern birds. Early carnivorous dinosaurs evolved longer forelimbs and hands, which probably helped them grab and handle their prey. That kind of grabbing would favor evolution of muscles that would quickly extend the front legs and pull them inward: exactly the motion used for the downward stroke in true flight. Then followed the feathery covering, probably for insulation. Given these innovations, there are at least two ways flight could have evolved. The first is called the “trees down” scenario. There is evidence that some theropods lived at least partly in trees. Feathery forelimbs would help these reptiles glide from tree to tree, or from tree to ground, which would help them escape predators, find food more readily, or cushion their falls.

A different—and more likely—scenario is called the “ground up” theory, which sees flight evolving as an outgrowth of open-armed runs and leaps that feathered dinosaurs might have made to catch their prey. Longer wings could also have evolved as running aids. The chukar partridge, a game bird studied by Kenneth Dial at the University of Montana, represents a living example of this step. These partridges almost never fly, and flap their wings mainly to help them run uphill. The flapping gives them not only extra propulsion, but also more traction against the ground. Newborn chicks can run up 45-degree slopes, and adults can ascent 105-degree slopes—overhangs more than vertical!—solely by running and flapping their wings. The obvious advantage is that uphill scrambling helps these birds escape predators. The next step in evolving flight would be very short airborne hops, like those made by turkeys and quail fleeing from danger.

In either the “trees down” or “ground up” scenario, natural selection could begin to favor individuals who could fly farther instead of merely gliding, leaping, or flying for short bursts. Then would come the other innovations shared by modern birds, including hollow bones for lightness and that large breastbone.

While we may speculate about the details, the existence of transitional fossils—and the evolution of birds from reptiles—is fact. 10

The only real science here is the study showing that wings can help birds run uphill. All the rest is, as Coyne admits, speculation—and therefore an undeniable fact!

We don’t have space this month to point out all the times Coyne makes bold general claims about the fossils, and then makes excuses for why the fossil data doesn’t support the general claim. We hope we have given you enough examples to prove our point, and hope that you read his book to find more examples for yourself.

Not Like Apes

Earlier in this essay we did promise, however, to examine Coyne’s statement about humans being so different from apes. This is important because evolutionists are stuck in the middle. On the one hand, they need to prove that we are so close genetically to apes that we must be biologically related to them. On the other hand, they need to explain how such a small genetic difference can produce such obvious, significant differences between men and apes.

That oft-quoted 1.5 percent difference between ourselves and chimps, then is really larger than it looks … More than 6 percent of genes found in humans simply aren’t found in any form in chimpanzees. There are over fourteen hundred novel genes expressed in humans but not in chimps. … Despite our general resemblance to our primate cousins, then, evolving a human from an apelike ancestor probably required substantial genetic change. 11 [italics his]

He is pretty close to the truth here. We’ve shown before that the allegedly small genetic difference between apes and man is a fictitious result of some artful mathematics. 12 There really is a substantial genetic difference between apes and humans which evolutionists don’t like to admit because it weakens their argument that we share a common biological ancestor.

The Discontinuity Problem

The most basic problem with the theory of evolution is staring us right in the face, but it is so obvious that it is often overlooked.

Indeed, perhaps the most striking fact about nature is that it is discontinuous. When you look at animals and plants, each individual almost always falls into one of many discrete groups. When we look at a single wild cat, for example, we are immediately able to identify it as either a lion, a cougar, a snow leopard, and so on. All cats do not blur insensibly into one another through a series of feline intermediates. And although there is a variation among individuals within a cluster (as all lion researchers know, each lion looks different from every other), the clusters nevertheless remain discrete in “organism space.” We see clusters in all organisms that reproduce sexually.

These discrete clusters are known as species. And at first sight, their existence looks like a problem for evolutionary theory. Evolution is, after all, a continuous process, so how can it produce groups of animals and plants that are discrete and discontinuous, separated from others by gaps in appearance and behavior? How these groups arise is the problem of speciation—or the origin of species.

That, of course, is the title of Darwin’s most famous book, a title implying that he had a lot to say about speciation. … Yet Darwin’s magnum opus was largely silent on the “mystery of mysteries.” And what little he did say on this topic is seen by most modern evolutionists as muddled. 13 [italics his]

If the theory of evolution were true, then plants and animals really would blur together without clear distinctions. It really is a problem for which Coyne has no good answer.

No Excuse for Sex

The origin of sex is one of the hardest things for evolutionists to explain. Coyne doesn’t have an answer. As usual, he just punts.

The question of the number of sexes is a messy theoretical issue that needn’t detain us, except to note that theory shows that two sexes will evolutionarily replace mating systems involving three or more sexes: two sexes is the most robust and stable strategy.

The theory of why the two sexes have different numbers and sizes of gametes is equally messy. This condition presumably evolved from that in earlier sexually reproducing species in which the two sexes had gametes of equal size. 14

False Claims

On those rare occasions when Coyne isn’t attacking creationists or making excuses for why there isn’t any real proof for evolution, he makes false claims about evidence for evolution. Here are just a few.

If we know the half-life, how much of the radioisotope was there when the rock was formed (something that geologists can accurately determine), and how much remains now, it’s relatively simple to estimate the age of the rock. 15

Geologists have no possible way of knowing how much radioactive material was in the rock when it formed.

Several radio-isotopes usually occur together, so the dates can be cross-checked, and the ages invariable agree. 16

No, they don’t invariably agree, unless you throw out the ages that don’t agree! The discordant dates of the Apollo 11 moon rocks are typical. (Only 10 of 116 measurements agreed with the “accepted” age of the moon. 17)

The fossil record documents the gradual loss of toes over time, so that in modern horses only the middle one—the hoof—remains. 18

This story about horse evolution has been debunked by evolutionists themselves for years! Even the Chicago Field Museum admits it. 19 20 How could Coyne not know that?

Getting His Haeckels Up

Coyne even goes so far as to try to defend Ernst Haeckel’s biogenetic law, sort of.

Noting this principle, Ernst Haeckel, a German evolutionist and Darwin’s contemporary, formulated a “biogenetic law” in 1866, famously summarized as “Ontogeny recapitulates phylogeny.” This means the development of an organism simply replays its evolutionary history. But this notion is true in only a limited sense. Embryonic stages don’t look like the adult forms of their ancestors, as Haeckel claimed, but like the embryonic forms of ancestors. Human fetuses, for example, never resemble adult fish or reptiles, but in certain ways they do resemble embryonic fish and reptiles. Also the recapitulation is neither strict nor inevitable: not every feature of an ancestor’s embryo appears in its descendants, nor do all stages of development unfold in a strict evolutionary order. Further, some species, like plants, have dispensed with nearly all traces of their ancestry during development. Haeckel’s law has fallen into disrepute not only because it wasn’t strictly true, but also because Haeckel was accused, largely unjustly, of fudging some drawings of early embryos to make them look more similar than they really are. Yet we shouldn’t throw out the baby with the bathwater. Embryos still show a form of recapitulation: features that arose earlier in evolution often appear earlier in development. And this makes sense only if species have an evolutionary history.

Now, we’re not absolutely sure why some species retain much of their evolutionary history during development. The “adding new stuff onto old” principle is just a hypothesis—an explanation for the facts of embryology. 21 [italics his]

In summary, embryos look similar during development, except when they don’t; and this only makes sense to evolutionists. They don’t know why this happens. They don’t know why it only happens in some species. But it explains the facts of embryology!

We don’t know why Coyne thinks Haeckel was “unjustly” accused of faking the drawings. There is no question that he did fake them. His guilt has been known for decades.

Ignore the Contradictions

The theory of evolution is full of contradictions, resulting in debates and arguments among evolutionists. Coyne says these controversies prove how strong the theory is.

Critics of evolution seize upon these controversies, arguing that they show something is wrong with the theory of evolution itself. But this is specious. There is no dissent among serious biologists about the major claims of evolutionary theory—only about the details of how evolution occurred, and about the relative roles of various evolutionary mechanisms. Far from discrediting evolution, the “controversies” are in fact the sign of a vibrant, thriving field. What moves science forward is [sic] ignorance, debate, and the testing of alternative theories with observations and experiments. A science without controversy is a science without progress. 22

This is just amazing! There are controversies precisely because the theory is wrong. He says all the people who believe in evolution really believe in evolution (they just believe other believers in evolution are wrong). The fact that there is so much ignorance and controversy about evolution proves how true it must be.

If it is true that debate about evolution promotes scientific progress, why is it that evolutionists go to court to prevent debate about evolution from being discussed in American public schools?

The more you read about evolution, written by evolutionists, the less you will believe it!

Footnotes:

1 Coyne, Why Evolution is True, 2009, page 3
2 ibid. page 133
3 ibid. page 132
4 ibid. page 22
5 ibid. pages 195-196
6 ibid. pages 196-197
7 ibid. page 197
8 ibid. page 207
9 ibid. page 34
10 ibid. pages 46-47
11 ibid. pages 210-211
12 Disclosure, January 2003, “98% Chimp”
13 Coyne, Why Evolution is True, 2009, page 169-170
14 ibid. page 156
15 ibid. page 23
16 ibid. page 24
17 Disclosure, June 2008, “The Age of the Moon”, http://www.scienceagainstevolution.org/v12i9f.htm
18 Coyne, Why Evolution is True, 2009, page 65
19 Disclosure, February 2002, “Horses and Peppered Moths”, http://www.scienceagainstevolution.org/v6i5f.htm
20 Disclosure, October 1997, “Education Behind the Times”, http://www.scienceagainstevolution.org/v2i1e.htm
21 Coyne, Why Evolution is True, 2009, page 78
22 ibid. page 223

Source

Does Religion Always Lose?



A common debating tactic, and a successful one in the eyes of many, is to say that whenever religion and science have a dispute about some question of fact, religion always loses.1 The implication is that religion should never make any factual claims, and it is even implied that religion has no contact with reality. Supporting evidence for this claim is said to include the physics of Galileo, the geology of Hutton and Lyell, the biology of Darwin, and the psychology of Freud and others. Religion, especially
supernatural religion, has always lost in the past, and it will always lose
in the future. We should either abandon it or at least adopt a liberal version
that makes no testable claims.

There are several problems with the above scenario. First, strictly
speaking, the disputes were not really between science and religion; there
were scientists on the “religion” side, and theologians on the “science”
side. It would be more proper to make the claim that the argument was
between naturalistic and supernaturalistic philosophies.
If so, the Galileo affair does not really belong with the other examples.
The Galileo affair resulted from the reaction of the Catholic Church,
which had just been rocked by the Protestant Reformation, to the
cosmology of Copernicus. The only issues which might impact the conflict
between naturalistic and supernaturalistic philosophy were whether
incidental details in the Bible were to be treated as ontologically (really)
accurate, or merely phenomenologically (only describing appearances)
accurate, and the authority of the Catholic Church. As far as I know, it
does not even involve the authority of the Pope speaking ex cathedra, as
I know of no such pronouncement of the Pope on the Galileo affair.
It could be (and has been) argued that the other “advances” listed
above were not really advances. Certainly a creationist will not find them
very persuasive. But there is a more basic flaw in the argument. Specifically,
there are important counterexamples to the argument. Religion does
not always lose.

We need to rephrase the above statement to give it more empirical
content, because we can never be completely certain that science has a
particular theory. Even if a theory appears to be well ahead of another, it
is always possible that more evidence will tip the scales in favor of the
currently out-of-favor theory. Thus a believer in naturalism could always
claim that in a given subject where a supernaturalist explanation fits best
with the known facts, more facts will tip the scales. Just wait a while;
your supernatural explanation will turn out to be wrong or unnecessary.
Of course, a supernaturalist could argue in a similar manner. And both
statements are basically faith statements. The only evidence we can have
for them is that the same process has occurred in other areas of knowledge
in the past.

So we will rephrase the proposition more carefully. Scientific and
historical hypotheses arising from and/or compatible with supernaturalistic
philosophy sometimes have considerably more empirical support
than hypotheses arising from and/or compatible with naturalistic philosophy.
Perhaps more importantly, this support has, in some cases increased
with time.

In the domain of history, one counterexample to the “religion always
loses” argument is the reliability of the chronology of the books of Kings
and Chronicles in the Bible. For a long time, skeptics believed a “Biblical”
chronology did not exist, and that what confused pieces of chronology
did exist were totally incompatible with the “real”, secular chronology.
After Thiele,2 the chronology of Kings and Chronicles was (and is) seen
not only as coherent, but able to serve as a corrective to secular chronology.
A Biblical approach has won, or at least has shown itself to be
much better at explaining the data. Religion did not lose in this case, and
it appears unlikely to lose in the future here.

Another counterexample is the book of Daniel, where skeptics originally
confidently stated that Belshazzar never existed, that the chronology
was hopelessly confused, and that since the entire book was fiction, there
was no point in looking for the characters in history. With time, that view
of history has been forced to change. Belshazzar not only existed, but
also turned out to be the crown prince (also king in Hebrew parlance),
able only to offer the third rulership in the kingdom. The chronology of
Nebuchadnezzar taking captives from Jerusalem turns out to have been
precisely correct. Perhaps most interesting, the names of Daniel4 and his
three friends5 have been found in Babylonian documents. This does not
mean that every statement in the book of Daniel has been confirmed.
The identity of Darius the Mede is still in doubt (although we have not
eliminated all candidates). But the case for the historicity of Daniel is
clearly better than it was in the past. Religion is winning here.

These cases are from history. Can the same be said of science? If one
is a Seventh-day Adventist, it can. For over a century, Adventists defended,
on the basis of what they believed to be inspiration, the view that tobacco
was an insidious but deadly poison. At the time this view was not shared
by the scientific community, but over the last 50 years the evidence has
become overwhelming that the hypothesis originally associated with religion
was correct. Religion did not lose here. The same comments, although
not quite as vigorously, can be made about vegetarianism.
But it could be countered that these supernaturalist positions were
sectarian, and in any case did not deal a major blow to naturalism. Are
there any cases more directly relevant to the creation-evolution controversy?
It turns out there are. The first example is in cosmology. The question
at issue was whether the universe extended backwards in time indefinitely
or if there was a finite limit to the age of the universe.

The former was strongly favored by most scientists, often with an explicit anti-supernatural
bias expressed as the reason for their preference. This bias formed a
major part of the objection to Big Bang cosmology. If the universe had a
beginning, it at least suggested that it might require a Creator. The desire
to protect an eternal universe was so great that in attempting to do so,
Einstein made what he later called his “greatest mistake”, introducing a
cosmological constant into the equation for the universe to keep it roughly
static. However, the weight of evidence now is solidly behind the concept
that the universe did have a beginning. Religion is not losing here.
Another example is the existence of vestigial organs. Vestigial organs
have been used as an argument against design, and therefore against a
designer, since Darwin. In the classical exposition, Wiedersheim listed
over 150 structures that he considered vestigial. He was careful to note
that some of them, such as the thyroid and adrenal glands, probably had
some function, in which case they might not be truly vestigial, and that
this could be the case with other organs. But some of his followers were
not so cautious, and it was not uncommon for such organs as the thymus,
the pituitary, and the appendix to be written off as completely useless.

This lack of caution was necessary if vestigial organs were to be used
against believers in design, because if there was some function that could
be attributed to them, then their existence in a designed organism would
not count as evidence against a designer.8 However, this lack of caution
was ill-advised, as further investigation has found a reasonable function
for all these structures, destroying, sometimes dramatically, the argument
against design. It could be argued that in this case anti-supernaturalist
prejudice actually was detrimental to science, tending to cause scientists
not to investigate possible functions for a structure because the prejudice
was that it had no function.

It could be further argued that anti-supernatural prejudice actually
killed people. Although the spleen was not on Wiedersheim’s list, when
I went to medical school it was commonly written off as a practically
useless organ that we would be better off not having, as it tended to bleed
when it got injured. Its only use was to show that humans and dogs, for
example (where it stores blood for autotransfusion in case of bleeding),
shared a common ancestor. As a result, when it did get injured, it was
commonly removed, without any attempt to preserve its function. It was
only later that it became apparent that not having a spleen predisposed
one to overwhelming pneumococcal infections. Surgical practice today
is to preserve splenic function whenever possible, either by repairing the
spleen, or failing that, by leaving small bits in the abdomen and hoping
that they attach themselves.

History repeated itself with the “junk DNA” controversy. When DNA
was discovered, many evolutionists predicted that there were vast quantities
of totally useless DNA in the genome of various organisms including
humans. As noted by Standish,9 they were perhaps ignoring evolutionary
theory in their anti-supernaturalist bias. But the point remains that supernaturalists
generally made a better prediction about the extent of “junk
DNA”, and that in this case an anti-supernaturalist bias actually hindered
research (the reverse of what is usually claimed).

This brings up an important point. One of the reasons “science”
(naturalism) claims not to lose is that it incorporates findings which were
originally thought to favor “religion” (supernaturalism). Thus the temporality
of the universe, and some other ideas such as the harmfulness of
tobacco, are simply incorporated into the naturalistic model, and the modern
believer in naturalism often may not be aware of the religious overtones
to the previous controversies. The topic is viewed as simply another
example of the steady advance of science.

The same could have been true for religion. For example, most theologians
have incorporated a heliocentric view of the solar system into
their theology. But the believers in naturalism will not let them forget
that at one time the majority of Christians (not all; note Philip Melancthon)
disagreed with the heliocentric theory, and the Catholic Church disagreed
strongly enough that it forced Galileo to recant and banned his books, an
action it has been forced to repudiate. The Church was in error here. But
if one can hold modern Christianity accountable for the mistakes of the
majority of its predecessors, one can also hold naturalism accountable
for the mistakes of the majority of its predecessors.

This brings us to a final point. The argument that “religion always
loses” is used to avoid having to deal with some subject where supernaturalism
is apparently winning at present, and where if it wins, naturalism
is dead. Naturalism can survive the historicity of the numbers in
Kings and Chronicles, or the toxicity of tobacco, or even (as deism) the
Big Bang. Naturalism cannot survive without a naturalistic explanation
for the origin of life. And yet there is not such an explanation, not even a
remotely plausible one. The more we know, the worse it looks.
Naturalism implicitly recognizes this.

The best evidence for this is the insistence on the monophyletic origin of life.
In the face of the Cambrian explosion and different genetic codes for some organisms (e.g., Paramecium), naturalists continue to insist that all organisms on Earth share
a common ancestor. If they really believed that life were that easy to
start, they would simply accept the hypothesis that it started a number of
different times. The fact that they insist on the monophyletic origin of
life is testimony that they implicitly recognize that it is extremely difficult
to get life started even once, let alone multiple times.

But believers in naturalism are absolutely committed to a naturalistic
origin for life. Some idea of the strength of the commitment can be
gathered from a passage in an excellent (and still accurate) book by Robert
Shapiro entitled Origins: A Skeptic’s Guide to the Creation of Life on
Earth.10 In it he points out the flaws of the various theories, finally opting
for a theory of short non-modern peptides as the least problematic. But
on p 130 he displays his own viewpoint:
Some future day may yet arrive when all reasonable chemical
experiments run to discover a probable origin for life have failed
unequivocally. Further, new geological evidence may indicate a
sudden appearance of life on the earth. Finally, we may have
explored the universe and found no trace of life, or processes
leading to life, elsewhere. In such a case, some scientists might
choose to turn to religion for an answer. Others, however, myself
included, would attempt to sort out the surviving less probable
scientific explanations in the hope of selecting one that was still
more likely than the remainder.


So naturalism requires a defense against the obvious. And the best
defense is, “We have never lost yet. You always do if you wait long
enough.” In the case of the origin of life, it appears that naturalism would
have lost a long time ago if its adherents had not refused to recognize the
loss. The major problem with the “religion always loses” defense is that
it is not true. Even in hindsight it is not true without distorting the record,
and from a prospective point of view (the only point of view from which
we can currently view the future), it is certainly not true. It should be
recognized as what it is, a faith statement disagreeing with the apparent
lessons of history. Religion does not always lose.

P.A.G.

ENDNOTES

1. See, for example: (a) Yandell KE. 1986. Protestant theology and natural science in the
twentieth century. In: Lindberg DC, Numbers RL, editors. God and Nature: Historical
Essays on the Encounter between Christianity and Science, p 448-471. Berkeley and
London: University of California Press; (b) White AD. A history of the warfare of science
with theology in Christendom. 2 vols. NY: Dover Press.
2. Thiele E. 1983. The mysterious numbers of the Hebrew Kings. 3rd ed. Grand Rapids,
MI: Zondervan Publishing House.
3. Strand KA. 1996. Thiele’s biblical chronology as a corrective for extrabiblical dates.
Andrews University Seminary Studies 34:295-317.
4. Shea W. 1988. Bel(te)shazzar meets Belshazzar. Andrews University Seminary Studies
26:67-81.
5. Shea W. 1982. Extra-biblical texts and the convocation on the Plain of Dura. Andrews
University Seminary Studies 20:29-57.
6. Robert Jastrow (1978. God and the astronomers. NY: W. W. Norton and Co.) notes the
phenomenon. Although the supernaturalists were not always on one side, or the naturalists
on the other, as noted by Helge Kragh (1999. Cosmology and controversy. Princeton,
NJ: Princeton University Press, p 251-268), there was still a tendency to line up on the
side most compatible with one’s evaluation of theism.
7. Bernard H, Bernard M, translators. 1895. The structure of man: an index to his past
history. Howes GB, editor. London: MacMilllan and Co.
8. For an anti-supernaturalist argument to succeed, it is important for the structure under
consideration to have no function. It is not enough simply for it to have minimal and
easily compensated function. Otherwise, such structures as little fingers or toes could be
considered unnecessary, as there are very few functions that cannot be performed equally
well by humans who have lost their little fingers and toes, and yet it seems unreasonable
to claim that they could not have been designed.
The attractiveness of such an argument is such that it is still not completely dead. It
surfaces, for example, in: Miller KR. 1999. Finding Darwin’s God. NY: Cliff Street
Books, p 100-101.
9. Standish TG. 2002. Rushing to judgment: functionality in noncoding or “junk” DNA.
Origins 53:7-20.
10. Shapiro R. 1986. Origins: a skeptic’s guide to the creation of life on Earth. NY: Summit
Books.

Creation and the Law


Attempts to implement a two-model approach to the teaching of origins in the public school science curriculum have been blocked by those who have branded the inclusion of creation in the classrooms as an establishment of religion. Struggles over the teaching of creation, especially in connection with the use of a textbook, Biology: A Search for Order in Complexity, prepared by the Creation Research Society (CRS), have taken place with school boards and textbook commissions in the states of Tennessee, California, and Texas.

Last year in Indiana, the textbook battle was taken to the courtroom. Hopes of seeing a favorable decision for the two-model approach died when a Marion County Superior Court judge ruled the required use of the CRS book to be a violation of the constitutional provisions of separation of church and state.

As in many other religion-related legal suits, the underlying problem centers on the interpretation of the opening clauses of the First Amendment to the U.S. Constitution which states: "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof ...."

The ambiguity of the wording that has plagued both plaintiffs and defendants seems to revolve around the definition of "religion" and "religious." Once, "religion" was confined solely to theistic connotations, implying a definite belief in the deity. But the definition was changed in 1961 when the U.S. Supreme Court indicated that non-theistic religions are also protected under the First Amendment's provision of "free exercise." This broader interpretation and definition of "religion" includes non-theistic concepts such as "Ethical Culture" and "Secular Humanism."
When in 1963 the Supreme Court ruled state-required prayer and Bible reading in the public schools to be establishments of religion, it seemed that God was banned from the classrooms. In explaining the ruling, Justice Tom C. Clark stated that its intent was merely to correct abuses or coercion and preference by the state. In other words, the state must remain neutral, not opposed, to religion.

In an article entitled "Has the Court Really Outlawed Religion in Schools?" (Worldwide Challenge, November 1977, pp. 9-13), John W. Whitehead argues that in actual practice, however, the state has sanctioned the religion of secular humanism over other religions. He proposes that the state has a duty to balance this trend by allowing a place for the teaching of theistic religion objectively.

Carrying this proposal one step further, Wendell R. Bird applies this idea to the teaching of creation in the science classrooms. In "Freedom of Religion and Science Instruction in Public Schools," an article which appeared in the January 1978 issue of the Yale Law Review (pp. 515-570), Mr. Bird questions the validity of the Indiana textbook ruling. He examines the current practice of teaching only the general theory of evolution (naturalistic evolution from simple organisms to man) and concludes that the state is violating the free exercise of religion by its refusal to present alternative views. He proposes that this abridgement be neutralized by the incorporation of creation into the teaching of origins, and maintains that a non-religious approach to creation should be followed. Even though some aspects of creation are related to religious beliefs, the entire theory cannot be banned from the classroom solely because of religious reasons, for creation can be taught objectively, based on scientific evidence.

It is probably not easy to practice the distinction between presenting information about religion and indoctrinating the students in those beliefs. But the distinction is allowed by the First Amendment. Justice Clark has stated that religion may be taught within public schools if it is taught objectively. Even if creation ideas are ruled to be religious by the courts, they should still be allowed in the classroom.

K.C.

Cell phones: Precautions recommended

Several government bodies around the world suggest that anyone who uses a cell phone (and these days, who doesn’t) would be well advised to keep a little distance between that phone and their body. And when people need to make a call, they should minimize radiation exposures by phoning only where reception is really good.

In justifying these and other precautions at a Senate Appropriations subcommittee hearing on Monday, several scientists observed that recent studies have begun linking heavy use of cell phones over a prolonged period with an increased risk of cancer. Especially in the head, and on the same side that people normally hold their phones.

Of course, if such a link were robust, cell phones would be sold with little warning labels, much as cigarettes are today. That link is not robust. On the other hand, they argued, it’s also not going away. Quite the contrary.

SOURCE

For instance, Olga Naidenko, a senior scientist with the Environmental Working Group, a research/advocacy organization based in Washington, D.C., led a team that just completed a 10-month analysis of 200 peer-reviewed studies on cell-phone safety.

“We found that the studies amassed during the first two decades of cell-phone use produced conflicting results and few definitive conclusions on cell-phone safety,” Naidenko said. “But, the latest research, in which scientists are for the first time able to study people who have used cell phones for many years, suggests the potential for serious safety issues.”

She and others at the hearing argued that in light of the accumulating — though still far from strong — indications of health risks, people would be wise to adopt the precautionary principle. Israeli physician and cell-phone researcher Siegal Sadetzki put it succinctly: “Better safe than sorry.”

People can and should adopt simple practices that reduce their exposure to cell-phone radiation, said this researcher from the Gertner Institute (affiliated with the Sackler School of Medicine at Tel-Aviv University). Nearly all of the researchers and scientists who spoke at the hearing similarly advocated a precautionary approach.

The lone holdout: Linda Erdreich, who spoke at the behest of CTIA-The Wireless Association®. This international group represents, among others, cell-phone makers and wireless-service providers. Erdreich, a consulting epidemiologist, saw no reason to take precautionary measures, she said, because her reading of the scientific literature suggests wireless phones pose no harm.

“The currently available scientific evidence about the effects of radiation emitted by mobile phones is contradictory,” admits Dariusz Leszczynski of Finland’s Radiation and Nuclear Safety Authority, in Helsinki. “There are both studies showing effects and some studies showing no effect.”

Rather than view this uncertainty as reason for complacency, he says, it makes more sense to consider as “premature” any interpretation that cell phones are safe. In fact, Leszczynski contends, “Current [cell phone] safety standards are not supported by science because of the very limited research on human volunteers and on children.” Rather, he says, “This uncertainty calls not only for precautionary measures but also for further research.”

His agency has issued two cell-phone advisories suggesting what such precautions might include — like limiting children’s use of cell phones. And texting, when possible, instead of actually talking on a cell phone (to keep that phone away from direct contact with the body).

Last year, Sadetzki’s team reported finding a 50 to 60 percent increased risk among certain Israeli adults of parotid-gland tumors (mostly benign tumors of the salivary gland). The affected group: heavy users of cell phones who did not listen to their calls via a hands-free device (such as a wired earphone or wireless ear piece). Writing in the American Journal of Epidemiology, she and her colleagues noted: “A positive dose-response trend was found for these measurements. Based on the largest number of benign PGT patients reported to date, our results suggest an association between cellular phone use and PGTs.”

At the hearing, Sadetzki said her group’s findings were consistent with research by others linking cell phone use for 10 or more years to tumors in the brain and to acoustic neuroma (a benign tumor affecting the nerve that connects the ear to the brain).

In the July International Journal of Oncology, Lennart Hardell and Michael Carlberg of Orebro University Hospital in Sweden reported an update of their cell-phone studies looking at brain-cancer risk. Here, phone habits were analyzed for some 3,600 people, both individuals with cancer and healthy controls. And people who had developed astrocytoma — a type of brain cancer — were five times as likely to be heavy cell-phone users who got their first mobile phone during their teen years, at least a decade earlier. Cordless home phones did not show a similar link.

It may not be surprising that some of these links to cell phone use are only now emerging, Sadetzki said at the hearing, because there can be long latency periods separating exposures to carcinogens and the development of tumors. For instance, she pointed out that the first reports of brain tumors linked to radiation from the atomic-bomb blasts in Hiroshima and Nagasaki didn’t show up until a half-century after the bombings.

“Since widespread cell-phone use began only in the mid-‘90s,” she notes, “the followup period in most published studies is only about 10 years.”

Cell-phone technology “is here to stay,” she acknowledges, so “the question that needs to be answered is not whether we should use cell phones, but how.”

For instance, she noted that the French health ministry has warned against excessive cell-phone use by children because their bodies may still be undergoing developmental changes that render them especially susceptible to radiofrequency emissions. Moreover, cell-phone radiation penetrates their brains proportionately more deeply than it does in adults. The Israeli health ministry recommends that cell-phone users employ speakers, earphones or hands-free technologies and limit use of these phones where reception is weak.

Naidenko has yet another recommendation: “Buy low-radiation phones.” Identifying which emit the lowest radiation can be difficult, she acknowledged. That’s why her group developed a free, interactive online guide that provides manufacturer-stated radiation-emissions values for more than 1,200 different phones.