April 19, 2003
A Scholar Follows Her Family's Dusty Footprints
By MARC LACEY
KOOBI FORA, Kenya — Louise N. Leakey grew up with bones. She first came to this remote region in northern Kenya along Lake Turkana in the early 1970's when she was 4 months old. Back then, her father, Richard, and mother, Meave, were chasing down the origins of the human species, and little Louise would be dunked in a basin of water to protect her from the scorching heat.
By that time, Louise's grandfather, Louis, and her grandmother, Mary, had already put East Africa on the map as the cradle of man. Bones were piled up at home and the topic of dinner conversation. They were what Mum and Dad searched for from sunrise till darkness.
So perhaps it was only natural that fossil hunting would be cast in stone as Louise Leakey's future profession, too, the third generation of a family that sometimes seems to have pieced together the early portrait of Homo erectus all on its own.
Though Louise, now 31, has a quick laugh and an easygoing nature, and the vibrant, outdoorsy look that comes from a life spent on Africa's vast plains, it is no small scientific mantle to inherit.
She says she considered a career in marine biology and kept her options open throughout her studies just in case she decided against joining what amounts to the family business. Her younger sister, Sumira, did opt for something else: work at the World Bank.
"I had to decide for myself that this is what I wanted to do," Louise said. "I eventually did decide that life's too short for me to work in a cubicle."
Short, indeed. Louise Leakey cannot seem to sit still.
She rises with the sun and immediately sets to work, whether scratching in the soil for the faintest hint of early man or fixing the thatch roof on her family's research camp. She talks quickly and moves quickly and, for a person more likely than almost any to take a long, long view of time, she seems always to be racing against it.
The daily pursuit of uncovering fossils that date back millions of years has made her keenly aware, she says, that 31 years on the earth, plus however many she has left, are a relative blink. We are here for but a brief moment, and leave behind only traces.
Dr. Leakey — she earned a Ph.D. in paleontology from the University of London two years ago — said she decided to pursue her career without pressure from her parents or grandparents, all of whom have made landmark discoveries in how human beings evolved.
She says she is confident she will extend the family legacy, and is off to a good start.
In her late 20's, Dr. Leakey joined her mother in uncovering a 3.5-million-year-old skull that shook up the prevailing view that early man had a single line of descent. The mother-daughter team, both explorers in residence for the National Geographic Society, co-wrote a paper for the journal Nature last year announcing the discovery of what they called Kenyanthropus platyops.
In many ways, her life can be tracked by her family's big fossil finds. When she was an infant, her parents dug up a Homo erectus skull known as 1470 that was far older than any other found to date.
When she was an adolescent, her father led a team that found the first ever Homo erectus skeleton, the so-called Turkana boy. It was a monumental find, and Dr. Leakey said she still remembers the excitement.
Her chores at that time were usually associated with the fossil field, her mother recalled. "Richard generally felt that children should be seen and not heard and that if they were on field expeditions they should work to earn their keep," Meave Leakey said.
For the first-born daughter, that meant early exposure to everything from gluing skulls together to driving a Land Rover. While the rest of the family was excavating the Homo erectus skeleton, 12-year-old Louise was driving back and forth from the camp to fetch drinking water for the team. She was so tiny that she peered through the middle of the steering wheel as she drove.
"I have not seen a child so tired before or since," Meave Leakey said, "but she had really learned how to drive a Land Rover in difficult terrain, and this has proved an essential skill ever since."
The other day, Louise Leakey showed off that skill with a visitor on the bumpy landscape of northern Kenya, splashing muddy water left by a recent rain. She reached one of the fossil beds her family made famous and demonstrated the other skills that can make fossil hunting with a Leakey an intimidating experience.
Hand her a piece of fossilized bone plucked from the soil and she will twist it around in her hands for a moment and then say, "That's a crocodile mandible."
Show her another, suggest it may be a major find, and she'll smile as she says, "It's the skull plate of a catfish."
One last try. Surely this bone is an ancient man. "It's a pygmy hippo," she says with a laugh, throwing back her short blond hair.
"It's all about reading the soil," she explained. "The sediment tells you everything."
Dr. Leakey said she doubted that there were any new species to uncover in the soils of Turkana but believed that fossil evidence could resolve some of the still roiling controversies about how man developed, including some disagreements in her own home.
"I see my mother's arguments and I see my father's," she said. "I stand in the middle. I want to know who's right."
The busy fossil hunting season runs from June to August. During the rest of the year, Dr. Leakey works to ensure that the fossil hunting infrastructure her parents put in place is around for the next generation, whether they are named Leakey or not.
The camp her parents built decades ago here on a strip of land that juts into the eastern side of the lake has decayed. Dr. Leakey has been devoting time to reviving the place, one of the less glamorous parts of her job.
She has supervised the installation of a water desalination plant and replaced candlelight with electrical power. One of her headaches is trying to persuade local pastoralists not to drive their herds through some of the prime fossil grounds. After surviving millions of years, a fossilized bone can be wiped out by the hoof of a goat, she said.
These fossil fields, though rich in a history all their own, are desolate. Still, having grown up here, Louise feels more comfortable on their arid, empty expanses than in the hubbub of Nairobi, where she spends half her time, flying back and forth in a private plane that is a hand-me-down from her father.
Admittedly, she says, social life suffers, though she is currently dating a fellow scientist who works in the remote reaches of eastern Congo.
The environment encourages her quirky sense of fun. In the evenings, when the sky is lit up by stars, she sometimes leaves the camp, a huge torch in hand, to creep up on the crocodiles that rest at the edge of the lake. By blinding a crocodile with light, she explains to a visitor keeping well behind her, she can get so close that she can touch the reptile's tail.
But the scientist within her cannot be contained. Ages ago, she says, there were five different species of crocodile that lived in these waters. All but the particular variety she is sneaking up on have become extinct.
CCC Metro TLC
July 15, 2003
Early Voices: The Leap to Language
By NICHOLAS WADE
ower birds are artists, leaf-cutting ants practice agriculture, crows use tools, chimpanzees form coalitions against rivals. The only major talent unique to humans is language, the ability to transmit encoded thoughts from the mind of one individual to another.
Because of language's central role in human nature and sociality, its evolutionary origins have long been of interest to almost everyone, with the curious exception of linguists.
As far back as 1866, the Linguistic Society of Paris famously declared that it wanted no more speculative articles about the origin of language.
More recently, many linguists have avoided the subject because of the influence of Noam Chomsky, a founder of modern linguistics and still its best-known practitioner, who has been largely silent on the question.
Dr. Chomsky's position has "only served to discourage interest in the topic among theoretical linguists," writes Dr. Frederick J. Newmeyer, last year's president of the Linguistic Society of America, in "Language Evolution," a book of essays being published this month by Oxford University Press in England.
In defense of the linguists' tepid interest, there have until recently been few firm facts to go on. Experts offered conflicting views on whether Neanderthals could speak. Sustained attempts to teach apes language generated more controversy than illumination.
But new research is eroding the idea that the origins of language are hopelessly lost in the mists of time. New clues have started to emerge from archaeology, genetics and human behavioral ecology, and even linguists have grudgingly begun to join in the discussion before other specialists eat their lunch.
"It is important for linguists to participate in the conversation, if only to maintain a position in this intellectual niche that is of such commanding interest to the larger scientific public," writes Dr. Ray Jackendoff, Dr. Newmeyer's successor at the linguistic society, in his book "Foundations of Language."
Geneticists reported in March that the earliest known split between any two human populations occurred between the !Kung of southern Africa and the Hadza of Tanzania. Since both of these very ancient populations speak click languages, clicks may have been used in the language of the ancestral human population. The clicks, made by sucking the tongue down from the roof of the mouth (and denoted by an exclamation point), serve the same role as consonants.
That possible hint of the first human tongue may be echoed in the archaeological record. Humans whose skeletons look just like those of today were widespread in Africa by 100,000 years ago. But they still used the same set of crude stone tools as their forebears and their archaic human contemporaries, the Neanderthals of Europe.
Then, some 50,000 years ago, some profound change took place. Settlements in Africa sprang to life with sophisticated tools made from stone and bone, art objects and signs of long distance trade.
Though some archaeologists dispute the suddenness of the transition, Dr. Richard Klein of Stanford argues that the suite of innovations reflects some specific neural change that occurred around that time and, because of the advantage it conferred, spread rapidly through the population.
That genetic change, he suggests, was of such a magnitude that most likely it had to do with language, and was perhaps the final step in its evolution. If some neural change explains the appearance of fully modern human behavior some 50,000 years ago, "it is surely reasonable to suppose that the change promoted the fully modern capacity for rapidly spoken phonemic speech," Dr. Klein has written.
Listening to Primates
Apes' Signals Fall Short of Language
At first glance, language seems to have appeared from nowhere, since no other species speaks. But other animals do communicate. Vervet monkeys have specific alarm calls for their principal predators, like eagles, leopards, snakes and baboons.
Researchers have played back recordings of these calls when no predators were around and found that the vervets would scan the sky in response to the eagle call, leap into trees at the leopard call and look for snakes in the ground cover at the snake call.
Vervets can't be said to have words for these predators because the calls are used only as alarms; a vervet can't use its baboon call to ask if anyone noticed a baboon around yesterday. Still, their communication system shows that they can both utter and perceive specific sounds.
Dr. Marc Hauser, a psychologist at Harvard who studies animal communication, believes that basic systems for both the perception and generation of sounds are present in other animals. "That suggests those systems were used way before language and therefore did not evolve for language, even though they are used in language," he said.
Language, as linguists see it, is more than input and output, the heard word and the spoken. It's not even dependent on speech, since its output can be entirely in gestures, as in American Sign Language. The essence of language is words and syntax, each generated by a combinatorial system in the brain.
If there were a single sound for each word, vocabulary would be limited to the number of sounds, probably fewer than 1,000, that could be distinguished from one another. But by generating combinations of arbitrary sound units, a copious number of distinguishable sounds becomes available. Even the average high school student has a vocabulary of 60,000 words.
The other combinatorial system is syntax, the hierarchical ordering of words in a sentence to govern their meaning.
Chimpanzees do not seem to possess either of these systems. They can learn a certain number of symbols, up to 400 or so, and will string them together, but rarely in a way that suggests any notion of syntax. This is not because of any poverty of thought. Their conceptual world seems to overlap to some extent with that of people: they can recognize other individuals in their community and keep track of who is dominant to whom. But they lack the system for encoding these thoughts in language.
How then did the encoding system evolve in the human descendants of the common ancestor of chimps and people?
Babbling and Pidgins Hint at First Tongue
One of the first linguists to tackle this question was Dr. Derek Bickerton of the University of Hawaii. His specialty is the study of pidgins, which are simple phrase languages made up from scratch by children or adults who have no language in common, and of creoles, the successor languages that acquire inflection and syntax.
Dr. Bickerton developed the idea that a proto-language must have preceded the full-fledged syntax of today's discourse. Echoes of this proto-language can be seen, he argued, in pidgins, in the first words of infants, in the symbols used by trained chimpanzees and in the syntax-free utterances of children who do not learn to speak at the normal age.
In a series of articles, Dr. Bickerton has argued that humans may have been speaking proto-language, essentially the use of words without syntax, as long as two million years ago. Modern language developed more recently, he suggests, perhaps with appearance of anatomically modern humans some 120,000 years ago.
The impetus for the evolution of language, he believes, occurred when human ancestors left the security of the forest and started foraging on the savanna. "The need to pass on information was the driving force," he said in an interview.
Foragers would have had to report back to others what they had found. Once they had developed symbols that could be used free of context — a general word for elephant, not a vervet-style alarm call of "An elephant is attacking!" — early people would have taken the first step toward proto-language. "Once you got it going, there is no way of stopping it," Dr. Bickerton said.
But was the first communicated symbol a word or a gesture? Though language and speech are sometimes thought of as the same thing, language is a coding system and speech just its main channel.
Dr. Michael Corballis, a psychologist at the University of Auckland in New Zealand, believes the gesture came first, in fact as soon as our ancestors started to walk on two legs and freed the hands for making signs.
Chimpanzees have at least 30 different gestures, mostly used to refer to other individuals.
Hand gestures are still an expressive part of human communication, Dr. Corballis notes, so much so that people even gesticulate while on the telephone.
He believes that spoken words did not predominate over signed ones until the last 100,000 years or so, when a genetic change may have perfected human speech and led to its becoming a separate system, not just a grunted accompaniment for gestures.
Critics of Dr. Corballis's idea say gestures are too limited; they don't work in the dark, for one thing. But many concede the two systems may both have played some role in the emergence of language.
Search for Incentives
As Societies Grew the Glue Was Gossip
Dr. Bickerton's idea that language must have had an evolutionary history prompted other specialists to wonder about the selective pressure, or evolutionary driving force, behind the rapid emergence of language.
In the mere six million years since chimps and humans shared a common ancestor, this highly complex faculty has suddenly emerged in the hominid line alone, along with all the brain circuits necessary to map an extremely rapid stream of sound into meaning, meaning into words and syntax, and intended sentence into expressed utterance.
It is easy to see in a general way that each genetic innovation, whether in understanding or in expressing language, might create such an advantage for its owners as to spread rapidly through a small population.
"No one will take any notice of the guy who says `Gu-gu-gu'; the one with the quick tongue will get the mates," Dr. Bickerton said. But what initiated this self-sustaining process?
Besides Dr. Bickerton's suggestion of the transition to a foraging lifestyle, another idea is that of social grooming, which has been carefully worked out by Dr. Robin Dunbar, an evolutionary psychologist at the University of Liverpool in England.
Dr. Dunbar notes that social animals like monkeys spend an inordinate amount of time grooming one another. The purpose is not just to remove fleas but also to cement social relationships. But as the size of a group increases, there is not time for an individual to groom everyone.
Language evolved, Dr. Dunbar believes, as a better way of gluing a larger community together.
Some 63 percent of human conversation, according to his measurements, is indeed devoted to matters of social interaction, largely gossip, not to the exchange of technical information, Dr. Bickerton's proposed incentive for language.
Dr. Steven Pinker of the Massachusetts Institute of Technology, one of the first linguists to acknowledge that language may be subject to natural selection, disputes Dr. Dunbar's emphasis on social bonding; a fixed set of greetings would suffice, in his view.
Dr. Pinker said it was just as likely that language drove sociality: it was because people could exchange information that it became more worthwhile to hang out together.
"Three key features of the distinctively human lifestyle — know-how, sociality and language — co-evolved, each constituting a selection pressure for the others," Dr. Pinker writes in "Language Evolution," the new book of essays.
But sociality, from Dr. Dunbar's perspective, helps explain another feature of language: its extreme corruptibility. To convey information, a stable system might seem most efficient, and surely not beyond nature's ability to devise. But dialects change from one village to another, and languages shift each generation.
The reason, Dr. Dunbar suggests, is that language also operates as a badge to differentiate the in group from outsiders; thus the Gileadites could pick out and slaughter any Ephraimite asked to say "shibboleth" because, so the writer of Judges reports, "He said sibboleth: for he could not frame to pronounce it right."
Language in the Genome
From Family Failing First Gene Emerges
A new approach to the evolution of language seems to have been opened with studies of a three-generation London family known as KE. Of its 29 members old enough to be tested, 14 have a distinctive difficulty with communication. They have trouble pronouncing words properly, speaking grammatically and making certain fine movements of the lips and tongue.
Asked to repeat a nonsense phrase like "pataca pataca pataca," they trip over each component as if there were three different words.
Some linguists have argued that the KE family's disorder has nothing specific to do with language and is some problem that affects the whole brain. But the I.Q. scores of affected and unaffected members overlap, suggesting the language systems are specifically at fault. Other linguists have said the problem is just to do with control of speech. But affected members have problems writing as well as speaking.
The pattern of inheritance suggested that a single defective gene was at work, even though it seemed strange that a single gene could have such a broad effect. Two years ago, Dr. Simon Fisher and Prof. Tony Monaco, geneticists at the University of Oxford in England, discovered the specific gene that is changed in the KE family. Called FOXP2, its role is to switch on other genes, explaining at once how it may have a range of effects. FOXP2 is active in specific regions of the brain during fetal development.
The gene's importance in human evolution was underlined by Dr. Svante Paabo and colleagues at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. In a study last year they reported that FOXP2 is highly conserved in evolution — in other words, that the precise sequence of units in FOXP2's protein product is so important that any change is likely to lead to its owner's death.
In the 70 million years since people and mice shared a common ancestor, there have been just three changes in the FOXP2 protein's 715 units, Dr. Paabo reported. But two of those changes occurred in the last six million years, the time since humans and chimps parted company, suggesting that changes in FOXP2 have played some important role in human evolution.
Sampling the DNA of people around the world, Dr. Paabo found signs of what geneticists call a selective sweep, meaning that the changed version of FOXP2 had spread through the human population, presumably because of some enormous advantage it conferred.
That advantage may have been the perfection of speech and language, from a barely comprehensible form like that spoken by the affected KE family members to the rapid articulation of ordinary discourse. It seems to have taken place about 100,000 years ago, Dr. Paabo wrote, before modern humans spread out of Africa, and is "compatible with a model in which the expansion of modern humans was driven by the appearance of a more proficient spoken language."
FOXP2 gives geneticists what seems to be a powerful entry point into the genetic and neural basis for language. By working out what other genes it interacts with, and the neural systems that these genes control, researchers hope to map much of the circuitry involved in language systems.
Ending the Silence
Linguists Return to Ideas of Origins
The crescendo of work by other specialists on language evolution has at last provoked linguists' attention, including that of Dr. Chomsky. Having posited in the early 1970's that the ability to learn the rules of grammar is innate, a proposition fiercely contested by other linguists, Dr. Chomsky might be expected to have shown keen interest in how that innateness evolved. But he has said very little on the subject, a silence that others have interpreted as disdain.
As Dr. Jackendoff, the president of the Linguistic Society of America, writes: "Opponents of Universal Grammar argue that there couldn't be such a thing as Universal Grammar because there is no evolutionary route to arrive at it. Chomsky, in reply, has tended to deny the value of evolutionary argumentation."
But Dr. Chomsky has recently taken a keen interest in the work by Dr. Hauser and his colleague Dr. W. Tecumseh Fitch on communication in animals. Last year the three wrote an article in Science putting forward a set of propositions about the way that language evolved. Based on experimental work by Dr. Hauser and Dr. Fitch, they argue that sound perception and production can be seen in other animals, though they may have been tweaked a little in hominids.
A central element in language is what linguists call recursion, the mind's ability to bud one phrase off another into the syntax of an elaborate sentence. Though recursion is not seen in animals, it could have developed, the authors say, from some other brain system, like the one animals use for navigation.
Constructing a sentence, and going from A to Z through a series of landmarks, could involve a similar series of neural computations. If by some mutation a spare navigation module developed in the brain, it would have been free to take on other functions, like the generation of syntax. "If that piece got integrated with the rest of the cognitive machinery, you are done, you get music, morality, language," Dr. Hauser said.
The researchers contend that many components of the language faculty exist in other animals and evolved for other reasons, and that it was only in humans that they all were linked. This idea suggests that animals may have more to teach about language than many researchers believe, but it also sounds like a criticism of evolutionary psychologists like Dr. Pinker and Dr. Dunbar, who seek to explain language as a faculty forced into being by specifics of the human lifestyle.
Dr. Chomsky rejects the notion that he has discouraged study of the evolution of language, saying his views on the subject have been widely misinterpreted.
"I have never expressed the slightest objection to work on the evolution of language," he said in an e-mail message. He outlined his views briefly in lectures 25 years ago but left the subject hanging, he said, because not enough was understood. He still believes that it is easy to make up all sorts of situations to explain the evolution of language but hard to determine which ones, if any, make sense.
But because of the importance he attaches to the subject, he returned to it recently in the article with Dr. Hauser and Dr. Fitch. By combining work on speech perception and speech production with a study of the recursive procedure that links them, "the speculations can be turned into a substantive research program," Dr. Chomsky said.
Others see Dr. Chomsky's long silence on evolution as more consequential than he does. "The fact is that Chomsky has had, and continues to have, an outsize influence in linguistics," Dr. Pinker said in an e-mail message. Calling Dr. Chomsky both "undeniably, a brilliant thinker" and "a brilliant debating tactician, who can twist anything to his advantage," Dr. Pinker noted that Dr. Chomsky "has rabid devotees, who hang on his every footnote, and sworn enemies, who say black whenever he says white."
"That doesn't leave much space," Dr. Pinker went on, "for linguists who accept some of his ideas (language as a mental, combinatorial, complex, partly innate system) but not others, like his hostility to evolution or any other explanation of language in terms of its function."
Biologists and linguists have long inhabited different worlds, with linguists taking little interest in evolution, the guiding theory of all biology. But the faculty for language, along with the evidence of how it evolved, is written somewhere in the now decoded human genome, waiting for biologists and linguists to identify the genetic program that generates words and syntax.
CCC Metro TLC
July 25, 2003
New World Ancestors Lose 12,000 Years
By NICHOLAS WADE and JOHN NOBLE WILFORD
Scientists studying the genetic signatures of Siberians and American Indians have found evidence that the first human migrations to the New World from Siberia probably occurred no earlier than 18,000 years ago.
The new estimate undermines arguments for colonization as far back as 30,000 years ago, but reinforces archaeological findings and a linguistic theory that most American languages belong to a single family called Amerind.
The genetic evidence fits neatly, for example, with the discovery of a human campsite in Chile, which is apparently 15,000 years old, and with the well-established presence of big-game hunters in North America, starting 13,600 years ago. The few sites with possibly older human traces have yet to gain wide acceptance among scientists.
By studying the DNA of living Siberian and American Indian populations, geneticists had previously been able to see traces of at least two early migrations from Siberia. But it has been hard to put a date on when the first people set foot in the Americas, for lack of a suitable marker in the Y chromosome.
After much search, a team of geneticists has now detected a change in the DNA sequence of Siberian men's Y chromosomes that took place just before the first of the two migrations into the Americas. They estimate that the DNA change, called M242, occurred 15,000 to 18,000 years ago, meaning the Americas must first have been occupied after that date. The DNA change is not in a gene and makes no known difference to the men who carry it.
The new result, to be published in the American Journal of Human Genetics, is by Dr. Mark Seielstad of the Harvard School of Public Health, Dr. R. Spencer Wells of the University of Oxford and other colleagues.
The migration was probably by land because at that time the world's sea level was much lower and a land bridge, known as Beringia, stretched across what is now the Bering Strait between Siberia and Alaska. Also, people bearing the same genetic marker, called M3, live on either side of the former bridge, suggesting it was the means of passage.
Beringia sank beneath the waves some 11,000 years ago as the glaciers of the last ice age melted. The second migration seen by the geneticists seems to have occurred some 8,000 years ago and was presumably by boat, as the land bridge had long since vanished.
The date based on the new marker is important because it sets an earliest limit on the colonization of America, something that archaeologists find hard to do because they cannot be sure there are not sites they may have missed.
Hitherto some archaeologists have argued that people reached the Americas as long as 30,000 years ago. This date received some genetical support last year in a study by Dr. Douglas Wallace, now of the University of California at Irvine, who matched up male migrations from Siberia with the female migrations that he and colleagues had worked out earlier. The female migrations are traced by analyzing a genetic element in every cell called mitochondrial DNA.
Based on the mitochondrial DNA of the women descended from those in the first migration, Dr. Wallace estimated it occurred 20,000 to 30,000 years ago. Dr. Spencer said in an e-mail message that mitochondrial DNA was hard to date accurately and often gave dates that were too old. The Y chromosome is a better genetic clock, if a suitable marker can be found, he said.
Dr. Wallace did not respond to e-mail requests for comments.
The new date derived by Dr. Seielstad and Dr. Spencer may strengthen the hand of linguists who argue that all American languages fall into three groups, known as Amerind, Na-Dene and Eskimo-Aleut, with Amerind being by far the largest. Most linguists dispute that classification, saying languages change too fast to allow any very ancient relationships to be discerned. But if the first humans arrived in the Americas only 18,000 years ago, efforts to find links between present languages may seem more plausible.
"If they entered more recently, it is not such a stretch to say you can see a linguistic relationship," Dr. Wells said.
The new archaeological results seem compatible with the younger date adduced by the geneticists. Radiocarbon dating revealed that a occupation site in Siberia was only 13,000 years old and thus too recent to be a critical link in the first migrations, as had been supposed.
The site on the Kamchatka Peninsula of Russia, previously dated at 16,800 years old, was thought to be a way station at the western edge of Beringia, a point of departure for migrants either across the frozen land or by sea along the coasts. The new research challenges the conventional idea that this was the specific site from which people crossed into America, but does not exclude the possibility that they did so from other sites.
Researchers, led by Dr. Ted Goebel of the University of Nevada at Reno, reported the redating of the Siberian site at Ushki Lake on the Kamchatka Peninsula in today's issue of the journal Science. The other authors were Dr. Michael R. Waters of Texas A&M University and Dr. Margarita Dikova, an archaeologist and widow of Dr. Nikolai Dikov, who discovered the site in 1964.
The initial radiocarbon analysis was apparently based on contaminated samples, the researchers said. The 13,000-year-old date, nearly 4,000 years younger than previously thought, effectively removed the site as a way station for the first migrants to America, they concluded.
For most of the last century, the peopling of America was a story of big-game hunters trekking across the Bering land bridge in the last ice age, spreading across North America and within 1,000 years or so reaching the tip of South America. Those who left the most durable traces, fluted projectile points, were the Clovis people, named for the town in New Mexico where their artifacts were first uncovered.
The journal quoted Dr. David J. Meltzer, an archaeologist at Southern Methodist University, as saying the new finding "removes what was, until now, the critical link in the chain connecting Clovis to Siberia."
When people first occupied the Ushki Lake site, Clovis hunters had already been killing mammoths in North America for some 600 years and groups of hunters had left their mark at Monte Verde, Chile, 3,000 years earlier. Radiocarbon dates are lower than calendar dates and they become increasingly so the farther back one goes in time.
If the Ushki site is only 13,000 years old, Dr. Goebel said, the oldest place in the Bering region with human traces now is Broken Mammoth, a 14,000-year-old site in central Alaska.
"It means we have even less evidence than we had before," Dr. Goebel said.
CCC Metro TLC
July 26, 2003
Amazon Indians Honor an Intrepid Spirit
By LARRY ROHTER
YAWALAPITI, Brazil, July 20 — Traveling for hours by boat and on foot, the chiefs, shamans and warriors arrived from all over the southeastern Amazon. For two days they danced, sang, chanted and reminisced around a painted tree trunk, decorated with a feathered headdress, that represented the soul of a recently departed friend.
The trunk was placed in the large open space that is the focus of community life here and implanted almost directly above the burial site of a former tribal leader, so he could help guide the spirit to the "village in the stars."
After daubing their bodies and hair with designs in black and red dye, scores of nearly naked men and boys paraded past the totem, whooping and stamping their feet in unison as they moved back and forth.
This elaborate quarup, or traditional Indian ceremony of lamentation for those of noble lineage, has been performed in the splendid isolation of the jungle for time uncounted. The farewell ceremony is normally an insular event performed before members of the community in honor of one of their own.
But the departed friend this time was Orlando Villas Bôas, who died eight months ago at the age of 88 and was buried in São Paulo.
As the eldest survivor of four brothers who devoted their lives to contacting and documenting the native peoples of the Brazilian Amazon and protecting them from the onslaught of modern civilization, he became an especially revered figure here.
"This is the biggest quarup we have ever had, and maybe our last one ever for a white man," said Aritana, 54, the village chief. "It is hard to imagine that any other white man in the future could be a friend of ours as wise and courageous and dedicated as Orlando was."
Mr. Villas Bôas first ventured into the Amazon as part of an official government expedition 60 years ago this month. Together with his younger brothers, Álvaro, Cláudio and Leonardo, he explored thousands of square miles of unmapped jungle, set up jungle outposts that today are cities or towns, wrote 14 books and helped found and administer the government's National Indian Foundation. As disciples of Marshal Cândido Rondon, modern Brazil's first great Indian expert, the Villas Bôas brothers' philosophy was "Die if necessary, but kill never." They endured countless bouts of malaria, numerous attacks with arrows or spears, and confrontations nearly as fierce with businessmen and politicians eager to open up the Amazon.
But the brothers never wavered, and on this day Mr. Villas Bôas's grateful charges from nearly a dozen tribes here in the Upper Xingu River basin thanked him by consigning his spirit to the heavens.
The brothers' most enduring achievement is probably the creation of the Xingu Indigenous Park. At nearly 11,000 square miles, it is a remote area of jungle and rivers larger than Maryland, home to about 4,400 Indians and closed to outsiders except by the invitation of tribal leaders.
For more than a decade, the brothers fought to obtain the decree setting up the reserve, the first in Latin America. Then they battled 20 more years to have its boundaries legally fixed to protect it from the ranchers, loggers and miners who wanted to see it disappear.
"This park is Orlando's legacy to us, and this quarup is our way of paying him back for the gift he made," said Itiamã, a tribal shaman who believes his age to be 56. "I loved Orlando. After my father died he became like a father to me and always used to tell me that he wanted us to have our own land and to be able to eat what we wanted."
Some anthropologists have criticized the brothers' approach as paternalistic, and the brothers themselves often had mixed feelings about their work. "Each time we contact a tribe we contribute to the destruction of what is most pure in it," Orlando Villas Bôas often said.
Western encroachments are indeed visible here. A painting of Spiderman adorns the entrance to one large thatched-roof lodge where a gas stove is also in use, a photograph of the Eiffel Tower hangs in another and some Indians now travel from village to village on bicycles, and even motorcycles.
But the Indians themselves argue that their situation today would have been much worse had the Villas Bôas brothers not intervened on their behalf. They have been able to retain their original language and religion, and smoked fish and manioc continue to dominate their diet.
"Orlando used to warn us about his concerns for the future, and everything he predicted has come to pass," said Paié, a member of the Kayabi tribe who is the director of the park and came nearly 800 miles from Brasília to attend the ceremony. "Fortunately he trained us and prepared us to deal with the white man and his world."
Many of the 50 or so villages in the reserve would have liked to have had the honor of conducting the ceremony. But the Yawalapiti feel an especially strong bond with the Villas Bôas family, which they credit for their very survival.
When Orlando Villas Bôas first arrived here in the late 1940's, the Yawalapiti had been reduced to fewer than a dozen individuals scattered among several other tribes.
Mr. Villas Bôas brought them together again and encouraged them to marry members of linguistically similar groups, and today 220 people live in the 14 communal lodges that make up this village, with 60 more Yawalapiti at a settlement nearby.
"We owe not just the preservation of our language and our culture to Orlando, but also our very existence today as a people," Aritana, the village chief, said. "He arranged the marriage of my father and my mother, and he saw me born, so he was always a part of the life of the Yawalapiti and my life."
Once the formal ceremony of lamentation concluded, the festivities began. Dozens of men and boys faced off in a huka-huka, or ritualized wrestling tournament, as their mothers, wives and daughters watched and called out encouragement.
The huka-huka has existed for centuries within individual tribes. But the Villas Bôas brothers persuaded the tribes to make the competition regional and transformed it into a substitute for the unceasing wars that had sapped their strength and unity.
"By convincing the tribes to stop their internecine fighting, the Villas Bôas brothers were able to get them to concentrate on the bigger enemy," said John Hemming, the author of "Red Gold," a history of Indians in Latin America, who came here from England for the ceremony at the invitation of Mr. Villas Bôas's family.
Also attending was Orlando's widow, Marina Villas Bôas, who was a 25-year-old nurse when she first arrived here in 1963. She said that from "the very first day," she was swept away by the jungle setting and the man, then nearly twice her age, who dominated it.
"This was where we fell in love and where our sons were conceived and spent the first years of their lives," she said. "So I am of course overcome with emotion to be here again in these circumstances and to see this outpouring of affection and regard from Orlando's friends."
But the region, isolated as it is — a journey of nearly two days by bus, four-wheel drive vehicle and boat from Brasília — is changed from what it was only a generation ago.
Today most of the area around the Xingu park has been deforested. Tribal leaders complain that they now see runoff from pesticides and fertilizer in the headwaters of the Xingu, which lie outside the reserve.
During the quarup, the chiefs and shamans called on Orlando Villas Bôas's two sons, Orlando Jr. and Noel, to continue their father's mission, a challenge the two men said they would accept.
"The destiny of the peoples of the Xingu is still uncertain, because of what is happening around them," said Orlando Villas Bôas Jr., who spent the first four and a half years of his life here. "Brazil may have changed, and the times, too, but in my father's absence someone still needs to work to guarantee that 60 years of effort are not lost."
When Mr. Villas Bôas died last December in the state of São Paulo, where he was born, he was buried with a flood of tributes and a funeral attended by thousands. But Mrs. Villas Bôas said his family regarded the religious ceremony here as being of even greater importance.
"If it had been up to Orlando, this is the place where he would have spent his last day on earth," she said as a pair of shamans sang to his spirit a few feet away. "His work and his memory, his entire life, were here, and we believe, as the Indians do, that once this quarup is over, we will have no more motive to be sad."
CCC Metro TLC
August 22, 2003
Where Wombats Roved, and Aborigines Sketched
By JANE PERLEZ
SYDNEY, Australia — In a cave in rugged wilderness not far from the luxurious country resorts of this city's well-to-do, a leading anthropologist has found an unusually rare and pristine cache of ancient Aboriginal rock art.
In all, 11 layers of images of Australian animals — kangaroos, wombats and monitor lizards, which Australians call goannas — as well as drawings of boomerangs and half-human, half-animal creatures are scattered across the back wall of the cave in a giant mural.
The more than 200 images — in faint reds and yellows, stark white and black — stretch from 4,000 years ago to the late 18th century when white settlers first ventured onto Australian soil, said Paul S. C. Tacon, the chief research scientist in anthropology at the Australian Museum, who visited the site with Aboriginal consultants in May.
"I have been to thousands of places with rock art and only a few have affected me in this way," Mr. Tacon said of the cave. "Obviously this was a special place that people made special trips to, either for ceremonies or to stop at on their travels. It shows there was a rich artistic tradition, ranging from naturalistic depictions to stylized form of expressions relating to spiritual beliefs."
The discovery is not the oldest Aboriginal art known in Australia — some drawings in the hard sandstone of the northern desert country are older, Mr. Tacon said.
But the proximity of these works to the country's largest city — just 60 miles west of Sydney and a few miles from resorts in the Blue Mountains — and their impeccable condition and number, makes the find one of the most important, he said.
The pigments on the drawings have been unusually well preserved because the cave opening faces north and receives little direct sunlight. The only disturbance Mr. Tacon could detect in the cave was some dust kicked up by wombats, an Australian marsupial that frequents the area and one of the creatures depicted on the wall.
Because of the historic value of the art, the announcement of the discovery was made in Parliament last month by Bob Carr, the premier of New South Wales, the state where the cave lies.
A group of hikers stumbled on the rock art eight years ago, but Mr. Tacon is the only expert to have seen it so far. It took a while for Mr. Tacon to get there, he said, because he wanted to consult first with Aborigines and include them in the process, and because drought and bush fires impeded access.
The cave's exact location has been kept secret. Mr. Tacon will say only that it is in Wollemi National Park and an extraordinarily tough walk from a drop off point at a place called Colo Heights in the Blue Mountains. He said that his stay at the cave, with five Aboriginal colleagues, was limited to two days because of a shortage of fresh water.
Once he came back from the site, Mr. Tacon successfully urged the premier to restrict public access to the cave.
Mr. Carr, a keen hiker and an outspoken advocate for Aboriginal rights, had wanted to go to the cave to see for himself. But when he learned that he would first have to take a two-day course to learn how to be winched down from a helicopter, he decided against a personal visit.
Of the array of drawings in the cave, Mr. Tacon said he was particularly impressed with the charcoals of local animals. The depiction of a swamp wallaby — a smaller version of a kangaroo — was the "spitting image" of the real thing, he said. A drawing of a rock wallaby was particularly strong because over time the charcoal had bonded with the rock. He also liked the drawing of a goanna drawn in charcoal and outlined with white ocher.
Some of the images, particularly the stenciling of hands and boomerangs, appear to have been done as a way of expressing, "`I was here, this is my country,"' Mr. Tacon said. The primitive artists fashioned the stencils by taking pigment — often red ocher or white pipe clay — putting it in their mouths and blowing it out over a hand or boomerang to leave the shape on the wall. "It was a first form of spray painting," he said.
Of particular interest was a drawing of what appeared to be a two-headed figure, or two figures standing one behind the other. They appear to be holding something resembling a barbed wire. Other images show human bodies with animal heads — either kangaroos or birds. These were considered ancestral beings of Aborigines, and are referred to in Aboriginal accounts of creation, Mr. Tacon said.
The rock art discovery comes as Australian art collectors have embarked on a sudden vogue for modern Aboriginal paintings on canvas, and are paying high prices for relatively new works only 30 to 40 years old at auction.
In Mr. Tacon's mind there is no comparison between the old and the new.
The intensity of the imagery in the cave made one of the Aborigines who visited the site with him fall into unusual dreams the night they camped there, he said. "He dreamed that his ancestors tried to visit him there. He wrote a poem and wanted an hour to sit by himself," before leaving the rock art behind and walking out of the forest.
CCC Metro TLC
June 29, 2004
The Oldest Americans May Prove Even Older
By JOHN NOBLE WILFORD
ARNWELL, S.C., June 24 - On a hillside by the Savannah River, under tall oaks bearded with Spanish moss, an archaeologist and a graduate student crouched in the humid depths of a trench. They had reason to think they were in the presence of a breathtaking discovery.
Or at the least, they were on to something more than 20,000 years old that would throw American archaeology into further turmoil over its most contentious issue: when did people first reach America, and who were they?
The sandy soil of the trench walls was flecked with pieces of chert, the source of flint coveted by ancient toolmakers. Some of the stone flakes appeared to be unfinished discards. Others had the sharp-edged look of more fully realized blades, chisels and scrapers. Long ago, it seemed, Stone Age hunter-gatherers had frequently stopped here and, perhaps, these toolmakers were among the first Americans.
With deft strokes of his trowel, the archaeologist, Dr. Albert C. Goodyear of the University of South Carolina, excised a chunk of chert about the size of a cantaloupe. Its sides, he said, had all the marks of flintknappers' work. They had presumably smashed one cobble against another, leaving fracture lines through the rock, and then recovered thin slices for making sharp tools.
"This is not a natural occurrence," Dr. Goodyear said, showing the beaten-about chert cobble afterward. "No river, fire or animals could do this. Too many blows have been struck."
If he is right, American prehistory is being extended deeper in time at this remote dig site near Barnwell. Dr. Robson Bonnichsen, an expert on early Americans who is not directly involved in the excavation, said it could even be "the single most significant Ice Age site in North America" as a place bearing tantalizing evidence for "understanding the earliest prehistory of the Americas."
The land is owned by the Clariant Corporation, the big Swiss chemical company, which allows archaeologists to dig to their minds' content in the forest at the Topper Site, named for the person who brought it to their attention more than 20 years ago.
Judging by the depth of sediments, the site may have been a toolmaking center at least 7,000 years earlier than the arrival of big-game hunters known as the Clovis people. Once thought to be the earliest Americans, Clovis hunters, named for the town in New Mexico where their traces were uncovered 70 years ago, left their finely worked fluted projectile points across the United States over five centuries, beginning 13,000 years ago. All the dates here are based on radiocarbon calculations adjusted to calendar years.
The two men in the trench, their shirts now soaked in sweat, were eager to find evidence that would yield more precise dates for the finds. They leaned into a seam of darker soil interspersed with black grains that the graduate student, Tony Pickering, had found three weeks before. It just might be the remains of a fireplace. If so, any residue of charcoal should give a reliable date through radiocarbon analysis.
Dr. Goodyear emerged from the trench clutching four small plastic zip-lock bags. "I don't know how we ever did archaeology before zip-lock bags," he remarked as he held them up for examination. Each bag contained soil and several pea-size black fragments that he hoped represented the residue of charcoal from a hearth.
"I hope the laboratory gets three dates out of this," he said. "And I hope they're all similar dates."
In his more exuberant moments, Dr. Goodyear ventured that the dates could be as old as 25,000, even 30,000, years ago. He has already found elsewhere on the site what appear to be 16,000-year-old artifacts, evidence for a pre-Clovis peopling of America similar to findings in Virginia and Pennsylvania. None of those discoveries has convinced skeptics.
A few conservative holdouts still question the one widely accepted pre-Clovis claim: that earlier people were living in Chile at a site excavated by Dr. Tom D. Dillehay of the University of Kentucky that is known as Monte Verde. A strong endorsement of Monte Verde by prominent archaeologists published in 1998 encouraged others, including Dr. Goodyear, to dig deeper.
Dr. Bonnichsen, who is director of the Center for the Study of the First Americans at Texas A&M University in College Station and has visited the Topper site and examined some of the possible artifacts, said, "If the preliminary findings hold, this is a tremendous discovery." But he cautioned that "a lot of hard research needs to be done to really test this thing thoroughly."
Dating the putative fireplace will be an important next step. As soon as that is done, Dr. Goodyear said, he and other scientists from several universities expect to announce the age and describe the excavated materials in a journal article, perhaps by the end of the year. Even if the charcoal is from a natural fire, not a human campfire, he said, the analysis should establish the age of any artifacts from the same sediment layer.
A bigger hurdle, scientists said, may be to establish that the stone pieces are indeed human-made tools. Many a presumed pre-Clovis site has failed to gain scholarly acceptance over the question of whether stone pieces that look like tools were the work of early humans or of nature.
Dr. Bonnichsen said much of the 16,000-year-old chert material previously excavated by Dr. Goodyear "looks really good" and might well be tools. At the laboratory at Texas A&M, microscopic examination of the supposed cutting edges showed gouges and scratches that appeared to be wear marks from scraping hides, butchering and cutting wood. They look, he said, "as if they are going to qualify as artifacts."
But it is too soon, he added, to render a nature-versus-culture verdict on the stone pieces from the greater depths and earlier ages at Topper. More experimental work is required to understand how the chert could have been modified into tools.
Dr. Goodyear, whose specialty is the study of stone tools, agreed, though he insisted that "so far we have found no plausible way nature could have made these tools, but we have shown how humans could have made them." The sample collected so far, Dr. Bonnichsen and others said, is too small to be definitive.
Dr. Goodyear said he planned a wider and more intensive search next year. Dr. Sarah C. Sherwood, an anthropologist at the University of Tennessee, is to visit the site next month to investigate the hearthlike material for signs of bone and plant remains, possible evidence for cooking fires, and to determine whether the remains are indeed from a fireplace and are not an accumulation of ash deposited by river floods. Other scientists from Tennessee, Texas A&M, the University of Illinois at Chicago and the Smithsonian Institution have inspected the digs, some of them conducting their own tests.
At the end of dig season this year, Dr. Goodyear seemed reconciled to the prospect of hard years of excavation, research and argument ahead.
"If this is 25,000 years old, and I think it is, then scientists will come here from all over the world to see for themselves," he said, while driving back to Barnwell after a day in the field. "And they will argue about it for another 10 years."
The challenge for the Topper archaeologists, as for others making pre-Clovis discoveries, is not only the ambiguity of the evidence, but also its unfamiliarity. Clovis workmanship was painstaking and distinctive. Nearly all the spear points were several inches long and sharpened on both sides. Many of them were found among bones of mammoths that they were used to kill, accounting for the long-held reputation of the Clovis people as primarily big-game hunters. That also agrees with the theory that the first Americans crossed from Siberia to Alaska in pursuit of mammoth and mastodon at the end of the last Ice Age.
Yet all claims for pre-Clovis cultures rest largely on finds of a much more primitive technology. If these are tools, they are simpler and the weapon points are not bifacial; they are finished on only one side. For these and other reasons, archaeologists who made their careers on the Clovis culture usually react to possible evidence of predecessors with stiff skepticism.
Calling this the "Clovis bias," Dr. Goodyear said, "You look for something with one idea in mind, and you don't see it, then people become uncomfortable and confused, and they often reject it."
That is changing, though. Three other likely pre-Clovis sites have been found in the eastern United States: at Meadowcroft, Pa., near Pittsburgh, and at Cactus Hill and Saltville in Virginia. Other sites in South America, besides Monte Verde, may precede the Clovis period.
Bluefish Caves, in the Yukon, is still disputed as a focus of pre-Clovis research.
Signs of pre-Clovis people are sparse because these mobile bands were few in number and trod lightly on the land, and also because archaeologists had until recently not been looking deeply enough.
"For generations, we assumed that Clovis was the primordial human culture south of the ice sheets, but that model has long been discredited," Dr. Brian M. Fagan, an archaeologist at the University of California, Santa Barbara, wrote in an updated edition of "The Long Journey: The Peopling of America," published this year by the University Press of Florida.
"We simply do not know when the first human settlers moved south of the ice sheets," Dr. Fagan concluded, noting that the archaeological record now showed the migration to be "an untidy process of rapid colonization, by people acquiring foods in many ways, who used a broad range of stone and wooden artifacts and, also occasionally, bone tools to survive."
It makes sense to Dr. Goodyear and his associates at the South Carolina Institute of Archaeology and Anthropology in Columbia that long before Clovis, bands of people moving up the Savannah River from the coast spotted chert washing out of the hillside. It still does. The dirt road at the Topper site is sprinkled with the rock. The hunter-gatherers quarried the chert, made their tools as best they could and then went on their way, to return again and again.
And so will Dr. Goodyear and probably many more archaeologists in search of the earliest people to live in the Americas.
- January 3, 2006
Scientist at Work | Shannon Lee Dawdy
Archaeologist in New Orleans Finds a Way to Help the Living
By JOHN SCHWARTZ
NEW ORLEANS -"That's a finger bone."
Shannon Lee Dawdy kneeled in the forlorn Holt graveyard to touch a
thimble-size bone poking up out of the cracked dirt. She examined it
without revulsion, with the fascination of a scientist and with the
sadness of someone who loves New Orleans.
Dr. Dawdy, a 38-year-old assistant professor of anthropology at the
University of Chicago, is one of the more unusual relief workers among
the thousands who have come to the devastated expanses of Louisiana,
Mississippi and Texas in the aftermath of Hurricanes Katrina and Rita.
She is officially embedded with the Federal Emergency Management Agency
as a liaison to the state's historic preservation office.
Her mission is to try to keep the rebuilding of New Orleans from
destroying what is left of its past treasures and current culture.
While much of the restoration of the battered Gulf Coast is the effort
of engineers and machines, the work of Dr. Dawdy, trained as an
archaeologist, an anthropologist and a historian, shows that the social
sciences have a role to play as well. "It's a way that archaeology can
contribute back to the living," she said, "which it doesn't often get to
Holt cemetery, a final resting place for the city's poor, is just one
example of what she wants to preserve and protect.
Other New Orleans graveyards have gleaming mausoleums that keep the
coffins above the marshy soil. But the coffins of Holt are buried, and
the ground covering many of them is bordered with wooden frames marked
with makeshift headstones.
Mourners decorate the graves with votive objects: teddy bears for
children and an agglomeration of objects, including ice chests, plastic
jack-o'-lanterns and chairs, on the graves of adults. There is the
occasional liquor bottle.
It is part of the soul of New Orleans, a city that through history has
had strong ties to its dead. Dr. Dawdy calls it a prime example of "the
amazing improvisational impulse of New Orleans," which creates beautiful
things and powerful feelings from the everyday.
Dr. Dawdy looked across the ruined graveyard. Holt was a place she knew
well and loved, she said, and when she first saw it after the storm, she
broke down and cried.
"It made me realize that it's that ephemeral folk expression in New
Orleans that is gone," she said, "and that probably, rebuilding efforts
Many of the objects on the graves were washed away by the storm, or
shifted from one part of the graveyard to another. Dr. Dawdy has
proposed treating the site as archaeologists would an ancient site in
which objects have been exposed on the surface by erosion.
Before the hurricanes, the cemetery was often busy, a hub of activity on
All Souls' Day, when people came to freshen the grave decorations.
"The saddest thing to me now was how few people we see," she said,
looking at the empty expanse and the scarred live oaks. "I realize we're
having enough trouble taking care of the living," she added, but the
lack of activity in a city normally so close to the spirits of the past
"drove home how far out of whack things are."
There is evidence of recent visits: blindingly white gravel sits atop
some graves, and a fresh bouquet sits on the grave of Andrew P. Sherman,
who was born in September 1924 and died in 1968. Dr. Dawdy picks up the
bouquet and checks the tag: it was purchased on Nov. 6. "Here's
archaeological dating for you," she said with a small smile.
Treating Holt as an archaeological site mean the government should not
treat the votive artifacts as debris, she said, but as the religious
artifacts that they are, with some effort to restore the damaged site,
to find the objects and at least record where they came from.
FEMA simply tries to clean up damaged areas, and its Disaster Mortuary
Operational Response Teams - called Dmort- deal with the bodies of the
dead and address problems in cemeteries that might lead to disease.
If such places are destroyed, Dr. Dawdy said, "then people don't feel as
connected here." She added that they might be more willing to come back
to a damaged city if they felt they were returning to a recognizable
Though she has deep emotional ties to New Orleans, Dr. Dawdy was born in
Northern California. She came here in 1994 to write her master's thesis
for the College of William & Mary, and, "I wrote it all day," she said.
"If I had written a minimum of five pages, I could come out for a parade
at night." Over the eight weeks it took to finish the project, she said:
"I fell in love with New Orleans. I really consider it the home of my
She started a pilot program at the University of New Orleans, working
with city planners and grants for research projects that involved
excavation, oral history and hands-on work with the city to safeguard
its buried treasures.
She left that job to earn a double doctorate at the University of
Michigan in anthropology and history that focused on French colonial
times in New Orleans, then landed a coveted faculty position at the
University of Chicago. She now lives in Chicago with her husband, Dan
McNaughton, and their 5-year-old son.
Jean Comaroff, the head of the anthropology department at the University
of Chicago, said in an e-mail message that it was only natural to be
supportive of Dr. Dawdy's efforts to help New Orleans.
"I could think of no one better to serve FEMA in this role," she wrote.
"The threat is great that much that was unique about New Orleans as a
social and cultural world - qualities that are at once creative,
poignant and fragile - will be lost in its reconstruction. Those of us
who value these qualities feel moved to do all we can to conserve them."
Even before Hurricane Katrina, Dr. Dawdy had found ways to return to New
Orleans. In 2004, she made an intriguing discovery while researching a
possible archaeological site under an old French Quarter parking garage
slated for demolition. Property records and advertisements from the
1820's said that the site had been the location of a hotel with an
enticing name: the Rising Sun Hotel.
Dr. Dawdy found a January 1821 newspaper advertisement for the hotel in
which its owners promised to "maintain the character of giving the best
entertainment, which this house has enjoyed for twenty years past."
It went on: "Gentlemen may here rely upon finding attentive Servants.
The bar will be supplied with genuine good Liquors; and at the Table,
the fare will be of the best the market or the season will afford."
The historical record made her think that the building might have served
as something more interesting than a mere hotel, a brothel perhaps.
Digging under the garage, she found an unusual number of liquor bottles
and rouge pots.
For Dr. Dawdy, it was a lucky break, the kind of find that can make a
reputation. "Can you prove archaeologically is this a brothel?" she
asked. "I can't prove it with a yes or no answer."
Nor can she say with certainty that this Rising Sun was the inspiration
for "House of the Rising Sun," the famous song first recorded in 1937 by
Alan Lomax, a musicologist and folklorist.
"I love the ambiguity of it all," Dr. Dawdy said.
New Orleans, she noted, has always been known for its libertine
lifestyle. The French all but abandoned the city as its colony around
1735 as being unworthy of the nation's support as a colony. Novels like
"Manon Lescaut" portrayed the city as a den of iniquity and corruption,
and across Europe, "they thought the locals were basically a bunch of
rogues, immoral and corrupt," Dr. Dawdy said.
She added that she saw parallels to today, as some skepticism emerges
about rebuilding the city. Dr. Dawdy characterized that posture as,
"Those people in New Orleans aren't worth saving, because they're all
But even if the devastation makes it hard to envision the road back, the
city, she said, is worth fighting for.
"The thing about New Orleans that gives me hope is they are so tied to
family, place, history," Dr. Dawdy said. "If anyone is going to stick it
out, out of a sense of history, out of a sense of tradition, it is New
* Copyright 2006
<http://www.nytimes.com/ref/membercenter/help/copyright.html> The New
York Times Company <http://www.nytco.com/>
Ann Popplestone AAB, BA, MA
CCC Metro TLC
[Non-text portions of this message have been removed]
- March 3, 2007
Putting to a Vote the Question 'Who Is Cherokee?'
By EVELYN NIEVES
TAHLEQUAH, Okla., March 1 - The casinos here are crowded by midmorning;
busloads of tourists stroll the streets, and construction crews are
everywhere. But peace of mind eludes the prospering Cherokee Nation of
The Cherokees, so proud that they survived the racism and greed that
forced them to leave the East and settle in Oklahoma, are embroiled in a
debate that is dredging up some of the most painful chapters of their
history. The fundamental question they are asking is: Who is Cherokee?
And it is raising ugly accusations of racism, from both inside and
outside the tribe.
At issue is a group barely known outside of Indian country, the
Freedmen. These are the descendants of black slaves owned by Cherokees,
free blacks who were married to Cherokees and the children of mixed-race
families known as black Cherokees, all of whom joined the Cherokee
migration to Oklahoma in 1838.
The Freedmen became full citizens of the Cherokee Nation after
emancipation, as part of the Treaty of 1866 with the United States. But
in 1983, by tribal decree, the Freedmen were denied the right to vote in
tribal elections on the ground they were not "Cherokee by blood."
They sued, and in December won their challenge. But that has prompted a
bigger fight. On Saturday, the Cherokee Nation is holding a special
election - believed to be the first of its kind - to decide, in essence,
whether to kick the Freedmen out of the tribe.
Officially, the election will ask voters whether to amend the Cherokee
Nation Constitution. Overriding the 1866 treaty, it would limit
citizenship to those who can trace their heritage to "Cherokee by blood"
rolls, part of a census known as the Dawes Rolls of 1906. The Freedmen
would automatically be denied citizenship because the Dawes Rolls, a
census commissioned by Congress to distribute land to tribal members,
put the Freedmen on a separate roll that made no mention of Indian
Proponents of the amendment say it is about drawing a line, a blood
line. The Cherokee Nation, the second-largest tribe in the country after
the Navajo, is also one of the fastest growing, with 270,000 members and
1,000 new citizens enrolled every month. Members are entitled to federal
benefits and tribal services , including medical and housing aid and
"Every other Indian tribe is based on blood, and they are not accused of
being racists," said John A. Ketcher, a former deputy tribal chief, in a
full-page "Vote Yes" ad in the Cherokee newspaper.
Many tribal leaders are campaigning for the amendment, citing the right
of a sovereign nation to determine its citizenship.
Voters say they have been bombarded with advertisements attacking
"non-Indians" as thieves who would create long lines in Cherokee health
clinics and social service centers.
Freedmen supporters chalk up the claims to bigotry. They say the
Cherokee Nation knows all too well that many Freedmen (who number about
25,000) have Cherokee blood.
When the Dawes Rolls were created, those with any African blood were put
on the Freedmen roll, even if they were half Cherokee. Those with
mixed-white and Cherokee ancestry, even if they were seven-eighths white
and one-eighth Cherokee, were put on the Cherokee by blood roll. More
than 75 percent of those enrolled in the Cherokee Nation have less than
one-quarter Cherokee blood, the vast majority of them of European
Marilyn Vann said she could not believe that one election could
determine whether she was allowed to claim Cherokee blood.
"There are Freedmen who can prove they have a full-blooded Cherokee
grandfather who won't be members," said Ms. Vann, president of the
Descendants of Freedmen of the Five Civilized Tribes. "And there are
blond people who are 1/1000th Cherokee who are members."
Mike Miller, the Cherokee Nation spokesman, agreed.
"We are aware that there are those who can prove Indian blood who are
not Cherokee citizens, because they are not on the Dawes 'by blood'
Rolls," Mr. Miller said. "But I don't know of a single tribe that
determines citizenship through a bunch of sources."
This is the second time in recent years that an Indian nation has tried
to remove its Freedmen. The Seminole Freedmen won a similar legal battle
The Seminoles were formed when refugees from several tribes joined with
runaway slaves. But after the Seminoles denied their Freedmen voting
rights and financial benefits, effectively abrogating the Treaty of
1866, the federal government refused to recognize the Seminoles as a
The Cherokees are also risking their tribal sovereignty, said Jon Velie,
a lawyer for the Seminole and Cherokee Freedmen.
"There is this racial schism in Indian Country that is growing and
getting worse," Mr. Velie said. "Even having the debate is the problem.
You then become a lesser person because people get to decide whether
you're in or not."
Taylor Keen, a Cherokee tribal council member who supports Freedmen
citizenship, suggested that proponents of the amendment were pandering
to racism, trying to score political points for when they run for tribal
office in June.
"This is a sad chapter in Cherokee history," Mr. Keen said. "But this is
not my Cherokee Nation. My Cherokee Nation is one that honors all parts
of her past."
<http://www.nytimes.com/ref/membercenter/help/copyright.html> The New
York Times Company <http://www.nytco.com/>
Ann Popplestone AAB, BA, MA
CCC Metro TLC
[Non-text portions of this message have been removed]
- Wow! I had no idea about this! I'm going to share this with my race and
ethnic relations class as well as my cultural anthropology class.
----- Original Message -----
From: "Popplestone, Ann" <ann.popplestone@...>
Date: Saturday, March 3, 2007 7:51 am
Subject: [SACC-L] NYTimes article
> March 3, 2007
> Putting to a Vote the Question 'Who Is Cherokee?'
> By EVELYN NIEVES
> TAHLEQUAH, Okla., March 1 - The casinos here are crowded by
> midmorning;busloads of tourists stroll the streets, and
> construction crews are
> everywhere. But peace of mind eludes the prospering Cherokee
> Nation of
> ssions/oklahoma/index.html?inline=nyt-geo> .
> The Cherokees, so proud that they survived the racism and greed that
> forced them to leave the East and settle in Oklahoma, are
> embroiled in a
> debate that is dredging up some of the most painful chapters of their
> history. The fundamental question they are asking is: Who is Cherokee?
> And it is raising ugly accusations of racism, from both inside and
> outside the tribe.
> At issue is a group barely known outside of Indian country, the
> Freedmen. These are the descendants of black slaves owned by
> Cherokees,free blacks who were married to Cherokees and the
> children of mixed-race
> families known as black Cherokees, all of whom joined the Cherokee
> migration to Oklahoma in 1838.
> The Freedmen became full citizens of the Cherokee Nation after
> emancipation, as part of the Treaty of 1866 with the United
> States. But
> in 1983, by tribal decree, the Freedmen were denied the right to
> vote in
> tribal elections on the ground they were not "Cherokee by blood."
> They sued, and in December won their challenge. But that has
> prompted a
> bigger fight. On Saturday, the Cherokee Nation is holding a special
> election - believed to be the first of its kind - to decide, in
> essence,whether to kick the Freedmen out of the tribe.
> Officially, the election will ask voters whether to amend the Cherokee
> Nation Constitution. Overriding the 1866 treaty, it would limit
> citizenship to those who can trace their heritage to "Cherokee by
> blood"rolls, part of a census known as the Dawes Rolls of 1906.
> The Freedmen
> would automatically be denied citizenship because the Dawes Rolls, a
> census commissioned by Congress to distribute land to tribal members,
> put the Freedmen on a separate roll that made no mention of Indian
> Proponents of the amendment say it is about drawing a line, a blood
> line. The Cherokee Nation, the second-largest tribe in the country
> afterthe Navajo, is also one of the fastest growing, with 270,000
> members and
> 1,000 new citizens enrolled every month. Members are entitled to
> federalbenefits and tribal services , including medical and
> housing aid and
> "Every other Indian tribe is based on blood, and they are not
> accused of
> being racists," said John A. Ketcher, a former deputy tribal
> chief, in a
> full-page "Vote Yes" ad in the Cherokee newspaper.
> Many tribal leaders are campaigning for the amendment, citing the
> rightof a sovereign nation to determine its citizenship.
> Voters say they have been bombarded with advertisements attacking
> "non-Indians" as thieves who would create long lines in Cherokee
> healthclinics and social service centers.
> Freedmen supporters chalk up the claims to bigotry. They say the
> Cherokee Nation knows all too well that many Freedmen (who number
> about25,000) have Cherokee blood.
> When the Dawes Rolls were created, those with any African blood
> were put
> on the Freedmen roll, even if they were half Cherokee. Those with
> mixed-white and Cherokee ancestry, even if they were seven-eighths
> whiteand one-eighth Cherokee, were put on the Cherokee by blood
> roll. More
> than 75 percent of those enrolled in the Cherokee Nation have less
> thanone-quarter Cherokee blood, the vast majority of them of European
> Marilyn Vann said she could not believe that one election could
> determine whether she was allowed to claim Cherokee blood.
> "There are Freedmen who can prove they have a full-blooded Cherokee
> grandfather who won't be members," said Ms. Vann, president of the
> Descendants of Freedmen of the Five Civilized Tribes. "And there are
> blond people who are 1/1000th Cherokee who are members."
> Mike Miller, the Cherokee Nation spokesman, agreed.
> "We are aware that there are those who can prove Indian blood who are
> not Cherokee citizens, because they are not on the Dawes 'by blood'
> Rolls," Mr. Miller said. "But I don't know of a single tribe that
> determines citizenship through a bunch of sources."
> This is the second time in recent years that an Indian nation has
> triedto remove its Freedmen. The Seminole Freedmen won a similar
> legal battle
> in 2003.
> The Seminoles were formed when refugees from several tribes joined
> withrunaway slaves. But after the Seminoles denied their Freedmen
> votingrights and financial benefits, effectively abrogating the
> Treaty of
> 1866, the federal government refused to recognize the Seminoles as a
> sovereign nation.
> The Cherokees are also risking their tribal sovereignty, said Jon
> Velie,a lawyer for the Seminole and Cherokee Freedmen.
> "There is this racial schism in Indian Country that is growing and
> getting worse," Mr. Velie said. "Even having the debate is the
> problem.You then become a lesser person because people get to
> decide whether
> you're in or not."
> Taylor Keen, a Cherokee tribal council member who supports Freedmen
> citizenship, suggested that proponents of the amendment were pandering
> to racism, trying to score political points for when they run for
> tribaloffice in June.
> "This is a sad chapter in Cherokee history," Mr. Keen said. "But
> this is
> not my Cherokee Nation. My Cherokee Nation is one that honors all
> partsof her past."
> Home <http://www.nytimes.com/>
> Copyright 2007
> <http://www.nytimes.com/ref/membercenter/help/copyright.html> The New
> York Times Company <http://www.nytco.com/>
> Ann Popplestone AAB, BA, MA
> CCC Metro TLC
> [Non-text portions of this message have been removed]
- <http://www.nytimes.com/> <http://www.nytimes.com/> <http://www.nytimes.com/>
March 4, 2007
By ROBIN MARANTZ HENIG
God has always been a puzzle for Scott Atran. When he was 10 years old, he scrawled a plaintive message on the wall of his bedroom in Baltimore. "God exists," he wrote in black and orange paint, "or if he doesn't, we're in trouble." Atran has been struggling with questions about religion ever since - why he himself no longer believes in God and why so many other people, everywhere in the world, apparently do.
Call it God; call it superstition; call it, as Atran does, "belief in hope beyond reason" - whatever you call it, there seems an inherent human drive to believe in something transcendent, unfathomable and otherworldly, something beyond the reach or understanding of science. "Why do we cross our fingers during turbulence, even the most atheistic among us?" asked Atran when we spoke at his Upper West Side pied-à-terre in January. Atran, who is 55, is an anthropologist at the National Center for Scientific Research in Paris, with joint appointments at the University of Michigan <http://topics.nytimes.com/top/reference/timestopics/organizations/u/university_of_michigan/index.html?inline=nyt-org> and the John Jay College of Criminal Justice <http://topics.nytimes.com/top/reference/timestopics/organizations/j/john_jay_college_of_criminal_justice/index.html?inline=nyt-org> in New York. His research interests include cognitive science and evolutionary biology, and sometimes he presents students with a wooden box that he pretends is an African relic. "If you have negative sentiments toward religion," he tells them, "the box will destroy whatever you put inside it." Many of his students say they doubt the existence of God, but in this demonstration they act as if they believe in something. Put your pencil into the magic box, he tells them, and the nonbelievers do so blithely. Put in your driver's license, he says, and most do, but only after significant hesitation. And when he tells them to put in their hands, few will.
If they don't believe in God, what exactly are they afraid of?
Atran first conducted the magic-box demonstration in the 1980s, when he was at Cambridge University <http://topics.nytimes.com/top/reference/timestopics/organizations/c/cambridge_university/index.html?inline=nyt-org> studying the nature of religious belief. He had received a doctorate in anthropology from Columbia University <http://topics.nytimes.com/top/reference/timestopics/organizations/c/columbia_university/index.html?inline=nyt-org> and, in the course of his fieldwork, saw evidence of religion everywhere he looked - at archaeological digs in Israel, among the Mayans in Guatemala, in artifact drawers at the American Museum of Natural History <http://topics.nytimes.com/top/reference/timestopics/organizations/a/american_museum_of_natural_history/index.html?inline=nyt-org> in New York. Atran is Darwinian in his approach, which means he tries to explain behavior by how it might once have solved problems of survival and reproduction for our early ancestors. But it was not clear to him what evolutionary problems might have been solved by religious belief. Religion seemed to use up physical and mental resources without an obvious benefit for survival. Why, he wondered, was religion so pervasive, when it was something that seemed so costly from an evolutionary point of view?
The magic-box demonstration helped set Atran on a career studying why humans might have evolved to be religious, something few people were doing back in the '80s. Today, the effort has gained momentum, as scientists search for an evolutionary explanation for why belief in God exists - not whether God exists, which is a matter for philosophers and theologians, but why the belief does.
This is different from the scientific assault on religion that has been garnering attention recently, in the form of best-selling books from scientific atheists who see religion as a scourge. In "The God Delusion," published last year and still on best-seller lists, the Oxford evolutionary biologist Richard Dawkins concludes that religion is nothing more than a useless, and sometimes dangerous, evolutionary accident. "Religious behavior may be a misfiring, an unfortunate byproduct of an underlying psychological propensity which in other circumstances is, or once was, useful," Dawkins wrote. He is joined by two other best-selling authors - Sam Harris, who wrote "The End of Faith," and Daniel Dennett, a philosopher at Tufts University <http://topics.nytimes.com/top/reference/timestopics/organizations/t/tufts_university/index.html?inline=nyt-org> who wrote "Breaking the Spell." The three men differ in their personal styles and whether they are engaged in a battle against religiosity, but their names are often mentioned together. They have been portrayed as an unholy trinity of neo-atheists, promoting their secular world view with a fervor that seems almost evangelical.
Lost in the hullabaloo over the neo-atheists is a quieter and potentially more illuminating debate. It is taking place not between science and religion but within science itself, specifically among the scientists studying the evolution of religion. These scholars tend to agree on one point: that religious belief is an outgrowth of brain architecture that evolved during early human history. What they disagree about is why a tendency to believe evolved, whether it was because belief itself was adaptive or because it was just an evolutionary byproduct, a mere consequence of some other adaptation in the evolution of the human brain.
Which is the better biological explanation for a belief in God - evolutionary adaptation or neurological accident? Is there something about the cognitive functioning of humans that makes us receptive to belief in a supernatural deity? And if scientists are able to explain God, what then? Is explaining religion the same thing as explaining it away? Are the nonbelievers right, and is religion at its core an empty undertaking, a misdirection, a vestigial artifact of a primitive mind? Or are the believers right, and does the fact that we have the mental capacities for discerning God suggest that it was God who put them there?
In short, are we hard-wired to believe in God? And if we are, how and why did that happen?
"All of our raptures and our drynesses, our longings and pantings, our questions and beliefs . . . are equally organically founded," William James wrote in "The Varieties of Religious Experience." James, who taught philosophy and experimental psychology at Harvard <http://topics.nytimes.com/top/reference/timestopics/organizations/h/harvard_university/index.html?inline=nyt-org> for more than 30 years, based his book on a 1901 lecture series in which he took some early tentative steps at breaching the science-religion divide.
In the century that followed, a polite convention generally separated science and religion, at least in much of the Western world. Science, as the old trope had it, was assigned the territory that describes how the heavens go; religion, how to go to heaven.
Anthropologists like Atran and psychologists as far back as James had been looking at the roots of religion, but the mutual hands-off policy really began to shift in the 1990s. Religion made incursions into the traditional domain of science with attempts to bring intelligent design into the biology classroom and to choke off human embryonic stem-cell research on religious grounds. Scientists responded with counterincursions. Experts from the hard sciences, like evolutionary biology and cognitive neuroscience, joined anthropologists and psychologists in the study of religion, making God an object of scientific inquiry.
The debate over why belief evolved is between byproduct theorists and adaptationists. You might think that the byproduct theorists would tend to be nonbelievers, looking for a way to explain religion as a fluke, while the adaptationists would be more likely to be believers who can intuit the emotional, spiritual and community advantages that accompany faith. Or you might think they would all be atheists, because what believer would want to subject his own devotion to rationalism's cold, hard scrutiny? But a scientist's personal religious view does not always predict which side he will take. And this is just one sign of how complex and surprising this debate has become.
Angels, demons, spirits, wizards, gods and witches have peppered folk religions since mankind first started telling stories. Charles Darwin noted this in "The Descent of Man." "A belief in all-pervading spiritual agencies," he wrote, "seems to be universal." According to anthropologists, religions that share certain supernatural features - belief in a noncorporeal God or gods, belief in the afterlife, belief in the ability of prayer or ritual to change the course of human events - are found in virtually every culture on earth.
This is certainly true in the United States. About 6 in 10 Americans, according to a 2005 Harris Poll, believe in the devil and hell, and about 7 in 10 believe in angels, heaven and the existence of miracles and of life after death. A 2006 survey at Baylor University <http://topics.nytimes.com/top/reference/timestopics/organizations/b/baylor_university/index.html?inline=nyt-org> found that 92 percent of respondents believe in a personal God - that is, a God with a distinct set of character traits ranging from "distant" to "benevolent."
When a trait is universal, evolutionary biologists look for a genetic explanation and wonder how that gene or genes might enhance survival or reproductive success. In many ways, it's an exercise in post-hoc hypothesizing: what would have been the advantage, when the human species first evolved, for an individual who happened to have a mutation that led to, say, a smaller jaw, a bigger forehead, a better thumb? How about certain behavioral traits, like a tendency for risk-taking or for kindness?
Atran saw such questions as a puzzle when applied to religion. So many aspects of religious belief involve misattribution and misunderstanding of the real world. Wouldn't this be a liability in the survival-of-the-fittest competition? To Atran, religious belief requires taking "what is materially false to be true" and "what is materially true to be false." One example of this is the belief that even after someone dies and the body demonstrably disintegrates, that person will still exist, will still be able to laugh and cry, to feel pain and joy. This confusion "does not appear to be a reasonable evolutionary strategy," Atran wrote in "In Gods We Trust: The Evolutionary Landscape of Religion" in 2002. "Imagine another animal that took injury for health or big for small or fast for slow or dead for alive. It's unlikely that such a species could survive." He began to look for a sideways explanation: if religious belief was not adaptive, perhaps it was associated with something else that was.
Atran intended to study mathematics when he entered Columbia as a precocious 17-year-old. But he was distracted by the radical politics of the late '60s. One day in his freshman year, he found himself at an antiwar rally listening to Margaret Mead, then perhaps the most famous anthropologist in America. Atran, dressed in a flamboyant Uncle Sam suit, stood up and called her a sellout for saying the protesters should be writing to their congressmen instead of staging demonstrations. "Young man," the unflappable Mead said, "why don't you come see me in my office?"
Atran, equally unflappable, did go to see her - and ended up working for Mead, spending much of his time exploring the cabinets of curiosities in her tower office at the American Museum of Natural History. Soon he switched his major to anthropology.
Many of the museum specimens were religious, Atran says. So were the artifacts he dug up on archaeological excursions in Israel in the early '70s. Wherever he turned, he encountered the passion of religious belief. Why, he wondered, did people work so hard against their preference for logical explanations to maintain two views of the world, the real and the unreal, the intuitive and the counterintuitive?
Maybe cognitive effort was precisely the point. Maybe it took less mental work than Atran realized to hold belief in God in one's mind. Maybe, in fact, belief was the default position for the human mind, something that took no cognitive effort at all.
While still an undergraduate, Atran decided to explore these questions by organizing a conference on universal aspects of culture and inviting all his intellectual heroes: the linguist Noam Chomsky <http://topics.nytimes.com/top/reference/timestopics/people/c/noam_chomsky/index.html?inline=nyt-per> , the psychologist Jean Piaget, the anthropologists Claude Levi-Strauss and Gregory Bateson (who was also Margaret Mead's ex-husband), the Nobel Prize <http://topics.nytimes.com/top/news/science/topics/nobel_prizes/index.html?inline=nyt-classifier> -winning biologists Jacques Monod and Francois Jacob. It was 1974, and the only site he could find for the conference was at a location just outside Paris. Atran was a scraggly 22-year-old with a guitar who had learned his French from comic books. To his astonishment, everyone he invited agreed to come.
Atran is a sociable man with sharp hazel eyes, who sparks provocative conversations the way other men pick bar fights. As he traveled in the '70s and '80s, he accumulated friends who were thinking about the issues he was: how culture is transmitted among human groups and what evolutionary function it might serve. "I started looking at history, and I wondered why no society ever survived more than three generations without a religious foundation as its raison d'être," he says. Soon he turned to an emerging subset of evolutionary theory - the evolution of human cognition.
Some cognitive scientists think of brain functioning in terms of modules, a series of interconnected machines, each one responsible for a particular mental trick. They do not tend to talk about a God module per se; they usually consider belief in God a consequence of other mental modules.
Religion, in this view, is "a family of cognitive phenomena that involves the extraordinary use of everyday cognitive processes," Atran wrote in "In Gods We Trust." "Religions do not exist apart from the individual minds that constitute them and the environments that constrain them, any more than biological species and varieties exist independently of the individual organisms that compose them and the environments that conform them."
At around the time "In Gods We Trust" appeared five years ago, a handful of other scientists - Pascal Boyer, now at Washington University <http://topics.nytimes.com/top/reference/timestopics/organizations/w/washington_university/index.html?inline=nyt-org> ; Justin Barrett, now at Oxford; Paul Bloom at Yale <http://topics.nytimes.com/top/reference/timestopics/organizations/y/yale_university/index.html?inline=nyt-org> - were addressing these same questions. In synchrony they were moving toward the byproduct theory.
Darwinians who study physical evolution distinguish between traits that are themselves adaptive, like having blood cells that can transport oxygen, and traits that are byproducts of adaptations, like the redness of blood. There is no survival advantage to blood's being red instead of turquoise; it is just a byproduct of the trait that is adaptive, having blood that contains hemoglobin.
Something similar explains aspects of brain evolution, too, say the byproduct theorists. Which brings us to the idea of the spandrel.
Stephen Jay Gould <http://topics.nytimes.com/top/reference/timestopics/people/g/stephen_jay_gould/index.html?inline=nyt-per> , the famed evolutionary biologist at Harvard who died in 2002, and his colleague Richard Lewontin proposed "spandrel" to describe a trait that has no adaptive value of its own. They borrowed the term from architecture, where it originally referred to the V-shaped structure formed between two rounded arches. The structure is not there for any purpose; it is there because that is what happens when arches align.
In architecture, a spandrel can be neutral or it can be made functional. Building a staircase, for instance, creates a space underneath that is innocuous, just a blank sort of triangle. But if you put a closet there, the under-stairs space takes on a function, unrelated to the staircase's but useful nonetheless. Either way, functional or nonfunctional, the space under the stairs is a spandrel, an unintended byproduct.
"Natural selection made the human brain big," Gould wrote, "but most of our mental properties and potentials may be spandrels - that is, nonadaptive side consequences of building a device with such structural complexity."
The possibility that God could be a spandrel offered Atran a new way of understanding the evolution of religion. But a spandrel of what, exactly?
Hardships of early human life favored the evolution of certain cognitive tools, among them the ability to infer the presence of organisms that might do harm, to come up with causal narratives for natural events and to recognize that other people have minds of their own with their own beliefs, desires and intentions. Psychologists call these tools, respectively, agent detection, causal reasoning and theory of mind.
Agent detection evolved because assuming the presence of an agent - which is jargon for any creature with volitional, independent behavior - is more adaptive than assuming its absence. If you are a caveman on the savannah, you are better off presuming that the motion you detect out of the corner of your eye is an agent and something to run from, even if you are wrong. If it turns out to have been just the rustling of leaves, you are still alive; if what you took to be leaves rustling was really a hyena about to pounce, you are dead.
A classic experiment from the 1940s by the psychologists Fritz Heider and Marianne Simmel suggested that imputing agency is so automatic that people may do it even for geometric shapes. For the experiment, subjects watched a film of triangles and circles moving around. When asked what they had been watching, the subjects used words like "chase" and "capture." They did not just see the random movement of shapes on a screen; they saw pursuit, planning, escape.
So if there is motion just out of our line of sight, we presume it is caused by an agent, an animal or person with the ability to move independently. This usually operates in one direction only; lots of people mistake a rock for a bear, but almost no one mistakes a bear for a rock.
What does this mean for belief in the supernatural? It means our brains are primed for it, ready to presume the presence of agents even when such presence confounds logic. "The most central concepts in religions are related to agents," Justin Barrett, a psychologist, wrote in his 2004 summary of the byproduct theory, "Why Would Anyone Believe in God?" Religious agents are often supernatural, he wrote, "people with superpowers, statues that can answer requests or disembodied minds that can act on us and the world."
A second mental module that primes us for religion is causal reasoning. The human brain has evolved the capacity to impose a narrative, complete with chronology and cause-and-effect logic, on whatever it encounters, no matter how apparently random. "We automatically, and often unconsciously, look for an explanation of why things happen to us," Barrett wrote, "and 'stuff just happens' is no explanation. Gods, by virtue of their strange physical properties and their mysterious superpowers, make fine candidates for causes of many of these unusual events." The ancient Greeks believed thunder was the sound of Zeus's thunderbolt. Similarly, a contemporary woman whose cancer treatment works despite 10-to-1 odds might look for a story to explain her survival. It fits better with her causal-reasoning tool for her recovery to be a miracle, or a reward for prayer, than for it to be just a lucky roll of the dice.
A third cognitive trick is a kind of social intuition known as theory of mind. It's an odd phrase for something so automatic, since the word "theory" suggests formality and self-consciousness. Other terms have been used for the same concept, like intentional stance and social cognition. One good alternative is the term Atran uses: folkpsychology.
Folkpsychology, as Atran and his colleagues see it, is essential to getting along in the contemporary world, just as it has been since prehistoric times. It allows us to anticipate the actions of others and to lead others to believe what we want them to believe; it is at the heart of everything from marriage to office politics to poker. People without this trait, like those with severe autism, are impaired, unable to imagine themselves in other people's heads.
The process begins with positing the existence of minds, our own and others', that we cannot see or feel. This leaves us open, almost instinctively, to belief in the separation of the body (the visible) and the mind (the invisible). If you can posit minds in other people that you cannot verify empirically, suggests Paul Bloom, a psychologist and the author of "Descartes' Baby," published in 2004, it is a short step to positing minds that do not have to be anchored to a body. And from there, he said, it is another short step to positing an immaterial soul and a transcendent God.
The traditional psychological view has been that until about age 4, children think that minds are permeable and that everyone knows whatever the child himself knows. To a young child, everyone is infallible. All other people, especially Mother and Father, are thought to have the same sort of insight as an all-knowing God.
But at a certain point in development, this changes. (Some new research suggests this might occur as early as 15 months.) The "false-belief test" is a classic experiment that highlights the boundary. Children watch a puppet show with a simple plot: John comes onstage holding a marble, puts it in Box A and walks off. Mary comes onstage, opens Box A, takes out the marble, puts it in Box B and walks off. John comes back onstage. The children are asked, Where will John look for the marble?
Very young children, or autistic children of any age, say John will look in Box B, since they know that's where the marble is. But older children give a more sophisticated answer. They know that John never saw Mary move the marble and that as far as he is concerned it is still where he put it, in Box A. Older children have developed a theory of mind; they understand that other people sometimes have false beliefs. Even though they know that the marble is in Box B, they respond that John will look for it in Box A.
The adaptive advantage of folkpsychology is obvious. According to Atran, our ancestors needed it to survive their harsh environment, since folkpsychology allowed them to "rapidly and economically" distinguish good guys from bad guys. But how did folkpsychology - an understanding of ordinary people's ordinary minds - allow for a belief in supernatural, omniscient minds? And if the byproduct theorists are right and these beliefs were of little use in finding food or leaving more offspring, why did they persist?
Atran ascribes the persistence to evolutionary misdirection, which, he says, happens all the time: "Evolution always produces something that works for what it works for, and then there's no control for however else it's used." On a sunny weekday morning, over breakfast at a French cafe on upper Broadway, he tried to think of an analogy and grinned when he came up with an old standby: women's breasts. Because they are associated with female hormones, he explained, full breasts indicate a woman is fertile, and the evolution of the male brain's preference for them was a clever mating strategy. But breasts are now used for purposes unrelated to reproduction, to sell anything from deodorant to beer. "A Martian anthropologist might look at this and say, 'Oh, yes, so these breasts must have somehow evolved to sell hygienic stuff or food to human beings,' " Atran said. But the Martian would, of course, be wrong. Equally wrong would be to make the same mistake about religion, thinking it must have evolved to make people behave a certain way or feel a certain allegiance.
That is what most fascinated Atran. "Why is God in there?" he wondered.
The idea of an infallible God is comfortable and familiar, something children readily accept. You can see this in the experiment Justin Barrett conducted recently - a version of the traditional false-belief test but with a religious twist. Barrett showed young children a box with a picture of crackers on the outside. What do you think is inside this box? he asked, and the children said, "Crackers." Next he opened it and showed them that the box was filled with rocks. Then he asked two follow-up questions: What would your mother say is inside this box? And what would God say?
As earlier theory-of-mind experiments already showed, 3- and 4-year-olds tended to think Mother was infallible, and since the children knew the right answer, they assumed she would know it, too. They usually responded that Mother would say the box contained rocks. But 5- and 6-year-olds had learned that Mother, like any other person, could hold a false belief in her mind, and they tended to respond that she would be fooled by the packaging and would say, "Crackers."
And what would God say? No matter what their age, the children, who were all Protestants, told Barrett that God would answer, "Rocks." This was true even for the older children, who, as Barrett understood it, had developed folkpsychology and had used it when predicting a wrong response for Mother. They had learned that, in certain situations, people could be fooled - but they had also learned that there is no fooling God.
The bottom line, according to byproduct theorists, is that children are born with a tendency to believe in omniscience, invisible minds, immaterial souls - and then they grow up in cultures that fill their minds, hard-wired for belief, with specifics. It is a little like language acquisition, Paul Bloom says, with the essential difference that language is a biological adaptation and religion, in his view, is not. We are born with an innate facility for language but the specific language we learn depends on the environment in which we are raised. In much the same way, he says, we are born with an innate tendency for belief, but the specifics of what we grow up believing - whether there is one God or many, whether the soul goes to heaven or occupies another animal after death - are culturally shaped.
Whatever the specifics, certain beliefs can be found in all religions. Those that prevail, according to the byproduct theorists, are those that fit most comfortably with our mental architecture. Psychologists have shown, for instance, that people attend to, and remember, things that are unfamiliar and strange, but not so strange as to be impossible to assimilate. Ideas about God or other supernatural agents tend to fit these criteria. They are what Pascal Boyer, an anthropologist and psychologist, called "minimally counterintuitive": weird enough to get your attention and lodge in your memory but not so weird that you reject them altogether. A tree that talks is minimally counterintuitive, and you might believe it as a supernatural agent. A tree that talks and flies and time-travels is maximally counterintuitive, and you are more likely to reject it.
Atran, along with Ara Norenzayan of the University of British Columbia, studied the idea of minimally counterintuitive agents earlier this decade. They presented college students with lists of fantastical creatures and asked them to choose the ones that seemed most "religious." The convincingly religious agents, the students said, were not the most outlandish - not the turtle that chatters and climbs or the squealing, flowering marble - but those that were just outlandish enough: giggling seaweed, a sobbing oak, a talking horse. Giggling seaweed meets the requirement of being minimally counterintuitive, Atran wrote. So does a God who has a human personality except that he knows everything or a God who has a mind but has no body.
It is not enough for an agent to be minimally counterintuitive for it to earn a spot in people's belief systems. An emotional component is often needed, too, if belief is to take hold. "If your emotions are involved, then that's the time when you're most likely to believe whatever the religion tells you to believe," Atran says. Religions stir up emotions through their rituals - swaying, singing, bowing in unison during group prayer, sometimes working people up to a state of physical arousal that can border on frenzy. And religions gain strength during the natural heightening of emotions that occurs in times of personal crisis, when the faithful often turn to shamans or priests. The most intense personal crisis, for which religion can offer powerfully comforting answers, is when someone comes face to face with mortality.
In John Updike <http://topics.nytimes.com/top/reference/timestopics/people/u/john_updike/index.html?inline=nyt-per> 's celebrated early short story "Pigeon Feathers," 14-year-old David spends a lot of time thinking about death. He suspects that adults are lying when they say his spirit will live on after he dies. He keeps catching them in inconsistencies when he asks where exactly his soul will spend eternity. "Don't you see," he cries to his mother, "if when we die there's nothing, all your sun and fields and what not are all, ah, horror? It's just an ocean of horror."
The story ends with David's tiny revelation and his boundless relief. The boy gets a gun for his 15th birthday, which he uses to shoot down some pigeons that have been nesting in his grandmother's barn. Before he buries them, he studies the dead birds' feathers. He is amazed by their swirls of color, "designs executed, it seemed, in a controlled rapture." And suddenly the fears that have plagued him are lifted, and with a "slipping sensation along his nerves that seemed to give the air hands, he was robed in this certainty: that the God who had lavished such craft upon these worthless birds would not destroy His whole Creation by refusing to let David live forever."
Fear of death is an undercurrent of belief. The spirits of dead ancestors, ghosts, immortal deities, heaven and hell, the everlasting soul: the notion of spiritual existence after death is at the heart of almost every religion. According to some adaptationists, this is part of religion's role, to help humans deal with the grim certainty of death. Believing in God and the afterlife, they say, is how we make sense of the brevity of our time on earth, how we give meaning to this brutish and short existence. Religion can offer solace to the bereaved and comfort to the frightened.
But the spandrelists counter that saying these beliefs are consolation does not mean they offered an adaptive advantage to our ancestors. "The human mind does not produce adequate comforting delusions against all situations of stress or fear," wrote Pascal Boyer, a leading byproduct theorist, in "Religion Explained," which came out a year before Atran's book. "Indeed, any organism that was prone to such delusions would not survive long."
Whether or not it is adaptive, belief in the afterlife gains power in two ways: from the intensity with which people wish it to be true and from the confirmation it seems to get from the real world. This brings us back to folkpsychology. We try to make sense of other people partly by imagining what it is like to be them, an adaptive trait that allowed our ancestors to outwit potential enemies. But when we think about being dead, we run into a cognitive wall. How can we possibly think about not thinking? "Try to fill your consciousness with the representation of no-consciousness, and you will see the impossibility of it," the Spanish philosopher Miguel de Unamuno wrote in "Tragic Sense of Life." "The effort to comprehend it causes the most tormenting dizziness. We cannot conceive of ourselves as not existing."
Much easier, then, to imagine that the thinking somehow continues. This is what young children seem to do, as a study at the Florida Atlantic University demonstrated a few years ago. Jesse Bering and David Bjorklund, the psychologists who conducted the study, used finger puppets to act out the story of a mouse, hungry and lost, who is spotted by an alligator. "Well, it looks like Brown Mouse got eaten by Mr. Alligator," the narrator says at the end. "Brown Mouse is not alive anymore."
Afterward, Bering and Bjorklund asked their subjects, ages 4 to 12, what it meant for Brown Mouse to be "not alive anymore." Is he still hungry? Is he still sleepy? Does he still want to go home? Most said the mouse no longer needed to eat or drink. But a large proportion, especially the younger ones, said that he still had thoughts, still loved his mother and still liked cheese. The children understood what it meant for the mouse's body to cease to function, but many believed that something about the mouse was still alive.
"Our psychological architecture makes us think in particular ways," says Bering, now at Queens University in Belfast, Northern Ireland. "In this study, it seems, the reason afterlife beliefs are so prevalent is that underlying them is our inability to simulate our nonexistence."
It might be just as impossible to simulate the nonexistence of loved ones. A large part of any relationship takes place in our minds, Bering said, so it's natural for it to continue much as before after the other person's death. It is easy to forget that your sister is dead when you reach for the phone to call her, since your relationship was based so much on memory and imagined conversations even when she was alive. In addition, our agent-detection device sometimes confirms the sensation that the dead are still with us. The wind brushes our cheek, a spectral shape somehow looks familiar and our agent detection goes into overdrive. Dreams, too, have a way of confirming belief in the afterlife, with dead relatives appearing in dreams as if from beyond the grave, seeming very much alive.
Belief is our fallback position, according to Bering; it is our reflexive style of thought. "We have a basic psychological capacity that allows anyone to reason about unexpected natural events, to see deeper meaning where there is none," he says. "It's natural; it's how our minds work."
Intriguing as the spandrel logic might be, there is another way to think about the evolution of religion: that religion evolved because it offered survival advantages to our distant ancestors. This is where the action is in the science of God debate, with a coterie of adaptationists arguing on behalf of the primary benefits, in terms of survival advantages, of religious belief.
The trick in thinking about adaptation is that even if a trait offers no survival advantage today, it might have had one long ago. This is how Darwinians explain how certain physical characteristics persist even if they do not currently seem adaptive - by asking whether they might have helped our distant ancestors form social groups, feed themselves, find suitable mates or keep from getting killed. A facility for storing calories as fat, for instance, which is a detriment in today's food-rich society, probably helped our ancestors survive cyclical famines.
So trying to explain the adaptiveness of religion means looking for how it might have helped early humans survive and reproduce. As some adaptationists see it, this could have worked on two levels, individual and group. Religion made people feel better, less tormented by thoughts about death, more focused on the future, more willing to take care of themselves. As William James put it, religion filled people with "a new zest which adds itself like a gift to life . . . an assurance of safety and a temper of peace and, in relation to others, a preponderance of loving affections."
Such sentiments, some adaptationists say, made the faithful better at finding and storing food, for instance, and helped them attract better mates because of their reputations for morality, obedience and sober living. The advantage might have worked at the group level too, with religious groups outlasting others because they were more cohesive, more likely to contain individuals willing to make sacrifices for the group and more adept at sharing resources and preparing for warfare.
One of the most vocal adaptationists is David Sloan Wilson, an occasional thorn in the side of both Scott Atran and Richard Dawkins. Wilson, an evolutionary biologist at the State University of New York at Binghamton, focuses much of his argument at the group level. "Organisms are a product of natural selection," he wrote in "Darwin's Cathedral: Evolution, Religion, and the Nature of Society," which came out in 2002, the same year as Atran's book, and staked out the adaptationist view. "Through countless generations of variation and selection, [organisms] acquire properties that enable them to survive and reproduce in their environments. My purpose is to see if human groups in general, and religious groups in particular, qualify as organismic in this sense."
Wilson's father was Sloan Wilson, author of "The Man in the Gray Flannel Suit," an emblem of mid-'50s suburban anomie that was turned into a film starring Gregory Peck. Sloan Wilson became a celebrity, with young women asking for his autograph, especially after his next novel, "A Summer Place," became another blockbuster movie. The son grew up wanting to do something to make his famous father proud.
"I knew I couldn't be a novelist," said Wilson, who crackled with intensity during a telephone interview, "so I chose something as far as possible from literature - I chose science." He is disarmingly honest about what motivated him: "I was very ambitious, and I wanted to make a mark." He chose to study human evolution, he said, in part because he had some of his father's literary leanings and the field required a novelist's attention to human motivations, struggles and alliances - as well as a novelist's flair for narrative.
Wilson eventually chose to study religion not because religion mattered to him personally - he was raised in a secular Protestant household and says he has long been an atheist - but because it was a lens through which to look at and revivify a branch of evolution that had fallen into disrepute. When Wilson was a graduate student at Michigan State University <http://topics.nytimes.com/top/reference/timestopics/organizations/m/michigan_state_university/index.html?inline=nyt-org> in the 1970s, Darwinians were critical of group selection, the idea that human groups can function as single organisms the way beehives or anthills do. So he decided to become the man who rescued this discredited idea. "I thought, Wow, defending group selection - now, that would be big," he recalled. It wasn't until the 1990s, he said, that he realized that "religion offered an opportunity to show that group selection was right after all."
Dawkins once called Wilson's defense of group selection "sheer, wanton, head-in-bag perversity." Atran, too, has been dismissive of this approach, calling it "mind blind" for essentially ignoring the role of the brain's mental machinery. The adaptationists "cannot in principle distinguish Marxism from monotheism, ideology from religious belief," Atran wrote. "They cannot explain why people can be more steadfast in their commitment to admittedly counterfactual and counterintuitive beliefs - that Mary is both a mother and a virgin, and God is sentient but bodiless - than to the most politically, economically or scientifically persuasive account of the way things are or should be."
Still, for all its controversial elements, the narrative Wilson devised about group selection and the evolution of religion is clear, perhaps a legacy of his novelist father. Begin, he says, with an imaginary flock of birds. Some birds serve as sentries, scanning the horizon for predators and calling out warnings. Having a sentry is good for the group but bad for the sentry, which is doubly harmed: by keeping watch, the sentry has less time to gather food, and by issuing a warning call, it is more likely to be spotted by the predator. So in the Darwinian struggle, the birds most likely to pass on their genes are the nonsentries. How, then, could the sentry gene survive for more than a generation or two?
To explain how a self-sacrificing gene can persist, Wilson looks to the level of the group. If there are 10 sentries in one group and none in the other, 3 or 4 of the sentries might be sacrificed. But the flock with sentries will probably outlast the flock that has no early-warning system, so the other 6 or 7 sentries will survive to pass on the genes. In other words, if the whole-group advantage outweighs the cost to any individual bird of being a sentry, then the sentry gene will prevail.
There are costs to any individual of being religious: the time and resources spent on rituals, the psychic energy devoted to following certain injunctions, the pain of some initiation rites. But in terms of intergroup struggle, according to Wilson, the costs can be outweighed by the benefits of being in a cohesive group that out-competes the others.
There is another element here too, unique to humans because it depends on language. A person's behavior is observed not only by those in his immediate surroundings but also by anyone who can hear about it. There might be clear costs to taking on a role analogous to the sentry bird - a person who stands up to authority, for instance, risks losing his job, going to jail or getting beaten by the police - but in humans, these local costs might be outweighed by long-distance benefits. If a particular selfless trait enhances a person's reputation, spread through the written and spoken word, it might give him an advantage in many of life's challenges, like finding a mate. One way that reputation is enhanced is by being ostentatiously religious.
"The study of evolution is largely the study of trade-offs," Wilson wrote in "Darwin's Cathedral." It might seem disadvantageous, in terms of foraging for sustenance and safety, for someone to favor religious over rationalistic explanations that would point to where the food and danger are. But in some circumstances, he wrote, "a symbolic belief system that departs from factual reality fares better." For the individual, it might be more adaptive to have "highly sophisticated mental modules for acquiring factual knowledge and for building symbolic belief systems" than to have only one or the other, according to Wilson. For the group, it might be that a mixture of hardheaded realists and symbolically minded visionaries is most adaptive and that "what seems to be an adversarial relationship" between theists and atheists within a community is really a division of cognitive labor that "keeps social groups as a whole on an even keel."
Even if Wilson is right that religion enhances group fitness, the question remains: Where does God come in? Why is a religious group any different from groups for which a fitness argument is never even offered - a group of fraternity brothers, say, or Yankees fans?
Richard Sosis, an anthropologist with positions at the University of Connecticut <http://topics.nytimes.com/top/reference/timestopics/organizations/u/university_of_connecticut/index.html?inline=nyt-org> and Hebrew University of Jerusalem, has suggested a partial answer. Like many adaptationists, Sosis focuses on the way religion might be adaptive at the individual level. But even adaptations that help an individual survive can sometimes play themselves out through the group. Consider religious rituals.
"Religious and secular rituals can both promote cooperation," Sosis wrote in American Scientist in 2004. But religious rituals "generate greater belief and commitment" because they depend on belief rather than on proof. The rituals are "beyond the possibility of examination," he wrote, and a commitment to them is therefore emotional rather than logical - a commitment that is, in Sosis's view, deeper and more long-lasting.
Rituals are a way of signaling a sincere commitment to the religion's core beliefs, thereby earning loyalty from others in the group. "By donning several layers of clothing and standing out in the midday sun," Sosis wrote, "ultraorthodox Jewish men are signaling to others: 'Hey! Look, I'm a haredi' - or extremely pious - 'Jew. If you are also a member of this group, you can trust me because why else would I be dressed like this?' " These "signaling" rituals can grant the individual a sense of belonging and grant the group some freedom from constant and costly monitoring to ensure that their members are loyal and committed. The rituals are harsh enough to weed out the infidels, and both the group and the individual believers benefit.
In 2003, Sosis and Bradley Ruffle of Ben Gurion University in Israel sought an explanation for why Israel's religious communes did better on average than secular communes in the wake of the economic crash of most of the country's kibbutzim. They based their study on a standard economic game that measures cooperation. Individuals from religious communes played the game more cooperatively, while those from secular communes tended to be more selfish. It was the men who attended synagogue daily, not the religious women or the less observant men, who showed the biggest differences. To Sosis, this suggested that what mattered most was the frequent public display of devotion. These rituals, he wrote, led to greater cooperation in the religious communes, which helped them maintain their communal structure during economic hard times.
In 1997, Stephen Jay Gould wrote an essay in Natural History that called for a truce between religion and science. "The net of science covers the empirical universe," he wrote. "The net of religion extends over questions of moral meaning and value." Gould was emphatic about keeping the domains separate, urging "respectful discourse" and "mutual humility." He called the demarcation "nonoverlapping magisteria" from the Latin magister, meaning "canon."
Richard Dawkins had a history of spirited arguments with Gould, with whom he disagreed about almost everything related to the timing and focus of evolution. But he reserved some of his most venomous words for nonoverlapping magisteria. "Gould carried the art of bending over backward to positively supine lengths," he wrote in "The God Delusion." "Why shouldn't we comment on God, as scientists? . . . A universe with a creative superintendent would be a very different kind of universe from one without. Why is that not a scientific matter?"
The separation, other critics said, left untapped the potential richness of letting one worldview inform the other. "Even if Gould was right that there were two domains, what religion does and what science does," says Daniel Dennett (who, despite his neo-atheist label, is not as bluntly antireligious as Dawkins and Harris are), "that doesn't mean science can't study what religion does. It just means science can't do what religion does."
The idea that religion can be studied as a natural phenomenon might seem to require an atheistic philosophy as a starting point. Not necessarily. Even some neo-atheists aren't entirely opposed to religion. Sam Harris practices Buddhist-inspired meditation. Daniel Dennett holds an annual Christmas sing-along, complete with hymns and carols that are not only harmonically lush but explicitly pious.
And one prominent member of the byproduct camp, Justin Barrett, is an observant Christian who believes in "an all-knowing, all-powerful, perfectly good God who brought the universe into being," as he wrote in an e-mail message. "I believe that the purpose for people is to love God and love each other."
At first blush, Barrett's faith might seem confusing. How does his view of God as a byproduct of our mental architecture coexist with his Christianity? Why doesn't the byproduct theory turn him into a skeptic?
"Christian theology teaches that people were crafted by God to be in a loving relationship with him and other people," Barrett wrote in his e-mail message. "Why wouldn't God, then, design us in such a way as to find belief in divinity quite natural?" Having a scientific explanation for mental phenomena does not mean we should stop believing in them, he wrote. "Suppose science produces a convincing account for why I think my wife loves me - should I then stop believing that she does?"
What can be made of atheists, then? If the evolutionary view of religion is true, they have to work hard at being atheists, to resist slipping into intrinsic habits of mind that make it easier to believe than not to believe. Atran says he faces an emotional and intellectual struggle to live without God in a nonatheist world, and he suspects that is where his little superstitions come from, his passing thought about crossing his fingers during turbulence or knocking on wood just in case. It is like an atavistic theism erupting when his guard is down. The comforts and consolations of belief are alluring even to him, he says, and probably will become more so as he gets closer to the end of his life. He fights it because he is a scientist and holds the values of rationalism higher than the values of spiritualism.
This internal push and pull between the spiritual and the rational reflects what used to be called the "God of the gaps" view of religion. The presumption was that as science was able to answer more questions about the natural world, God would be invoked to answer fewer, and religion would eventually recede. Research about the evolution of religion suggests otherwise. No matter how much science can explain, it seems, the real gap that God fills is an emptiness that our big-brained mental architecture interprets as a yearning for the supernatural. The drive to satisfy that yearning, according to both adaptationists and byproduct theorists, might be an inevitable and eternal part of what Atran calls the tragedy of human cognition.
Robin Marantz Henig, a contributing writer, has written recently for the magazine about the neurobiology of lying and about obesity.
Ann Popplestone AAB, BA, MA
CCC Metro TLC
[Non-text portions of this message have been removed]