Post List

Anthropology posts

(Modify Search »)

  • June 9, 2011
  • 05:14 PM

The Leper Warrior: Persistence of Racial Terminology in Biological Anthropology

by Kristina Killgrove in Powered By Osteons

A few months ago, the news media carried a story about "Bones of Leper Warrior found in Medieval Cemetery" in central Italy.  The publication by Mauro Rubini and Paola Zaio was in early view at the time and was just published in the July issue of the Journal of Archaeological Science (see citation below). I noticed that Katy Meyers blogged about it today over at Bones Don't Lie, but I'm afraid I can't be as charitable as she is in pointing out the flaws.

8th c Avar Warrior
(credit: Wikimedia commons)
Rubini and Zaio studied skeletons from 234 graves in an early Medieval (6th-8th century) cemetery in Molise (south-central Italy).  Based on grave goods, they suggest that the people buried in the cemetery were of different ethnic backgrounds - the Eurasian Avars, Lombards, and indigenous Italians - and were semi-nomadic.  Three of the skeletons appear to have warfare-related wounds, and one of the three also suffered from leprosy in life.  The authors therefore conclude that the three were "warriors from the East."  The finding of a "leper warrior" is actually quite interesting, and I'll return to this at the end.  But there is one very problematic feature of this article that quite frankly surprised me, considering the high profile of the journal and the research caliber of the first author.

The authors offer no evidence to back up their claim that the cemetery was the burial place of people of different geographical origins, aside from the off-hand statement that "the multicultural context of the necropolis is shown by the presence of Lombard, indigenous, and Asian grave goods" (p. 1552).  Sure, different populations may have different artifacts, but the presence of, say, "typical Avar stirrups" does not prove an individual's ethnicity or background, nor does the similarity of the graves to "Pazyryk burials in the placement of the horse, body and grave goods" (ibid.).  As every bioarchaeology student knows from reading The Archaeology of Death and Burial (Parker Pearson 1999), a grave and its associated goods is much more likely to represent the values and mores of the group burying a person than of the person himself.  It's rather curious that Rubino and Zaio include a lengthy explanation of the fact that the Avars were not a distinct ethnic population but rather were "a heterogeneous, multi-ethnic population [... and] a union of multiple cultural patterns" (p. 1551) immediately before talking about the distinct "Asian" (whatever that means) presence in the Campochiaro cemetery.

Especially problematic is Rubini and Zaio's claim that the Lombard/Avar/indigenous classification is "supported also by the preliminary anthropological study of the skeletal sample" (p. 1552).  The study is not specified here, but in the authors' later description of the three graves of interest, the questionable analysis becomes clear (pp. 1554-6):
Grave n. 20 (see Fig. 3) Individual of male sex, with age at death over 55 years. Height was estimated at about 161.5 cm. The skull shows, according to the suggestions of Corrain (2002), characteristics of Mongolic type: dolicomorphous with a superior profile of ovoid shape, flat face and large, low orbit.  [...] Grave n. 102 (Fig. 5) Individual of male sex, with age at death 50-55 years. Height was estimated at about 169.7 cm. According to the suggestions of Corrain (2002), the skull shows characteristics of the Dinarico-Adriatic type: brachimorphous with superior profile of ovoid shape and narrow frontal.

[...] Grave n. 108 (Fig. 7) Individual of male sex, with age at death over 50 years. Height was estimated at about 161.1 cm. The skull shows a mesomorphous shape with a long and narrow face.Just as biological anthropology in the U.S. in the 19th century was influenced by Samuel Morton's racist ideology in the construction of biological "types" and "races" (q.v. yesterday's post), biological anthropology in 19th century Italy was influenced by Cesare Lombroso's theory of atavism and "born criminals."  Morton interpreted his data on skull shape to fit his preconceived notions of race, and Lombroso did the same to fit ideas of "criminals," who all too often included the economically disadvantaged south Italians (Killgrove 2005).

Within biological anthropology in the U.S., we have worked hard to move on from Samuel Morton, Carleton Coon, and the use of the cephalic index in general.  So it always disheartens me when clearly archaic terminology is used to discuss skeletons.  Mongolic and Dinarico-Adriatic types?  Dolico-, brachi- and mesomorphous?  These terms have no place in today's biological anthropology, which is focused on understanding the diversity of the human population, both modern and ancient.  Skulls are not pots - we can't create a typology of cranial features and expect to be able to pick someone's head out of a lineup.  When we do this, we inevitably learn by using alternate methods that inter-population variation is low and gene flow was considerable (e.g., Killgrove 2009).  Classifications such as Dinarico-Adriatic are based on Coon's The Races of Europe (1939), and we've pretty clearly posthumously castigated Coon for advancing segregation in the U.S. in spite of the lack of scientific data to support clear "racial" differences.

There's absolutely no evidence within the Rubini and Zaio article that the three individuals of interest were from another geographic area - skull morphology doesn't cut it, and "ethnic" artifacts in the grave don't convince me either.  Were these "warriors from the East" as the article title implies?  Maybe.  But I'll need to see some aDNA or isotope data to consider the claim to be plausible.

Skull of the Leper Warrior
(credit: Rubini and Zaio, fig. 8a)
What is interesting about the article - and, I assume, the reason that JAS published it - is the authors' finding that a man over the age of 50 who was afflicted with leprosy likely engaged in warfare.  It shouldn't be revelatory that lepers in the past went to war, just as it shouldn't be revelatory that women in the past were active agents in war.  But considering the dearth of evidence we have for this kind of behavior in antiquity (and our own preconceived contemporary Western notions of alpha male warriors as different from frail, sickly lepers and powerless, weak women), the discovery of a warrior with leprosy is quite cool.

While I don't want to take away from Rubini and Zaio's fascinating discovery of the leper warrior, I do want to point out their employment of problematic and archaic terminology in discussing skeletal remains.  It surprises me that in the 21st century, anthropologists are still using these racially-tinged terms, and it surprises me even more that reviewers for the top-tier journal JAS would let the authors claim they'd found "Eastern" people with no real evidence of it.

... Read more »

M. Rubini, & P. Zaio. (2011) Warriors from the East. Skeletal evidence of warfare from a Lombard-Avar cemetery in central Italy (Campochiaro, Molise, 6th-8th century AD). Journal of Archaeological Science, 38(7), 1551-1559. info:/

K. Killgrove. (2009) Rethinking taxonomies: skeletal variation on the North Carolina coastal plain. Southeastern Archaeology, 28(1), 87-100. info:other/

  • June 8, 2011
  • 09:44 PM

Gould's Straw Man

by Kristina Killgrove in Powered By Osteons

Stephen Jay Gould famously argued in his best-known work, The Mismeasure of Man, that Samuel Morton unconsciously manipulated his data on cranial capacity in different populations to fit his own preconceived, racist notions about human variation.  Gould undertook a reanalysis of Morton's data and leveled a variety of accusations against Morton: he incorrectly measured skulls, made mathematical errors, picked and chose his sample populations, and didn't report all of the data he collected.  I would be surprised if any biological anthropologist did not have to read The Mismeasure of Man in a college or graduate course somewhere along the line.  I've even assigned chapters from it when teaching my own classes as well.  The Gould vs. Morton case is often used to illustrate that scientists are humans and humans can be biased.

Yesterday, an article by Jason Lewis and colleagues came out in PLoS Biology contradicting Gould's conclusions about Morton: "The Mismeasure of Science: Stephen Jay Gould versus Samuel George Morton on Skulls and Bias".  The authors remeasured Morton's skulls and reanalyzed both Morton's and Gould's data and have concluded that Gould's claims about Morton's bias are either poorly supported or falsified.  It's a fascinating read, particularly the powerful last paragraphs:
[...] Our results falsify Gould's hypothesis that Morton manipulated his data to conform with his a priori views. The data on cranial capacity gathered by Morton are generally reliable, and he reported them fully. Overall, we find that Morton's initial reputation as the objectivist of his era was well-deserved. That Morton's data are reliable despite his clear bias weakens the argument of Gould and others that biased results are endemic in science. Gould was certainly correct to note that scientists are human beings and, as such, are inevitably biased, a point frequently made in “science studies.” But the power of the scientific approach is that a properly designed and executed methodology can largely shield the outcome from the influence of the investigator's bias. Science does not rely on investigators being unbiased “automatons.” Instead, it relies on methods that limit the ability of the investigator's admittedly inevitable biases to skew the results. Morton's methods were sound, and our analysis shows that they prevented Morton's biases from significantly impacting his results. The Morton case, rather than illustrating the ubiquity of bias, instead shows the ability of science to escape the bounds and blinders of cultural contexts.Morton's study of skulls published in Crania Americana seems to have been grounded in good science after all.  His interpretation of the differences he saw, however, was not. We know now - thanks in large part to Franz Boas, the father of American anthropology - that environment most directly influences the size and shape of one's skull, not one's "race" or ethnic background.  The study by Lewis and colleagues, though, shows that Gould's interpretations may have been as clouded as Morton's, and it seems that Gould may have set up a straw man in Morton, attacking his science rather than his interpretations.

Anyone out there who's read The Mismeasure of Man should read the eye-opening PLoS article. (And John Hawks' post on the topic here.)  It might change the way you think about bias in scientific studies.

J.E. Lewis, D. DeGusta, M.R. Meyer, J.M. Monge, A.E. Mann, & R.L. Holloway (2011). The Mismeasure of Science: Stephen Jay Gould versus Samuel George Morton on Skulls and Bias PLoS Biology, 9 (6) : 10.1371/journal.pbio.1001071... Read more »

J.E. Lewis, D. DeGusta, M.R. Meyer, J.M. Monge, A.E. Mann, & R.L. Holloway. (2011) The Mismeasure of Science: Stephen Jay Gould versus Samuel George Morton on Skulls and Bias. PLoS Biology, 9(6). info:/10.1371/journal.pbio.1001071

  • June 8, 2011
  • 04:27 PM

Crazy Corn Children & Ritual Form

by Cris Campbell in Genealogy of Religion

In 1977, Stephen King published his short story “Children of the Corn” in Penthouse. Seven years later, movie audiences across the nation were horrified by the ritual doings of small town Nebraska kids who worshiped something malevolent in the corn.
It surely was no coincidence that later in the year, Nebraska experienced a sharp drop in [...]... Read more »

  • June 8, 2011
  • 02:45 PM

Earliest human migrations

by zacharoo in Lawn Chair Anthropology

One of my favorite paleoanthropological sites is Dmanisi, in the Republic of Georgia. It is the oldest securely dated hominid site outside Africa (just under 1.85 million years ago), and the hominids found there display a neat mix of primitive Homo habilis and derived H. erectus features. I consider myself lucky to have had the opportunity to excavate at Dmanisi last year, and to return to Georgia (lamazi Sakartvelo! [I hope I translated that correctly]) for more fieldwork next month.
Recently, Reid Ferring and others (2011) described the results of excavations of M5, a section of the site a bit aways from the area where the hominids were found. M5 is pretty cool because it presents a nice geological "layer cake," as Ferring described it to us: each of the strata (different layers of deposition) are nicely and evenly stacked on one another. Check out the labeled layers on the right of the figure, from Ferring et al. 2011:This is in stark contrast to the jumbled strata (like 'spaghetti') where the hominids were found. In geology and archaeology, there is a general "law of superposition," which states that the lowest layers in a sequence would have been deposited earlier than the layers above them. The A sediments at Dmanisi, as seen in the figure above, are thus older than the Bs. Hominids have only been found in the B sediments. But work at M5 has shown that stone tools are found in the older A sediments, meaning that hominids arrived at the site and used it continually, beginning just after 1.85 million years ago.
Tools from the site differ between the older A and slightly later (still older than 1.75 million years!) B sediments in both material and manufacture. As they say in the paper (p. 2/5), a major difference in tool manufacture between the strata A and B occupations could be that during the earlier A times, "either cores were more intensively reduced or selected flakes were made elsewhere and carried to the site." I'm not sure why this may be, but it is neat that within a fairly narrow time span, researchers can see habits change in our early ancestors.
The authors also note that the older tools from A sediments indicate "that Eurasia was probably occupied before Homo erectus appears in the East African fossil record" (from the paper's abstract). If only hominids also came out of the A sediments! The News is touting this as meaning H. erectus evolved in Eurasia and then some members of the 'new species' moved back into Africa, but I don't think this is necessarily the case. The Dmanisi hominids are described as H. erectus, but lack some key H. erectus apomorphies (most notably a large brain size) and really look pretty similar to contemporary hominids in Kenya (such as KNM-ER 3733) and Tanzania (such as OH 16). Plus, the E. African hominid fossil record around 1.9 million years ago leaves some tantalizing hints at hominids more erectus-like than habilis-like, such as the ER 2598 occipital fragment.
So while Dmanisi definitely demonstrates the presence of hominids outside Africa earlier than most well-accepted "Homo erectus" (or "ergaster") fossils in E. Africa, I don't think they necessarily indicate that the species arose in Eurasia. Rather, what the fossil record likely shows is the evolution of populations of early Homo, in Africa and Eurasia, toward the more 'advanced' H. erectus we know and love (due to gene flow w/in a widespread species, rather than parallel evolution of similar traits in different species).
ReferenceFerring R, Oms O, Agustí J, Berna F, Nioradze M, Shelia T, Tappen M, Vekua A, Zhvania D, & Lordkipanidze D (2011). Earliest human occupations at Dmanisi (Georgian Caucasus) dated to 1.85-1.78 Ma. Proceedings of the National Academy of Sciences of the United States of America PMID: 21646521... Read more »

Ferring R, Oms O, Agustí J, Berna F, Nioradze M, Shelia T, Tappen M, Vekua A, Zhvania D, & Lordkipanidze D. (2011) Earliest human occupations at Dmanisi (Georgian Caucasus) dated to 1.85-1.78 Ma. Proceedings of the National Academy of Sciences of the United States of America. PMID: 21646521  

  • June 7, 2011
  • 04:30 PM

Universal Moral Grammar?

by Kevin Karpiak in Kevin Karpiak's Blog

Does anthropological evidence support the idea of a universal moral grammar?... Read more »

  • June 7, 2011
  • 12:15 PM

"What Have I Done?"—The Nature of Regret

by Krystal D'Costa in Anthropology in Practice

We've all been there—the "Oh, [expletive]" moment. Perhaps the door just shut and your keys are still sitting on the counter. Or you get to the subway/bus stop just as your mass transit mode of choice is pulling away. Perhaps you've left your wallet at home, and there are blue lights flashing in your rear view mirror. Or maybe your expletive moment is a bit darker: a broken promise, a hurt friend, or a damaged relationship through some fault of your own. After all, regret is all about you and what you could have done differently.  It can certainly vary in intensity, but we've all been there a time or two. Regret is a hard emotion to avoid. It is a curious emotion—a mixture of disappointment, shame, sadness, and self blame, and it can be both a hindrance and provide a much needed push in the face of opportunity. Does the experience of regret serve a purpose? Is it a necessary element of sociality?
One researcher suggests that there are two main forms of regret (1). The first is a "hot emotion" that carries a blow. For example, it's what you feel when you suffer a loss because you didn't follow instructions or seek guidance. It's the punch to the diaphragm as you think about the things you could have done differently, or that sinking feeling of despair as you're confronted with disappointment. The second is a form of wistful thinking. "If only" fills this category: If only I had taken the time to check where my keys were or if only I had walked away from that argument. Common to both forms is a sense that something could have been done differently. That is the nature of regret: the belief that a negative outcome is the result of one's own actions, and could have been avoided if one had taken an alternative path (2, 3).
The role of regret in decision-making is twofold. On one hand, it can be limiting. We tend to overgeneralize negative results, and so it may be that one poor choice can prevent you from taking any action that remotely resembles the regrettable course (4). For example, you may refrain from giving advice if your suggestion doesn't go as planned. Or you may stop purchasing a particular brand if you have problems with one of their products. Or you may avoid a particular topic if others seemed to have responded poorly to those ideas in the past. In this context, regret can easily become consuming. That is, the individual can be so focused on what he did wrong and what he would do differently, that it immobilizes him, rendering him incapable of accepting and resolving the situation. On the other hand, it can make us more attuned to missed opportunities. For example, if you paid more for something because the sale expired as you waited to see if the price would drop further, you may be less inclined to wait going forward. Or if you realize that you should have spoken up to prevent a friend from being hurt, you may decide to speak up sooner if it means saving a friendship.
Regret seems to operate as a social gauge. It may serve as a flag that a transgression has occurred that would not be acceptable on a continued basis. If this is indeed the case, then it may be possible to conceive of regret as a form of social apology, particularly if the regret is public—that is, if the regret is the result of an act that makes you look badly or lose status or standing, then regret may be a form of "remedial self-preservation" (5). In accepting blame for the event, the actor vilifies himself:Apologies split the self into two parts, a "bad" self that is vilified for the incident and a "good" self that proclaims a recognition of the misconduct and extends a promise (often implicit) of more acceptable behavior in the future (6).In this regard, regret is a personal assessment of perceived external judgment. It cannot be assigned, like blame. Regret is something we take upon ourselves. And the human tendency is to assume the worst: "People routinely overestimate the emotional impact of negative events ranging from professional failures and romantic breakups to electoral losses, sports defeats, and medical setbacks" (7). So we ultimately determine what to regret and what not to regret. If regret is consuming, is it because we allow it to be so?
Perhaps then regret stems from a conflict between this assessment and the true desires of the individual. It is possible to want something that society tells you is inappropriate—of course, we've moved beyond the regret of forgotten keys at this point, but regret truly does seem to exist on a sliding scale. More than just wistful thinking, which is a past-oriented perspective, regret may very much be rooted in the present in a sense of futility—firmly ensconced in the inability to pursue that which you desire most for whatever reason.
What are your thoughts on this emotion? Is it better to take a chance and pursue what you desire? Or should we wrestle with regret?
Cited:Gilbert DT, Morewedge CK, Risen JL, & Wilson TD (2004). Looking Forward to Looking Backward: The Misprediction of Regret. Psychological science, 15 (5), 346-50 PMID: 15102146
John Sabini and Maury Silver (2005). Why Emotion Names and Experiences Don't Neatly Pair Psychological Inquiry, 16 (1), 1-10
Schlenker, Barry, & Darby, Bruce (1981). The Use of Apologies in Social Predicaments. Social Psychology Quarterly, 44 (3), 271-278 DOI: 10.2307/3033840
Notes:1. Sabini and Silver (2005): 9. 2. Gilbert et. al. (2004): 346.3. Sabini: 7.4. Gilbert: 346. 5. Schlenker and Darby 1981: 271. 6. Schlenker and Darby: 272.7. Gilbert: 346.
... Read more »

Gilbert DT, Morewedge CK, Risen JL, & Wilson TD. (2004) Looking Forward to Looking Backward: The Misprediction of Regret. Psychological science, 15(5), 346-50. PMID: 15102146  

John Sabini and Maury Silver. (2005) Why Emotion Names and Experiences Don't Neatly Pair. Psychological Inquiry, 16(1), 1-10. info:/

Schlenker, Barry, & Darby, Bruce. (1981) The Use of Apologies in Social Predicaments. Social Psychology Quarterly, 44(3), 271-278. DOI: 10.2307/3033840  

  • June 7, 2011
  • 11:49 AM

Britain's Not Getting More Mentally Ill

by Neuroskeptic in Neuroskeptic

There's a widespread belief that mental illness is getting more common, or that it has got more common in recent years.A new study in the British Journal of Psychiatry says: no, it's not. They looked at the UK APMS mental health surveys, which were done in 1993, 2000 and 2007. Long-time readers will remember these.The authors of the new paper analyzed the data by birth cohort, i.e. when you were born, and by age at the time of the survey. If mental illness were rising, you'd predict that people born more recently would have higher rates of mental illness at any given age.The headline finding: there was no cohort effect, implying that rates of mental illness aren't changing. There was a strong age effect: in men, rates peak at about age 50; in women the data is rather messy but in general the rate is flat up to age 50 and then it falls off, like in men. But there's no evidence that those born recently are at higher risk.The only exception was that men born after 1950 were at somewhat higher risk than those born earlier as shown by the "break" on the graph above. The effect for women was smaller. The most recent cohort, those born after 1985, were also above the curve but there was only one datapoint there, so it's hard to interpret.We also get a rather cute graph showing how life changes with age:As you get older, you get less irritable and, if you're a woman, you'll worry less. But sleep problems and, in men, fatigue, increase. Overall, 50 is the worst age in terms of total symptoms. After that, it gets better. Well, that's nice to know. Or not, depending on your age.Overall, the authors say:Our finding of subsequently stable rates contradicts popular media stories of a relentlessly rising tide of mental illness, at least for men. Stable prevalence in the male population, together with peaking of the prevalence of common mental disorder at about age 50 years, indicates that a large increase in projected rates of poor mental health is unlikely in the male population in the near future....Trends in women are less clearly identified, with considerable increases in the prevalence of sleep problems, but no clear increase or even some decrease in other measures. Further research is needed to relate these age and cohort differences to drivers of mental health such as employment status and family composition.Caution's warranted, though, because the APMS data were based on self-reported symptoms of mental illness assessed by lay interviewers. As I've argued before, self-report is problematic, but this is true of almost all of these kinds of studies.More unusual is that this study didn't attempt to assign formal diagnoses, it just looked at total symptoms on the CIS Scale; a total of 12 or more was considered to indicate "probable disorder".Purists would say that this is a weakness and that you ought to be making full DSM-IV diagnoses, but honestly, it's got its own problems, and I think this is no worse.Finally, this study only looked at "common mental disorders" i.e. depression and various kinds of anxiety symptoms. Things like schizophrenia and bipolar disorder weren't included, but from what I remember they're not rising either.Spiers N, Bebbington P, McManus S, Brugha TS, Jenkins R, & Meltzer H (2011). Age and birth cohort differences in the prevalence of common mental disorder in England: National Psychiatric Morbidity Surveys 1993-2007. The British journal of psychiatry : the journal of mental science, 198, 479-84 PMID: 21628710... Read more »

  • June 6, 2011
  • 04:55 PM

Osteochronology and the Berenstain Bears

by Kristina Killgrove in Powered By Osteons

Actually, Papa Bear, humans are a bit similar to trees.... Read more »

  • June 6, 2011
  • 11:08 AM

Foreign Ideas & Moral Indigestion

by Cris Campbell in Genealogy of Religion

Imagine you are dining at a friend’s home. Your host is excited because she has prepared a special dish for you. When dinner is finally served, you are surprised to see a whole egg on your plate and when you open the egg, you are even more surprised to see this:
That’s balut, a dish of [...]... Read more »

Ritter, Ryan, & Preston, Jesse Lee. (2011) Gross gods and icky atheism: Disgust responses to rejected religious beliefs. Journal of Experimental Social Psychology. info:/10.1016/j.jesp.2011.05.006

  • June 6, 2011
  • 06:00 AM

Trench Fever and Plague in 14th Century France

by Michelle Ziegler in Contagions

The Marseille plague group has been suggesting for some time now that human lice could be a major vector of medieval plague. To test their hypothesis the group devised a multiplex PCR screening method to rapidly screen many aDNA samples for seven pathogens that could cause medieval epidemics, including relapsing fever and trench fever transmitted by human lice. ... Read more »

  • June 6, 2011
  • 03:56 AM

Save the Planet by… Becoming a Vegan! Do I really have to?

by Stuart Farrimond in Dr Stu's Science Blog

Veganism – it’s just something for middle-class ‘hippies’ right? Vegans are those tree-hugging, hemp-wearing festival-goers who say ‘man’ far too much. Well perhaps it’s time for a rethink on that stereotype. At least if you care about environment, that is. If you had thought you could do your bit to fight global warming by getting … Continue reading »... Read more »

Gidon Eshel and Pamela A. Marti. (2006) Diet, Energy and Global Warming. Earth Interactions, 10(9), 1-17. DOI: 10.1175/EI167.1  

Fengxia Dong . (2007) Changing Diets in China's Cities: Empirical Fact or Urban Legend?. Center for Agricultural and Rural Development at Iowa State University. info:/

  • June 4, 2011
  • 04:26 PM

Decoding Frazer’s “Golden Bough”

by Cris Campbell in Genealogy of Religion

Few books in the history of anthropology are better known (but never read) than James George Frazer’s The Golden Bough: A Study in Magic and Religion. First published in 1890 (2 volumes), Frazer published a second edition in 1900 (3 volumes), and a rolling third edition between 1911 and 1915 which ballooned to 12 volumes.
Though [...]... Read more »

Ackerman, Robert. (1975) Frazer on Myth and Ritual. Journal of the History of Ideas, 36(1), 115-134. DOI: 10.2307/2709014  

  • June 3, 2011
  • 04:01 AM

Political Suicide

by Neuroskeptic in Neuroskeptic

When is killing yourself not suicide?In the British Journal of Psychiatry, two psychiatrists and an anthropologist discuss recent cases of self-immolation as a form of political protest in the Arab world:Since ancient times there has been a difference between suicide (an act of self-destruction) and self-immolation which, although self- destructive, has a sacrificial connotation. Self-immolation is associated with terrible physical pain (burning alive) and with the idea of courage... It is, however, a new phenomenon in Arab Muslim societies.The self-immolation of the young Tunisian Mohamed Bouazizi, a street vendor, expresses both the extreme hurt associated with the harassment and humiliation that was inflicted on him after his wares had been confiscated, and the fact that there were no other ways to be heard in a country where he knew no kind of political system other than dictatorship...His gesture is now being replicated, mostly by other young men in Arab countries.These events ....raise important issues for psychiatrists and mental health professionals. First, these events highlight the social, political and cultural dimensions of suicide as a powerful collective idiom of distress. In the Tunisian case there is a shift from an individual sinful suicide to a sacrifice which evokes martyrdom. Fire symbolises purification...Second, in spite of the fact that the idiom of distress put forward by these Arab youth is radically different from the usual profile of youth suicide in Western countries, these events may also be an invitation to rethink the collective dimensions of youth suicide as a protest against society. Without minimising the role of psychopathology and interpersonal factors, it may be time to revisit the collective meaning associated by youth with the decision to exit a world in which they may feel they do not always have a voice.There's certainly a perception that some suicide is "political", and quite different from similar actions done for "personal" reasons. The same goes for breaking the law: we make a distinction between "common criminals", who do it for their own sake, and people who do so for an ideal.But I wonder whether this political/personal distinction is so clear-cut, psychologically speaking. Even "political" suicide has a personal component: in most cases, millions of people are in the same political situation, but only a few people burn themselves. Politics alone doesn't explain any individual case.Conversely the idea that "personal" suicide is simply a symptom of an individual's mental illness is likewise inadaquate - most people with mental illness, even very severe cases, do not do it. We have to look into the social sphere as well.Emile Durkheim drew a distinction between "egoistic" suicide, related to an individual's "prolonged sense of not belonging, of not being integrated in a community" and "anomic" suicide, caused by upheavals in society leading to "an individual's moral confusion and lack of social direction". But aren't those different ways of looking at the same thing?Cheikh IB, Rousseau C, & Mekki-Berrada A (2011). Suicide as protest against social suffering in the Arab world. The British journal of psychiatry : the journal of mental science, 198, 494-5 PMID: 21628715... Read more »

Cheikh IB, Rousseau C, & Mekki-Berrada A. (2011) Suicide as protest against social suffering in the Arab world. The British journal of psychiatry : the journal of mental science, 494-5. PMID: 21628715  

  • June 2, 2011
  • 02:37 PM

Lost in (Western) Translation

by Cris Campbell in Genealogy of Religion

There is a sense in which we are all cultural narcissists. By this, I mean that because all of us are acculturated at a particular time and in a particular place, we have a strong tendency to view other times and places through our own cultural lens. These lenses are prismatic and what we see [...]... Read more »

  • June 2, 2011
  • 07:44 AM

Language revitalization and liberation

by Ingrid Piller in Language on the Move

I’ve recently come across the story of Chibana Shoichi, who burnt the Japanese flag in 1987 to commemorate the Okinawan victims of WWII Japanese militarism. The story is intriguing not because of the flag-burning incident but because Shoichi also keeps … Continue reading →... Read more »

  • June 1, 2011
  • 01:40 PM

The Arabian Middle Paleolithic and the southern route of human dispersal

by Julien Riel-Salvatore in A Very Remote Period Indeed

In a comment on my last post, Maju who's a regular commenter on this blog, pointed out that recent finds in the Arabian Peninsula and the Persian Gulf suggest that modern humans might have been present in the Middle East by the time Shanidar 3 was killed. Some of the specific evidence in support of this that has come out in the past year include that presented by Armitage et al. (2011) and Rose (... Read more »

Petraglia, Michael D., & Alsharekh, Abdullah. (2003) The Middle Palaeolithic of Arabia: Implications for modern human origins, behaviour and dispersals . Antiquity, 77(298), 671-684. info:/

  • May 31, 2011
  • 04:53 PM

Bioarchaeology of Roman Seafood Consumption

by Kristina Killgrove in Powered By Osteons

How much seafood did the Romans eat, and how does imported seafood affect our understanding of their origins?... Read more »

C. Beltrame, D. Gaddi, & S. Parizzi. (2011) A presumed hydraulic apparatus for the transport of live fish, found on the Roman wreck at Grado, Italy. International Journal of Nautical Archaeology. info:/10.1111/j.1095-9270.2011.00317.x

  • May 31, 2011
  • 04:14 PM

Vaccines Cause Autism, Until You Look At The Data

by Neuroskeptic in Neuroskeptic

According to a much-discussed new paper, vaccines may cause autism after all: A Positive Association found between Autism Prevalence and Childhood Vaccination uptake across the U.S. Population.The author is Gayle DeLong, who "teaches international finance at Baruch College, City University of New York", according to her profile as a board member of anti-vaccine group SafeMinds. She correlated rates of coverage of the government recommended full set of vaccines in the 51 US states including Washington D.C., with registered rates of autism in those states six years later.Uh-oh - there was a correlation between vaccination in two year kids, and the rate of autism in the state six years later, when those kids were eight. As the abstract says:The higher the proportion of children receiving recommended vaccinations, the higher was the prevalence of AUT... The results suggest that although mercury has been removed from many vaccines, other culprits may link vaccines to autism. Further study into the relationship between vaccines and autism is warranted.Sounds rather scary. Until you look at the data, helpfully provided in the paper. First up, here's the scatterplot of all of the vaccination rates and all of the autism-six-years-later rates:There's more than 51 data points as you can see: there's actually 355 because each state had seven different datapoints (1995 vaccines vs 2001 autism though to 2001 vs 2007). This scatterplot shows no correlation. You can tell just from looking at it, but the correlation coefficient confirms this, as it's a tiny r 0.012 (from a possible range of 0 to 1).To be fair, that's a very noisy measure, because each state has unique characteristics, so the effect of vaccines will be diluted. However, it's still a useful sanity check, and shows that there can't be a major effect, otherwise it would be too big to get diluted.To get around this I next looked at the change in the rates of vaccination from one year to the next, and correlated that with the corresponding change in future rates of autism, within each state. A "change" of 1 means no change, 0.5 means it halved and 2 means it doubled, etc.Zilch. Correlation coeffiencent r is 0.034.Maybe the changes year-to-year were too small? So I checked the changes between the last year, and the first year.This made the changes bigger, because more tends to change over six years than in just one. And, to be fair, this does produces a slightly stronger vaccine-autism effect... but it's still tiny. The correlation coefficient here, r, is 0.18 which means that vaccination changes accounts for 3% of the variability in autism changes (r^2 = 0.034.) The p value is 0.20, statistically insignificant.My conclusion is that this dataset shows no evidence of any association. The author nonetheless found one. How? By doing some statistical wizardry.The statistical model used took into consideration the unique characteristics of each state. For example, each state had a unique mixture of pollution, which may have affected the prevalence of autism, yet such an effect was not included in this study. A fixed-effects, within-group panel regression (Hall and Cummins 2005) controlled for these unique yet undefined characteristics by deriving a different starting point (intercept) for each state.The 51 different intercepts - one for each state - reflected the base level of autism or speech disorders occurring in that state that were not explained by the other independent variables (vaccination rates, income, or ethnicity). The model then produced a single relationship between the independent variables and the prevalence of autism or speech disorders.OK, that's all very fancy, but when the raw data shows zilch and you can only find a signal by "controlling for" stuff, alarm bells start ringing. Given sufficient statistical analysis you can make any data say anything you want.If the author had given details of the methods, and explained why she chose to control for the variables she did, and not others, that might be different. But she didn't. Nor did she justify only looking at the effects six years later, when five or seven or ten would be just as sensible... and so on.(Note: whenever I've said "autism", that's my shorthand for autism + SLI, which is what the paper looked at; autism alone data are not presented. Note also that by "vaccination %" I mean "% who got the full vaccine schedule"; the other kids may have got vaccines, just not all of them.)Delong G (2011). A Positive Association found between Autism Prevalence and Childhood Vaccination uptake across the U.S. Population. Journal of toxicology and environmental health. Part A, 74 (14), 903-16 PMID: 21623535... Read more »

  • May 31, 2011
  • 12:25 PM

Ghostbusting with Gozer

by Cris Campbell in Genealogy of Religion

According to the Ghostbusters Wiki, Gozer the Gozerian (known also as Gozer the Destructor, Volguus Zildrohar, and Lord of the Sebouillia) is an ancient entity who “was originally worshiped as a god by the Hittites, Mesopotamians, and the Sumerians around 6000 BC.” When not visiting retribution on New York in the form of the Stay [...]... Read more »

  • May 31, 2011
  • 11:53 AM

Stressed Lemurs and Grass-Eating Humans

by Laelaps in Laelaps

In his 1960 presidential address to the South African Archaeological Society, the anthropologist Louis Leakey cast the fossil humans that had been found in that country as little more than a collection of evolutionary dead-ends. Leakey didn’t put it quite like that – that would have been rude – but he did utilize the platform [...]... Read more »

Cerling, T., Mbua, E., Kirera, F., Manthi, F., Grine, F., Leakey, M., Sponheimer, M., & Uno, K. (2011) Diet of Paranthropus boisei in the early Pleistocene of East Africa. Proceedings of the National Academy of Sciences. DOI: 10.1073/pnas.1104627108  

Leakey, L. (1961) Africa's Contribution to the Evolution of Man. The South African Archaeological Bulletin, 16(61), 3. DOI: 10.2307/3887411  

join us!

Do you write about peer-reviewed research in your blog? Use to make it easy for your readers — and others from around the world — to find your serious posts about academic research.

If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.

Register Now

Research Blogging is powered by SMG Technology.

To learn more, visit