Dogs are particularly good at tasks that involve communicating or cooperating with humans, which has led some researchers to speculate that they are really good at solving social tasks, more generally. For example, dogs can figure out where a human's attention is, are really good at picking up on eye-gaze and finger pointing cues, distinguish among different individual humans (by contrast, humans are really bad at distinguishing among different individual monkeys, for example), and at least in one outstanding case, are capable of "fast mapping."
Relative to non-human primates, domestic dogs indeed seem to have exceptional social skills. For example, previous research has demonstrated that dogs are able to use human social cues to find hidden food while non-human primates do not. Furthermore, cross-sectional studies of dogs and puppies of different ages, as well as longitudinal studies which track the development of individual puppies, have indicated that dogs do not require extensive exposure to humans to skillfully use those cues (though training enhances their skill). By contrast, wolf pups do require extensive exposure to humans to be able to extract meaning from human social cues such as eye-gaze and finger-pointing. So, while both dogs and wolves are able to understand human social cues, the domestication of dogs seems to have selected for this trait and allowed it to emerge early in development without much experience.
Figure 1: Gratuitous picture of my dog? Sure, why not?
But these studies of social cognition in dogs have had one common theme, which is that they all tested social cognition in the context of a communicative-cooperative task. But do dogs' social skills extend beyond this narrow context? In non-communicative or non-cooperative social tasks, are dogs' social skills otherwise unremarkable? The distinction is not trivial; social information comes in various forms beyond explicit communication. For example, various non-human primate species are known to alter their behavior when trying to steal food from a human, according to whether or not that human is watching them. This is surely a social problem, but one devoid of explicit communication or cooperation.
Two researchers with whom the regular reader of this blog should now be familiar, Victoria Wobber (who ran the bonobo testosterone study I mentioned in the review of Bonobo Handshake) and Brian Hare, wonder to what extent dogs can reason about the social world more broadly. Specifically, would their impressive social skills persist in a task that did not involve cooperative communication? They compared dogs and chimpanzees in two versions of a reversal learning task: non-social and social. Read the rest of this post... | Read the comments on this post...... Read more »
Wobber, V., & Hare, B. (2009) Testing the social dog hypothesis: Are dogs also more skilled than chimpanzees in non-communicative social tasks?. Behavioural Processes, 81(3), 423-428. DOI: 10.1016/j.beproc.2009.04.003
Chimpanzees have culture (or not) depending on your definition.Image: Irish Wildcat / Creative Commons
Author's Note: The following is an expansion on my reply to anthropologist Dan Sperber on the PLoS ONE article "Prestige Affects Cultural Learning in Chimpanzees."
Culture is like art or pornography, it's hard for people to define but everyone knows it when they see it. Cultural anthropologists have long struggled to develop a consistent definition of the very thing that they study, a problem that has resulted in bitter arguments between scholars that, to an outsider, may seem as esoteric as church doctrinal disputes over how many angels can sit upon the point of a needle.
In his 1959 book The Evolution of Culture anthropologist Leslie White famously defined culture as "the extra-somatic means of adaptation for the human organism." His goal was to bring some consistency to a field that had 164 separate definitions of "culture" being used interchangeably in the anthropological literature (which, predictably, made cross-cultural comparisons challenging at best). Today, this view has expanded beyond the human animal and a widely accepted definition is from Peter Richerson and Robert Boyd's celebrated work Not By Genes Alone: How Culture Transformed Human Evolution:
Culture is information capable of affecting individuals' behavior that they acquire from other members of their species through teaching, imitation, and other forms of social transmission.
By information, we mean any kind of mental state, conscious or not, that is
acquired or modified by social learning, and affects behavior.
Earlier I reported on a new study in PLoS ONE by Victoria Horner, Darby Proctor, Kristin E. Bonnie, Andrew Whiten, and Frans de Waal that found chimpanzees will adopt novel behaviors after watching them performed by high-ranking members of their group. The authors concluded that these findings demonstrate "prestige-based cultural transmission" for the first time in nonhuman animals. Their results were consistent with Richerson and Boyd's definition of culture as well as their argument that:
[N]atural selection has shaped the psychology of social learning so that we are predisposed to imitate people with prestige and material well-being. . . [M]any phenomena, ranging from maladaptive fads and fashions to group-functional religious beliefs to symbolically marked boundaries between groups, might result from the properties of prestige bias.
However, French anthropologist Dan Sperber (Research Director at the Jean Nicod Institute, CNRS and 2009 recipient of the Claude Levi-Strauss Prize in Social Science) has recently challenged these findings in chimpanzees and insists that it does not represent cultural transmission at all. In a critique, following from his work in linguistic anthropology, he suggests that humans alone are capable of culture. However, just like in anthropology's past, his conclusions rest on the definition that he prefers to use. Read the rest of this post... | Read the comments on this post...... Read more »
Here is a far-reaching and crucially relevant question for those of us seeking to understand the evolution of culture: Is there any relationship between population size and tool kit diversity or complexity? This question is important because, if met with an affirmative answer, then the emergence of modern human culture may be explained by changes [...]... Read more »
Kline MA, & Boyd R. (2010) Population size predicts technological complexity in Oceania. Proceedings. Biological sciences / The Royal Society. PMID: 20392733
Although some have emphasized the need to breed crops for future climatic conditions, much of the world’s farming population relies on landrace populations, not formal breeding networks.
Undeniable, of course, and a good reason to not forget landraces, or farmers’ local varieties, when thinking about how agriculture will (or will not) adapt to climate change. And [...]... Read more »
Mercer, K., & Perales, H. (2010) Evolutionary response of landraces to climate change in centers of crop diversity. Evolutionary Applications. DOI: 10.1111/j.1752-4571.2010.00137.x
So say Mijares and colleagues (2010), reporting the discovery of a small human third metatarsal from Callao Cave in the northern Philippines. The paper present a brief overview of fieldwork conducted at Callao since 2003 that exposed Pleistocene deposits at the site. The age of the layer in which the metatarsal was recovered was obtained through Electron Spin Resonance (ESR) and Uranium Series (U-Series) on two cervid teeth, one of which yielded an age of 66 +11/-9 kya.From Mijares et al. (2010: 7, Fig. 8, Copyright © 2010 Elsevier Ltd). The Callao specimen (A) is compared to H. sapiens (B) and H. habilis (C), and three non-human primates to the right. The really interesting part of the paper comes when the author discuss the taxonomic attribution of the metatarsal. They compare it to various extant primates and show that it is a convincing Homo bone, aligning itself most closely with small-bodied populations, such as H. habilis or contemporary Philippine Negritos, the latter of which stand out as likely potential analogs of the hominin to whom the metatarsal belonged. That said, "the dimensions of the base of the bone and the section of the shaft are smaller, indicating peculiar proportions for the Callao metatarsal. At the mid-shaft, the shaft appears to be considerably smaller in the dorso-plantar direction than in the Negrito comparative sample. As shown by the reduced dimensions obtained for the dorso-plantar height and medio-lateral breadth of the proximal facet for the lateral cuneiform, the base is very small. It is the smallest of our sample, confirming the particular shape and proportions of the bone as seen from lateral and superior views" (Mijares et al. 2010: 8).The authors emphasize that the peculiarities of the Callao metatarsal are unique in the panorama of known foot bones attributed to various Pleistocene Homo. Provocatively, they point out that the dimensions of the H. floresiensis third metatarsal from Liang Bua (LB 1) are very close to those of the Callao specimen (Mijares et al. 2010: 9). While they present this comparison as speculative, the implications of the exercise are clear: they're asking whether something like H. floresiensis could have been present at Callao ca. 67 kya, although they do cover their bases by emphasizing that the closest analog small-bodied humans known in the region today are Negritos.What's a bit puzzling is their repeated discussion that the Philippines are east of Wallace's line. While I know there's a bit of debate over this, I've always understood the Philippines as being located west of Wallace's line, on the Asian side of things. Mijares et al.'s argument that the Philippines are "beyond Wallace's Line in Island Southeast Asia" appear to be a further manner of potentially linking the Callao specimen to those from Flores.In any case, as the authors conclude, the Callao third metatarsal "documents the presence of a hominin species on the island of Luzon as early as 67 ka, and is testimony to a capability to colonize new territories across open sea gaps. The Philippine specimen also indicates that Flores was not the only island in Wallacea to be occupied by hominins more than 50,000 years ago" (Mijares et al. 2010: 9). Regardless of the precise taxonomic affiliation of that bone, it indicates a great time depth for human presence in that part of the Old World, and provides some thought-provoking evidence that seafaring must have been part of the hominin behavioral range by that time, something that seems to potentially have been the case in other parts of East Asia at that time.Reference:Mijares, A., Détroit, F., Piper, P., Grün, R., Bellwood, P., Aubert, M., Champion, G., Cuevas, N., De Leon, A., & Dizon, E. (2010). New evidence for a 67,000-year-old human presence at Callao Cave, Luzon, Philippines Journal of Human Evolution DOI: 10.1016/j.jhevol.2010.04.008... Read more »
Mijares, A., Détroit, F., Piper, P., Grün, R., Bellwood, P., Aubert, M., Champion, G., Cuevas, N., De Leon, A., & Dizon, E. (2010) New evidence for a 67,000-year-old human presence at Callao Cave, Luzon, Philippines. Journal of Human Evolution. DOI: 10.1016/j.jhevol.2010.04.008
Bonobo Week continues! I'm donating whatever proceeds I receive from my blogging shenanigans for the entire month of June to help the bonobos at Lola Ya Bonobo.
Imagine that you're wandering in the desert and you come across two magic lamps. One lamp grants three wishes. It's your standard sort of magic lamp with a genie in it. (No wishing for extra wishes, of course.) The second magic lamp is, well, a moody magic lamp. It's inconsistent. Sometimes it grants one wish, and sometimes it grants seven wishes. But the thing is, you don't know for sure whether, when you rub the lamp and genie pops out, if he's going to grant you just one or the full seven. But let's make things more interesting. You only get to use one of the lamps. As soon as you rub one of the lamps and the genie comes out, the other lamp disappears. And you might be in the Desert of Infrequent Lamps. Tomorrow you could chance upon two more lamps, with the same rules. But you might not come across any more lamps for many days. So which lamp will you decide to use?
Figure 1: If you're lucky, the genie will have the voice of Robin Williams and will sing to you.
Decades of studies indicated that, as humans, we tend to avoid risk. When it comes to potential gains, we prefer the safe option over the risky option. Most people would choose the sure thing, and summon the three-wish genie, especially since they don't know when they'd be lucky enough to stumble upon their next lamp. Resources (in the shape of magical wish-granting lamps) are scarce. After all, you're wandering through the Desert of Infrequent Lamps. Sucks to be you.
Animals face similar risks on a daily basis, though in the context of things like food acquisition and predator avoidance. So it makes sense that natural selection would, over generations, favor certain cognitive decision-making mechanisms that most effectively addressed those risks. Risk preference patterns in animals are variable though. That variability in risk preference has been observed, at least under experimental conditions, suggests that animals can adjust their strategies given the parameters of the immediate environment. For example, when the riskier option may not be very costly, or when plenty of food is available in the environment, the animal may opt for the riskier choice, and under those circumstances that may actually be the optimal decision.
Read the rest of this post... | Read the comments on this post...... Read more »
Heilbronner, S., Rosati, A., Stevens, J., Hare, B., & Hauser, M. (2008) A fruit in the hand or two in the bush? Divergent risk preferences in chimpanzees and bonobos. Biology Letters, 4(3), 246-249. DOI: 10.1098/rsbl.2008.0081
"Dinah", a young female gorilla kept at the Bronx Zoo in 1914. From the Zoological Society Bulletin.
Frustrated by the failure of gorillas to thrive in captivity, in 1914 the Bronx Zoo's director William Hornaday lamented "There is not the slightest reason to hope that an adult gorilla, either male or female, ever will be seen living in a zoological park or garden." Whereas wild adult gorillas were "savage" and "implacable" beasts which could not be captured (a photo of a sculpture included in Hornaday's article depicts a gorilla strangling one man, brandishing another about with its other arm, and standing on the body of a third), young gorillas were fragile animals that did not last long in the concrete and steel enclosures made for them. One gorilla in Germany had survived for seven years, but the average lifespan of a captive juvenile gorilla was about nine months, and often considerably less than that. This was not to say that zoological parks would stop trying to capture and import young gorillas - Hornaday gave no indication that he wished to stop procuring young gorillas for his zoo - but only that visitors to the Bronx Zoo and other menageries would probably never see an adult gorilla.
Zoos had been failing miserably at keeping apes in captivity for centuries. Most of the animals captured were young individuals which had been snatched from their families or had just been orphaned by specimen collectors. They regularly died on the journey out of Africa or shortly after they arrived at their public confines. Many refused to eat, and most would become sick before passing away, but why this should be so puzzled zoologists. Perhaps, they speculated, it was a matter of climate. The cooler climates of Europe and North America were poor proxies for equatorial Africa, so it was hardly surprising that morality was so high.
Looking back at the practices of zoos during the early 20th century, however, it is apparent that the different climates of Europe and North America cannot solely be blamed for the deaths of these apes. The traditional methods of catching and collecting wild animals which had worked for many other species caused a great deal of stress for captured apes, and the concrete and steel enclosures in which they were placed were cruel by today's standards. (Though, even under today's improved conditions, it can still be questions whether zoos are capable of keeping apes happy and healthy.) Still, in reference to climate, what is curious about the modern disparity between Africa and the places to which the young apes were shipped is that, not so very long ago, much of the northern hemisphere was inhabited by a variety of apes species. North American never had apes, but Europe and much of Asia did, making today's ape species the tattered branches of what once was a richer family tree.
Read the rest of this post... | Read the comments on this post...... Read more »
Merceron, G., Kaiser, T., Kostopoulos, D., & Schulz, E. (2010) Ruminant diets and the Miocene extinction of European great apes. Proceedings of the Royal Society B: Biological Sciences. DOI: 10.1098/rspb.2010.0523
DUNHAM, A., ERHART, E., & WRIGHT, P. (2010) Global climate cycles and cyclones: consequences for rainfall patterns and lemur reproduction in southeastern Madagascar. Global Change Biology. DOI: 10.1111/j.1365-2486.2010.02205.x
How does natural selection account for language? Darwin wrestled with it, Chomsky sidestepped it, and Pinker claimed to solve it. Discerning the evolution of language is therefore a much sought endeavour, with a vast number of explanations emerging that offer a plethora of choice, but little in the way of consensus. This is hardly new, [...]... Read more »
Deacon, T. (2010) Colloquium Paper: A role for relaxed selection in the evolution of the language capacity. Proceedings of the National Academy of Sciences, 107(Supplement_2), 9000-9006. DOI: 10.1073/pnas.0914624107
An unexpected gem from last year's Journal of the American Psychoanalytic Association: Mind over medicine.Surprisingly, it has nothing to do with psychoanalysis. Rutherford and colleagues performed a meta-analysis of lots of clinical trials of antidepressants. Neuroskeptic readers will be all too familiar with these. But they did an interesting thing with the data: they compared the benefits of antidepressants in trials with a placebo condition, vs. trials with no placebo arm, such as trials comparing one drug to another drug.Why do that comparison? Because the placebo effect is likely to be stronger in trials with no placebo condition. If you volunteer for a placebo controlled trial, you'll know that you've got (say) a 50-50 chance of getting inactive sugar pills. You'll probably be uncertain whether or not you'll get better, maybe even quite worried. On the other hand if you're in a trial where you definitely will get a real drug, you can rest assured that you'll feel better - and that in itself might make your depression improve.The paper only presents very preliminary results, but they say that:Our group at Columbia has completed preliminary work involving metaanalyses of randomized controlled trials comparing antidepressant medications to a placebo or active comparator in geriatric outpatients with Major Depressive Disorder (Sneed et al. 2006). In placebo controlled trials, the medication response rate was 48% and the remission rate 33%, compared to a response rate of 62% and remission rate of 43% in the comparator trials (p Notably, they only looked at trials of old age patients, but the same probably applies to everyone else.Why does this matter? The authors suggest one very important implication. There are quite a few trials nowadays comparing the effects of psychotherapy, medication, neither, or both. How it works is that everyone gets pills, 50% of them real drugs and 50% placebos; also, half the people get psychotherapy while the others remain on the waiting list.These trials often find that medication plus psychotherapy is better than just medication alone. This leads to the idea that therapy and drugs should be combined in clinical practice, a message which goes down really well with because it gives both psychopharmacologists and therapists the feeling that they have an important job to do. An example of this kind of trial is the influential TADS from 2004, finding that Prozac and therapy both work in depressed teens, and combining them is best. Everyone's a winner.But as Rutherford et al. point out, there's a problem with this reasoning. The people who only get antidepressants don't know that they're getting any treatment, because they might be getting placebo. But the people who get antidepressants and therapy know that they're getting at least one real treatment (therapy). This is likely to improve their outcome through an expectation effect. (In fact, in TADS, the people on combination treatment were told that they were getting both - they knew they would never get dummy pills - which will have made this even worse.)Now you could say that this doesn't matter: TADS and similar studies show that therapy and medication is better than just medication, and it's purely academic whether that's "just a placebo effect". But the key point is that in real life people always get medication knowing that it's real - so, like the therapy plus medication people in the trials, they get the benefit of knowing that they are getting a real treatment. In the trials the medication-only group don't know that, but in real life they do - so the benefits of adding psychotherapy might be less, or even zero, in real life.The authors of the TADS study did acknowledge this in their original paper, but only very briefly - here's all they say about it:Blinding patients in the placebo and fluoxetine alone groups but not in the CBT alone group (participants knew they would not be receiving fluoxetine) and the fluoxetine combined with CBT group (participants knew that they would be receiving fluoxetine) may have interacted with expectancy effects regarding improvement and acceptability of treatment assignment.Yet this limitation means they, strictly speaking, all TADS showed is that Prozac works in this group. It doesn't prove that adding (very expensive) therapy benefits anyone, in the real world. This is not to say that psychotherapy doesn't work of course, maybe it does, but the point is that therapy + medication trials may be best without a placebo.Rutherford, B., Roose, S., & Sneed, J. (2009). Mind Over Medicine: the Influence of Expectations on Antidepressant Response Journal of the American Psychoanalytic Association, 57 (2), 456-460 DOI: 10.1177/00030651090570020909... Read more »
Rutherford, B., Roose, S., & Sneed, J. (2009) Mind Over Medicine: the Influence of Expectations on Antidepressant Response. Journal of the American Psychoanalytic Association, 57(2), 456-460. DOI: 10.1177/00030651090570020909
One of the issues raised by the recent Sarmiento comments is that of the Miocene apes and the evolution of a short back. All extant apes possess a “short back,” by which we mean a reduction in the lumbar spine combined with an upward elongation of the blades of the pelvis. This back is a nice, [...]... Read more »
McCollum MA, Rosenman BA, Suwa G, Meindl RS, & Lovejoy CO. (2010) The vertebral formula of the last common ancestor of African apes and humans. Journal of experimental zoology. Part B, Molecular and developmental evolution, 314(2), 123-34. PMID: 19688850
Pilbeam, D. (2004) The anthropoid postcranial axial skeleton: Comments on development, variation, and evolution. Journal of Experimental Zoology, 302B(3), 241-267. DOI: 10.1002/jez.b.22
Preuschoft, H., Hayama, S., & Günther, M. (1988) Curvature of the Lumbar Spine as a Consequence of Mechanical Necessities in Japanese Macaques Trained for Bipedalism. Folia Primatologica, 50(1-2), 42-58. DOI: 10.1159/000156333
NAKATSUKASA, M., KUNIMATSU, Y., NAKANO, Y., & ISHIDA, H. (2007) Vertebral morphology of Nacholapithecus kerioi based on KNM-BG 35250. Journal of Human Evolution, 52(4), 347-369. DOI: 10.1016/j.jhevol.2006.08.008
It has been said that "word frequency" is the most important variable in language research, despite the belief by many that it can't be used as a variable because no one really knows what a word is. (see: Minifalsehood: We can't tell what a word is!?!? and A run in my stocking ...)
A recent study in PLoS looks at a heretofore under investigated area, word/character use in Chinese. Read the rest of this post... | Read the comments on this post...... Read more »
Cai, Q., & Brysbaert, M. (2010) SUBTLEX-CH: Chinese Word and Character Frequencies Based on Film Subtitles. PLoS ONE, 5(6). DOI: 10.1371/journal.pone.0010729
As digital and social media infiltrate the world of sports, and make teams, athletes, reporters, and information overall more accessible for fans, there is a greater opportunity for fans to connect to the game. This connection is important to the longevity of the franchises, and has largely been borne on the shoulders of the games' announcers. But why bother turning up the volume on the radio or
... Read more »
Hallett, T. (2003) EMOTIONAL FEEDBACK AND AMPLIFICATION IN SOCIAL INTERACTION. The Sociological Quarterly, 44(4), 705-726. DOI: 10.1111/j.1533-8525.2003.tb00532.x
Like atlatls, but to an even greater degree, bows are rare in the archaeological record because they were made of perishable materials. While some types of atlatls had more durable attachments such as hooks and weights, bows were almost always made of wood and various fibrous materials, except in some areas where they were made [...]... Read more »
Hibben, F. (1938) A Cache of Wooden Bows from the Mogollon Mountains. American Antiquity, 4(1), 36. DOI: 10.2307/275360
I’ve said quite a lot about atlatls, so perhaps it’s time to move on to the second part of this series. The bow and arrow is a sufficiently popular weapon system even today that it doesn’t need much introduction. It’s important to note, however, that most archaeologists have concluded that the bow and arrow is [...]... Read more »
The skull of a spotted hyena (Crocuta crocuta), photographed at the AMNH's "Extreme Mammals" exhibit.
There was something strange about the assemblage of Homo erectus fossils found at Zhoukoudian - the famous 750,000 - 200,000 year old site in China popularly known as Dragon Bone Hill. Despite the abundance of skulls and teeth, there were hardly any remains of the hominins from below the neck. Where were the bodies?
The majority of Homo erectus fossils from Zhoukoudian were discovered and studied by an international team of scientists during the 1920's and 1930's. (Unfortunately most of the specimens were lost with the outbreak of WWII, but casts of these early discoveries remain.) They were just what paleoanthropologists had been hoping to find - evidence that human evolution had primarily taken place in Asia (a hypothesis later overturned by discoveries in Africa) - but the dearth of postcranial remains was puzzling. Clearly something must have happened to bias the fossil record so that skulls were more likely to be preserved than bodies.
At this time many paleoanthropologists believed that, as the English philosopher Thomas Hobbes once put it, the lives of early humans as "nasty, brutish, and short." The heavy brows and robust bones of hominins like Neanderthals and Homo erectus testified of a time when strength and savagery were more important to survival than intelligence or culture. No doubt these prehistoric people were just as brutal to each other as they were to the animals they hunted, and so the horrifying act of cannibalism seemed like a plausible explanation for what the scientists found at Zhoukoudian. Read the rest of this post... | Read the comments on this post...... Read more »
Noel T. Boaz, Russell Ciochon, Xu Qinqi, and Liu Jinyi. (2000) Large Mammalian Carnivores as a Taphonomic Factor in the Bone Accumulation at Zhoukoudian. Acta Anthropologica Sinica, 224-234. info:/
Boaz, N. (2004) Mapping and taphonomic analysis of the Homo erectus loci at Locality 1 Zhoukoudian, China. Journal of Human Evolution, 46(5), 519-549. DOI: 10.1016/j.jhevol.2004.01.007
For those of you familiar with the formal mathematical models of cultural evolution (Cavalli-Sforza & Feldman, 1981; Boyd & Richerson, 1985), you’ll know there is a substantive body of literature behind the process of cultural transmission. It comes as a surprise, then, that experiments in this area are generally lacking. For instance, if we look [...]... Read more »
Mesoudi, A., & Whiten, A. (2008) Review. The multiple roles of cultural transmission experiments in understanding human cultural evolution. Philosophical Transactions of the Royal Society B: Biological Sciences, 363(1509), 3489-3501. DOI: 10.1098/rstb.2008.0129
Prozac and suicide: what's going on?Many people think that SSRI antidepressants do indeed cause suicide, and in recent years this idea has gained a huge amount of attention. My opinion is that, well, it's all rather complicated...At first glance, it seems as though it should be easy to discover the truth. SSRIs are some of the most studied drugs in the world. We have data from several hundred randomized placebo-controlled trials, totaling tens of thousands of patients. Let's just look and see whether people given SSRIs are more likely to die by suicide than people given placebo.Unfortunately, that doesn't really work. Actual suicides are extremely rare in antidepressant trials. This is partly because most trials only last 4 to 6 weeks, but also because anyone showing evidence of suicidal tendencies is excluded from the studies at the outset. There just aren't enough suicides to be able to study.What you can do is to look at attempted suicide, and at "suicidality", meaning suicidal thoughts and self-harming behaviours. Suicidality is more common than actual suicide, so it's easier to research. Here's the bad news: the evidence from a huge number of trials is that compared to placebo, antidepressants do raise the risk of suffering suicidality(1) and of suicide attempts(1) (from 1.1 per 1000 to 2.7 per 1000), when given to people with psychiatric disorders.There's no good evidence that SSRIs are any worse or any better than other antidepressants, or that any one SSRI stands out as particularly bad(1,2). The risk seems to be worst in younger people: compared to placebo, SSRIs raised suicidality in people below age 25, had no effect in most adults, and lowered it in the oldest age groups(1). This is why SSRIs (and all other antidepressants) now carry a "black box" in the USA, warning about the risk of suicide in young people.*This is very troubling. Hang on though. I mentioned that suicidality is an exclusion criterion from pretty much all antidepressant trials. This is for ethical as well as practical reasons: it's considered unethical to give a suicidal person an experimental drug, and it's really impractical to have patients dying during your trial.Indeed the recorded rate of suicidality in these trials is incredibly tiny: only 0.5% of the psychiatric patients experienced any suicidal ideation or behaviour at all(1). The other 99.5% never so much as thought about it, apparently. If that were representative of the real world it would be great; unfortunately it isn't. Yet what this all means is that antidepressants could not possibly reduce suicidality in these trials, because there's just nothing there to reduce. Even if, in the real world, they prevent loads of suicides, these trials wouldn't show it.How do you investigate the effects of drugs "in the real world"? By observational studies - instead of recruiting people for a trial, you just look to see what happens to people who are prescribed a certain drug by their doctor. Observational studies have strengths and weaknesses. They're not placebo controlled, but they can be much larger than trials, and they can study the full spectrum of patients.Observational studies have found very little evidence suggesting that antidepressants cause suicide. Most strikingly, since 1990 when SSRIs were introduced, antidepressant sales have increased enormously, and the suicide rate has fallen steadily; this is true of all Western countries.More detailed analyses of antidepressant sales vs. suicide rates across time and location have generally either found either no effect, or a small protective effect, of antidepressant sales(1,2,3, many others). In the past few years, concern over suicidality has led to a fall in antidepressant use in adolescents in many countries: but there is no evidence that this reduced the adolescent suicide rate(1,2).Another observational approach is to see whether people who have actually died by suicide were taking SSRIs at the time of death. Australian psychiatrists Dudley et al have just published a review of the evidence on this question, and they found that out of a total of 574 adolescent suicide victims from the USA, Britain, and Scandinavia, only 9 (1.5%) were taking an SSRI when they died. In other words, the vast majority of youth suicides occur in non-SSRI users. This sets a very low upper limit on the number of suicides that could be caused by SSRIs.*So what does all this mean? As I said, it's very controversial, but here's my take, with the standard caveat that I'm just some guy on the internet.The evidence from randomized controlled trials is clear: SSRIs can cause suicidality, including suicide attempts, in some people, namely some people below age 25. The chance of this happening is below 1% according to the trials, but this is still worrying given that lots of people take antidepressants. However, the use of antidepressants on a truly massive scale has not led to any rise in the suicide rate in any age group. This implies that overall, antidepressants prevent at least as many suicides as they cause.My conclusion is that the clinical trials are not much use when it comes to knowing what will happen to any individual patient. The evidence is that antidepressants could worsen suicidality, or they could reduce it. This is hardly a satisfactory conclusion for people who want neat and tidy answers, but there aren't many of those in psychiatry. For patients, the implication is, boringly, that we should follow the instructions on the packet - be vigilant for suicidality, but don't stop taking them except on a doctor's orders.Dudley, M., Goldney, R., & Hadzi-Pavlovic, D. (2010). Are adolescents dying by suicide taking SSRI antidepressants? A review of observational studies Australasian Psychiatry, 18 (3), 242-245 DOI: 10.3109/10398561003681319... Read more »
Dudley, M., Goldney, R., & Hadzi-Pavlovic, D. (2010) Are adolescents dying by suicide taking SSRI antidepressants? A review of observational studies. Australasian Psychiatry, 18(3), 242-245. DOI: 10.3109/10398561003681319
It is well documented that Thomas Robert Malthus’ An Essay on the Principle of Population greatly influenced both Charles Darwin and Alfred Russell Wallace’s independent conception of their theory of natural selection. In it, Malthus puts forward his observation that the finite nature of resources is in conflict with the potentially exponential rate of reproduction, [...]... Read more »
For some time now, evolutionary biologists have used phylogenetics. It is a well-established, powerful set of tools that allow us to test evolutionary hypotheses. More recently, however, these methods are being imported to analyse linguistic and cultural phenomena. For instance, the use of phylogenetics has led to observations that languages evolve in punctuational bursts, explored [...]... Read more »
Lycett SJ, Collard M, & McGrew WC. (2009) Cladistic analyses of behavioural variation in wild Pan troglodytes: exploring the chimpanzee culture hypothesis. Journal of human evolution, 57(4), 337-49. PMID: 19762070
Greenhill, S., Currie, T., & Gray, R. (2009) Does horizontal transmission invalidate cultural phylogenies?. Proceedings of the Royal Society B: Biological Sciences, 276(1665), 2299-2306. DOI: 10.1098/rspb.2008.1944
A new study interestingly implies that human activities may not always be bad for biodiversity. Long before the colonizers arrived in South America, indigenous farmers, belonging to the Arauquinoid cultures, had already interfered with the Amazonian biodiversity. Their novel agricultural engineering methods had changed the savannah ecosystem, resulting in increased biodiversity. Thus states the solid paper, on 'Pre-Columbian agricultural landscapes, ecosystem engineers, and self-organized patchiness in Amazonia', published in Proceedings of the National Academy of Sciences (online before print, April 12th 2010), by Doyle McKey (Université de Montpellier II, France), Stéphen Rostain, José Iriarte, Bruno Glaser, Jago Jonathan Birk, Irene Holst, and Delphine Renard. The savannahs of coastal Guyana tend to flood during the rains and are dry during the summer. However, strange complexes of mounds are seen in the terrain of these plains, running for 360 miles from Berbice River to Cayenne. Due to their perfect symmetry, the mounds were deduced to be man-made. The mounds drained well during the rains and floods (their drainage capacity was nine times as high as the seasonally flooded savannah). The authors deduce that these are large raised beds/fields, made out of the surrounding topsoil, for cultivating crops (a theory further substantiated by soil samples containing microfossils of maize, cassava, and squash), constructed by the pre-Columbian farmers, around 1000-700 years ago. The interesting point is that this farming was practiced in wastelands considered to be unsuitable for agriculture- a feat achieved due to their effective agricultural engineering. When these fields were abandoned, the mounds were colonised by flora and fauna, thus creating a new ecosystem. These 'ecosystem engineers' (viz., ants such as Acromyrmex octospinosus and Ectatomma brunneum, termites such as Nasutitermitinae, and earthworms) built their nests on the raised beds so that the colonies wouldn't be flooded. Their burrowing aerated it further, helping in accumulating sufficient rainfall. Moreover, the mounds were fertilised as a result of them congregating organic matter into their nests and accumulating minerals such as nitrogen, potassium, and calcium. As a result, the perennial plants on the mounds flourished and their strong roots prevented the erosion of the mound. All of these alterations initiated by humans have resulted in a higher biodiversity than seen in the normal savannahs. This study would give additional impetus to the debate over whether most of the Amazon rainforest and savannahs (commonly considered to be pristine) are sites of significant human occupation, especially during the pre-Columbian times. The authors suggest that this agricultural system could be a model for modern farming, especially considering the beneficial ecological changes. Although this is a perfect example of a terrain modified by humans and maintained by Nature, it must be noted the increase in biodiversity was a result of 400-800 years of no/minimal human intervention. Secondly, the ‘punja’ technique of rice/paddy cultivation has a very similar methodology and is followed in parts of Kerala. link: http://www.pnas.org/content/early/2010/04/07/0908925107.abstract?sid=93317154-3ce6-43c8-9a1e-456e24272c2bMcKey D, Rostain S, Iriarte J, Glaser B, Birk JJ, Holst I, & Renard D (2010). Pre-Columbian agricultural landscapes, ecosystem engineers, and self-organized patchiness in Amazonia. Proceedings of the National Academy of Sciences of the United States of America, 107 (17), 7823-8 PMID: 20385814... Read more »
McKey D, Rostain S, Iriarte J, Glaser B, Birk JJ, Holst I, & Renard D. (2010) Pre-Columbian agricultural landscapes, ecosystem engineers, and self-organized patchiness in Amazonia. Proceedings of the National Academy of Sciences of the United States of America, 107(17), 7823-8. PMID: 20385814
Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.
If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.
Research Blogging is powered by SMG Technology.
To learn more, visit seedmediagroup.com.