Post List

All posts; Tags Include "Behavioral Neuroscience"

(Modify Search »)

  • March 20, 2017
  • 11:25 AM
  • 114 views

Opioids, Benzos and Risk for Overdose

by William Yates, M.D. in Brain Posts

The evolving epidemic of opioid overdose and overdose deaths is receiving increased public and research attention.Opioids overdoses and overdose deaths are often unintentional or accidental. It has been known that concurrent use of opioids with alcohol or benzodiazepines (i.e. Valium or Xanax) increases risk for overdose toxicity.A recent study published in the British Medical Journal confirmed the association of concurrent benzodiazepine prescription with opioid overdose.This research team examined confidential medical database records from over 500,000 patients in the U.S.Those that were enrolled in a medical plan including pharmaceutical benefits between 2001 and 2013 were included in the analysis.The key findings from the study included the following:The percentage of opioid users concurrently using a benzodiazepine rose from 9% of opioid users in 2001 to 17% of opioid users in 2013Chronic users of opioids nearly doubled their risk of opioid overdose if they took a concurrent benzodiazepine medication (4/100 persons/year to 7-8/100 person years)If the association is causal, the authors estimate emergency room visits and inpatient admissions could be reduced by 15% by stopping concurrent prescriptionsThis association of risk seems reasonable given the toxicities of opioids and benzodiazepines. Both at higher doses decrease respiratory drive potentially contributing to hypoxia and death.The authors note several take home messages for clinicians. Chronic users of benzodiazepines should be prescribed opioids cautiously if at all. Opioid prescriptions should be for short periods of time and low doses for chronic benzodiazepine patients.Likewise, if chronic opioids are necessary they should rarely be combined with intermittent or long-term benzodiazepine prescriptions.Readers with more interest in this topic can access the free full-text manuscript by clicking on the PMID link in the citation below.Photo of NCAA men's basketball tournament in Tulsa, OK is from my files.Follow me on Twitter WRY999Sun EC, Dixit A, Humphreys K, Darnall BD, Baker LC, & Mackey S (2017). Association between concurrent use of prescription opioids and benzodiazepines and overdose: retrospective analysis. BMJ (Clinical research ed.), 356 PMID: 28292769... Read more »

  • March 15, 2017
  • 12:57 PM
  • 153 views

Emotional Intelligence and the Physician

by William Yates, M.D. in Brain Posts

Emotional intelligence (EI) is characterized by the ability to recognize emotional states in self and in others.This emotional recognition may be helpful in guiding behavior and in improving interpersonal relationships.It seems logical on a face validity level to assume that higher levels of EI would be good in the selection of students for medical school.However, there are few studies assessing EI in physicians. There are fewer studies that examine whether EI influences physician behavior, patient satisfaction and ultimately patient outcomes.Rhamzan Shahid and colleagues at the Stritch School of Medicine at Loyola University Medical examined levels of EI in a group of resident physicians in training in the specialties of pediatrics and combined internal medicine-pediatrics.This was a cross-sectional study design that included comparison of residents in the first two years of training versus those in years 3 and 4. The main findings included the following:Residents tended to score high on EI overall with the highest scores on impulse control and the interpersonal composite subscaleResidents scored relatively lower on assertiveness and independence subscalesAssertiveness subscale scores were higher in the more senior residentsEmpathy scores were lower in the the more senior residentsIncreased assertiveness sub-scale scores in more senior residents might be a good thing, possibly indicating a growth in confidence and skill level. This cannot be stated definitely as this study was not longitudinally designed.The lower empathy sub-scale scores in senior residents is an interesting finding. Some might argue it is a negative consequence of training and reflects an increasing disenchantment with being a physician. The authors of the study encourage interventions to "ensure they (resident physicians) do not lose empathy".However, it may be that in a group selected for high empathy, a reduction may also represent a normal maturational process. Maybe high empathy contributes to higher physician distress in the clinical setting and potentially more burnout and depression. Maybe empathy levels that are too high produce emotional states that actually impair physician behavior and reduce effectiveness of clinical decision making.These possibilities should prompt studies correlating EI with patient satisfaction, patient outcome, physician satisfaction with medicine and their specialty and risk for physician burnout.The manuscript reviewed and commented on today is available in free full-text format by clicking on the link in the citation below.Follow me on Twitter @WRY999Photo of the lesser scaup duck is from my personal photography files.Shahid, R., Stirling, J., & Adams, W. (2016). Assessment of Emotional Intelligence in Pediatric and Med-Peds Residents Journal of Contemporary Medical Education, 4 (4) DOI: 10.5455/jcme.20170116015415... Read more »

  • March 7, 2017
  • 11:01 AM
  • 148 views

Can Older Drivers Benefit From Training?

by William Yates, M.D. in Brain Posts

Older drivers are over-represented in motor-vehicle driving accidents.The lowest rate of fatal vehicle crashes per 100 million miles driven is found in drivers between the ages of 30-69 years of age.Fatal vehicle driving rates per miles driven is 4 to 5 times higher in drivers over 80 years of age. (IIHS.org data)So can older drivers be trained or educated to improve their safety (and the safety o those around them)?A recent randomized controlled trial examined an educational intervention in drivers 75 years and older in Australia.This intervention targeted a reduction or avoidance of seven high-risk driving situations:Night drivingDriving in the rainRight-hand (left-hand in U.S.) turns across oncoming trafficDriving during heavy trafficDriving on high-speed roadsDriving during rush hourDriving aloneThe trial found participants in the intervention group showed a greater readiness to make changes that could reduce high-risk accident exposure. However, the intervention group did not reduce total miles driven in the year following the intervention.Additionally, the intervention group did not increase their use of alternate transportation (i.e. buses or cabs) in the follow-up period.This trial showed limited response to educational training in an older-aged drivers population. Alternate approaches (adoption of high safety feature vehicles, use of newer alternative driving programs like Uber, cognitive training programs to improve psychomotor speed) may hold more promise in reducing fatal accident rates in elderly populations.Follow me on Twitter WRY999Photo of moon is from my photography files.Coxon K, Chevalier A, Brown J, Clarke E, Billot L, Boufous S, Ivers R, & Keay L (2016). Effects of a Safe Transportation Educational Program for Older Drivers on Driving Exposure and Community Participation: A Randomized Controlled Trial. Journal of the American Geriatrics Society PMID: 27943260... Read more »

  • March 5, 2017
  • 05:37 PM
  • 266 views

Do you crave music like you crave a cookie?

by Kiralee Musgrove in Neuroscientist on music

Imagine there is a cookie sitting in front of you. You are hungry. You have been on a diet for months. If you have to look at one more raw, paleo, gluten free snack you are going to scream.... Read more »

  • March 2, 2017
  • 12:19 PM
  • 161 views

Improving Hearing-Aid Access in Older Adults

by William Yates, M.D. in Brain Posts

There are significant barriers to widespread use of hearing-aids in older adults with age-related hearing loss.Sensitivity to the stigma of wearing a hearing-aid is one barrier.Cost is another significant barrier. In the U.S., bilateral hearing-aid purchase amounts to a cost of $2400 to $5800. This cost is typically not covered by Medicare or other health insurance plans.I ran into a interesting manuscript on looking at an alternative less costlier approach to hearing-aid selection and purchase.Larry Humes and colleagues compared the outcome of a older adults randomized to one of three hearing aid interventions. One was standard audiology best practices, one was a placebo hearing-aid (device without amplication). A third alternative was also studied that was called an over-the-counter (OTC) intervention. This intervention included the following elements:Self-selection of hearing aid tips, tubes and devicesThree types of hearing-aids were provided for selection. Each was programmed with one of the three most common patterns of hearing lossSubjects tried various combinations of devices and listened to sample sounds of speech, music and environmental soundsSubjects were assessed after a six week trial for hearing function, satisfaction and desire to keep the deviceSubjects were randomized to pay a fee of $3500 versus $600 for devices that were identical in features. This allowed for study of the effect of cost on outcome measures.Interestingly, the OTC intervention resulted in outcomes (i.e. hearing improvement) that were very similar to audiology best practices. However, OTC subjects showed a slightly lower satisfaction score and were somewhat more likely to return devices after the study for a refund. Higher price also predicted return for refund following the study.The authors conclude:"Efficacious OTC service-delivery models (and devices) may increase accessibility and affordability of hearing aids for millions of older adults, but further research is required to evaluate various devices and approaches as well as to examine the generalization of the findings from this clinical trial."This study provides an impetus for further study of the OTC model in hearing aid selection and use. Cost issues appear to continue to be a significant barrier to wider hearing-aid access.Readers with more interest in this study can access the free full-text manuscript by clicking on the link in the citation below.Photo of wood duck is from my photography file.Follow me on Twitter WRY999Humes, L., Rogers, S., Quigley, T., Main, A., Kinney, D., & Herring, C. (2017). The Effects of Service-Delivery Model and Purchase Price on Hearing-Aid Outcomes in Older Adults: A Randomized Double-Blind Placebo-Controlled Clinical Trial American Journal of Audiology DOI: 10.1044/2017_AJA-16-0111... Read more »

  • February 27, 2017
  • 05:29 AM
  • 173 views

Know your brain: Mammillary bodies

by neurosci in Neuroscientifically Challenged

Where are the mammillary bodies?























The mammillary bodies are part of the diencephalon, which is a collection of structures found between the brainstem and cerebrum. The diencephalon includes the hypothalamus, and the mammillary bodies are found on the inferior surface of the hypothalamus (the side of the hypothalamus that is closer to the brainstem). The mammillary bodies are a paired structure, meaning there are two mammillary bodies---one on either side of the midline of the brain. They get their name because they were thought by early anatomists to have a breast-like shape. The mammillary bodies themselves are sometimes each divided into two nuclei, the lateral and medial mammillary nuclei. The medial mammillary nucleus is the much larger of the two, and is often subdivided into several subregions.  What are the mammillary bodies and what do they do?The mammillary bodies are best known for their role in memory, although in the last couple of decades the mammillary bodies have started to be recognized as being involved in other functions like maintaining a sense of direction. The role of the mammillary bodies in memory has been acknowledged since the late 1800s, when mammillary body atrophy was observed in Korsakov's syndrome---a disorder characterized by amnesia and usually linked to a thiamine deficiency. Since then a number of findings---anatomical, clinical, and experimental---have supported and expanded upon a mnemonic role for the mammillary bodies.The mammillary bodies are directly connected to three other brain regions: the hippocampus via the fornix, thalamus (primarily the anterior thalamic nuclei) via the mammillothalamic tract, and the tegmental nuclei of the midbrain via the mammillary peduncle and mammillotegmental tract. Two of the three connections are thought to primarily carry information in one direction: the hippocampal connections carry information from the hippocampus to the mammillary bodies and the thalamic connections carry information from the mammillary bodies to the thalamus (the tegmental connections are reciprocal). These connections earned the mammillary bodies the reputation of being relay nuclei that pass information from the hippocampus on to the anterior thalamic nuclei to aid in memory consolidation. This hypothesis is supported by the fact that damage to pathways that connect the mammillary bodies to the hippocampus or thalamus is associated with deficits in consolidating new memories. Others argue, however, that the mammillary bodies act as more than a simple relay, making independent contributions to memory consolidation. Both perspectives emphasize a role for the mammillary bodies in memory but differ as to the specifics of that role.Further supporting a role for the mammillary bodies in memory, there is evidence from humans that suggests damage to the mammillary bodies is associated with memory deficits. Several cases of brain damage involving the mammillary bodies as well as cases of tumor-related damage to the area of the mammillary bodies suggests that damage to the mammillary bodies is linked to anterograde amnesia. Indeed, mammillary body dysfunction has been identified as a major factor in diencephalic amnesia, a type of amnesia that originates in the diencephalon (Korsakoff's syndrome, an amnesia that is seen primarily in long-term alcoholics, is one type of diencephalic amnesia).Experimental evidence from animal studies also underscores the importance of the mammillary bodies in memory. Studies with rodents and monkeys have found deficits in spatial memory to occur after damage to the mammillary bodies or the mammillothalamic tract. In addition to involvement in memory functions, there are cells in the mammillary bodies that are activated only when an animal's head is facing in a particular direction. These cells are thought to be involved in navigation and may act somewhat like a compass in creating a sense of direction.Vann SD, & Aggleton JP (2004). The mammillary bodies: two memory systems in one? Nature reviews. Neuroscience, 5 (1), 35-44 PMID: 14708002... Read more »

Vann SD, & Aggleton JP. (2004) The mammillary bodies: two memory systems in one?. Nature reviews. Neuroscience, 5(1), 35-44. PMID: 14708002  

  • February 3, 2017
  • 11:22 AM
  • 93 views

Brain Shape and Personality Type

by William Yates, M.D. in Brain Posts

Personality has often been conceptualized a a human feature shaped largely by nurture and environment.Unlike major neuroscience medicine disorders, personality features have been considered less influenced by brain structure and genetic influences.A recent brain structure (morphology) study puts these assumptions at risk.Roberta Riccelli along with colleagues in Italy and Florida State University studied brain structural features across 507 participants in the Human Connectome Project.All subjects completed a personality assessment using the five-factor model (FFM), a widely validated measure of five personality features.Here were the key findings from the study for each personality feature:High neuroticism: increased cortical thickness in supramarginal gyrus, superior parietal cortex, superior temporal cortex, superior prefrontal cortex and frontal pole. Also decreased surface area of superior parietal cortex, middle temporal gyrus, cuneus, superior prefrontal cortex and frontal pole.High extraversion: increased cortical thickness in precuneus and lower surface area and volume of superior temporal gyrus. Also lower cortical volume of entorhinal cortex and greater folding in the fusiform gyrus.High openness: lower cortical thickness in the postcentral gyrus, rostral anterior cingulate cortex, superior prefrontal cortex and lateral occipital gyrus. Increased surface area, cortical volume and folding in a series of parietal, temporal and frontal regions.High Agreeableness: Decreased cortical thickness, surface area, cortical volume and local gyrus formation in frontotemporal regions. Increased local gyrus formation in inferior temporal gyrus.High concientiousness: increased cortical thickness in prefrontal cortex along with lower surface area and cortical volume in middle/inferior temporal gyrus and lateral occipital gyrus regions. Decreased cortical folding in prefrontal gyrus and several other regions.These findings are of note for the diffuse number and types of structural correlates of personality features in man. The authors note the importance of the prefrontal cortex in personality:  "significantly evolved in human beings and great apes relative to other species. This could reflect that several FFM personality traits are linked to high-level socio-congnitive skills as well as the ability to modulate "core" affective responses."Look for more structural, functional and genetic study of the five factor model of personality features in human to come.Readers with more interest in this research can access the free full-text manuscript by clicking on the PMID link in the citation below.Follow the author on Twitter WRY999Photo of bald eagle head in profile is from the author's bird photography file. Riccelli R, Toschi N, Nigro S, Terracciano A, & Passamonti L (2017). Surface-based morphometry reveals the neuroanatomical basis of the five-factor model of personality. Social cognitive and affective neuroscience PMID: 28122961... Read more »

  • January 25, 2017
  • 12:06 PM
  • 284 views

Jet Lag and Baseball (MLB) Performance

by William Yates, M.D. in Brain Posts

Abrupt changes in the biological clock or circadian rhythm are noted to contribute to significant cognitive and psychomotor impairments.One practical area for this effect to potentially be important is in the area of sports performance.Alex Song and colleagues recently completed an interesting study of major league baseball (MLB) performance related to team travel patterns.The major leagues are divided into regional divisions (western, central and eastern) to minimize length of travel to and from away games. Nevertheless, significant travel continues to be part of the league and travel across the four U.S. time zones is not uncommon.This study included all MLB games between 1992-20o1. Baseball performance across a variety of variables was examined with attention to effects linked to travel and jet lag.The key findings from the study included:Most major jet lag performance impairment was noted for travel west to eastBoth east and west travel produced an increase in home runs allowedOffensive slugging percentage declines after teams travel home from away gamesThe authors conclude that starting pitchers appear to impacted by travel resulting in giving up more home runs following jet lag.The suggest teams might want to consider sending projected starting pitchers to away games a few days prior to the team's arrival.This is an important manuscript that demonstrates potential practical psychomotor effects of jet lag. Readers with more interest in the topic can access the free full-text manuscript by clicking on the PMID link in the citation below.Follow me on Twitter @WRY999Photo of Albert Pujols hitting is spring training is from the author's files.Song A, Severini T, & Allada R (2017). How jet lag impairs Major League Baseball performance. Proceedings of the National Academy of Sciences of the United States of America PMID: 28115724... Read more »

Song A, Severini T, & Allada R. (2017) How jet lag impairs Major League Baseball performance. Proceedings of the National Academy of Sciences of the United States of America. PMID: 28115724  

  • December 8, 2016
  • 05:59 AM
  • 412 views

Know your brain: Septum

by neurosci in Neuroscientifically Challenged

Where is the septum?























The term septum, when used in reference to the brain (it is a common anatomical term used to refer to a partition), indicates a subcortical structure in the forebrain that is found near the midline of the brain. The septum in humans can be separated into two structures: the septum pellucidum and septum verum. Each of these is sometimes referred to on is own as the "septum," which can make references to the structure a bit confusing.The septum pellucidum, which is Latin for “translucent wall,” is a thin, almost transparent membrane that runs down the middle of the brain from the corpus callosum to the area of a large fiber bundle called the fornix. The septum pellucidum acts as a partition between a portion of the lateral ventricles, forming part of the walls of the anterior region of the lateral ventricles. It is made up of a thin two-layered structure that consists of white matter, some neurons, fiber bundles, and blood vessels. The septum pellucidum is surrounded by neurons that make up the septum verum, which consists of assorted nuclei commonly referred to as the septal nuclei. The septal nuclei themselves are often categorized based on location and are split up into lateral and medial (and sometimes additional caudal and ventral) divisions.  What is the septum and what does it do?Little is known about the functional role of the septum pellucidum, and it is often treated as simply an anatomical barrier in many discussions of the septum. However, its connections with the hippocampus and hypothalamus suggest a role at least as a relay station between these structures. Although abnormalities of the septum pellucidum are associated with several neurological conditions, it is at this point unclear what role, if any, the septum pellucidum plays in directly generating the symptoms of such disorders.A bit more is known about the actions of the septal nuclei, which seem to be involved in a variety of functions, although their exact role in many of these functions is still relatively poorly understood. Additionally, most of what we do know about the septal nuclei comes from animal studies, as there is little research available on the functions of the septal nuclei in humans.The septal nuclei are considered part of the limbic system, a group of subcortical structures that are often linked to emotion but are really involved in a long list of functions in the human brain. The septal nuclei receive afferent (i.e. incoming) connections from other limbic structures like the hippocampus, amygdala, and hypothalamus, as well as the dopamine-rich ventral tegmental area. The septal nuclei also send projections to the hippocampus, habenula, thalamus, ventral tegmental area, and hypothalamus.One of the first functional roles to be linked to the septal nuclei was an involvement in processing rewarding experiences. In a now-famous group of experiments, researchers James Olds and Peter Milner found that electrical stimulation of the septal nuclei and several other areas of the brain seemed to be rewarding to rats. Rats, in fact, responded more strongly to stimulation in the septal region than any other part of the brain studied, leading Olds and Milner to hypothesize the septal region was perhaps the locus of the reward system.Although our understanding of the reward system has since expanded and less importance has been placed on the septal nuclei in favor of other structures like the ventral tegmental area and nucleus accumbens, the septal nuclei are still thought to potentially play a role in reward processing. Neurons in the septal nuclei give rise to axons that travel in the medial forebrain bundle, a collection of fibers that connects the nuclei with the hypothalamus and ventral tegmental area. The medial forebrain bundle is an important part of the reward system, thought to stimulate dopamine neurons in the ventral tegmental area in response to rewarding stimuli.The septal nuclei also are densely interconnected with the hippocampus, and through these connections may play a role in learning and memory. The septal nuclei and hippocampus are sometimes referred to as the septo-hippocampal complex, and projections to the hippocampus (which travel in a fiber bundle called the fornix and are often called septohippocampal fibers) are some of largest projections from the septal region. Although the precise role of the septal nuclei in memory functions is not yet clear, the hippocampus receives the majority of its acetylcholine projections from the septal region. These neurons are activated during learning and degenerate in conditions like Alzheimer's disease that are characterized by disruptions in memory processes. The septal nuclei have been implicated in a number of other roles such as social behavior and the expression of fear, and abnormalities in septal functioning have been linked to a variety of disorders ranging from depression to schizophrenia. The septal area, however, including both the septum pellucidum and septum verum/septal nuclei, is still relatively poorly understood; it will take more research to fully elucidate its functions and influence on behavior.Sheehan, T., Chambers, R., & Russell, D. (2004). Regulation of affect by the lateral septum: implications for neuropsychiatry Brain Research Reviews, 46 (1), 71-117 DOI: 10.1016/j.brainresrev.2004.04.009... Read more »

  • December 6, 2016
  • 12:07 PM
  • 391 views

Online Insomnia Therapy Effective in Clinical Trial

by William Yates, M.D. in Brain Posts

Insomnia of sufficient severity to meet clinical significance is estimated to affect up to 20% of the general population.This makes insomnia an important public health challenge.Effective, inexpensive and accessible programs to treat insomnia are needed.One recent controlled clinical trial supports the promise of an online intervention that incorporates key elements of cognitive behavioral therapy (CBT).Lee Ritterband and colleagues at the University of Virginia recently published a controlled clinical trial of online CBT in 303 adults with chronic insomnia. The key elements of design in their study included the following elements:Subjects: Adults with chronic insomnia defined as 30 minutes of insomnia (at onset or during night) 3 nights per week for the last 6 months. Subjects were also required to have total sleep times of 6.5 hours per night or less. Additionally, the insomnia was required to produce significant distress or impairment in function. Subjects had to have a reliable source of access to the internet.Intervention: The experimental intervention was a online automated program known as "Sleep Healthy Using the Internet (SHUTi). This intervention is a weekly internet-based program lasting 6 weeks. The program mimics elements of face-to-face CBT for insomnia. The control intervention consisted of a non-tailored internet-based informational program about insomnia.Outcome Measures: Self-report measure known as the Insomnia Severity Index (ISI) along with sleep diaries. The study demonstrated a statistically significant improvement in multiple sleep measures including the ISI score, duration of onset insomnia and duration of wake time after sleep onset with SHUTi compared to control.SHUTi subjects showed a reduction of sleep onset insomnia time from an average of around 45 minutes at baseline to about 20 minutes at one year follow up.Interestingly there was no difference between experimental treatment and control in total sleep time. However, both groups showed about a 50 minute increase in total sleep time at one year of follow-up.This study is important because it not only demonstrated a significant therapeutic effect for SHUTi but this effect was maintained for a full year. This supports the durability of the the therapeutic effect for this intervention.The authors note limitations to the study include a sample that tended to be highly educated with internet access. Additionally, it was impossible for complete blinding as some subjects likely could guess their assignment based on the content of their internet intervention.Readers with more interest in the SHUTi program can access the official site HERE.  The site allows completing the online program for a fee of $135. Readers can access the free full-text manuscript by clicking on the link located in the citation below.Follow me on Twitter WRY999Photo of Christmas lights at Rhema in Tulsa Oklahoma is from my files.Ritterband LM, Thorndike FP, Ingersoll KS, Lord HR, Gonder-Frederick L, Frederick C, Quigg MS, Cohn WF, & Morin CM (2016). Effect of a Web-Based Cognitive Behavior Therapy for Insomnia Intervention With 1-Year Follow-up: A Randomized Clinical Trial. JAMA psychiatry PMID: 27902836... Read more »

  • November 21, 2016
  • 11:10 AM
  • 403 views

Benefits of Physical Activity in Parkison's Disease

by William Yates, M.D. in Brain Posts

Parkinson's disease (PD) is a progressive neurodegerative disorder estimated to affect 7 to 10 million individual worldwide.The primary mechanism for Parkinson's disease is a reduction in the neurotransmitter dopamine in the midbrain region of the substantia nigra highlighted in red in the figure.PD impairs motor and cognitive functions and leads to significant decline in psychosocial functioning.Drugs for PD can be effective in reversing and slowing the progression of the illness. However, response is often limited with relapse over time.Physical exercise appears to be an adjunctive option in the multidisciplinary treatment of PD. Martine Lauze along with colleagues in France and the U.S. recently conducted a literature review of this topic.Their review covered 106 published papers between 1981 and 2015. A total of 868 outcome measures were examined. The key findings from their review included:Physical activity is most effective in improving physical capabilitiesPhysical activity is also effective in improving physical and cognitive function capacityPhysical activity appears to improve flexibilityLesser response to physical activity was found in the domains of PD clinical symptoms, depression, psychosocial function They noted clinical PD symptoms of bradykinesia, freezing of gait and tremor were very resistant to physical acitivity interverventions. On a more hopeful note, gait and postural alterations in PD are more responsive to physical activity.Physical therapy protocols that appear to have the best chance of producing improving PD include:Gait training, walking for increased speed and enduranceStrength training for improvement muscle mass in legs and armsFlexibility exercises for upper, lower extremities and trunkInterventions to reduce risk of falls (balance)The evidence supports routine referral for physical therapy and physical rehabilitation in those suffering from Parkinson's disease.You can access the free full text manuscript in this post by clicking on the citation link below.The figure in this post is a Creative Commons file from Wikipedia authored:By Madhero88 - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=7157181Follow me on Twitter.Lauzé M, Daneault JF, & Duval C (2016). The Effects of Physical Activity in Parkinson's Disease: A Review. Journal of Parkinson's disease, 6 (4), 685-698 PMID: 27567884... Read more »

Lauzé M, Daneault JF, & Duval C. (2016) The Effects of Physical Activity in Parkinson's Disease: A Review. Journal of Parkinson's disease, 6(4), 685-698. PMID: 27567884  

  • October 15, 2016
  • 05:47 AM
  • 422 views

Know your brain: Suprachiasmatic nucleus

by neurosci in Neuroscientifically Challenged

Where is the suprachiasmatic nucleus?











the suprachiasmatic nucleus is represented by a small green area within the hypothalamus (indicated by red arrow).







The suprachiasmatic nuclei are two small, paired nuclei that are found in the hypothalamus. Each suprachiasmatic nucleus only contains approximately 10,000 neurons. The nuclei rest on each side of the third ventricle, just above the optic chiasm. The location provides the rationale for the naming of the structure, as supra means above and chiasmatic refers to its proximity to the optic chiasm. What is the suprachiasmatic nucleus and what does it do?Circadian rhythms are biological patterns that closely follow a 24-hour cycle. The term circadian comes from the Latin for around (circa) and day (diem), and circadian rhythms govern a large number of biological processes including sleeping, eating, drinking, and hormone release. In the 1960s, researchers noticed that damage to the anterior hypothalamus of the rat caused a disruption in the animal's circadian rhythms. Several years later, the specific nucleus in the hypothalamus whose integrity was necessary for maintaining circadian rhythms was identified as the suprachiasmatic nucleus.We now know that the suprachiasmatic nucleus houses a type of biological clock that is able to keep our circadian rhythms on close to a 24-hour cycle, even without the help of external cues like daylight. Thus, if you were to lock someone in a room with no external light and no other way of telling the time, her body would still maintain a circadian rhythm of around 24 hours. The mechanism that regulates this biological clock was first elucidated in Drosophila, more commonly known as the fruit fly.In Drosophila, the timekeeping of the circadian clock appears to be controlled by a cycle of gene expression that has an ingenious negative feedback mechanism built into it. Cells in the suprachiasmatic nuclei of Drosophila produce two proteins called Clock and Cycle. Clock and Cycle bind together and act to promote the expression of two genes called period (per) and cryptochrome (cry). The protein products of these two genes, Per and Cry, then bind together and proceed to inhibit the actions of Clock and Cycle, an action which in turn suppresses the production of Per and Cry. Gradually, however, the Per and Cry proteins begin to break down. When Per and Cry degrade fully, Clock and Cycle are free to act again; they go back to promoting the expression of per and cry, starting the cycle anew. The process consistently takes around 24 hours to complete before it repeats. Thus, it is thought that this cycle of gene expression is what acts as the molecular clock in Drosophila suprachiasmatic nucleus cells.This understanding of the timekeeping mechanism in Drosophila laid the foundation for elucidating the process in mammals, where it is thought to be similar but more complex. The mammalian version of Cycle is called BMAL1 (which stands for brain and muscle ARNT-like 1). When CLOCK and BMAL1 bind together, they enhance the transcription of a family of genes that includes multiple period (Per1, Per2, and Per3) and cryptochrome (Cry1 and Cry2) genes. The resultant PER and CRY proteins form complexes along with other proteins to inhibit the activity of CLOCK and BMAL1 until the PER and CRY proteins degrade (as above). We know that this process of gene expression and inhibition acts to keep a 24-hour clock within the neurons of the suprachiasmatic nucleus, but less is known about how the timekeeping within these neurons leads to the regulation of rhythmic activity throughout the body. It is believed, however, that the molecular clocks found within the neurons of the suprachiasmatic nuclei regulate neural activity within the nuclei, which in turn coordinates the activity of multiple signaling pathways as well the stimulation of projections to neuroendocrine neurons in the hypothalamus involved with hormone release.Additionally, the suprachiasmatic nucleus helps to maintain circadian rhythms by coordinating the timing of billions of other circadian clocks found in cells throughout the rest of the brain and body. Not long after the discovery of the suprachiasmatic nucleus, it was also learned that similar types of molecular clocks exist in most other peripheral tissues and in many areas of the brain. These clocks, sometimes called slave oscillators (while the suprachiasmatic nucleus is considered the master oscillator) appear to depend on signals generated by the suprachiasmatic nucleus to synchronize their time-keeping with that of the suprachiasmatic nucleus. These signals can be associated with rhythms that the suprachiasmatic nucleus helps to establish, like feeding patterns, rest and activity behaviors, etc., or by direct neuronal or hormonal output from the suprachiasmatic nucleus.Although the suprachiasmatic nucleus is capable of maintaining circadian rhythms independently of any environmental signals (e.g. daylight), it does rely on cues from the environment to make adjustments to the circadian clock. For example, when you fly across multiple time zones, your body's circadian clock becomes significantly out of sync with the timing of the day (e.g. your body might be preparing for sleep when it is still light out). To make adjustments to the circadian clock in such instances, the suprachiasmatic nucleus relies on information it receives from the retina about light in the environment. Such information travels from the retina to the suprachiasmatic nucleus along a path called the retinohypothalamic tract. Additional inputs to the suprachiasmatic nucleus provide more information about light in the environment and other non-photic information about time of day to help to adjust the circadian clock.Due to the importance of our circadian rhythms to normal functioning, the integrity of the suprachiasmatic nucleus is essential to health. Disrupted function of the suprachiasmatic nucleus is being explored as a potential influence in a variety of psychiatric disorders as well as a factor in age-related decline in healthy sleep. Thus, although we have much more to learn about the suprachiasmatic nucleus, it is clear that it plays a very critical role in healthy brain and bodily function.Colwell, C. (2011). Linking neural activity and molecular oscillations in the SCN Nature Reviews Neuroscience, 12 (10), 553-569 DOI: 10.1038/nrn3086Dibner, C., Schibler, U., Albrecht, U. (2010). The mammalian circadian timing system: organization and coordination of central and peripheral clocks Annual review of physiology, 72 (1), 517-549.... Read more »

  • October 14, 2016
  • 11:50 AM
  • 386 views

Pathways to Substance Use and Abuse

by William Yates, M.D. in Brain Posts

Neuroscience medicine clinicians encounter patients every day who have both a mental and substance use disorder.This co-occurrence, or comorbidity, complicates diagnosis, treatment and outcome.The exact mechanism for this comorbidity issue is unclear.A recent study out of Washington University in St. Louis and King's College London provides some insight into this comorbidity issue.They examined participants in the Study of Addiction: Genetics and Environment (SAGE). These subjects provided genetic samples and psychiatric interviews to the research team.Five psychiatric disorders were studied including attention deficit hyperactivity, autism spectrum disorder, major depression, bipolar disorder and schizophrenia. A initial finding ruled out any link between genetic risk for autism spectrum disorder and any substance use/abuse risk.The remaining four psychiatric disorders did increase risk for substance use and abuse in a general manner. This means genetic risk for ADHD, bipolar disorder, major depression and schizophrenia all contribute to a general risk for substance use/abuse across all drug categories.Additionally, the team reported some specific drug use/abuse with individual genetic risk for ADHD, bipolar disorder, major depression and schizophrenia. These specific pathways included:Major depression polygenetic risk score and non-problem cannabis useMajor depression polygenetic risk score and severe cocaine dependenceSchizophrenia polygenetic risk score and  non-problem cannabis use and severe cannabis dependenceSchizophrenia polygenetic risk score and severe cocaine dependenceThe take-home message from this study is that genetic risk for many psychiatric disorders also contributes to a increased risk for general substance use/abuse. Additionally, some psychiatric disorders appear to increase risk for specific substance use/abuse issues.Prevention, assessment and treatment services need to address this relationship and the needs for each component of illness in those with comorbidity.Individuals with more interest in this topic can access the free full-text manuscript by clicking on the link in the citation below.Follow me on Twitter WRY999Image is an original graphic produced by me based on content in the manuscript.Carey CE, Agrawal A, Bucholz KK, Hartz SM, Lynskey MT, Nelson EC, Bierut LJ, & Bogdan R (2016). Associations between Polygenic Risk for Psychiatric Disorders and Substance Involvement. Frontiers in genetics, 7 PMID: 27574527... Read more »

Carey CE, Agrawal A, Bucholz KK, Hartz SM, Lynskey MT, Nelson EC, Bierut LJ, & Bogdan R. (2016) Associations between Polygenic Risk for Psychiatric Disorders and Substance Involvement. Frontiers in genetics, 149. PMID: 27574527  

  • October 10, 2016
  • 11:54 AM
  • 420 views

Alzheimer's Disease: Atrophy Pattern and Symptoms

by William Yates, M.D. in Brain Posts

Memory impairment is a key symptom of Alzheimer's dementia common to patients with the condition.However, additional cognitive and behavioral symptoms vary between patients with a clinical and pathological diagnosis of Alzheimer's disease.A key area of research is focused on understanding factors that contribute to symptom variability in Alzheimer's disease.A team of researchers from Singapore and Harvard Medical School recently published an important study on this topic.They analyzed structural MRI images from a group of 188 subjects in the Alzheimer's Disease Neuroimaging Initiative.Using a mathematical model known as latent Dirichlet allocation they were able to identify three distinct areas of atrophy of variable severity in their cohort. These three factors were found to be:Temporal atrophy: temporal cortex, hippocampus and amygdalaCortical atrophy: frontal, parietal, lateral temporal and lateral occipital cortex regionsSubcortical atrophy: striatum, thalamus and cerebellumIndividual patterns of atrophy within subjects were found to be stable over time. This means the atrophy patterns are not just a reflection of the stage of the Alzheimer's disease.As expected, patterns of atrophy correlated with neuropsychological domains of impairment. Temporal atrophy subjects had the greatest memory impairment. Cortical atrophy patients showed the most impairment in executive function. Subcortical atrophy subjects had lower levels of executive function and memory impairment and showed a slower rate of cognitive decline.Patients tended to fall into factor groups where more than one area of atrophy was noted, i.e. cortical and temporal atrophy but not subcortical atrophy.The authors note their findings support the heterogeneity of Alzheimer's disease affecting different cognitive and behavioral features as well as variability in disease progression.This is an important study in understanding the clinical manifestation of Alzheimer's disease. Readers with more interest in this research can access the free full-text manuscript by clicking on the link in the citation below.Follow me on Twitter @WRY999Brain image highlighting regions of the subcortex is an iPad screen shot from the app Brain Tutor.Zhang X, Mormino EC, Sun N, Sperling RA, Sabuncu MR, Yeo BT, & Alzheimer’s Disease Neuroimaging Initiative. (2016). Bayesian model reveals latent atrophy factors with dissociable cognitive trajectories in Alzheimer's disease. Proceedings of the National Academy of Sciences of the United States of America PMID: 27702899... Read more »

Zhang X, Mormino EC, Sun N, Sperling RA, Sabuncu MR, Yeo BT, & Alzheimer’s Disease Neuroimaging Initiative. (2016) Bayesian model reveals latent atrophy factors with dissociable cognitive trajectories in Alzheimer's disease. Proceedings of the National Academy of Sciences of the United States of America. PMID: 27702899  

  • September 29, 2016
  • 11:25 PM
  • 439 views

Locating social memories in the brain

by adam phillips in It Ain't Magic

Scientists have identified that social memories are stored in the vetral CA1 region of the brain (in mice). After meeting a mouse and forgetting it, the memories can be reactivated optogenetically, indicating that they exist, but cannot be retrieved after time passes... Read more »

Okuyama, T., Kitamura, T., Roy, D., Itohara, S., & Tonegawa, S. (2016) Ventral CA1 neurons store social memory. Science, 353(6307), 1536-1541. DOI: 10.1126/science.aaf7003  

  • September 28, 2016
  • 11:10 AM
  • 467 views

Lewy Body Versus Alzheimer's Dementias and Parkinson's

by William Yates, M.D. in Brain Posts

One clinical challenge is making an accurate diagnosis in patients with dementia.Alzheimer's disease is typically the predominant diagnosis in dementia. However a significant number of patients will present with dementia due to Lewy Body disease, Parkinson's dementia,  frontotemporal dementia or vascular dementia.A recent study helps clinicians to distinguish Lewy Body  from Alzheimer's dementia and Parkinson's disease.Douglas Scharre and collegues from Ohio State University conducted a matched pair analysis of 21 patients with Lewy Body dementia with 21 patients with Alzheimer's disease and 21 patients with Parkinson's disease.Parkinson's disease subjects in this study had higher cognitive function scores than the Lewy Body disease subjects but were matched on level of motor impairment.Subject groups were assessed on a variety of motor, cognitive and neuropsychological domains. Lewy Body dementia subjects differed from the Alzheimer's group in the following areas:Higher impairment scores on executive function and visuospatial functionLower impairment on memory and orientationHigher scores on measures of sleepinessHigher scores on fluctuation of cognitive and behavior deficitsMore hallucinationsMore sleep apneaLewy body dementia subjects differed from the Parkinson's disease group in the following areas:More impairment in axial motor functionMore impairment in gait and balance functionHigher scores on measures of sleepinessHigher scores on fluctuation of cognitive and behavior deficitsMore hallucinationsMore sleep apneaThe authors noted that measures of axial motor, gait and balance impairment correlated higher with level of executive function impairment, visuomotor function impairment and global cognitive impairment.This is an important study and highlights the need for specific neuropsychological testing along with assessment of motor, gait and balance domains dementia evaluations.Readers with more interest in this study can access the author's uncorrected proofs by clicking on the DOI link in the citation below.Follow me on Twitter WRY999Photo of brown thrasher from my back yard is from my files.Scharre, D., Chang, S., Nagaraja, H., Park, A., Adeli, A., Agrawal, P., Kloos, A., Kegelmeyer, D., Linder, S., Fritz, N., Kostyk, S., & Kataki, M. (2016). Paired Studies Comparing Clinical Profiles of Lewy Body Dementia with Alzheimer’s and Parkinson’s Diseases Journal of Alzheimer's Disease, 1-10 DOI: 10.3233/JAD-160384... Read more »

Scharre, D., Chang, S., Nagaraja, H., Park, A., Adeli, A., Agrawal, P., Kloos, A., Kegelmeyer, D., Linder, S., Fritz, N.... (2016) Paired Studies Comparing Clinical Profiles of Lewy Body Dementia with Alzheimer’s and Parkinson’s Diseases. Journal of Alzheimer's Disease, 1-10. DOI: 10.3233/JAD-160384  

  • September 12, 2016
  • 12:36 PM
  • 440 views

Opioid Abuse: Treatment Guidelines

by William Yates, M.D. in Brain Posts

Molecular model of the drug buprenorphineThe U.S. Surgeon General recently sent a letter to all physicians about the danger of prescription opioids, particularly when used in combination with benzodiazepines (i.e. Valium, Xanax).  This combination greatly increases the risk of overdose death.Clinicians can successfully assist those with opioid use disorders. Marc Schuckit, M.D. recently published a nice summary review of entitled "Treatment of Opioid-Use Disorder" in the New England Journal of Medicine (see citation below).The review contains some nice tables including tables related to:Diagnostic criteria of opioid use disorderAn opiate withdrawal rating scaleOpioid free treatment aids in management of opioid withdrawalOpioid agonist aids in management of opioid withdrawalOpioid agonist use in long-term maintenanceOpioid withdrawal is generally non-life threatening, but extremely uncomfortable and a strong motivator for return to opioid use.Use of opioid agonists such as buprenorphine (pictured in molecular model above) is the best tolerated and least uncomfortable pathway to treatment of opioid withdrawal. However, it is restricted to prescription by a small group of physicians who have completed a special training program.This means many opioid users end up with less tolerated and more uncomfortable withdrawal episodes treated with drugs such as clonidine, diazepam or even over-the-counter drugs such as Imodium.The review notes that with high relapse rates, supervised opioid maintenance with prescribed opioids such as methadone or buprenorphine often are required for long-term abstinence. One barrier to using these types of interventions is availability and cost. Buprenorphine can be expensive and requires regular monitoring in a physicians office.With high costs and lack of access to supervised opioid management, opioid users commonly return to street heroin or prescription opioid pills (licit or illicit). Readers with more interest in this topic can access the free full-text manuscript by clicking on the DOI link in the citation below.Follow me on Twitter @WRY999Model of buprenorphine is from a Wikipedia Creative Commons File attributed to:Crystallographic data from J. Mazurek, M. Hoffmann, A. Fernandez Casares, P. D. Cox and M. D. Minardi (June 2014). "Buprenorphine". Acta Cryst. E70, o635. DOI:10.1107/S1600536814009672Schuckit, M. (2016). Treatment of Opioid-Use Disorders New England Journal of Medicine, 375 (4), 357-368 DOI: 10.1056/NEJMra1604339... Read more »

Schuckit, M. (2016) Treatment of Opioid-Use Disorders. New England Journal of Medicine, 375(4), 357-368. DOI: 10.1056/NEJMra1604339  

  • September 6, 2016
  • 11:17 AM
  • 447 views

Keeping the Weight Off

by William Yates, M.D. in Brain Posts

Weight loss and maintenance of weight loss is difficult if not nearly impossible for most people.A registry of individuals who have lost 30 pounds or more and maintained their weight loss over a year exists in the U.S. This research effort is known as the National Weight Control Registry. It currently has over 10,000.I was looking at some of the published research results from this study. A paper published in 2012 used cluster analysis to identify sub-types of individuals with successful long-term weight loss.The study identified four specific sub-types of successful dieters. The varied on a variety of historical, demographic and behavioral variables. However, I was impressed by the amount of exercise the participants documented.Over 90% of the successful dieters burned an average of 2500 calories in a typical week of physical activity. Although calories burned by exercise varies by weight a typical weekly amount of exercise to burn 2500 calories per week would be:Walking 25 miles per week (six plus hours walking per week)Running 18 miles per week (three plus hours running per week)These levels of physical activity are two to three times more than typical recommendations for physical activity in the general population.This study suggests successful weight loss maintenance is hard work. Unless you are committed to and capable of this level of activity it may be nearly impossible to lose weight and keep the weight off.Readers with more interest in this study can access the free full-text manuscript by clicking on the PMID link in the citation below.Follow me on Twitter @WRY999Photo of fireworks is a Moto G4 plus smartphone pic from my files.Ogden, L., Stroebele, N., Wyatt, H., Catenacci, V., Peters, J., Stuht, J., Wing, R., & Hill, J. (2012). Cluster Analysis of the National Weight Control Registry to Identify Distinct Subgroups Maintaining Successful Weight Loss Obesity, 20 (10), 2039-2047 DOI: 10.1038/oby.2012.79... Read more »

  • August 20, 2016
  • 06:54 AM
  • 569 views

What are we getting wrong in neuroscience?

by neurosci in Neuroscientifically Challenged

In 1935, an ambitious neurology professor named Egas Moniz sat in the audience at a symposium on the frontal lobes, enthralled by neuroscientist Carlyle F. Jacobsen's description of some experiments Jacobsen had conducted with fellow investigator John Fulton. Jacobsen and Fulton had damaged the frontal lobes of a chimpanzee named "Becky," and afterwards they had observed a considerable behavioral transformation. Becky had previously been stubborn, erratic, and difficult to train, but post-operation she became placid, imperturbable, and compliant. Moniz had already been thinking about the potential therapeutic value of frontal lobe surgery in humans after reading some papers about frontal lobe tumors and how they affected personality. He believed that some mental disorders were caused by static abnormalities in frontal lobe circuitry. By removing a portion of the frontal lobes, he hypothesized he would also be removing neurons and pathways that were problematic, in the process alleviating the patient's symptoms. Although Moniz had been pondering this possibility, Jacobsen's description of the changes seen in Becky was the impetus Moniz needed to try a similar approach with humans. He did so just three months after seeing Jacobsen's presentation, and the surgical procedure that would come to be known as the frontal lobotomy was born.Moniz's procedure initially involved drilling two holes in a patient's skull, then injecting pure alcohol subcortically into the frontal lobes, with the hopes of destroying the regions where the mental disorder resided. Moniz soon turned to another tool for ablation, however---a steel loop he called a leucotome (which is Greek for "white matter knife")---and began calling the procedure a prefrontal leucotomy. Although his means of assessing the effectiveness of the procedure were inadequate by today's standards---for example, he generally only monitored patients for a few days after the surgery---Moniz reported recovery or improvement in most of the patients who underwent the procedure. Soon, prefrontal leucotomies were being done in a number of countries throughout the world. The operation attracted the interest of neurologist Walter Freeman and neurosurgeon James Watts. They modified the procedure again, this time to involve entering the skull from the side using a large spatula. Once inside the cranium, the spatula was wiggled up and down in the hopes of severing connections between the thalamus and prefrontal cortex (based on the hypothesis that these connections were crucial for emotional responses, and could precipitate a disorder when not functioning properly). They also renamed the procedure "frontal lobotomy," as leucotomy implied only white matter was being removed and that was not the case with their method. Several years later (in 1946), Freeman made one final modification to the procedure. He advocated for using the eye socket as an entry point to the frontal lobes (again to sever the connections between the thalamus and frontal areas). As his tool to do the ablation, he chose an ice pick. The ice pick was inserted through the eye socket, wiggled around to do the cutting, and then removed. The procedure could be done in 10 minutes; the development of this new "transorbital lobotomy" brought about the real heyday of lobotomy.The introduction of transorbital lobotomy led to a significant increase in the popularity of the operation---perhaps due to the ease and expediency of the procedure. Between 1949 and 1952, somewhere around 5,000 lobotomies were conducted each year in the United States (the total number of lobotomies done by the 1970s is thought to have been between 40,000 and 50,000). Watts strongly protested the transformation of lobotomy into a procedure that could be done in one quick office visit---and done by a psychiatrist instead of a surgeon, no less---which caused he and Freeman to sever their partnership.Freeman, however, was not discouraged; he became an ardent promoter of transorbital lobotomy. He traveled across the United States, stopping at mental asylums to perform the operation on any patients who seemed eligible and to train the staff to perform the surgery after he had moved on. Freeman himself is thought to have performed or supervised around 3,500 lobotomies; his patients included a number of minors and a 4-year old child (who died 3 weeks after the procedure). Eventually, however, the popularity of transorbital lobotomy began to fade. One would like to think that this happened because people recognized how barbaric the procedure was (along with the fact that the approach was based on somewhat flimsy scientific rationale). The real reasons for abandoning the operation, however, were more pragmatic. The downfall of lobotomy began with some questions about the effectiveness of the surgery, especially in treating certain conditions like schizophrenia. It was also recognized that some types of cognition like motivation, spontaneity, and abstract thought suffered irreparably from the procedure. And the final nail in the coffin of lobotomy was the development of psychiatric drugs like chlorpromazine, which for the first time gave clinicians a pharmacological option for intractable cases of mental disorders. It is easy for us now to look at the practice of lobotomy as nothing short of brutality, and to scoff at what seems like a tenuous scientific explanation for why the procedure should work. It's important, however, to look at such issues in the history of science in the context of their time. In an age when effective psychiatric drugs were nonexistent, psychosurgical interventions were viewed as the "wave of the future." They offered a hopeful possibility for treating disorders that were often incurable and potentially debilitating. And while the approach of lobotomy seems far too non-selective (meaning such serious brain damage was not likely to affect just one mental faculty) to us now, the idea that decreasing frontal lobe activity might reduce mental agitation was actually based on the available scientific literature at the time.Still, it's clear that the decision to attempt to treat psychiatric disorders through inflicting significant brain damage represented a failure of logic at multiple levels. When we discuss neuroscience today, we often assume that our days of such egregious mistakes are over. And while we have certainly progressed since the time of lobotomies (especially in the safeguards protecting patients from such untested and dangerous treatments), we are not that far removed temporally from this sordid time in the history of neuroscience. Today, there is still more unknown about the brain than there is known, and thus it is to be expected that we continue to make significant mistakes in how we think about brain function, experimental methods in neuroscience, and more.Some of these mistakes may be due simply to a natural human approach to understanding difficult problems. For example, when we encounter a complex problem we often first attempt to simplify it by devising some straightforward way of describing it. Once a basic appreciation is reached, we add to this elementary knowledge to develop a more thorough understanding---and one that is more likely to be a better approximation of the truth. However, that overly simplistic conceptualization of the subject can give birth to countless erroneous hypotheses when used in an attempt to explain something as intricate as neuroscience. And in science, these types of errors can lead a field astray for years before it finds its way back on course.Other mistakes involve research methodology. Due to the rapid technological advances in neuroscience that have occurred in the past half-century, we have some truly amazing neuroscience research tools available to us that would have only been science fiction 100 years ago. Excitement about these tools, however, has caused researchers in some cases to begin utilizing them extensively before we are fully prepared to do so. This has resulted in using methods that cannot yet answer the questions we presume they can, and has provided us with results that we are sometimes unable to accurately interpret. In accepting the answers we obtain as legitimate and assuming our interpretations of results are valid, we may commit errors that can confound hypothesis development for some time.Advances in neuroscience in the 20th and into the 21st century have been nothing short of mind-boggling, and our successes in understanding far outpace our long-standing failures. However, any scientific field is rife with mistakes, and neuroscience is no different. In this article, I will discuss just a few examples of how missteps and misconceptions continue to affect progress in the field of neuroscience.The ________________ neurotransmitterNowadays the fact that neurons use signaling molecules like neurotransmitters to communicate with one another is one of those pieces of scientific knowledge that is widely known even to non-scientists. Thus, it may be a bit surprising that this understanding is less than 100 years old. It was in 1921 that the ... Read more »

  • July 29, 2016
  • 11:22 AM
  • 630 views

Elite Cyclists and Brain Fatigue Resistance

by William Yates, M.D. in Brain Posts

In a Brain Post from 2012 I reviewed a study of fatigue in elite athletic performance. This study supported a key role in the brain insula in regulating the perception of exercise-induced fatigue. You can access this post by clicking HERE.An update on this topic was recently published in PloS One by a research team in Australia.This study compared performance on a cognitive task after extreme 20 minute cycling time trial. Professional cyclists were compared to recreational cyclists on the Stroop test that requires inhibitory control.The results of the study were that elite cyclists performed significantly better on the Stroop test (more correct responses) following exercise than non-elite cyclists. This is indicative of a greater resistance effects of fatigue on brain performance.The authors note in the discussion section:"These finding suggest that successful endurance performance may require superior inhibitory control and resistance to mental fatigue."This resistance to mental fatigue at high levels of exercise may be a key component in successful performance at the elite level.Inhibitory control has been shown to have a significant genetic association and to be stable over time. It is possible that training interacts with genetic factors to produce brain fatigue resistance in the elite cyclist population.Readers with more interest in this topic can access the free full-text research manuscript by clicking on the PMID link in the citation below.Follow the author on Twitter HERE.Photo of non-elite cyclist participating in triathlon is from the author's files. Martin K, Staiano W, Menaspà P, Hennessey T, Marcora S, Keegan R, Thompson KG, Martin D, Halson S, & Rattray B (2016). Superior Inhibitory Control and Resistance to Mental Fatigue in Professional Road Cyclists. PloS one, 11 (7) PMID: 27441380... Read more »

Martin K, Staiano W, Menaspà P, Hennessey T, Marcora S, Keegan R, Thompson KG, Martin D, Halson S, & Rattray B. (2016) Superior Inhibitory Control and Resistance to Mental Fatigue in Professional Road Cyclists. PloS one, 11(7). PMID: 27441380  

join us!

Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.

If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.

Register Now

Research Blogging is powered by SMG Technology.

To learn more, visit seedmediagroup.com.