China probes for Sars links in pneumonia outbreak

Healthworker China SARS

Health worker walks past a SARS billboard in Hefei, the capital of China’s Anhui province.

China is investigating an outbreak of atypical pneumonia that is suspected of being linked to Sars, the flu-like virus that killed hundreds of people a decade ago, state media reported on Tuesday (Dec 31).

A team of experts from the National Health Commission were dispatched on Tuesday to Wuhan, in central China’s Hubei province, and are “currently conducting relevant inspection and verification work”, state broadcaster CCTV reported.

An emergency notification issued on Monday by the Wuhan municipal health committee said hospitals in the city have treated a “successive series of patients with unexplained pneumonia”, without offering details.

Chinese news site The Paper reported 27 cases of viral pneumonia in Wuhan in December, citing unnamed health officials from the city. “Of the 27 cases, seven were critical, the rest were under control, and two patients are expected to be discharged from hospital in near future,” The Paper said.

It is unclear whether all these patients are suspected of having contracted severe acute respiratory syndrome (Sars), a highly contagious respiratory disease.

The emergency notification has urged hospitals to offer treatment and report cases in a “timely manner”.

The World Health Organisation (WHO) criticised China for under-reporting the number of Sars cases following the outbreak in 2003.

Sars killed 349 people in mainland China and another 299 in Hong Kong in 2003.

The virus, which infected more than 8,000 people around the world, is expected to have originated in the southern Chinese province of Guangdong, according to WHO.

China sacked its then Health Minister Zhang Wenkang for the poor handling of the crisis in 2003, several months after the first case was reported.

WHO announced that China was free of the deadly Sars virus in May 2004.

Wearing shoes from a young age makes your ankles less flexible


© Sebastian Kopp/Getty Images
Wearing shoes from a young age may not help you put your best foot forward

Your shoes are changing your feet. The ankles of people who habitually wear shoes are different to those of people who walk barefoot.

These changes to ankle bones take place over the course of a person’s life, and there is no evidence that they can be passed on genetically.

In modern industrial societies, most people wear shoes from a young age. However, in traditional hunter-gatherer societies people often go barefoot, or wear only very thin footwear.

“We know that there are some variations in the feet of modern humans, due to the use of shoes,” says Rita Sorrentino of the University of Bologna in Italy. But most previous findings relate to the front and middle of the foot. Sorrentino and her team have focused on the ankle, which is crucial because it links the foot to the leg.

They studied 142 ankle bones from 11 populations from North America, Africa and Europe. These included modern sandal-wearing Nguni farmers in southern Africa, people living in modern New York, and preserved bones from Stone Age hunter-gatherers.

The hunter-gatherers’ ankle bones were significantly shorter than those of people living in modern cities, and there were other differences in the shape. “They are mostly related to footwear-related behaviours and locomotor behaviours,” says Sorrentino.

The hunter-gatherers walked barefoot for long distances every day over natural terrain. Their ankles were relatively flexible. In contrast, the people who live in big cities wear constrictive footwear and walk short distances on flat surfaces like asphalt roads. Their ankles were more rigid. “There is a loss of flexibility,” says Sorrentino.

There is now debate among scientists about the downsides to wearing shoes or going barefoot. Foregoing footwear may at first seem hard on feet, but there is evidence that even though people who walk barefoot develop thicker soles, that doesn’t stop them being able to sense the ground, which is important for locomotion.

Sorrentino says it is an open question whether shoes have disadvantages, but she suspects a key factor is that the rigidity of modern shoes causes our bones to become weaker and thus more prone to fracturing.

Solid evidence for people wearing shoes only exists for the last 10,000 years, says Sorrentino. For instance, a sandal from a Missouri cave may be 8300 years old. These early shoes were all fairly soft, resembling moccasins or sandals, so wouldn’t have restricted the motion of the ankle much.

Journal reference: American Journal of Physical Anthropology, DOI: 10.1002/ajpa.23976

Can lowering inflammation help treat major depression?


It is estimated that 7.1% of the adult population in the U.S. experienced at least one major depressive episode in 2017. The highest rates are among those ages 18 to 25 years. Many people believe depression is caused by a chemical imbalance in the brain. This is a theory that has been widely promoted by drug companies and psychiatrists, to the point it is now accepted as fact.

However, this is just a theory and, worse, it’s a theory that has been largely discredited. The idea spread quickly after it was proposed in the 1960s when it appeared antidepressant drugs altered brain chemicals. In the 1980s, Prozac (fluoxetine) was released by Eli Lilly and heavily promoted to balance brain chemicals and affect depression.

Prozac had fewer side effects than some of the earlier antidepressants and soon became the poster child for selective serotonin reuptake inhibitor (SSRI) class of antidepressants. However, while heavily prescribed, data repeatedly showed SSRIs worked no better than placebos for those experiencing mild to moderate depression.

Although antidepressants don’t effectively treat depression, they do double the risk of harm from suicide and violence in healthy adults and increase aggression in children and adolescents.

Researchers also suggest major depression could be vastly overdiagnosed and overtreated with antidepressants. The majority who are prescribed these drugs end up staying on them long-term, which may compromise their health.

More Studies Link Depression to Inflammation

Researchers have found yet another link between inflammation and depression. In one study1 published in the Journal of Neurology, Neurosurgery & Psychiatry, researchers systematically reviewed the safety and effectiveness of anti-inflammatory agents in people suffering with major depression.

The literature review included results from 30 randomized control trials with a 1,610 participants. In an overall analysis of 26 studies, the researchers found anti-inflammatory agents reduced depressive disorder when compared with placebo. They found no differences in quality of life analysis but did find gastrointestinal event differences between the treatment periods.

A subanalysis of the data demonstrated an adjunctive treatment with antidepressants with nonsteroidal anti-inflammatory drugs, statins, omega-3 FAs and minocycline significantly reduced depressive symptoms.2

Results from another large metanalysis3 carried out by researchers from Aarhus University Hospital in Denmark revealed similar findings, showing anti-inflammatories may be effective in the treatment of depression. One researcher explains the study showed the combination of anti-inflammatory drugs along with antidepressants have beneficial effects.

The results also showed the effect against depression was present when the anti-inflammatory medication was used alone, compared against a placebo. The scientists analyzed 36 international studies of participants who suffered from depression or who had symptoms of depression. One of the researchers, Dr. Ole Köhler-Forsberg, commented on the results of the study:4

“This definitely bolsters our chances of being able to provide personalised treatment for individual patients in the longer term. Of course we always have to weigh the effects against the potential side-effects of the anti-inflammatory drugs.

We still need to clarify which patients will benefit from the medicine and the size of the doses they will require. The findings are interesting, but patients should consult their doctor before initiating additional treatment.”

Yet in another recently published study in Molecular Psychiatry5 scientists found patients treated with immunotherapeutics for inflammatory disorders, who also presented with depression or depressive symptoms, experienced symptomatic relief. The team found the reduction in depressive symptoms was not associated with any treatment-related changes in their physical health.

Immune Dysregulation May Trigger Allergic-Type Reaction

There has been an increasing number of studies in which depression is reported to be linked to immune dysregulation and inflammation, mimicking an allergic reaction.6 Your body uses inflammation as a defense mechanism to an attack.

A localized, infected wound demonstrates an isolated inflammatory response as it turns red and sore. Inflammation is also triggered by stress and physical trauma; inflammation in turn triggers depression. This is related to response to the release of cytokines, which are small protein cells the body uses to help with the response.

This information may ultimately influence emotions and how you feel. By affecting the quality of your sleep, metabolism and stress responses, inflammation may create a biological environment triggering depressive symptoms.

The findings from these studies have contributed to a mounting body of evidence that inflammation may be a biochemical route of mental health symptoms. Thus, it may provide another nonpharmacological route for treating those who suffer with depression. Köhler-Forsberg and colleagues are interested in a pharmacological response, and he points out:7

“Some studies suggest that the choice of antidepressant can be decided by a blood sample that measures whether there is an inflammatory condition in the body. Other studies show that the same blood sample can be used as a guideline for whether a depressive patient can be treated with anti-inflammatory medicine that has a better effect when there is inflammation present at the same time as the depression.

However, we need to verify these findings and examine which patients can benefit from this before it can be implemented in everyday clinical practice.”

Mental Health Screening May Overlook Contributing Factors

Physicians commonly use mental health screening tests to determine how best to treat depressive symptoms. The use of these screening tests is only as good as the physician administering them to analyze the data and how you feel when you enter the doctor’s office.

In one 2013 study, an evaluation of 5,639 participants identified by their clinician as suffering with depression, researchers found only 38.4% met the DSM-4 criteria for a major depressive episode. In speaking to The New York Times, one of the researchers pointed out that not only are physicians prescribing more medications, but patients are demanding more as well.

He points out Americans have become used to using drugs to address the stresses of daily life that may trigger short-term situational sadness. Mental health screening tests do not often consider vitamin deficiencies, lack of exercise, poor nutrition, lack of sleep or inflammation.

Antidepressant Use Doubled in Seniors

In the 2013 study evaluating participants who were prescribed antidepressants by their physician, a mere 14.3% of those over age 65 met the DSM-4 criteria for a major depressive episode. To investigate whether the number of antidepressant drugs prescribed to seniors had risen, another team of researchers looked at data from 1991 to 1993.

They compared this against data gathered from studies occurring from 2008 to 2011. During the early period, 4.2% of adults were taking antidepressants. This number more than doubled to 10.7% during the later period. The rate of antidepressant use in older adults living in care homes in the English population cohort studies also rose, from 7.4% to 29.2%.

A study published in 2017 reviewed data from 1990 to 2015 gathered from Australia, Canada, England and the U.S. In this report it was noted that the prevalence of disorders and symptoms had not decreased despite an increase in the prescription of antidepressants.

In seniors, depression is associated with cognitive decline, dementia and poor medical outcomes. Those with depression also experience higher rates of suicide and mortality. Guidelines from the American Psychiatric Association suggest antidepressant medication with psychotherapy in the elderly. But, despite the increased risks with antidepressants, most seniors receive only medication.

Treatment with antidepressant drugs in seniors increases the risk of Type 2 diabetes, which increases the risk of other comorbid health conditions including heart disease and stroke. The drugs are also linked to the development of thicker arteries and dementia.

In addition, depending on the classification of drug, they are known to reduce several nutrients, including coenzyme Q10, vitamin B12, calcium and folate. Of concern in the elderly is the risk of osteoporosis and fractures associated with antidepressant medications.

One 2015 study compared women treated with indigestion drugs against those treated with SSRIs and found a 76% increased rate of fracture in the first year in those taking antidepressants. When these risks are combined with the knowledge the drugs work no better than placebo for mild to moderate depression, seniors may experience greater risk than any benefit they receive.

Consider Nonpharmacological Options to Reduce Depression

Reducing the inflammatory response in your body is crucial as it is a root cause of many chronic conditions, including depression. In addition to strategies to reduce inflammation, there are other approaches with a history of improving symptoms.

As you consider the following nonpharmacological suggestions, remember you don’t have to do them all at once, and you can accomplish them no matter your age or current physical abilities. Begin the journey to better health taking small, permanent steps.

  • ExerciseExercise normalizes your insulin and leptin sensitivity and has a significant effect on kynurenine, a neurotoxic stress chemical produced from the amino acid tryptophan; brain-derived neurotrophic factor (BDNF), a growth factor regulating neuroplasticity and new growth of neurons and your endocannabinoid system
  • Nutrition — There are several nutritional factors that affect your mood and emotions, not the least of which is eating too much sugar. Excessive amounts of sugar disrupt your leptin and insulin sensitivity, affect dopamine levels and damage your mitochondria, all of which affect your mood.Nutrients such as omega-3 fats, magnesium, vitamin D and the B vitamins each influence your mood and brain health. You may experience the beneficial effects of boosting these nutrients to optimal levels in as little as two weeks.
  • Light therapyLight therapy is an effective treatment for seasonal affective disorder, and researchers8 find it is also effective against moderate to severe depression. Participants simply used a white light box for 30 minutes each day as soon as possible after waking up.
  • Mindful meditation or Emotional Freedom Techniques (EFT) — In a study9 of 30 moderately to severely depressed college students, the depressed students were given four 90-minute EFT sessions. Students who received EFT showed significantly less depression than the control group when evaluated three weeks later.

Widespread overuse of herbicides leading to resistant black-grass is costing UK £400 million per year

blackgrass in wheat field

Blackgrass in wheat field

Scientists from international conservation charity ZSL (Zoological Society of London) have for the first time put an economic figure on the herbicidal resistance of a major agricultural weed that is decimating winter-wheat farms across the UK.

A vital ingredient in mince pies, biscuits and stuffing — and of course a large amount fed to turkeys, the future of Christmas dinners containing wheat could be at risk, with the persistent weed making its way across British fields.

Black-grass (Alopecurus myosuroides) is a native annual weed which although natural, large infestations in farmers’ fields can force them to abandon their winter wheat — the UK’s main cereal crop. Farmers have been using herbicides to try and tackle the black-grass problem — but in many areas of England the agricultural weed is now resistant to these herbicides. The cost of black-grass heralded as ‘Western Europe’s most economically significant weed’, is setting back the UK economy £400 million and 800,000 tonnes of lost wheat yield each year, with potential implications for national food security.

Published in Nature Sustainability today (23 December 2019), researchers from ZSL’s Institute of Zoology, Rothamsted Research and Sheffield University have devised a new model which helps quantify the economic costs of the resistant weed and its impact on yield under various farming scenarios.

An estimated four million tonnes of pesticide are applied to crops worldwide each year. There are 253 known herbicide-resistant weeds already, and unlike the known-costs to the economy of human antibiotic resistance — which runs into trillions of dollars — estimates of the costs of resistance to agricultural xenobiotics (e.g. antimycotics, pesticides) are severely lacking.

Over-use of herbicides can lead to poor water quality and causes loss of wild plant diversity and indirect damage to surrounding invertebrate, bird and mammal biodiversity relying on the plants.

The ZSL research found the UK is losing 0.82 million tonnes in wheat yield each year (equivalent to roughly 5% of the UK’s domestic wheat consumption) due to herbicide resistant black-grass. The worst-case scenario — where all fields contained large amounts of resistant black-grass — is estimated to result in an annual cost of £1 billion, with a wheat yield loss of 3.4 million tonnes per year.

Lead author and postdoctoral researcher at ZSL’s Institute of Zoology, Dr Alexa Varah said: “This study represents the first national-scale estimate of the economic costs and yield losses due to herbicide resistance, and the figure is shockingly higher than I think most would imagine.

We need to reduce pesticide use nationwide, which might mean introducing statutory limits on pesticide use, or support to farmers to encourage reduced use and adoption of alternative management strategies. Allocating public money for independent farm advisory services and research and development could help too.”

Management industry recommendations have so far advised using a mixture of herbicides, designed to prevent the evolution of ‘specialist’ resistance, however alarmingly recent research has revealed that this method actually alters the type of resistance to a more generalist resistance, giving resistance to chemicals the plants have never been exposed to.

Glyphosate is now one of the few herbicides that black-grass has not evolved resistance to, with farmers now reliant on repeated applications to control the weed. However, evidence from a recent study shows that resistance to glyphosate is now evolving in the field too.

Dr Varah added; “Farmers need to be able to adapt their management to implement more truly integrated pest management strategies — such as much more diverse crop rotations and strict field hygiene measures.

“Currently resistance management is the responsibility of individual practitioners, but this isn’t a sustainable approach. It should be regulated through a national approach, linking the economic, agricultural, environmental and health aspects of this issue in a National Action Plan — that also targets glyphosate resistance.

“Understanding the economic and potential food security issues is a vital step, before looking at biodiversity, carbon emissions and water quality impacts in greater detail. We hope to use this method to aid the development of future models to help us understand how British farmers battling black-grass could do it in a way that is more beneficial to biodiversity like insects, mammals, wild plants and threatened farmland bird species like skylarks, lapwing and tree sparrows — unearthing how their numbers are linked to changes in farming practices.”

Summarizing the evidence for sex differences in cognition

brain skull model

In a previous post I examined the biological and social influences on sex and gender identity. Evidence suggests that biology plays a powerful role in the determination of sex as well as of gender identity, although social forces are also important particularly as they relate to gender role expression. In this essay I’ll examine the evidence surrounding a related controversial topic: whether or nor there are cognitive differences between the sexes and, if so, whether they are biological or social in origin.

In what follows, I’ll focus on individuals whose gender identity matches their biological sex. This leaves out nonbinary and transsexual persons, about whom there is far less research evidence. Nevertheless, given that transsexuals tend to have hypothalamuses that match their identified gender not their biological sex, it would be interesting to know if this produces cognitive differences as well. Some evidence suggests that the administration of sex hormones to those undergoing transition does influence cognition in expected ways. Other studies suggest that cognitive differences exist prior to hormone treatment, and that the cognition of transsexuals resembles that of their identified gender more than that of their biological sex (a finding that appears to lend further support to the hypothesis that gender dysphoria is produced by women’s brains in men’s bodies and vice versa).

This essay offers an exploration of mean group differences. Nothing here should be taken to imply that either sex should be excluded from certain cognitive tasks or professions. Nor should mean group differences, which are often quite small, be used to infer the capabilities of any given individual.

Are there Sex Differences in Overall IQ?

Most of the available evidence suggests there is very little difference in overall IQ between males and females. Controversy arises around the question of whether there is more spread (what statisticians call standard deviation or the degree to which most people deviate from the average or mean) among males than females. For IQ tests, the mean is typically set at 100, with a standard deviation of 15. This means that most individual scores fall between 85 and 115. Not everyone, even those with “normal” IQs, is going to score exactly 100. The Gender Variability Hypothesis suggests that boys tend to have more spread or higher standard deviation than do girlsin other words there are proportionally more geniuses and cognitively impaired individuals among males, whereas females cluster closer to the mean. This hypothesis can be very provocative. It was at the center of a 2005 controversy involving then-president of Harvard University Lawrence Summers, who suggested that it might partly explain why males outnumber females in high-echelon STEM (Science, Technology, Engineering, and Math) careers. An outcry ensued, and Summers resigned a year later.

Boys and men tend to be overrepresented among individuals with lower IQs including those with intellectual disabilities. Few people seem to be interested in discussing that. But some studies indicate that boys are also overrepresented in the superior range of IQs (130 and above). In a 2003 study of Scottish youth born in the early twentieth century, girls actually tended to outperform boys until roughly the 115 IQ mark (because there are fewer girls with lower IQs, and more girls in the average to high average range). Only once we pass the 115 IQ mark do we start to see a male advantage. For instance, for IQs 130 and above, boys represent 57.7 percent of the high performers as compared to 50.4 percent of the underlying sample. Biologist Heather Heying recently tweeted a graph from this study. Note that there were still plenty of girls in the highest IQ ranges. So, if IQ differences were the only explanation for, say, STEM Ph.D. discrepancies, we’d still expect 42.3 percent of STEM Ph.D.s to be awarded to women.

A 2016 cross-cultural analysis confirmed the gender variability hypothesis. However, this mainly benefited males in science and math, whereas females scored consistently higher in reading. A 2019 reanalysis of the same dataset confirmed these results, but also suggested that the difference between males and females decreases (but does not vanish) in countries that actively promote egalitarian female participation in the workplace and in education. This suggests that the greater male spread, at least in some abilities, is real and that both biological and social forces play a role.

There are, however, some cautionary notes. One study finds that differential dropout rates in long-term outcome studies may exaggerate the male advantage in IQ variability. Similarly, other analyses suggest that this effect is consistent within the US and UK, but not across all cultures, (although the calculations made by that research group in related contexts have sometimes been criticized). Further, any apparent issues related to sex variability at the top end of intelligence may be due to specific abilities rather than overall intelligence.

Are there Sex Differences in Specific Cognitive Abilities?

It is often said that women and girls perform better at cognitive tasks testing verbal abilities, whereas men and boys have better visuospatial skills involving mental rotation or hand-eye coordination. The evidence appears to bear this out. For instance, in a large sample of Portuguese youth, most cognitive differences between boys and girls were pretty trivial in size. However, mechanical reasoning showed significant differences, with boys outperforming girls. Another study in the US found that girls tend to outperform boys on memory and processing speed tasks, whereas boys once again have an advantage on visuospatial tasks. A very large study of English schoolchildren lends support to female superiority in verbal skills, with males showing an advantage in quantitative abilities.

These sex differences, with female advantages in verbal ability and memory, and male advantages in visuospatial cognition appear to be fairly consistent across samples and across time, subject to the caveats mentioned below. It’s important to note that most of these differences are quite small, and perhaps not worth worrying about. However, males do appear to be at a significant disadvantage in reading and writing whereas comparative male advantages in math and science may explain the greater proportion of males in STEM, at least in part.

Biology, Sociology, or Both?

If there is a sex difference in intelligence variability is it innate or socialized? When we’re discussing human psychology, it’s generally a safe bet to assume that both biological and social forces play a role in shaping behavior. Twelve years ago, a diverse panel of psychologists looked at this issue from every angle — evolutionary, biological, and sociocultural — and found that…it’s complex! Although sex differences in cognitive abilities may not be directly evolved, evolution, brain structure, and culture likely interact to produce various outcomes. The authors conclude that “early experience, biological factors, educational policy, and cultural context affect the number of women and men who pursue advanced study in science and math and that these effects add and interact in complex ways. There are no single or simple answers to the complex questions about sex differences in science and mathematics.”

Some studies suggest that, even in cultures where women are encouraged to participate in highly demanding activities like chess, sex discrepancies in top performers remain. However, other studies suggest that sex disparities favoring men tend to be most pronounced in cultures with overall high levels of sexual stratification, such as women being restricted from entering the workforce. Even in the US, sex disparities have declined over time, suggesting a clear sociocultural impact on such abilities, although stabilization appears to have been achieved more recently, hinting sociocultural forces are only one piece of the puzzle and biology is also important. More opportunities for women in education and occupation tend to at least reduce, though not eliminate, male advantages in some cognitive abilities. My impression is that there has been less emphasis on addressing male deficits in reading and writing, even as males fall behind in attending college.

Training also matters. For instance, some research suggests that women training with fast-paced video games increase their visuo-spatial cognition thereby reducing sex disparities. A study I conducted years ago found few overall male advantages for visuospatial tasks, with each sex better at visuospatial tasks involving items traditionally associated with each sex. By contrast, some beliefs such as “stereotype threat” (the theory that stereotypes such as “girls are bad at math” can negatively influence female performance) are now in trouble, potentially part of psychology’s replication crisis.

That said, some scholars probably go too far in denying the involvement of biology at all. For instance, one paper asserts that mathematics performance “is largely an artifact of changeable sociocultural factors, not immutable, innate biological differences between the sexes.” Although this conclusion is no doubt well-intentioned, it probably goes too far, particularly given how powerfully involved genetics are in intelligence and cognition. Rather, it appears that biological and cultural forces interact in complex ways. Statements that suggest that cognitive differences between men and women are entirely innate are similarly reductive.

Being aware of the science can be difficult in a hypercharged political environment, where hyperbole rules on both sides. For instance, a 2005 Washington Post article made striking claims about brain similarities across the sexes, mainly by pointing to studies of sociocultural influences. This is a bit of a dodge, since few scholars who find evidence for biological differences claim sociocultural influences are unimportant. The author suggests that even sex differences in physical aggression may not be real, pointing to evidence of female equivalence in the perpetration of domestic violence. While that particular data point is accurate, males tend to vastly outnumber females in the perpetration of other violent crimes, a fact that went unmentioned.

In closing, there are several reasonable conclusions we can draw from the current data:

  • There is little evidence for an overall sex difference in IQ.
  • Males may show more variability in IQ, resulting in greater proportional representation at both ends of the IQ scale. However, these proportional differences are probably smaller than is often claimed and don’t fully explain outcome discrepancies, such as in STEM careers.
  • Females appear to be generally superior at verbal and memory tasks, with males superior at visuospatial tasks. Male advantage on visuospatial and quantitative tasks may be one factor in explaining STEM discrepancies, whereas female advantages in verbal abilities may explain females outpacing males in higher education more generally.
  • Sex differences in cognitive ability are most pronounced among cultures with more sex stratification.
  • Genetics have a strong influence on IQ.
  • Sex differences in cognitive abilities are likely due to a complex interaction of evolutionary, biological, and sociocultural forces. Exclusive focus on only one of these is likely to result in an incomplete theoretical model.

A final observation: Much of this debate focuses on perceived differences in ability related to outcomes such as STEM careers. There is also a wide range of literature regarding sex differences in interest, which is also complicated, and which may explain a larger portion of the sex discrepancy in STEM careers. Put simply, many women may have the ability to perform in STEM careers but display more interest on average in alternative high-status careers such as medicine or law.

Christopher J. Ferguson is a professor of psychology at Stetson University in Florida. He is author of Moral Combat: Why the War on Violent Video Games is Wrong and the Renaissance mystery novel Suicide Kings. His forthcoming book How Madness Shaped History will be out in January 2020. You can follow him on Twitter @CJFerguson1111

Richard Dawkins discovers his ideal idiom and audience

children painting

Some writers struggle for years to achieve a proper harmony in their work between style and substance. For some, that precious concinnity remains elusive till the end. So it is always something of a happy surprise when an author discovers his or her ideal idiom in the twilight of a long career. In a sense, Richard Dawkins has always been a writer of books for children — or, at any rate, for readers with childish minds — but not until now, it seems, has it occurred to him to write explicitly as a children’s author. Tο this point, he has made a good living out of a relative paucity of gifts. As a third-tier zoologist, a popularizer of both scientific truths and pseudo-scientific speculations, and a tireless enemy of all religious beliefs (whether he understands them or not), he has gone far on an engagingly mediocre prose-style and an inflexible narrowness of mind. But, while his ambition has always been toward a certain intellectual gravity, even his putatively most serious (or most self-important) books have had an undeniably infantile quality about them. The Selfish Gene, for instance, was really little more than a cartoon of the molecular biology it pretended to explicate, a simplistic genetocentric reductionist fantasia so fraught with obvious logical errors and so prone to inadvertently and ineptly metaphysical claims that no truly mature mind could fail to recognize its fatuity. The God Delusion was, if anything, even more of a nursery entertainment: puerile rants, laboriously obvious jokes, winsomely preposterous conceptual confusions, a few dashes of naïve but honest indignation, attempts at philosophical reasoning so maladroit as to be touching in their guileless silliness. And I think it fair to say that nothing Dawkins has written for public consumption has lacked this element of beguiling absurdity — the delightful atmosphere of playtime on a long golden summer afternoon, alive with small figures shouting happily in shrill little voices and stumbling about in their parents’ clothing, acting out scenes from what they imagine to be the daily lives of adults. But the bewitching effect has also always been diluted by his unfortunate failure to embody his ideas in a form suitable to their triviality.

With Outgrowing God: A Beginner’s Guide, all of that has changed. The warm, languid sunlight of those idyllic revels positively spills across its pages. At last, Dawkins has found an authorial voice entirely adequate to his theme. And it is charming. Yes, of course, the confused and chaotic quality of his arguments remains a constant, and the basic conceptual mistakes have not altered appreciably since the earliest days of his polemics; but here it all comes across as the delightful babble of a toddler. “Do you believe in God?” he asks on the first page, tugging at your sleeve, eager to inform you of all the interesting things he has learned about religion this week. “Which god? Thousands of gods have been worshipped throughout the world, throughout history.” Do tell. And, in fact, tell he does, breathlessly emptying out his whole little hoard of knowledge about the local deities of ancient peoples. The sheer earnest impishness of his manner is almost enough to make you ignore his continued inability — despite decades of attempts by more refined logicians to explain his error to him — to distinguish between the mythic and devotional stories that peoples tell about their gods and the ontological and modal claims that the great monotheistic traditions of the “axial age” and after have made about God, or to grasp the qualitative conceptual gulf that separates them.

To a great degree, as it happens, the first half of the book consists in a long series of variations on just this fundamental incomprehension. Much of the information Dawkins imparts is accurate enough, moreover, with a few notable exceptions (for instance, he believes that the Orthodox and Roman churches actually split from one another over the filioque controversy, but that is a common enough misconception). And, to be fair, taken solely as a riposte to fundamentalist religion, it all has a certain force. Certainly, Dawkins cannot be gainsaid in pointing out that a purely literalist reading of Christian scripture is unsustainable. Of course, he lacks the sophistication to know which of his claims are really sound, which arguments solvent, which criticisms accurate, and so forth, and on the whole these pages read like a digest of material culled from atheist websites. I imagine, in fact, that this is where much of it comes from. So it is hardly surprising that he lazily adopts quite a few inane arguments along with the good ones, or that he assumes that fundamentalist literalism is the Christian norm. And obviously he is insensible to the amusing reality that most of the textual inconsistencies and historical dubia he cites were identified first by Christian scholars. But, again, whereas in the past these bumptious vacuities were annoying, here the effervescent banality of the prose renders it all somehow — how to say it? — renders it all somehow rather cute.

This is, after all, as much as one can ask. At this point in Dawkins’s career, no one could possibly want him to deviate from his accustomed channels. Taking all of his previous publications into account, it would probably be rather unsettling if he all at once began to exhibit philosophical gifts; it would seem eerily unnatural, like a kindergartener suddenly mastering quantum theory. And in this respect his new book does not disappoint. Despite decades of the best and most persistent efforts on the parts of his philosophically literate critics to disabuse him of his crudest conceptual errors and to rouse him from his dogmatic slumber, Dawkins has made not the slightest advance in dialectical subtlety. He is like a marvelously flawless diamond that the winds of time cannot blemish. When, for instance, he considers whether there is any proof of God’s “existence,” it is clear that he still cannot distinguish between the sort of empirical evidence that might confirm the existence of a certain kind of object in the physical world (say, a teapot in orbit, to use the example he borrows from Bertrand Russell) and the sort of logical and metaphysical “proofs” by which one might attempt to establish the reality of a transcendent God who is the source and ground of all existence as such. In fact, all the principal questions of classical metaphysics remain a terra incognita for him. He certainly appears, moreover, to be unconscious of the qualitative difference between specific religious dogmas (which are as a rule intelligible only within certain larger contexts of belief) and ontological or logical asseverations about God as such, or of why the rejection of the former should have no consequences for one’s view of the latter. And, in general, he reprises all his most familiar “philosophical” gestures. As he has before, for instance, he invokes Hume’s argument against the plausibility of miracles, in a sweetly oversimplified form, clearly unaware both of its irrelevance to most religious rationales for belief and of its own formidable internal contradictions. And so on. All of his reasoning is dreadful, of course. Even where he gets something right, it is clear that he has done so only by accident, and has reached his correct formulation for all the wrong reasons. Every argument is circular, or incoherent, or simply wrong — which, under normal circumstances, would quickly grow annoying. Once again, however, the book’s lispingly infantile prose makes it impossible for the reader not to feel indulgent. Rather than rebuke Dawkins for his vapidity, one almost wants to give him treats.

The book’s second half, I should note, is somewhat more substantial. At least, it is more informative, being something of a child’s guide to basic evolutionary and organic biology. Rhetorically, however, it is just as insipid as what precedes it, inasmuch as the premise subtending it is that theological statements about God as creator and biological statements about life processes are all situated on the same explanatory level and hence constitute rival narratives of the same natural events. Here, as elsewhere, years of exposure to philosophical correctives have left no mark on Dawkins’s mind. He still cannot grasp that, logically speaking, whatever might account for the existence of causality as such cannot itself be a contingent cause among other causes. Thus, for instance, he seems to think that his discussion of embryology precludes the “alternative” claim that God is life’s creator. This is a curious sort of confusion, of course, since Dawkins must realize that most religious people do in fact believe in embryos, and even in ova and spermatozoa, and that they have no difficulty at all in affirming that nature — created though it be — consists in a large variety of natural processes. But, in another sense, no confusion is more typical of the man’s thinking than this one: this total inability, that is, to grasp that what the great theistic creeds mean by “creation” is qualitatively irreducible to any kind of physical causation, or transition between physical states. And its persistence is almost heroic. A man of lesser resolve might by now have relented just a little bit before the onslaught of philosophical scorn Dawkins has attracted over the years and, if nothing else, conceded that he might not fully understand what distinguishes ontological contingency from physical causation. Not Dawkins. For him still, no less than in his salad days, evolutionary science and the doctrine of creation from nothingness represent antithetical causal proposals regarding a single object. And so, once again, he feels it worthwhile to struggle at length against arguments for design in nature of the sort Deists and Intelligent Design theorists promote, even though such arguments are no real part of any major creed’s understanding of creation, and are irrelevant to the issue of creation as such in any event.

I suppose I should mention — though it probably does not need saying — that Dawkins has also gotten no better at distinguishing between the literal and the metaphorical even in his own language, or between empirical facts and speculative fictions. In a sense, much of his public career has been erected on a deep foundation of just such confusions. He may say now, for instance, that he regrets his earlier talk of “selfish genes”; but, take away the ludicrous fantasy of tiny unseen replicators occultly guiding the course of all life, constructing organisms as “robots” whose only purpose is to act as vehicles for those replicators, and then endowing those robots with the illusion of free will and moral sentiments, and so forth, and nothing really remains of the book that first made him famous. Then again, to take another example, there could scarcely be a more ridiculous burlesque of empirical science than the concept of “memes”: free-floating fragments of intentionality magically capable of subsisting apart from and in some sense logically prior to the intentional subjectivity they shape and populate; and yet Dawkins to this day still speaks as if those invisible little sprites — whose nature cannot be clearly defined, whose activities cannot be directly observed, and whose existence is as immune to proof as it is to logical plausibility — were real objects of scientific scrutiny. So it is scarcely surprising that Dawkins is insensible as well to the thoroughly metaphorical and obscure nature of, say, talk of genetic “information,” or that his whole understanding of all physical phenomena depends upon a logically impossible model of irreducible physical emergence. And then . . .

Well, really, perhaps none of this matters very much. Readers with serious minds took leave of Dawkins years ago. The chief lesson of this book may be that it is foolish to resent a childish mind for thinking childishly, especially when — however belatedly — it has learned to express itself in the sort of enchantingly childish voice that suits it best.

One cannot, alas, remain an infant forever; we must all sooner or later put away childish things; toy-land, toy-land, once you pass its portals . . . (and so on). In the end, if we want to think deeply about ultimate questions, Dawkins is not the man for us. We all have to outgrow him and his kind and all that they represent. Happily, the buoyant callowness of his most recent book invites us to do just that. In a sense, it gives expression to a degree of self-awareness on Dawkins’s part that has never been conspicuous in his work in the past, and of which he had seemed until now incapable. It suggests that, at some level, he has learned to recognize his ideas as essentially idle diversions for unformed minds — something on the order of a birthday-party clowns or miniature ponies or balloon-animals — and in this way it gives us license to ignore him with more geniality than we might otherwise have been able to manage. He means well, after all; he simply is not — and never will be — a thinker for adults. So, though outgrow him we must, we need not do so with rancor or disdain. We can even, if we wish, pause one last time before departing the nursery to appreciate his awkward but earnest ingenuousness, smile at his artless games and rambling stories, and perhaps fondly pat him on the head. In that sense, this book is a gift.

David Bentley Hart is author of more than a dozen books and roughly 750 articles. He is the author of the forthcoming Theological Territories: A David Bentley Hart Digest. He is exceedingly fond of dogs.

Want to change your life? Ditch New Year’s Resolutions for habit tracking

habit tracking

It’s an age-old conundrum — every time January 1st rolls around, millions of Americans set New Year’s Resolutions, but by the time February rolls around, one third of us have abandoned every single one. So, if you want to make 2020 the year you finally organize your finances, get in shape or complete any other monumental task, you may want to forget New Year’s Resolutions. Instead of writing down grandiose goals, turn your attention to your daily habits.

What is habit tracking?

Habit tracking — the practice of monitoring the tiny things you do every single day — helps you set small and achievable goals that add up over time to produce huge changes in your life. It’s a powerful tool that will help you establish healthy habits that’ll stick for years to come.

Habits are so important because they are ultimately responsible for any change we make in our life. I’ll never be able to publish a novel if I don’t sit down every day and put words on the page. I’ll never be able to get stronger if I don’t workout every day (or at least a few times a week.) I’ll never be able to be a more mindful and enlightened person if I don’t meditate every morning.

Any huge goal you want to accomplish is virtually insurmountable unless you break it up into bite-sized habits that will, over time, lead you to your final destination.

Why does habit tracking work?

habit tracking

© Getty Images
You can make habit tracking as tech-free as you like.

I’ve been habit tracking for a little over a year, and I’ve seen incredible results. I print out a spreadsheet at the start of every week and hang it on my bedroom door. Some examples of my personal daily habits are to do a strength-oriented workout, read 10 pages of an academic book, write 500 words and floss. In the past year I’ve been able to finally do two pullups in a row, learned more about philosophy and global conflicts, had a short story I wrote accepted to read on one of my favorite podcasts and the best part of all is that my dentist no longer pesters me about flossing.

Of course, I’ve had many days this year when I failed to complete my habits, but the way I’ve gone about my habit tracking journey highlights many of the reasons why it’s so effective.

First off, it’s extremely visible. I have to look at my habits every single time I exit my bedroom. Setting up obvious reminders to complete your habits is one of the three key methods that James Clear lays out in the book Atomic Habits, and by staring at my list several times a day, I do just that.

Secondly, by physically checking off the boxes, I create a reward for myself. Making the habit satisfying is another one of Clear’s guidelines, and I can attest that there’s nothing more satisfying than having an entire week of habits checked off perfectly. There’s a reason why companies like Snapchat use the concept of “streaks” to motivate their users to keep using the app.

And thirdly, my habits are easy to complete. I’m never going to sit down and write an entire book, but 500 (or even 50) words is a very digestible task. It doesn’t take too much time or energy, but if I stick with it consistently I’ll have a full length manuscript by the end of the year.

habit tracking

© Getty Images
Make your habits easy to see — using sticky notes on a wall is a great way to visualize your success.

How can you start habit tracking?

You can turn almost any New Year’s Resolution into a daily habit. For example, saving more money can be as easy as skipping Starbucks and making your coffee at home everyday. Investing in self-care can mean meditating or taking a hot bath every night before bed, getting fit is a matter of doing some small exercise every day and you can achieve your yearly goal of reading more by getting through as little as three pages a day.

First, choose one or two mega-goals you’d like to accomplish. Start small — you can always add on more goals to your habit tracker once you’ve gotten used to it.

Then, choose a method to track your habits. You can go old-school like me and print out an Excel spreadsheet with the days as columns and one habit on each row, or you can try out the many mobile applications. Momentum Habit Tracker allows you to export data onto a spreadsheet and integrates seamlessly with your iPhone. If you’re an Android user, Habitica is a cool app that turns your habits into a motivating RPG game.

Don’t let failure discourage you

If you miss a single day, it’s no big deal — researchers have found that a single missed opportunity has a negligible effect on habit formation. Just try to hop back on the habit train the next day.

I’ll also impart a few helpful tips I’ve learned through trial and error. First off, if you find that you’re consistently missing habits, they’re most likely too difficult. You want to set yourself up for success — if you find that you can’t write 1,000 words a day, scale it down to 100, or even just 10.

Similarly, learn to give yourself an occasional break. If you’re going on vacation, maybe ditch the habit of not eating dessert till you get back home. Habit tracking is meant for the long slog, and it’s not helpful to try to be perfect 100% of the time.

Finally, don’t choose habits or New Year’s Resolutions that you don’t enjoy (at least on some level). For years I tried to force myself to lift weights in a strict routine using dumbbells and gym machines, and could never stick with it. Now, I focus on bodyweight movements that are fun and dynamic — pull ups, handstands and other monkey bar tricks — and I’ve consistently done my workouts for an entire year.

The ripple effects of expressing gratitude

A new study shows that expressing gratitude affects not only the grateful person, but anyone who witnesses it.

Thank you, gratitude

Researchers studying gratitude have found that being thankful and expressing it to others is good for our health and happiness. Not only does it feel good, it also helps us build trust and closer bonds with the people around us.

These benefits have mostly been observed in a two-person exchange — someone saying thanks and someone receiving thanks. Now, a new study suggests that expressing gratitude not only improves one-on-one relationships, but could bring entire groups together — inspiring a desire to help and connect in people who simply witness an act of gratitude.

In this extensive study, Sara Algoe of the University of North Carolina at Chapel Hill and her colleagues ran multiple experiments to investigate how witnessing gratitude affects people’s feelings toward the grateful person and the benefactor (the person who is being thanked).

They came up with a few different ways for participants to observe gratitude. In one experiment, participants were tasked with reading a movie review draft and underlining eye-catching passages for the reviewer’s benefit. Before they began, though, they saw an example (supposedly done by a previous participant). Several lines of text were underlined, as the assignment required, but many typos were also corrected, showing effort that went beyond the original assignment. In some cases, this help was acknowledged with a handwritten note from the reviewer saying, “Thank you so much for catching those typos!”

Afterward, participants underlined passages in another article by the reviewer, and researchers counted how many typos they corrected as a measure of their willingness to offer extra help. Then, people were asked how much they might like to be friends with the reviewer.

The results showed that people who had seen a note of gratitude were more willing to correct typos and help out, and more likely to want to become friends with the reviewer, than those who hadn’t.

“When people witness an expression of gratitude, they see that the grateful person is the kind of person who notices when other people do kind things and actually takes the time to acknowledge them — meaning, they’re a good social partner,” says Algoe. “People who are responsive as social partners are really desirable people.”

Based on other survey questions, Algoe and her team also discovered that participants wanted to help and affiliate with the person receiving the gratitude. That’s because receiving gratitude marks you as a person who is effective at being supportive or helpful, says Algoe.

“It’s helpful to know who the people in our environment are who will do nice things for other people, because they are attractive relationship partners,” she says.

What was the exact cause of participants’ reactions, though? It’s possible that people are attracted to others who seem positive in general, like the reviewer, or they simply felt elevated by witnessing the other participant’s generosity. To find out the active ingredient in gratitude and why it may have effects on bystanders, Algoe and her colleagues ran more experiments.

In one, they had participants watch videos where one member of a real-life couple expressed gratitude toward their partner. The videos varied in how much the grateful person praised their partner’s fine qualities — e.g., admiring their partner’s listening skills or their generous nature — and expressed how the partner’s generosity had benefitted them — e.g., helping them work through a difficult problem or save money they would have spent on a cab. The videos also varied in how warm, positive, or competent the person in the video appeared.

The results showed that participants were more drawn to videotaped individuals who praised their partner than to those who focused on how they’d benefitted personally. True, they were also drawn to grateful people who seemed warm, competent, and positive — but those traits didn’t matter nearly as much as praising another’s fine qualities.

To Algoe, this points to a particularly important element of gratitude — its other-focused nature — which may be key to influencing witnesses of gratitude.

“When a grateful person actually takes the time to step outside of themselves and call attention to what was great about the other person’s actions — that’s what distinguishes gratitude from other kinds of positive emotional expressions,” she says.

Interestingly, Algoe’s findings weren’t affected by the gender of the witnesses, grateful people, or benefactors. Although men might fear that gratitude makes them look weak or become indebted, even men who expressed gratitude were rated as more competent than those who didn’t.

These findings build on prior research by showing that expressions of gratitude not only provide social glue for the people involved — the grateful person and the benefactor — but also spread beyond the dyad, affecting witnesses in ways that could reverberate throughout a group.

“It’s easy to imagine how this might work in a workplace, where people are actually attending to and acknowledging other people’s good deeds and kindnesses,” says Algoe. “A whole group of people could be inspired to be kinder to one another, and, through this interwoven kindness, the group itself could become a higher-functioning group.”

Does this mean we should all be expressing gratitude more frequently? Yes, says Algoe — though how it’s expressed could differ by context and culture. For some situations, she says, it may be appropriate to be demonstrative rather than verbal — giving a hug, for example, or bringing a gift of flowers. In another context, a simple thank you — especially if it’s sincere and not manipulative — will get the ball rolling.

Whatever the case, though, it’s clear we can do more to increase social connection if we acknowledge the good in those around us. “Gratitude expression seems to be a unique kind of emotional experience that is really well-suited for relationship building,” says Algoe.

A key area of the brain is smaller in women on the pill

At the base of the brain is a small but crucial area that acts as a control hub for the nervous and hormonal systems. Now, a study has found that among women, it is significantly smaller in those using birth control pills.


New research finds an intriguing link between birth control pills and the size of a brain area key for managing the hormonal system.

The Food and Drug Administration (FDA) first approved birth control pills for use in the United States in 1960. Today, in the U.S., 12.6% of women between the ages of 15 and 49 years take these pills.

Known simply as “the pill,” this oral contraceptive is one of the most popular forms of birth control, but people also use it to help with a wide range of conditions, including irregular menstruation, acne, polycystic ovary syndrome, endometriosis, and cramps.

In essence, the pill began as a way of preventing pregnancy using hormone control.

Originally, manufacturers engineered it to stop ovulation through the hormone progesterone, but it has since evolved to include a myriad of different types. These involve various hormone combinations, doses, and schedules, depending on the desired outcome. People can also use the pill to skip menstruation or stop it entirely.

But what does this harnessing of hormone power mean for the body’s natural system of hormones?

Before the current study, which the researchers presented at the 2019 annual meeting of the Radiological Society of North America, there was very little research into the effects of birth control pills on the hypothalamus.

This small region of the brain, which sits above the pituitary gland at the organ’s base, performs the vital role of producing hormones and helping control a range of bodily functions — including sleep cycles, mood, sex drive, appetite, body temperature, and heart rate.

The researchers who presented the study acknowledged that before their work, there had not been any reporting on the effect of birth control pills on the structure of the human hypothalamus.

“There is a lack of research on the effects of oral contraceptives on this small but essential part of the living human brain,” says Michael Lipton, Ph.D., who is a professor of radiology at the Gruss Magnetic Resonance Research Center at Albert Einstein College of Medicine and medical director of MRI Services at the Montefiore Medical Center, both in New York City, NY.

This may be down to the fact that, until now, there was no known way of quantitatively analyzing MRI exams of the hypothalamus.

Lipton explained to Medical News Today that the team’s previous work also inspired them to investigate these effects. “We have reported some quite interesting findings on sex-based risk in brain injury,” he said. “Specifically, women seem to fare worse than men. Other studies have shown that the female sex hormone progesterone is neuroprotective.”

“Since [oral contraceptive pills] are widely used, we wanted to explore the effects of [oral contraceptive pills] in healthy women to understand their potential role in our sex-divergent findings. The finding we report here is one outcome from that exploration.”

Dramatic difference in hypothalamus size

“I was not expecting to see such a clear and robust effect,” said Lipton. The researcher also notes, “We found a dramatic difference in the size of the brain structures between women who were taking oral contraceptives and those who were not.”

For the study, the researchers recruited 50 women in good health, 21 of whom were taking birth control pills.

The team carried out MRI scans, which use radiology to generate images of organs, to look at the brain of each of the 50 women. They then used a validated methodology to gauge the hypothalamic volume.

“We validated methods for assessing the volume of the hypothalamus and confirm, for the first time, that current oral contraceptive pill usage is associated with smaller hypothalamic volume,” says Lipton.

The researchers found that the women taking birth control pills had a significantly lower hypothalamus volume than those who were not using oral contraceptives.

Hypothalamic volume and anger

Although the study found that there was no noteworthy link between hypothalamic volume and a woman’s cognitive ability, or ability to think, the preliminary findings suggest that there is an association between smaller hypothalamic volume and reduced anger.

“These findings are generally consistent with previous studies of [oral contraceptive pills] that support [an effect] on mood regulation. Our finding might represent a manifestation of the mechanism behind these effects or simply be unrelated. It is just too soon to tell,” said Lipton.

“This initial study shows a strong association and should motivate further investigation into the effects of oral contraceptives on brain structure and their potential impact on brain function,” concludes Lipton.

Regarding plans for future work, Lipton said: “For my group, the most important and immediate goal is to incorporate the role of [oral contraceptive pills] into our ongoing studies and to further explore the role of normal sex hormone cycles related to the menstrual cycle, as well as the role of androgens (testosterone) in men and women.”

Vagus Nerve: The mysterious nerve network that quiets pain and stress — and may defeat disease


Take a deep breath. Hug a friend. Reach for the ceiling and stretch your limbs. Each of these simple acts bestows a sense of calm and comfort. And each works its soothing magic in part by activating a complicated system of nerves that connects the brain to the heart, the gut, the immune system, and many of the organs. That system is known collectively as the vagus nerve.

The vagus nerve is one of the twelve cranial nerves, which sprawl out from the brain and into the body like an intricate network of roots. These nerve networks act as lines of communication between the brain and the body’s many systems and organs. Some of the cranial nerves interpret sensory information collected by the skin, eyes, or tongue. Others control muscles or communicate with glands.

The vagus nerve, also called the “10th cranial nerve,” is the longest, largest, and most complex of the cranial nerves, and in some ways it’s also the least understood. Experts have linked its activity to symptom changes in people with migraine headaches, inflammatory bowel disease, depression, epilepsy, arthritis, and many other common ailments. The more science learns about the vagus nerve, the more it seems like a better understanding of its function could unlock new doors to treating all manner of human suffering.

Vagus is Latin for “wandering,” which is apt when one considers all the different parts of the body the vagus nerve reaches. “It seems like every year somebody finds a new organ or system that it talks with,” says Tiffany Field, PhD, director of the Touch Research Institute at the University of Miami School of Medicine.

Field says that branches of the vagus nerve are connected to the face and voice. “We know that depressed people have low vagal activity, and this is associated with less intonation and less-active facial expressions,” she explains. A separate branch of the vagus nerve runs down to the gastrointestinal tract. Here, low vagal activity is associated with slowed gastric motility, which interferes with proper digestion, she says.

Still other branches of the vagus nerve are connected to the heart, the lungs, and the immune system. The vagus nerve’s activation or deactivation is tied to the ebb or flow of hormones such as cortisol and the digestive hormone ghrelin, the amount of inflammation the immune system produces, and many other internal processes that shape human health and experience. “There’s a massive bioelectrical and biochemical series of events that the vagus nerve is responsible for, and all that is almost impossible to map,” Field says.

How could one nerve system control so much? While some aspects of vagal activity are inscrutable, it’s clear that the nerve is the governor of the parasympathetic nervous system, which helps control the body’s relaxation responses. In simple terms, heightened vagal activity counteracts the stress response, which involves the sympathetic nervous system. “The sympathetic nervous system is fight or flight, while the parasympathetic nervous system is more chill out,” says Stephen Silberstein, MD, a professor of neurology and director of the Headache Center at Philadelphia’s Thomas Jefferson University Hospitals.

Silberstein co-wrote a comprehensive 2016 review of the research on the vagus nerve. He says that heightened vagal activity slows heart rate and also switches off inflammation, in part by triggering the release of immune system calming chemicals. There’s also evidence that activating the vagus nerve through electronic stimulation can produce a range of health benefits. “Depending on the frequency of the stimulation, we know it can turn off an asthma attack or an epileptic seizure,” Silberstein says. “It can turn off a migraine or cluster headache, and it can decrease the perception of acid reflux.”

Pick almost any common medical condition that’s made worse by stress or inflammation — everything from arthritis to inflammatory bowel disease — and there’s research showing that vagus nerve stimulation can help treat it or relieve its symptoms.

In the past, this stimulation required a surgical implant in the chest that transmits electrical pulses directly into the vagus nerve. But some newer, noninvasive devices — including one that has FDA approval for the treatment of migraine and cluster headaches — are capable of stimulating the vagus nerve when pressed against the skin of the neck. Silberstein says doctors are exploring the use of vagus nerve stimulation for a wide range of diseases and disorders, including afflictions of the mind.

“More and more, we’re learning how critical vagal activity is to attention and mood,” says Field. Already, there’s evidence that stimulating the vagus nerve may improve working memory or help people with attention deficit hyperactivity disorder. And since the early 2000s, the FDA has approved vagus nerve stimulation for the treatment of some forms of depression.

While electronic stimulation holds promise — and, in some cases, is already providing relief — for people with a range of ailments, Field says there are plenty of ways to stimulate vagal activity without a device or implant. “We know that massage and yoga promote parasympathetic nervous system activity, which is vagal activity,” she says.

Her research has shown that these and many related activities increase vagal activity via pressure receptors buried beneath the surface of the skin — receptors located throughout the body, and ones that only firm pressure or a deep stretch can reach. She points out that light touching or stroking is arousing, while a bear hug or powerful handshake are inherently soothing. “A strong hug or a handshake promote parasympathetic activity,” she says.

Silberstein says that almost anything people find relaxing — meditation, deep breathing — is also associated with heightened vagal activity and parasympathetic nervous system activity. “We did studies in the past showing that patients with migraine have impaired vagal activity,” he says. “We tried to fix that by doing yoga or deep-breathing meditation, and we found a lot of those things enabled us to activate the vagal nerve.” On the other hand, stress and anxiety are associated with depressed vagal activity, which may help to explain why these conditions are linked with an increased risk for other maladies.

There’s still a lot about the vagus nerve science doesn’t understand. But as doctors uncover more of its secrets, these discoveries are revealing new and more effective ways to relieve pain, inflammation, sadness, and disease.