Blog

Junk food cravings linked to lack of sleep, study suggests

Having even one night without sleep leads people to view junk food more favourably, research suggests.

Scientists attribute the effect to the way food rewards are processed by the brain. Previous studies have found that a lack of shuteye is linked to expanding waistlines, with some suggesting disrupted sleep might affect hormone levels, resulting in changes in how hungry or full people feel.

But the latest study suggests that with hormones may have little to do with the phenomenon, and that the cause could be changes in the activity within and between regions of the brain involved in reward and regulation.

“Our data brings us a little closer to understanding the mechanism behind how sleep deprivation changes food valuation,” said Prof Jan Peters, a co-author of the research from the University of Cologne.

Writing in the Journal of Neuroscience, Peters and colleagues describe how they recruited 32 healthy men aged between 19 and 33 and gave all of them the same dinner of pasta and veal, an apple and a strawberry yoghurt.

Participants were then either sent home to bed wearing a sleep-tracking device, or kept awake in the laboratory all night with activities including parlour games.

All returned the next morning to have their hunger and appetite rated, while 29 of the men had their levels of blood sugar measured, as well as levels of certain hormones linked to stress and appetite. Participants also took part in a game in which they were presented with pictures of 24 snack food items, such as chocolate bars, and 24 inedible items, including hats or mugs, and were first asked to rate how much they would be willing to pay for them on a scale of €0-€3.

During a functional magnetic resonance imaging (fMRI) scan, they were asked to choose whether or not they would actually buy the object when its price was fixed – an experiment that allowed researchers to look at participants’ brain activity upon seeing pictures of food and other items.

A week later, the experiment was repeated, with the participants who had previously stayed up allowed to sleep, and vice versa.

The results showed that whether sleep-deprived or not, participants were similarly hungry in the morning, and had similar levels of most hormones and blood sugar. However, when participants were sleep-deprived, they were willing to pay more for a food snack than when rested, and had higher levels in their blood of a substance called des-acyl ghrelin – which is related to the “hunger hormone” ghrelin, though its function is not clear.

The fMRI results showed that when sleep-deprived, participants had greater activity in the brain’s amygdala (where food rewards are processed) when food images were shown, and a stronger link between the price participants would pay for food and activity in the hypothalamus (which is involved in regulating consumption). Interactions between these two regions increased compared with when participants had slept.

But the team found there was no link between individuals’ changes in levels of des-acyl ghrelin and any of the brain or behaviour differences – although Peters said that could be because levels were very high and participants were equally hungry whether rested or not. This, the team said, suggested changes in brain activity in response to images of food after a bad night’s sleep were not just about hormones.

But Peters said that what was driving the changes in activity in the amygdala and hypothalamus was unclear. “We know that changes in other neurotransmitters such as dopamine occur following sleep deprivation, so this might be another candidate,” he said.

Christian Benedict, a neuroscientist at Uppsala University in Sweden who was not involved in the study, welcomed the research. He said when individuals were sleep-deprived, their brains used more energy, so it makes sense that the brain would promote signals that might increase the consumption of food, and not waste energy on controlling impulses.

But he noted that the research had limitations, including that it was small and that blood was not taken when participants were viewing images of food during the scanning task. The study also did not compare the participants’ responses to healthy food.

He said it was important to remember that many factors besides sleep can affect body weight. “It is not only about sleep. Physical activity matters, dietary things, food and accessibility. So we should not break it down only to sleep.”

Doctoring Data – Science has turned to darkness

As readers of this blog know I was obliterated from Wikipedia recently. Many have expressed support and told me not to get down about it. To be perfectly frank, the only time I knew I was on Wikipedia was when someone told me I was going to be removed. So it hasn’t caused great psychological trauma.

In fact my feelings about this are probably best expressed on a Roman tombstone. It has been translated in different ways, but my favourite version is the following:

I was not

I was

I am not

I care not

However, whilst my removal from Wiki is, in one way, completely irrelevant in the greater scheme of things. In another way it is hugely important. As Saladin said of Jerusalem, whilst he was battling with the Christians during the crusades. ‘Jerusalem is nothing; Jerusalem is everything.’

My removal from Wikipedia is nothing. My removal from Wikipedia is everything. Not because it is me, but because of what it represents. No to beat about the bush, there is a war going on out there between scientific enlightenment, and the forces of darkness.

You think that is too dramatic? Well, this is what Richard Horton – editor of the Lancet for many years – has to say.

‘The case against science is straightforward: much of the scientific literature, perhaps half, may simply be untrue…science has taken a turn towards darkness.’ – Richard Horton

Science has taken a turn towards darkness… Of course science cannot really turn anywhere. It is not an entity. Science is simply made up of people. The scientific method itself, is simply an attempt to discover what is factually true, by being as objective as possible and removing human bias. It is, like everything humans do, imperfect. Bias is always there.

What Horton means is that the methods used to pursue science have increasingly moved from the pure Olympian ideal, a disinterested quest for truth, to something else. Distortion, manipulation and bias. In some cases downright lies. I hesitate to use the term ‘fake news’, but that is what it is. What it is becoming.

As John Ioannadis had to say in his seminal paper ‘Why most published research findings are false‘,

‘Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias. It is more likely for a research claim to be false than true.’

I think, in truth, this very much depends on the area of science you are looking at. Some are highly contentious, for example, Global warming, and here you can see dreadful science being done on all sides, as people desperately try to prove their point.

Moving closer to my area, nutritional science is awful. A complete mess. I have virtually given up reading any paper in this area as they just annoy me so much. Ioannadis has looked at the area in some detail. To quote from The American Council on Science and Health:

Dr. Ioannidis bluntly states that nutrition epidemiology is in need of “radical reform.” In a paragraph that perfectly captures the absurdity of the field, he writes:

“…eating 12 hazelnuts daily (1 oz) would prolong life by 12 years (ie,1 year per hazelnut), drinking 3 cups of coffee daily would achieve a similar gain of 12 extra years, and eating a single mandarin orange daily (80 g) would add 5 years of life. Conversely, consuming 1 egg daily would reduce life expectancy by 6 years, and eating 2 slices of bacon (30 g) daily would shorten life by a decade, an effect worse than smoking. Could these results possibly be true?”

The answer to his rhetorical question is obviously no. So, how did this garbage get published?

How did this garbage get published? How does the garbage get published? On the other hand how did a major study on replacing saturated fat with polyunsaturated fat (The Minnesota Coronary Experiment) NOT get published.

Because it found that polyunsaturated fat did lower cholesterol levels, but the more it lowered the cholesterol level, the greater the risk of death. That is in the results, this was the conclusion:

Available evidence from randomized controlled trials shows that replacement of saturated fat in the diet with linoleic acid effectively lowers serum cholesterol but does not support the hypothesis that this translates to a lower risk of death from coronary heart disease or all causes. Findings from the Minnesota Coronary Experiment add to growing evidence that incomplete publication has contributed to overestimation of the benefits of replacing saturated fat with vegetable oils rich in linoleic acid. https://www.bmj.com/content/353/bmj.i1246

The results of this study were, eventually found in the garage of the son of one of the lead investigators. It was recovered and published forty years later. Long after one of the lead investigators, Ancel Keys, had died.

I wrote the book Doctoring Data to try and shine some light on the methods used to distort and manipulate data. I try, as best as I can, to follow the scientific method. That includes discussion and debate, to test ones ideas in the furnace of sustained attacks.

However, if you try to do this, the forces of darkness come after you, and they come hard. Especially if ever dare to suggest that animal fats, saturated fats, are not in the least harmful. At which point you waken the vegan beast, and this beast is not the least interested in science, or the scientific method,or discussion or debate.

It has one aim, and that is to silence anyone, anywhere, who dares to question the vegan philosophy. Aided and abetted by the Seventh Day Adventist church. Below is a short list, non-exhaustive list, of those who have suffered their wrath.

Prof Tim Noakes – South Africa

Dr Maryanne Demasi – Australia

Dr Gary Fettke – Australia

Professor John Yudkin – UK

Dr Aseem Malhotra – UK

Dr Uffe Ravnskov – Sweden

Dr Andreas Eenfeldt – Sweden

Dr Zoe Harcombe – UK

Dr Robert Atkins – US

Nina Teicholz – US

Gary Taubes – US

Dr. Anna Dahlqvist – Sweden

Several of these doctors have been dragged in front of the medical authorities, usually by dieticians, who claim that patients are being damaged. So far, they have all won their cases – often after prolonged and expensive legal hearings. Luckily, the courts recognise logic when they see it.

Uffe Ravnskov has his book, the Cholesterol Myths, questioning the cholesterol hypothesis burned, live on air. All of the brave souls on this list have been accused of ‘killing thousands’ at one time or another. Maryanne Demasi lost her job with the Australian Broadcasting Company.

Now, it seems, the attacks have moved into a different area, such as a determined effort to remove everyone from Wikipedia. When the vegans find someone they don’t like, they work tirelessly to extinguish them from the record. They call them kooks and quacks – but they never ever reveal who they truly are. They exists in the shadows.

They got rid of me from Wikipedia, they are currently attacking Aseem Malhotra, Uffe Ravnskov, Jimmy Moore, and the entire THINCS network. (The International Network of Cholesterol Skeptics). There are even worse things going on, that I cannot speak about yet.

Yes, this is science today. At least it is one part of science – which is not science, and it has definitely turned to the darkness. You can be accused of being a kook and a quack by someone who hides behind anonymity and never dares to show their face. In truth, I know who it is. Someone found out for me. Yes MCE, it is you.

You want a debate, come out into the open, reveal yourself, your motives and your arguments to the world. Then we can do science. Until then please expect me to hold you in the contempt that you deserve.

Addiction and a lack of purpose



How the opioid epidemic is related to a “purpose deficient” culture.

As you are no doubt aware, presently the United States is experiencing an opioid epidemic. There are many reasons for this – one of the most obvious being the reckless over-subscription of opiate-based painkillers by doctors, leading to dependency. But on a psychological level, we have to take into account the strong relationship between addiction and the lack of a sense of purpose.

To some extent, addiction is the result of a lack of purpose. It’s partly the consequence of experiencing what the psychologist Viktor Frankl called the ‘existential vacuum’ – feeling as though there is no purpose or meaning to your life. With a strong sense of purpose, we become very resilient, able to overcome challenges, and to bounce back after setbacks. We are also better able to deal with – and perhaps more motivated to overcome – the painful effects of past trauma. We never wake up in the morning with no reason to get out of bed. Life seems easier, less complicated and stressful. Our minds seem somehow tauter and stronger, with less space for negativity to seep in.

But without a sense of purpose, we are more vulnerable to becoming depressed in response to negative events. We become more susceptible to psychological discord – to boredom, frustration and pessimism. We are more liable to feel the residual pain of trauma from the past (and traumatic past experiences in themselves have also been linked to addiction). Drugs and alcohol are therefore appealing as a way of escaping the psychological problems caused by a lack of purpose. But addiction can also be seen as an attempt to find a purpose. After all, when a person becomes an addict, their lives take on a very strong sense of purpose: to satisfy the addiction. I have often heard addicts describe how simple life becomes in addiction. There is always a clear goal in your mind and a motivation behind every moment of your existence. Everything else is secondary to the overriding purpose of feeding your addiction.

And just as there is a relationship between addiction and a lack of purpose, there is a relationship between recovering from addiction and gaining a new sense of purpose. Research has shown that, without a new sense of purpose, recovery does not tend to last. The stronger and more established a person’s sense of purpose is, the greater chance they have of remaining sober. Partly this is because there has to be something to replace the sense of purpose provided by the substance, otherwise a person is likely to return to that purpose. At the same time, a new sense of purpose can provide the resilience needed to overcome the challenges that arise with new sobriety.

A Crisis of Purpose

In my view, the present-day prevalence of addiction is at least in part symptomatic of a crisis of purpose in secular Western culture. It’s partly the result of a sense of disorientation and frustration due to the lack of the availability of fulfilling, purposeful modes of living. The primary purpose that our cultures offer is what you could call a ‘self-accumulative’ purpose. We are encouraged to think of happiness in terms of acquisition and achievement. We try to get as many qualifications as we can so we can get good jobs, earn good money, buy possessions and pleasures, and slowly work our way up the ladder of success. But it’s not surprising that so many people find this type of purpose unfulfilling. After all, there is very little evidence that material success and professional achievement contribute to personal well being. And due to the fact that success and wealth are limited commodities, there is a great deal of competition for them. It’s easy for people to fall behind, to lose motivation, or even to fall off the ladder altogether.

Another possibility is a religious purpose. Religion is appealing to many people because it does provide a strong sense of purpose and meaning. However, taking on a religious purpose usually involves accepting questionable and irrational beliefs, and subsuming one’s individuality and intellectual independence within a pre-given framework, which is difficult for many of us to do. As a result, religion offers only limited help.

More Fulfilling Types of Purpose

To my mind, therefore, large-scale problems with substance abuse are an inevitable result of a culture that is ‘purpose deficient.’ It’s clearly essential to take appropriate social and political measures, such as reducing the availability of prescription opioids and funding recovery programs, but at the same time – from a longer-term perspective – we need to encourage the adoption of more satisfactory and fulfilling types of purpose, beyond self-accumulation and religion.

In a research project at my university, we studied the effects of different types of purpose and found that an altruistic and a ‘self-expansive’ purpose were much more strongly associated with well-being. (By ‘self-expansive’ we mean that a person’s sense of purpose is to grow, to expand themselves and their horizons, and to fulfill their potential. This often means following a creative path, or a path of personal or spiritual development.) Altruism is much more fulfilling than self-accumulation because it connects us to other people, and helps us to transcend a self-centered preoccupation with our own desires or worries. Altruism is also non-material, and therefore limitless. We don’t need to compete against each other for kindness. A self-expansive purpose is so fulfilling because it gives us a sense of dynamic movement, with feelings of flow and accomplishment and meaning.

So in order to transcend our present state of purpose deficiency (and so transcend the psychological discord that so easily leads to addiction), there ought to be a cultural movement to encourage the adoption of these different types of purpose. Rather than encouraging consumerism and competitiveness, we should be encouraging altruism, creativity and spirituality. Perhaps most of all, this should begin with an overhaul of education systems, which tragically neglect these areas, in favor of training pupils and students for a treadmill of unfulfilling striving.

Unless this happens – whilst it may be mitigated by social and political changes – the tragic specter of addiction will always hover over us.

About the author

Steve Taylor is a senior lecturer in Psychology at Leeds Beckett University, UK. His latest books in the US are The Calm Center and Back to Sanity: Healing the Madness of the Human Mind. He is also the author of The Fall, Waking From Sleep, and Out Of The Darkness. His books have been published in 19 languages. His research has appeared in The Journal of Transpersonal Psychology, The Journal of Consciousness Studies, The Transpersonal Psychology Review, The International Journal of Transpersonal Studies, as well as the popular media in the UK, including on BBC World TV, The Guardian, and The Independent.

Don’t Deny Girls the Evolutionary Wisdom of Fairy-Tales and Princesses

Psychologist Joyce Benenson, who researches sex differences, traces women’s evolved tendency to opt for indirectness – in both competition and communication – to a need to avoid physical altercation, either with men or other women. This strategy would have allowed ancestral women to protect their more fragile female reproductive machinery and to fulfill their roles as the primary caretaker for any children they might have.

Sure, today, a woman can protect herself against even the biggest, scariest intruder with a gun or a taser – but that’s not what our genes are telling us. We’re living in modern times with an antique psychological operating system – adapted for the mating and survival problems of ancestral humans. It’s often at a mismatch with our current environment.

Understanding this evolutionary mismatch helps women get why it’s sometimes hard for them to speak up for themselves – to be direct and assertive. And identifying this as a problem handed them by evolution can help them override their reluctance – assert themselves, despite what feels “natural.” Additionally, an evolutionary understanding of female competition can help women find other women’s cruelty to them less mystifying. This, in turn, allows them to take such abuse less personally than if they buy into the myth of female society as one big supportive sisterhood.

Such myths have roots in academia. Academic feminists typically contend that culture alone is responsible for behavior. If pressed, they’ll concede to some biological sex differences – but only below the neck. They deny that there could be psychological differences that evolved out of the physiological differences – and never mind all the evidence.

For example, evolutionary psychologists David Buss and David Schmitt explain – per surveys across cultures – that men and women evolved to have conflicting “sexual strategies.” In general, “a long-term mating strategy” (commitment-seeking) would be optimal for women and a “short-term mating strategy” (the “hit it and quit it” model) would be optimal for men. (Guess which model is championed in princess movies?)

In almost all species, it’s the female that gets pregnant and stuck with mouths to feed. So human women – like most females in other species – evolved to seek high-status male partners with an ability and willingness to invest.

This evolutionary imperative is supported by women’s emotions, which anthropologist John Marshall Townsend explains, “act as an alarm system that urges women to test and evaluate investment and remedy deficiencies even when they try to be indifferent to investment.” In Townsend’s research, even when women wanted nothing but a one-time hookup with a guy, they often were surprised to wake up with worries like “Does he care about me?” and “Is sex all he was after?”

In other words, the allure of “princess culture” was created by evolution, not Disney. Over countless generations, our female ancestors most likely to have children who survived to pass on their genes were those whose emotions pushed them to hold out for commitment from a high status man – the hunter-gatherer version of that rich, hunky prince. A prince is a man who could have any woman, but – very importantly – he’s bewitched by our girl, the modest but beautiful scullery maid. A man “bewitched” (or, in contemporary terms, “in love”) is a man less likely to stray – so the princess story is actually a commitment fantasy.

Because of this, princess films can be the perfect foundation for parents of teen girls to have conversations about the realities of evolved female emotions in the mating sphere. A young woman who’s been schooled (in simple terms) about evolutionary psychology is less likely to behave in ways that will leave her miserable – understanding that being “sexually liberated” might not make her emotionally liberated enough to have happy hookups with a string of Tinder randos.

As for the notion that watching classic princess films could be toxic to a girl’s ambition, let’s be real. Girls are being sent in droves to coding camp and are bombarded with slogans like “The future is female!” And young women – young women who grew up with princess films – now significantly outpace young men in college enrollment.

Children aren’t idiots. They know that talking mirrors and pumpkins that Uber a girl to the royal prom aren’t real, and they aren’t having their autonomy brainwashed away by feature-length cartoons – just as none of us dropped anvils on the neighbor kids after watching Road Runner. Ultimately, these bans of princess movies are really about what’s psychologically soothing for the parents, not what’s good for children. Preventing children from watching princess films and other fantasy kid fare gives parents the illusion of control, the illusion that they’re doing something meaningful and protective for their children.

Author and activist Lenore Skenazy urges parents to go “free range” – give their kids healthy independence, such as by letting them ride their bikes around the neighborhood without being accompanied by a rent-a-mercenary. I suggest parents also go psychologically free range. This means allowing children to watch classic Disney films instead of giving in to the ridiculous panic that their daughters will start seeing “princess” as a career option.

Stories give us insight into successful human behavior; they don’t turn us into fleshy robots who act exactly as the characters do. Understanding this is essential, because if we instead succumb to the current paranoia, we’ll have to remove much of the fiction from the high school curriculum – lest, say, Edgar Allan Poe’s Tell-Tale Heart lead yet another generation of young readers to murder their roommates, cut them up, and stash them under the floorboards.

Amy Alkon sneers at “self-help” books and instead writes “science help” – translating research from across scientific disciplines into highly practical advice. Her new science-based book is Unf*ckology: A Field Guide to Living with Guts and Confidence. Follow Amy Alkon on Twitter at @amyalkon.

Slowly but surely, psychology is accepting that faith might play a role in treatment

For anyone who took a college course in psychology more than a decade ago or who is even casually acquainted with the subject through popular articles, a close examination of today’s field would undoubtedly prove surprising. The science that for most of the 20th century portrayed itself as the enlightened alternative to organized religion has taken a decidedly spiritual turn.

Bowling Green State University professor Kenneth Pargament, who in 2013 edited the American Psychological Association’s Handbook of Psychology, Religion, and Spirituality, notes just how dramatically his profession’s attitude towards faith has changed in recent times. As a young academic interested in the connection between mental health and religion, he would “go to the library once a semester and leisurely review the journals” only to be disappointed by how little his colleagues had to say about it. But “no more,” Pargament happily reports. In fact, he adds, “it is hard to keep up with the research in the field.”

Today’s psychology tells us that faith can be very helpful in coping with major life setbacks, including divorce, serious illnesses, the death of a loved one, and even natural or human-caused disasters. A study by the RAND Corporation, published in the New England Journal of Medicine just after the 9/11 attacks, found that 90 percent of Americans coped with the trauma by “turning to God.” During the week that followed, 60 percent went to a church or memorial service, and sales of the Bible rose more than 25 percent.

Other studies have shown that religious people are less prone to depression and anxiety, are less likely to abuse alcohol and drugs, and have above average immunity to physical diseases. As a result, psychologists are now developing faith-based approaches to treating chronic anger and resentment, the emotional scars of sexual abuse, and eating disorders.

To appreciate this intellectual shift, we must remember that for most of the last century, the leading names in psychology – including Sigmund Freud and behaviorism founder John Watson – regarded a belief in God not merely as wrong-headed, but as itself a form of mental illness. Freud wrote that civilization “has little to fear from educated people and brain-workers” who had properly rejected traditional religion. Watson regarded the concept of a soul as an obvious psychological defect, arguing that no one had ever “touched the soul, or has seen one in a test tube, or has in any way come into a relationship with it.”

This anti-religious bias was for decades compounded by the fact that most people attracted to the mental health field were (and continue to be) very different from average Americans in at least one respect. “Whereas 90% of the population believes in God,” says Dr. Pargament, “studies indicate that only 25% of those who go into psychology do.”

So strong was the resulting professional prejudice against religious faith that for decades the very origins of academic psychology were deliberately ignored or distorted. Completely eliminated from most textbooks was the fact that the scientific study of mind had begun not with Freud and Watson, but a half-century earlier under the auspices of Princeton’s James McCosh, Yale’s Noah Porter, and other American college presidents, most of whom were ordained ministers. These 19th-century educators assumed long-term happiness to be the result of spiritual discipline and established the first social science courses in the belief that researchers could eventually prove this connection.

Twentieth-century psychologists who persisted in studying the upside of faith, most notably Harvard professor William James and Freud’s one-time pupil Carl Jung, were invariably dismissed as “unscientific,” with courses based on their work either banished to religion departments or dropped from the curriculum altogether. At the same time, popular self-help programs that treated emotional complaints as symptoms of a spiritual deficiency – the Emmanuel Movement (1906-1929), the Oxford Group (1921-1938), Alcoholics Anonymous (founded in 1935), and a variety of 1960s group therapies – were either ignored or derided as unprofessional.

Indeed, modern psychology’s return to its 19th-century religious roots has not come without considerable resistance. As recently as 1993, nearly one quarter of all cases described in the Diagnostic and Statistical Manual (DSM), the standard reference used by psychiatrists and insurance companies to define emotional disorders, were framed as problems of “neurotic religious thinking.”

Not that materialistic psychologies had ever accomplished much when it came to the acid test of their utility: curing depression, anxiety, phobias, and other common emotional disorders. “Some Implicit Common Factors in Diverse Methods of Psychotherapy,” published in 1936 by Harvard-trained psychologist Saul Rosenzweig, was the first in a long line of studies to show that no widely practiced psychotherapy was any more effective than any other – or was any more likely to help patients than if they just confided in a close friend. Watson’s behaviorism, with its systematic use of rewards and punishments to alter habits, did work on rats, monkeys, and other experimental animals, but its influence over humans seemed to evaporate outside the laboratory.

Unable to generate any practical applications, 20th-century psychoanalysis, behaviorism, psychopharmacology, and other secular theories derived their authority primarily from the self-serving endorsement of what Irving Kristol famously termed the “New Class.” By that Kristol had meant the growing ranks of government administrators, economists, criminologists, educators, urban planners, and other professionals whose livelihoods depended on the public’s faith in both the feasibility and desirability of social engineering. Whereas the spiritual psychology of early America’s clerical college presidents had placed responsibility for human progress in the hands of the individual – his or her moral choices reverberating in ways that were both personally and socially beneficial – its materialistic successors provided intellectual grounds for empowering credentialed elites.

Even as mid-20th-century advances in attitude measurement, combined with more sophisticated techniques for tracking subjects’ behavior over many decades, finally made it possible to test the early college presidents’ belief in a connection between religious values and long-term health, few researchers were interested in trying. Some prominent figures, like American Psychological Association president Donald T. Campbell, did warn colleagues that they had “special reasons for modesty and caution in undermining traditional belief systems.” But it was not until the 1990s, when the Alcoholics Anonymous model for problem drinking was becoming widely noticed, that research on religion finally ceased to be the academic kiss of death.

Little more than a decade later, it was even safe to suggest that the mental health field was circling back to its 19th-century spiritual roots. Psychology was originally understood to be “the study of the soul,” wrote Dr. Len Sperry, clinical professor of psychiatry at the Medical College of Wisconsin, in a 2012 paper on “Spiritually Sensitive Psychotherapy.” Then, in a misguided quest to appear more scientific, he continued, “psychology sought to free itself from its roots in ‘value-based’ philosophy and become ‘value-free’…. God, afterlife, free-choice, and other spiritual phenomena [were treated as] false projections of the mind.” Today, “a paradigm shift is occurring, [one which] constitutes a radical and revolutionary change with many implications.”

If there is one remaining way that contemporary psychology keeps its distance from religion, it is the tendency to treat a faith in God or a “higher power” as essentially the same across all creeds. Hence a study of “Anxiety in Believers versus Atheists” is far more likely to be undertaken than a study of “Anxiety in Catholics versus Protestants” or “Anxiety in Muslims versus Buddhists.”

This tacit preference for treating devotional differences as secondary to a universal “religiosity” likely has many causes, including researchers’ understandable desire to discourage working therapists from thinking they should impose a belief system on patients. Many psychologists are also undoubtedly motivated by the same assumption that motivates many of today’s theologians – that emphasizing what all religions have in common contributes to social tolerance and cooperation.

Dr. Pargament would add to this list his profession’s tendency to attract spiritual skeptics. Because most psychology researchers “approach faith from a distance,” he says, they may acknowledge that faith is good for you, but “still are inclined to see various creeds as relatively undifferentiated.”

Nor, finally, can we ignore the very practical fact that depicting faiths as interchangeable insulates mental health professionals from the accusation that they are engaged in explicitly religious activity. In a society where so much of psychological research and treatment is government funded – the product of a time when social science and religion were viewed as mutually exclusive – any effort that threatens to frame one faith as superior to any other is certain to trigger First Amendment objections.

If there is a downside to this scrupulous non-sectarianism, it has been the development of a dry, technical language for studying religion that not only leaves the public cold but obscures the extent to which the century-old gap with religion has been bridged. At this point, mental health professionals and the clergy “have a lot in common and a lot to learn from each other,” says Pargament. “Bridges have been built and there’s been some good progress,” he adds, though “not nearly as much as there could be.”

How future psychologists will eventually resolve the tension between their reluctance to be too religiously explicit and their desire to be more relevant to the layperson is hard to predict. All that we can say for certain is that the old wall between science and religion, which not that long ago dominated and defined their field, has clearly been breached.

Dr. Lewis Andrews was executive director of the Yankee Institute for Public Policy at Trinity College from 1999 to 2009. He is writing a self-help book based on the spiritual wisdom of America’s early college presidents.

Heart-breaking season: Christmas Eve the peak time for heart attacks, says study

For some, Christmas can be a time of stress instead of peace and goodwill – and a new Swedish study shows that 10 pm on Christmas Eve is the annual peak time for heart attack risk, particularly for the elderly and those with existing conditions.

Researchers analysed data from 283,014 heart attacks reported to Swedish hospitals between 1998 and 2013, and compared with weeks outside of holiday periods as a control measure.

In Sweden, Christmas Eve is actually the bigger event than Christmas Day, and researchers noted a 37 percent increased risk on this day, peaking at 10 pm. More generally there was a 15 percent increased risk over the Christmas period.

The risk was greatest in the over 75s and those with existing diabetes or heart disease. The study also noted more cases of heart attacks reported on Midsummer holidays, early mornings, and Mondays.

“Christmas and Midsummer holidays were associated with higher risk of myocardial infarction, particularly in older and sicker patients, suggesting a role of external triggers in vulnerable individuals,” the team explains in their study.

Previous studies have also made the link between holiday seasons and more heart attacks, but this new study adds some extra interesting detail from many more years of data.

No increased risk was spotted during sporting events or during the Easter period, for example. And while there was an increased risk at New Year, it was on New Year’s Day rather than New Year’s Eve – perhaps because too much partying the night before was leading to symptoms being ignored or misunderstood.

While this study on its own isn’t enough to prove cause and effect between Christmas and heart attack risk – unmeasured factors might be lurking unseen in the background, as with all observational studies like this – it does fit in well with the existing research out there.

Anger, anxiety, sadness, grief and stress have all been linked to heart attack risk in the past, and while we hope your Christmas is filled with joy and hope, these emotions can also come into play over the holiday season.

“Excessive food intake, alcohol, long distance travelling may also increase the risk of heart attack,” one of the team, David Erlinge from Lund University in Sweden, told ScienceAlert.

“Interestingly, the pattern of increased risk in the morning which dominates the rest of the year was reversed at Christmas, with an increased risk in the evening, indicating that the stress and eating during the day triggered the heart attacks.”

An earlier study from the same team linked increased heart attack risk with cold and cloudy weather, too. Considering the control data in this new study was taken from the weeks close to Christmas, this factor should already be accounted for.

The aim of the study is not to scare you away from indulging in the holiday festivities, but to look out for people who might be at risk and to try and cut down on the number of heart attacks seen over Christmas and New Year.

Erlinge told ScienceAlert that people should be aware of how emotional distress and eating way too much could increase risk – and of course to take good care of friends and family over the holiday season.

“These findings warrant further research to identify the mechanisms behind this phenomenon,” conclude the researchers.

“Understanding what factors, activities, and emotions precede these myocardial infarctions and how they differ from myocardial infarctions experienced on other days could help develop a strategy to manage and reduce the number of these events.”

The findings have been published in the BMJ.

Russian Hachiko: Loyal pooch spends weeks outside hospital awaiting owner’s recovery

Heavy snowfall, chilling wind, and temperatures far below zero are no obstacle for true love as proven by a loyal dog, who has been waiting for her sick master outside a hospital for two weeks now.

Her amazing fidelity has quickly made, Cherry, from the Russian city of Voronezh a media sensation and led to obvious comparisons with Hachiko.

Back in the 1920s, a Japanese dog had been waiting for her owner’s return outside a train station for nine years, not knowing that he passed away, to become an ultimate symbol of friendship.

[embedded content]

But Cherry’s story isn’t a sad one as her master, Dmitry Bubnov, is recovering from his sickness and will be soon discharged.

However, even a short separation was unbearable for the mutt, who has been spending day and night outside the hospital since her master got there.

The man’s wife, Svetlana, said she tried to take the dog home many times, but Cherry always escaped and returned to the hospital walls. “She loves daddy more. She was his dog initially,” she told Mir 24.

Dmitry comes outside every day to spend time with his pooch. He feeds her cutlets and pies; hugs and kisses her to make Cherry the happiest animal on the planet.

The ‘CICOpaths’ – Who’s to blame for fat-shaming?

In the 1970s, people largely ate 3 times per day – breakfast, lunch and dinner. If you were not hungry, then it was perfectly acceptable to skip a meal. That was your body telling you that you didn’t need to eat, so you should listen to your body.

By 2004, the number of meals per day had increased closer to 6 per day – almost double. Now, snacking was not just an indulgence, it was encouraged as a healthy behavior. Meal skipping was heavily frowned upon. What kind of Bizarro world was this? You need to constantly shove food into your mouth to lose weight? Seriously? If you don’t eat, you’ll gain weight? Seriously? It sounds really stupid, because it is really stupid.

The admonishments against meal skipping were loud. Doctors and dieticians with the heavy backing of corporate $$$, told patients to never, ever skip a meal. They warned of dire consequences. Magazines blared out warnings of the problems of meal skipping. From a physiologic standpoint, what happens when you don’t eat that is really so bad? Let’s see. If you don’t eat your body will burn some body fat in order to get the energy it needs. That’s all. There’s nothing else. After all, that is the entire purpose the body carries fat in the first place. We store fat so that we can use it. So if we don’t eat, we’ll use the body fat.

As people gained more weight, the calls for people to eat more and more frequently grew louder. It didn’t actually work, but that was beside the point. As people became obese, doctors would say to cut calories and eat constantly – graze, like some dairy cow in a pasture.

But this horrible advice didn’t work. So there are two potential sources of the problem. Either the dietary advice for weight loss was bad, or the advice was good, but the person was not following it. On one hand, the problem was the doctor’s advice. On the other, it was a patient problem. Let’s break it down to the basics. Physicians and other nutritional authorities believe religiously that excess calories cause weight gain. They advise patients to eat fewer calories. Either:

  1. The ‘Eat Fewer Calories’ advice is wrong and doesn’t work (Doctor’s fault)
  2. The advice is good, but the patient could not follow it. The spirit is willing but the flesh is weak. You’ve got the dream but not the drive. (Patient’s fault)

I believe that #1 is correct. Therefore, patients with obesity are victims of poor advice to eat more often, lower dietary fat in a desperate effort to reduce caloric intake. Their weight problems are a symptom of a failure to understand the disease of obesity. I do not believe they have low willpower or character. It is not different to me than treating a patient with cancer.

Many physicians and researchers believe option #2. They think the problem is not the advice. They believe the problem is the patients. It’s the ‘A Calorie is a Calorie’ people who are to blame for the phenomenon of fat shaming. They are blaming the victim because it exculpates their own failed understanding of weight gain. They believe that the obesity epidemic was the result of a worldwide collective simultaneous loss of willpower and character.

The name of this game is ‘Blame the Victim’. That way, doctors can go on believing that the advice they give is perfect. It was the patient’s fault. Does this make sense? Somewhere around 40% of the American adult population is classified as obese (BMI>30) and 70% are overweight or obese (BMI>25). Was this obesity crisis actually a crisis of weak will power?

Consider an analogy. Suppose a teacher has a class of 100 children. If one fails, that may certainly be the child’s fault. Perhaps they didn’t study. But if 70 children are failing, then is this more likely the children’s fault, or is it more likely the teacher’s fault? Obviously the teacher. In obesity medicine, the problem was never with the patient. The problem was the faulty dietary advice patients were given. But the CICOpaths, in their denial have heaped the blame onto those obese patients that were the very victims of the doctors failure to understand obesity as a disease, and not a personal character failing. The Calories people secretly believe that fat people deserve that shame.

This is why obesity is not only a disease with dire health consequences but comes with a huge slice of shame. It is a disease with dire psychological consequences. People blame themselves because everybody tells them it was their fault. Nutritional authorities throw around the euphemism ‘personal responsibility’ when what they really mean is ‘It’s your fault’. But it’s not.

The real problem is the acceptance of underlying assumption that obesity is all about ‘Calories In Calories Out’. This failed CICO mentality has pervaded our entire universe and the natural conclusion of this line of thinking is that if you are obese ‘It’s your fault’ that you ‘let yourself go’. You either failed to control your eating (low willpower, gluttony) or did not exercise enough (laziness, sloth). But it is not true. Obesity is not a disorder of too many calories. It’s a hormonal imbalance of hyperinsulinemia. Cutting calories when the problem is insulin is not going to work. And guess what? It doesn’t.

Not only do people with weight problems suffer all the physical health issues – type 2 diabetes, joint problems, etc., but they also get the blame for it. Blame that is unfairly targeted toward them because the advice they received to lose weight had a 99% failure rate. Should people get angry about it? Absolutely. The next time some physician tells you that ‘It’s all about calories’ you have my permission to slug him/her.