Blog

Atheist philosopher thinks it’s reasonable to argue against reason

Have you noticed the remarkable cluelessness in atheist arguments about science and metaphysics? It’s really on display in a recent essay by atheist philosopher Justin E. H. Smith. In a nutshell, Smith, a professor of philosophy at Concordia University and author of Irrationality: A History of the Dark Side of Reason, argues that reason is inferior to the non-rational behavior of animals. Permit me to offer some responses to his essay, “If reason exists without deliberation, it cannot be uniquely human,” which advanced that view recently at Aeon:

Philosophers and cognitive scientists today generally comprehend the domain of reason as a certain power of making inferences, confined to the thoughts and actions of human beings alone. Like echolocation in bats or photosynthesis in plants, reason is an evolved power, but unlike these, the prevailing theory goes, it emerged exactly once in the history of evolution (porpoises and shrews also echolocate, cyanobacteria photosynthesise).

It’s not clear why Smith defines reason as “a certain power of making inferences.” The accepted definition of reason is simple and straightforward: it is the power to think abstractly, without concrete particulars. Abstract thought entails comprehension of concepts that are disconnected from particular objects. When I think about the ham sandwich I am eating for lunch, I am thinking concretely. When I think about the nutritional consequences of my choice of sandwich, I am thinking abstractly.

Only man thinks abstractly; that is the ability to reason. No animal, no matter how clever, can think abstractly or reason. Animals can be very clever but their cleverness is always about concrete things – about the bone they are playing with, or about the stranger they are barking at. They don’t think about “play” or “threat” as abstract concepts.

Reason is a power characteristic of man, to be sure, but it is not “an evolved power.” It didn’t “evolve.” The ability to reason didn’t evolve because it’s not a material power of the mind. Reason is an immaterial power of the mind – it is abstracted from particular things, and cannot logically be produced by a material thing.

Material states of the brain can, of course, influence our power of reason – an ounce of whiskey can have quite an effect on our judgment – but the power of reason itself is immaterial. It cannot “evolve” because natural selection, whatever its worth as a scientific hypothesis, needs matter to act on.

Reason is exceedingly rare, a hapax legomenon of nature, and yet this rarity has led to a bind: when pushed to account for its origins, thinkers who champion reason’s human exclusivity are forced to lean on supernaturalism, while those who contend that reason is a fundamentally natural property have then to concede that ‘lower’ lifeforms are capable of exercising it. The question is – how?

Reason isn’t rare at all. 7.7 billion people do it every day. But no non-human animal does it. This immaterial power of the soul is precisely what makes man qualitatively different from every other living thing. And I am not “forced to lean on supernaturalism” by pointing this out. I’m merely making an observation that’s obvious to all. Man, and man alone, has the power to reason.

Most philosophers and scientists who see reason as some sort of inferential ability involving abstract representations will allow that experiments with ‘higher’ animals can yield evidence of some low-level reason-like faculty: for example when apes hide stones in anticipation of future conflicts. But researchers almost always draw the lower limit for such ability in a way that excludes species whose behaviour is not observably similar to ours. The search for reason beyond the bounds of the human species always ends up as a search for beings that remind us of ourselves.

The most dramatic evidence for the difference between human reason and animal perception is the fact that humans and only humans “search for reason beyond the bounds of the human species.” We routinely ask questions that entail reasoning. Animals never do. That’s the point.

But what if reason is not so much an inferential ability, as simply the power to do the right thing in the right circumstances?

Reason has nothing to do with the power to “do the right thing.” The right thing may be a reflex or a brute assessment of a situation and an impulsive response. That may be fine – sometimes fight – or flight – is the smart thing to do. Reason is, as I’ve pointed out, just a power of abstract thought – the power to think without particular things in mind.

Smith quotes a 16th-century philosopher who believed that animal instinct is superior to human reason:

Rorario’s core idea is that human deliberation – the period of hesitancy when we survey our various options and eventually select what appears to be the best of them – far from being an advantage over other beings, is in fact a mark of our inferiority. Animals and plants do not hesitate. They cut right to the chase and, to the extent that they do not examine alternative options in order to choose among them, they are in a sense incapable of being wrong.

As a tool for survival, reason is neither invariably beneficial nor invariably harmful. It depends on the circumstance. Stopping to reason about a lion about to pounce on you will get you killed. Stopping to reason why a culture of bacteria died when penicillium notatum mold grew on the plate has saved hundreds of millions of lives. Reason is neither superior nor inferior to instinct when it comes to adaptation to circumstances. It depends on what we are adapting to.

Smith gets a glimmer of a deeper truth when he uses his (apparently inferior capacity for) reason to contemplate nature:

Nature itself is a rational order, on this alternative view, both as a whole and in any of its subdomains. Reason is everywhere, with human reason being only an instantiation or reflection, within a very tiny subdomain, of the universal reason that informs the natural world. The regularities of the motions of the heavens (to speak with the ancients) or the laws governing the orbits of the planets (to speak with the moderns) are not there in place of reason. Rather, these regularities or laws are the reflection ‘out there’ in the world of what human thought is ‘in here’ inside our minds.

Smith gets spooked, it seems, by the evidence for a Mind as the ground of nature and he begins to ramble about computers and algorithms. Contemplating the rationality that pervades nature, he longs for a way to cling to his materialism and atheism:

… we are able to preserve the naturalism that philosophy and cognitive science insist upon today, while dispensing with the human-exclusivity of reason. And all the better, since faith in the strange idea that reason appears exactly once in nature, in one particular species and nowhere else, seems, on reflection, to be itself a vestige of pre-scientific supernaturalism.

Supernaturalism was, and is, of course, not “prescientific” at all. The Scientific Revolution was a manifestly Christian endeavor. In fact, no officially atheist culture has ever had a scientific revolution of any sort, and contemporary atheist cultures contribute little or nothing to scientific progress. Inference to a rational Creator has always been the basis for modern science, and atheists and materialists like Smith are parasites living off of a scientific worldview built almost entirely by passionate theists immersed in “supernaturalism.” Case in point: Newton spent more time working on an exegesis of the Book of Daniel than he did on gravitation and calculus.
What sense does it make to look for reason and purpose in an absurd universe? Belief in God was indispensable to the Scientific Revolution, and where the materialist beliefs of atheists like Smith dominate, science withers. Smith meanders from nonsense about an atheist and materialist basis for science to his certainty about the existence of space aliens:

But terrestrial biology is now considered unexceptional too: most biologists today are inclined to believe that exoplanetary life is a statistical near-certainty. So perhaps it is also time to give up the idea of rationality as nature’s last remaining exception.

Smith is leading the way to the abandonment of rationality. There’s not a shred of reason in his essay. It’s amusing to ask: how can Smith “argue” that reason is inferior to brute action? Why would he choose an inferior method – reasoned argument – when he could simply overwhelm his interlocutors with force? Why waste time with “reason” when you can bash your listener over the head with a billy?

Think of the irony: a professor of philosophy, who is paid only to reason, uses reason to argue against reason. Welcome to the bowels of atheist metaphysics. It would be funny if it were not so dangerous to our culture and to our souls.

Michael R. Egnor, MD, is a Professor of Neurosurgery and Pediatrics at State University of New York, Stony Brook, has served as the Director of Pediatric Neurosurgery, and award-winning brain surgeon. He was named one of New York’s best doctors by the New York Magazine in 2005. He received his medical education at Columbia University College of Physicians and Surgeons and completed his residency at Jackson Memorial Hospital. His research on hydrocephalus has been published in journals including Journal of Neurosurgery, Pediatrics, and Cerebrospinal Fluid Research. He is on the Scientific Advisory Board of the Hydrocephalus Association in the United States and has lectured extensively throughout the United States and Europe.

Dim future? IQ rates are mysteriously declining throughout much of the developed world

People are getting dumber. That’s not a judgment; it’s a global fact. In a host of leading nations, IQ scores have started to decline.

Though there are legitimate questions about the relationship between IQ and intelligence, and broad recognition that success depends as much on other virtues like grit, IQ tests in use throughout the world today really do seem to capture something meaningful and durable. Decades of research have shown that individual IQ scores predict things such as educational achievement and longevity. More broadly, the average IQ score of a country is linked to economic growth and scientific innovation.

So if IQ scores are really dropping, that could not only mean 15 more seasons of the Kardashians, but also the potential end of progress on all these other fronts, ultimately leading to fewer scientific breakthroughs, stagnant economies and a general dimming of our collective future.

As yet, the United States hasn’t hit this IQ wall – despite what you may be tempted to surmise from the current state of the political debate. But don’t rush to celebrate American exceptionalism: If IQs are dropping in other advanced countries but not here, maybe that means we’re not really an advanced country (too much poverty, too little social support).

Or – just as troubling – if we are keeping up with the Joneses (or Johanssons and Jacques) in terms of national development, that means we are likely to experience similarly plummeting IQs in the near future. At which point, the U.S. will face the same dangers of intellectual and economic stagnation.

If we want to prevent America from suffering this fate, we’d better figure out why IQs are dropping elsewhere. But it’s uncharted territory. Until recently, IQ scores only moved in one direction: up. And if you’re thinking, “Isn’t the test set up so that 100 is always the average IQ?,” that’s only true because researchers rescale the tests to correct for improving raw scores. (Also, congrats, that’s the kind of critical thinking we don’t want to lose!)

These raw scores have been rising on a variety of standard IQ tests for over half a century. That may sound odd if you think of IQ as largely hereditary. But current IQ tests are designed to measure core cognitive skills such as short-term memory, problem-solving speed and visual processing, and rising scores show that these cognitive capabilities can actually be sharpened by environmental factors such as higher-quality schools and more demanding workplaces.

For a while, rising IQ scores seemed like clear evidence of social progress, palpable proof that humanity was getting steadily smarter – and might even be able to boost brainpower indefinitely. Scholars called it the “Flynn effect,” in homage to J.R. Flynn, the researcher who recognized its full sweep and import.

These days, however, Flynn himself concedes that “the IQ gains of the 20th century have faltered.” A range of studies using a variety of well-established IQ tests and metrics have found declining scores across Scandinavia, Britain, Germany, France and Australia.

Details vary from study to study and from place to place given the available data. IQ shortfalls in Norway and Denmark appear in longstanding tests of military conscripts, whereas information about France is based on a smaller sample and a different test. But the broad pattern has become clearer: Beginning around the turn of the 21st century, many of the most economically advanced nations began experiencing some kind of decline in IQ.

One potential explanation was quasi-eugenic. As in the movie “Idiocracy,” it was suggested that average intelligence is being pulled down because lower-IQ families are having more children (“dysgenic fertility” is the technical term). Alternatively, widening immigration might be bringing less-intelligent newcomers to societies with otherwise higher IQs.

However, a 2018 study of Norway has punctured these theories by showing that IQs are dropping not just across societies but within families. In other words, the issue is not that educated Norwegians are increasingly outnumbered by lower-IQ immigrants or the children of less-educated citizens. Even children born to high-IQ parents are slipping down the IQ ladder.

Some environmental factor – or collection of factors – is causing a drop in the IQ scores of parents and their own children, and older kids and their younger siblings. One leading explanation is that the rise of lower-skill service jobs has made work less intellectually demanding, leaving IQs to atrophy as people flex their brains less.

There are also other possibilities, largely untested, such as global warming making food less nutritious or information-age devices sapping our ability to focus.

Ultimately, it’d be nice to pin down the precise reason IQ scores are dropping before we’re too stupid to figure it out, especially as these scores really do seem connected to long-term productivity and economic success.

And while we might be able to compensate with skills besides intelligence, like determination or passion, in a world where IQ scores continue to fall – and where the drop expands to places like the United States – there’s also a bleaker scenario: a global intelligence crisis that undermines humanity’s problem-solving capacity and leaves us ill-equipped to tackle the complex challenges posed by AI, global warming and developments we have yet to imagine.

Evan Horowitz is the director of research communication at FCLT Global, a financial think thank.

Actor John Cleese talks to reincarnation researcher Dr Jim Tucker about children’s past life memories

Regular readers of the Grail will know that legendary comedian and Monty Python alumni John Cleese has a deep interest in research into the survival of consciousness beyond death. And if you’re interested in that topic, the place to go is the Division of Perceptual Studies (DoPS) at the University of Virginia, which has hosted researchers of the caliber of Dr Bruce Greyson (NDEs) and the late Dr Ian Stevenson (reincarnation memories).

So it’s no surprise to see a video posted recently online, embedded below, by the DoPS in which another researcher there, Dr Jim Tucker, is interviewed by John Cleese himself.

In the nine-minute-long video, Tucker gives a short history of the reincarnation research performed by the DoPS since the 1960s, beginning with Ian Stevenson, through which they have now collected 2500 cases of past-life recollection.

He then goes into detail about how they collect case information and evidence, along with a description of one of their ‘best’ evidentiary cases.

[embedded content]

John Cleese is a friend and supporter of the research being done at UVA DOPS into the nature of consciousness. Mr. Cleese has a long history of personal interest in this important area of inquiry.

In this video John Cleese is interviewing pediatric psychiatrist and director of UVA DOPS, Dr. Jim Tucker, about his research into the memories of young children who appear to recall specific factual memories of a previous life. In this video, Dr. Tucker presents some of the details of the case of ‘Ryan’ who recalled specific facts about the life of a man name ‘Marty Martin’.

The Division of Perceptual Studies (DOPS) is a research unit within the Department of Psychiatry and Neurobehavioral Sciences at the University of Virginia Health System. The research faculty of the Division are known internationally for their expertise and research integrity in the investigation of phenomena relevant to the nature of consciousness and its relationship to the physical world.

You can learn more about the Division of Perceptual Studies at their website.

5G danger: Hundreds of respected scientists sound alarm about health effects of 5G networks going up nationwide

Even though many in the scientific community are loudly warning about the potential health effects that 5G technology could have on the general population, Verizon and AT&T are starting to put up their 5G networks in major cities all across the nation. Today, the total number of cell phones exceeds the entire population of the world, and the big cell phone companies are making a crazy amount of money providing service to all of those phones. And now that the next generation of cell phone technology has arrived, millions of cell phone users are looking forward to better connections and faster speeds than ever before. In fact, President Trump says that 5G networks will be up to 100 times faster than the current 4G networks that we are using right now…

5G will be as much as 100 times faster than the current 4G cellular networks. It will transform the way our citizens work, learn, communicate, and travel. It will make American farms more productive, American manufacturing more competitive, and American healthcare better and more accessible. Basically, it covers almost everything, when you get right down to it. Pretty amazing.

And just as 4G networks paved the way for smartphones and all of the exciting breakthroughs – they made possible so many things – this will be more secure and resilient. 5G networks will also create astonishing and really thrilling new opportunities for our people – opportunities that we’ve never even thought we had a possibility of looking at.

Sounds great, right?

But in order to achieve such vastly superior performance, 5G networks will use technology that is completely different from 4G networks.

5G waves are “ultra high frequency” and “ultra high intensity”, but they are also easily absorbed by objects such as buildings and trees. So although cell towers will be much, much smaller, but they will also have to be much, much closer together than before. According to CBS News, it is estimated that the big cell phone companies will be putting up at least 300,000 of these small towers, and it has been projected that it will cost hundreds of billions of dollars to fully set up the 5G network nationwide.

Needless to say, there is a tremendous amount of money at stake, and the big cell phone companies are trying very hard to assure everyone that 5G technology is completely safe.

But is it?

Today, there is a growing body of scientific evidence that indicates that the electromagnetic radiation that we are constantly being bombarded with is not good for us. Hundreds of scientists that are engaged in research in this area have signed the “International EMF Scientist Appeal”, and this is how that document begins…

We are scientists engaged in the study of biological and health effects of non-ionizing electromagnetic fields (EMF). Based upon peer-reviewed, published research, we have serious concerns regarding the ubiquitous and increasing exposure to EMF generated by electric and wireless devices. These include-but are not limited to-radiofrequency radiation (RFR) emitting devices, such as cellular and cordless phones and their base stations, Wi-Fi, broadcast antennas, smart meters, and baby monitors as well as electric devices and infra-structures used in the delivery of electricity that generate extremely-low frequency electromagnetic field (ELF EMF).

In the next paragraph, we are told that “cancer risk”, “genetic damages”, “functional changes of the reproductive system”, and “neurological disorders” are some of the health risks that have been discovered by the scientific research that has been conducted so far…

Numerous recent scientific publications have shown that EMF affects living organisms at levels well below most international and national guidelines. Effects include increased cancer risk, cellular stress, increase in harmful free radicals, genetic damages, structural and functional changes of the reproductive system, learning and memory deficits, neurological disorders, and negative impacts on general well-being in humans. Damage goes well beyond the human race, as there is growing evidence of harmful effects to both plant and animal life.

And remember, 5G technology is going to take all of this to an entirely new level.

Because the 5G towers are going to be so powerful and so close together, it will essentially be like living in a closed radiation chamber 24 hours a day.

Over in Israel, one scientist has discovered that the surface of the human body actually draws in 5G radiation “like an antenna”

What’s further disturbing about 5G radiation is how the human body responds to and processes it. Dr. Ben-Ishai from The Hebrew University of Jerusalem discovered as part of a recent investigation that human skin acts as a type of receptor for 5G radiation, drawing it in like an antenna.

“This kind of technology, which is in many of our homes, actually interacts with human skin and eyes,” writes Arjun Walia for Collective Evolution about the study.

“… human sweat ducts act like a number of helical antennas when exposed to these wavelengths that are put out by the devices that employ 5G technology,” he adds.

In other words, our bodies are essentially magnets for 5G radiation.

So will it be worth it?

Will you be willing to risk your life in order to have better connections and faster speeds?

Sure, your phone will be more useful than ever before, but there is also the possibility that you could get cancer. Even the American Cancer Society acknowledges the risk…

A recent large study by the US National Toxicology Program (NTP) exposed large groups of lab rats and mice to RF energy over their entire bodies for about 9 hours a day, starting before birth and continuing for up to 2 years (which is the equivalent of about 70 years for humans, according to NTP scientists). The study found an increased risk of tumors called malignant schwannomas of the heart in male rats exposed to RF radiation, as well as possible increased risks of certain types of tumors in the brain and adrenal glands.

Of course all previous studies have been done on existing cell phone technology.

No studies have been done on the health effects of our new ultra-powerful 5G technology, and this has many scientists extremely concerned.

Dr. Martin Pall, a PhD and Professor Emeritus of Biochemistry and Basic Medical Sciences at Washington State University, says that rolling out 5G without any safety testing whatsoever “has got to be about the stupidest idea anyone has had in the history of the world”.

Unfortunately, there is no organized opposition and 5G networks are going up all over the country right now.

So it won’t be too long before you are being bombarded by “ultra high frequency” and “ultra high intensity” cell phone radiation wherever you go, and most people won’t even realize what is happening.

And if you do get sick, the cell phone companies sure aren’t going to pay the bill.

About The Author

Michael Snyder is a nationally-syndicated writer, media personality and political activist. He is the author of four books including Get Prepared Now, The Beginning Of The End and Living A Life That Really Matters. His articles are originally published on The Economic Collapse Blog, End Of The American Dream and The Most Important News. From there, his articles are republished on dozens of other prominent websites. If you would like to republish his articles, please feel free to do so. The more people that see this information the better, and we need to wake more people up while there is still time.

Religious couples tend to have happier marriages

As such, while progressive values might be a recipe for a happier marriage, they are less likely to lead to a rising fertility rate. Contra claims that feminism is the new pro-natalism, religion still seems the way to go if you want to have more kids.

Charles Fain Lehman is a staff writer for the Washington Free Beacon. He writes about policy, covering crime, law, drugs, immigration, and social issues. Reach him on twitter (@CharlesFLehman) or by email at lehman@freebeacon.com.

Study shows CBD reduces cravings and anxiety in recovering heroin abusers

A small study has offered new data on how cannabis-derived cannabidiol (CBD) could help to curb the international opioid crisis.

Results of a study published this week in the American Journal of Psychiatry suggest that CBD could help to reduce the anxiety and cravings experienced by patients recovering from heroin addiction. Researchers reported that CBD may help to dampen signals in the body that trigger cravings and anxiety in former opioid abusers, in addition to lowering patients’ heart rates and levels of stress hormones.

Conducted at Mount Sinai hospital in New York, the double-blind, randomized, controlled study looked at 42 people with a heroin-use disorder who were abstaining from the drug at the time. Participants took either a placebo, a 400 milligram dose of CBD, or an 800 milligram dose at different intervals, allowing researchers to examine the short- and long-term effects of administered CBD. During the study sessions, researchers alternately monitored participants’ vital signs such as heart rate, blood pressure, respiratory rate, and oxygen saturation.

Researchers found that patients who were given CBD exhibited lowered feelings of anxiety or craving when exposed to certain psychological cues, such as videos of drug paraphernalia. Tests of such participants’ saliva also showed lowered levels of stress hormones, and participants’ heart rates were found to be lower than those in the placebo group. Some of these effects were evident in patients up to a week after their final dose of CBD.

Neuroscientist Yasmin Hurd, the study’s first author and director of the Addiction Institute at the Icahn School of Medicine at Mount Sinai, commented in a statement that the results are a further indication that CBD should be considered as a potentially significant option for treating patients recovering from opioid abuse.

“To address the critical need for new treatment options for the millions of people and families who are being devastated by this epidemic, we initiated a study to assess the potential of a nonintoxicating cannabinoid on craving and anxiety in heroin-addicted individuals,” Hurd wrote. “The specific effects of CBD on cue-induced drug craving and anxiety are particularly important in the development of addiction therapeutics because environmental cues are one of the strongest triggers for relapse and continued drug use.”

Hurd also noted that the cannabinoid chemical’s anxiety-reducing capability extends to other kinds of patients, but could offer key relief for patients recovering from opioid dependence. “[This] particular anxiety leads someone to take a drug that can cause them death, and anything we can do to decrease that means increasing the precious chance of preventing relapse and saving their lives.”

In recent years, CBD has been popularly treated as a potential “miracle drug,” in large part because of its ability to stimulate the endocannabinoid system, which maintains balance and function in the body. When this system itself gets out of balance, introducing phytocannabinoids such as cannabis-derived CBD can help to re-balance that system, allowing it to get back to work throughout the body.

It’s also worth noting that subjects in the Mount Sinai study were administered fairly large doses of CBD, which could have cost as much as $50 in the current medical cannabis market, based on typical per-milligram prices. Everyday sufferers of anxiety, pain, and more serious illnesses may use anywhere between 5 and 200 milligrams per day, and numerous researchers believe that in some patients, a lower dose of cannabidiol may be more efficacious than a higher one.

As NBC News reported, Hurd and her team previously studied the effects of CBD and opioid abuse in animals, and after observing a reduced tendency to use heroin in those subjects, they decided to extend their research to human groups.

(h/t Scientific American, Newsweek)

Janet Burns covers tech, culture, and other fun stuff from Brooklyn, NY. She also hosts the cannabis news podcast The Toke.

Study shows students learn better when they take handwritten notes

The educational system is being swept along in the race for technological advancement and supremacy in nearly every country. I’m not anti-technology, but I’m aware of the disadvantages of the Computer Age.

Students don’t engage in interesting practical work in some schools around the world anymore. Some instructors will merely project a virtual lab or a live video on a big screen. Some don’t even have to dissect animals anymore. What happened to the hands-on work? The “big screen” teaches them exactly how to do that.

I’m glad some schools are still holding on to the frog-dissection culture.

Laptops and tablets are increasingly replacing exercise books, pens, and paper. In some ways, it has become a standard to measure the modernity of a school. Laptops are becoming smaller, flatter, faster, and hence, more ubiquitous than ever. Tablets aren’t backing down in the race either. Colleges where students still take hand-written notes and use paper textbooks are thought of as archaic and non-progressive; however, this would not be entirely accurate to say. At many schools, it’s a requirement for students to show up with a reliable laptop for their coursework. Some colleges provide these for all their students.

Most would think this to be a desirable upgrade. Paper can be cumbersome, heavy, less manageable, not to mention the students who suffer from shoulder and back pain from carrying heavy books around. Furthermore, it’s easier to access textbooks in PDF or EPub format online than being compelled to buy paper books written by different authors.

I’ll admit it. It’s faster to type your notes highlight the critical pieces of information as the lecturer speaks, but what if students are missing out on something beneficial with this trend? It doesn’t exactly matter if the students are writing with a stylus on a tablet or with a pen on a paper. The idea is that writing is more beneficial to the student’s long-term memory than typing [1].

The Pen Is Mightier Than the Keyboard

According to a 2014 study, students who take hand-written notes are more focused and process information more accurately and selective, and have better long-term memory than the laptop note-takers [2].

“Our new findings suggest that even when laptops are used as intended-and not for buying things on Amazon during class-they may still be harming academic performance,” said lead author, Pam A. Mueller of Princeton University.

They organized a group of 65 college students to watch five TEDx talks covering various topics. They were instructed to take notes with their laptops or with pen and paper, as they usually would, dividing them into the hand-written notes group, and laptop note-takers group. According to the press release, the information contained in the talks was “interesting but not common knowledge.” A half-hour later, both groups of students were asked questions on hard facts such as historical dates and names, and it was recorded that the two groups did “equally well.”

When they were asked conceptual questions on applications and methods, such as “How do Japan and Sweden differ in their approaches to equality within their societies?” The laptop note-takers were recorded to have performed “significantly worse.”

Writing fosters selectivity and accuracy

During the second study, they instructed a group of laptop note-takers to ensure that they do not take down notes verbatim in the following lecture. Unfortunately, the students couldn’t help it, and upon typing notes word for word, they experienced significant problems recalling essential points.

“Even when we told people they shouldn’t be taking these verbatim notes, they were not able to overcome that instinct. When people type their notes, they have this tendency to try to take verbatim notes and write down as much of the lecture as they can,” Mueller told NPR [3]. “The students who were taking longhand notes in our studies were forced to be more selective-because you can’t write as fast as you can type. And that extra processing of the material that they were doing benefited them.”

Despite this information, pen and paper probably aren’t creeping back in anytime soon. Technology can be addictive, and people would most likely need rehab to accept the old fashioned way of writing once again.

“I think it is a hard sell to get people to go back to pen and paper,” she said. “But they are developing lots of technologies now like Livescribe and various stylus and tablet technologies that are getting better and better. And I think that will be sort of an easier sell to college students and people of that generation.”

Sources:

  1. Study Shows Students Learn Better When They Take Handwritten Notes. Megan Overdeep. BHG. Retrieved from https://www.bhg.com/news/study-shows-students-learn-better-when-they-take-handwritten-notes/. February 22. 2019.
  2. The Pen Is Mightier Than the Keyboard: Advantages of Longhand Over Laptop Note Taking. Mueller and Oppenheimer. Sage Pub. Retrieved from https://journals.sagepub.com/doi/abs/10.1177/0956797614524581. April 23, 2019.
  3. Admin. Attention, Students: Put Your Laptops Away. NPR. Retrieved from https://www.npr.org/2016/04/17/474525392/attention-students-put-your-laptops-away. April 17, 2016.

Child endangerment: Belgian legal opinion declares imposition of vegan diets on children is unethical; may lead to changes in law

A fresh report by the Belgian Royal Academy of Medicine has sparked a debate for claiming bringing kids up vegan is ‘unethical.’ RT heard different views on the potentially dangerous parents’ choice.

Plant-based food lacks proteins and vital amino acids and therefore can stunt growth and cause health problems for children, doctors said in a bombshell legal opinion published last week. Kids raised vegan need to be constantly monitored and require additional dietary intake to avoid deficiencies. Such a “destabilizing diet” is therefore “not ethical to impose on children,” the report argued.

The opinion was issued at the request of a government official after a number of cases where children faced complications, including critical medical conditions, after switching to a vegan diet, according to the Le Soir daily. The report could lead to changes in the law. Forcing kids to abandon animal proteins could be legally qualified as “non-assistance to a person in danger,” a crime which entails a prison sentence of up to two years in Belgium, professor Georges Casimir, one of the authors of the report told the paper.

Should parents be punished for imposing their eating philosophy on their children even if it can potentially have grave consequences? Jon Gaunt, talk radio host and columnist thinks no moral belief can justify potential harm to minors. “Children should not be forced to do it. There have been deaths in Belgium so you can understand why the state is concerned,” Gaunt said.

Penalties are not a solution for Dr. Myriam van Winckel, a pediatrician at Ghent University Hosptial. Parents, however, should be informed that “the more restrictions you use, the more is the risk of deficiencies,” she told RT. The greatest risk comes from the shortage of vitamin B12 because its active form is absent in plant food. Also known as cobalamin, vitamin B12 is required for proper red blood cell formation and DNA synthesis. Its deficiency “can cause irreversible damage to the neurological system,” the doctor said.

Meanwhile, several vegan groups blasted the report indicating that a balanced vegetable diet can support healthy living in all ages. Parents who chose a vegan lifestyle have every right to raise their children according to their moral convictions, Dr. Jeanette Rowley, a legal expert at Vegan Society told RT. Nonetheless, she thinks cases of child neglect “ought to be dealt with very precisely” regardless of the parents’ ethical orientation.