Culture

The universality of shame

Shame on you. These three simple words can have devastating effect on an individual's psyche.

But why is that? How is the feeling of shame generated, and what is its purpose? Some theorists argue that feeling shame is a pathology, a condition to be cured. Others dismiss it as a useless, ugly emotion.

A research team at the University of Montreal and UC Santa Barbara's Center for Evolutionary Psychology (CEP), however, suggest something altogether different. Shame, they argue, was built into human nature by evolution because it served an important function for our foraging ancestors.

Living in small, highly interdependent bands, the researchers explain, our ancestors faced frequent life-threatening reversals, and they counted on their fellow band members to value them enough during bad times to pull them through. So being devalued by others -- deemed unworthy of help -- was literally a threat to their survival. Therefore, when considering how to act, it was critical to weigh the direct payoff of a potential action (e.g., how much will I benefit by stealing this food?) and against its social costs (e.g., how much will others devalue me if I steal the food -- and how likely is it that they will find out?) became critical.

The researchers hypothesized that the intensity of anticipated shame people feel is an internally generated prediction of just how much others will devalue them if they take a given action. Moreover, if this feature was part of human nature, it should be observed everywhere -- in every culture.

To test for universality, they selected a linguistically, ethnically, economically and ecologically diverse set of cultures scattered around the world. In these 15 traditional, small-scale societies, the researchers found that the intensity of shame people feel when they imagine various actions (stealing, stinginess, laziness, etc.) accurately predicts the degree to which those actions would lead others in their social world to devalue them. Their findings appear in the Proceedings of the National Academy of Sciences.

The Function of Feelings

"In a world without soup kitchens, police, hospitals or insurance, our ancestors needed to consider how much future help they would lose if they took various actions that others disapprove of but that would be rewarding in other ways," said lead author Daniel Sznycer, an assistant professor of psychology at the University of Montreal. "The feeling of shame is an internal signal that pulls us away from acts that would jeopardize how much other people value our welfare."

Noted Leda Cosmides, a professor of psychology at UC Santa Barbara, co-director of the CEP and a co-author of the paper, "For this to work well, people can't just stumble about, discovering after the fact what brings devaluation. That's too late. In making choices among alternative actions, our motivational system needs to implicitly estimate in advance the amount of disapproval each alternative action would trigger in the minds of others."

A person who did only what others wanted would be selected against, the authors point out, because they would be completely open to exploitation. On the other hand, a purely selfish individual would be shunned rapidly as unfit to live with in this highly interdependent world -- another dead end.

"This leads to a precise quantitative prediction," said John Tooby, a professor of anthropology at UC Santa Barbara, CEP co-director and a coauthor of the paper. "Lots of research has shown that humans can anticipate personal rewards and costs accurately, like lost time or food. Here we predicted that the specific intensity of the shame a person would anticipate feeling for taking an action would track how much others in their local world would negatively evaluate the person if they took that specific act.

"The theory we're evaluating," he continued, "is that the intensity of shame you feel when you consider whether to take a potential action is not just a feeling and a motivator; it also carries vital information that seduces you into making choices that balance not only the personal costs and benefits of an action but also its social costs and benefits. Shame takes the hypothetical future disapproval of others, and fashions it into a precisely calibrated personal torment that looms the closer the act gets to commission or discovery."

A Universal Human Quality

According to the authors, shame -- like pain -- evolved as a defense. "The function of pain is to prevent us from damaging our own tissue," said Sznycer. "The function of shame is to prevent us from damaging our social relationships, or to motivate us to repair them if we do."

As a neural system, shame inclines you to factor in others' regard alongside private benefits so the act associated with the highest total payoff is selected, the authors argue. A key part of the argument is that this neurally based motivational system is a part of our species' biology. "If that is true, we should be able to find this same shame-devaluation relationship in diverse cultures and ecologies all around the world, including in face-to-face societies whose small scale echoes the more intimate social worlds in which we think shame evolved," Sznycer noted.

To test this hypothesis, the team collected data from 15 traditional small-scale societies in four continents. The people in these societies speak very different languages (e.g., Shuar, Amazigh, Icé-tód), have diverse religions (e.g., Hinduism, Shamanism), and make a living in different ways (e.g., hunting, fishing, nomadic pastoralism). If shame is part of universal, evolved human nature, the research should find that the emotion closely tracks the devaluation of others, for each specific act, in each community; but if shame is more akin to a cultural invention like agriculture or the alphabet, present in some places but not others, they should find wide variation from place to place in this relationship. Indeed, anthropologists have long proposed that some cultures are guilt-oriented, some are fear-oriented, and some are shame-honor.

Yet, the authors found the predicted relationships everywhere they tested. "We observed an extraordinarily close match between the community's negative evaluation of people who display each of the acts or traits they were asked about and the intensities of shame individuals anticipate feeling if they took those acts or displayed those traits," Sznycer said. "Feelings of shame really move in lockstep with the values held by those around you, as the theory predicts."

Further studies, he added, have demonstrated that it is specifically shame -- as opposed to other negative emotions -- that tracks others' devaluation. "Moral wrongdoing is not necessary," said Sznycer. "In other research we showed that individuals feel shame when others view their actions negatively, even when they know they did nothing wrong."

Of interesting note, anticipated shame mirrored not only the disapproval of fellow community members, but also the disapproval of (foreign) participants in each of the other societies. For example, the shame expressed by the Ik forager-horticulturalists of Ikland, Uganda, mirrored not only the devaluation expressed by their fellow Iks, but also the devaluation of fishermen from the Island of Mauritius, pastoralists from Khövsgöl, Mongolia, and Shuar forager-horticulturalists of the Ecuadorian Amazon. What's more, shame mirrored the devaluation of foreigners living nearby in geographic or cultural space just as well as it mirrored the devaluation of foreigners living farther and farther away -- another indication of shame's universality.

Credit: 
University of California - Santa Barbara

Prescribing antibiotics for children with cough does not reduce hospitalization risk

Doctors and nurses often prescribe antibiotics for children with cough and respiratory infection to avoid return visits, symptoms getting worse or hospitalisation. In a study published in the British Journal of General Practice today [Tuesday 11 September], researchers from the Universities of Bristol, Southampton, Oxford and Kings College London found little evidence that antibiotics reduce the risk of children with cough ending up in hospital, suggesting that this is an area in which unnecessary antibiotic prescribing could be reduced.

The team, funded by the National Institute for Health Research, analysed data from a study of 8,320 children (aged three months to 15 years) who had presented to their GP with cough and other respiratory infection symptoms to see whether adverse outcomes occurred within 30 days of seeing their GP.

Sixty-five (0.8 per cent) children were hospitalised and 350 (four per cent) revisited their GP due to a worsening of symptoms.

Compared with no antibiotics, there was no clear evidence that antibiotics reduced hospitalisation for children, supporting similar research findings in adults. However, there was evidence that a strategy of delayed antibiotic prescribing (giving parents or carers a prescription and advising they wait to see if symptoms worsened before using it) reduced the number of return visits to the GP.

Immediate antibiotics were prescribed to 2,313 (28 per cent) and delayed antibiotics to 771 (nine per cent).

Dr Niamh Redmond, from the Centre for Academic Primary Care at the University of Bristol and NIHR CLAHRC West, and lead author of the study, said: "The good news is that most children who present to their GP with acute cough and respiratory infection symptoms are at low risk of hospitalisation. We know that GPs, for a variety of reasons, commonly prescribe antibiotics in these cases as a precautionary measure. However, our study shows that antibiotics are unlikely to reduce this already small risk. This means that along with other strategies, there is real potential to reduce unnecessary antibiotic prescribing, which is a major contributor to the growing public health threat of antimicrobial resistance.

"If a GP or nurse is considering antibiotic prescribing for a child presenting with acute cough, a delayed prescription may be preferable as we have shown that delayed prescribing reduces the need for further consultations."

Credit: 
University of Bristol

Acute critical illness increases risk of kidney complications and death

Acute critical illness in people without previous renal disease puts them at risk of kidney complications as well as death, according to a study in published in CMAJ (Canadian Medical Association Journal).

"[P]atients with acute critical illness without apparent underlying renal disease -- a group traditionally considered to be at low risk of renal diseases -- have clinically relevant long-term renal risks," write Dr. Shih-Ting Huang and Dr. Chia-Hung Kao, Taichung Veterans General Hospital and China Medical University, Taichung, Taiwan, with coauthors.

Most studies have looked at patients with pre-existing kidney disease, while this study looked at data on 33 613 Taiwanese patients with critical acute illness and no pre-existing kidney disease compared with 63 148 controls for medium-term renal outcome. More than half the patients (53%) were over age 65 and two-thirds (67%) had high blood pressure. Patients who had experienced acute kidney illness were at increased risk of renal complications, developing chronic kidney disease and end-stage renal disease, with septicemia and septic shock being the strongest risk factors. Of the critically ill patients in the study, 335 developed end-stage renal disease, with a rate of 21 per 10 000 person-years compared with 4.9 per 10 000 person-years in the control group.

Patients who developed chronic kidney disease and end-stage renal disease were at a higher risk of death.

The authors suggest clinicians monitor kidney function at 30-90 days in patients with acute critical illness without preexisting renal disease and then at least yearly afterwards.

"Renal complications and subsequent mortality in acute critically ill patients without pre-existing renal disease" is published September 10, 2018.

Credit: 
Canadian Medical Association Journal

Lifestyle changes reduce the need for blood pressure medications

CHICAGO, Sept 8, 2018 -- Men and women with high blood pressure reduced the need for antihypertensive medications within 16 weeks after making lifestyle changes, according to a study presented at the American Heart Association's Joint Hypertension 2018 Scientific Sessions, an annual conference focused on recent advances in hypertension research.

Lifestyle changes are the first step in reducing blood pressure according to the 2017 American College of Cardiology/American Heart Association Hypertension Guideline.

"Lifestyle modifications, including healthier eating and regular exercise, can greatly decrease the number of patients who need blood pressure-lowering medicine. That's particularly the case in folks who have blood pressures in the range of 130 to 160 mmHg systolic and between 80 and 99 mmHg diastolic," said study author Alan Hinderliter, M.D., associate professor of medicine at University of North Carolina in Chapel Hill.

The researchers studied 129 overweight or obese men and women between ages 40 and 80 years who had high blood pressure. Patients' blood pressures were between 130-160/80-99 mmHg but they were not taking medications to lower blood pressure at the time of the study. More than half were candidates for antihypertensive medication at the study's start, according to recent guidelines.

Researchers randomly assigned each patient to one of three 16-week interventions. Participants in one group changed the content of their diets and took part in a weight management program that included behavioral counseling and three-times weekly supervised exercise. They changed their eating habits to that of the DASH plan, a nutritional approach proven to lower blood pressure. DASH emphasizes fruits, vegetables and low-fat dairy and minimizes consumption of red meat, salt and sweets. Participants in the second group changed diet only, focusing on the DASH diet with the help of a nutritionist. The third group didn't change their exercise or eating habits.

The researchers found:

Those eating the DASH diet and participating in the weight management group lost an average 19 pounds and had reduced blood pressure by an average 16 mmHg systolic and 10 mmHg diastolic at the close of the 16 weeks.

Those following only the DASH eating plan had blood pressures decrease an average 11 systolic/8 diastolic mmHg.

Adults who didn't change their eating or exercise habits experienced a minimal blood pressure decline of an average 3 systolic/4 diastolic mmHg.

By the study's end, only 15 percent of those who had changed both their diet and their exercise habits needed antihypertensive medications, as recommended by the 2017 AHA/ACC guideline, compared to 23 percent in the group that only changed their diet. However, there was no change in the need for medications among those who didn't change their diet or exercise habits - nearly 50 percent continued to meet criteria for drug treatment.

Hinderliter suspects lifestyle modifications would be just as helpful to people with a higher risk of cardiovascular disease and in patients on medications for high blood pressure but that needs confirmation in future studies, he said.

Credit: 
American Heart Association

Finding that links ALS/ataxia to cellular stress opens new approaches for treatment

SALT LAKE CITY -- Few treatments exist for neurodegenerative diseases that progressively rob a person's ability to move and think, yet the results of a new study could potentially open additional approaches for exploration.

Scientists at University of Utah Health report for the first time that a protein, called Staufen1, accumulates in cells of patients suffering from degenerative ataxia or amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig's disease. Depleting the protein from affected mice improved symptoms including motor function. These results suggest that targeting Staufen1 could have therapeutic potential in people. The research is published in Nature Communications.

"This is a completely new avenue for thinking about neurodegenerative diseases," says Stefan Pulst, M.D., Dr. Med., chair of Neurology at U of U Health and senior researcher on the study. "A protein that had never been known to be involved in neurodegeneration is now a great target for drug treatments."

Previously, researchers had not considered Staufen1 a culprit in neurodegenerative disease until they discovered its association with ataxia, a rare condition that causes patients to lose control of their movement. They found that Staufen1 binds Ataxin2, a protein that is both responsible for ataxia and a risk factor for ALS.

A role for Staufen1 in disease pathology became evident upon genetically depleting it from mice with an ataxia-like condition. The animals' condition improved at both physiological and molecular levels.

Beginning at 12 weeks of age, mice performed significantly better on a rotarod performance test, measuring the length of time the animals could walk or run on an accelerating spinning rod. In addition, the expression of a handful of proteins that had diminished in brain cells during disease reverted back to near-normal levels.

"Staufen was first discovered in the fruit fly and has been studied for 30 years, but it had never been connected to anything related to disease," says Pulst. "This is a novel finding." Future investigations will focus on determining whether drugs or therapies that reduce Staufen1 could be developed as treatments for multiple diseases.

Beyond these applications, the biology of Staufen1 could reveal new clues about neurodegenerative disease. The protein accumulates with Ataxin2 and other proteins and RNAs in dense clusters called stress granules, a hallmark of ataxia, ALS and other conditions such as frontotemporal dementia. When Staufen1 was depleted from mice with ataxia, it not only improved the pathology of disease but also rid cells of stress granules.

While the precise role of stress granules is still an intensive area of study, they are believed to help cells weather stress caused by toxins or certain disease conditions, explains co-author Daniel Scoles, Ph.D., associate professor of Neurology at U of U Health. One function could be to prevent proteins from being made under suboptimal conditions.

The findings connect Staufen1 to the emerging concept that neurodegenerative diseases are linked to malfunctions in the way cells cope with cellular stress. One implication, says Scoles, is that Staufen1-targeted therapies could work against a number of disorders in which stress granules emerge, although it remains to be determined whether the aggregates themselves lead to disease.

"Our results put the stress granule in focus as a structure to target in disease," says Scoles.

Credit: 
University of Utah

New guidelines for traumatic brain injury -- Built with input from rehabilitation professionals

September 7, 2018 - Clinical practice guidelines play a critical role in promoting quality care for patients with traumatic brain injury (TBI). A new set of guidelines for rehabilitation of patients with moderate to severe TBI - incorporating insights from the rehabilitation professionals responsible for providing care from initial assessment through long-term follow-up - is introduced in the September issue of the The Journal of Head Trauma Rehabilitation (JHTR), official journal of the Brain Injury Association of America. The journal is published in the Lippincott portfolio by Wolters Kluwer.

"The novel approach of consulting and working with end users to develop a clinical practice guideline for moderate to severe TBI should influence knowledge uptake for clinicians wanting to provide evidence-based care," according to an introductory article by Bonnie Swaine, PhD, of Université de Montréal and the Center for Interdisciplinary Rehabilitation Research (CRIR) and colleagues.

Unique Focus on Responding to Needs of Professionals Caring for TBI

The guidelines were developed by a collaborative effort of researchers, clinicians, and policymakers from Ontario and Quebec. The complete, bilingual (English and French) guidelines can be accessed at http://braininjuryguidelines.org. The guidelines were sponsored by the Québec Institut national d'excellence en santé et en services sociaux (INESSS) and the Ontario Neurotrauma Foundation (ONF).

Several sets of guidelines for TBI have been developed in recent years - so why develop a new clinical practice guideline now? "Because clinicians told us that they need specific features and tools," Dr. Swaine and coauthors write. Updated guidelines are also needed to reflect the trend toward community-based rehabilitation, as well as the context of the Canadian healthcare system.

From the outset, the guideline development process assessed the needs and expectations of "end users": the clinicians and managers providing rehabilitation care for patients with moderate to severe TBI. A study by Marie-Eve Lamontagne, PhD, of Université Laval, Québec City, and colleagues found that rehabilitation professionals expressed positive perceptions of clinical practice guidelines - however, only a small proportion of respondents used them in everyday practice. The professionals identified several key topics to be covered in guidelines, including the intensity and frequency of rehabilitation services, managing behavioral disorders and cognitive function impairment, and social participation and community life.

A separate survey asked professionals their views on how well guideline recommendations were implemented into the care of patients with TBI. While a high percentage of recommendations were considered "fully or mostly implemented," several gaps in implementation were recognized, both in acute care and rehabilitation settings.

An article by Dr Mark Bayley, MD, of University of Toronto and colleagues highlights the unique features of the INESSS/ONF guideline development process that address user's needs, including providing prioritization of recommendations for implementation, implementation tools, indicators to measure uptake, system implications and background rationale and evidence supporting the recommendation. The final clinical practice guideline includes 71 recommendations related to the components of the optimal TBI rehabilitation system, including the intensity/frequency of interventions, rehabilitation mechanisms, duration of interventions, and mechanisms for promoting continuity of care; and 195 recommendations pertaining to assessment and rehabilitation of the sequelae of brain injury, including behavioral disorders, cognitive dysfunction, fatigue and sleep disturbance, and mental health.

The vision behind the guidelines encompasses the whole "knowledge to action cycle," including measures to define and support implementation of the recommendations across Ontario and Québec. Drs. Swaine, Bayley, and Lamontagne and colleagues conclude: "Only time will tell whether our attention to user needs and expectations will positively influence the uptake of knowledge using the clinical practice guideline and, ultimately, patient outcomes following moderate to severe TBI. "

Credit: 
Wolters Kluwer Health

What is shared decision-making and how does it work for allergists?

ARLINGTON HEIGHTS, IL (September 7, 2018) - If you and your doctor chat for a few minutes about treatment options for your allergy symptoms, is that considered shared decision-making (SDM)? It is not, according to a new article published in Annals of Allergy, Asthma and Immunology, the scientific publication of the American College of Allergy, Asthma and Immunology (ACAAI).

"Although health care providers think they know what is involved, many aren't aware there are distinct elements which must be included to make it SDM," says allergist Michael Blaiss, MD, ACAAI Executive Medical Director and lead author of the article. "SDM is a collaborative process between the healthcare provider and the patient that takes place throughout the entire visit. It's also an ongoing discussion throughout the course of the provider/patient relationship."

According to the article, SDM is characterized by the absence of orders and interruptions, by the avoidance of medical jargon, and by using dialogue with open-ended questions. Both the patient and the healthcare provider are sources of information, bringing different, but equally important types of expertise to the decision-making process.

"SDM for asthma has been shown to improve adherence and outcomes, as well as improve patient satisfaction with care," says allergist and SDM expert Eli Meltzer, MD, co-author of the article. "SDM is particularly important for any chronic diseases such as asthma and allergies which often require long-term, and potentially complicated or intensive treatments. Adherence to these treatments can be driven by patient characteristics and preferences, which cannot be fully explored without SDM."

The article acknowledges there are barriers to SDM including the perception by healthcare providers that SDM is too time-consuming. However, a review of studies on SDM showed that the median consultation time only increased 2.6 minutes when a patient decision aid was used.

"It's important for healthcare providers to understand that SDM is not interchangeable with informed consent in which patients are educated about options and asked if they agree to a treatment," says allergic diseases researcher Bruce Bender, PhD, co-author of the article. "SDM is also not simply providing patients with educational materials. If used properly, SDM can improve both the health of the patient, and the relationship between the patient and allergist."

ACAAI has introduced three new SDM patient decision aids for use by allergists with their patients. The tools walk patients through SDM for severe adult asthma for Atopic Dermatitis (commonly known as eczema) and for allergy immunotherapy

Credit: 
American College of Allergy, Asthma, and Immunology

Good governance, clean water, & sanitation necessary to curb global antibiotic resistance

Washington DC - An international team of researchers from CDDEP, Australian National University and Cardiff University and the Monarch Institute has published a new study showing that the proliferation of disease-causing antibiotic resistant organisms is correlated with many social and environmental factors such as poor sanitation, unsafe water and higher incomes. The results have been published in Lancet Planetary Health.

The study, which was based on economic and public health data from 73 countries, found that better infrastructure and better governance were significantly associated with lower measures of antimicrobial resistance. Good governance includes lower corruption, political stability, rule of law, and absence of violence; while infrastructure measures include sanitation, safe water, internet accessibility, urbanization, and access to electricity. Improving sanitation, increasing access to clean water, and ensuring good governance, plus increasing public health expenditures, all need to be addressed to reduce global antimicrobial resistance, say the authors of the paper.

Although the use of antibiotics is commonly known to drive the emergence and maintenance of antimicrobial resistance, the team found that antibiotic consumption was not significantly associated with higher antimicrobial resistance. Reducing antibiotic consumption is insufficient to control antimicrobial resistance because contagion--the spread of resistant strains--seems to be the dominant factor.

"While reducing antibiotic consumption is important, we have to remember that resistance genes are already widely disseminated in the environment," according to Ramanan Laxminarayan, one of the study's authors. "Preventing transmission of resistant pathogens through investments in improved water and sanitation, and primary healthcare are central to our ability to tackle antimicrobial resistance."

"There are not magic bullets here," Laxminarayan said. "Any new antibiotic will run into the same challenges as existing ones and resistance will emerge rapidly unless we take the problems of improving the health system head on."

Credit: 
Center for Disease Dynamics, Economics & Policy

Warning about dietary supplements containing higenamine

image: This is NSF International Senior Research Scientist John Travis, co-author of the study.

Image: 
NSF International

ANN ARBOR, Mich., September 6, 2018 -- Less than two years after the World Anti-Doping Agency (WADA) added higenamine to its list of substances prohibited in sport, an international team of public health researchers has published a peer-reviewed study documenting inaccurately labeled and potentially harmful levels of the stimulant in weight-loss and sports/energy supplements available in the United States. Based on the findings, the researchers are urging consumers to use caution when consuming supplements labeled as containing higenamine. The research was published in the peer-reviewed journal Clinical Toxicology.

"We're urging competitive and amateur athletes, as well as general consumers, to think twice before consuming a product that contains higenamine," said John Travis, Senior Research Scientist at NSF International and a co-author of the study. "Beyond the doping risk for athletes, some of these products contain extremely high doses of a stimulant with unknown safety and potential cardiovascular risks when consumed. What we've learned from the study is that there is often no way for a consumer to know how much higenamine is actually in the product they are taking."

The independent study was conducted by researchers at global public health organization NSF International, Harvard Medical School and the National Institute for Public Health and the Environment (RIVM) in the Netherlands. The researchers studied 24 products labeled as containing higenamine or the synonyms "norcoclaurine" or "demethylcoclaurine" and found unpredictable and potentially harmful quantities of the stimulant ranging from trace levels to 62 mg per serving. Of the 24 products tested, only five listed a specific quantity of higenamine on the label, and none of those five quantities were accurate. Based on the labeled directions for use, consumers could be exposed to up to 110 mg of higenamine per day. The health risks of higenamine remain poorly understood, but as a beta-2 agonist, it has been prohibited from sport by the WADA, and therefore poses a risk to competitive athletes' careers.

"Some plants, such as ephedra, contain stimulants. If you take too much of the stimulants found in ephedra, it can have life-threatening consequences. Similarly, higenamine is a stimulant found in plants," said Dr. Pieter Cohen, Associate Professor of Medicine at Harvard Medical School, Internist at Cambridge Health Alliance and a co-author of the study. "When it comes to higenamine, we don't yet know for certain what effect high dosages will have in the human body, but a series of preliminary studies suggest that it might have profound effects on the heart and other organs."

Dietary supplements lead to an estimated 23,000 emergency department visits each year in the United States, and weight loss and sports supplements contribute to a large portion of these emergency department visits.

"Higenamine is a natural constituent of several traditional botanical remedies, such as aconite root and Aristolochia brasiliensis," said Travis. "While higenamine is considered a legal dietary ingredient when present as a constituent of botanicals, our research identified concerning levels of the stimulant and wildly inaccurate labeling and dosage information. And, as a WADA-prohibited substance, any amount of higenamine in a dietary supplement should be of concern to the competitive athlete." The research points to the need for independent testing and certification of dietary supplements, a public health service that NSF International provides.

NSF International facilitated the development of the only American National Standard for dietary supplements (NSF/ANSI 173), which became the foundation of NSF's accredited dietary supplement certification program in 2001 (ANSI-Accredited Product Certification Body - Accreditation #0216). To earn NSF certification, products are tested for product formulation, label claims and harmful levels of specific contaminants and potentially harmful ingredients. Additionally, NSF certified dietary supplements must be produced in a manufacturing facility that is inspected twice a year to comply with the U.S. FDA's Good Manufacturing Practice (GMP) requirements.

Products certified under NSF's Certified for Sport® program must meet additional requirements and are screened for more than 272 athletic banned substances. Many professional and elite sports associations and leagues recommend or require the use of Certified for Sport® products, including MLB, NHL, NFL, PGA, LPGA, CFL and the Canadian Centre for Ethics in Sport.

Consumers with questions about NSF certified supplements can contact the NSF International consumer hotline at 1-800-673-8010 or email info@nsf.org.

Credit: 
NSF International

Stress wracks worm nerves, leaving lasting memories

image: NIH-funded scientists showed how stress rewired a worm's nervous system and made adults act like juvenile worms.

Image: 
Courtesy of Hobert lab, Columbia University, N.Y.

Scientists stunted the puberty of male worms by starving them before they underwent sexual maturation. In a study funded by the National Institutes of Health, the scientists suggested that stress from starvation even days before sexual maturation prevented normal changes in the wiring patterns of key neuronal circuits, which caused adult male worms to act immature.

"We found that environmental stress can permanently and profoundly impact the connectivity of a developing nervous system," said Oliver Hobert, Ph.D., professor of biological sciences at Columbia University in New York City, and a senior author of the study published in Nature.

Dr. Hobert's lab studies the nervous system of tiny see-through worms called Caenorhabditis elegans, or C. elegans. Previously, scientists in his lab showed how sexual maturation genetically reprograms and reshapes the wiring patterns of some neuronal circuits in male worms, making them different from their hermaphrodite mating partners.

In this study, Emily Bayer, a graduate student in the lab, discovered that exposure to stress, specifically starvation, before maturation can interrupt the rewiring program in males leaving the adult worms with immature circuits. Her results also suggested that these responses to stress were, in part, controlled by serotonin, a neurotransmitter associated with depression in humans.

"Exploring how genes and the environment shape the nervous system is critical to understanding how circuits break down to cause a variety of neurological disorders," said Jill Morris, Ph.D., program director at the NIH's National Institute of Neurological Disorders and Stroke (NINDS).

Initially, Bayer stressed out immature worms when she accidentally left some animals unattended for a few weeks. This caused the worms to pause their normal growth and enter what scientists call a "dauer state."

"Basically, if immature worms sense stress of any kind they can temporarily halt their normal growth for months and then restart it when the stress passes. This temporary freeze in the growth process is the dauer state," said Dr. Hobert.

Eventually, Bayer returned the worms to their normal environment and let them grow into adults. After examining the nervous systems of stressed worms, she noticed something unusual. Normally, some of the neuronal connections in the males' tails are eliminated, or pruned, during sexual maturation. Instead, she found that immature connections in the stressed worms remained. Follow-up experiments suggested that this was strictly caused by starvation and no other forms of stress - such as heat - could have caused the dauer state.

"I was totally surprised. In fact, I never thought stressing the worms out would matter," said Bayer. "It wasn't until I saw differences in their circuits that I realized that stress could remap their wiring diagrams."

She also found that starvation before sexual maturation caused male adult worms to act immaturely during behaviors known to be controlled by these circuits. Unlike normal adult males, the stressed worms were highly sensitive to a noxious chemical called SDS. Stressed worms swam away from SDS while normal males barely responded. The stressed worms also had problems mating. Specifically, they spent much less time in contact with hermaphrodite worms than normal males.

Further experiments suggested that pruning of the circuits was controlled by opposing effects of serotonin and octopamine, a chemical cousin of the human "fight or flight" neurotransmitter norepinephrine. Starved worms made abnormally high amounts of octopamine, which blocked production of serotonin. Exposing immature males to serotonin during starvation caused normal pruning and the adults to have mature reactions to SDS. In contrast, adding octopamine to an immature male worm's diet prevented circuit pruning.

Do these results suggest that stress can disrupt the changes the human nervous system undergoes during early development?

"That's a tricky question. Right now, it's very hard to pin down these kinds of differences in humans and other animals with large complicated nervous systems," said Dr. Hobert. "Nevertheless, worms are so simple and easy to study that we can provide detailed proofs of principals that may guide nervous systems great and small."

Currently his lab is part of a nationwide project, called CeNGEN, to map the genetic activity of every neuron in the C. elegans nervous system. Results from this project may help researchers fully explore how the collisions between genes and experience, or nature and nurture, shape a nervous system.

Credit: 
NIH/National Institute of Neurological Disorders and Stroke

Spinal muscular atrophy disease expression correlated with haplotypes

STRASBURG, PA- A natural history study has provided the first comprehensive clinical description of spinal muscular atrophy (SMA) within the Amish and Mennonite communities and correlates ancestral chromosome 5 haplotypes and SMN2 copy number with disease severity. SMA is a devastating genetic disease that affects the motor neurons that control movement, eating, and breathing. It represents the leading genetic cause of infant death worldwide, with an incidence of approximately 1 per 10,000 newborns worldwide and as many as 1 per 2,800 babies of Mennonite descent. The observations were conducted within a population-specific framework to elucidate subtle differences in disease expression and the subsequent impact of disease-modifying therapies administered early in life. Forty-two Mennonite and fourteen Amish patients with SMA were included in the study by practitioners and researchers at the
Clinic for Special Children in Strasburg, PA. The study is published online today in PLOS ONE.

DNA microsatellites and 2.6 million-marker single nucleotide polymorphism (SNP) microarrays were used to examine the genetic composition of haplotypes, which revealed structural similarities among the various patient groups. The two major Mennonite haplotypes identified (M1a, M2) included 1 and 2
copies of SMN2 respectively, and the single Amish SMA haplotype (A1) had 1 copy of SMN2. M1a/M1a, M1a/M2, and M2/M2 were the prominent SMA genotypes.

The study reveals important differences in the timing and severity of motor nerve degeneration as a function of both SMN2 copy number and SMA haplotype. Genotypes with two copies of SMN2 were associated with earlier disease onset, more restricted motor development, and shorter survival than genotypes with three or four copies of SMN2. A more surprising finding, however, was the difference in clinical severity and survival between individuals with A1/A1 as compared to M1/M1 haplotypes, which convey the same number of SMN2 copies. The researchers plan to use whole exome sequence data to compare A1 and M1a at a higher resolution in an effort to account for these differences.

Credit: 
Clinic for Special Children

Building a Better Brain-in-a-Dish, Faster and Cheaper

image: This is a false color image of a slice of human brain organoid from a patient with autism spectrum disorder.

Image: 
Photo Alysson Muotri, UC San Diego Health

Writing in the current online issue of the journal Stem Cells and Development, researchers at University of California San Diego School of Medicine describe development of a rapid, cost-effective method to create human cortical organoids directly from primary cells.

Experimental studies of developing human brain function are limited. Research involving live embryonic subjects is constrained by ethical concerns and the fragile nature of the brain itself. Animal models only partially mimic or recapitulate human biology and cognitive function. Single cell studies do not capture the complexity of neural networks.

In recent years, the development of in vitro human organoids -- three-dimensional, miniaturized, simplified versions of an organ produced from reprogrammed stem cells -- have allowed scientists to study biological functions, diseases and treatments more realistically and in greater detail.

"And that includes the brain," said Alysson R. Muotri, PhD, professor in the UC San Diego School of Medicine departments of Pediatrics and Cellular and Molecular Medicine, director of the UC San Diego Stem Cell Program and a member of the Sanford Consortium for Regenerative Medicine. "Cerebral organoids can form a variety of brain regions. They exhibit neurons that are functional and capable of electrical excitation. They resemble human cortical development at the gene expression levels."

Muotri is among the leaders in the field, having used the "brain-in-a-dish" approach to provide the first direct experimental proof that the Zika virus can cause severe birth defects, to repurpose existing HIV drugs on a rare, inherited neurological disorder and to create Neanderthalized "mini-brains."

But human brain organoids are difficult, time-consuming and expensive to produce, requiring sophisticated tools and know-how to first generate human induced pluripotent stem cells (iPSCs) capable of becoming almost any kind of cell from skin cells, called fibroblasts, then directing those iPSCs to differentiate into the variety of interconnected cell types that comprise an organ like the brain.

In the new paper, senior author Muotri and colleagues describe a new, rapid and cost-effective method to reprogram individual somatic cells directly into cortical organoids from hundreds of individuals simultaneously. To do so, they compressed and optimized several steps of the process so that somatic cells are reprogrammed, expanded and stimulated to form cortical cells almost simultaneously. The result is a cortical organoid that fully develops from somatic cells with only minor manipulation, Muotri said.

"What we've done is establish a proof-of-principle protocol for a systematic, automated process to generate large numbers of brain organoids," said Muotri. "The potential uses are vast, including creating large brain organoid repositories and the discovery of causal genetic variants to human neurological conditions associated with several mutations of unknown significance, such as autism spectrum disorder. If we want to understand the variability in human cognition, this is the first step."

Credit: 
University of California - San Diego

Is the key to sparking climate action a game?

Research published by PLOS ONE found that 81 percent of participants in the World Climate Simulation, a role-playing game of the UN climate talks, showed increased motivation to combat climate change, even among Americans who are free market proponents, a belief strongly linked to denial of human-caused climate change in the United States.

Prof. Juliette Rooney-Varga of the UMass Lowell Climate Change Initiative led the research into how the game affected participants' beliefs, emotional responses, and intent to take action on climate change. The study examined how World Climate affected more than 2,000 participants from eight countries and four continents, ranging from middle school students to CEOs. Across this diverse population, and regardless of political orientation, cultural identity, age, or gender, participation in World Climate was associated with increased understanding of climate change science, a greater sense of urgency and hope, and increased motivation to learn and do more about climate change. The more people learned through the game, the more their sense of urgency increased. As Rooney-Varga explains, "it was this increased sense of urgency, not knowledge, that was key to sparking motivation to act."

The researchers also found that the game reaches people outside the traditional climate change "choir," including free-market proponents and people who knew and cared little about climate change before participating. In fact, these people experienced greater gains in knowledge, urgency, and motivation to act. This finding is particularly exciting given the failure of many prior climate change communication efforts to reach across the political spectrum and to engage people who are not concerned about the issue. The study relied on statistical analysis of surveys that participants completed before and after experiencing World Climate.

The World Climate game is "an engaging, social experience grounded in the best available climate science," comments Rooney-Varga. Participants take on the roles of national delegates to the UN climate change negotiations and are charged with creating a global agreement that successfully mitigates climate change. As in the real negotiations, each delegation offers policies for their greenhouse gas emissions. The developed nations pledge contributions to the Green Climate Fund to help developing nations cut their emissions and adapt to change; the developing nations specify how much they need to do so. Their decisions are then entered into the climate policy computer model, C-ROADS, which has been used to support the real negotiations, giving participants immediate feedback on the expected climate impacts of their decisions. First round results usually fall short, showing everyone the likely harm to their prosperity, health and welfare. Participants then negotiate again, using C-ROADS to explore the consequences of more ambitious emission cuts.

World Climate is designed for ease of use. As of July 2018, more than 43,000 people in 78 countries around the world have participated in it. The simulation has been reviewed by independent educators and scientists, found to support national science education standards in the US, and designated as an official resource for schools in France, Germany, and South Korea.

Co-author, Prof. John Sterman of MIT Sloan School of Management, notes that "research shows that showing people research doesn't work. World Climate works because it enables people to express their own views, explore their own proposals and thus learn for themselves what the likely impacts will be."

Dr. Rooney-Varga of UMass Lowell adds, "For most of human history experience has been our best teacher, enabling us to understand the world around us while stimulating emotions--fear, anger, worry, hope--that drive us to act. The big question for climate change communication is: how can we build the knowledge and emotions that drive informed action without real-life experience which, in the case of climate change, will only come too late? The answer appears to be simulated experience."

Co-authors Eduardo Fracassi and Florian Kapmeier have used World Climate extensively across South America and Europe. Fracassi has seen World Climate inspire "life-changing insights," with many participants "embracing projects that reduced greenhouse gas emissions in the real world and taking steps to protect people from future climate risks." Kapmeier shared the simulation with the Germany Ministry of Education, which designated World Climate as an official resource for German high schools. As Kapmeier explains, the German government "realized that education is a key means to move climate policy forward" and "appreciates that the C-ROADS climate model in World Climate is used by actual policymakers."

Co-author Andrew Jones of Climate Interactive sees relevance for climate communication more generally: "Our findings may be useful to anyone who is engaging others on climate action. It suggests three key ingredients: information grounded in solid science, an experience that helps people feel for themselves, on their own terms, and social interaction arising from conversation with their peers."

Credit: 
University of Massachusetts Lowell

'Double-acting' baking powder - how it works

image: Baking powder is used to raise baked goods like cakes and cookies. It's often sold under the label "double-acting," but what does that mean? In this video, Reactions explains the chemistry of how baking powder can act twice to make bubbles in your baked goods: https://youtu.be/f16wezzHPzg.

Image: 
The American Chemical Society

WASHINGTON, Sept. 4, 2018 -- Baking powder is used to raise baked goods like cakes and cookies. It's often sold under the label "double-acting," but what does that mean? In this video, Reactions explains the chemistry of how baking powder can act twice to make bubbles in your baked goods: https://youtu.be/f16wezzHPzg.

Credit: 
American Chemical Society

US housing subsidy may improve adolescent girls' binge drinking but worsen boys'

A housing subsidy treatment that enables low-income families in US cities to move from public to private housing appears to reduce adolescent girls' binge drinking but increase adolescent boys' binge drinking. The reasons for these differential gender effects are not yet clear.

Affordable housing policies are common in democratic nations, and the Housing Choice Voucher (HCV) program represents the largest federal investment in affordable housing in the US. Such policies have the potential to address inequality for low-income households by defraying rental costs and promoting housing mobility to higher opportunity areas.

The Moving to Opportunity (MTO) housing subsidy trial randomized volunteer low-income families in public housing to (1) receive rental subsidies redeemable in neighbourhoods with few residents living in poverty plus housing counselling, (2) receive unrestricted rental subsidies or (3) remain in public housing. In this secondary study, results from groups 1 and 2 were pooled to represent the treatment group.

This secondary analysis, published today by the scientific journal Addiction, examined 2829 adolescents (1950 in the treatment and 879 in the control group) for differences in binge drinking and other alcohol use outcomes 4 to 7 years after the study began, including following their family's receipt and use of a rental subsidy to move to a private rental unit if they were assigned to the treatment condition, or their assignment to the public housing control condition. The treatment-control risk difference (the excess risk than can be attributed to having moved with a subsidy vs. not) on binge drinking for girls was -0.022 (a beneficial effect of treatment) and for boys was 0.032 (a harmful effect of treatment).

Lead author Dr Theresa Osypuk, Associate Professor at the University of Minnesota School of Public Health, says "Our research addresses a key policy question in the popular US-based HCV programme: whether or not to incorporate elements from non-housing sectors to improve outcomes for low-income children. Our findings suggest that although girls benefit from the HCV programme with respect to reducing underaged drinking, boys may need additional support to be successful.

"Although we don't know exactly why there is a gender difference in adolescent binge drinking resulting from this housing subsidy program, we suspect the MTO treatment may have influenced binge drinking via social relationship mechanisms, which are different for girls and boys. For example, girls and boys may cope with the stressors of moving differently, and/or parenting relationships affected by residential mobility may influence girls' and boys' drinking behaviours differently. Further research will uncover the exact mechanisms, but until then we need to respond to the fact that boys face increased risk of binge drinking under the current HCV programme."

Binge drinking is a pattern of excessive alcohol consumption that results in elevated blood alcohol concentration, leads to cognitive, sensory and motor impairment, and causes tissue damage in both acute and chronic use. Any alcohol use may cause some level of impairment, but binge drinking has a much higher risk for negative outcomes than measures such as life-time or past month alcohol use.

Credit: 
Society for the Study of Addiction