Culture

Sexual/gender minority patients prefer written self-report for identity info collection

Boston, MA -- Health care and government organizations call for routine collection of sexual orientation and gender identity (SOGI) information in the clinical setting. However, what little research exists suggests that many sexual and gender minority patients find disclosing SOGI to a clinician as difficult as disclosing the same information to other people in their lives. Understanding how health care providers can collect SOGI information in a manner that is patient-centered is critical to improving health care inequities experienced by sexual and gender minorities in the U.S. To this end, researchers at the Center for Surgery and Public Health at Brigham and Women's Hospital, in conjunction with Johns Hopkins School of Medicine, conducted a matched cohort trial to determine which of two different SOGI collection methods was associated with higher patient satisfaction with their emergency department (ED) visit.

"The first step to better understanding disparities impacting sexual and gender minority (SGM) patients is to routinely collect sexual orientation and gender identity information for all patients," said Adil Haider MD, MPH, a trauma surgeon and Kessler Director of the Center for Surgery and Public Health at the Brigham, and first author of the Emergency Department Query for Patient-Centered Approaches to Sexual Orientation and Gender Identity (EQUALITY) study. "Collecting these data is especially important in emergency departments, the source of nearly half of all inpatient admissions and an entry point for many uninsured and underinsured patients. By working in tandem with patients and ED providers over the course of the EQUALITY study, we successfully identified and then tested two data collection approaches to find out which one made patients' ED experiences the most comfortable."

As of 2016, 12 million American adults identify as SGM, representing 4 percent of the U.S. population. Although the Joint Commission and Department of Health and Human Services recommends routine collection of SOGI information in health care settings, and electronic health records have the ability to collect it, little work has been done to understand patient preferences for how SOGI should be collected, especially in the ED.

The two data collection approaches were conducted and compared as a mixed cohort trial at four emergency departments between February 2016 and March 2017. In the first mode of the study, ED nurses collected SOGI information from patients verbally during the clinical visit as part of their patient history. In the second, registrars requested patients to confidentially complete a demographics information form. Prior to and during the intervention, ED physicians, physician assistants, nurses, and registrars received education on SGM health disparities and terminology. Patient satisfaction was measured by a scale modified from the Communication Climate Assessment Toolkit (CCAT) patient survey, which included questions on overall patient comfort, patient experiences, and patient comfort with SOGI collection.

SGM patients reported greater comfort and improved communication when a standardized collection process is used where all patients can non-verbally self-report SOGI along with other demographic information, rather than being asked by a nurse during a clinical encounter. Of 23,372 patients from whom SOGI data was collected, 673 were identified as SGM and 213 enrolled and completed outcome surveys. Average modified CCAT scores were six points higher among SGM patients whose information was collected during registration compared to verbal collection (95.6 vs. 89.5, p = 0.03). Compared to non-SGM patients, and controlling for age, race, illness severity, and study site, SGM patients had 2.57 times higher odds of having a better CCAT score with registration collection (95% CI 1.13, 5.82). Additionally, most patients said it was important for all patients to report SOGI information and no significant differences were found between the two approaches among non-SGM patients (91.8 vs. 93.2, p= 0.59) and those without reported SOGI information field (92.7 vs. 93.6, p=0.70).

"Now that we know the majority of patients feel comfortable reporting SOGI information, and that they think it's important for providers to ask, the emergency medicine community has an imperative to collect these demographic data in a more routine, standardized, and patient-centric way," said Haider.

In phase 1, the EQUALITY study found that nearly 80 percent of clinicians believed that patients would refuse to provide SOGI, yet only 10 percent of patients reported that they would refuse to do so. SGM patients view the standardized collection of SOGI as a step towards recognition as an individual as well as normalization of SGM individuals within society. Results from the study also supported the ability of patients to opt out of reporting SOGI information as well as the importance of nurses and registrars receiving cultural sensitivity and dexterity training on caring for patients who disclose that they identify as SGM.

"A lot of organizations have recommended collecting sexual orientation and gender identity information to improve our understanding of the health and wellness of sexual and gender minority patients and to identify disparities in care. Unfortunately, these recommendations often lack evidence-based guidance for how to best collect these data from patients," said Brandyn Lau, MPH, CPH, Assistant Professor of Radiology and Radiological Sciences at John Hopkins Medicine and last author on the study. "We worked with patients and providers throughout this study to develop and rigorously test methods to collect SOGI information that maximize patient comfort with disclosing this information and minimizing disruption to an already busy clinical workflow."

Credit: 
Brigham and Women's Hospital

After naloxone, when can opioid overdose patients be safely discharged?

image: Brian Clemency and colleagues at the University at Buffalo and Erie County Medical Center observed patients brought to the Emergency Department after receiving naloxone for suspected opioid overdose.

Image: 
Douglas Levere/University at Buffalo

BUFFALO, N.Y. -- Naloxone has saved thousands of lives. But can patients be safely discharged from the Emergency Department (ED) just an hour after they receive the medication that curtails drug overdoses?

According to the St. Paul's Early Discharge Rule developed in 2000, that's how long providers should observe patients after naloxone treatment, so long as their vital signs meet specific criteria and they are ambulatory.

But the rule was never externally validated or assessed in light of the changes that have occurred in recent years with opioid use disorder.

That's why University at Buffalo researchers conducted the current study, published today in Academic Emergency Medicine, and the first to clinically assess the rule developed at St. Paul's Hospital in Vancouver.

Dramatic changes

"The landscape of opioid use disorder has changed dramatically," said Brian Clemency, DO, lead author on the paper, associate professor of emergency medicine in the Jacobs School of Medicine and Biomedical Sciences at UB and an attending physician specializing in emergency medicine at Erie County Medical Center. He also is a physician with UBMD Emergency Medicine.

In 2000, he explained, naloxone was almost exclusively administered intravenously by doctors, nurses and paramedics. Today, the medication is far more widely available, including to members of the public, and is often given in the form of a nasal spray. In addition, the use of heroin and synthetic opioids, such as fentanyl and carfentanil, has increased tremendously.

"Recommendations for patient observation after naloxone administration are inconsistent," said Clemency. "Patients can be observed for six or more hours or they can be immediately discharged with no further evaluation.

"The question is, which of these patients needs to be watched longer?" he asked. "Right now, there isn't a really good rule. This has wide-ranging negative implications for emergency care and opioid use disorder treatment.

"It is our hope that these findings will lead to a reduction in practice variation and allow for better use of resources in the ED, while ensuring patient safety," he added.

Tracking patients in the ED

To determine if the one-hour early discharge rule is valid, given the changes in opioid use disorder, Clemency and his colleagues launched an ambitious study at Buffalo's Erie County Medical Center, (ECMC) a busy, urban teaching hospital affiliated with the Jacobs School.

Patients who arrived at the medical center by ambulance after receiving naloxone for suspected opioid overdose had to be enrolled and evaluated within 30-40 minutes of arrival.

One hour after receiving naloxone in the community, patients' vital signs were evaluated, ranging from body temperature and heart rate to blood pressure and the blood oxygen level.

A total of 538 patients were included in the study. Patients were typically observed for at least four hours before being discharged.

Patients were tracked through their hospitalization for any adverse events. Medical examiner records were then reviewed for subsequent fatalities.

The authors reported that most adverse events seen in patients with normal examinations after receiving naloxone were minor and unlikely to be life-threatening.

"This rule is a way to predict which patients will have adverse outcomes after they overdose on opiates," concluded Clemency. "The rule is simple to follow and can be used by health care providers with varying levels of training and experience.

"We anticipate this study will lead to nationally standardized recommendations for the observation of patients following the administration of naloxone for suspected opioid overdose," he said.

Credit: 
University at Buffalo

Marine debris study counts trash from Texas to Florida

image: Marine debris collected during the two year study.

Image: 
Caitlin Wessel

Trash, particularly plastic, in the ocean and along the shoreline is an economic, environmental, human health, and aesthetic problem causing serious challenges to coastal communities around the world, including the Gulf of Mexico.

Researchers from the Dauphin Island Sea Lab and the Mission-Aransas National Estuarine Research Reserve teamed up for a two-year study to document the problem along the Gulf of Mexico shorelines. Their findings are documented in the publication, Accumulation and distribution of marine debris on barrier islands across the northern Gulf of Mexico, in ScienceDirect's Marine Pollution Bulletin.

From February 2015 to August of 2017, the researchers kept tabs on marine debris that washed up on the shoreline every month at 12 different sites on nine barrier islands from North Padre Island, Texas to Santa Rosa, Florida. The trash was sorted by type, frequency, and location.

The most shocking discovery was that ten times more trash washes up on the coast of Texas than any of the other Gulf states throughout the year.

Most of the trash, 69 to 95 percent, was plastic. The plastic items included bottles and bottle caps, straws, and broken pieces of plastic. Researchers also cited that more trash washed ashore during the spring and summer. This could be because more people are outside and on the water during this time.

Credit: 
Dauphin Island Sea Lab

How quickly can ponds "inhale" and store carbon?

image: Timber rattlesnakes are disproportionately affected by landscape and habitat disturbances by coal mining.

Image: 
Photo courtesy of Thomas Maigret.

Get a sneak peek into these new scientific papers, publishing on January 3, 2019 in the Ecological Society of America's journal Frontiers in Ecology and the Environment.

Flashing lights can protect alpaca and llama herds from pumas
Does mountaintop removal in Appalachia also remove rattlesnakes?
How quickly can ponds "inhale" and store carbon?
Machine-learning shows what's changed in ecological research over forty years

Flashing lights can scare off some predators from llamas and alpacas

Historically, lethal methods for protecting livestock from predators have contributed to the decline of terrestrial carnivore populations in many ecological communities. While non-lethal alternatives abound (and many local indigenous farmers prefer them), most of these methods have not been rigorously tested to determine whether they are effective. A team of researchers from the University of Wisconsin and from Chile's Pontifícia Universidad Católica deployed an array of flashing light deterrents across their study area on the Altiplano, or Andean Plateau, which stretches across a portion of Chile and Bolivia, where pumas and Andean foxes prey on llama and alpaca herds. The researchers found that the light devices effectively deterred predation on their herds by pumas, but not by Andean foxes. This study represents a step forward in establishing experimental protocols for assessing the effectiveness of strategies and products that help humans and wildlife coexist.

Author Contact: Omar Ohrens (ohrens@wisc.edu)

Ohrens O, Bonacie C, and Treves A. Non-lethal defense of livestock against predators: flashing lights deter puma attacks in Chile. Frontiers in Ecology and the Environment. DOI: 10.1002/fee.1952

Coal miners and rattlesnakes prefer the same type of topography

On the Cumberland Plateau in eastern Kentucky, surface coal mining remains a common method of coal extraction and requires complete removal of mature forest cover and the upper soil layers. Even though federal law requires mining companies to approximately reconstruct the original topography after mining has wrapped up, it is rare for natural succession, native forest growth, and terrestrial biodiversity to return to their original levels. To determine how surface mining affects ridgetop habitat availability, researchers from the University of Kentucky implanted radio transmitters in timber rattlesnakes (Crotalus horridus) and tracked their movements until the snakes retreated to hibernation sites in autumn, which provided a roadmap for identifying other potentially viable hibernating sites, or "hibernacula," across the study area. By analyzing a suite of LiDAR remote-sensing and satellite imagery, mining maps, and permit data from the USGS and other sources, the researchers were then able to determine how mining might affect the range of rattlesnake hibernacula. They found that because timber rattlesnakes tend to overwinter in the same places that make ideal mines (along ridgelines and other similar higher-elevation topographic features), surface mining disproportionately alters or eliminates the preferred habitat of this species. These ridgetops contain other plant and animal species that may be similarly affected. The authors' analyses also show that restoring mined areas in Appalachia will be difficult, especially where ridgetop topography has been permanently rearranged.   

Author Contact: Thomas A Maigret (thomas.maigret@uky.edu)

Maigret TA, Cox JJ, and Yang J. 2019. Persistent geophysical effects of mining threaten ridgetop biota of Appalachian forests. Frontiers in Ecology and the Environment. DOI: 10.1002/fee.1992

Stashing carbon in small ponds

Forests, grasslands, wetlands, and other vegetated habitat types take in large amounts of carbon that is then stored in plant tissue and sediment - carbon that would otherwise exist in the atmosphere as a greenhouse gas. But there is little consensus about how much carbon can be stored in lakes and ponds (and how quickly it can be stored, and for how long). Researchers from the University of Northumbria and from the University of Highlands and Islands dug out thirty small ponds in a former coal mine in the UK and measured how much organic carbon the ponds accumulated over an 18- to 20-year period. They found that over 20 years, the ponds acted as carbon "sinks" and had higher rates of organic carbon burial than many other terrestrial and aquatic habitats, including boreal and temperate forests and temperate grasslands.  

Author Contact: Michael Jeffries (michael.jeffries@northumbria.ac.uk)

Taylor S, Gilbert PJ, Cook DA, et al. High carbon burial rates by small ponds in the landscape Frontiers in Ecology and the Environment. DOI: 10.1002/fee.1988

 

Machine-learning shows that ecology is shifting to data-intensive research and anthropogenic themes

The discipline of ecology is changing quickly to accommodate shifting societal needs and to make room for new areas of interest, but the vast breadth of research being published makes it difficult to quantify exactly how - and how much - it is really changing. Today, software can learn from data, identify patterns, and make decisions with minimal human intervention - a concept dubbed "machine-learning." Automated content analysis, or ACA, is a machine-learning method that "trains" a program or model to identify key concepts and themes across large amounts of text. Researchers from Purdue University in Indiana asked their ACA software to "read" over 80,000 papers published in the ecological literature between 1980 and 2016 and to identify the key concepts and themes in each. The researchers then analyzed how the prevalence of various themes has changed over time, and found that theoretical research and articles on plant and population ecology are becoming less common, while articles about microbial ecology, genetics, biogeochemistry, macrosystems ecology, and human dimensions of nature are becoming more common.

Author Contact: Songlin Fei (sfei@purdue.edu)

McCallen E, Knott J, Nunez-Mir G, et al. Trends in ecology: shifts in ecological research themes over the past four decades. Frontiers in Ecology and the Environment. DOI: 10.1002/fee.1993

Credit: 
Ecological Society of America

Adults with cerebral palsy at increased risk of depression, anxiety

Bottom Line: While cerebral palsy is considered a pediatric condition because it develops and is diagnosed in early childhood, it is a lifelong condition with the majority of children living into adulthood. Little research exists on the mental health of adults with cerebral palsy. This study included 1,700 adults 18 years or older with cerebral palsy and 5,100 adults without cerebral palsy. Those adults with cerebral palsy without an intellectual disability had a higher risk of developing depression and anxiety. The study relied on diagnostic codes for outcomes.

Authors: Kimberley J. Smith, Ph.D., University of Surrey, Guildford, United Kingdom, and coauthors

To Learn More: The full study is available on the For The Media website.

(doi:10.1001/jamaneurol.2018.4147)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Study details opioid poisoning deaths among children, teens over two decades

Bottom Line: Nearly 9,000 children and adolescents died from opioid poisonings with prescription and illicit drugs between 1999 and 2016 based on an analysis of national data. The death rate almost tripled over that time to nearly 1 per 100,000 based on the data from the Centers for Disease Control and Prevention (CDC). Prescription opioids were implicated in 73 percent of the deaths (6,561) and most of the deaths were unintentional (nearly 81 percent). The majority of deaths were among non-Hispanic white males but over time non-Hispanic black children accounted for a larger proportion of the deaths. The highest annual death rates during the 18 years examined in the study were among teens 15 to 19, with heroin implicated in nearly 1,900 deaths. The study relied on data from death certificates so the potential for misclassification of cause and manner of death exists. Researchers urge lawmakers, public health officials, clinicians and parents to implement protective measures to address the growing public health problem.

Authors: Julie R. Gaither, Ph.D., M.P.H., R.N., Yale School of Medicine, New Haven, Connecticut, and coauthors

To Learn More: The full study is available on the For The Media website.

(doi:10.1001/jamanetworkopen.2018.6558)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Sugar-sweetened beverage pattern linked to higher kidney disease risk

image: Visual Abstract

Image: 
Rebholz

Highlight

In a study of African-American men and women with normal kidney function, a pattern of higher collective consumption of soda, sweetened fruit drinks, and water was associated with a higher risk of developing kidney disease.

Washington, DC (December 27, 2018) -- Higher collective consumption of sweetened fruit drinks, soda, and water was associated with a higher likelihood of developing chronic kidney disease (CKD) in a community-based study of African-American adults in Mississippi. The findings, which appear in an upcoming issue of the Clinical Journal of the American Society of Nephrology (CJASN), contribute to the growing body of evidence pointing to the negative health consequences of consuming sugar-sweetened beverages.

Certain beverages may affect kidney health, but study results have been inconsistent. To provide more clarity, Casey Rebholz PhD, MS, MNSP, MPH (Johns Hopkins Bloomberg School of Public Health) and her colleagues prospectively studied 3003 African-American men and women with normal kidney function who were enrolled in the Jackson Heart Study.

"There is a lack of comprehensive information on the health implications of the wide range of beverage options that are available in the food supply," said Dr. Rebholz. "In particular, there is limited information on which types of beverages and patterns of beverages are associated with kidney disease risk in particular."

For their study, the investigators assessed beverage intake through a food frequency questionnaire administered at the start of the study in 2000-04, and they followed participants until 2009-13.

Among the 3003 participants, 185 (6%) developed CKD over a median follow-up of 8 years. After adjustment for confounding factors, consuming a beverage pattern consisting of soda, sweetened fruit drinks, and water was associated with a higher risk of developing CKD. Participants in the top tertile for consumption of this beverage pattern were 61% more likely to develop CKD than those in the bottom tertile.

The researchers were surprised to see that water was a component of this beverage pattern that was linked with a higher risk of CKD. They noted that study participants may have reported their consumption of a wide variety of types of water, including flavored and sweetened water. Unfortunately, the investigators did not collect information about specific brands or types of bottled water in the Jackson Heart Study.

In an accompanying editorial, Holly Kramer, MD, MPH and David Shoham, PhD (Loyola University Chicago) noted that the findings hold strong public health implications. "While a few select U.S. cities have successfully reduced SSB [sugar sweetened beverage] consumption via taxation, all other municipalities have resisted public health efforts to lower SSB consumption," they wrote. "This cultural resistance to reducing SSB consumption can be compared to the cultural resistance to smoking cessation during the 1960s after the Surgeon General report was released. During the 1960s, tobacco use was viewed as a social choice and not a medical or social public health problem."

In an accompanying Patient Voice editorial, Duane Sunwold explained that he is a patient with CKD who changed his eating and drinking patterns to put his disease in remission. As a chef, he offers a number of recommendations to fellow patients trying to decrease their consumption of sugar-sweetened drinks.

Credit: 
American Society of Nephrology

Breaking down AGEs: Insight into how lifestyle drives ER-positive breast cancer

image: Stylized image suggesting a link between AGEs and diet

Image: 
Illustration by Emma Vought, Medical University of South Carolina

Poor diet and lack of exercise are associated with cancer development, but the underlying biology is not well understood. Advanced glycation end products (AGEs) could offer a biological link to help us understand how certain lifestyle behaviors increase cancer risk or lessen the likelihood that an anti-cancer therapy will be effective.

AGE accumulation is the natural and unavoidable result of the breakdown of nutrients, sugars and fats. AGE levels, however, can be increased by the consumption of processed foods high in sugar and fat. Certain cooking techniques, such as grilling, searing and frying, also increase AGE formation.

High AGE levels could prevent patients with estrogen receptor (ER)-positive breast cancer from responding to tamoxifen therapy, suggest preclinical findings reported by researchers at the Medical University of South Carolina (MUSC) in a recent issue of Breast Cancer Research and Treatment. The MUSC team was led by David P. Turner, Ph.D., an assistant professor in the MUSC College of Medicine and a member of the Hollings Cancer Center, who is one of the two corresponding authors on the article. Marvella E. Ford, Ph.D., a professor in the MUSC College of Medicine and associate director of Cancer Disparities at Hollings Cancer Center, is the other corresponding author.

"By showing that AGEs in the diet may impact how well breast cancer patients respond to therapy we can make breast cancer patients aware of their existence," says Turner. "And we can design lifestyle interventions aimed at reducing AGE intake."

AGEs cause an imbalance between molecules called free radicals and antioxidants, leading to chronic inflammation that can promote the development of a variety of chronic diseases. Furthermore, as AGEs accumulate in our organs, they cause damage that is associated with diseases such as diabetes, Alzheimer's, cardiovascular disease, arthritis and cancer. However, AGEs have not been studied in depth in the context of cancer.

The publication by Turner, Ford and colleagues shows that elevated AGE levels lead to continual activation of pathways that promote cancer cell growth. A key molecule turned on by those pathways is important in the context of ER-positive and -negative breast cancer. This led the MUSC team to explore how AGE might affect cancer cell signaling in ER-positive breast cancer.

The MUSC team found that AGEs actually increase the phosphorylation (a process that turns on a biological pathway) of a protein called estrogen receptor alpha in a breast cancer cell line model. Adding tamoxifen to the cancer cells prevented their growth. However, adding AGEs caused them to grow once again. This could mean that patients with high AGEs are less likely to respond to tamoxifen treatment.

Turner's team also found that a defined lifestyle intervention of exercise and dietary counseling lowered systemic levels of AGEs in overweight women with non-metastatic ER-positive breast cancer.

Next steps are to expand the published study to determine the effects of the intervention on a larger scale, while also further exploring the biological pathways in animal models. Together, they should shed light on how lifestyle interventions can beneficially affect cancer treatments by reducing AGE levels.

Credit: 
Medical University of South Carolina

Sleeping sickness parasite uses multiple metabolic pathways

Parasitic protozoa called trypanosomes synthesize sugars using an unexpected metabolic pathway called gluconeogenesis, according to a study published December 27 in the open-access journal PLOS Pathogens by David Horn of the University of Dundee in the UK, and colleagues. The authors note that this metabolic flexibility may be essential for adaptation to environmental conditions and survival in mammalian host tissues.

Trypanosomes cause human sleeping sickness and animal African trypanosomiases, which are a range of devastating but neglected tropical diseases affecting cattle, other livestock and horses. The mammalian stage of the parasite circulates in the bloodstream, a nutrient-rich environment with constant temperature and pH and high glucose concentration. Bloodstream-form African trypanosomes are thought to rely exclusively upon a metabolic pathway called glycolysis, using glucose as a substrate, for ATP production. In contrast to this view, Horn and colleagues show that bloodstream-form trypanosomes can use glycerol for ATP production and for gluconeogenesis -- a metabolic pathway that results in the generation of glucose from non-carbohydrate carbon substrates.

The authors showed that even wild-type parasites, grown in the presence of glucose and glycerol, use both substrates and have active gluconeogenesis. Moreover, mammalian-infective parasites assemble a dense surface glycoprotein coat, the glycan components of which incorporate carbons from glycerol. Therefore, gluconeogenesis can be used to drive metabolism and metabolite biosynthesis. The results reveal that trypanosomes exhibit metabolic flexibility and adaptability, which is likely required for survival in multiple host tissue environments. According to the authors, this finding should be considered when devising metabolically targeted therapies.

The authors add, "The findings challenge a dogma that has persisted for more than 30 years; that these parasites rely solely on glucose and glycolysis for energy production in their mammalian hosts."

Credit: 
PLOS

Confronting the side effects of a common anti-cancer treatment

AMHERST, Mass. - Results of a new study by neuroscientists at the University of Massachusetts Amherst and the University of Toronto suggest that a new treatment approach is needed - and how this may be possible - to address adverse effects of aromatase inhibitors, drugs commonly prescribed to both men and women to prevent recurrence of estrogen-positive breast cancer.

The current drug therapy is linked to such complaints as hot flashes, memory lapses, anxiety and depression, side effects so bothersome that some patients discontinue the life-saving treatment, the researchers point out. Their study found that aromatase inhibitors do indeed suppress estrogen synthesis in body tissues, but their unexpected findings in the brain could explain some of the negative effects and provide insight into more effective, less disruptive future therapies.

Neuroscientists Agnès Lacreuse, Luke Remage-Healey and their graduate students at UMass Amherst, collaborator Jessica Mong at the University of Maryland and first author Nicole Gervais worked together on this research. Gervais, who conducted the experiments as a postdoctoral researcher at UMass Amherst, is now at the University of Toronto. The authors studied a small group of aged male and female marmosets, non-human primates whose brains are much like humans' and which exhibit "complex behavior," senior author Lacreuse explains.

She adds, "This drug is given to prevent recurrent breast cancer in humans and it does save lives, but a lot of times, patients are not compliant because of unpleasant side effects that affect quality of life." Their study, showing changes in the animals consistent with some of the human complaints, allowed the researchers to assess cognitive behavior, thermal regulation and neuronal changes in drug-treated vs. control groups. Their findings appear this week in the Journal of Neuroscience.

As Gervais explains, studies in humans are hampered by confounders. "The patients have had cancer, so it's hard to disentangle the stress of their disease and treatment from the drug effects." She adds, "We wanted to know if the symptoms while using the aromatase inhibitors can be reproduced in an animal model, and further explore the mechanisms to understand how they work and find alternative treatments."

In this work supported by the NIH's National Institute on Aging and National Institute of Neurological Disorders and Stroke, the researchers administered the estrogen-inhibiting drug orally "the way it's given to humans and at a similar dose," Gervais explains, for one month, and observed that it did indeed suppress estrogen production in the body. They then compared changes in behavior, memory, electrophysiology, and thermoregulation in the treated and control groups

Gervais says, "Sure enough, we found deficits in some aspects of memory and we also saw the most striking results in thermal regulation, a deficit in the ability to regulate body temperature when the ambient temperature increases, but only in females. It doesn't match hot flashes exactly but it's consistent with what we know about the regulation of hot flashes by estrogens in women. Females on the drug could not regulate their temperature as well as control females."

It was in the investigation of neurons that the researchers saw something quite surprising, says Remage-Healey. "In the hippocampus, which is thought to be critical for learning and memory functions, instead of reduced estrogen levels we found that the drug caused a paradoxical increase in estrogen levels."

Gervais adds, "We believe that the hippocampus may have synthesized its own estrogens to compensate for low levels it senses in peripheral tissues. According to our results, the mechanism for an adverse effect on memory may be due to an increase of estrogen synthesis in the hippocampus. Perhaps, future treatments could find a way to block this increased synthesis, and maybe prevent some of the negative side effects."

Remage-Healey points out that "We were also able to follow the excitability of hippocampal neurons, which was compromised in the treatment but not control group. This is consistent with the occasional memory problems reported by patients. It seems the hippocampus is particularly sensitive to estrogens and their blockade. But we have a lot of work to do to understand the precise mechanism underlying these effects."

The authors state, "These findings suggest adverse effects of aromatase inhibitors on the primate brain and call for new therapies that effectively prevent breast cancer recurrence while minimizing side effects that further compromise quality of life."

Credit: 
University of Massachusetts Amherst

Producers of white colonies on kimchi surface, mistaken as molds, have been identified

image: Isolation and Whole-Genome Analysis of White Colony-Forming Yeasts on Kimchi Surface

Image: 
Please indicate that this is the result of WiKim' research

WiKim (World Institute of Kimchi, General Director, Dr. Jaeho Ha) reported that the white colonies on the surface of kimchi is not formed by molds but by "yeasts" and that genomic data was acquired regarding the hygienic safety of the yeast strains.

This report is based on a study conducted by Dr. Tae-Woon Kim and Dr. Seong Woon Roh's team at Microbiology and Functionality Research Group of WiKim, on yeasts causing white colony on kimchi surface and on their hygienic safety. The study involves a next-generation sequencing (NGS) approach to the collected white colonies from the surface of kimchi samples: such as cabbage kimchi, mustard leaf kimchi, young radish kimchi, and watery kimchi.

*NGS: Also known as high-throughput sequencing, used to describe a number of different modern sequencing technologies, allowing us to sequence DNA and RNA much more quickly and cheaply than the previously used Sanger sequencing for the study of genomics and molecular biology.

The findings of this study were published in the latest online edition (Oct. 2018) of the Journal of Microbiology, an international academic journal.

In general, yeasts produce alcoholic and aromatic compounds that help generate the flavor of fermented foods; hence, they are frequently used in making bread or rice wine. Kimchi is primarily fermented by lactic acid bacteria rather than yeasts; however, during the later phase of fermentation, when the activity of lactic acid bacteria is decreased, a white colony on kimchi surface is formed by yeasts. The white colony is often observed on the surface of moist fermented food products including soy sauce, soy bean paste, rice wine, and kimchi.

The research group performed microbial community structure analysis to identify five representative yeast strains responsible for white colony on kimchi surface: Hanseniaspora uvarum, Pichia kluyveri, Yarrowia lipolytica, Kazachstania servazzii, and Candida sake.

Furthermore, whole-genome sequencing of the five yeast strains confirmed that they do not have known toxin-related genes.

This study is unique, since it is the first report to analyze diversity of microbial community structures and whole-genome sequence of white colony-forming yeasts on kimchi surface using NGS technology. In the future, WiKim intends to disseminate this genetic information regarding white colony-forming yeasts on kimchi surface in the Genome Database of Kimchi-associated Microbes, GDKM and to perform additional studies such as toxicity tests based on animal experiments to verify the safety of the identified yeasts and to develop methods to prevent their formation.

In order to prevent white colony formation, the surface of kimchi should be covered with a sanitized cover or be immersed in the kimchi soup so that the surface of kimchi is not exposed to the air. Furthermore it is advised to maintain kimchi at a storage temperature below 4°C. Upon formation of white colony on kimchi surface, it should be skimmed off and the kimchi should be washed and heated before eating.

The General Director Dr. Jaeho Ha at WiKim said, "This study is significant in that it has scientifically identified white colony-forming yeasts for which the people used to have vague anxiety and it is a step forward toward the alleviation of the anxiety for hygienic safety of kimchi."

Credit: 
National Research Council of Science & Technology

For patients with kidney disease, genetic testing may soon be routine

NEW YORK, NY (December 26, 2018)--A new study has found that genes cause about 1 in 10 cases of chronic kidney disease in adults, and identifying the responsible gene has a direct impact on treatment for most of these patients.

"Our study shows that genetic testing can be used to personalize the diagnosis and management of kidney disease, and that nephrologists should consider incorporating it into the diagnostic workup for these patients," says Ali Gharavi, MD, chief of nephrology at Columbia University Vagelos College of Physicians and Surgeons, and a co-senior author of the study.

The findings were published on December 26 in the New England Journal of Medicine.

It's estimated that 1 in 10 adults in the United States have chronic kidney disease. Yet, for 15 percent of patients with chronic kidney disease, the underlying cause of kidney failure is unknown.

"There are multiple genetic causes of chronic kidney disease, and treatment can vary depending on the cause," says Gharavi. "But many of the genetic types are rare and can be difficult to detect with traditional diagnostics.

And because kidney disease is often silent in the early stages, some patients aren't diagnosed until their kidneys are close to failing, making it more difficult to find the underlying cause."

DNA sequencing has the potential to pinpoint the genetic culprits, but has not been tested in a wide range of patients with chronic kidney disease.

"Our study identifies chronic kidney disease as the most common adult disease, outside of cancer, for which genomic testing has been demonstrated as clinically essential," says David Goldstein, PhD, director of Columbia University's Institute for Genomic Medicine and a co-senior author of the study.

In this study, researchers used DNA sequencing to look for genetic kidney disorders in 3,315 individuals with various types of chronic or end-stage kidney disease. For 8.5 percent of these individuals, clinicians had not been able to identify the cause of disease.

The researchers found a genetic disorder responsible for about 9 percent of the participants' kidney problems, and DNA testing reclassified the cause of kidney disease in 1 out of 5 individuals with a genetic diagnosis. In addition, DNA testing was able to pinpoint a cause for 17 percent of participants for whom a diagnosis was not possible based on the usual clinical workup.

DNA results had a direct impact on clinical care for about 85 percent of the 168 individuals who received a genetic diagnosis and had medical records available for review. "For several patients, the information we received from DNA testing changed our clinical strategy, as each one of these genetic diagnoses comes with its own set of potential complications that must be carefully considered when selecting treatments," Gharavi says.

About half of the patients were diagnosed with a kidney disorder that also affects other organs and requires care from other specialists. A few (1.5 percent) individuals learned they had medical conditions unrelated to their kidney disease, In all of these cases, the incidental findings had an impact on kidney care. "For example, having a predisposition to cancer would modify the approach to immunosuppression for patients with a kidney transplant," adds Gharavi.

"These results suggest that genomic sequencing can optimize the development of new medicines for kidney disease through the selection of patient subgroups most likely to benefit from new therapies," says Adam Platt, PhD, Head of Global Genomics Portfolio at AstraZeneca and a co-senior author of the study.

While the current study shows the utility of DNA testing in people with kidney disease, another study led by Goldstein and Gharavi found that DNA testing in healthy individuals vastly overestimated the prevalence of kidney disease-associated genetic conditions.

"Altogether, our research suggests that DNA testing may be most useful when balanced with clinical information," says Goldstein.

Credit: 
Columbia University Irving Medical Center

Cell size and cell-cycle states play key decision-making role in HIV

image: These are members of the research team (right to left): Bioengineering seniors Erin Tevonian and Melina Megaridis, Post-doctoral fellow Kathrin Bohn-Wippert, Bioengineering graduate students Meng-Yao Huang and Yiyang Lu, and Professor Roy Dar.

Image: 
University of Illinois Department of Bioengineering.

Thanks to the development of antiretroviral drugs, human immunodeficiency virus (HIV) is considered a manageable chronic disease today. However, if left undiagnosed or untreated, HIV can develop into AIDS (acquired immune deficiency syndrome), a disease which led to the deaths of nearly 1 million people worldwide in 2017.

The life-saving drugs don't cure HIV, though, because when the virus infects the body, it insidiously targets the very cells required to trigger the body's immune response to any infection. Specifically, HIV invades CD4 T-cells--a type of white blood cell--making copies of itself and taking over the CD4 host cell's DNA.

"Upon infection of a CD4 T-cell, HIV undergoes one of two fates," said University of Illinois Assistant Professor Roy Dar. "It either integrates into a replicating state, leading to the production of hundreds of infectious virions, or it integrates into a latent state where the provirus lies transcriptionally silent."

According to Dar, researchers have focused on eradicating the latent reservoir because it can reactivate spontaneously, evade drug therapy, or be a source for viral rebound, which can occur if a patient does not strictly adhere to the antiretroviral therapy treatment regimen.

"To date, there is no way to distinguish between uninfected cells and latently infected cells in the body, but such an ability would support existing therapeutic approaches to curing HIV," said Dar, who is affiliated with the Electrical & Computer Engineering Department and Carl Woese Institute for Genomic Biology on campus.

In a recent study, Dar and his research group investigated the reactivation of T-cells that were latently infected with HIV in the lab by using a viral construct that contained a gene for a green fluorescent protein (GFP) that gets expressed when a cell reactivates.

"The method of time-lapse, single-cell imaging allowed monitoring single latent cell reactivation from their silent to their active states by calculating the mean fluorescence of GFP," explained Bioengineering Post-Doctoral Researcher Kathrin Bohn-Wippert, the lead author of the study.

Once they identified the cells that were reactivated, they calculated the cell size and determined the mean cell diameter necessary for reactivation. They discovered that within a latent population, only larger host cells reactivate while the smaller cells remain silent or latent.

According to Bohn-Wippert, this larger host-cell size provides a natural cellular mechanism for enhancing burst size of viral expression and is necessary to destabilize the latent state while biasing viral decision making.

"Our results present a case of passive host-cell dominated viral decision-making, in which the virus is off when the infected cell is small and can only spontaneously reactivate in larger cell sizes," said Dar. "This presents a case of the host cell depicting the right conditions for viral decision-making to occur."

The team also determined that the cells' transition from latent to active is dependent on the cell cycle--the stages by which cellular DNA replicates--and can be modulated with drug treatments. "We showed that you can use drug treatments to modulate a population of cells in and out of a specific cell cycle state in order to bias their viral reactivation," said Dar.

These findings may be useful in guiding stochastic design strategies for drug therapies, have applications in synthetic biology, and play a role in advancing HIV diagnostics and treatments.

Credit: 
University of Illinois Grainger College of Engineering

UC San Diego researchers identify how skin ages, loses fat and immunity

image: This is a microscopic image of the skin reveals skin cells in blue and fat cells in green. The fat cell layer forms the final barrier against bacteria entering deep into the body.

Image: 
UC San Diego Health

Dermal fibroblasts are specialized cells deep in the skin that generate connective tissue and help the skin recover from injury. Some fibroblasts have the ability to convert into fat cells that reside under the dermis, giving the skin a plump, youthful look and producing a peptide that plays a critical role in fighting infections.

In a study published in Immunity on December 26, University of California San Diego School of Medicine researchers and colleagues show how fibroblasts develop into fat cells and identify the pathway that causes this process to cease as people age.

"We have discovered how the skin loses the ability to form fat during aging," said Richard Gallo, MD, PhD, Distinguished Professor and chair of the Department of Dermatology at UC San Diego School of Medicine and senior author on study. "Loss of the ability of fibroblasts to convert into fat affects how the skin fights infections and will influence how the skin looks during aging."

Don't reach for the donuts. Gaining weight isn't the path to converting dermal fibroblasts into fat cells since obesity also interferes with the ability to fight infections. Instead, a protein that controls many cellular functions, called transforming growth factor beta (TGF-β), stops dermal fibroblasts from converting into fat cells and prevents the cells from producing the antimicrobial peptide cathelicidin, which helps protect against bacterial infections, reported researchers.

"Babies have a lot of this type of fat under the skin, making their skin inherently good at fighting some types of infections. Aged dermal fibroblasts lose this ability and the capacity to form fat under the skin," said Gallo. "Skin with a layer of fat under it looks more youthful. When we age, the appearance of the skin has a lot to do with the loss of fat."

In mouse models, researchers used chemical blockers to inhibit the TGF-β pathway, causing the skin to revert back to a younger function and allowing dermal fibroblasts to convert into fat cells. Turning off the pathway in mice by genetic techniques had the same result.

Understanding the biological process that leads to an age-dependent loss of these specialized fat cells could be used to help the skin fight infections like Staphylococcus aureus (S. aureus) -- a pathogenic bacteria that is the leading cause of infections of the skin and heart and a major factor in worsening diseases, like eczema. When S. aureus becomes antibiotic resistant it is known as methicillin-resistant Staphylococcus aureus or MRSA, which is a leading cause of death resulting from infection in the United States.

The long term goals and benefits of this research are to understand the infant immune system, said Gallo. The results may also help understand what goes wrong in other diseases like obesity, diabetes and autoimmune diseases.

Credit: 
University of California - San Diego

Kicking, yelling during sleep? Study finds risk factors for violent sleep disorder

MINNEAPOLIS - Taking antidepressants for depression, having post-traumatic stress disorder or anxiety diagnosed by a doctor are risk factors for a disruptive and sometimes violent sleep disorder called rapid eye movement (REM) sleep behavior disorder, according to a study published in the December 26, 2018, online issue of Neurology®, the medical journal of the American Academy of Neurology. The study also found men are more likely to have the disorder.

REM sleep is the dream state of sleep. During normal REM sleep, your brain sends signals to prevent your muscles from moving. However, for people with REM sleep behavior disorder, those signals are disrupted. A person may act out violent or action-filled dreams by yelling, flailing their arms, punching or kicking, to the point of harming themselves or a sleep partner.

"While much is still unknown about REM sleep behavior disorder, it can be caused by medications or it may be an early sign of another neurologic condition like Parkinson's disease, dementia with Lewy bodies or multiple system atrophy," said study author Ronald Postuma, MD, MSc, of McGill University in Montreal, Canada, and a member of the American Academy of Neurology. "Identifying lifestyle and personal risk factors linked to this sleep disorder may lead to finding ways to reduce the chances of developing it."

The study looked at 30,097 people with an average age of 63. Researchers screened participants for a variety of health conditions and asked about lifestyle, behavior, social, economic and psychological factors.

In addition, every participant was asked, "Have you ever been told, or suspected yourself, that you seem to act out your dreams while asleep?"

Researchers then identified 958 people, or 3.2 percent, with possible REM sleep behavior disorder, after excluding participants with Parkinson's disease, dementia, Alzheimer's disease or sleep apnea.

Researchers found those with the disorder were over two-and-a-half times as likely to report taking antidepressants to treat depression, with 13 percent of those with the disorder taking them compared to 6 percent of those without the disorder. People with the disorder were also two-and-a-half times as likely to have post-traumatic stress disorder. They were twice as likely to have mental illness, and over one-and-a-half times as likely to have psychological distress.

Other findings were that men were twice as likely as women to have possible REM sleep behavior disorder; 59 percent of those with the disorder were male, compared to 42 percent of those without the disorder. People with possible REM sleep behavior disorder were 25 percent more likely than those without the disorder to be moderate to heavy drinkers, with 19 percent of those with the disorder moderate to heavy drinkers compared to 14 percent of those without the disorder. They had slightly less education, an average of 13.2 years of education compared to an average of 13.6 years for those without the disorder. They also had lower income and were more likely to have smoked.

"Our research does not show that these risk factors cause REM sleep behavior disorder, it only shows they are linked," said Postuma. "Our hope is that our findings will help guide future research, especially because REM sleep behavior disorder is such a strong sign of future neurodegenerative disease. The more we understand about REM sleep behavior disorder, the better positioned we will be to eventually prevent neurologic conditions like Parkinson's disease."

A limitation of the study was that 96 percent of participants were white, meaning the results may not apply to people of other ethnic backgrounds.

Credit: 
American Academy of Neurology