Culture

Archaeological mystery solved with modern genetics

Researchers at the University of Tokyo conducted a census of the Japanese population around 2,500 years ago using the Y chromosomes of men living on the main islands of modern-day Japan. This is the first time analysis of modern genomes has estimated the size of an ancient human population before they were met by a separate ancient population.

"Evidence at archaeological dig sites has been used to estimate the size of ancient human populations, but the difficulty and unpredictability of finding those sites is a big limitation. Now we have a method that uses a large amount of modern data," said Associate Professor Jun Ohashi, an expert in human evolutionary genetics and leader of the research team that performed the analysis.

Archaeological mystery

The current theory on human migrations into Japan is that the original inhabitants, the Jomon people, were met about 2,500 years ago by a separate group coming mainly from the Korean Peninsula, the Yayoi people.

Archaeologists have identified fewer Jomon sites from the Late Jomon Period, the era immediately before the Yayoi arrival. Global temperatures and sea levels dropped during that period, which could have made life more difficult for the hunter-gatherer Jomon people.

When the Yayoi people arrived, they brought wet rice farming to Japan, which would have led to a more stable food supply for the remaining Jomon people living with the new Yayoi migrants.

The lesser amount of archaeological remains from the Late Jomon Period could be evidence of an actual population decline, or just that the archaeological dig sites have not yet been found.

Genetic evidence

Ohashi's research team decided to start digging through the human genome to address this archaeological mystery. They began by comparing the Y-chromosome sequences of modern Japanese men to those of Korean and other East Asian men. Y chromosomes are passed on from father to son with very little change over generations, so modern Y-chromosome sequences can reliably estimate the Y chromosomes of men thousands of years ago.

Researchers used DNA samples collected before 1990 from 345 men whose families were from the three main islands of Honshu, Shikoku, and Kyushu in Japan.

The research team identified one group of DNA sequences that only Japanese men had. That unique sequence group likely came from the Jomon people. The researchers identified six sequence groups common to both Japanese men and men with other East Asian heritage (Korean, Vietnamese, Chinese), which likely came from the Yayoi people or other ancestors common to Japanese and East Asian people.

DNA confirms archaeology

Researchers built evolutionary family trees using the Y-chromosome sequences and saw a pattern indicative of a population decrease and sudden increase: a remarkable decrease in the number of ancestral Y-chromosome sequences around 2,500 years ago.

Interestingly, modern Japanese men seem to have a greater percentage of Jomon ancestral DNA in their Y chromosomes than the rest of their genomes.

Previous genetic analyses concluded that modern ethnically Japanese people get about 12 percent of their entire genomes from Jomon ancestors and the rest from Yayoi ancestors. Ohashi's research team calculated that the one group of Jomon sequences they identified accounted for 35.4 percent of the entire Y chromosome, indicating that the specific sequence would have been extremely common in Jomon men.

Since it is easier for a sequence to become common in a small population, this is another indication that the size of the Jomon population decreased during the Late Jomon Period before the arrival of the Yayoi people.

"We hope this method might be useful to confirm other ancient human dynamics not fully explained by archaeology," said Ohashi.

Credit: 
University of Tokyo

EHR medication lists lack accuracy, may threaten patient safety

image: A patient's electronic health record may not capture the most accurate, up-to-date information about ophthalmic medications, a new study finds.

Image: 
Michigan Medicine/Manifest

When it comes to keeping track of prescribed medications between clinic visits, many patients rely on printed medication lists automatically generated from electronic health records (EHRs).

An examination of the EHRs of a cohort of ophthalmology patients revealed that one-third had at least one discrepancy between the medications discussed in the clinician's notes and those on the medication list.

These findings raise concerns about patient safety and continuity of care.

The study, published in JAMA Ophthalmology, was conducted by investigators at the University of Michigan Kellogg Eye Center. The team examined medication-related information contained in the EHRs of patients treated for microbial keratitis between July 2015 and August.

"Corneal infection is an important disease condition to study ophthalmic medication lists because the medications change rapidly," says cornea specialist Maria Woodward, M.S., M.D., assistant professor of ophthalmology and the study's lead author.

Often, many medications are used, some requiring compounding, making telephone orders to specialty pharmacies common.

"Because of the multiple clinic visits and frequent medication changes," Woodward says, "it is imperative to have strong verbal and written communication between providers and patients who are battling corneal infections."

In a typical appointment, a provider verbally communicates medication instructions to the patient. At the same time, notes from that discussion are typed into an unstructured or "free text" section of the patient's EHR by either the doctor, a technician or a medical scribe.

The patient then receives a medication list generated from the EHR as part of a printed after-visit summary.

"That summary should confirm how the provider intends medications to be used," says Woodward, also a health services researcher at the U-M Institute for Healthcare Policy and Innovation.

The team found that one-third of patients had at least one medication mismatch in their records.

While this is the first study focused on ophthalmic medications, the results are consistent with studies of medications used in other medical specialties.

"This level of inconsistency is a red flag," Woodward says. "Patients who rely on the after-visit summary may be at risk for avoidable medication errors that may affect their healing or experience medication toxicity."

Identifying the root cause

The switch to the EHR has led to many improvements in patient care. But as this study shows, it's not a perfect tool for the provider or the patient.

In a typical clinic visit, a prescription entered into the EHR triggers both an order to the patient's pharmacy and an update to the medication list.

But several scenarios can result in mismatches between the clinical notes and the medication list.

"Issues arise when a medication is started by an outside provider and continued at the new hospital and when patients require compounded medications that must be telephoned in to a pharmacist in the evening," Woodward says.

These scenarios expose a shortcoming of the EHR: Data about medications (and other information) is captured in multiple formats in multiple locations.

"The only way to ensure that the medication list is completely accurate is to double-document. The same information must be entered into the clinician's note and the formal medication list -- two separate places," Woodward says.

"In a busy clinical setting, our top priority is communicating directly with the patient and answering their questions," she says. "We're focused on clarifying the treatment plan and addressing concerns, so duplicating note taking does not rise to our primary mission."

To improve both the reliability of medication information patients depend on and the accuracy of data used for research, Woodward's study team recommends that EHR developers create software solutions to ease the burden of clinical documentation and make it easier to reconcile medication names and dosages.

Credit: 
Michigan Medicine - University of Michigan

Slime travelers

image: These are fossil beds of the Nilpena National Heritage Ediacara site.

Image: 
Scott Evans / UCR

New UC Riverside-led research settles a longstanding debate about whether the most ancient animal communities were deliberately mobile. It turns out they were, because they were hungry.

"This is the first time in the fossil record we see an animal moving to get food," said study lead Scott Evans, a UCR paleontology doctoral candidate.

Evans' team demonstrated that the 550-million-year-old ocean-dwelling creatures moved on their own rather than being pushed around by waves or weather. The research answers questions about when, why and how animals first developed mobility.

The team searched for evidence of movement in more than 1,300 fossils of Dickinsonia, dinner-plate-shaped creatures up to a meter long that lived and fed on a layer of ocean slime.

Details of the team's analysis were published this month in the journal Geobiology. It found that Dickinsonia move like worms, constricting and relaxing their muscles to go after their next meal of microorganisms.

Dickinsonia were first discovered in the 1940s and since then, scientists have debated whether the fossils showed evidence of self-directed movement. To test this, it was crucial that Evans be able to analyze how multiple creatures living in the same area behaved relative to one another.

Evans and study co-author Mary Droser, a UCR professor of paleontology, reasoned that if Dickinsonia were riding waves or caught in storms, then all the individuals in the same area would have been moved in the same direction. However, that isn't what the evidence shows.

"Multiple fossils within the same community showed random movement not at all consistent with water currents," Evans said.

Critically, Evans was able to use fossil communities in the Australian outback unearthed by Droser and paper co-author James Gehling of the South Australian Museum. The duo systematically excavated large bed surfaces containing as many as 200 Dickinsonia fossils, allowing Evans to test whether the groups of the animals moved in the same or different directions, Evans said.

The team also analyzed the directions traveled by individual Dickinsonia.

"Something being transported by current should flip over or be somewhat aimless," Evans said. "These movement patterns clearly show directionality based on the animals' biology, and that they preferred to move forward."

Future studies at UCR will try to determine what Dickinsonia bodies were made of. "The tissues of the animals are not preserved, so it's not possible to directly analyze their body composition," he said. "But we will look at other clues they left behind."

Understanding Dickinsonia's capabilities offers insight not only into the evolution of animal life on Earth, but also about the Earth itself and possibly about life on other planets.

"If we want to search for complex life on other planets, we need to know how and why complex life evolved here," Evans said. "Knowing the conditions that enabled large mobile organisms to move during the Ediacaran era, 550 million years ago, gives us a clue about the habitable zone elsewhere."

That Dickinsonia could move helps confirm a large amount of oxygen was available in Earth's oceans during that time period, since it would have been required to fuel their movement. In a related study, Evans explored a spike in ocean oxygen levels during the Ediacaran period. Later, when oxygen levels dropped, Evans said that Dickinsonia - and things like them - went extinct.

Credit: 
University of California - Riverside

How you lock your smartphone can reveal your age: UBC study

image: Older smartphone users tend to rely more on their phones' auto lock feature compared to younger users, a new UBC study has found. They also prefer using PINs over fingerprints to unlock their phones.

Image: 
University of British Columbia

Older smartphone users tend to rely more on their phones' auto lock feature compared to younger users, a new UBC study has found. They also prefer using PINs over fingerprints to unlock their phones.

Researchers also found that older users are more likely to unlock their phones when they're stationary, such as when working at a desk or sitting at home.

The study is the first to explore the link between age and smartphone use, says Konstantin Beznosov, an electrical and computer engineering professor at UBC who supervised the research.

"As researchers working to protect smartphones from unauthorized access, we need to first understand how users use their devices," said Beznosov. "By tracking actual users during their daily interactions with their device, we now have real-world insights that can be used to inform future smartphone designs."

Analysis also showed that older users used their phone less frequently than younger users. For every 10-year interval in age, there was a corresponding 25 per cent decrease in the number of user sessions. In other words, a 25-year-old might use their phone 20 times a day, but a 35-year-old might use it only 15 times.

The study tracked 134 volunteers, ranging from 19 to 63 years of age, through a custom app installed on their Android phones. For two consecutive months, the app collected data on lock and unlock events, choice of auto or manual lock and whether the phone was locked or unlocked while in motion. The app also recorded the duration of user sessions.

The study also found gender differences in authentication choices. As they age, men are much more likely to rely on auto locks, as opposed to manually locking their devices, compared to women.

In terms of overall use, women on average use their phone longer than men, with women in their 20s using their smartphones significantly longer than their male peers. However, the balance shifts with age, with men in their 50s logging longer usage sessions than women of the same age.

While the study didn't look at the reasons for these behaviours, Beznosov says the findings can help smartphone companies design better products.

"Factors such as age should be considered when designing new smartphone authentication systems, and devices should allow users to pick the locking method that suits their needs and usage patterns," he said, adding that future research should look into other demographic factors and groups of participants, and explore the factors involved in authentication decisions.

Credit: 
University of British Columbia

Study: More aggressive treatments needed to improve 5-year survival rate for glioblastoma

JACKSONVILLE, Fla. -- Despite improvements in median and short-term survival rates for patients with glioblastoma, the most common brain tumor in adults, the percentage of patients achieving five-year survival remains low, according to new Mayo Clinic research.

A study to be published next month in Mayo Clinic Proceedings finds that little has changed in terms of five-year survival -- only 5.5% of patients live for five years after diagnosis -- and calls for more aggressive treatments to be considered for all glioblastoma patients.

Gliomas represent about 75% of malignant primary brain tumors, according to previous studies, and glioblastoma, grade 4 glioma, is among the most aggressive forms of cancer. The retrospective analysis of 48,652 cases in the National Cancer Database from January 2004 through December 2009 found that 2,249 patients survived at least five years after diagnosis. Among those who reached five-year survival, the median length of survival was 88 months. Patients who did not survive five years had a median survival of just seven months.

"The introduction of chemotherapy in the treatment of glioblastoma was revolutionary, although this research suggests that chemotherapy serves more as a temporizing measure against disease recurrence and death," says Daniel Trifiletti, M.D., a Mayo Clinic radiation oncologist and senior author of the study. "Considerable work needs to be done to provide hope for patients with glioblastoma."

The study involved oncology, radiation and biomedical statistics researchers from Mayo Clinic's Florida and Minnesota campuses, and East Tennessee State University.

According to the study, factors associated with five-year survival included age, race and gender. Those who achieved five-year survival were relatively younger adults, female and nonwhite. Other factors included generally good health, higher median income, tumors that were on the left side of the brain or outside the brainstem, and treatment with radiotherapy. Contrary to previous studies, tumor size did not appear to affect the odds of long-term survival significantly.

The findings suggest that more aggressive treatments that focus on long-term survival will be needed. "Although it is uncertain how this may be achieved, it likely will require novel and radical approaches to treatment of the disease," Dr. Trifiletti says. The study recommends that nearly all patients should be offered enrollment in a clinical trial.

Dr. Trifiletti says there are several studies underway testing promising new surgeries, radiation techniques and drug therapies. "To me, the most exciting area is in cellular therapy," he says. "In my lab, I am evaluating the possibility of using targeted cellular therapy as an agent that can synergize with existing therapies, including radiation."

Credit: 
Mayo Clinic

Major study finds no conclusive links to health effects from waste incinerators

Researchers have found no link between exposure to emissions from municipal waste incinerators (MWIs) and infant deaths or reduced foetal growth.

However, they show living closer to the incinerators themselves is associated with a very small increase in the risk of some birth defects, compared to the general population. But whether this is directly related to the incinerator or not remains unclear.

The findings come from the largest and most comprehensive analysis to date of the effects of municipal waste incinerators (MWIs) on public health in the UK.

MWIs are used to burn waste that is not recycled, composted or sent to landfill and can include materials such as paper, plastic, wood and metal. While MWI emissions are governed by EU regulations, public concern remains around their potential impact on public health and scientific studies to date have been inconsistent or inconclusive.

The analysis, led by a team at Imperial College London and funded by Public Health England and the Scottish Government, looked at MWIs at 22 sites across the UK between 2003 and 2010.

Researchers from the UK Small Area Health Statistics Unit (SAHSU) at Imperial first analysed concentrations of fine particles called PM10 (particulate matter measuring 10 micrometres or less in diameter) emitted from the chimneys of the incinerators as waste is burned.

Computer models generated from the data showed how these particles spread over a 10 km radius around 22 MWIs in England, Scotland and Wales. The models show that MWIs added very little to the existing background levels of PM10 at ground level - with existing PM10 concentrations at ground level on average 100 to 10,000 times higher than levels emitted by the chimneys (Environment Science & Technology, 2017).

Using these models, the team then investigated potential links between concentrations of PM10 emitted by MWIs and any increased risk of adverse birth outcomes. In an earlier study (Environment International, 2018), they found that analysis of records covering more than one million births in England, Scotland and Wales revealed no evidence of a link between small particles emitted by the incinerators and adverse birth outcomes such as effects on birthweight, premature birth, infant death, or stillbirth, for children born within 10 km of MWIs in Great Britain.

The team's latest findings, published in the journal Environment International, looked at occurrence of birth defects within 10 km of a subset of 10 incinerators in England and Scotland between 2003 and 2010. In their analysis, the team used health data on more than 5000 cases of birth defects among over 200,000 births, still births and terminations in England and Scotland.

They found no association between birth defects and the modelled concentrations of PM10 emitted by MWIs, but there was a small increase in the risk of two birth defects among those living closer to MWIs - specifically congenital heart defects and hypospadias (affecting the male genitalia - where the opening of the urethra is not at the top of the penis). These birth defects typically require surgery but are rarely life-threatening.

In the UK, congenital heart defects affect approximately 5.3 in 1000 births and 1.9 per 1000 males are born with hypospadias (Source: NCARDRS 2016*).

In terms of excess risk, the team estimates that the associated increase in risk for these two birth defects could be around 0.6 cases per 1,000 total births for congenital heart defects and 0.6 cases per 1,000 male births for hypospadias within 10 km of an incinerator.

Professor Paul Elliott, Director of the UK Small Area Health Statistics Unit (SAHSU) said: "Based on the available data, our findings showing that there is no significant increased risk of infant death, stillbirth, preterm birth or effects on birthweight from municipal waste incinerators are reassuring. The findings on birth defects are inconclusive, but our study design means we cannot rule out that living closer to an incinerator in itself may slightly increase the risk of some specific defects - although the reasons for this are unclear."

Professor Mireille Toledano, Chair in Perinatal and Paediatric Environmental Epidemiology at Imperial, said: "In these studies we found a small increase in risk for children living within 10 km of an MWI being born with a heart defect, or a genital anomaly affecting boys, but did not find an association with the very low levels of particulates emitted. This increase with proximity to an incinerator may not be related directly to emissions from the MWIs. It is important to consider other potential factors such as the increased pollution from industrial traffic in the areas around MWIs or the specific population mix that lives in those areas."

Professor Anna Hansell, Director of the Centre for Environmental Health and Sustainability at the University of Leicester, who previously led the work while at Imperial College London, added: "Taken together, this large body of work reinforces the current advice from Public Health England - that while it's not possible to rule out all impacts on public health, modern and well-regulated incinerators are likely to have a very small, or even undetectable, impact on people living nearby."

The team explains that while the results of the emissions studies are reassuring, they cannot rule out a link between the increased incidence of the two birth defects and the activities of the MWIs. They add that while they adjusted their results for socioeconomic and ethnic status, these may still influence birth outcomes findings. Poorer families may be living closer to MWIs due to lower housing or living costs in industrial areas, and their exposure to industrial road traffic or other pollutants may be increased.

The researchers highlight that their findings are limited by a number of factors. Also, they did not have measurements (for the hundreds of thousands of individual births considered) of metals or chemical compounds such as polychlorinated biphenyls (PCBs) and dioxins, but used PM10 concentrations as a proxy for exposure to MWI emissions - as has been used in other incinerator studies.

They add that ongoing review of evidence is needed to explore links further, as well as ongoing surveillance of incinerators in the UK to monitor any potential long-term impacts on public health.

Credit: 
Imperial College London

Largest study of CTE finds it in 6% of subjects

image: Dr. Kevin Bieniek of UT Health San Antonio is first author on a study that examined the brains of 300 athletes and 450 non-athletes for evidence of chronic traumatic encephalopathy (CTE). The study is the broadest to date of CTE. Dr. Bieniek is with the university's Glenn Biggs Institute for Alzheimer's and Neurodegenerative Diseases.

Image: 
UT Health San Antonio

SAN ANTONIO -- Nearly 6% of athletes and non-athletes were found to have the neurodegenerative disorder chronic traumatic encephalopathy (CTE) in the largest, and broadest, study conducted of the disease to date. The findings were published June 14 in the international journal Brain Pathology.

"Generally our findings point to CTE being more common in athletes and more common in football players, but this study is a bit more balanced and accurately reflects the general population compared to previous studies," said lead author Kevin Bieniek, Ph.D., of UT Health San Antonio. Dr. Bieniek led the research while at the Mayo Clinic before moving to Texas. He now directs the brain bank at the Glenn Biggs Institute for Alzheimer's and Neurodegenerative Diseases, which is part of UT Health San Antonio.

Unbiased screen

CTE, linked with repetitive blows to the head, has been found in 80-99% of autopsied brains of pro football players. "Nobody has really looked at it from kind of an epidemiological perspective," Dr. Bieniek said. "We compared people who played a sport with those who didn't play. We studied both young and old people, and amateur players versus college and professional players. And we studied both men and women, which had not been done previously. What we aimed to do was an unbiased screen for CTE from all sorts of different cases."

Biographical information utilized

The team scanned obituaries and high school yearbooks of 2,566 individuals whose brain autopsies are a part of the Mayo Clinic Tissue Registry. The study focused on a variety of contact sports: baseball, basketball, boxing, football, hockey, lacrosse, soccer and wrestling. Non-contact sports, such as golf and tennis, were excluded.

This analysis identified 300 former athletes and 450 non-athletes. "We screened the brains of all of these cases for evidence of CTE in a blinded fashion, intentionally not knowing which brain tissue was related to which case," Dr. Bieniek said.

Findings

A small number of cases, 42, had CTE pathology (5.6% of the total). CTE was found in 27 athletes and 15 non-athletes, and in 41 men and one woman. American football had the highest frequency of CTE (15%) of the contact sports studied, with participation beyond high school resulting in the highest risk of developing CTE.

"The 42 cases, or 6%, is more of a grounded, realistic number," Dr. Bieniek said. "That might not seem like a lot, but when you consider there are millions of youth, high school and collegiate athletes in the United States alone who play organized sports, it has the potential of being a significant public health issue. There are many ongoing questions regarding CTE pathology, however, and we don't want to discourage sources of healthy physical and cardiovascular activity like these sports. Rather, we emphasize safe strategies to reduce the possibility of head injuries and properly treat them when they are sustained."

Non-athletes' cases

The identification of 15 CTE cases in non-athletes raises interesting questions, Dr. Bieniek said. "Did these people have trauma from another source?" he asked. "Were they actually athletes and we were unable to detect it from biographical information? Is there another disease with similar features?"

Cases with CTE tended to be a bit older than the cases without it, and many CTE cases also showed evidence of Alzheimer's disease. "At the Glenn Biggs Institute, we study the concept of multiple neurodegenerative disorders happening within the brain of a person who has dementia," Dr. Bieniek said.

The crucial role of donors

"This is an important national study led by our brain bank director, Dr. Bieniek," said Sudha Seshadri, M.D., professor of neurology at UT Health San Antonio and director of the Glenn Biggs Institute. "We have a great team of scientists at the Biggs Institute, and the brain bank is key to the research aims of these investigators. We are so grateful for the many patients and normal older persons who have signed on to be brain donors after their death. The program runs 24/7/365, is free to the family, and gives the family the peace and knowledge of a definitive diagnosis for their loved one's condition."

Several studies related to traumatic brain injuries and CTE by Dr. Bieniek and his colleagues are currently ongoing at UT Health San Antonio, including how certain genetic variants might protect or put a person at higher risk for developing CTE.

Credit: 
University of Texas Health Science Center at San Antonio

Deaths from cardiovascular diseases attributable to heat and cold down 38% in Spain

Temperature-related mortality has been decreasing in Spain over the past four decades, according to a new study led by the Barcelona Institute for Global Health (ISGlobal), a research centre supported by "la Caixa". The study analysed the Spanish population's vulnerability to hot and cold temperatures in the context of global warming.

The study, published in The Lancet Planetary Health, analysed temperatures and deaths related to cardiovascular diseases recorded in 48 Spanish provinces between 1980 and 2016. Cardiovascular diseases are the leading cause of death in Spain and there is clear evidence of an association between temperature and cardiovascular mortality.

The findings show that temperature-related cardiovascular disease mortality was 38.2% lower in the period between 2002 and 2016 than for the period between 1980 and 1994. Analysis of the data in 15-year periods revealed that temperature-related cardiovascular mortality decreased at a rate of more than 17% per decade.

Specifically, heat-related cardiovascular mortality for the period 2002-2016 was more than 42% lower in men and more than 36% lower in women than in 1980-1994, while cold-related mortality was 30% lower in women and nearly 45% lower in men.

Notable differences were observed between the sexes: heat-related mortality was much higher in women, while men were more vulnerable to cold temperatures. In older people, the risk of death attributable to high temperatures was significantly higher for both sexes, but in the case of cold the increase was significant only for men.

"We observed two parallel phenomena," explained Hicham Achebak, a researcher at ISGlobal and the Centre for Demographic Studies (CED) and lead author of the study. "First, over the past four decades the mean temperature has risen by nearly 1°C. The trend is towards fewer days of moderate or extreme cold temperatures and more days of high temperatures. Second, the Spanish population has adapted to both cold and warm temperatures. The number of deaths at a given temperature is lower now than it was four decades ago."

The adaptation observed appears to be due to socioeconomic development and structural improvements--including improvements in housing conditions and health care systems. The authors highlighted a number of socioeconomic developments in Spain, including increases between 1991 and 2009 in per capita income (€8,700 to €22,880) and per capita health care spending (€605 to €2,182). In addition, between 1991 and 2011, the proportion of households with central heating went from 25.8% to 56.9% while the proportion of households with air conditioning rose from 4.16% in 1991 to 35.5% in 2008.

"The Spanish population has demonstrated a considerable capacity to adapt to rising temperatures," commented Joan Ballester, ISGlobal researcher and coordinator of the study. "However, as this has not necessarily been the result of a strategy to mitigate the consequences of climate change, it is possible that this adaptive response is limited and will not be sustained at higher temperatures, as climate warming accelerates."

Credit: 
Barcelona Institute for Global Health (ISGlobal)

Brains of pairs of animals synchronize during social interaction

image: Assistant professor of biological chemistry and neurobiology at the David Geffen School of Medicine at UCLA.

Image: 
Dean Ishida/UCLA

FINDINGS

UCLA researchers have published a Cell study showing that the brains of pairs of animals synchronize during social situations. The synchronized activity not only arose during various types of social behavior, but also the level of synchronization actually predicted how much the animals would interact. The team also found that brain synchrony arises from different subsets of neurons that encode the behavior of the self vs. the social partner, and that the dominant animal's behavior tends to drive synchronization more than behavior of the subordinate.

BACKGROUND

Considerable research has been devoted to studying brain activity in individual animals behaving alone. Much of animals' lives are spent interacting with one another -- socializing, competing and so forth -- and these social behaviors are generally quite complex, as an animal must not only react to other individuals, but actively predict their future behavior. Less is understood about how brain activity might function across interacting animals. Using sophisticated recording devices, the research team set out to simultaneously monitor activity in the brains of two interacting mice, making this the first study to use the technique in two animals behaving naturally together.

METHOD

The researchers attached tiny, high-tech microscopes to the heads of each mouse, which recorded activity in hundreds of individual brain cells. Fitted with the devices, the mice were placed together in pairs, first in open arenas to freely interact, and later in plastic tubes -- a common method of observing competition and social hierarchy, as the dominant mouse tends to claim more of the tube's "territory" by pushing against the subordinate mouse, or pushing it out of the tube completely.

When the mice interacted with each other, their brain activity was correlated, or synced up. The more engaged they were with one another, the more coupled were their brains. This brain synchronization arose from individual cells -- interestingly, some cells responded preferentially to the behavior of the self, while other cells responded only to the behavior of the social partner. The dominant mouse's behavior tended to have more of an effect on synchronization than that of the subordinate mouse, likely because both animals in a pair are paying attention to the dominant animal.

IMPACT

This is the first time that interbrain synchrony has been observed in socializing mice. Researchers believe that the insights gained from this study may shed new light on how brain activity synchronizes across humans during social interaction. Beyond adding clarity to fundamental properties of brain function in social interaction, the findings may also enable researchers to understand more about certain psychiatric and developmental disorders, including autism spectrum disorder, since many of these conditions include symptoms such as social deficits.

Credit: 
University of California - Los Angeles Health Sciences

Study: Phenols in cocoa bean shells may reverse obesity-related problems in mouse cells

image: Cocoa shells, a waste byproduct of roasting cocoa beans to produce chocolate, contain significant amounts of three healthful bioactive compounds that are also found in cocoa, coffee and tea.

Image: 
Fred Zwicky

CHAMPAIGN, Ill. -- Scientists may have discovered more reasons to love chocolate.

A new study by researchers at the University of Illinois suggests that three of the phenolic compounds in cocoa bean shells have powerful effects on the fat and immune cells in mice, potentially reversing the chronic inflammation and insulin resistance associated with obesity.

Visiting scholar in food science Miguel Rebollo-Hernanz and Elvira Gonzalez de Mejia, a professor in the department, found that cocoa shells contain high levels of three beneficial bioactive chemicals also found in cocoa, coffee and green tea - protocatechuic acid, epicatechin and procyanidin B2.

Rebollo-Hernanz, the study's lead author, created a water-based extract containing these compounds and tested its effects on white fat cells called adipocytes and immune cells called macrophages. Using computer modeling and bioinformatic techniques, he also examined the impact that each of the phenolics individually had on the cells.

"The objectives of the study were to test whether the bioactive compounds in the cocoa shells were efficacious against macrophages - the inflammatory cells - at eliminating or reducing the biomarkers of inflammation," said de Mejia, also a director of nutritional sciences. "We wanted to see if the phenolics in the extract blocked or reduced the damage to fat cells' mitochondria and prevented insulin resistance."

Similar to batteries within cells that burn fat and glucose to generate energy, mitochondria can become damaged when high levels of fat, glucose and inflammation occur in the body, de Mejia said.

When the scientists treated adipocytes with the aqueous extract or the three phenolic compounds individually, damaged mitochondria in the cells were repaired and less fat accumulated in the adipocytes, blocking inflammation and restoring the cells' insulin sensitivity, Rebollo-Hernanz said.

The scientists reported their findings recently in a paper published in the journal Molecular Nutrition and Food Research.

When adipocytes accumulate too much fat, they promote the growth of macrophages. This initiates a toxic cycle in which the adipocytes and macrophages interact, emitting toxins that inflame fat tissue, de Mejia said.

Over time, this chronic inflammation impairs cells' ability to take up glucose, leading to insulin resistance and possibly type 2 diabetes as glucose levels in the blood escalate.

To recreate the inflammatory process that occurs in the body when macrophages and adipocytes begin their toxic dance, Rebollo-Hernanz grew adipocytes in a solution in which macrophages had been cultured.

"That's when we observed that these inflammatory conditions in the solution increased the oxidative damage" to the fat cells' mitochondria, he said.

Fewer mitochondria were present in the adipocytes that were grown in the solution, and the mitochondria that did exist in these cells were damaged, he found.

When the scientists treated the adipocytes with the phenolics in the extract, however, the adipocytes underwent a process called browning, in which they differentiated - that is, converted - from white adipocytes into another form called beige adipocytes.

Beige adipocytes are a specialized form of fat tissue with greater numbers of mitochondria and enhanced fat-burning efficiency.

"We observed that the extract was able to maintain the mitochondria and their function, modulating the inflammatory process and maintaining the adipocytes' sensitivity to insulin," Rebollo-Hernanz said. "Assuming that these phenolics were the main actors in this extract, we can say that consuming them could prevent mitochondrial dysfunction in adipose tissue."

Cocoa shells are a waste byproduct that's generated when cocoa beans are roasted during chocolate production. About 700,000 tons of the shells are discarded annually, causing environmental contamination if not disposed of responsibly, de Mejia said.

In addition to providing cocoa producers with another potential revenue stream, processing the shells to extract the nutrients would reduce the environmental toxicants generated currently by cocoa shell waste, de Mejia said.

Once extracted from cocoa bean shells, the phenolic compounds could be added to foods or beverages to boost products' nutritional value, she said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

NASA selects SwRI's PUNCH mission to image beyond the Sun's outer corona

image: NASA has selected Southwest Research Institute to lead a microsatellite mission to image the Sun's outer corona. PUNCH proposes a constellation of four suitcase-sized satellites that will orbit the Earth, studying how the Sun's corona connects with the interplanetary medium, to better understand how coronal structures infuse the solar wind with mass and energy.

Image: 
Southwest Research Institute

SAN ANTONIO -- June 20, 2019 -- NASA has selected Southwest Research Institute to lead the "Polarimeter to Unify the Corona and Heliosphere" (PUNCH) mission, a landmark Small Explorers Program mission that will image beyond the Sun's outer corona.

PUNCH will consist of a constellation of four suitcase-sized microsatellites or "microsats" that will launch as early as 2022. The microsats will orbit the Earth in formation to study how the Sun's atmosphere, or corona, connects with the interplanetary medium. PUNCH will provide the first global images of how the solar corona infuses the solar wind with mass and energy.

"The vacuum of space between the planets is not completely empty -- it is actually filled with a tenuous, hypersonic 'solar wind' that streams out from the corona and affects spacecraft and planets -- including our own," said PUNCH Principal Investigator Dr. Craig DeForest, a scientist and program director in SwRI's Space Science and Engineering Division. "PUNCH will observe the 'no-man's land' between the outer solar corona and the solar wind, giving us our first clear images of the entire system connecting the Sun and Earth."

PUNCH will track and image the solar wind as it emerges from the solar corona, transitions to interplanetary space and streams through the solar system, bathing the planets and other solar system bodies. These measurements will reveal how and why the material coming from the star becomes gusty and turbulent en route to Earth.

In addition, the PUNCH satellites will track in 3D the Sun's coronal mass ejections, also known as "CMEs" or "space storms," as they erupt from the corona out into interplanetary space. CMEs cause some "space weather" events that affect Earth, which can threaten astronauts, damage satellites, black out power grids, and disrupt communication and GPS signals.

"Most of what we know about the space weather delivered by the solar wind comes from direct sampling by spacecraft embedded in it," said PUNCH Project Scientist Dr. Sarah Gibson, acting director of the High Altitude Observatory in Boulder, Colorado. "This is like understanding global weather patterns based on detailed measurements from a few individual weather stations on the ground. PUNCH is more like a weather satellite that can image and track a complete storm system as it evolves across an entire region."

The four spacecraft will fly in a distributed formation spread around the globe, operating in sync to produce polarized images of the entire inner solar system every few minutes. Each of the four PUNCH spacecraft carries a specialized camera to capture faint glimmers of sunlight reflected by free electrons in interplanetary space.

One spacecraft carries a Narrow Field Imager that captures the outer corona itself, and the others carry SwRI-developed Wide Field Imagers (WFIs). Dark baffles enable the WFIs to photograph space weather effects that are over a thousand times fainter than the Milky Way, despite flying in direct sunlight.

"Photographing the sky in polarized light is the secret sauce of the mission," DeForest said. "When sunlight bounces off electrons, it becomes polarized. That polarization effect lets us measure how solar wind features move and evolve in three dimensions, instead of just a 2D image plane. PUNCH is the first mission with the sensitivity and polarization capability to routinely track solar wind features in 3D."

"The Explorers Program seeks innovative ideas for small, cost-constrained missions that can help unravel the mysteries of the universe," said Dr. Paul Hertz, director of NASA's Astrophysics Division. "PUNCH absolutely meets the standard to solve mysteries about the Sun's corona, the Earth's atmosphere and magnetosphere, and the solar wind."

Credit: 
Southwest Research Institute

Moral concerns override desire to profit from finding a lost wallet

ANN ARBOR--The setup of a research study was a bit like the popular ABC television program "What Would You Do?"--minus the television cameras and big reveal in the end.

An international team of behavioral scientists turned 17,303 "lost" wallets containing varying amounts of money into public and private institutions in 355 cities across 40 countries. Their goal was to see just how honest the people who handled them would be when it came to returning the "missing" property to their owners. The results were not quite what they expected.

"Honesty is important for economic development and more generally for how society functions in almost all relationships," said Alain Cohn, assistant professor at the U-M School of Information. "Yet, it often is in conflict with individual self-interest."

The wallets either contained no money, a small amount ($13.45) or a larger sum ($94.15). Each wallet had a transparent face revealing a grocery list along with three business cards with a fictitious person's name, title and an email address printed on them.

Research assistants posed as the wallet finders, hurriedly dropping them off in such places as banks, theaters, museums or other cultural establishments, post offices, hotels, police stations, courts of law or other public offices so as to avoid having to leave their own contact information. Most of the activity occurred in 5-8 of the largest cities in each country, totaling approximately 400 observations per country.

The experiment on honesty most likely represents the truest picture of how people respond when no one is looking, and the results were surprising in more ways than one, researchers report in the current issue of Science.

Initially, the researchers went into the field experiment expecting to find a dollar value at which participants would be inclined to keep the money, believing the prevailing thought that the more cash in the wallet, the more tempting it would be for the recipients to take it and run.

Instead, the team from U-M, the University of Zurich and the University of Utah found that in nearly all of the countries, the wallets with greater amounts of money were more likely to be returned.

In 38 of 40 countries, citizens overwhelmingly were more likely to report lost wallets with money than without. Overall across the globe, 51% of those who were handed a wallet with the smaller amount of money reported it, compared with 40% of those that received no money. When the wallet contained a large sum of money, the rate of return increased to 72%.

"The psychological forces--an aversion to not viewing oneself as a thief--can be stronger than the financial ones," said co-corresponding author Michel André Maréchal of the University of Zurich.

Not all wallets in the field experiment were returned, however. Among the other surprises were some of the places where people were not so honest. Wallets dropped off at the Vatican and at two anti-corruption bureaus were among those that never made their way back to the "rightful owners."

Cohn said unlike other research of its kind, in which people knew they were being observed--usually in laboratory settings in wealthier Western, industrialized nations--the data in this field study was gathered from people across the world, in natural settings, who had no idea anyone was watching.

"It involves relatively high stakes in some countries. Previous studies focused on cheating in modest stakes," Cohn said.

After getting the field results, the team surveyed more than 2,500 people in the United Kingdom, the United States and Poland to better understand why honesty matters to us more than the money. The respondents were presented with a scenario that matched the field experiment and asked questions about how they would respond if presented with a lost wallet. Similar to the field study, those in this survey said failing to return a wallet felt like stealing when more money was involved.

The team also conducted a survey with 279 economists and experts in the field who predicted participants likely would keep the money. Another survey of nearly 300 people in the U.S. also showed that when predicting the behavior of others, respondents believed civic honesty would waiver when the amount of money was higher. While the experts had a bit more faith in the honesty of individuals, both groups believed the more money in the wallet, the more tempting it would be to keep it.

Credit: 
University of Michigan

People globally return 'lost' wallets more as money increases

In a study of how people in 40 countries decided to return (or not) "lost" wallets, researchers were surprised to find that - in contrast to classic economic logic - people returned the wallets holding the greater amounts of money more often. Across the globe, in 38 of the countries studied, honesty in this way increased as the incentive to cheat (higher wallet value) went up. Honesty is important for economic development as relates to contracts and taxes for example, and for how society functions in relationships more generally. Yet, it often is in conflict with individual self-interest. Alain Cohn et al. wanted to more rigorously evaluate trade-offs between honesty and self-interest, teasing apart the motivations for the former among people handling "missing" property. In a global field study, Cohn and an international team turned in 17,303 "lost" wallets - containing no money or the equivalent of US $13.45, or in some cases US $94.15 - to public and private institutions in 355 cities across 40 countries. Though the researchers went into the experiment expecting to find a dollar value at which participants would be inclined to keep the money, they found that in 38 of 40 countries, citizens overwhelmingly were more likely to report lost wallets with money than without. Overall across the globe, 51% of those who were handed a wallet with the smaller amount of money reported it. When the wallet contained a large sum of money, the rate of return increased to 72%.
The results, a measure of overall rates of civic honesty across the globe, offer a demonstration of how people's inclinations to show concern for others can be greater than their preferences for self, though that's not the whole story, the researchers say. People also returned the wallets with greater values because of concern for self-image. "The psychological forces--an aversion to not viewing oneself as a thief--can be stronger than the financial ones," said co-author Michel André Maréchal. This was explored in follow-up surveys designed to better understand why honesty matters to people more than the money. As well, several additional experiments in the field - including adding a key to the wallet, which increased wallet return - confirmed that wallet returners gave wallets back both because they care about the person who lost the wallet and also because they care about their self-image. The results, say the authors, were a surprise to both the public, who predicted returning would go down as money increased, and to professional economists. In a Perspective, Shaul Shalvi discusses how the finding rejects the classic economic logic suggesting only self-interest drives behavior. "Instead," says Shalvi, "it shows the psychological considerations of acting altruistically and wishing to do the right thing predicts people's behavior."

Credit: 
American Association for the Advancement of Science (AAAS)

First results from ruminant genome project will inform agriculture, conservation and biomedicine

A trio of Reports and a Perspective in this issue present the Ruminant Genome Project's (RGP) initial findings, which range from explaining how deer antlers exploit cancer-associated signaling pathways to regenerate, to informing reindeer genetic adaptations - including as relates to circadian rhythm - that have helped these animals thrive in the frigid Arctic. The work provides an unprecedented look into the genomics, evolution and adaptation of ruminants, a group of highly successful and diverse mammals with significant agricultural, conservational and biomedical importance, and one that includes many well-known domestic and wild taxa, such as cows, goats, reindeer and giraffes.
Despite the fact that ruminant taxa can be found in most places on the planet, the evolutionary origin and diversification of ruminants as well as the genetics underlying their unique traits remain relatively unknown. To better resolve ruminant genetics, Lei Chen and colleagues assembled the genomes of 44 ruminant species across all six Ruminantia families - a dataset encompassing more than 40 trillion base pairs. Chen et al. then used these, as well as other ruminant genomes, to create a time-calibrated phylogenetic tree of the group, which was able to resolve the evolutionary history of many ruminant genera. Interestingly, the results revealed large declines in ruminant populations nearly 100,000 years ago, reductions that coincide with the migration of humans out of Africa and may be evidence of early humans' impact on various ruminant species, the authors say.
In another Report, Yu Wang and colleagues examine the underlying genetics and evolution of ruminant headgear. Ruminants are the only living group of mammals that possess bony headgear, but despite their common composition and cranial location, their form and function is varied among several families. Antlers, like those carried by deer, are capable of rapid growth - as much as 2.5 centimeters a day - and have become a particular interest in regenerative biology. By comparing 221 transcriptomes from headgear-bearing ruminant families and the genomes of two lineages that convergently lack headgear against the genomic background provided by the RGP, Wang et al. found that the horns of bovids and the antlers of cervids share similar genetic and cellular roots. Most striking, however, is the discovery that the regenerative properties of antler tissue are made possible through the unique exploitation of cancer-associated signaling pathways and the high expression of tumor suppressing genes. Intriguingly, the genes and regulatory sequences expressed in these animals allow for tissue regeneration without cancer growth.
Reindeer harbor a variety of biological adaptations that allow them to thrive in Arctic environments and survive harsh conditions such as extreme cold, limited food availability and prolonged periods of light and dark. What's more, reindeer are the only fully domesticated cervid species. However, the underlying genetic basis of their unique traits remains largely unknown. Here, Lin et al. closely evaluate the animal's genome and discover that several genes related to circadian arrhythmicity, vitamin D metabolism, docility and female antler growth, are either uniquely mutated and/or under positive selection in reindeer. While the results provide a genetic basis for reindeer Arctic adaptation and domestication, they may also provide insights relevant to human health, suggest the authors. For example, the newly identified genes related to circadian arrhythmicity could inform approaches to treat seasonal affective disorders, insomnia and depression. Finally, in a Perspective, Dai Fei Elmer Ker and Yunzhi Peter Yang discuss their potential implications of the three RGP Reports on future biomedical efforts.

Credit: 
American Association for the Advancement of Science (AAAS)

For global fisheries, it's a small world after all

Even though many nations manage their fish stocks as if they were local resources, marine fisheries and fish populations are a single, highly interconnected and globally shared resource, a new study emphasizes. This single resource transcends international management and economic zones due in large part to the dispersal of fish eggs and tiny larvae that can drift far and wide upon swift ocean currents, researchers report. The results of their global analysis of the international connectivity and economic contributions of more than 700 commercially harvested species worldwide reveals an ocean-spanning, "small-world" network within global fisheries. According to the authors, such connectivity suggests that poorly managed fisheries or environmental disturbances in some critically important areas could have large economic implications on fisheries further afield and the millions of people who rely on the food and livelihoods they provide. Many nations manage their fish stocks as if they were a local resource - independently and within internationally defined Exclusive Economic Zones (EEZs). Fish, however, care not for such arbitrary divisions and populations are often connected well beyond administrative boundaries. While this is particularly true for their itinerant larvae, the impact of larval connectivity across EEZs on fisheries is poorly understood. Nandini Ramesh and colleagues used dynamic ocean modeling, network analysis and the life history data from 747 fished species to determine the dispersal larvae between EEZs. The results indicate that the international flow of larvae may account for substantial amounts of total marine catch - perhaps more than US $10 billion annually - underscoring the economic risks associated with a nation's dependency on EEZs outside of their jurisdiction. According to Ramesh et al., it's this interdependence of fisheries across the globe that forms a single, global network characterized by tight interconnections and particularly important hubs of productivity.

Credit: 
American Association for the Advancement of Science (AAAS)