Culture

Voluntary pact with food industry to curb salt content in England linked to thousands of extra heart

Projected 26,000 extra cases of heart disease/stroke and 3800 cases of stomach cancer by 2025, without strategy change, warn researchers

Those living in areas of greatest deprivation hardest hit

Estimated costs of treatment and lost productivity in excess of £1 billion

Since the introduction of the voluntary pact the UK government made with the food industry in 2011 to curb the salt content of food, the reduction in dietary salt intake in England has slowed significantly, reveals the first study of its kind, published online in the Journal of Epidemiology & Community Health.

It is estimated that this may have been responsible for an additional 9900 cases of heart disease/stroke and an extra 1500 cases of stomach cancer up to 2018--diseases both associated with excess dietary salt--compared with the period before the pact.

Without a change in strategy, this toll is estimated to reach 26,000 extra cases of heart disease/stroke and 3800 additional stomach cancer cases by 2025, widening health inequalities in the process, and adding up to more than £1 billion in healthcare and lost productivity costs, calculate the researchers.

The Public Health Responsibility Deal was a voluntary pact made between the UK government and industry to improve the nation's health by, among other things, reducing the salt content of food.

Before its introduction in 2011, the Food Standards Agency spearheaded a salt reduction strategy, which included voluntary agreements with industry to reformulate processed foods; public awareness campaigns; and food labelling.

But crucially, the strategy set specific targets to be achieved, with the threat of statutory imposition if these weren't met.

Despite the international popularity of public-private partnerships, such as the Responsibility Deal, to improve population health, these collaborations tend not to be properly evaluated, say the researchers.

To try and rectify this, they drew on data from the National Diet and Nutrition Survey (2000, 2001) and national sodium intake surveys taken from the Health Survey for England for the years 2006, 2008, 2011 and 2014.

They then assessed the effect of changes in dietary salt intake on new cases of heart disease/stroke and stomach cancer, using a validated mathematical method (IMPACT) that closely mimics the impact of changing risk factors on the development of disease.

These data were then combined with published estimates of the healthcare and workplace productivity costs associated with cardiovascular disease and stomach cancer.

In 2000-01, average daily dietary salt intake was 10.5 g for men and 8 g for women in England. Between 2003 and 2010, average annual intake fell by 0.2 g among men and by 0.12 g among women.

But between 2011 and 2014, annual reductions in dietary salt intake slowed to 0.11 g among men and to 0.07 g among women.

IMPACT analysis estimated that between 2011 and 2018 this trend may have been responsible for around 9900 extra cases of heart disease/stroke plus 710 associated deaths, as well as 1500 additional cases of stomach cancer and 610 associated deaths.

If current trends in salt intake continue, the equivalent estimates rise to 26,000 extra cases and 5500 extra deaths from heart disease/stroke and 3800 additional cases of stomach cancer by 2025, compared with the trends before 2011.

What's more, those living in the most deprived areas of the country will have been hit the hardest, so widening health inequalities, the calculations suggest.

Estimates of the economic impact in terms of healthcare costs and lost productivity associated with these extra cases add up to £160 million between 2011 and 2018, rising further to more than £1 billion by 2025, the calculations show.

This is an observational modelling study, and as such can't establish cause, added to which the researchers acknowledge that their study did not collect long term data on salt intake in the same people, which may have affected the findings.

Nevertheless, their findings echo those of other studies and official data, they point out. They highlight that the Responsibility Deal may have been particularly unfavourable for those living in the most deprived areas of the country, so widening, rather than narrowing, health inequalities.

"Public-private partnerships such as the [Responsibility Deal], which lack robust and independent target setting, monitoring, and enforcement are unlikely to produce optimal health gains," they conclude.

Credit: 
BMJ Group

Study examines differences over time in home dialysis initiation by race and ethnicity

Highlights

Among U.S. patients who started dialysis in 2005 to 2013, racial/ethnic differences in initiating home dialysis decreased over time, although in the most recent era, Blacks were still less likely to use home dialysis as the initial modality than other groups.

Racial/ethnic differences in transfer from home dialysis to hemodialysis performed in dialysis facilities did not change over time.

Minority patients continued to have lower mortality and kidney transplantation rates than White patients.

Washington, DC (July 18, 2019) -- A recent analysis reveals that as home dialysis increased from 2005 to 2013 among U.S. patients with kidney failure, racial/ethnic differences in initiating home dialysis narrowed. The findings, which appear in an upcoming issue of CJASN, indicate that all racial/ethnic groups are increasingly using this form of dialysis.

Dialysis is a life-saving treatment for many individuals with kidney failure that can be done either at home by the patient or at a dialysis facility by trained personnel. The use of home dialysis in the United States has increased over the last 10 years due to payment reforms and educational efforts. Because home dialysis provides patients with more autonomy and flexibility, it is important that all patients are educated about and offered home dialysis as an option.

Historically, minority patients have been less likely to use home dialysis than non-Hispanic White patients. To examine whether the recent growth in home dialysis use was proportional among all racial/ethnic groups, Jenny Shen, MD, MS (Los Angeles Biomedical Institute at Harbor-UCLA Medical Center) and her colleagues analyzed information on all patients listed in the United States Renal Data System who initiated dialysis from 2005 to 2013.

Of the 523,526 patients initiating dialysis from 2005 to 2013, 55% were White, 28% Black, 13% Hispanic, and 4% Asian. Among the study's findings:

In the earliest era (2005 to 2007), 8.0% of White patients initiated dialysis with home modalities, as did a similar proportion of Asians (9.2%), while lower proportions of Black (5.2%) and Hispanic (5.7%) patients did so.

Over time, home dialysis use increased in all groups and racial/ethnic differences decreased. In 2011 to 2013, rates were 10.6% for Whites, 8.3% for Blacks, 9.6% for Hispanics, and 14.2% for Asians.

Compared with White patients, the risk of transferring to in-center hemodialysis was higher in Blacks, similar in Hispanics, and lower in Asians, and these differences remained stable over time.

The mortality rate was lower for minority patients than for White patients, and this difference increased over time.

Transplantation rates were lower for Blacks and similar for Hispanics and Asians, and the difference in transplantation rates between Blacks and Hispanics vs. Whites increased over time.

"The nephrology community has been understandably pleased by the increase in the use of home dialysis in the past decade. It was important to study whether all racial and ethnic groups have experienced this rise in home dialysis use or whether some groups had continued to lag behind," said Dr. Shen. "We found that racial and ethnic differences in the initiation of dialysis with home dialysis have narrowed without any deleterious impact in relative rates of transfer to in-center hemodialysis and death."

In an accompanying editorial, Kerri Cavanaugh, MD, MHS (Vanderbilt University Medical Center) noted that "although disparities remain in the use of home dialysis by race/ethnicity in the U.S., there is optimism that this will be a statistic of the past as we keep it front and center as we show that health equity is an achievable outcome when the many talents, resources, and compassion of the nephrology community come together."

Credit: 
American Society of Nephrology

Access to contraception not 'silver bullet' to stem population growth in Africa

Greater economic development across Africa in the years ahead could cause its population to grow at an even quicker rate than current projections, according to an important new demographic study released today.

According to UN estimates, the population of sub-Saharan Africa is set to double by 2050, which could add an additional one billion people to the world's population. Yet the new findings argue that the eventual figure could be even higher, challenging common misconceptions about the reasons for population change in Africa.

The research, from University of Bath demographer Dr Melanie Channon, suggests that a focus on contraception and access to family planning as a means to stem population growth, while important, fails to address significant cultural factors impacting the decisions of women across Africa. Whilst many believe that rising levels of economic prosperity and higher levels of education should equate with women automatically desiring fewer children the results from Africa are challenging this assumption. Across sub-Saharan Africa and in spite of rising prosperity and increasing education, many women still want large families.

The researchers suggest that the reasons for this are complex, however they point to factors such as children providing status and social security as potential reasons why women might still desire bigger families.

Published in the journal PLOS ONE, the research looked at the mismatch between the number of children women want and the number of children they actually have. The team used survey data from 58 low and middle income countries from Africa, Latin America and Asia to highlight how:

In sub-Saharan Africa as a whole women have just over 5 children (yet they want more - 5.9). For the same region on average women with higher education have 2.7 children (yet again want more - 3.7)

There are no countries in Africa where woman want fewer than 2.5 children. In most countries women want more than 4 children.

Findings from Africa are at odds with results observed in Asia and South America where fertility and population growth are much lower, where access to contraception is generally better and where it is far more common for women to have more children than they want.

On average, women in Africa with no education want 2.4 more children than is the case in other parts of the world; women in Africa with higher education want 1 more child than women in an equivalent position in other lower and middle income countries.

Describing this situation as 'puzzling' and an example of 'African exceptionalism', the team behind the study suggest that the high levels of women wanting more children might explain the continent's unusual fertility transition, which started later and is proceeding slower than in other regions. They highlight how fertility declines have not only been modest in Africa, but that the decline in fertility has stalled in multiple countries; something rarely seen in other contexts.

Dr Melanie Channon from the University of Bath's Department of Social & Policy Sciences explains: "While it is true that women in many African countries are beginning to have fewer children, it is common for this to be in spite of their preferences rather than because of them.

"There may be many reasons why there is a mismatch between desired numbers of children and the actual children women have, but on the basis of these results we should not ignore the possibility that women in Africa could actually choose to have more children if their circumstances improve."

According to the paper, a small group of African countries - Central African Republic, Chad and Niger - stand out where even when they have higher education many women still want 5 children or more. This is described in the study as 'an extraordinarily high number for such a high level of education' and suggests that what's happening in Africa in terms of population growth is unique from the rest of the world.

Dr Channon adds: "There is a strong tendency, in particular in the West, to assume that African women would have less children if only they had better access to contraception and high quality education and that rising population growth is caused by insufficient family planning. But this misconception presupposes that women want to have fewer children, which our research shows is often not the case. Increased access to contraception and education are important, but may not result in fertility declines as substantial as we have seen in other areas of the world.

"On average, nearly three quarters of women in African countries are failing to have the number of children that they want, but for many this is not down to a lack of contraception - they want more children not fewer. This is particularly common in Western and Middle African countries, where the average woman has over 5 children, but still wants more.

"Given the challenges that population growth in Africa present, we must listen to why women want so many children rather than just concentrating on providing family planning services. We need to recognise that access to contraception is not the 'silver bullet' to tackling population growth that many believe."

Credit: 
University of Bath

TGen-led study finds link between gene and severe liver damage

PHOENIX, Ariz. -- July 18, 2019 -- Researchers have found that a gene known as AEBP1 may play a central role in the development, severity and potential treatment of liver disease, according to a study by Temple University, the Geisinger Obesity Institute and the Translational Genomics Research Institute (TGen), an affiliate of City of Hope.

The findings are detailed in a study published in the scientific journal PLoS One.

The study results suggest that increased expression of AEBP1 correlates with the severity of liver fibrosis in patients with NASH (nonalcoholic steatohepatitis), which is a type of NAFLD (nonalcoholic fatty liver disease), the most common cause of liver damage. NASH indicates there is both inflammation and liver cell damage, along with fat in the liver.

"Given the strong link between fibrosis and risk of liver-related mortality, efforts to identify and characterize the specific mechanisms contributing to NAFLD progression are critical for the development of effective therapeutic and preventative strategies," said Dr. Johanna DiStefano, head of the Diabetes and Fibrotic Disease Unit at TGen.

One of the study's major findings is that AEBP1 regulates the expression of a network of at least nine genes related to fibrosis: AKR1B10, CCDC80, DPT, EFEMP1, ITGBL1, LAMC3, MOXD1, SPP1, and STMN2.

"These findings indicate that AEBP1 may be a central regulator of a complex fibrosis gene expression network in the human liver," said Dr. DiStefano, the study's senior author.

The study suggests that:

AEBP1 contributes to liver fibrosis by modulating a gene network specific to stellate cells, the key fibrogenic cells of the liver.

AEBP1 expression is increased by obesity-related factors linked to NAFLD, including the sugars glucose and fructose, and palmitate, a fatty acid commonly found in processed foods.

Two microRNAs -- miR-372-3p and miR-373-3p -- which otherwise limit the expression of AEBP1, are reduced in NASH patients with advanced fibrosis. MicroRNAs constitute a recently discovered class of non-coding RNAs that play key roles in the regulation of gene expression.

"Our work shows that AEBP1 parallels the onset and severity of fibrosis in NASH patients, and suggest that AEBP1 may represent a specific therapeutic target to prevent the development of NASH fibrosis," said Dr. Glenn S. Gerhard, Chair of the Department of Medical Genetics and Molecular Biochemistry at the Lewis Katz School of Medicine at Temple University. Dr. Gerhard is the study's lead author.

Credit: 
The Translational Genomics Research Institute

Canada's high school curricula not giving students full picture of climate change

Canada's high school students may not be getting enough information on the negative impacts of climate change, scientific consensus behind human-caused warming or climate solutions, according to new research from the University of British Columbia and Lund University.

In a study published in PLOS ONE, researchers analyzed textbooks and curricula from Canada's 13 provinces and territories and interviewed curriculum designers. They concluded that while the material did a good job of explaining that climate change is caused by humans, it missed opportunities to educate them on impacts and solutions.

In addition, curricula from Manitoba, Newfoundland and Labrador and Prince Edward Island presented human-caused climate change as being a subject of debate among experts when, in fact, there is overwhelming scientific consensus that humans are driving climate change.

"A focus on inaccurate scientific controversy is problematic," said lead author Seth Wynes, PhD candidate at UBC in the department of geography. "If you ask students to debate whether or not climate change is happening, or if it's caused by humans, it gives them the idea that there's disagreement on facts established with great scientific certainty."

The researchers rated high school science curricula documents across Canada on six core areas: basic knowledge of the physical climate system ("it's climate"); observations of rising temperatures ("it's warming"); warming is caused by human activities ("it's us"); scientific consensus ("experts agree"); negative consequences associated with warming ("it's bad"); and the possibility of avoiding the worst effects of climate change through rapidly reducing greenhouse gas emissions ("we can fix it").

"Canadian curricula often focus on how the greenhouse effect works, and demonstrating that Earth is warming--but only Saskatchewan covers that there's such a strong scientific consensus, and only five provinces focus on the solutions that can fix climate change. This gap in knowledge might be one of the reasons past surveys have shown that 52 per cent of young adults in Canada are only "somewhat" or "not at all" concerned about climate change," said Wynes.

Given that burning fossil fuels is the major driver of climate change, the researchers expected that provinces with a larger fossil-fuel industry might have less climate change coverage in their science curriculum. However, they found no such correlation. In fact, Saskatchewan, which has the highest per capita greenhouse gas emissions, had the most comprehensive coverage of climate change in its high school curricula, followed by Ontario.

Nova Scotia and New Brunswick, which had the oldest curriculum documents, had the least comprehensive coverage of climate change, covering only the single topic "it's warming" in mandatory courses, while B.C. covered only half of the core topics, although it is rolling out major updates to its science curriculum over the 2019/2020 school year.

The researchers believe that climate change education in Canada would benefit from reform to accurately reflect scientific understanding and to support environmental citizenship in the next generation of Canadians.

"Curriculum documents take a lot of time and effort to produce, but the earth itself is changing fast," said Wynes, a former high school chemistry teacher. "Educators should be supported in keeping up to date with the latest scientific understandings of climate change."

"Scientifically accurate and comprehensive climate change education is absolutely essential for the 21st century," concluded study author Kimberly Nicholas, associate professor of sustainability science at Lund University in Sweden. "The science is clear on the urgent need to rapidly reduce emissions to zero. Every country needs students prepared to contribute to meeting the climate challenge."

Credit: 
University of British Columbia

China's plans to solve the mysteries of the moon

image: China, in collaboration with several countries, is now at the forefront of lunar exploration. In an article published on July 18 in Science, researchers laid out what the China Lunar Exploration Program (CLEP) has accomplished since their launch in 2007 and their plans into the next three decades.

Image: 
NAOC/CNSA

Fifty years ago, on July 20, 1969, the world watched as Neil Armstrong walked on the Moon. Since then, space agencies around the globe have sent rovers to Mars, probes to the furthest reaches of our galaxy and beyond, yet humanity's curiosity and fascination with the Moon has never abated.

China, in collaboration with several countries, is now at the forefront of lunar exploration. In an article published on July 18 in Science, researchers laid out what the China Lunar Exploration Program (CLEP) has accomplished since their launch in 2007 and their plans into the next three decades.

"Fifty years after Neil Armstrong took, 'one small step for man, one giant leap for mankind' as the first human to set foot on the Moon, China's CE-4 lander and Yutu 2 rover left the footprints of humanity's first robotic visit to the surface of the far side of the Moon," said LI Chunlai, article author and the Deputy Director-General of National Astronomical Observatories of the Chinese Academies of Science (NAOC).

The exploration of the far side of the Moon led to the unexpected discovery of possible lunar mantle material on the surface - a potential indicator of the severity of asteroid impacts in the early days of the Moon. The Chinese missions also led to the highest resolution global image and topographic data of the Moon to date.

"CLEP has brought Chinese lunar science to a great stage of development," LI said, noting the program has pushed technology forward with regard to lunar remote sensing, lunar geomorphology and lunar geology.

CLEP's next mission is set to launch in early 2020. Dubbed Chang'E 5 for the Chinese moon goddess, the goal of this mission is to collect lunar rock and soil that will be sent to Earth in a sample-return vehicle. It'll be the first sample-return mission of any country since 1976. This technological advancement - bringing samples to Earth - signals the third phase of CLEP.

LI and his team hope these developments will eventually translate to great strides in scientific application through a Lunar Scientific Research Station. The plan is to have the station in place by 2030 to carry out technical verification and scientific validation of various experiments, with the ultimate goal of hosting astronauts for long-term stays on the Moon.

First, though, there's work to be done. CLEP's planned lunar exploration and scientific studies would be significantly limited by current technology, according to LI. While China has made remarkable progress through CLEP, international collaboration is critical for the next phase of lunar exploration.

"The Moon belongs to all of us. Just as the Apollo program played a positive role in promoting the development of human society, China will work with countries around the world in its forward-looking lunar and deep space exploration projects," LI said. "We hope to cooperate with other countries in the exploration, research and utilization of the Moon to jointly create a better future for humanity through achievements in space science and technology."

Credit: 
Chinese Academy of Sciences Headquarters

Cleaning our water with groundbreaking 'bioinspired' chemistry

image: The evolved design protocol that delivered the record-holding technical performance peroxidase mimicking 'NewTAML' catalyst. The protocol strives to integrates well-balanced and positive technical, cost, health, environmental and fairness performances into the value propositions of NewTAML/peroxide processes. NewTAML/peroxide can make major contributions to sustainability by advancing safe water purification.

Image: 
Carnegie Mellon University

The 20th and 21st centuries have seen an explosion in the use of synthetic chemicals worldwide, including pesticides, medications and household cleaners -- many of which end up in our waterways. Even in small amounts these substances can affect wildlife, plants and humans, and a number of them have shown resistance to normal water treatment methods, leaving them to build up in the environment unchecked.

In a study published in ACS Catalysis, researchers at Carnegie Mellon University's Institute for Green Science (IGS) blazed the trail for a new field of sustainable chemistry by unveiling powerful, safe and inexpensive oxidation catalysts inspired by the biological processes within us that break down even the most stubborn micropollutants.

"It's maybe the most important paper that we've produced in 20 years," said Teresa Heinz Professor in Green Chemistry Terrence J. Collins, who directs the IGS.

Collins, who has been concerned with the harmful biological effects of synthetic chemicals since his days as an undergraduate student in New Zealand, has spent the last four decades working to develop methods to remove these chemicals from water using the process of oxidation, a process familiar to the human body.

"Oxidation chemistry is some considerable percentage of the biochemistry going on within us," Collins noted. "This is how nature deals with the problem of converting organic matter, particularly very chemically resistant organic matter, into usable material for its biochemistry or into energy to keep the organism going. Sometimes the resistance is too great for the enzymes that drive the oxidation chemistry and we have persistent compounds against which nature is powerless."

The substrate of choice for many oxidation reactions within our bodies and elsewhere in nature is hydrogen peroxide, which peroxidase enzymes activate to break down molecules from food and other substances we take in. Collins' goal since 1980 essentially has been to recreate the power and efficiency of these enzymes with artificial catalysts of his creation called tetra-amido macrocyclic ligands (TAMLs).

"We had to make the iron center of our catalysts do the same kind of chemistry as the iron center of the peroxidase enzymes," Collins said. "We spent 15 years systematically figuring out how to make the TAML catalyst composition perform properly. Then having got the first one -- we spent 20 years trying to make it better."

In this new study, Collins describes the "record-setting" performances of these improved catalysts, called NewTAMLs. Testing has shown that infinitesimal amounts of these catalysts activate hydrogen peroxide to eliminate the pharmaceutical and common persistent micropollutant propranolol from water in less than five minutes.

Because of their speed and efficiency, Collins envisions NewTAMLs having major cost savings over current water treatment techniques, such as ozone purification. Even more important to him than cost and power, however, is safety. A catalyst that eliminated micropollutants would be pointless if the catalyst itself ended up causing harmful effects in living organisms.

"It's trivial to find out if something's acutely toxic -- it's when something is surreptitiously toxic in parts per trillion in your body that you have a big problem," Collins explained. "Endocrine hormones in your body work in parts per trillion to low parts per billion concentrations. They control how much of life develops and what we become. The current host of everyday, everywhere chemicals that we have discovered are endocrine disruptors reads like a science fiction horror story -- but it is reality."

To test the catalysts' safety, Collins helped world leaders of endocrine disruption science to identify appropriate assays and arrange them logically to screen for low-dose adverse effects of chemicals. And TAMLs and NewTAMLs have been used for beta-testing the resulting Tiered Protocol for Endocrine Disruption. The NewTAML paper incorporates an endocrine disruption assay in mice, which the candidate catalyst passed with flying colors.

Furthermore, Collins and his team ended up rejecting potential catalyst elements that could greatly improve the performance of TAMLs because of their lack of presence in living organisms. Adding fluorine to TAMLs, for example, greatly improved their performance and stability, but fluorine is a substance rarely found in living things, and the researchers worried that building it into catalysts deployed in drinking water could increase fluoride and fluorochemical degradation products in the treated water. "There were no negative toxicity results when we tested," said Collins. "The decision to try to reproduce fluorine's singular and remarkable electronic properties without using it turned out to be a key reason we were rewarded with NewTAMLs."

"The number one way to stay safe for toxicity is to build your chemical technologies from the same elements that you're made of," Collins said.

This "bioinspired" approach to chemistry is a pillar of the new field of sustainable ultradilute oxidation catalysis that Collins and the Institute for Green Science are pioneering.

Credit: 
Carnegie Mellon University

Toward molecular computers: First measurement of single-molecule heat transfer

image: The illustration shows the heat flow through a single molecule -- a chain of carbon atoms bridging the room-temperature electrode and the pointed, atomic-scale tip of the heated electrode.

Image: 
Longji Cui, Nanomechanics and Nanoscale Transport Labs, Michigan Engineering

ANN ARBOR--Heat transfer through a single molecule has been measured for the first time by an international team of researchers led by the University of Michigan.

This could be a step toward molecular computing--building circuits up from molecules rather than carving them out of silicon as a way to max out Moore's Law and make the most powerful conventional computers possible.

Moore's Law began as an observation that the number of transistors in an integrated circuit doubles every two years, doubling the density of processing power. Molecular computing is widely believed to be Moore's Law's end game, but many obstacles stand in the way, one of which is heat transfer.

"Heat is a problem in molecular computing because the electronic components are essentially strings of atoms bridging two electrodes. As the molecule gets hot, the atoms vibrate very rapidly, and the string can break," said Edgar Meyhofer, U-M professor of mechanical engineering.

Until now, the transfer of heat along these molecules couldn't be measured, let alone controlled. But Meyhofer and Pramod Reddy, also a professor of mechanical engineering at U-M, have led the first experiment observing the rate at which heat flows through a molecular chain. Their team included researchers from Japan, Germany and South Korea.

"While electronic aspects of molecular computing have been studied for the past 15 or 20 years, heat flows have been impossible to study experimentally," Reddy said. "The faster heat can dissipate from molecular junctions, the more reliable future molecular computing devices could be."

Meyhofer and Reddy have been building the capability to do this experiment for nearly a decade. They've developed a heat-measuring device, or calorimeter, that is almost totally isolated from the rest of the room, enabling it to have excellent thermal sensitivity. They heated the calorimeter to about 20 to 40 Celsius degrees above the room temperature.

The calorimeter was equipped with a gold electrode with a nanometer-sized tip, roughly a thousandth the thickness of a human hair. The U-M group and a team from Kookmin University, visiting Ann Arbor from Seoul, South Korea, prepared a room temperature gold electrode with a coating of molecules (chains of carbon atoms).

They brought the two electrodes together until they just touched, which enabled some chains of carbon atoms to attach to the calorimeter's electrode. With the electrodes in contact, heat flowed freely from the calorimeter, as did an electrical current. The researchers then slowly drew the electrodes apart, so that only the chains of carbon atoms connected them.

Over the course of the separation, these chains continued to rip or drop away, one after the other. The team used the amount of electrical current flowing across the electrodes to deduce how many molecules remained. Collaborators at the University of Konstanz in Germany and the Okinawa Institute of Science and Technology Graduate University in Japan had calculated the current expected when just one molecule remained--as well as the expected heat transfer across that molecule.

When a single molecule remained between the electrodes, the team held the electrodes at that separation until it broke away on its own. This caused a sudden, minuscule rise in the temperature of the calorimeter, and from that temperature increase, the team figured out how much heat had been flowing through the single-molecule carbon chain.

They conducted heat flow experiments with carbon chains between two and 10 atoms long, but the length of the chain did not seem to affect the rate at which heat moved through it. The heat transfer rate was about 20 picowatts (20 trillionths of a watt) per degree Celsius of difference between the calorimeter and the electrode held at room temperature.

"In the macroscopic world, for a material like copper or wood, the thermal conductance falls as the length of the material increases. The electrical conductance of metals also follows a similar rule," said Longji Cui, first author and a 2018 U-M Ph.D. graduate, currently a postdoctoral researcher in physics at Rice University.

"However, things are very different at the nanoscale," Cui said. "One extreme case is molecular junctions, in which quantum effects dominate their transport properties. We found that the electrical conductance falls exponentially as the length increases, whereas the thermal conductance is more or less the same."

Theoretical predictions suggest that heat's ease of movement at the nanoscale holds up even as the molecular chains get much longer, 100 nanometers in length or more--roughly 100 times the length of the 10-atom chain tested in this study. The team is now exploring how to investigate whether that is true.

Credit: 
University of Michigan

Jurassic fossil shows how early mammals could swallow like their modern descendants

video: Tiny hyoid bones are preserved below the skull in the Jurassic mammaliaform Microdocodon. Comparison of Microdocodon to distant cynodont relatives and modern mammals reveals new information about hyoids and middle ears of Mesozoic mammals. The discovery of modern mammal-like hyoids of a mammaliaform sheds light on the first appearance of the swallowing function in mammal evolution.

Image: 
April I. Neander

The 165-million-year-old fossil of Microdocodon gracilis, a tiny, shrew-like animal, shows the earliest example of modern hyoid bones in mammal evolution.

The hyoid bones link the back of the mouth, or pharynx, to the openings of the esophagus and the larynx. The hyoids of modern mammals, including humans, are arranged in a "U" shape, similar to the saddle seat of children's swing, suspended by jointed segments from the skull. It helps us transport and swallow chewed food and liquid - a crucial function on which our livelihood depends.

Mammals as a whole are far more sophisticated than other living vertebrates in chewing up food and swallowing it one small lump at a time, instead of gulping down huge bites or whole prey like an alligator.

"Mammals have become so diverse today through the evolution of diverse ways to chew their food, weather it is insects, worms, meat, or plants. But no matter how differently mammals can chew, they all have to swallow in the same way," said Zhe-Xi Luo, PhD, a professor of organismal biology and anatomy at the University of Chicago and the senior author of a new study of the fossil, published this week in Science.

"Essentially, the specialized way for mammals to chew and then swallow is all made possible by the agile hyoid bones at the back of the throat," Luo said.

'A pristine, beautiful fossil'

This modern hyoid apparatus is mobile and allows the throat muscles to control the intricate functions to transport and swallow chewed food or drink fluids. Other vertebrates also have hyoid bones, but their hyoids are simple and rod-like, without mobile joints between segments. They can only swallow food whole or in large chunks.

When and how this unique hyoid structure first appeared in mammals, however, has long been in question among paleontologists. In 2014, Chang-Fu Zhou, PhD, from the Paleontological Museum of Liaoning in China, the lead author of the new study, found a new fossil of Microdocodon preserved with delicate hyoid bones in the famous Jurassic Daohugou site of northeastern China. Soon afterwards, Luo and Thomas Martin from the University of Bonn, Germany, met up with Zhou in China to study the fossil.

"It is a pristine, beautiful fossil. I was amazed by the exquisite preservation of this tiny fossil at the first sight. We got a sense that it was unusual, but we were puzzled about what was unusual about it," Luo said. "After taking detailed photographs and examining the fossil under a microscope, it dawned on us that this Jurassic animal has tiny hyoid bones much like those of modern mammals."

This new insight gave Luo and his colleagues added context on how to study the new fossil. Microdocodon is a docodont, from an extinct lineage of near relatives of mammals from the Mesozoic Era called mammaliaforms. Previously, paleontologists anticipated that hyoids like this had to be there in all of these early mammals, but it was difficult to identify the delicate bones. After finding them in Microdocodon, Luo and his collaborators have since found similar fossilized hyoid structures in other Mesozoic mammals.

"Now we are able for the first time to address how the crucial function for swallowing evolved among early mammals from the fossil record," Luo said. "The tiny hyoids of Microdocodon are a big milestone for interpreting the evolution of mammalian feeding function."

New insights on mammal evolution as a whole

Luo also worked with postdoctoral scholar Bhart-Anjan Bhullar, PhD, now on the faculty at Yale University, and April Neander, a scientific artist and expert on CT visualization of fossils at UChicago, to study casts of Microdocodon and reconstruct how it lived.

The jaw and middle ear of modern mammals are developed from (or around) the first pharyngeal arch, structures in a vertebrate embryo that develop into other recognizable bones and tissues. Meanwhile, the hyoids are developed separately from the second and the third pharyngeal arches. Microdocodon has a primitive middle ear still attached to the jaw like that of other early mammals like cynodonts, which is unlike the ear of modern mammals. Yet its hyoids are already like those of modern mammals.

"Hyoids and ear bones are all derivatives of the primordial vertebrate mouth and gill skeleton, with which our earliest fishlike ancestors fed and respired," Bhullar said. "The jointed, mobile hyoid of Microdocodon coexists with an archaic middle ear -- still attached to the lower jaw. Therefore, the building of the modern mammal entailed serial repurposing of a truly ancient system."

The tiny, shrew-like creature likely weighed only 5 to 9 grams, with a slender body, and an exceptionally long tail. The dimensions of its limb bones match up with those of modern tree-dwellers.

"Its limb bones are as thin as matchsticks, and yet this tiny Mesozoic mammal still lived an active life in trees," Neander said.

The fossil beds that yielded Microdocodon are dated 164 to 166 million years old. Microdocodon co-existed with other docodonts like the semiaquatic Castorocauda, the subterranean Docofossor, the tree-dwelling Agilodocodon, as well as some mammaliaform gliders.

Credit: 
University of Chicago Medical Center

How sex affects gene expression in mammals

Researchers report the discovery of genome-wide variations in gene expression between mammalian females and males and offer new insights into the molecular origins and evolution of sexual dimorphism in mammal species, according to a new study. The findings could help explain the wide range of sex-specific differences in human health and disease. Female and male mammals often exhibit a variety of differences in biological processes and phenotypic traits. For example, in most mammal species males are larger than females, and because sex differences appear common across many species, animal models are often used to investigate sex-biased traits and diseases in humans. However, the effects of sex on gene expression, particularly in autosomal genes, aren't well known. To investigate how sex affects the genome, Sahin Naqvi and colleagues performed a genome-wide, multi-tissue and comparative survey of sex-biased gene expression across five mammalian species. Naqvi et al. collected RNA sequencing data from male and female macaques, mice, rats and dogs for 12 different tissues which represented each germ layer as well as most major organ systems. Non-human data was compared to corresponding human RNA-seq data from the Genotype Tissue Expression Consortium (GTEx), which catalogs gene expression across all major tissues in the human body. The comparative analyses revealed hundreds of conserved sex-biased gene expressions in each tissue, which contribute to differences in traits between the sexes. For example, nearly 12% of the sex difference observed in average human height can be accounted for through conserved sex biases in gene expression. The results also show, however, that most sex bias in gene expression is an evolutionarily recent adaptation and thus is not shared between all mammalian lineages - findings which warrant careful attention in the use of non-human models of sex differences.

Credit: 
American Association for the Advancement of Science (AAAS)

Sperm may offer the uterus a 'secret handshake'

image: Before they reach the egg, sperm have to survive by the female immune system. An interaction between glycans (branched structures) on the surface of sperm and receptors on cells of the endometrium may act as a 'secret handshake,' helping sperm survive -- or it may help the immune system target faulty sperm.

Image: 
(c) American Society for Biochemistry and Molecular Biology

Why does it take 200 million sperm to fertilize a single egg?

One reason is that sperm, when they arrive in the uterus, face a bombardment by the immune system. Perhaps, says molecular anthropologist Pascal Gagneux, many are needed so that some will survive. On the other hand, there may be a benefit to culling so many sperm.

"I'm a lonely zoologist in a medical school," Gagneux said. "My elevator spiel is that all of life is one big compromise. [For an egg], being too easy to fertilize is bad; being too difficult to fertilize is also bad."

Gagneux's lab at the University of California, San Diego, has discovered the makings of a "secret handshake" between sperm and the cells lining the uterus. Uterine cells, they report in the Journal of Biological Chemistry, express a receptor that recognizes a glycan molecule on the surface of sperm cells. It's possible that this interaction may adjust the female's immune response and help sperm make it through the leukocytic reaction.

The leukocytic reaction is not well understood. What we do know, Gagneux explained, is that "after crossing the cervix, millions of sperm -- a U.S. population worth of sperm -- that arrive in the uterus are faced by a barrage of macrophages and neutrophils."

This attack by the innate immune system kills a majority of the sperm cells in semen, winnowing hundreds of millions of sperm down to just a few hundred that enter the fallopian tubes. The defensive response may be beneficial in preventing polyspermy, when an egg is fertilized by more than one sperm and cannot develop.

Because sperm are coated in sialic acid-rich glycans, and the innate immune system uses sialic acid to differentiate human cells from invaders, Gagneux and his lab initially expected that the glycan might be involved in interactions with innate immune cells called neutrophils. But neutrophils they tested did not seem to see much difference between sperm with and without sialic acid.

Meanwhile, the team observed sialic acid binding receptors called siglecs on endometrial cells. In solution, these endometrial receptors can bind to whole sperm. According to Gagneux, the binding interaction may help the sperm run this gantlet--for example, by dampening the immune response. Alternatively, it may be a way for uterine cells to weed out faulty sperm. In the immune system, this receptor class helps cells to recognize sialic acid molecules as "self," and in that context they can either turn up or down inflammation.

"It's somewhat embarrassing how little we can say about what this [interaction] means," he said. The first step in understanding its physiological significance will be to look for direct interaction between sperm and intact uterine tissue--this paper looked at only sperm interacting with purified proteins.

In some ways, Gagneux added, it's humbling to be working in such a poorly understood area. Reproduction, he said, "is a very, very delicate tug-of-war at many levels. The fact that there is (also) this immune game going on is completely fascinating."

Credit: 
American Society for Biochemistry and Molecular Biology

Stimulating life-like perceptual experiences in brains of mice

Using a new and improved optogenetic technique, researchers report the ability to control - and even create - novel visual experiences in the brains of living mice, even in the absence of natural sensory input, according to a new study. The results not only broaden our understanding of how the perceptions of the outside world are initiated and manifested in living mammalian brains but could also aid in the development of neurotherapeutics for those who suffer neuropsychiatric symptoms like hallucinations or delusions. Perceptual experiences of the surrounding environment likely stem from sensory-driven neuronal activity patterns in the mammalian neocortex. However, the relationships between this activity and its influence on perception and behavior remain unclear. While there has been significant scientific interest in the ability to study and perhaps influence perception and behavior through optogenetics, technological limitations have made progress towards these goals difficult, according to the authors. To overcome these challenges, James Marshel and colleagues developed a new optogenetic technique capable of individual-cell observation and control of hundreds of neurons across a mouse neocortex. Through genome mining of more than 600 microbial genomes, Marshel et al. identified a new channelrhodopsin (ChRmine) with exceptional optogenetic properties. Combined with an improved holographic photostimulation technique, this allowed the researchers to deeply probe - and even elicit - activity within the visual cortex of living mice. According to the results, optogenetic stimulation of specific neuron ensembles previously activated by natural perception of visual stimuli recreated the original activity, suggesting the ability to successfully elicit perception and guide behavior in mice. What's more, the new optogenetic method's ability to examine activity spanning large cortex volumes revealed new insights into the dynamics of neuronal activity between cortical layers and the neurobiology that underpins mammalian behavior.

Credit: 
American Association for the Advancement of Science (AAAS)

Stimulating neurons to induce particular perceptions in mice's minds

Hallucinations are spooky and, fortunately, fairly rare. But, a new study suggests, the real question isn't so much why some people occasionally experience them. It's why all of us aren't hallucinating all the time.

In the study, Stanford University School of Medicine neuroscientists stimulated nerve cells in the visual cortex of mice to induce an illusory image in the animals' minds. The scientists needed to stimulate a surprisingly small number of nerve cells, or neurons, in order to generate the perception, which caused the mice to behave in a particular way.

"Back in 2012, we had described the ability to control the activity of individually selected neurons in an awake, alert animal," said Karl Deisseroth, MD, PhD, professor of bioengineering and of psychiatry and behavioral sciences. "Now, for the first time, we've been able to advance this capability to control multiple individually specified cells at once, and make an animal perceive something specific that in fact is not really there -- and behave accordingly."

The study, to be published online July 18 in Science, holds implications for obtaining a better understanding of natural information processing in the brain, as well as psychiatric disorders such as schizophrenia, and points to the possibility of designing neural prosthetic devices with single-cell resolution.

Deisseroth is the study's senior author. Lead authorship is shared by staff scientists James Marshel, PhD, and Sean Quirin, PhD; graduate student Yoon Seok Kim; and postdoctoral scholar Timothy Machado, PhD.

Using optogenetics

Deisseroth, who is a Howard Hughes Medical Institute investigator and holds the D. H. Chen Professorship, pioneered optogenetics, a technology enabling researchers to stimulate particular neurons in freely moving animals with pulses of light, and to observe the resulting effects on the animals' brain function and behavior.

In the new study, Deisseroth and his colleagues inserted a combination of two genes into large numbers of neurons in the visual cortex of lab mice. One gene encoded a light-sensitive protein that caused the neuron to fire in response to a pulse of laser light of a narrowly defined color -- in this case, in the infrared spectrum. The other gene encoded a fluorescent protein that glowed green whenever the neuron was active.

The scientists created cranial windows in the mice by removing a portion of the animals' skulls to expose part of the visual cortex, which in both mice and humans is responsible for processing information relayed from the retina. The investigators protected this exposed area with a clear glass covering. They could then use a device they developed for the purpose of the study to project holograms -- three-dimensional configurations of targeted photons -- onto, and into, the visual cortex. These photons would land at precise spots along specific neurons. The researchers could monitor the resulting activity of nearly all individual neurons in two distinct layers of the cerebral cortex spanning about 1 square millimeter and containing on the order of several thousand neurons.

With their heads fixed in a comfortable position, the mice were shown random series of horizontal and vertical bars displayed on a screen. The researchers observed and recorded which neurons in the exposed visual cortex were preferentially activated by one or the other orientation. From these results, the scientists were able to identify dispersed populations of individual neurons that were "tuned" to either horizontal or vertical visual displays.

They were then able to "play back" these recordings in the form of holograms that produced spots of infrared light on just neurons that were responsive to horizontal, or to vertical, bars. The resulting downstream neuronal activity, even at locations relatively far from the stimulated neurons, was quite similar to that observed when the natural stimulus -- a black horizontal or vertical bar on a white background -- was displayed on the screen.

The scientists trained the mice to lick the end of a nearby tube for water when they saw a vertical bar but not when they saw a horizontal one or saw neither. Over the course of several days, as the animals' ability to discriminate between horizontal and vertical bars improved, the scientists gradually reduced the black-white contrast to make the task progressively harder. They found that the mice's performance perked up if the scientists supplemented the visual displays with simultaneous optogenetic stimulation: For example, if an animal's performance deteriorated as a result of a lowered contrast, the investigators could boost its discrimination powers by stimulating neurons previously identified as preferentially disposed to fire in response to a horizontal or vertical bar.

This boost occurred only when the optogenetic stimulation was consistent with the visual stimulation -- for example, a vertical bar display plus stimulation of neurons previously identified as likely to fire in response to vertically oriented bars.

Hallucinating mice

Once the mice had become adept at discriminating between horizontal and vertical bars, the scientists were able to induce tube-licking behavior in the mice simply by projecting the "vertical" holographic program onto the mice's visual cortex. But the mice wouldn't lick the tube if the "horizontal" program was projected instead.

"Not only is the animal doing the same thing, but the brain is, too," Deisseroth said. "So we know we're either recreating the natural perception or creating something a whole lot like it."

In their early experiments, the scientists had identified numerous neurons as being tuned to either a horizontal or a vertical orientation, but they hadn't yet directly stimulated each of those particular neurons optogenetically. Once the mice were trained, optogenetic stimulation of small numbers of these neurons was enough to get mice to respond with appropriate licking or nonlicking behavior.

The researchers were surprised to find that optogenetically stimulating about 20 neurons -- or fewer in some cases -- selected only for being responsive to the right orientation, could produce the same neuronal activity and animal behavior that displaying the vertical or horizontal bar did.

"It's quite remarkable how few neurons you need to specifically stimulate in an animal to generate a perception," Deisseroth said.

"A mouse brain has millions of neurons; a human brain has many billions," he said. "If just 20 or so can create a perception, then why are we not hallucinating all the time, due to spurious random activity? Our study shows that the mammalian cortex is somehow poised to be responsive to an amazingly low number of cells without causing spurious perceptions in response to noise."

Credit: 
Stanford Medicine

Study reveals unusually high carbon stocks and tree diversity in Panama's Darien forest

image: Researchers found that the amount of carbon a forest stores is mainly affected by the selective extraction of large trees.

Image: 
Sean Mattson/Smithsonian Tropical Research Institute

Forests in Darien, an eastern province of Panama, are crucial for carbon storage, biodiversity conservation and the livelihoods of indigenous groups, yet they are under threat due to illegal logging. Through a participatory forest-carbon monitoring project, scientists from the Smithsonian Tropical Research Institute (STRI), McGill University and the National Research Council of Canada uncovered sources of above-ground biomass (AGB) variation and explored considerations for implementing Reducing Emissions from Deforestation and Forest Degradation (REDD+) in Darien.

"Indigenous authorities were interested in quantifying forest-carbon stocks using field-based measurements to validate the REDD+ potential of their forests and engage in informed discussions with REDD+ proponents in the country," said Javier Mateo-Vega, former research fellow at STRI and main author of the study.

As part of the study, the scientists and a team of trained indigenous technicians analyzed 30 one-hectare plots distributed across a large, mature forest landscape, in undisturbed and disturbed areas. They found that Darien has the highest carbon stocks among nine mature forest sites across the Neotropics, and the second-highest tree species richness among five mature forest sites in the region, supporting the need to protect it in a culturally appropriate way with the region's indigenous peoples.

"I have been working in Darien since 1993 and also perceived these forests as exceptional. It was very exciting when we analyzed the results to see just 'how' exceptional they really are," said Catherine Potvin, research associate at STRI and Canada Research Chair in Climate Change Mitigation and Tropical Forests at McGill University. "Hopefully our results will help give visibility to their global importance for carbon and for biodiversity."

They also discovered that, although half of the plots in the sample had experienced traditional indigenous extractive activities, satellite analyses of vegetation cover did not detect changes in canopy height or noticeable damage to the landscape like agriculture or cattle ranching would. In the field, however, disturbed plots harbored 54% less biomass than intact forests, so their AGB volumes differed vastly from those of undisturbed plots, but their structure and characteristics did not.

This led researchers to ascertain that the main determinant of AGB variation is the level of disturbance in the forest. That is, the amount of organic matter above the ground--in standing trees-- and the amount of carbon it stores, is mainly affected by the selective extraction of large trees rather than by differences across forest types or any other factors.

The study also revealed that even when disturbed forests lost half of their carbon as compared to undisturbed ones, they maintained the same tree species richness. In addition, disturbed forests still maintained a disproportionately high capacity to sequester carbon, suggesting that they should not necessarily be excluded from REDD+ investments given its interest in targeting areas where climate-change mitigation and biodiversity conservation can be achieved simultaneously.

"Decades of efforts to protect Darien's natural and cultural heritage through different protected areas' management categories and land-tenure regimes for indigenous peoples are being stripped away by rampant illegal logging," Mateo-Vega said. "Our study conclusively demonstrates how important these forests are for climate-change mitigation, biodiversity conservation and the well-being of indigenous peoples."

Credit: 
Smithsonian Tropical Research Institute

Scientists discover group of genes connected to longer life in fruit flies

image: E(z) longer life - New insights on genes linked to longer life.

Image: 
Insilico Medicine

Thursday, July 18 - Alexey Moskalev, Ph.D., Head of the Laboratory of Geroprotective and Radioprotective technologies, and co-authors from the Institute of biology of Komi Science Center of RAS, the Engelgard's Institute of molecular biology of RAS and Moscow Institute of Physics and Technology published a scientific article titled "Transcriptome Analysis of Long-lived Drosophila melanogaster E(z) Mutants Sheds Light on the Molecular Mechanisms of Longevity" in Nature Scientific Reports - the leading international multidisciplinary weekly journal.

Scientists are now closer to understanding how a genetic mutation found in a fruit fly could hold the key to a longer lifespan. Using genome-wide transcriptome analysis, the team noted that lifespan extension and stress resistance in fruit flies -- Drosophila -- carrying the E(z) histone methyltransferase heterozygous mutation, or the E(z) mutation, were correlated with changes in the expression levels of 239 genes. The expression levels of some of the genes were doubled in flies with the E(z) mutation.

According to the results of the study, the mutant flies had a 22 to 23 percent lifespan extension compared to the control group. In addition, these flies were more resistant to hyperthermia, oxidative stress and endoplasmic reticulum stress, which can disrupt processes designed to help cells stay healthy. The mutant flies were also more fertile, the researchers added.

E(z) genes appear connected with gene expression that affects metabolism, such as carbohydrate metabolism, lipid metabolism, drug metabolism and nucleotide metabolism. The expressions that related to aging were involved in pathways related to the immune response, cell cycle and ribosome biogenesis.

"The findings of the conducted research may be a step toward investigating whether the E(z) mutation could play a role in human longevity and have implications for understanding the role of global derepression of chromatin in aging," said Dr. Alexey Moskalev, Ph.D., Head of the Laboratory of Geroprotective and Radioprotective technologies.

The laboratory of geroprotector and radioprotector technologies of the Institute of Biology Komi SC UrB RAS is regularly publishing research papers in peer-reviewed journals. The laboratory's research is aimed at studying the molecular and genetic mechanisms of lifespan regulation, aging process, stress- and radioresistance. The scientific team succeeded in identifying several dozen genes with pro-longevity action. Geroprotective effects of various pharmacological agents, natural compounds, and plant extracts on aging-related signaling pathways, lifespan and physiological functions are being actively studied. Thanks to the members of the laboratory team, data libraries DrugAge and Geroprotectors.org have been created in collaboration with leading institutions of the world. In addition, studies on the influence of factors of different nature, primarily in small doses and concentrations, on lifespan and non-linear effects (such as hormesis and adaptive response) are being carried. All studies are conducted at a high scientific level with the use of modern methods of obtaining data, their statistical and bioinformatic analysis.

Credit: 
InSilico Medicine