Culture

New research shows that mites and ticks are close relatives

image: There is a phenomenal diversity of mites (as shown by these two examples), and ticks are close relatives.

Image: 
David Walter

Scientists from the University of Bristol and the Natural History Museum in London have reconstructed the evolutionary history of the chelicerates, the mega-diverse group of 110,000 arthropods that includes spiders, scorpions, mites and ticks.

They found, for the first time, genomic evidence that mites and ticks do not constitute two distantly related lineages, rather they are part of the same evolutionary line. This now makes them the most diverse group of chelicerates, changing our perspective on their biodiversity.

Arthropoda, or jointed-legged animals, make up the majority of animal biodiversity. They both pollinate (bees) and destroy our crops (locusts), are major food sources (shrimps and crabs), and are vectors of serious diseases like malaria and Lyme disease (mosquitoes and ticks).

Arthropods are ancient and fossils show that they have been around for more than 500 million years. The secret of their evolutionary success, which is reflected in their outstanding species diversity, is still unknown. To clarify what makes arthropod so successful we first need to understand how the different arthropod lineages relate to each other.

Co-author of the study, Professor Davide Pisani, from the University of Bristol's School of Earth Sciences and Biological Sciences, said: "Finding that mites and ticks constitute a single evolutionary lineage is really important for our understanding of how biodiversity is distributed within Chelicerata.

"Spiders, with more than 48,000 described species, have long been considered the most biodiverse chelicerate lineage, but 42,000 mite and 12,000 tick species have been described. So, if mites and ticks are a single evolutionary entity rather than two distantly related ones, they are more diverse than the spiders."

Dr Greg Edgecombe of the Natural History Museum London added: "Because of their anatomical similarities it has long been suspected that mites and ticks form a natural evolutionary group, which has been named Acari. However, not all anatomists agreed, and genomic data never found any support for this idea before."

Lead author, Dr Jesus Lozano Fernandez, from Bristol's School of Biological Sciences, said: "Spiders are iconic terrestrial animals that have always been part of the human imagination and folklore, representing mythological and cultural symbols, as well as often being objects of inner fears or admiration.

"Spiders have long been considered the most biodiverse chelicerate lineage, but our findings show that Acari is, in fact, bigger."

In order to come up with their findings, the researchers used an almost even representation of mites and ticks (10 and 11 species, respectively), the most complete species-level sampling at the genomic level for these groups so far.

Dr Lozano-Fernandez added: "Regardless of the methods we used, our results converge on the same answer - mites and ticks really do form a natural group. Evolutionary trees like the one we've reconstructed provide us with the background information we need to interpret processes of genomic change.

"Our genealogical tree can now be used as the foundation for studies using comparative genomics to address problems of potential biomedical and agricultural relevance, like the identification of the genomic changes that underpinned the evolution of blood-feeding parasitic ticks from ancestors that weren't blood-feeders."

Credit: 
University of Bristol

Drug-resistant infections: If you can't beat 'em, starve 'em, scientists find

BUFFALO, N.Y. -- How do you fight a fungal infection that is becoming increasingly resistant to medicine? By starving it, found a team of University at Buffalo and Temple University researchers.

To treat Candida albicans, a common yeast that can cause illness in those with weakened immune systems, researchers limited the fungus' access to iron, an element crucial to the organism's survival.

They did so by using deferasirox, a medication used to treat blood disorders. Tested in mice, the results were promising: investigators decreased iron levels in saliva by four times, which altered the expression of more than 100 genes by the fungus, diminished its ability to infect oral mucosal tissue and caused a two-fold reduction in the organism's survival rate.

"In the absence of novel drug candidates, drug repurposing aimed at using existing drugs to treat diseases is a promising strategy," says Mira Edgerton, DDS, PhD, co-lead investigator of the study and research professor in the Department of Oral Biology at the UB School of Dental Medicine.

Edgerton, along with Sumant Puri, PhD, co-lead investigator and assistant professor in the Kornberg School of Dentistry at Temple University, published the study in March in Antimicrobial Agents and Chemotherapy.

Currently, only three major classes of clinical antifungal drugs exist. However, fungal drug resistance has steadily increased and no new classes of antifungals have emerged in decades, says Edgerton.

Candida albicans, a fungus among the group building resistance, is the agent behind a number of infections. They include oral thrush, a yeast infection in the mouth identified by a white film that coats the tongue and throat, causing painful swallowing; and denture-related stomatitis, a fungal infection that affects nearly two-thirds of U.S. denture wearers that causes inflammation, redness and swelling in the mouth.

The yeast is also the fourth leading cause of hospital-acquired bloodstream infections, which often have high mortality rates, says Edgerton.

Candida albicans is the most abundant fungus in the oral microbiome and relies heavily on saliva as a source for essential elements. Iron, the second most abundant metal in saliva, is a critical nutrient used by the fungus in several cellular processes, including energy production and DNA repair.

In mice, the group added deferasirox to drinking water to lower iron levels in saliva and reduce the availability of iron needed to sustain an infection.

The investigators found that Candida albicans in the mice who received the treatment were less likely to survive attacks by the immune system, subsisting at a 12 percent survival rate compared to a 25 percent survival rate in mice who did not receive the treatment.

The therapy also altered the expression of 106 genes by the fungus, a quarter of which were involved in the regulation of iron metabolism, directly regulated by iron or had iron-related functions. The study is the first report of iron starvation affecting gene expression of Candida albicans in real time during live infection, says Puri.

Other research has shown that treatment with deferasirox does not result in iron deficiency in adults with normal iron levels, forming the potential for preventative treatment for those who are also vulnerable to mucosal infections, says Puri.

Credit: 
University at Buffalo

Researchers propose new federal rule of evidence for more accurate verdicts in court

While many juries use commonsense when determining an innocent or guilty verdict, research has shown that commonsense can be misleading and inaccurate. In a new study, researchers propose a new federal rule of evidence that ensures a jury is educated on theories of false memory in order to produce more just verdicts--a rule that would especially be of aid in testimonies from children.

Referencing previous cases and research, the researchers found that because some court testimonies largely rely on how a person remembers a scenario, an assessment based on commonsense is not a sound way for a jury to reach a verdict; memory can be incorrect, as can our instincts for determining credibility. The study relates six principles of false memory that are firmly grounded in research to legal cases, demonstrating the fallibility of a testimony based on memory and verdicts based on commonsense.

According to the researchers, if juries were able to assign weight to testimonies with reference to these six principles as opposed to commonsense, more just verdicts would be made. And because children are so often disbelieved due to age, jurors would be taught the real role age plays in memory.

Credit: 
SAGE

Soil communities threatened by destruction, instability of Amazon forests

image: Invertebrates such as earthworms, ants and termites were more vulnerable to the displacement of forests with pastures than by crops, while microbes showed the opposite pattern.

Image: 
Andre L.C. Franco/ Colorado State University

The clearing and subsequent instability of Amazonian forests are among the greatest threats to tropical biodiversity conservation today.

Although the devastating consequences of deforestation to plants and animal species living above the ground are well-documented, scientists and others need to better understand how soil communities respond to this deforestation to create interventions that protect biodiversity and the ecosystem. But that information has been lacking.

A team of researchers led by Colorado State University's André Franco, a research scientist in the Department of Biology, conducted a meta-analysis of nearly 300 studies of soil biodiversity in Amazonian forests and sites in various stages of deforestation and land-use.

The new study, "Amazonian deforestation and soil biodiversity," is published in the June issue of Conservation Biology and is co-authored by CSU Distinguished Professor Diana Wall, Bruno Sobral, professor in the Department of Microbiology, Immunology and Pathology at CSU, and Artur Silva, professor at the Universidade Federal do Pará in Belém, Brazil.

Overall, the researchers found that the abundance, biomass, richness and diversity of soil fauna and microbes were all reduced following deforestation. Soil fauna or animals that were studied include earthworms, millipedes, dung beetles, nematodes, mites, spiders and scorpions.

Franco, who hails from Brazil, said that this is the first time that all of the available scientific data related to soil biodiversity in Amazonian forests has been synthesized.

The research team also found that the way the land is used after the forest is cleared matters to soil biodiversity. Species of invertebrates such as earthworms, ants and termites -- which are described as soil engineers -- were more vulnerable to the displacement of forests with pastures than by crops, while microbes showed the opposite pattern.

Franco said the highest biodiversity losses were found on the side of the Amazon with the highest mean annual precipitation and in areas where the soil was very acidic.

"That means these areas should be higher priorities for conservation efforts," he said.

Scientists also uncovered gaps in existing research.

"Very few studies looked at the impact of disturbances like wildfires and selective logging on these forests," Franco said. "Yet logging is an official management strategy in the Amazon forest."

In addition, the team found a lack of data from seven of the nine countries that the Amazon biome covers parts of, including Bolivia, Peru, Ecuador, Venezuela, Guyana, Suriname and French Guiana.

Sobral noted that biodiversity is a hot topic and was elevated recently with the release of a report from the United Nations, which found that nature is declining globally at unprecedented rates. But most of the scientific knowledge in the world about biodiversity relates to birds and mammals, he said.

The team is continuing this research in the Amazon, working with farmers' associations and two research institutes in Brazil to collect and analyze soil samples with the goal of studying the consequences of this loss of biodiversity.

Zaid Abdo, a bioinformatics expert and associate professor in the Department of Microbiology, Immunology and Pathology, has joined the CSU-based research team.

Sobral said it's extremely important that the scientists work with local farmers and others who are impacted by the deforestation.

"We're very focused on making sure the research isn't disconnected from local communities' needs and aspirations," he said. "Our work is guided by what the farmers want to know and how scientific knowledge could shape their future sustainable development."

Credit: 
Colorado State University

Don't overdo omega-6 fat consumption during pregnancy

In Western societies, we are eating more omega 6 fats, particularly linoleic acid, which are present in foods such as potato chips and vegetable oil. Other research has shown that linoleic acid can promote inflammation and may be associated with an increased risk of heart disease.

New research in The Journal of Physiology showed that eating a diet with three times the recommended daily intake of linoleic acid might be harmful in pregnancy. They found three changes in rat mothers who ate a high linoleic acid diet: their liver had altered concentrations of inflammatory proteins, their circulating concentrations of a protein that can cause contraction of the uterus during pregnancy were increased, and a hormone that can regulate growth and development was decreased. These changes may result in an increased risk of pregnancy complications and poor development of the babies.

If the effects of a high linoleic acid are the same in rats and humans, this would suggest that women of child-bearing age should consider reducing the amount of linoleic acid in their diet.

The researchers fed rats for 10 weeks on a diet with high linoleic acid, mated them and then investigated the effects of the diet on their pregnancy and developing babies. They specifically investigated any changes in the body and organ weights of the mothers and her babies, and concentrations of circulating and liver inflammatory proteins, cholesterol, and the hormone leptin. Rats typically give birth to multiple babies in each pregnancy. Rat mothers who ate a high linoleic acid diet had a reduced number of male babies.

It is important to note that when humans eat a diet rich in linoleic acid, the diet also tends to be high in fat, sugar and salt. However, in the study, the only change in the diet was higher linoleic acid, but no changes in fat, sugar or salt.

Deanne Skelly, senior author on the paper, said:

"It is important for pregnant women to consider their diet, and our research is yet another example that potentially consuming too much of a certain type of nutrient can have a negative impact on the growing baby."

Credit: 
The Physiological Society

Biomarkers help tailor diuretic use in acute heart failure patients

Athens, Greece - 25 May 2019: Adrenomedullin activity predicts which acute heart failure patients are at the greatest risk of death without diuretic treatment post-discharge, according to late breaking research presented today at Heart Failure 2019, a scientific congress of the European Society of Cardiology (ESC).1

"Therapy at discharge often remains unchanged for several weeks and even months in acute heart failure patients," said first author Dr Nikola Kozhuharov, of the University Hospital Basel, Switzerland. "Our study shows that not re-evaluating the need for diuretics in this critical time period has detrimental consequences for patients."

Acute heart failure is the most common cause of hospitalisation in people over 50 and up to 30% die in the year after discharge. "This is in part due to the challenge of predicting which patients are at the greatest risk of death and the subsequent uncertainty in defining the appropriate intensity of in-hospital and immediate post-discharge management," said Dr Kozhuharov.

The study aimed to find biomarkers that predict risk levels in acute heart failure patients discharged from hospital and who would benefit from heart failure drugs. The drugs were diuretics, angiotensin-converting enzyme inhibitors or angiotensin receptor blockers, beta blockers, and aldosterone antagonists.

For the biomarkers, the study used two components of adrenomedullin, a peptide hormone that is a vasodilator, meaning it dilates (opens) blood vessels. Adrenomedullin was selected after pilot studies suggested it can quantify dysfunction of small blood vessels and the associated mortality risk. In addition, activity of adrenomedullin reflects residual congestion in acute heart failure patients and the researchers hypothesised that this could be used to guide diuretic therapy at discharge.

The two components used to quantify the activity of adrenomedullin were midregional proadrenomedullin (MR-proADM), a stable precursor, and the biologically active form of adrenomedullin (bio-ADM).

The study enrolled 1,886 acute heart failure patients presenting with acute breathlessness to emergency departments of university hospitals in the UK, France, and Switzerland. Plasma concentrations of MR-proADM and bio-ADM were assessed within 12 hours of presentation and at discharge from an acute ward.

A total of 514 patients (27%) died during the 365-day follow-up. Patients with bio-ADM levels above the median had significantly lower survival if they were not receiving diuretics at discharge. A similar result was found for MR-proADM. Both associations remained significant after adjusting for age and plasma creatinine concentration at discharge. Associations with the other drugs were not significant after correction for multiple testing.

Patients with bio-ADM plasma concentrations above the median had an 87% increased risk of death during follow-up compared to those with levels below the median. MR-proADM was even more accurate than bio-ADM for predicting death and the combined risk of death and/or acute heart failure rehospitalisation.

Dr Kozhuharov said: "The observation that patients with high bio-ADM have much higher mortality rates if not treated with diuretics at discharge has immediate clinical consequences. Reasons for stopping diuretics during hospitalisation included worsening renal function and low blood pressure. Our study shows that patients should be reassessed for contraindications before discharge so that diuretics can be restarted if appropriate, particularly if they have elevated bio-ADM."

Credit: 
European Society of Cardiology

Daily self-weighing can prevent holiday weight gain

image: Associate professor Jamie Cooper shows graduate student Liana Rodrigues how to take height and weight of a subject for clinical health measures with undergraduate Allison Jones in Cooper's clinical lab in Dawson Hall.

Image: 
Andrew Davis Tucker/UGA

Athens, Ga. - Researchers at the University of Georgia have shown that a simple intervention - daily self-weighing - can help people avoid holiday weight gain.

Participants in a 14-week UGA study who weighed themselves daily on scales that also provided graphical feedback showing their weight fluctuations managed to maintain or lose weight during and after the holiday season, while a control group gained weight.

Researchers speculate that participants' constant exposure to weight fluctuations - along with being able to see a target or goal weight line (their baseline weight) - motivated behavioral change that led to weight maintenance, or in the case of overweight subjects, weight loss.

"Maybe they exercise a little bit more the next day (after seeing a weight increase) or they watch what they're eating more carefully," said study author Jamie Cooper, an associate professor in the department of foods and nutrition within the UGA College of Family and Consumer Sciences. "The subjects self-select how they're going to modify their behavior, which can be effective because we know that interventions are not one-size-fits-all."

After determining their baseline weight prior to the holidays, participants in an intervention group were told to try not to gain weight above that number, but with no additional instructions as to how to accomplish this goal.

Participants in the control group were given no instructions. A total of 111 adults between the ages of 18 and 65 participated in the study.

Michelle vanDellen, an associate professor in the UGA Department of Psychology and second author on the paper, said the findings support discrepancy theories of self-regulation.

"People are really sensitive to discrepancies or differences between their current selves and their standard or goal," she said. "When they see that discrepancy, it tends to lead to behavioral change. Daily self-weighing ends up doing that for people in a really clear way."

Daily self-weighing also has been shown to be effective in preventing weight gain in college freshmen in previous research, but researchers wanted to apply it to another historically dangerous time for weight gain.

With the average American reportedly gaining a pound or two a year, overeating during the holiday season has been identified as a likely contributor to small weight gains that add up over time and can lead to obesity.

"Vacations and holidays are probably the two times of year people are most susceptible to weight gain in a very short period of time," Cooper said. "The holidays can actually have a big impact on someone's long-term health."

Cooper said future research may investigate if daily self-weighing alone - without the graphical feedback - is the driving force behind the behavioral changes that led to weight maintenance.

The fact that subjects knew researchers would be accessing their daily weight data also could have contributed to behavioral change, she said.

What seems clear is the intervention is effective, largely because of its simplicity and adaptability.

"It works really well in the context of people's busy lives," vanDellen said. "The idea that people might already have all the resources they need is really appealing."

Credit: 
University of Georgia

Chemical juggling with three particles

image: Dr. Andreas Gansäuer and Anastasia Panfilova during epoxy hydrogenation at the Kekulé Institute of Organic Chemistry and Biochemistry at the University of Bonn.

Image: 
© Photo: Volker Lannert/Uni Bonn

Chemists from the University of Bonn and their US colleagues at Columbia University in New York have discovered a novel mechanism in catalysis. It allows the synthesis of certain alcohols more cheaply and environmentally friendly than before. The reaction follows a previously unknown pattern in which hydrogen is split into three components in a time-coordinated manner. More than five years passed between the idea and its practical realization. The results are published in the prestigious journal Science.

Alcohols are common chemical compounds which, in addition to carbon and hydrogen, contain at least one OH group. They serve as starting materials for a whole series of chemical syntheses and are often produced directly from olefins by addition of water (chemical formula: H2O). Olefins are hydrocarbons with a double bond available from oil. The water molecule serves as a "donor" of the OH-group characteristic of alcohols.

This synthesis is simple and efficient, but it has a decisive disadvantage: It can only be used to produce certain alcohols, the so-called "Markovnikov alcohols". The OH group cannot simply be attached to any position of the olefin - one of two positions is excluded. "We have now found a new catalytic method that can produce exactly these 'impossible' alcohols," explains Prof. Dr. Andreas Gansäuer.

Gansäuer works at the Kekulé Institute of Organic Chemistry and Biochemistry at the University of Bonn. The idea for the new synthesis emerged in 2013 in a collaboration with the group of Prof. Dr. Jack Norton of Columbia University in New York. However, it took almost five years until the synthesis of the so-called "anti-Markovnikov alcohol" using the new catalytic system worked well enough to be published.

Acceleration and slowing down by the catalysts' ligands

The fact that the two groups succeeded in making it into the renowned journal Science is due to the unusual reaction mechanism. Epoxides, common and valuable intermediate products of the chemical industry, serve as starting materials. Epoxides can be produced by adding an oxygen atom (chemical symbol: O) to olefins. If they are allowed to react with hydrogen molecules (H2), the oxygen becomes an OH group. Normally, with this approach only Markovnikov alcohols are produced.

"In our reaction, however, we successively transfer the hydrogen in three parts," explains Gansäuer. "First a negatively charged electron, then a neutral hydrogen atom and finally a positively charged hydrogen ion, a proton. We use two catalysts, one of which contains titanium and the other chromium. "This allows us to convert epoxides into anti-Markovnikov alcohols." The timing of the entire process must be strictly coordinated - like in juggling, where each ball has to maintain a specified flight duration. To achieve this, the chemists had to synchronize the speed of three catalytic reactions. To this end, they attached the 'right' ligands, molecules that control the metals' reactivity, to the titanium and chromium atoms.

Until now, anti-Markovnikov alcohols have been produced through a so-called hydroboration followed by an oxidation. However, this reaction is relatively complex and not particularly sustainable. The new mechanism, on the other hand, does not produce any by-products and is thus practically waste-free. "Titanium and chromium are also very common metals, unlike many other noble metals that are often used in catalysis," Gansäuer emphasizes.

In 2013, Norton and Gansäuer submitted their idea to a call for proposals on sustainable catalysis by the International Union of Pure and Applied Chemistry (IUPAC), winning first place. The project was largely financed with the grant money. "But the good cooperation within my institute has certainly also contributed to the success," emphasizes Gansäuer. "For instance, I had access not only to the institute's resources, but also to equipment of the other groups from Bonn."

Credit: 
University of Bonn

Tiny fish live fast, die young

video: Fish on coral reefs manage to thrive in isolated areas where there are very low levels of nutrients for them to use. How? The answer may lie in the tiny fish that live in the gaps in the coral structure.
From Brandl et al: 'Demographic dynamics of the smallest marine vertebrates fuel coral-reef ecosystem functioning.' https://science.sciencemag.org/cgi/doi/10.1126/science.aav3384

Image: 
Images by Tane Sinclair-Taylor.

New research has revealed that the short lives and violent deaths of some of coral reefs' smallest tenants may be vital to the health of reef systems, including the iconic Great Barrier Reef.

Dr Simon Brandl, from Simon Fraser University in Canada, led an international team of researchers searching for answers to the longstanding puzzle of 'Darwin's paradox'.

Co-author Prof David Bellwood from the ARC Centre of Excellence for Coral Reef Studies (Coral CoE) at James Cook University (JCU) said: "Charles Darwin wondered how fish on coral reefs manage to thrive in isolated areas where there are very low levels of nutrients for them to use. We thought the answer may lie in the tiny fish that live in the gaps in the coral structure."

"These tiny fish are less than five centimetres long and are known as 'cryptobenthics'. They include gobies, blennies, cardinalfish, and several other families," Prof Bellwood said.

The team surveyed reefs around the globe and records of larval abundance. They discovered that cryptobenthics and their larvae make up nearly 60% of all fish flesh eaten on the reef.

"Because of their size and tendency to hide, these little fish are commonly overlooked," Dr Brandl said, "but their unique demographics make them a cornerstone of the ecosystem."

"Their populations are completely renewed seven times a year, with individuals in some species living only a few days before they are eaten. The only way they can sustain this is by a spectacular supply of local larvae," added co-author Renato Morais, a PhD student at JCU.

Prof Bellwood said almost anything capable of eating cryptobenthics does so, including juvenile fish and invertebrates such as mantis shrimps, which then became food for other creatures.

"These factors have made it hard for researchers in the past to realise the importance of cryptobenthics and discover the food supply that the 'crypto-pump' supplies."

He said that the cryptobenthics have finally emerged from the shadows. "Their extraordinary larval dynamics, rapid growth, and extreme mortality underpins their newly discovered role as a critical functional group on coral reefs."

Credit: 
ARC Centre of Excellence for Coral Reef Studies

Game theory highlights power of local reporting in vaccine decisions

image: Individuals are more likely to choose to get vaccinated if they have access to detailed local information on disease prevalence.

Image: 
FEMA/Mark Wolfe

Computational modeling of social networks suggests that vaccination programs are more successful in containing disease when individuals have access to local information about disease prevalence. Anupama Sharma of The Institute of Mathematical Sciences in Chennai, India, and colleagues present these findings in PLOS Computational Biology.

The success of vaccination programs can eventually undercut their effectiveness when individuals choose not to get vaccinated because they believe they are protected by herd immunity. During an epidemic, a person who previously avoided vaccination may perceive higher risk of infection and choose to get vaccinated. Such decision-making is influenced by the person's access to information about disease prevalence at the local or global level.

To explore how disease prevalence information influences the success of a vaccination program, Sharma and colleagues employed a computational modeling approach. Using principles of game theory, they probed the relative importance of information about disease prevalence in an individual's local neighborhood versus disease prevalence in the entire population.

The analysis shows that when individuals rely on global disease prevalence information, for instance obtained from mass media, vaccination is unable to prevent a large section of the population from becoming infected. In contrast, when individuals make vaccination decisions that are appropriate to their immediate circumstances, the final size of an epidemic outbreak may be smaller.

"While mass immunization programs are a crucial bulwark against pandemic outbreaks, it is essential to ensure that incidence information is disseminated strategically," Sharma says. "Our findings argue strongly for a transparent system of disseminating detailed incidence data, whereby individuals have real-time access to local information and do not rely only on mass media coverage."

Future research could take into account further complexities, such as dynamic social networks that change over the course of an epidemic or individual's varying perceptions of severity between diseases like Ebola and influenza. The authors also suggest that their computational framework could be extended to explore the influence of opinions about vaccination shared over online social networks that are very different from the networks over which a disease spreads.

Credit: 
PLOS

Home-schoolers see no added health risks over time

image: Laura Kabiri

Image: 
Jeff Fitlow/Rice University

Years of home-schooling don't appear to influence the general health of children, according to a Rice University study.

A report by Rice kinesiology lecturer Laura Kabiri and colleagues in the Oxford University Press journal Health Promotion International puts forth evidence that the amount of time a student spends in home school is "weakly or not at all related to multiple aspects of youth physical health."

"Although there may be differences in the health of elementary through high school home-schoolers, those differences don't seem to change with additional time spent in home school," Kabiri said. "In other words, staying in home school longer isn't related to increased health benefits or deficits."

Earlier this year Kabiri and her Rice team reported that home-schooled students who depended on maintaining physical fitness through outside activities were often falling short.

The flip side presented in the new report should come as good news to parents and students. The study was conducted by Kabiri and colleagues at Texas Woman's University and the University of Texas Health Science Center (UTHealth) at San Antonio.

The results from studies of more than 140 children in grades kindergarten through 5, who were tested against statistically normal data for children of their age and gender, accounted for prior published research that showed home-schooled children have less upper-body and abdominal muscle strength and more abdominal fat when compared to public school students. Additional studies also showed that home-schooling benefited sleep patterns, overall body composition and diet.

However, to the researchers' surprise, these differences in home-schooler health did not appear to be affected either way by increased time in home school.

"Body composition can relate to sleep as well as diet," Kabiri said. "And as far as muscular health goes, these kids are still active. We're not saying there's not an upfront benefit or detriment to their health, but after an initial gain or loss, there aren't additional gains or losses over time if you're going to home-school your children for one year or their entire careers. The relationship between their health and the time they spend in home school seems to be irrelevant."

Credit: 
Rice University

Unique Iron Age shield gives insight into prehistoric technology

image: Radiocarbon dating has revealed that the shield was made between 395 and 255 BC

Image: 
University of Leicester

A unique bark shield, thought to have been constructed with wooden laths during the Iron Age, has provided new insight into the construction and design of prehistoric weaponry.

The only one of its kind ever found in Europe, the shield was found south of Leicester on the Everards Meadows site, in what is believed to have been a livestock watering hole.

Following analysis of the construction of the shield by Michael Bamforth at the University of York, it became apparent that the shield had been carefully constructed with wooden laths to stiffen the structure, a wooden edging rim, and a woven boss to protect the wooden handle.

Although prior evidence has shown that prehistoric people used bark to make bowls and boxes, this is the first time researchers have seen the material used for a weapon of war.

The outside of the shield has been painted and scored in red chequerboard decoration. Radiocarbon dating has revealed that the shield was made between 395 and 255 BC.

The shield was severely damaged before being deposited in the ground, with some of the damage likely to have been caused by the pointed tips of spears. Further analysis is planned to help understand if this occurred in battle or as an act of ritual destruction.

Michael Bamforth, from the University of York's Department of Archaeology, said: "This truly astonishing and unparalleled artefact has given us an insight into prehistoric technology that we could never have guessed at.

"Although we know that bark has many uses, including making boxes and containers it doesn't survive well in the archaeological record. Initially we didn't think bark could be strong enough to use as a shield to defend against spears and swords and we wondered if it could be for ceremonial use.

"It was only through experimentation that we realised it could be tough enough to protect against blows from metal weapons. Although a bark shield is not as strong as one made from wood or metal, it would be much lighter allowing the user much more freedom of movement."

The shield was first discovered by archaeologists at the University of Leicester's Archaeological Services in 2015 at an Iron Age site within a farming landscape known to have been used and managed by Iron Age communities, with the Fosse Way Roman road running close by.

Many cutting-edge analytical techniques have been used to understand the construction of the object, including CT scanning and 3D printing.

Dr Rachel Crellin, Lecturer in later Prehistory at the University of Leicester, who assessed the evidence for impact damage, said: "The first time I saw the shield I was absolutely awed by it: the complex structure, the careful decorations, and the beautiful boss.

"I must admit I was initially sceptical about whether the shield would have functioned effectively, however the experimental work showed that the shield would have worked very effectively, and analysis of the surface of the object has identified evidence of use."

Credit: 
University of York

GRACE data contributes to understanding of climate change

image: Illustration of the twin GRACE follow-on satellites.

Image: 
NASA/JPL-Caltech

The University of Texas at Austin team that led a twin satellite system launched in 2002 to take detailed measurements of the Earth, called the Gravity Recovery and Climate Experiment (GRACE), reports in the most recent issue of the journal Nature Climate Change on the contributions that their nearly two decades of data have made to our understanding of global climate patterns.

Among the many contributions that GRACE has made:

GRACE recorded three times the mass of ice lost in the polar and mountainous regions since first beginning measurements -- a consequence of global warming.

GRACE enabled a measure of the quantity of heat added to the ocean and the location for said heat that remains stored in the ocean. GRACE has provided detailed observations, confirming that the majority of the warming occurs in the upper 2,000 meters of the oceans.

GRACE has observed that of the 37 largest land-based aquifers, 13 have undergone critical mass loss. This loss, due to both a climate-related effect and an anthropogenic (human-induced) effect, documents the reduced availability of clean, fresh water supplies for human consumption.

The information gathered from GRACE provides vital data for the federal agency United States Drought Monitor and has shed light on the causes of drought and aquifer depletion in places worldwide, from India to California.

Intended to last just five years in orbit for a limited, experimental mission to measure small changes in the Earth's gravitational fields, GRACE operated for more than 15 years and has provided unprecedented insight into our global water resources, from more accurate measurements of polar ice loss to a better view of the ocean currents, and the rise in global sea levels. The mission was a collaboration between NASA and the German Aerospace Centre and was led by researchers in the Center for Space Research (CSR) in UT's Cockrell School of Engineering.

UT's Texas Advanced Computing Center (TACC) has played a critical role in this international project over the last 15 years, according to Byron Tapley, the Clare Cockrell Williams Centennial Chair Emeritus in the Department of Aerospace Engineering and Engineering Mechanics who established the Center for Space Research at UT in 1981 and who served as principal investigator of the GRACE mission.

"As the demand for the GRACE science deliverables have grown, TACC's ability to support these demands have grown. It has been a seamless transition to a much richer reporting environment," he said.

By measuring changes in mass that cause deviations in the strength of gravity's pull on the Earth's various systems -- water systems, ice sheets, atmosphere, land movements, and more -- the satellites can measure small changes in the Earth system interactions.

"By monitoring the physical components of the Earth's dynamical system as a whole, GRACE provides a time variable and holistic overview of how our oceans, atmosphere and land surface topography interact," Tapley said.

The data system for the mission is highly distributed and requires significant data storage and computation through an internationally distributed network. Although the final data products for the CSR solutions are generated at TACC, there is considerable effort in Germany by the Geophysics Center in Potsdam and the NASA Jet Propulsion Laboratory (JPL) in Pasadena, California. The final CSR analysis at TACC starts with a data downlink from the satellites to a raw data collection center in Germany. The data is then transmitted to JPL where the primary measurements are converted into the geophysical measurements consisting of GPS, accelerometer, attitude quaternions, and the high accuracy intersatellite ranging measurements collected by each satellite during a month-long observation span.

"The collection of information from this international community are brought together by the fundamental computing capability and the operational philosophy at TACC to undergo the challenging data analysis required to obtain the paradigm-shifting view of the Earth's interactions," Tapley said.

Despite being a risky venture operating on minimal funding, the GRACE mission surpassed all expectations and continues to provide a critical set of measurements.

"The concept of using the changing gravimetric patterns on Earth as a means to understanding major changes in the Earth system interactions had been proposed before," Tapley said. "But we were the first to make it happen at a measurement level that supported the needs of the diverse Earth-science community."

One of the remarkable benefits of working with TACC, according to Tapley, is the ability to pose questions whose solutions would have not been feasible prior to TACC and to find the capability to answer the questions.

"As an example, when we began the GRACE mission, our capability was looking at gravity models that were characterized by approximately 5,000 model parameters, whose solution was obtained at approximately yearly analysis intervals. The satellite-only GRACE models today are based on approximately 33,000 parameters that we have the ability to determine at a daily interval. In the final re-analysis of the GRACE data, we're looking to expand this parameterization to 4,000,000 parameters for the mean model. The interaction with TACC has always been in the context of: 'If the answer to a meaningful question requires extensive computations, let's find a way to satisfy that requirement,'" Tapley said.

Now that the GRACE Follow-On mission, which the CSR will continue to play a role in, has launched successfully, the chance to continue the GRACE record for a second multi-decadal measurement of changes in mass across the Earth system is possible. Engineers and scientists anticipate that the longer data interval will allow them to see an even clearer picture of how the planet's climate patterns behave over time.

Credit: 
University of Texas at Austin, Texas Advanced Computing Center

Scientists teach old worms new tricks

image: The nematode worm Caenorhabditis elegans at three life stages; egg, larva, and adult.

Image: 
Marie-Anne Félix, Ecole Normale Supérieure, Paris, France

May 23, 2019 - Model organisms such as yeast, fruit flies, and worms have advanced the study of genomics, eukaryotic biology, and evolution. An important resource for any model organism is a near-complete reference genome from which a multitude of scientific questions can be answered. Caenorhabditis elegans have been widely studied due to their short generation time and transparent anatomy and were one of the first multicellular organisms sequenced, yet gaps in their reference genome remain.

Three studies, published today in Genome Research, provide novel insights into C. elegans genomics and gene expression. Advances in sequencing technologies toward longer reads have facilitated genome assembly, finishing, and the sequencing of highly repetitive regions. In a study by Yoshimura and colleagues, researchers used a combination of short- and long-read assemblies to generate a more complete reference genome of a modern laboratory strain of C. elegans. The new sequence has an extra two million nucleotides that were absent from the previous sequence, which include highly repetitive regions and approximately 50 new genes. Likewise, Kim and colleagues used long-read sequencing to construct a high-quality reference genome of a wild C. elegans strain for comparative studies, detecting new regions at chromosomes ends that maintain genome integrity and more than 600 genes divergent between the wild strain and a laboratory strain.

Knowledge of where and when these genes are expressed is then key to understanding how organism developed and how tissues carry out their specific functions. Robert Waterston and colleagues utilized C. elegans to study spatial and temporal gene expression in a cell- and tissue-specific manner during development. This gene regulatory network library provides a framework for studying cell lineage from egg to adulthood.

These studies expand the usefulness of this model organism and provide an important resource to C. elegans biologists. Equally, the development of advanced techniques may facilitate the generation of more complete genome assemblies and the discovery of context- and tissue-specific gene regulation where small cell niches may play important biological roles in development, immune response, and cancer.

Credit: 
Cold Spring Harbor Laboratory Press

Scientists create new standard genome for heavily studied worm

image: The nematode worm Caenorhabditis elegans. Three life stages of C. elegans are shown: an adult hermaphrodite, a smaller larva, and eleven newly laid eggs (with embryos developing inside them). These are all on a lawn of Escherichia coli bacteria (being eaten by the worms as food). The fully grown adult is roughly 1 millimeter long.

Image: 
Marie-Anne Félix, Ecole Normale Supérieure

ITHACA, N.Y. - A new Cornell University-led study finds that the genome for a widely researched worm, on which countless studies are based, was flawed. Now, a fresh genome sequence will set the record straight and improve the accuracy of future research.

When scientists study the genetics of an organism, they start with a standard genome sequenced from a single strain that serves as a baseline. It's like a chess board in a chess game: every board is fundamentally the same.

One model organism that scientists use in research is a worm called Caenorhabditis elegans. The worm - the first multicellular eukaryote (animal, plant or fungus) to have its genome sequenced - is easy to grow and has simple biology with no bones, heart or circulatory system. At the same time, it shares many genes and molecular pathways with humans, making it a go-to model for studying gene function, drug treatments, aging and human diseases such as cancer and diabetes.

Genetic studies of C. elegans were based on a single strain, called N2, which researchers have ordered for decades from the C. elegans stock center at the University of Minnesota. Though people tried to uphold a common standard, individual labs grew N2 strains on their own, which led to morphing.

"Over the last decade, with more advanced genetic experiments using high levels of DNA sequencing, scientists were alarmed to discover that there is no longer a single laboratory strain that everyone was using," said Erich Schwarz, assistant research professor in the Department of Molecular Biology and Genetics. "Over 40 years there have arisen many different N2 strains; we can't rely on any one of them to do experiments."

Schwarz is a senior author of a new study published in Genome Research that describes a single genetically clean strain, called VC2010, where each individual is truly identical. Schwarz and colleagues from the University of Tokyo, Stanford University, the University of British Columbia and the University of Minnesota used cutting-edge techniques to sequence VC2010's genome and create a new standard.

As part of the study, the researchers compared VC2010 to the original N2 genome. They expected a near-perfect match, but got a surprise. "Along with the 100 million nucleotides we expected to see, we discovered an extra 2 million nucleotides, an extra two percent of the genome," that was hidden in the original, likely due to limitations of old technology, Schwarz said.

Schwarz added that similar issues are likely occurring in the standard genomes of other organisms, including humans. "It shows us that having the true complete DNA of an animal is not as easy as we thought it was," he said.

Other labs have begun using modern sequencing tools to reassess other genomes, which has implications for synthetic biology, where scientists are creating life - such as bacteria - from scratch. "Having a really good DNA sequence is an important baseline," Schwarz said.

Credit: 
Cornell University