Culture

Rare 10 million-year-old fossil unearths new view of human evolution

image: Rudapithecus was pretty ape-like and probably moved among branches like apes do now -- holding its body upright and climbing with its arms. However, it would have differed from modern great apes by having a more flexible lower back, which would mean when Rudapithecus came down to the ground, it might have had the ability to stand upright more like humans do.

Image: 
Illustration courtesy of John Sibbick

COLUMBIA, Mo. - Near an old mining town in Central Europe, known for its picturesque turquoise-blue quarry water, lay Rudapithecus. For 10 million years, the fossilized ape waited in Rudabánya, Hungary, to add its story to the origins of how humans evolved.

What Rudabánya yielded was a pelvis -- among the most informative bones of a skeleton, but one that is rarely preserved. An international research team led by Carol Ward at the University of Missouri analyzed this new pelvis and discovered that human bipedalism -- or the ability for people to move on two legs -- might possibly have deeper ancestral origins than previously thought.

The Rudapithecus pelvis was discovered by David Begun, a professor of anthropology at the University of Toronto who invited Ward to collaborate with him to study this fossil. Begun's work on limb bones, jaws and teeth has shown that Rudapithecus was a relative of modern African apes and humans, a surprise given its location in Europe. But information on its posture and locomotion has been limited, so the discovery of a pelvis is important.

"Rudapithecus was pretty ape-like and probably moved among branches like apes do now -- holding its body upright and climbing with its arms," said Ward, a Curators Distinguished Professor of Pathology and Anatomical Sciences in the MU School of Medicine and lead author on the study. "However, it would have differed from modern great apes by having a more flexible lower back, which would mean when Rudapithecus came down to the ground, it might have had the ability to stand upright more like humans do. This evidence supports the idea that rather than asking why human ancestors stood up from all fours, perhaps we should be asking why our ancestors never dropped down on all fours in the first place."

Modern African apes have a long pelvis and short lower back because they are such large animals, which is one reason why they typically walk on all fours when on the ground. Humans have longer, more flexible lower backs, which allow them to stand upright and walk efficiently on two legs, a hallmark characteristic of human evolution. Ward said if humans evolved from an African ape-like body build, substantial changes to lengthen the lower back and shorten the pelvis would have been required. If humans evolved from an ancestor more like Rudapithecus, this transition would have been much more straightforward.

"We were able to determine that Rudapithecus would have had a more flexible torso than today's African apes because it was much smaller -- only about the size of a medium dog," Ward said. "This is significant because our finding supports the idea suggested by other evidence that human ancestors might not have been built quite like modern African apes."

Ward teamed up with Begun to study the pelvis along with MU alumna Ashley Hammond, Assistant Curator of Biological Anthropology at the American Museum of Natural History, and J. Michael Plavcan, a professor of anthropology at University of Arkansas. Since the fossil was not 100% complete, the team used new 3D modeling techniques to digitally complete its shape, then compared their models with modern animals. Ward said their next step will be to conduct a 3D analysis of other fossilized body parts of Rudapithecus to gather a more complete picture of how it moved, giving more insight into the ancestors of African apes and humans.

Credit: 
University of Missouri-Columbia

'Death Star' bacterial structures that inject proteins can be tapped to deliver drugs

image: 'Death Star' bacterial syringe structure -- Metamorphosis Associated Contractile structures (MACs) discovered by SDSU marine microbiologist Nicholas Shikuma and his lab team. It injects a protein into animal cells that leads to metamorphosis, it may also be in the human gut,and could be tapped to deliver targeted drugs in the future.

Image: 
Shikuma Lab

By scraping tubeworms off the bottom of boats in the San Diego harbor to study them, San Diego State University researchers discovered that a beneficial bacterium that aids them in establishing colonies could also be a boon for human health, because the same process might already take place in the human gut.

By examining this bacterium that causes metamorphosis in the humble tubeworm, marine microbiologists at SDSU discovered that the nanoscale syringe-like structures produced by it - a structure nicknamed the Death Star for the effect it has - could be used in the future to deliver novel therapeutics or vaccines to targeted cells and tissues in humans.

Tubeworms (Hydroides elegans) are tiny marine creatures with hard shells that cause a lot of trouble and economic loss for boat and ship owners. They stick to the bottoms of boats and form inches-thick crusty layers, and also attract other invertebrates like barnacles that then form on top of them. This so called 'biofouling' leads to additional weight and higher fuel consumption. So, everyone from the U.S Navy to the shipping and boat building industry is interested in finding out how they do this and what can be done to prevent it from happening.

Marine research led to significant discovery

Nicholas Shikuma with SDSU has been studying tubeworms for several years with students in his lab, to understand exactly why they are drawn to certain places in the ocean where they establish colonies.

Previous research by others showed that like coral reefs, sea urchins and sea squirts, the tubeworms also needed a conducive environment to reproduce, so they typically gravitated to areas with healthy populations of bacteria like Pseudoalteromonas, a beneficial bacterium. Shikuma discovered that the bacterium has Metamorphosis Associated Contractile structures (MACs) - syringe-like structures that inject content into the larvae of tubeworms, helping transform it into juvenile worms.

What he and fellow scientists did not know was if the MACs were injecting a biochemical into the tubeworm to cause metamorphosis and to stick to boat hulls. Shikuma's lab used cryo-electron tomography imaging to study the structures and found arrays of death star shaped injection systems, which are released by the bacterium.

They discovered that the syringe structures contained a novel effector protein, Mif1, that regulates biological activity in the tubeworm host, and it's this protein that's responsible for causing metamorphosis.

"Lots of pathogens produce these syringe structures that typically cause disease," Shikuma said. "But this is the first time we discovered bacteria that use the syringe for a symbiotic purpose."

Stealing syringes from phages, but for good cause

The MACs resemble similar syringe structures found on bacteriophages - viruses that infect bacteria - and with evolution, the bacteria have 'stolen' this structure from the phages, and have put it to good use.

"Phage typically attack bacteria with these structures, but instead of using it to infect other bacteria, the Pseudoalteromonas now uses it to interact with other animals, such as tubeworms, insects, and mouse cells," Shikuma said.

"MACs are created when the bacteria undergo cell lysis - when the cells blow themselves up - and the bacteria that do this die afterwards, so it's almost like altruism because it benefits the rest of the bacterial population."

Not every bacterium in this strain produces the MACs, only about one out of 50 do so, but since we can produce trillions of these bacteria, supply will not be an issue and more of them can be engineered to produce MACs, he explained.

The findings will be published September 17th in eLife journal, and follow on the heels of a recent publication from Shikuma's lab that was published in Cell Reports in June this year which looked at how this bacterium interacts in vitro with insect and mouse cells. That paper showed how the microscopic syringe structures could be modified with payloads that could potentially carry therapeutics or vaccines.

Shikuma has obtained a provisional patent for the findings in both papers, on using the MACs to deliver modified proteins. As a next step, current research in his lab involves mining data from the Human Microbiome Project to see if we humans have this same bacterial syringe structure in our guts that can be harnessed for therapeutics.

Credit: 
San Diego State University

New hunt for dark matter

image: Researchers have designed a way to give the long tunnel arms of gravitational-wave observatories like KAGRA in Japan the ability to potentially also detect axion dark matter. https://doi.org/10.1103/PhysRevLett.123.111301

Image: 
© 2019 University of Tokyo Institute for Cosmic Ray Research

Dark matter is only known by its effect on massive astronomical bodies, but has yet to be directly observed or even identified. A theory about what dark matter might be suggests that it could be a particle called an axion and that these could be detectable with laser-based experiments that already exist. These laser experiments are gravitational-wave observatories.

The hunt is on for dark matter. There are many theories as to what manner of thing it might turn out to be, but many physicists believe dark matter is a weakly interacting massive particle, or WIMP. What this means is that it does not interact easily with ordinary matter. We know this to be true because it hasn't been seen directly yet. But it must also have at least some mass as its presence can be inferred by its gravitational attraction.

There have been enormous efforts to detect WIMP dark matter, including with the Large Hadron Collider in Switzerland, but WIMPs haven't been observed yet. An alternative candidate particle gaining attention is the axion.

"We assume the axion is very light and barely interacts with our familiar kinds of matter. Therefore, it is considered as a good candidate for dark matter," said Assistant Professor Yuta Michimura from the Department of Physics at the University of Tokyo. "We don't know the mass of axions, but we usually think it has a mass less than that of electrons. Our universe is filled with dark matter and it's estimated there are 500 grams of dark matter within the Earth, about the mass of a squirrel."

Axions seem like a good candidate for dark matter, but since they may only interact very weakly with ordinary matter, they are extraordinarily difficult to detect. So physicists devise increasingly intricate ways to compensate for this lack of interaction in the hope of revealing the telltale signature of dark matter, which makes up over a quarter of the visible universe.

"Our models suggest axion dark matter modulates light polarization, which is the orientation of the oscillation of electromagnetic waves," explained Koji Nagano, a graduate student at the Institute for Cosmic Ray Research at the University of Tokyo. "This polarization modulation can be enhanced if the light is reflected back and forth many times in an optical cavity composed of two parallel mirrors apart from each other. The best-known examples of these kinds of cavities are the long tunnel arms of gravitational-wave observatories."

Dark matter research does not get as much attention or funding as other more applicable areas of scientific research, so great efforts are made to find ways to make the hunt cost-effective. This is relevant as other theoretical ways to observe axions involve extremely strong magnetic fields which incur great expense. Here, researchers suggest that existing gravitational-wave observatories such as the Laser Interferometer Gravitational-Wave Observatory (LIGO) in the USA, Virgo in Italy or KAGRA in Japan could be cheaply modified to hunt for axions without detriment to their existing functions.

"With our new scheme, we could search for axions by adding some polarization optics in front of photodiode sensors in gravitational-wave detectors," described Michimura. "The next step I would like to see is the implementation of optics to a gravitational-wave detector like KAGRA."

This idea has promise because the upgrades to the gravitational-wave facilities would not reduce the sensitivity they rely on for their primary function, which is to detect distant gravitational waves. Attempts have been made with experiments and observations to find the axion, but thus far no positive signal has been found. The researchers' proposed method would be far more precise.

"There is overwhelming astrophysical and cosmological evidence that dark matter exists, but the question 'What is dark matter?' is one of the biggest outstanding problems in modern physics," said Nagano. "If we can detect axions and say for sure they are dark matter, it would be a truly exciting event indeed. It's what physicists like us dream for."

Credit: 
University of Tokyo

Screening mammography could benefit men at high risk of breast cancer

image: Images in a 53-year-old man with Ashkenazi Jewish ancestry with BARD1 genetic mutation and strong family history of breast cancer including male breast cancer in father and premenopausal breast cancer in multiple sisters. Patient was found to have left breast cancer on (a-c) baseline screening mammogram and contralateral right breast cancer on (d-f) subsequent-year screening mammogram. Grouped coarse heterogeneous calcifications in left breast on baseline screening mammogram as shown on magnification views in (a) mediolateral and (b) craniocaudal projections (circles) underwent excisional biopsy, with surgical specimens showing (c) inclusion of targeted calcifications. Pathologic result yielded estrogen receptor (ER)-positive, progesterone receptor (PR)-positive, and human epidermal growth factor receptor 2 (HER2)-negative grade 2 invasive and in situ carcinoma, ultimately treated with mastectomy. One year later, grouped and scattered calcifications were seen on right breast screening mammogram as shown on magnification views in (d) mediolateral and (e) craniocaudal projections (arrows), which underwent SAVI Scout radar-localized excisional biopsy with surgical specimens showing (f) inclusion of targeted calcifications. Pathologic result yielded ER-positive and PR-positive grade 2 ductal carcinoma in situ, which subsequently underwent mastectomy.

Image: 
Radiological Society of North America

OAK BROOK, Ill. - Selective mammography screening can provide potentially lifesaving early detection of breast cancer in men who are at high risk for the disease, according to a landmark study published in the journal Radiology.

Breast cancer in men is a rare but often deadly disease. The American Cancer Society projects that 2,670 new cases of invasive breast cancer will be diagnosed in men in 2019, and about 500 men will die from it.

There are no formal screening guidelines for men in high-risk groups such as those who have a personal history of the disease, breast-cancer-associated genetic mutations or family members who had breast cancer. As a result, men diagnosed with breast cancer tend to have worse outcomes than women.

"Mammographic screening has helped improve the prognosis for women with breast cancer," said study lead author Yiming Gao, M.D., from the Department of Radiology at New York University Langone Medical Center in New York City. "But men don't have any formalized screening guidelines, so they are more likely to be diagnosed at a more advanced stage and often don't do as well as women."

There is anecdotal evidence that selective screening with mammography in men with identifiable risk factors is beneficial, but little is known as to how and to what extent breast imaging is used in this population.

In the first study of its kind, Dr. Gao and colleagues evaluated breast imaging utilization patterns and screening outcomes in 1,869 men, median age 55, who underwent mammography over a 12-year period.

Mammography helped detect a total of 2,304 breast lesions, 149 of which were biopsied. Of those, 41 (27.5 percent) proved to be malignant. The cancer detection rate of 18 per 1,000 exams in men at high risk of breast cancer was significantly higher than the average detection rate of three to five per 1,000 exams in average risk women. In addition, the cancers in men detected were at an early stage, before they had spread to the lymph nodes, improving the prognosis for survival.

"These results show that it is possible to detect male breast cancer early, and it appears that mammography is effective in targeted screening of high-risk men," Dr. Gao said. "We've shown that male breast cancer doesn't have to be diagnosed only when symptomatic."

In men, mammographic screening sensitivity, or the ability to detect cancer, was 100 percent, while specificity, or the ability to distinguish breast cancer from other findings, was 95 percent. This excellent performance is related to men having a relative lack of breast fibroglandular tissue that in women often masks abnormal results, the researchers said.

Personal history of breast cancer was the most significant risk factor associated with breast cancer in men. Ashkenazi Jewish ancestry, genetic mutations, and first-degree family history of breast cancer were also significant factors.

Currently, the National Comprehensive Cancer Network (NCCN) does not support screening because of a lack of evidence, even in men with elevated risk. Earlier NCCN guidelines suggested consideration of baseline mammograms on an individual basis, an approach the new study results may support.

Moving forward, the researchers hope to see larger multi-institutional studies that have the statistical power to delineate more nuanced information based on different breast cancer risk factors in men.

"Rethinking our strategy toward male breast cancer diagnosis is necessary," Dr. Gao said. "We hope these results will provide a foundation for further investigations, and potentially help pave the way to standardizing screening for certain high-risk groups of men."

Credit: 
Radiological Society of North America

Pros and cons of genetic scissors

image: Thorsten Müller researches in Bochum into so-called organoids from stem cells that function like mini brains.

Image: 
RUB, Marquard

Crispr technology has greatly facilitated gene editing. Associate Professor Thorsten Müller from Ruhr-Universität Bochum and Dr. Hassan Bukhari from Harvard Medical School discuss its pros and cons in a review article in the journal Trends in Cell Biology from 12 September 2019. They believe Crispr technology has future potential primarily if it can be rendered usable in the field of stem cell research.

In order to analyse the effects of genes or gene products, they used to be artificially over-activated. "Thus, they would occur up to 1,000 times more frequently than in nature," says Thorsten Müller. "The cell was flooded with gene products, the proteins, which can falsify function analysis." The Crispr method eliminates this difficulty. It can be used to implant blueprints of fluorescent proteins in cells and to position them behind a specific gene. "This has enabled us to monitor for the first time a protein's function live under natural conditions - rather than after 1,000-fold overproduction," explains the biochemist.

The Crispr method, too, has been optimised. Initially, researchers had to create so-called vectors in a time-consuming process, in order to label genes in the genome. Vectors are DNA segments whose sequence has to be to some extent identical with the DNA of the target cell, to ensure that the implanted gene finds the right spot. Today, they take advantage of the cells' natural DNA repair function, which considerably simplifies the creation of vectors, and are consequently able to introduce fluorescent proteins quickly and easily.

How drugs work

In addition, fluorescent tagging makes it possible to observe where the labelled gene products are located in the cell live under the microscope. "This could be interesting in order to test the effects of drugs on certain gene products," explains Müller. To this end, researchers would have to stimulate the cell with the active substance and track if and how the position of the gene product changes.

Different genes could be labelled with fluorescent biosensors in different colours and analysed simultaneously. The stronger the reading of a tagged gene, the more strongly does the cell fluoresce in the respective colour.

Organoids could replace animal testing

The authors believe a combination of this method with so-called organoids offers considerable potential. Organoids are mini organs derived from induced pluripotent stem cells that can be extracted from an adult organism. They can be used to, for example, build mini brains that are the equivalent to the human brain as far as function is concerned.

Should the application of Crispr technology in stem cells become more widespread, researchers would be able to analyse the effects of gene modification in complex tissues, rather than just in isolated cells. "We could study human genes live in tissues resembling those of humans and wouldn't have to rely on animal models as much as we do now," concludes Müller.

Following these considerations, Müller and Bukhari have compiled a number of key research questions in the review article, which have to be answered in order to consolidate Crispr technology and organoid technology.

Credit: 
Ruhr-University Bochum

How nitrogen-fixing bacteria sense iron

image: University of East Anglia research reveals how nitrogen-fixing bacteria sense iron -- an essential but deadly micronutrient.

Image: 
University of East Anglia

Researchers at the University of East Anglia have discovered how nitrogen-fixing bacteria sense iron - an essential but deadly micronutrient.

Some bacteria naturally fix nitrogen from the soil into a form that plants can use. In nature, most plants get nitrogen either from soil bacteria that do this work or from plants and microbes that die and recycle their nitrogen into the soil. In agriculture, soil is enriched with synthetic nitrogen fertilizers.

Virtually all life forms require iron to survive, yet too much of the metal can be catastrophic. In healthy cells, many systems regulate this delicate balance.

In many nitrogen-fixing bacteria, a protein called RirA plays a key role in regulating iron. It senses high levels of the metal and helps to shut down the production of proteins that bring in more iron.

RirA contains a cluster of four iron and four sulfur atoms, which acts as a sensor for iron availability. But until now, exactly how this cluster structure detects iron levels in a cell was unclear.

The UEA research team was led by Prof Nick Le Brun from the School of Chemistry in collaboration with researchers at the University of Essex.

They used a technique known as time-resolved mass spectrometry to examine the sensory response of the iron-sulfur cluster of RirA when different levels of iron were available.

The results revealed a 'loose' iron atom in the cluster. When iron levels drop, this atom is rapidly lost as it is scavenged for use in other essential cellular processes.

Without it, the cluster in RirA collapses and the protein becomes inactive, which prompts the cell to produce proteins that enable the cell to take up iron from its surroundings.

Once iron levels are sufficient again, RirA regains its cluster and becomes active again, stopping the production of proteins that bring in more iron.

Iron-sulfur clusters are common in many proteins, and this work offers new insight into their various roles. It also highlights the potential to use time-resolved mass spectrometry to examine biological processes in depth.

Prof Le Brun said 'This research provides unprecedented detail of how the iron-sensing cluster of RirA responds to low iron conditions, and establishes, for the first time, how an iron-sulfur cluster can be used to sense iron.

"This is an important piece in the bigger puzzle of how life deals with iron, a nutrient it cannot do without but one it must also avoid having in excess."

Credit: 
University of East Anglia

Stroke patients relearning how to walk with peculiar shoe

video: The iStride device is strapped over the shoe of the good leg and generates a backwards motion, exaggerating the existing step, making it harder to walk while wearing the shoe. The awkward movement strengthens the stroke-impacted leg, allowing gait to become more symmetrical once the shoe is removed.

Image: 
University of South Florida

TAMPA, Fla. (September 17, 2019)- A therapeutic shoe engineered to improve stroke recovery is proving successful and expected to hit the market by the end of the year. Clinical trials have been completed on the U.S. patented and licensed iStride Device, formerly the Gait Enhancing Mobile Shoe (GEMS), with results just published in the Journal of NeuroEngineering and Rehabilitation.

Stroke sufferers experience muscle weakness or partial paralysis on one side of the body, which greatly impacts how they walk, known as gait. Gait asymmetry is associated with poor balance, a major cause of degenerative issues that make individuals more susceptible to falls and injuries.

The iStride device is strapped over the shoe of the good leg and generates a backwards motion, exaggerating the existing step, making it harder to walk while wearing the shoe. The awkward movement strengthens the stroke-impacted leg, allowing gait to become more symmetrical once the shoe is removed. The impaired foot wears a matching shoe that remains stationary.

"The backward motion of the shoe is generated passively by redirecting the wearer's downward force during stance phase. Since the motion is generated by the wearer's force, the person is in control, which allows easier adaptation to the motion," said developer Kyle Reed, PhD, associate professor of mechanical engineering at the University of South Florida. "Unlike many of the existing gait rehabilitation devices, this device is passive, portable, wearable and does not require any external energy."

The trial included six people between ages 57 and 74 who suffered a cerebral stroke at least one-year prior to the study. They all had asymmetry large enough to impact their walking ability. Each received twelve, 30-minute gait training sessions for four weeks. With guidance from a physical therapist, the patients' gait symmetry and functional walking were measured using the ProtoKinetics Zeno Walkway system.

All participants improved their gait's symmetry and speed. That includes how long it takes to stand up from a sitting position and walk, as well as how long it takes to walk to a specific location and distance traveled within six minutes. Four improved the percentage of time spent in a gait cycle with both feet simultaneously planted on the ground, known as double limb support. As far as the other two that didn't improve, one started the study with severe impairment, while the other was highly functional. It's also important to note that three participants joined the study limited to walking in their homes. Following the trial, two of them could successfully navigate public venues.

Reed compared his method to a previous study conducted on split-belt treadmill training (SBT), which is commonly used by physical therapists to help stroke patients improve their gait. The equipment allows the legs to move at different speeds, forcing the patient to compensate in order to remain on the treadmill. While the SBT improves certain aspects of gait, unlike the iStride, it doesn't strengthen double limb support.

That research concluded only about 60 percent of patients trained on the SBT corrected their gait when walking in a normal environment. Walking is context dependent where visual cues impact how quickly one tries to move, and in what direction. The iStride allows patients to adjust accordingly. Movement on a treadmill is predictable and provides individuals a static scene.

Since patients are often disappointed in their progress after being discharged from rehabilitation, the iStride's portability allows patients to relearn to walk in a typical setting more often and for a longer duration. Reed is now working on a home-based clinical trial with 21 participants and expects to publish results within the next year. He recently received a Fulbright scholarship to conduct research at Hong Kong Polytechnic University. He's working in the rehabilitation sciences and biomedical engineering departments throughout the 2019-2020 academic year.

Credit: 
University of South Florida

Complexity of plastics make it impossible to know which are dangerous

image: Researchers recommend that consumers try to avoid PVC, which is labelled as #3 plastic in the recycling code, and all products labelled as #7 plastic, which are "other types of plastic".

Image: 
Illustration: Teleman84, Wikimedia Commons

A lot of people worry about microplastics and plastic pollution, but not as many of us are aware of the large number of chemicals we encounter in plastic products that we use every day.

Researchers know of more than 4,000 chemicals that are currently used in plastic food packaging. But with more than 5,000 different types of plastic on the market, the number of chemicals used to make plastics is likely even larger.

"The problem is that plastics are made of a complex chemical cocktail, so we often don't know exactly what substances are in the products we use. For most of the thousands of chemicals, we have no way to tell whether they are safe or not," says Martin Wagner, a biologist at the Norwegian University of Science and Technology (NTNU). "This is because, practically speaking, it's impossible to trace all of these compounds. And manufacturers may or may not know the ingredients of their products, but even if they know, they are not required to disclose this information."

Wagner specializes in the environmental and health impacts of plastics, and is the senior author of a new study on the toxicity and chemical composition in various plastic products.

The work is part of the PlastX research group led by ISOE - the Institute for Social-Ecological Research Frankfurt and was carried out in collaboration with the Goethe University in Frankfurt am Main, Germany. The article was recently published in Environmental Science and Technology.

"We studied eight types of plastics commonly used to make everyday products, such as yogurt cups and bath sponges, and examined their toxicity and chemical composition. Three out of four products contained toxic chemicals," Lisa Zimmermann, Wagner's colleague and first author of the study, says.

The researchers used cell cultures to investigate the effects of the mix of chemicals in each product. They found that many plastics contain chemicals that induced general toxicity (six out of ten products), oxidative stress (four out of ten) and endocrine-disrupting effects (three out of ten).

It is impossible to pinpoint specifically which chemicals were the culprits: the research group discovered more than 1,400 substances in plastics but identified only 260 of them. That means that most of the plastic chemicals remain unknown and cannot be assessed for their safety.

Given that, the authors were able to conclude that plastic chemicals in polyvinyl chloride (PVC) and polyurethane (PUR) were the most toxic. Compared to PVC and PUR, polyethylene terephthalate (PET) and high-density polyethylene (HDPE) were less toxic.

"Plastics contain chemicals that trigger negative effects in a culture dish. Even though we do not know whether this will affect our health, such chemicals simply shouldn't be in plastics in the first place," Wagner says.

Traditionally, plastics consist of polymers and as a rule, are made from petroleum products. Now "green" alternatives to regular plastic can be found as well.

Bioplastics are made from renewable biomass sources, such as plants, instead of from petroleum. But they are still plastic - the polymers that make up the plastic just come from another source.

"Regarding toxicity," Wagner says, "it's the same problem. We are in the dark as to which chemicals are used in the bioplastics as well."

The PlastX team also investigated products made of polylactic acid (PLA), a common type of bioplastic. They found toxicity in all of them.

Even plastic products that seem similar may contain different substances.

"We examined four different yogurt cups and found toxicity in two of them, but not in the other two," Zimmermann says.

In other words, it is almost impossible for consumers to know whether a product is safe or not. This sounds discouraging but there is something you can do.

The PlastX researchers suggest:

Refuse buying unnecessary plastics and reduce your plastics exposure and footprint, for example, by buying fresh and unpackaged products.

Avoid PVC products when possible, which are labelled #3 in the recycling code, and all "other types of plastic" labelled as #7 because it is not clear which material they are made from.

Consumers can and should demand safer plastics. One way is to ask retailers for transparency regarding what materials a product is made of and which chemicals are in the product.

Ultimately, Wagner says, it is the responsibility of manufacturers and retailers to improve the chemical safety of their products.

"We found that there was at least one product that was far less toxic than others made from the same material, Wagner says. "This is encouraging because it shows that safer plastics are already on the market."

That means manufacturers could conceivably make plastics that do not contain toxic chemicals. Wagner says that big retail chains and brand owners have the power to make that happen. However, at this stage, transparency regarding the chemical composition remains an issue that must be resolved.

Wagner calls on policymakers to consider the safety of plastics to which the public is exposed to regularly a priority. "We need to avoid demonizing plastics. But given that we live in the plastic age, we need to make sure they don't affect our health," he argues.

This requires taking action, from consumer choices in the supermarket to the international level.

Credit: 
Norwegian University of Science and Technology

Short-term study suggests vegan diet can boost gut microbes related to body weight, body composition and blood sugar control

New research presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 Sept) suggests that a 16-week vegan diet can boost the gut microbes that are related to improvements in body weight, body composition and blood sugar control. The study is by Dr Hana Kahleova, Physicians Committee for Responsible Medicine (PCRM), Washington, DC, USA, and colleagues.

Gut microbiota play an important role in weight regulation, the development of metabolic syndrome, and type 2 diabetes. The aim of this study was to test the effect of a 16-week plant-based diet on gut microbiota composition, body weight, body composition, and insulin resistance in overweight adults with no history of diabetes.

The study included 147 participants (86% women and 14% men; mean age was 55.6±11.3 years), who were randomised to follow a low-fat vegan diet (n=73) or to make no changes to their diet (n=74) for 16 weeks. At baseline and 16 weeks, gut microbiota composition was assessed, using uBiome kits. Dual energy X-ray absorptiometry was used to measure body composition. A standard method called the PREDIM index was used to assess insulin sensitivity.

Following the 16-week study, body weight was reduced significantly in the vegan group (treatment effect average -5.8 kg), particularly due to a reduction in fat mass (average -3.9 kg) and in visceral fat. Insulin sensitivity also increased significantly in the vegan group.

The relative abundance of Faecalibacterium prausnitzii increased in the vegan group (treatment effect +4.8%). Relative changes in Faecalibacterium prausnitzii were associated with decreases in body weight, fat mass and visceral fat. The relative abundance of Bacteoides fragilis also increased in the vegan group (treatment effect +19.5%). Relative changes in Bacteroides fragilis were associated with decreases in body weight, fat mass and visceral fat, and increases in insulin sensitivity.

The authors conclude: "A 16-week low-fat vegan dietary intervention induced changes in gut microbiota that were related to changes in weight, body composition and insulin sensitivity in overweight adults."

However, the authors acknowledge that further work is needed to separate out the effects of the vegan diet itself from that of the reduced calories. They say: "A plant-based diet has been shown to be effective in weight management, and in diabetes prevention and treatment. This study has explored the link between changes in the gut microbiome, and changes in body weight, body composition, and insulin sensitivity. We have demonstrated that a plant-based diet elicited changes in gut microbiome that were associated with weight loss, reduction in fat mass and visceral fat volume, and increase in insulin sensitivity."

They add: "The main shift in the gut microbiome composition was due to an increased relative content of short-chain fatty acid producing bacteria that feed on fibre. Therefore, high dietary fibre content seems to be essential for the changes observed in our study. We plan to compare the effects of a vegan and a standard portion-controlled diet on gut microbiome in people with type 2 diabetes, in order to separate out the positive effects of the reduced calories in the diet from those caused by the vegan composition of the diet."

They continue: "This is a fascinating area of research and we have been collecting data from more study participants. We hope we will be able to present them at the next year's 2020 EASD meeting."

The authors say that fibre is the most important component of plant foods that promotes a healthy gut microbiome. Faecalibacterium prausnitzii is one of the short-chain fatty acids producing bacteria, which degrade plant complex sugars and starch to produce health-promoting butyrate and/or other short-chain fatty acids that have been found to have a beneficial effect on body weight, body composition, and insulin sensitivity. The authors say: "Eating more fibre is the number one dietary recommendation for a healthy gut microbiome."

Credit: 
Diabetologia

Study finds virtual reality training could improve employee safety

A new study suggests employee safety could be improved through use of Virtual Reality (VR) in Health and Safety training, such as fire evacuation drills.

The Human Factors Research Group at the University of Nottingham, developed an immersive VR system to stimulate participants' perception of temperature, and senses of smell, sight and hearing to explore how they behaved during two health and safety training scenarios: an emergency evacuation in the event of a fire and a fuel leak.

In one scenario, participants had to evacuate from a virtual fire in an office, seeing and hearing using a VR headset but could also feel heat from three 2kW heaters, and could smell smoke from a scent diffuser, creating a multisensory virtual environment. This group was compared against another group who were observed in this scenario using only audio-visual elements of VR.

Observing real life behaviours

Previous research on human behaviour during real-world fire incidents has shown that a lack of understanding of the spread and movement of fire often means that occupants are unprepared and misjudge appropriate actions. Immersive health and safety training enables employers to train people about hazards and hazardous environments without putting anyone at risk.

The Nottingham research, funded by the Institution of Occupational Safety and Health (IOSH), found contrasts between the groups in the way participants reacted to the scenario. Those in the multi-sensory group had a greater sense of urgency, reflecting a real-life scenario, and were more likely to avoid the virtual fires. Evidence from the audio-visual participants suggested that they were treating the experience more like a game and behaviours were less consistent with those expected in a real world situation.

Dr Glyn Lawson, Associate Professor in the Faculty of Engineering, University of Nottingham, said: "Health and safety training can fail to motivate and engage employees and can lack relevance to real-life contexts. Our research, which has been funded by the Institution of Occupational Safety and Health, suggests that virtual environments can help address these issues, by increasing trainees' engagement and willingness to participate in further training. There are also business benefits associated with the use of virtual environment training, such as the ability to deliver training at or near the workplace and at a time that is convenient to the employee."

Virtual Reality vs. PowerPoint

A further test was done, as part of the study, to measure the effectiveness of VR training versus traditional PowerPoint training. Participants took questionnaires, testing their knowledge on either fire safety or safe vehicle disassembly procedure, before and after training as well as one week later.

While those trained via PowerPoint appeared to have gained more knowledge when tested directly after training, there was a significantly larger decrease in knowledge scores when participants were retested one week later. In comparison, the VR group's long term retention was better and reported higher levels of engagement; attitude to occupational safety and health; and willingness to undertake training in the future.

The research suggests that the increased cognitive engagement of learning in the virtual environment creates more established and comprehensive mental models which can improve recall, and implies that testing an employee's knowledge immediately following health and safety training may not be an effective means of gaging long-term knowledge of health and safety.

Applications to the work place

Mary Ogungbeje, Research Manager at IOSH, said: "The wheels are turning so that virtual and smart learning is increasingly engrained in the workplace and everyday life.

"Technology is continuously advancing and in many cases becoming more affordable, so this study gives us a taste of what's to come. By improving training strategies with the use of technology and stimulated sensory experiences, we are heading in a direction where the workforce will not just enjoy a more immersive and interesting training course but participate in an effective learning experience, so they are better prepared and equipped to stay safe, healthy and well at work."

The researchers conducted meetings, discussions, and visits with partners including Rolls-Royce, for expert advice around fire safety and safe handling of hazardous chemicals. The University of Nottingham's Health and Safety advisors also contributed to help the researchers better understand how the training may be implemented in industry.

The study aims to produce evidence-based guidance for the development and use of virtual environments in engaging and effective training using cost-effective and accessible solutions. The full study features in a report, titled 'Immersive virtual worlds: Multisensory virtual environments for health and safety training', to be released at the IOSH's annual conference on Tuesday 17 September.

Credit: 
University of Nottingham

Renegade genes caught red handed

image: Potentially dangerous genes embedded within human DNA were once thought to be locked down by helpful DNA structures called heterochromatin. A University of Arizona researcher disputes that belief and hopes to change the paradigm even further.

Image: 
Kelvin Pond

The guardians of the human genome that work to prevent potentially disease-causing gene expression might not be as effective at their jobs as previously thought, according to new University of Arizona research.

Human chromosomes are made up of DNA, about half of which includes ancient remnants of a type of virus called transposons. Also known as "jumping genes," transposons have the potential to attack other parts of the genome and cause mutations and damage if they're ever free to be expressed.

"To keep transposons from disrupting how our genes function, cells create a structure called heterochromatin," said Keith Maggert, UA associate professor of cellular and molecular medicine and member of the UA Cancer Center.

Heterochromatin, the guardian that basically handcuffs the dangerous transposons in a compact tangle of DNA strand, prevents transposons from being copied or expressed. But researchers are still working to understand the fundamentals of heterochromatin, and much of what is thought is based on assumption, Maggert said.

One assumption was that once heterochromatin formed on a transposon, it was stable, locked up for good. But research done by Maggert and his team suggests that even early on, some cells fail to silence the transposons, and even the silenced ones aren't completely quiet.

"People thought heterochromatin was good at its job," said Maggert, lead author on the paper published today in the journal Proceedings of the National Academy of Sciences. "But heterochromatin makes mistakes, and so it slips from time to time, flickering on and off constantly. Each time it drops the ball, we're at risk, and certain environmental conditions can lead to increased instability."

It's Maggert's ongoing hypothesis that transposons can do their damage as heterochromatin flickers on and off and becomes unstable, allowing for errors in DNA transcription and, ultimately, the emergence of diseases such as cancer. More research is required to confirm this linkage, Maggert stressed.

Until now, studies were only able to assess how genes were expressed at the end of a fruit fly's life, but the team's novel methods allowed for the observation of heterochromatin activity throughout cell development.

"We designed a system to track the history of silencing (flickering) throughout an organism's life," Maggert said. "Imagine a game of pinball. Before, we could only see if the ball finally went in the hole, but we couldn't see all the interesting stuff - the ball bounce against the weird bumpers, what path it took, what it did before reaching that final state. The flickering on and off was a total mystery no one expected."

It took the team years to create the experiment, which consisted of a combination of biological observations of fruit flies, a novel mathematical model and genes borrowed from yeast, jellyfish and coral. It took four years to make, but just 24 hours to get the results, which have implications for how human heterochromatin functions.

"It was an exciting day, but it was also a very scary day to come in and look at the results," Maggert said.

Not only has his team learned that heterochromatin is surprisingly unstable, but "the very unusual thing about heterochromatin is it seems to 'remember' whether it's been strong or weak (handcuffed or not) even after a cell divides," Maggert said. "It's wiped away during cell division then restored afterward. This is called epigenetics - the term for silencing transposons."

No one yet understands how it happens.

"I've always been fascinated by the idea of epigenetics, and when I started working on it as a graduate student, I was totally on board. But as I've done more and more research - it's now been 20 years - I think it's all wrong," he said. "How did I lose the faith? Memory requires a mechanism, and a lot of smart people have been looking for one for 20 years and have come up empty handed. Lack of evidence is not the evidence of the lack, but at some point, you start to think, maybe we won't find it."

Next, Maggert, who is in year three of a five-year $2.5 million Transformative Research Award grant from the National Institutes of Health, hopes to complete a study where he explains how heterochromatin memory works without invoking epigenetics.

Credit: 
University of Arizona

Taking evolution to heart

image: A chimpanzee echocardiogram is performed by Aimee Drane from the International Primate Heart Project.

Image: 
Robert Shave

An international research group at UBC, Harvard University and Cardiff Metropolitan University has discovered how the human heart has adapted to support endurance physical activities.
This research examines how the human heart has evolved and how it adapts in response to different physical challenges, and will bring new ammunition to the international effort to reduce hypertensive heart disease--one of the most common causes of illness and death in the developed world.

The landmark study analyzed 160 humans, 43 chimpanzees and five gorillas to gain an understanding of how the heart responds to different types of physical activity. In collaboration with Harvard University's Daniel Lieberman and Aaron Baggish, UBC Professor Robert Shave and colleagues compared left ventricle structure and function in chimpanzees and a variety of people, including some who were sedentary but disease-free, highly active Native American subsistence farmers, resistance-trained football linemen and endurance-trained long-distance runners.

The wide variety of participants were specifically recruited in order to examine cardiac function in an evolutionary context. From the athletic stadium to wildlife sanctuaries in Africa, the team measured a diverse array of cardiac characteristics and responses to determine how habitual physical activity patterns, or a lack of activity influence cardiac structure and function, explains Shave.

"While apes showed adaptations to support the pressure challenge associated with activities such as climbing and fighting, humans showed more endurance related adaptations," says Shave, director of UBCO's School of Health and Exercise Sciences

Guiding their inquiry is the well-known idea that the heart remodels itself in response to different physiological challenges, he notes

"Moderate-intensity endurance activities such as walking and running stimulate the left ventricular chamber to become larger, longer and more elastic--making it able to handle high volumes of blood," he says. "But pressure challenges like chronic weight-lifting or high blood pressure, stimulate thickening and stiffening of the left ventricular walls."

Among humans, the research team showed there is a trade-off between these two types of adaptations. This trade-off means that people who have adapted to pressure cannot cope as well with volume and vice versa. Basically, the hearts of endurance runners aren't great at dealing with a pressure challenge, and the weight lifter's heart will not respond well to increases in volume.

This new research provides evidence that the human heart evolved for the purpose of moderate-intensity endurance activities, but adapts to different physical (in)activity patterns.

"As a result, today's epidemic of physical inactivity in conjunction with highly processed, high-sodium diets contributes to thicker, stiffer hearts that compromise the heart's ability to cope with endurance physical activity, and importantly this may start to occur prior to increases in resting blood pressure," explains Shave.

This is often followed by the onset of high blood pressure and can eventually lead to hypertensive heart disease.

"We hope our research will inform those at highest risk of developing hypertensive heart disease," says Shave. "And ensure that moderately intense endurance-type activities are widely encouraged in order to ultimately prevent premature deaths."

Credit: 
University of British Columbia Okanagan campus

How gut bacteria negatively influences blood sugar levels

Millions of people around the world experience serious blood sugar problems which can cause diabetes, but a world first study is revealing how gut bacteria impact the normally feel good hormone serotonin to negatively influence blood sugar levels.

Serotonin, a neurotransmitter in the brain, is nicknamed the 'happy hormone' and is normally linked with regulating sleep, well-being and metabolism. But the gut actually produces 95 percent of it, and not in the happy form like we know about in the brain.

In a study published in the leading international journal Proceedings of the National Academicy of Sciences (PNAS) today, researchers from Flinders, SAHMRI, and McMaster University in Canada show exactly how bacteria living in the guts of mice, the microbiome, communicate with cells producing serotonin to influence blood sugar levels in the host body.

Professor Damien Keating, Head of Molecular and Cellular Physiology at Flinders University and Deputy Director of the Flinders Health and Mecical Research Institute, says this study sheds light on the unanswered question about exactly how bacteria in the microbiome communicate to control glucose levels in the metabolism.

"We found that the microbiome worsens our metabolism by signalling to cells in the gut that produce serotonin. They drive up serotonin levels, which we previously showed to be increased in obese humans, and this rise in blood serotonin causes significant metabolic problems."

"The next step will be to understand exactly which bacteria do this, and how, in the hope that this could lead to new approaches to regulating blood sugar levels in humans." says Professor Keating

This study is the first to show how the microbiome, the bacteria that lives in the gut, effectively communicate with an organism to impact the hosts metabolism.

If researchers can better understand which bacteria cause the signals to produce serotonin in the gut, treatments could one day be developed to reduce blood sugar levels, and this is a first step towards better understanding this process.

"This is an exciting revelation that can one day have direct implications for human health disorders such as diabetes, but much more research like this is required in the years to come."

Credit: 
Flinders University

For lemurs, sex role reversal may get its start in the womb

image: A mother lemur's pregnancy hormones affect her daughter's aggression later in life, researchers report.

Image: 
Photo by David Haring, Duke Lemur Center

DURHAM, N.C. -- Anyone who says females are the 'gentle sex' has never met a lemur. Lady lemurs get first dibs on food, steal their mates' favorite sleeping spots and even attack males, swatting or biting those that annoy them.

What gives these female primates the urge and ability to reign supreme while the meeker males give in or get out of the way? New research suggests that female lemurs' bullying behavior may get programmed early, before birth.

A Duke University study shows that a mother lemur's hormone levels during pregnancy can have long-term effects on her daughters. Female fetuses that are naturally exposed to higher doses of a sex hormone called androstenedione while still in the womb grow up to be more aggressive as adults, researchers report.

Understanding female domination in lemurs has been a puzzle, said lead author Nicholas Grebe, a postdoctoral associate working with Duke professor Christine Drea. Female lemurs are no bigger or brawnier than their male counterparts. Nor are they better armed with horns or tusks like some male animals.

Testosterone levels are significantly lower in females than males too. If the "aggression hormone" doesn't explain females' drive to run the show, the researchers wondered, could other hormonal pathways shape their behavior in some way?

For the study, Grebe, Drea and colleagues followed 24 young lemurs from three to 30 months of age, collecting blood and observing their behavior from infancy to adulthood at the Duke Lemur Center.

The researchers watched the lemurs play for 20 minutes at a time, noting every time they chased or wrestled with a playmate; who instigated and who was on the receiving end; but also when their horseplay crossed a line from play fighting to real fighting.

The researchers also had blood samples collected from the lemur moms during each trimester of pregnancy.

Blood hormone test results confirmed that lemur males have much more testosterone than females: 17 times as much. Yet when it comes to being rough and rowdy, girl lemurs play-fight about as often as boys. And once they reach puberty, females but not males quickly cross a line from play to abuse.

The team found no link between a lemur's hormone levels while they were growing up, and whether they were more likely to be on the giving or the receiving end of aggression. But hormone exposure before birth was a different story.

The researchers found strong links between hormones circulating in a mother's bloodstream during late pregnancy and aggressive behavior in her offspring later on, particularly in females.

While in the womb, developing lemurs are bathed in high doses of the hormone androstenedione, known to increase in lemur moms during pregnancy. Female fetuses that were exposed to higher doses of androstenedione in utero were less likely to be bullied while growing up.

The findings suggest that prenatal exposure to the hormone has lasting effects on their development, permanently programming their bodies and brains in a more aggressive direction.

"The prenatal hormonal soup that lemurs were swimming in before they were born predicted their behavior later on in life," Grebe said.

Credit: 
Duke University

Hiding in plain sight

image: The common form of barnyard grass (top) has red stems, while the mimic has green stems -- more like rice.

Image: 
Jordan R. Brock/Washington University

Early rice growers unwittingly gave barnyard grass a big hand, helping to give root to a rice imitator that is now considered one of the world's worst agricultural weeds.

New research from Zhejiang University, the Chinese Academy of Sciences and Washington University in St. Louis provides genomic evidence that barnyard grass (Echinochloa crus-galli) benefited from human cultivation practices, including continuous hand weeding, as it spread from the Yangtze River region about 1,000 years ago.

Barnyard grass is a globally common invasive weed of cultivated row crops and cereals. The new study was published Sept. 16 in the journal Nature Ecology & Evolution.

"In Asia, rice farmers have traditionally planted and weeded their paddies by hand. Any weeds that stick out are easily detected and removed," said Kenneth Olsen, professor of biology in Arts & Sciences. "Over hundreds of generations, this has selected for some strains of barnyard grass that specialize on rice fields and very closely mimic rice plants. This allows them to escape detection."

Olsen collaborated on data analyses and interpretation for the new study. He is working with the study's corresponding author, Longjiang Fan of Zhejiang University, on other research related to rice evolutionary genomics and agricultural weed evolution.

This study sequenced the genomes of rice-mimic and non-mimic forms of the weed as a step towards understanding how this process has occurred.

This form of mimicry, called Vavilovian mimicry, is an adaptation of weeds to mimic domesticated plants. In the case of barnyard grass, the rice mimics grow upright like a rice plant instead of sprawling along the ground like most barnyard grass. They also have green stems like rice plants instead of the red stems more commonly found in the weed.

"With the advent of agriculture about 10,000 years ago, humans all over the planet began creating a wonderful habitat for naturally weedy plant species to exploit," Olsen said. "The most successful and aggressive agricultural weeds were those that evolved traits allowing them to escape detection and proliferate in this fertile new environment."

The researchers estimate that the mimic version of E. crus-galli emerged at about the same time that Chinese historical records indicate that the regional economic center was shifting from the Yellow River basin to the Yangtze River basin. During this period of the Song Dynasty, human populations were growing rapidly, demand for rice as the staple grain was paramount. This is also the time when a quick-maturing, drought-resistant variety of rice called Champa rice was introduced to the Yangtze basin from Southeast Asia -- to allow two harvests in a year. Weed management in paddies might have been intensified in the context of these conditions.

Traditional farming preserves diversity of Thai purple rice

However, while common barnyard grass is a major agricultural weed in the U.S., the rice mimic form has never become widespread in the main rice growing region -- the southern Mississippi valley.

Olsen speculates that this is because U.S. rice farmers rely on mechanized farming instead of hand labor.

"Without farmers out in the fields planting and weeding by hand, there's not such strong selection for weeds to visually blend in with the rice crop," he said.

Credit: 
Washington University in St. Louis