Culture

Do animals control earth's oxygen level?

image: The history of animal life and the environment is preserved in rocks that formed in ancient oceans.

Image: 
Artem Kouchinsky

No more than 540 million years ago there was a huge boom in the diversity of animals on Earth. The first larger animals evolved in what is today known as the Cambrian explosion. In the time that followed, the animals evolved and grew larger, but concurrently with the evolution of the animals, the oxygen level in the atmosphere dropped and this temporarily slowed the radiation. However, subsequent oxygenation and growth of algae added energy to the food chain and got the explosion of life going.

In a new scientific study, researchers from the GLOBE Institute at the Faculty of Health and Medical Sciences, University of Copenhagen, have now found that the animals themselves probably contributed to an adjustment of the oxygen level and thus indirectly controlled their own development.

'For the first time, we have succeeded in measuring 'Earth's heartbeat'- understood as the dynamics between the oxygen level and the productivity on Earth. We have found that it is not just the environment and the oxygen level that affect the animals, but that, most likely, the animals affect the oxygen level', says Associate Professor Tais Wittchen Dahl from the GLOBE Institute.

To understand what controls the oxygen level on Earth, the researchers have looked at limestone deposited on the ocean floor during the Cambrian explosion 540-520 million years ago. The ratio of uranium-238 to uranium-235 in the old lime has revealed how much oxygen there was in the oceans at that time. The researchers have thus been able to see some massive fluctuations between two extreme conditions, where the ocean floor was covered by oxygenated or oxygen-depleted bodies of water, respectively. It is these global-scale fluctuations that they believe the animals themselves have contributed to.

During the Cambrian explosion, the marine animals evolved. They became larger, began to move on the ocean floor, ate each other and developed skeletons and shells. In particular, the new ability to move is interesting because the animals ploughed through the mud on the ocean floor, and - as a result - much of the phosphate contained in the water was instead bound in the ocean floor. Phosphate is a nutrient for algae in the oceans, and algae make photosynthesis, which produces oxygen.

'Less phosphate produced fewer algae, which over geological time led to less oxygen on Earth, and due to the oxygen-poor conditions, the larger animals moved away. Once the animals were gone, the oxygen level could go up again and create favourable living conditions, and then the process repeated itself', explains Tais Wittchen Dahl.

'In this way, the mud burrowing animals themselves helped control the oxygen level and slow down the otherwise explosive evolution of life. It is entirely new that we can render it probable that such dynamics exist between the animals and the environment. And it is a very important discovery in order to understand the mechanisms that control the oxygen level on Earth'.

Understanding the mechanisms that control the oxygen level on our planet is not just important for life on Earth. A better understanding of the dynamics between oxygen and life ¬- Earth's heartbeat - will also bring us closer to an understanding of possible life on other planets.

'Oxygen is a biomarker - some of what you look for when you look for life elsewhere in the universe. And if life in itself helps control the oxygen level, it is much more likely that there will also be life in places where oxygen is present', says Tais Wittchen Dahl.

Interpreting the million-year-old dynamics is the closest we can come to making a global experiment. As it is not possible to test how you might influence the global oxygen level today, scientists must instead resort to the past to gain an understanding of the dynamics that make up Earth's heartbeat - and in this way perhaps make it a little easier to understand life on our own and on other planets.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Colorful microreactors utilize sunlight

The sun is the most sustainable energy source available on our planet and could be used to power photochemical reactions. In the journal Angewandte Chemie, scientists present a widely applicable, cost-effective photomicroreactor. It is based on "Luminescent Solar Concentrators", which harvest, convert, and make photons available for chemical reactions. Thus, the researchers were able to synthesize various substances, including two pharmaceuticals.

To date, research into the use of sunlight has focused on solar electricity, solar thermal, and solar fuels, while the solar-powered synthesis of chemicals is still in its infancy. Light energy can power chemical reactions; for example, by moving a catalyst into an excited state and thereby accelerating a reaction or even making it possible in the first place. However, the sun as a light source is disadvantageous in certain respects because the bulk of the solar spectral irradiance (the radiant flux received by a surface per unit area) falls within the relatively narrow visible range. Moreover, fluctuations in irradiance are caused by phenomena such as passing clouds.

The scientists from the Eindhoven University of Technology (the Netherlands) and the Max-Planck Institute of Colloids and Interfaces (Potsdam, Germany) now show for the first time that a diverse set of photon-driven transformations can be efficiently powered by solar irradiation. The secret of success is their specially designed, cost-effective "photomicroreactor" based on luminescent solar concentrators (LSCs).

The LSCs consist of light-guiding slabs made of polymethylmethacrylate (PMMA) doped with special luminophores that capture photons from the solar spectrum and subsequently release them as fluorescence with longer wavelength characteristics for use by the luminophore. In this way, the sunlight is concentrated into a narrow wavelength range, and daylight and weather-dependent fluctuations of the spectral distribution become negligible.

Tiny channels made of a solvent-resistant polymer are embedded into the LSC slabs, which contain the reaction mixture. A light sensor that monitors the light intensity is connected to an integrated circuit that autonomously adjusts the flow rate of the mixture: the lower the light intensity the slower the mixture passes the channel, thereby receiving the light dose needed for an adequate reaction yield. In so doing, fluctuations in sunlight irradiance are compensated for and the quality of the product remains consistent.

The selection of the doping luminophores depends on the wavelength needed for excitation of the catalyst. The team led by Timothy Noel generated red, green, and blue LSC reactors for reactions catalyzed by the photocatalysts methylene blue for the red device, eosin Y and rose Bengal for the green, and ruthenium-based metal complexes for the blue reactor. "Using these devices, we succeeded in synthesizing the antiworm drug ascaridole and an intermediate of the antimalarial drug artemisinin, besides others," says Noel. "A solar-based production approach is of high interest for products with high added value, such as fine chemicals, drugs, and fragrances. It would be particularly suited to limited resource settings."

Credit: 
Wiley

Researchers find earliest evidence of milk consumption

image: A jaw bone used in the study -- from the collections of the Dorset County Museum.

Image: 
Dr Sophy Charlton, University of York

Researchers have found the earliest direct evidence of milk consumption anywhere in the world in the teeth of prehistoric British farmers.

The research team, led by archaeologists at the University of York, identified a milk protein called beta lactoglobulin (BLG) entombed in the mineralised dental plaque of seven individuals who lived in the Neolithic period around 6,000 years-ago.

The human dental plaque samples in the study are the oldest to be analysed for ancient proteins to date globally and the study represents the earliest identification of the milk whey protein BLG so far.

The Neolithic period in Britain ran from 4,000 to 2,400 cal. BC and saw the emergence of farming, with the use of domesticated animals such as cows, sheep, pig and goats, alongside crops such as wheat and barley. Archaeologists have also discovered evidence of complex cultural practices, with Neolithic communities building large monumental and burial sites.

The ancient human remains tested in the study come from three different Neolithic sites - Hambledon Hill and Hazleton North in the south of England, and Banbury Lane in the East Midlands. Individuals from all three sites showed the presence of milk proteins from cows, sheep or goats, suggesting people were exploiting multiple species for dairy products.

Dental plaque can offer unique insights into the diets of ancient people because dietary proteins are entrapped within it when it is mineralised by components of saliva to form tartar or 'dental calculus'.

Lead author of the study, Dr Sophy Charlton, from the Department of Archaeology at the University of York, said: "The fact that we found this protein in the dental calculus of individuals from three different Neolithic sites may suggest that dairy consumption was a widespread dietary practice in the past.

"It would be a fascinating avenue for further research to look at more individuals and see if we can determine whether there are any patterns as to who was consuming milk in the archaeological past - perhaps the amount of dairy products consumed or the animals utilised varied along the lines of sex, gender, age or social standing."

The discovery of milk proteins is particularly interesting as recent genetic studies suggest that people who lived at this time did not yet have the ability to digest the lactose in milk. To get around this, the ancient farmers may have been drinking just small amounts of milk or processing it into other foodstuffs such as cheese (which removes most of the lactose), the researchers say.

'Lactase persistence', which allows for the continued consumption of milk into adulthood, is the result of a genetic mutation in a section of DNA that controls the activity of the lactase gene. However, the mechanisms behind how and when we evolved this ability remain a mystery.

Dr Charlton added: "Because drinking any more than very small amounts of milk would have made people from this period really quite ill, these early farmers may have been processing milk, perhaps into foodstuffs such as cheese, to reduce its lactose content."

"Identifying more ancient individuals with evidence of BLG in the future may provide further insights into milk consumption and processing in the past, and increase our understanding of how genetics and culture have interacted to produce lactase persistence."

Credit: 
University of York

Software companies follow the skills and move where the staff are

Software companies are more likely to base their operations in locations where skilled potential recruits already work - rather than staff moving to new areas for fresh opportunities.

New research from Lancaster University, the University of Illinois and Texas Tech University shows new firms commonly establish headquarters within a mile of existing software businesses. However, while this makes recruitment easier, there is no evidence of increased profitability as a result.

The study, published in the Journal of Economic Behavior & Organization, looks at software firms in Texas over a six-year period. There were 431 new establishments in this time, and software firms regularly set up close to established businesses, with an average of 10 within one mile, employing 394 workers between them.

Across the sample period, the number of establishments dropped from a high of 648 in late 2002 to 581 at the end of 2006, but employment rose from 16,600 to 21,000. The research found sustained employment in the software sector in the areas of business clustering came not from the growth of firms, but rather from new companies taking on employees from those already in place.

The study suggests that new firms setting up closer to established businesses are instead negatively affected by the presence of other firms in terms of the total jobs created. These companies do not grow faster that those located in isolation, rather job losses at one business are more likely to be compensated by other firms capturing the employees, explaining the persistence of employment within the sector in certain areas.

"We found that clusters of software establishments within small geographic areas are highly persistent, as is employment across the sector - though not within individual companies," said co-author Professor Dakshina De Silva, of Lancaster University Management School.

"The industry is dynamic, with a large turnover of establishments and substantial changes in their scale - approximately half of those in operation at the start of our study had exited by the end.

"Despite this, and the great potential for the distribution of firms' locations to change, almost all the employment within the software industry at the end of our six-year study was in areas with active establishments at the start of the period. This was even though employment in those already-established firms declined substantially, and the majority of jobs present at the start were lost, with only 7,015 of the initial 16,645 jobs existing in the same establishments.

"Instead, many of those jobs lost by one software establishment were captured by another firm less than a mile away as they act as a sponge for lost jobs at neighbouring companies."

The study suggests that companies are more likely to locate close to competitors as it makes recruiting skilled workers easier, with a ready supply of employees on tap, who are more likely to switch companies if they do not have to relocate. There is also the effect of start-up firms being established by employees leaving other companies.

"Software companies tend to enter into locations less than a mile away from other software establishments, and view these locations as significantly more attractive than those further away from other firms in the sector," said co-author Professor George Deltas, of the University of Illinois.

"There are lower costs for employees who switch between establishments, as they can retain their living and commuting arrangements. The large pool of software engineers and programmers within commuting distance allows a firm to expand more easily by poaching the employees of other firms, which reduces the set-up and subsequent recruitment costs.

"New firms would find it advantageous to locate close to their competitors as it is often harder to induce employees to relocate, both because of the upfront costs and because moving would see them forgo the option to work for other employers in that same area at a later time."

Associate Professor Robert McComb, of Texas Tech University, added: "In the software industry, jobs go where the workers are, rather than the other way around. This leaves cities that want to become magnets for an industry in a conundrum - they need concerted efforts to attract multiple establishments to a location in a short period of rime, which is a daunting task."

Credit: 
Lancaster University

Brain circuit controls individual responses to temptation in rats

image: One of the rats from the eLife study by Campus et al.

Image: 
Flagel Lab

Differences in a key brain circuit may suggest why some individuals are less able than others to resist tempting cues, according to a study in rats published today in eLife.

The findings help explain the results of previous behavioral studies in rats which showed that some are more attracted to cues associated with food. The current work may have important implications for scientists trying to understand why some people have a harder time resisting cravings triggered by sights, sounds or places.

"Those who are unduly attracted to reward-associated stimuli are likely to be at greater risk of impulse control disorders, including eating disorders and substance abuse," says lead author Paolo Campus, PhD, Postdoctoral Research Fellow at the Molecular and Behavioral Neuroscience Institute, University of Michigan in Ann Arbor, US. "We wanted to investigate which brain regions and chemicals may render an individual more susceptible to these conditions."

To do this, the team used a combination of genetic engineering and a drug, otherwise known as chemogenetics, to 'turn on' the brain circuit that connects the prelimbic cortex with the paraventricular nucleus of the thalamus in rats that find it difficult to resist food-related cues. They then exposed the animals to these food cues and found that their attraction was reduced. Turning on this circuit in rats that were less responsive to food cues had no effect on their behaviour.

Conversely, using this technique to 'turn off' the circuit in rats that were previously uninterested in food-linked cues made them instead seek out these cues. The team also found that turning on this pathway in animals that were indifferent to these cues increased the amount of dopamine in their brain - a chemical that makes reward stimuli more attractive.

"These findings suggest that the circuit reduces the incentive value of a food cue," Campus explains. "It does this by controlling processes in the brain that contribute to the association of a cue with a reward, such as the release of dopamine."

Currently, the techniques used in this study to manipulate the brain cannot be applied to humans. Future research will therefore be needed to confirm that this circuit is essential to regulating reward cue-related behaviour in humans also. "This may help scientists better understand why some people develop substance abuse disorders, overeating, compulsive gambling or other impulse control disorders," notes senior author Shelly Flagel, PhD, Research Associate Professor at the Molecular and Behavioral Neuroscience Institute, University of Michigan in Ann Arbor. "This brain circuit could potentially be a target for new therapies to treat or prevent such conditions."

Credit: 
eLife

Sex for cooperation

image: Same-sex sexual behavior in female bonobos increases friendly social interactions, including cooperation.

Image: 
Zanna Clay, Lola ya Bonobo Sanctuary

Among our two closest phylogenetic relatives, chimpanzees remain by far the more thoroughly-studied and widely-recognized species, known for their high levels of cooperation especially among males, which includes sharing food, supporting each other in aggressive conflicts and defending their territories against other communities. In contrast, insights into the social dynamics of wild bonobos are available from only a small number of long-term field sites, and bonobos are probably best known for their diverse sexual behavior, which together with their proposed peacefulness between communities and co-dominance between the sexes, has led to their nickname as the 'hippie apes'.

The stereotype of bonobos as hyper-sexual is an over-simplification, but it does capture a fascinating aspect of bonobo social behavior. Bonobos are one of the few species in which all adult members of one sex engage in habitual same-sex sexual interactions that occur at similar or even greater frequencies as opposite-sex interactions. In the wild, all adult females perform same-sex genital contacts, known as genito-genital rubbing (or GG-rubbing) on a regular basis with many other females in their community. In contrast, male bonobos rarely engage in same-sex sexual behavior. There are several theories to explain the function of same-sex sexual behavior in bonobos, including as a way to reduce social tension, prevent aggression or form social bonds. However, none of these theories can explain why such behavior occurs so frequently only among females.

To clarify why same-sex sexual behavior is so important specifically for female bonobos, we collected behavioral and hormonal data for over a year from all adult members of a habituated bonobo community at the long-term LuiKotale field site in the Democratic Republic of Congo. In addition to our focus on sexual interactions, we identified preferred partners for other social activities such as giving support in conflicts. We also collected urine to measure the hormone oxytocin, which is released in the body in other species after friendly social interactions, including sex and helps to promote cooperation.

We found that in competitive situations, females preferred to have sex with other females rather than with males. After sex, females often remained closer to each other than did mixed sex pairs, and females had measurable increases in urinary oxytocin following sex with females, but not following sex with males. Among same-sex and opposite-sex pairs, individuals who had more sex also supported each other more often in conflicts, but the majority of these coalitions were formed among females. "It may be that a greater motivation for cooperation among females, mediated physiologically by oxytocin, is the key to understanding how females attain high dominance ranks in bonobo society", explained co-lead author Martin Surbeck, a researcher at the Max Planck Institute for Evolutionary Anthropology and Harvard University.

For humans as well, alliances between members of the same sex provide many benefits, including mutual social support and sharing of resources. There is also historical and cross-cultural evidence that such alliances are often reinforced through sexual interactions. "While it is important to not equate human homosexuality with same-sex sexual behavior in animals", cautions co-lead author Liza Moscovice, a researcher at the Leibniz Institute for Farm Animal Biology, "our study suggests that in both humans and a close phylogenetic relative, the evolution of same-sex sexual behavior may have provided new pathways to promote high levels of cooperation."

Credit: 
Max Planck Institute for Evolutionary Anthropology

Precious metal flecks could be catalyst for better cancer therapies

Tiny extracts of a precious metal used widely in industry could play a vital role in new cancer therapies.

Researchers have found a way to dispatch minute fragments of palladium - a key component in motor manufacture, electronics and the oil industry - inside cancerous cells.

Scientists have long known that the metal, used in catalytic converters to detoxify exhaust, could be used to aid cancer treatment but, until now, have been unable to deliver it to affected areas.

A molecular shuttle system that targets specific cancer cells has been created by a team at the University of Edinburgh and the Universidad de Zaragoza in Spain.

The new method, which exploits palladium's ability to accelerate - or catalyse - chemical reactions, mimics the process some viruses use to cross cell membranes and spread infection.

The team has used bubble-like pouches that resemble the biological carriers known as exosomes, which can transport essential proteins and genetic material between cells. These exosomes exit and enter cells, dump their content, and influence how the cells behave.

This targeted transport system, which is also exploited by some viruses to spread infection to other cells and tissues, inspired the team to investigate their use as shuttles of therapeutics.

The researchers have now shown that this complex communication network can be hijacked. The team created exosomes derived from lung cancer cells and cells associated with glioma - a tumour that occurs in the brain and spinal cord - and loaded them with palladium catalysts.

These artificial exosomes act as Trojan horses, taking the catalysts - which work in tandem with an existing cancer drug- straight to primary tumours and metastatic cells.

Having proved the concept in laboratory tests, the researchers have now been granted a patent that gives them exclusive rights to trial palladium-based therapies in medicine.

The study was funded by the Engineering and Physical Sciences Research Council and the European Research Council. It has been published in the journal, Nature Catalysis.

Professor Asier Unciti-Broceta, from the University of Edinburgh's CRUK Edinburgh Centre, said: "We have tricked exosomes naturally released by cancer cells into taking up a metal that will activate chemotherapy drugs just inside the cancer cells, which could leave healthy cells untouched."

Professor Jesús Santamaría, of the Universidad de Zaragoza, said: "This has the potential to be a very exciting technology. It could allow us to target the main tumour and metastatic cells, thus reducing the side effects of chemotherapy without compromising the treatment."

Credit: 
University of Edinburgh

Near misses at Large Hadron Collider shed light on the onset of gluon-dominated protons

image: Daniel Tapia Takaki of the University of Kansas at work at the Large Hadron Collider's Compact Muon Solenoid.

Image: 
Tapia Takaki

LAWRENCE -- New findings from University of Kansas experimental nuclear physicists Daniel Tapia Takaki and Aleksandr (Sasha) Bylinkin were just published in the European Physical Journal C. The paper centers on work at the Compact Muon Solenoid, an experiment at the Large Hadron Collider, to better understand the behavior of gluons.

Gluons are elementary particles that are responsible for "gluing" together quarks and anti-quarks to form protons and neutrons -- so, gluons play a role in about 98% of all the visible matter in the universe.

Previous experiments at the now-decommissioned HERA electron-proton collider found when protons are accelerated close to light-speed, the density of gluons inside them increases very rapidly.

"In these cases, gluons split into pairs of gluons with lower energies, and such gluons split themselves subsequently, and so forth," said Tapia Takaki, KU associate professor of physics & astronomy. "At some point, the splitting of gluons inside the proton reaches a limit at which the multiplication of gluons ceases to increase. Such a state is known as the 'color glass condensate,' a hypothesized phase of matter that is thought to exist in very high-energy protons and as well as in heavy nuclei."

The KU researcher said his team's more recent experimental results at the Relativistic Heavy Ion Collider and LHC seemed to confirm the existence of such a gluon-dominated state. The exact conditions and the precise energy needed to observe "gluon saturation" in the proton or in heavy nuclei are not yet known, he said.

"The CMS experimental results are very exciting, giving new information about the gluon dynamics in the proton," said Victor Goncalves, professor of physics at Federal University of Pelotas in Brazil, who was working at KU under a Brazil-U.S. Professorship given jointly by the Sociedade Brasileira de Física and the American Physical Society. "The data tell us what the energy and dipole sizes are needed to get deeper into the gluonic-dominated regime where nonlinear QCD effects become dominant."

Although experiments at the LHC don't directly study interaction of the proton with elementary particles such as those of the late HERA collider, it's possible to use an alternative method to study gluon saturation. When accelerated protons (or ions) miss each other, photon interactions occur with the proton (or the ion). These near misses are called ultra-peripheral collisions (UPCs) as the photon interactions mostly occur when the colliding particles are significantly separated from each other.

"The idea that the electric charge of the proton or ions, when accelerated at ultra-relativistic velocities, will provide a source of quasi-real photons is not new," Tapia Takaki said. "It was first discussed by Enrico Fermi in the late 1920s. But it's only since the 2000s at the RHIC collider and more recently at the LHC experiments where this method has been fully exploited."

Tapia Takaki's group has played a significant role in the study of ultra-peripheral collisions of ions and protons at two instruments at the Large Hadron Collider, first at the ALICE Collaboration and more recently with the CMS detector.

"We have now a plethora of interesting results on ultra-peripheral heavy-ion collisions at the CERN's Large Hadron Collider," said Bylinkin, an associate researcher in the group. "Most of the results have been focused on integrated cross-sections of vector mesons and more recently on measurements using jets and studying light-by-light scattering. For the study of vector meson production, we are now doing systematic measurements, not only exploratory ones. We are particularly interested in the energy dependence study of the momentum transfer in vector meson production since here we have the unique opportunity to pin down the onset of gluon saturation."

The researchers said the work is significant because it's the first establishment of four measured points in terms of the energy of the photon-proton interaction and as a function of the momentum transfer.

"Previous experiments at HERA only had one single point in energy," Tapia Takaki said. "For our recent result, the lowest point in energy is about 35 GeV and the highest one is about 180 GeV. This does not sound like a very high energy point, considering that for recent J/psi and Upsilon measurements from UPCs at the LHC we have studied processes up to the 1000s GeV. The key point here is that although the energy is much lower in our Rho0 studies, the dipole size is very large."

According to team members, many questions remain unanswered in their line of research to better understand the makeup of protons and neutrons.

"We know that at the HERA collider there were already hints for nonlinear QCD effects, but there are many theoretical questions that have not been answered such as the onset of gluon saturation, and there are at least two main saturation models that we don't know yet which one is the closest to what nature says the proton is," said Goncalves. "We've used the latest results from the CMS collaboration and compared them to both the linear and nonlinear QCD-inspired models. We observed, for the first time, that the CMS data show a clear deviation from the linear QCD model at their highest energy point."

Credit: 
University of Kansas

Multicomponent home-based treatments improve mobility in older adults after hip fracture

Each year more than 260,000 older Americans are hospitalized for hip fractures, a debilitating injury that can severely and permanently impact mobility. Researchers at the University of Maryland School of Medicine (UMSOM) studied two types of home-based interventions and discovered that these treatments are effective in helping individuals regain their ability to walk, but not enough to do every day functions like crossing the street.

Jay Magaziner, PhD, MSHyg, Professor and Chair of the Department of Epidemiology and Public Health at UMSOM was the Principal Investigator for this research and Rebecca L Craik, PT, PhD, FAPTA, Dean of the College of Health Sciences at Arcadia University was Co-Principal Investigator. The research was a multidisciplinary partnership involving investigators from epidemiology, physical therapy, geriatrics, orthopedics, gerontology, health economics, biostatistics and health services research. It was conducted at UMSOM, Arcadia University and UConn Health at the University of Connecticut. The research compared two different types of multi-component home-based physical therapy programs, both of which showed significant improvements in the ability to walk but not enough to be independent in the wider community.

This research, which was published today in JAMA, involved 210 participants 60 years old and older recovering from hip fractures. One group received aerobic, strength and balance training. The other group received nerve stimulation and active range of motion exercises. Both groups received as many as three regular weekly home visits from a physical therapist over a 16-week period. In addition, the participants received nutritional counseling and daily vitamin D, calcium and multivitamin supplements.

The research measured the participants so-called "community ambulation," which is the ability to cross a street before a traffic light changes. To determine community ambulation, participants were timed to see how far they could walk 300 meters (approximately the length of 3 football fields) in a six-minute period after the 16-week treatment. Researchers found that after all the regular in-home interventions, 23 percent of the participants in the group receiving aerobic, strength and balance training could walk more than 300 meters in six minutes. In the other group, which received nerve stimulation and range of motion exercises, 18 percent of the participants could walk 300 meters or more in a six-minute timeframe, a difference that was not considered statistically significant.

"Both groups showed significant improvement, which highlights the importance of multi-component home-based interventions," said Dr. Magaziner. "The equal level of professional attention both groups received may explain why the difference in the percentage of patients becoming community ambulators between the two groups was relatively small."

This Community Ambulation Projected tested the effectiveness of a home-based multicomponent 16-week intervention that addressed specific walking-related abilities such as endurance, balance, strength and lower extremity function. Both treatment groups were provided regular 60-minute in-home interventions for 16 weeks. The patients were then tested to determine if they could walk more than 300 meters in six minutes.

"Many older adults face challenges regaining mobility after hip fracture. We were pleased that in this study a sizeable number of participants achieved community ambulation capacity. However, much more needs to be done to develop and carry out targeted and creative rehabilitation programs that will benefit greater numbers of older adults who strive to become community ambulators following hip fracture." said Richard Fortinsky, PhD, Professor and Health Net, Inc. Endowed chair in Geriatrics and Gerontology at UConn Health, who was the study's lead researcherat the UConn Health Campus of the University of Connecticut.

During the 16-week period, physical therapists were engaged actively with the study participants, providing motivation and positive reinforcement throughout the exercise sessions. This therapeutic alliance may have also contributed to improved walking ability, the research showed. Going forward, this research could help shed light on improving home-based interventions. For example, future research could assess the influence of factors such as the amount of exercise, how patients adhere to the treatments, behavior and environmental factors, and body composition in response to in-home interventions.

"The number of hip fractures throughout the world is increasing, and nearly half of those individuals who have a fracture will not be able to walk independently a year later. To date, no single intervention has been able to provide a remedy to this growing problem. Our research here at the University of Maryland can help set a clear path for treating the most challenging mobility cases," said UMSOM Dean E. Albert Reece, MD, PhD, MBA, who is also the Executive Vice President for Medical Affairs, University of Maryland, and the John Z. and Akiko K. Bowers Distinguished Professor.

Credit: 
University of Maryland School of Medicine

Mathematical model could help correct bias in measuring bacterial communities

Researchers from North Carolina State University have developed a mathematical model that shows how bias distorts results when measuring bacterial communities through metagenomic sequencing. The proof-of-concept model could be the first step toward developing calibration methods that could make metagenomic measurements more accurate.

Metagenomic sequencing identifies the number and type of bacteria present in a particular community - for example, in a human gut microbiome - through DNA extracted from the sample. "We're measuring communities of bacteria - which ones are present and how many of each one are there," says Ben Callahan, assistant professor of population health and pathobiology and corresponding author of a paper describing the work. "However, the measurement technology isn't perfect, which introduces bias into the results. And that means we don't get an accurate picture of the community we're trying to measure."

According to Callahan, since metagenomic sequencing is a multi-step process, biases can be introduced in every step.

"The most well-known step is DNA extraction, where we break open the bacteria to get to the DNA," Callahan says. "The cells of some bacteria are harder to break open then others. Let's say I have a bacterium that makes up half of the community but doesn't break very well. I could end up with only 10% of this bacterium in my measurement, instead of the 50% that is actually there. That introduces bias. Now every measurement or calculation I do from that point onward is systematically skewed."

Callahan, with NC State postdoctoral researcher Michael McLaren and biostatistician Amy Willis from the University of Washington, tested their model of bias against two types of metagenomic sequencing - 16S RNA gene and shotgun metagenomics - in microbial communities of known composition, and found that the model accurately described bias in those circumstances.

"What this experiment shows is that the model we propose works in at least these limited circumstances," Callahan says. "The long-term goal is to provide a calibration tool for metagenomic measurements of complex natural communities, just as we have standards that we use to calibrate measurement technologies like scales, oscilloscopes and microscopes. This work is a first step toward that."

Credit: 
North Carolina State University

URI scientists establish link between prenatal HIV exposure and decreased infant immunity

KINGSTON, R.I. - September 10, 2019 - In the August 16 edition of Nature Scientific Reports, scientists at the University of Rhode Island provide concrete evidence linking the specific immune responses in HIV-negative babies to the HIV-positive status of their mothers. The work was carried out in the laboratory of Barbara Lohman-Payne, associate research professor with URI's Institute for Immunology and Informatics within its Department of Cell and Molecular Biology, in collaboration with colleagues at the University of Nairobi, the University of Washington and the Karolinska Institute in Sweden.

To arrive at their conclusions, researchers compared the T-cell receptor beta-chain repertoire of cord blood samples from HIV-exposed uninfected infants to samples collected from mother-child pairs unaffected by HIV, but who were living in the same communities.

Despite the success of antiretroviral therapies in helping to suppress the risk of HIV being passed from mother to child, globally, HIV-exposed uninfected infants are a vulnerable population characterized by increased morbidity and mortality, higher rates of hospitalizations and childhood infections with more severe outcomes.

According to Lohman-Payne, "The reasons for these epidemiological findings are poorly understood, largely because it has been difficult to obtain suitably controlled participant samples and health data to allow for careful analysis." Fortunately, Lohman-Payne and her team were able to work with researchers at the University of Nairobi and University of Washington that had established cohorts as part of their work in the 1990s and 2000s, prior to the wide introduction of antiretroviral therapy in Kenya, in order to obtain samples for analysis.

The analysis revealed two clusters of HIV-exposed but uninfected babies: one cluster born similar to healthy infants without HIV exposure while the other cluster born with a significantly reduced immune repertoire.

The reduced immune repertoire in HIV-exposed but uninfected infants found in the study may help explain the increased risk of morbidity in this population and emphasizes the need for special care and attention to this group.

"For this study to succeed required a multi-year, multi-continent collaboration that provided generous access to the samples we needed," said Lohman-Payne. "Thanks to that collaboration, importantly, we now have a fuller picture of the potential impact of HIV-exposure on infants regardless of their HIV-status."

Credit: 
University of Rhode Island

Slowing brain rhythms can serve as a marker for delirium and its clinical outcomes

Study shows that the slowing of brain rhythms can serve as a marker for delirium and its clinical outcomes

BOSTON - An EEG (electroencephalogram) can provide a valuable biomarker for detecting delirium, a serious mental disturbance that is often underrecognized, as well as predicting poor clinical outcomes, researchers from Massachusetts General Hospital (MGH) have found. In a paper published in Neurology, the team reported that the generalized slowing of brain rhythms, shown as abnormal theta or delta waveforms on a routine clinical EEG, were associated with longer patient hospitalizations, worse functional outcomes and increased mortality.

"There is growing concern that delirium severity is associated with worse prognosis, and our study provides for the first time hard clinical data that allows physicians to quantify, track and predict patient outcomes in a more accurate way," says Eyal Kimchi, MD, PhD, investigator in the department of Neurology at MGH and lead author of the study. "This information could play a critical role in identifying high-risk individuals and then monitoring the clinical course of their condition, including potentially their responses to novel therapies."

Delirium is an acute and fluctuating disturbance of attention and awareness that is often associated with dementia, dependence and death. While clinical tools in the form of structured cognitive assessments with standardized questions are commonly used to measure delirium presence, they involve subjective evaluation of a complex neurologic condition, generating disagreement even among experts in the field. Generalized EEG slowing was shown in the MGH study to be highly and significantly predictive of delirium when compared to the results of the 3-Minute Diagnostic Interview for Confusion Assessment Method (3D-CAM), the study's primary assessment tool.

"There's been an explosion of clinical tools to assess delirium, but very little in the way of biomarker development," says Kimchi. "Having a biomarker which is more rigorous and quantitative would give us a better understanding of what's going on in the brain and provide a solid foundation for our diagnoses and prognoses. It would also allow us to develop therapies and determine if they are working."

The study consisted of 200 inpatients undergoing EEG for various causes of altered mental status. Over 60 percent of this cohort screened positive for delirium based on the 3D-CAM criteria. While previous research has linked EEG slowing to poor clinical outcomes among specific patient populations, such as those with postanoxic coma or sepsis in the ICU, the MGH study was the first to draw a connection to a wider patient population with a variety of disease states.

In terms of predicting clinical outcomes among these patients, investigators found that EEG slowing was associated with a median length of hospital stay eight days longer than for those with no slowing. Moreover, EEG slowing was a marker for worse functional outcomes as measured by ability to perform routine daily tasks required for independent living. And lastly, EEG slowing was associated with increased in-hospital mortality. Specifically, the study found 19 of the 127 patients with delirium and EEG slowing died in the hospital, compared to no deaths among the cohort without EEG slowing.

"Our study suggests that even a single observation of EEG slowing is a significant marker for poor clinical outcomes among people with altered mental status and shouldn't be ignored by neurologists and other medical professionals," says Kimchi, who is also an assistant professor of Neurology at Harvard Medical School (HMS). "These are vulnerable individuals and being able to identify and monitor them through EEG slowing is a valuable way to ensure they get the care and treatment they need."

Credit: 
Massachusetts General Hospital

Nobel Laureate, Tom Cech, Ph.D., suggests new way to target third most common oncogene, TERT

image: Thomas Cech, PhD and colleagues show that TERT mRNA may be waylaid in the cell nucleus, offering the possibility to trap this mRNA and thus silence the effect of this dangerous oncogene.

Image: 
University of Colorado Cancer Center

Healthy cells have a built-in self-destruct mechanism: Strands of DNA called "telomeres" act as protective caps on the ends of your chromosomes. Each time a cell replicates, telomeres get a little shorter. Think of it like filing your nails with an Emory board - after enough filing, you hit your fingertip - ouch! In the case of healthy cells, after enough replications, telomeres are "filed" away, leaving bare ends of the chromosomes exposed. At that point, healthy cells are inactivated or die. The eventual loss of telomeres is a major reason you are not immortal. This cellular mortality is also a major way your body fights cancer.

That's because a hallmark of cancer is cellular immortality. And for that to happen, somehow, some way, cancer cells have to break the body's system of telomere degradation - cancer needs to keep chromosomes safely capped. One way they do it is by spackling new DNA onto telomeres faster than it's lost. This involves supercharging the gene that codes for new telomere material.

The TERT gene is the third most commonly mutated gene in cancer. When cancer over-activates TERT, it manufactures more of the enzyme "telomerase," which rebuilds telomeres faster than they are degraded. With telomeres being built faster than they degrade, cancer cells gain immortality. Especially cancer like melanoma, glioblastoma, and bladder cancers (among others) are defined by TERT mutation. It's likely that without TERT mutation, there would be none of these cancers.

Unfortunately, despite a massive effort by research and pharmaceutical communities, there is no silver bullet that successfully turns off over-activated TERT in human cancer patients (though anti-TERT drugs have done well against cancer cells in dishes and even in mouse models).

Now a study by Nobel Laureate, Thomas Cech, PhD, University of Colorado Cancer Center investigator and Director of the Biofrontiers Institute at University of Colorado Boulder, offers an intriguing new way to target TERT over-activation. Between a gene and its expression is an important step: The blueprints from the TERT gene have to get from the cell's nucleus out into the cell's cytoplasm, where tiny manufacturing centers called ribosomes make these TERT blueprints into the telomerase enzyme that rebuilds telomeres. (That's a difficult sentence - consider rereading before continuing...)

The current study, published in the Proceedings of the National Academy of Sciences, shows that unlike "messenger RNA" (mRNA) for other genes, the messengers in charge of bringing TERT blueprints to ribosome manufacturing centers dally in the cell nucleus.

Let's let Dr. Cech deliver the punchline:

"Our hypothesis is that there's a special mechanism for transporting TERT mRNA into the cytoplasm that is really slow or inefficient, and that this is another level of regulation that most genes don't need. But TERT needs this special step in regulating the transport of mRNA - and if that's the case, maybe that could be a new target for interfering with this transport step. Maybe we could attack TERT in cancer cells by keeping mRNA trapped in the nucleus," Cech says.

In other words, maybe the silver bullet in the fight against TERT over-activation has nothing to do with TERT over-activation at all. If we could trap TERT mRNA in the nucleus, it wouldn't matter how much of it there is - without the ability to reach ribosomes in the cytoplasm, the TERT gene wouldn't be manufactured into the telomerase enzyme.

In fact, Cech's lab discovered the action of the TERT gene in cancer nearly 22 years ago.

"At the time, it was just a basic science discovery. We weren't working on cancer, just interested in this little machine," he says. "Time went on and now several hundred labs around the world are working on TERT."

One thing these labs agree on is that TERT is a major cancer-causing gene. They see TERT over-activation and the accumulation of telomerase in cancer cells. But labs had never looked at the timeline of this telomerase accumulation. Here's where this science story becomes a people story.

"Two years ago, John Rinn moved his lab to our institute from Harvard. One of his postdocs, Gabrijela Dumbovic, struck up conversations with a postdoc in my lab, Teisha Rowland [first author on the current paper]. Gabby had the ability to see RNA in individual cells and even track the movement over time of specific RNA molecules. The result is this collaboration: We can see RNA spewing out from the TERT gene as it's being made, and we can also see individual mRNA molecules moving from the nucleus to the cytoplasm," Cech says.

What they saw is that while all other messenger RNAs move quickly from nucleus to cytoplasm, TERT messenger RNA is held up in the nucleus.

"It's not that none of it gets out. It's like half gets out. It's like a flipbook: At the first snapshot, we can see RNA being made at the TERT gene. Then as we flip through later snapshots, we see, surprisingly, a lot of it still stuck in the same compartment it's made, even after days. Some gets out, but a lot appears to be stuck. Keeping TERT RNA trapped in the nucleus would be just as good as knocking out the gene - if it were stuck in the nucleus, it couldn't do anything," Cech says.

However, Cech also cautions that TERT-trapping isn't ready for primetime. In addition to better understanding the mechanism that keeps TERT messenger RNA trapped in the nucleus and designing/testing a way to magnify or manipulate this effect, is the major challenge that some telomerase is important for the function of the body's healthy cells, especially stem cells. Completely erasing the action of TERT would be, as scientists say, a bad thing.

"Unfortunately, it's a double-edged sword. If telomeres get too short, you get genome instability, which is another hallmark of cancer. You need TERT action in a Goldilocks range - not too hot, not too cold. Just the right telomeres," Cech says.

"It took 22 years to go from the discovery of the TERT gene to this finding," he says. "We hope it's not another 22 years before we find a way to act against TERT in cancer."

Credit: 
University of Colorado Anschutz Medical Campus

Europeans face significant challenges to participate in lung cancer clinical trials

Barcelona--A survey of patients with lung cancer in several European countries revealed that half did not know what a cancer clinical trial is, and 22 percent had never heard of a cancer clinical trial. The research was reported by Dr. A.M. Baird, on behalf of Lung Cancer Europe, today at the IASLC 2019 World Conference on Lung Cancer hosted by the International Association for the Study of Lung Cancer.

Lung Cancer Europe (LuCE) is the voice of people impacted by lung cancer in Europe. LuCE aims to increase knowledge of lung cancer and provide a platform to raise awareness about disparities in detection, diagnosis, treatment and care across Europe.

Dr. Baird reported that the study was undertaken to gain a better insight into the clinical trial experience from a patient perspective and improve clinicians' and public health's understanding of patients' awareness and attitudes towards clinical trials.

The LuCE team developed the survey and qualitative interview questions based on a review of relevant literature and policy sources.

"We shared our online survey with lung cancer advocates and patients with lung cancer," Dr. Baird said. The organization conducted qualitative interviews with 15 individuals, covering the medical community, representatives from patient advocate organizations and the pharmaceutical industry.

The survey was shared with patients with lung cancer across Europe and those who took part resided mostly in Poland (19.5 percent), Italy (18.7 percent), Denmark (9.9percent) and Spain (9.2 percent).

"Fortunately, over 50 percent of these respondents stated that their trial experience was positive and 80 percent wanted to find out more about clinical trials, while 75 percent believed that it would be beneficial for patients to work together with researchers in the clinical trial development process," she said.

Survey respondents identified several barriers to accessing lung cancer clinical trials, including difficulties in cross-border access, language barriers, lack of accurate accessible information, lack of awareness by patients and clinicians and disparities in access across Europe.

"The lung cancer community must work together to overcome these barriers and ensure access to clinical trials for all people impacted by lung cancer," Baird concluded.

Credit: 
International Association for the Study of Lung Cancer

Nurse led follow-up service aids patients with respected early stage lung cancer, improves clinic efficiency

Barcelona-- The presence of the specialist nurse within thoracic surgical centers in the United Kingdom increased clinic capacity and efficiency, reduced waiting time for appointments, promoted junior medical training and ensured continuity of care for the patients, according to an analysis reported today by Jenny Mitchell from Oxford University Hospitals NHS Foundation Trust, in Oxford, United Kingdom.

Mitchell presented the report at the IASLC 2019 World Conference on Lung Cancer hosted by the International Association for the Study of Lung Cancer.

Specialist nursing roles within thoracic surgical centers in the UK are unique to each center and develop to meet the needs of the local service. In Oxford, Mitchell identified that the follow-up of patients after resection of early stage lung cancer could be improved and would be suitable for management by a specialist nurse.

Prior to the introduction of the specialist nursing role, patients were reviewed by the junior doctors working in the clinic, offering limited continuity of care and often presenting challenges in following-up abnormal results.

Following the successful development of a nurse led early follow-up clinic Jenny Mitchell and her colleagues instituted a nurse led CT follow-up program for patients on long term surgical follow-up after lung cancer surgery..

According to Mitchell, following review of international guidelines and in conjunction with the lung cancer multidisciplinary team, the research team devised a CT follow-up program that provided

CT chest, abdomen and pelvis every six months for two years after surgery followed by an appointment to be given the results.

CT chest, abdomen and pelvis at three, four and five years after surgery followed by an appointment to be given the results.

The program is coordinated, and CT results triaged by the specialist nurse. Following successful introduction of nurse led follow-up in the face to face clinics.

"We found that feedback from patients on our CT follow-up program indicated they find two trips to the hospital burdensome and they frequently requested results of surveillance imaging over the telephone," Mitchell said. "In addition, limited capacity in the thoracic surgery clinics led to patients waiting a long time for a face to face appointment to be informed of their imaging results.

To address these issues, the Oxford team developed a model of nurse led telephone follow-up after surveillance imaging. The criteria for telephone appointments are: CT results show no abnormality or minor changes requiring a repeat CT chest in three months and patients can communicate adequately over the telephone.

The specialist nurse reviews all the CT follow-up results and allocates patients to the most appropriate clinic, ensuring patients are reviewed in the appropriate setting for their needs and those who need to be see urgently are prioritized. Abnormalities and concerns detected during the follow-up program are presented at the multidisciplinary meetings by the specialist nurse, who takes responsibility for the actions requested by the team.

From January 2013 to December 2017 there were 546 specialist nurse face-to-face clinic appointments in 189 clinics for 285 patients with primary lung cancer. The telephone clinic commenced in April 2017 and in the first twelve months there were 254 patient appointments in 51 telephone clinics.

The presence of the specialist nurse within the follow-up clinics has increased clinic capacity and efficiency, reduced waiting time for appointments, promotes junior medical training and ensures continuity of care for the patients.

"The patients appreciate the continuity of care and improved access to specialist nursing support and the role is appreciated and respected by the multidisciplinary team," Mitchell reported.

The introduction of the telephone clinic has increased overall clinic capacity and reduced the waiting time for appointments within the face-to-face clinics.

"The specialist nurse is able to provide continuity of care and ensure that all imaging results are followed up appropriately," said Mitchell." The role requires the support of the multidisciplinary lung cancer team to work effectively across all elements of the patient pathway. "

Credit: 
International Association for the Study of Lung Cancer