Culture

New epidemic forecast model could save precious resources

image: Taking into account the way an individual changes their behavior when faced with an outbreak can help researchers advise communities of the best use of their resources.

Image: 
Texas A&M Engineering

When governments and institutions deploy epidemic forecast models when facing an outbreak, they sometimes fail to factor in human behavior and over-allocate precious resources as a result. Thanks to new research authored by a Texas A&M University engineering professor, that may no longer be the case.

Dr. Ceyhun Eksin, lead author and assistant professor in the Texas A&M Department of Industrial & Systems Engineering and his colleagues at the University of California Santa Barbara and the Georgia Institute of Technology have published an article in the journal Epidemics that focuses on incorporating behavior change criteria into disease outbreak models.

Adding these criteria will allow professionals and communities to mobilize adequate resources during epidemic outbreaks and reduce public mistrust caused by the overallocation of resources.

"Our goal was to adapt these findings to forecast the disease trajectory, even if the initial information the model received was inaccurate," Eksin said. "The findings show there is value to incorporating a behavior aspect into forecast models."

A modified SIR model

The current models used to predict the impact of an outbreak, called simple susceptible-infected-recovered (SIR) models, do not take the changes in an individual's behavior into account and can over predict the number of infected individuals during an outbreak. This can lead to an overuse of resources.

The research team hypothesized that individuals would take action during an outbreak to reduce their exposure by avoiding infected individuals and as a result would change the number of individuals infected during the outbreak. To put this idea to the test, the researchers created a modified SIR model that included the ability to pick up a change in an individual's behavior.

By testing their modified model against the simple SIR model, Eksin and his colleagues were able to show that the modified model more accurately predicted outbreak numbers. By inputting past outbreak data into the modified models, they were able to predict the number of infected individuals more accurately.

Putting resources to better use

Predicting the number of individuals who will become infected during an outbreak is valuable to determine how to use limited resources, and interdisciplinary research can help understand the link between a public health response and behavior change. If a community is better able to plan for an outbreak, without over-preparing, it can save resources and reduce the possibility of losing public support during future outbreaks.

Credit: 
Texas A&M University

Carbohydrate in the heart seems to help regulate blood pressure

New research suggests that a particular type of carbohydrate plays an important role in regulating the blood pressure in the human body. This has been shown by researchers from the University of Copenhagen and Rigshospitalet in a new study using rats. The researchers believe that the finding may have a vast potential for improved medications for high blood pressure.

Both hypertension and hypotension can have adverse consequences for the health and lead to cardiovascular diseases and syncope, respectively. Now, researchers from the University of Copenhagen and Rigshospitalet know a little more about the factors that help regulate the blood pressure. The new research results have been published in the scientific journal, Journal of Biological Chemistry.

In interdisciplinary collaboration between researchers at the University of Copenhagen and Rigshospitalet, PhD student Lasse Holst Hansen found a particular form of carbohydrate or sugar on a particular peptide hormone in humans. In addition, in tests with rats the research team found that the peptide hormone with that particular form of sugar affects the regulation of the blood pressure. They hope that in the long term, their results can be used to develop better medications for hypertension.

'It may be a really good bet for a modern way to treat hypertension without side effects, such as syncope. It has long been known that this peptide hormone is extremely important for the blood pressure, but so far it has not been possible to use it in the treatment. This finding was only possible because we collaborated across disciplines and combined basic and clinical research,' says Professor Jens Peter Gøtze, Rigshospitalet, the Department of Clinical Biochemistry.

About one in five Danes has hypertension. This increases the risk of cardiovascular diseases, such as coronary thrombosis and heart failure. According to the Danish Heart Association, one in four Danes will die from cardiovascular diseases.

New Insight into Physiological Processes

The cells of the body use sugar to decorate proteins - a process also called glycosylation - in order to control the function and stability of the proteins. In the study, the researchers show how a particular type of sugar attach to a peptide hormone called atrial natriuretic peptide (ANP). This peptide hormone is secreted from the heart and is important for regulation of the blood pressure and the fluid balance in the body.

'We can see that when that particular sugar is located on the peptide hormone, it regulates the fluid balance and blood pressure differently than if the sugar is not located there. In our animal models, we could see that the peptide hormone with and without sugar behaves differently. It gives us an insight into a new mechanism for regulation of these important physiological processes in the body,' says Associate Professor Katrine Schjoldager, Copenhagen Center for Glycomics.

The next step for the researchers will now be in-depth studies of the function of that particular sugar and studies as to how the heart regulates the attachment of the sugar. At the same time, the researchers wish to investigate the function in humans to find out whether the phenomenon is more prevalent in some patient groups than in others, such as patients with heart failure.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

DNA from tooth in Florida man's foot solves 25-year-old shark bite mystery

image: In the summer of 2018, Florida resident Jeff Weakley noticed a blister-like bulge on his foot. Chalking it up to a recent increase in running miles, he ignored it, but it continued to grow. When he finally tweezed it open, he found a tooth fragment from a shark that bit him in 1994. DNA from the tooth revealed he was bitten by a blacktip.

Image: 
Florida Museum photo by Kristen Grace

GAINESVILLE, Fla. --- When Jeff Weakley tweezed open a blister-like bulge on his foot, he was not expecting to find a piece of tooth from a shark that bit him while he was surfing off Flagler Beach in 1994.

He also did not imagine that a DNA test of the tooth, conducted by scientists at the Florida Museum of Natural History, would reveal the kind of shark that had nabbed his foot nearly a quarter century ago: a blacktip.

Weakley was planning to turn the small sliver of tooth into a pendant when he read about how researchers in the Florida Program for Shark Research identified the shark species responsible for a bite off New York by analyzing DNA from a tooth retrieved from the victim's leg.

He decided to offer the tooth to science.

"I was very excited to determine the identity of the shark because I'd always been curious," said Weakley, editor of Florida Sportsman magazine. "I was also a little bit hesitant to send the tooth in because for a minute I thought they would come back and tell me I'd been bitten by a mackerel or a houndfish - something really humiliating."

But Weakley's bite was the real deal, caused by Carcharhinus limbatus, a shark species commonly involved in bites in Florida.

The result came as no surprise to Weakley, who had always suspected a blacktip. But to Gavin Naylor, director of the shark research program, the fact that any viable DNA was left in the tooth fragment to analyze - after 24 years in Weakley's foot where it would have been attacked by his immune system - was a shocker.

"I had put our odds of success at slim to none," he said.

His doubts were shared by his laboratory manager Lei Yang who said he thought "it was kind of weird" to test the tooth, but was also intrigued to try.

"It was a mystery waiting for us to uncover," he said.

Yang cleaned the tooth of contaminants, removed part of the enamel and scraped pulp tissue from the tooth's cavity. He extracted DNA from the tissue, purified it, broke it into small pieces and then added molecular "bookends" on either side of each piece. These bookends made a genomic "library" out of the DNA, which Yang could then search for the sequences he needed to identify the shark. He compared the target sequences against two databases of shark and ray genetic information to determine Weakley had been bitten by a blacktip.

About 70 percent of shark bites are caused by unidentified species, and more precise data on which species are involved could improve bite mitigation strategies, Yang said. He also understood Weakley's personal curiosity.

"If I was bitten by a shark, I would want to know what it was," Yang said.

Weakley, who was bitten while surfing at a college beach mixer, said he was back in the water - foot encased in a waterproof bandage and bootie - within a couple of weeks. Twenty-five years later, he surfs and fishes weekly and regards the sharks he frequently sees in the same light as dogs that menace him when he jogs.

"I've been lucky to have not been bitten by a dog, but I would regard that interaction I had with that shark as being no different or more destructive than a dog bite," he said. "I certainly don't have a hatred of sharks or any feeling of vindictiveness toward them. They're part of our natural world."

But he doesn't wax romantic about them either.

"I've consumed blacktip shark and thought it was delicious."

Credit: 
Florida Museum of Natural History

Even today, we want our heroes to know right from wrong

COLUMBUS, Ohio - In a world of sympathetic villains and flawed heroes, people still like fictional characters more when they have a strong sense of morality, a new study finds.

Researchers found that people best liked the heroes they rated as most moral, and least liked villains they rated as most immoral.

Antiheroes and morally ambiguous characters like Walter White - the chemistry teacher turned drug kingpin in the show Breaking Bad - were more complicated for people to rate on likability.

But across all of the character types, morality and likability were closely related to each other, said Matthew Grizzard, lead author of the study and assistant professor of communication at The Ohio State University.

"Character morality is intricately tied to how much people like them," Grizzard said.

"People still don't like evil characters."

The study was published online in the Journal of Media Psychology and will appear in a future print edition.

Scholars have long believed that character morality is linked to how much people like them, Grizzard said.

"But then in the 90s we started to see antiheroes get popular in our culture, in TV shows like The Sopranos and NYPD Blue, for example. Characters did bad things, but people still rooted for them," he said.

"That got us thinking: Does charcater morality not matter anymore? Or does it matter and we're just not seeing the whole picture?"

So Grizzard and his colleagues put this question to the test by asking 262 college students to think of characters they liked or characters they disliked.

All of the students were given descriptions of three character types - heroes, villains and "morally ambiguous characters," or antiheroes. They were then asked to recall a fictional character in one of those categories that they either "really liked" or "really disliked."

Some of the liked characters participants chose included Superman and Batman as heroes, Deadpool and Batman as morally ambiguous characters and The Joker and Voldemort as villains.

Some of the disliked characters also included Batman and Superman as heroes, Dexter Morgan (of the TV show Dexter) and Spiderman as morally ambiguous characters and The Joker and Voldemort as villains.

Participants rated how much they liked the character they chose and how moral they thought the character was on scales from 1 to 7.

This put some participants in a difficult and unnatural position: They had to choose a villain they liked or a hero they disliked, Grizzard said.

"If there is really no connection between morality and liking, we should clearly see it here. But that's not what we found," he said.

The disliked heroes were rated as less moral than the liked heroes. The liked villains were rated as more moral than the disliked villains. And the liked antiheroes were rated as more moral than the disliked antiheroes.

"The more moral a character is, the more I like them. The more I like a character, the more moral I perceive them to be. It is nearly impossible to separate these factors," Grizzard said.

It was most difficult for researchers to predict the relationship between liking and morality in the morally ambiguous characters.

These antiheroes show that it may be too simple to say that as characters increase in morality, they will always be more liked.

"This middle ground where the characters are somewhat good and somewhat bad are harder to predict. But even then, there is still some relationship between morality and liking," he said.

Grizzard gives the example of Walter White, the main character in the show Breaking Bad, who starts producing and selling crystal meth to help secure his family's financial future after he is diagnosed with cancer.

In many movies, the Walter White character would be the villain and no one would root for him, Grizzard said. But in Breaking Bad, White still is more moral than most of the other characters, so viewers root for him.

"It is a relative morality. Because all of the other characters are worse than he is, we have something to compare him to. We don't exactly like him, but he is the best we can hope for in this show."

Credit: 
Ohio State University

Creating 'movies' of thin film growth at NSLS-II

video: This animation is a simplified representation of thin film growth. As C60 molecules are deposited onto a material, they form multiple layers simultaneously -- not one layer at a time. After a molecule reaches the surface of the material, it migrates by surface diffusion towards the boundary of an existing layer, or the 'step-edge,' causing the step-edge to move out from the center of the mound. This process repeats as new layers are continuously formed in an organized pattern. The mound increases in height by one layer after an equivalent of one full layer of molecules has been deposited onto the material. The pattern of step-edges is self-similar after each full-layer-equivalent is deposited, just displaced one layer higher. The main result of the study is that this repeating self-similarity, or 'autocorrelation,' can be quantitatively measured with coherent x-rays, and that the autocorrelations can be used to deduce certain details of how step-edges propagate during the deposition.

Image: 
Brookhaven National Laboratory

From paint on a wall to tinted car windows, thin films make up a wide variety of materials found in ordinary life. But thin films are also used to build some of today's most important technologies, such as computer chips and solar cells. Seeking to improve the performance of these technologies, scientists are studying the mechanisms that drive molecules to uniformly stack together in layers--a process called crystalline thin film growth. Now, a new research technique could help scientists understand this growth process better than ever before.

Researchers from the University of Vermont, Boston University, and the U.S. Department of Energy's (DOE) Brookhaven National Laboratory have demonstrated a new experimental capability for watching thin film growth in real-time. Using the National Synchrotron Light Source II (NSLS-II)--a DOE Office of Science User Facility at Brookhaven--the researchers were able to produce a "movie" of thin film growth that depicts the process more accurately than traditional techniques can. Their research was published on June 14, 2019 in Nature Communications.

How thin films grow

Like building a brick wall, thin films "grow" by stacking in overlapping layers. In this study, the scientists focused on the growth process of a nanomaterial called C60, which is popular for its use in organic solar cells.

"C60 is a spherical molecule that has the structure of a soccer ball," said University of Vermont physicist Randall Headrick, lead author of the research. "There is a carbon atom at all of the corners where the 'black' and 'white' patches meet, for a total of 60 carbon atoms."

Though spherical C60 molecules don't perfectly fit side-by-side like bricks in wall, they still create a uniform pattern.

"Imagine you have a big bin and you fill it with one layer of marbles," Headrick said. "The marbles would pack together in a nice hexagonal pattern along the bottom of the bin. Then, when you laid down the next layer of marbles, they would fit into the hollow areas between the marbles in the bottom layer, forming another perfect layer. We're studying the mechanism that causes the marbles, or molecules, to find these ordered sites."

But in real life, thin films don't stack this evenly. When filling a bin with marbles, for example, you may have three layers of marbles on one side of the bin and only one layer on the other side. Traditionally, this nonuniformity in thin films has been difficult to measure.

"In other experiments, we could only study a single crystal that was specially polished so the whole surface behaved the same way at the same time," Headrick said. "But that is not how materials behave in real life."

Studying thin film growth through coherent x-rays

To collect data that more accurately described thin film growth, Headrick went to the Coherent Hard X-ray Scattering (CHX) beamline at NSLS-II to design a new kind of experiment, one that made use of the beamline's coherent x-rays. The team used a technique called x-ray photon correlation spectroscopy.

"Typically, when you do an x-ray experiment, you see average information, like the average size of molecules or the average distance between them. And as the surface of a material become less uniform or 'rougher,' the features you look for disappear," said Andrei Fluerasu, lead beamline scientist at CHX and a co-author of the research. "What is special about CHX is that we can use a coherent x-ray beam that produces an interference pattern, which can be thought of like a fingerprint. As a material grows and changes, its fingerprint does as well."

The "fingerprint" produced by CHX appears as a speckle pattern and it represents the exact arrangement of molecules in the top layer of the material. As layers continue to stack, scientists can watch the fingerprint change as if it were a movie of the thin film growth.

"That is impossible to measure with other techniques," Fluerasu said.

Through computer processing, the scientists are able to convert the speckle patterns into correlation functions that are easier to interpret.

"There are instruments like high resolution microscopes that can actually make a real image of these kinds of materials, but these images usually only show narrow views of the material," Headrick said. "A speckle pattern that changes over time is not as intuitive, but it provides us with data that is much more relevant to the real-life case."

Co-author Lutz Wiegart, a beamline scientist at CHX, added, "This technique allows us to understand the dynamics of growth processes and, therefore, figure out how they relate to the quality of the films and how we can tune the processes."

The detailed observations of C60 from this study could be used to improve the performance of organic solar cells. Moving forward, the researchers plan to use this technique to study other types of thin films as well.

Credit: 
DOE/Brookhaven National Laboratory

Cholesterol that is too low may boost risk for hemorrhagic stroke

UNIVERSITY PARK, Pa. -- Current guidelines recommend lowering cholesterol for heart disease risk reduction. New findings indicate that if cholesterol dips too low, it may boost the risk of hemorrhagic stroke, according to researchers.

Over a period of nine years, a Penn State-led study examined the relationship between low-density lipoprotein cholesterol -- LDL, commonly known as "bad" cholesterol -- and hemorrhagic stroke. This type of stroke occurs when a blood vessel bursts in the brain.

The researchers found that participants with LDL cholesterol levels below 70 mg/dL had a higher risk of hemorrhagic stroke.

Xiang Gao, associate professor of nutritional sciences and director of the Nutritional Epidemiology Lab at Penn State, said the results -- published today (date) in Neurology -- may help refine and personalize recommendations for ideal target cholesterol levels.

"As is true with many things in nutrition, moderation and balance is key when deciding the optimal target level of LDL cholesterol," Gao said. "You can't go to either extreme -- too high or too low. And if you're at a high risk for hemorrhagic stroke due to family history or risk factors like high blood pressure and heavy alcohol drinking, you may want to be extra careful about LDL cholesterol levels."

According to the researchers, low LDL cholesterol is recommended as a way to reduce the risk of a heart attack or ischemic stroke -- the latter when a blood vessel in the brain becomes blocked by a clot. But previous research has suggested a link between very low LDL cholesterol levels and hemorrhagic stroke.

Chaoran Ma, a nutritional sciences graduate student at Penn State, said that while previous studies suggested this connection, there was a need for additional validation in a separate cohort.

"For our study, we wanted to expand the scope of knowledge in this area by investigating the issue prospectively in a large cohort with multiple LDL cholesterol measurements to capture variation over time," Ma said.

The study included 96,043 participants with no history of stroke, heart attack or cancer when the study began. LDL cholesterol levels were measured when the study began and yearly thereafter for nine years. Reported incidents of hemorrhagic stroke were confirmed by medical records.

The researchers found that participants who had LDL cholesterol levels between 70 and 99 mg/dL had a similar risk of hemorrhagic stroke. But, when LDL cholesterol levels dipped below 70 mg/dL, the risk of hemorrhagic stroke increased significantly. For example, the risk increased by 169 percent for participants with LDL levels less than 50mg/dL relative to those with LDL levels between 70 and 99 mg/dL. These findings were consistent after controlling for age, sex, blood pressure and medication.

"Traditionally, an LDL cholesterol level of more than 100 mg/dL had been considered as optimal for the general population and lower in individuals at elevated risk of heart disease," Gao said. "We observed that the risk of hemorrhagic stroke increased in individuals with LDL cholesterol levels below 70 mg/dL. This observation, if confirmed, has important implications for treatment targets."

Ma said the findings may be able to help health care professionals continue to refine guidelines.

"The results were based on a large community-based study, which is an advantage because it focused on healthy people in a non-clinical setting," Ma said.

Credit: 
Penn State

A NEAT discovery about memory

image: Farah Lubin, Ph.D., associate professor, Department of Neurobiology, University of Alabama at Birmingham

Image: 
UAB

BIRMINGHAM, Ala. - You could call this a neat discovery.

Researchers at the University of Alabama at Birmingham have found that a tissue-specific, non-coding RNA called NEAT1 has a major, previously undescribed role in memory formation. The findings are presented in a paper published in Science Signaling on July 2.

We have long known that DNA contains the instructions -- or the code -- that gives cells the genetic information they need to build and maintain an organism, much as the letters of the alphabet are the code used to make words. RNA is the messenger that transmits the code to individual cells in the form of proteins. However, there are also non-coding RNAs, which do carry instructions to a cell without coding for proteins and whose role -- if any -- has been poorly understood. Recently, science has come to understand that non-coding RNA may play a more important role than originally believed.

"NEAT1 is a tissue-specific, non-coding RNA found in the hippocampus region of the brain. This brain region is most associated with learning and memory," said Farah Lubin, Ph.D., associate professor in the Department of Neurobiology and primary investigator of the study. "While it has some association with cancer in other parts of the body, we have discovered that, in the hippocampus, NEAT1 appears to regulate memory formation."

Lubin says that, when NEAT1 is on, or active, we do not learn as well. But when presented with an outside learning experience, it turns off, allowing the brain to learn from the outside stimulus. She uses a car analogy. The engine might be running; but when the brakes are on, the car does not move. You have to take off the brakes and hit the gas to get the car to move.

"NEAT1 is the brake: When it is on, we aren't learning, at least not as much as we might with it off," Lubin said. "In a younger brain, when presented with stimulus that promotes learning, NEAT1 turns off. Since one of the hallmarks of aging is a decline in memory, we wondered if NEAT1 was implicated in that decline."

Lubin says one of the genes that NEAT1 acts upon is c-FOS, which is necessary for memory formation. In an aging brain, NEAT1 is on more than it is in a younger brain, interfering with the epigenetic regulation of c-FOS, which disrupts its memory functions.

Using siRNA techniques in a mouse model, Lubin's team was able to turn off NEAT1 in older mice. With NEAT1 off, the mice demonstrated normal abilities in learning and memory.

The next step was to change the level of NEAT1 in younger mice, using CRISPR/dCas9 gene-activation technology. Boosting the presence of NEAT1 in younger mice caused a decline in their ability to learn and remember.

"Turning NEAT1 off in older animals boosted memory, while increasing NEAT1 in younger animals diminished memory," Lubin said. "This gives us very strong evidence that NEAT1 and its effects on the epigenetic control of c-FOS are one of the keys to memory formation. These are significant findings, for not only did we find a novel epigenetic initiator and regulator, we identified a new role for the NEAT1 non-coding RNA. This sets the stage for more research into the potential roles played by other non-coding RNAs."

Lubin says further research should also examine the potential of using the same CRISPR/dCas9 technology to ultimately prevent NEAT1 overexpression in older humans to help boost memory formation. The goal is to find ways to enhance memory due to aging or conditions with memory deficits, such as Alzheimer's disease or other dementias.

Credit: 
University of Alabama at Birmingham

Fast radio burst pinpointed to distant galaxy

image: Owens Valley Radio Observatory.

Image: 
Caltech/OVRO/Gregg Hallinan

Fast radio bursts (FRBs) are among the most enigmatic and powerful events in the cosmos. Around 80 of these events--intensely bright millisecond-long bursts of radio waves coming from beyond our galaxy--have been witnessed so far, but their causes remain unknown.

In a rare feat, researchers at Caltech's Owens Valley Radio Observatory (OVRO) have now caught a new burst, called FRB 190523, and, together with the W. M. Keck Observatory in Hawaii, have pinpointed its origins to a galaxy 7.9 billion light-years away. Identifying the galaxies from which these radio bursts erupt is a critical step toward solving the mystery of what triggers them.

A paper about the discovery appears online July 2 in Nature.

Before this new discovery, only one other burst, called FRB 121102, had been localized to a host galaxy. FRB 121102 was reported in 2014 and then later, in 2017, was pinpointed to a galaxy lying 3 billion light-years away. Recently, a second localized FRB was announced on June 27, 2019. Called FRB 180924, this burst was discovered by a team using the Australian Square Kilometer Array Pathfinder and traced to a galaxy about 4 billion light-years away.

FRB 121102 was easiest to find because it continues to burst every few weeks. Most FRBs, however--including the Australian and OVRO finds--just go off once, making the job of finding their host galaxies harder.

"Finding the locations of the one-off FRBs is challenging because it requires a radio telescope that can both discover these extremely short events and locate them with the resolving power of a mile-wide radio dish," says Vikram Ravi, a new assistant professor of astronomy at Caltech who works with the radio telescopes at OVRO, which is situated east of the Sierra Nevada mountains in California.

"At OVRO, we built a new array of ten 4.5-meter dishes that collectively act like a mile-wide dish to cover an area on the sky the size of 150 full moons," he says. "To do this, a powerful digital system ingests and processes an amount of data equivalent to a DVD every second."

The new OVRO instrument is called the Deep Synoptic Array-10, with the "10" referring to the number of dishes. This array serves as a stepping stone for the planned Deep Synoptic Array (DSA), funded by the National Science Foundation (NSF), which, when completed by 2021, will ultimately consist of 110 radio dishes.

"The DSA is expected to discover and localize more than 100 FRBs per year," says Richard Barvainis, program director at the NSF for the Mid-Scale Innovations Program, which is funding the construction of the DSA. "Astronomers have been chasing FRBs for a decade now, and we're finally drawing a bead on them with new instruments like DSA-10 and, eventually, the full DSA. Now we have a chance of figuring out just what these exotic objects might be."

The new observations show that the host galaxy for FRB 190523 is similar to our Milky Way. This is a surprise because the previously located FRB 121102 originates from a dwarf galaxy that is forming stars more than a hundred times faster than the Milky Way.

"This finding tells us that every galaxy, even a run-of-the-mill galaxy like our Milky Way, can generate an FRB," says Ravi.

The discovery also suggests that a leading theory for what causes FRBs--the eruption of plasma from young, highly magnetic neutron stars, or magnetars--may need to be rethought.

"The theory that FRBs come from magnetars was developed in part because the earlier FRB 121102 came from an active star-forming environment, where young magnetars can be formed in the supernovae of massive stars," says Ravi. "But the host galaxy of FRB 190523 is more mellow in comparison. "

Ultimately, to solve the mystery of FRBs, astronomers hope to uncover more examples of their host galaxies.

"With the full Deep Synoptic Array, we are going to find and localize FRBs every few days," says Gregg Hallinan, the director of OVRO and a professor of astronomy at Caltech. "This is an exciting time for FRB discoveries."

The researchers also say that FRBs can be used to study the amount and distribution of matter in our universe, which will tell us more about the environments in which galaxies form and evolve. As radio waves from FRBs head toward Earth, intervening matter causes some of the wavelengths to travel faster than others; the wavelengths become dispersed in the same way that a prism spreads apart light into a rainbow. The amount of dispersion tells astronomers exactly how much matter there is between the FRB sources and Earth.

"Most matter in the universe is diffuse, hot, and outside of galaxies," says Ravi. "This state of matter, although not 'dark,' is difficult to observe directly. However, its effects are clearly imprinted on every FRB, including the one we detected at such a great distance."

Credit: 
California Institute of Technology

Higher risk of stillbirth in longer pregnancies, study finds

image: The longer a pregnancy continues past 37 weeks gestation, the higher the risk of a stillbirth.

Image: 
Tatiana Vdb, Flickr

The longer a pregnancy continues past 37 weeks gestation, the higher the risk of a stillbirth, according to a new meta-analysis published this week in PLOS Medicine by Shakila Thangaratinam of Queen Mary University of London, UK, and colleagues.

Of the 3000 babies stillborn every year in the UK, a third appeared healthy at 37 weeks. In the new work, researchers searched major electronic databases for studies on term pregnancies that included weekly numbers of stillbirths or neonatal deaths. Thirteen studies, providing data on 15 million pregnancies and 17,830 stillbirths, were identified and included in their analysis.

The risk of stillbirth increased with gestational age from 0.11 stillbirths per 1000 pregnancies at 37 weeks (95% CI 0.07 to 0.15) to 3.18 stillbirths per 1000 pregnancies at 42 weeks (95% CI 1.84 to 4.35). From 40 to 41 weeks, there was a 64% increase in the risk of stillbirth. Neonatal mortality remained steady in babies born from 38 to 41 weeks, but was significantly higher for babies born at 42 weeks compared to 41 weeks (RR 1.87, 95% CI 1.07 to 2.86, p=0.012).

"Any mother considering prolongation of pregnancy beyond 37 weeks should be informed of the additional small but significantly increased risks of stillbirths with advancing gestation," the authors say. "There is a need to assess the acceptability of early delivery at term to parents and healthcare providers to avoid the small risk of stillbirth. Better stratification of apparently low risk women for complications using individualized prediction models could reduce the number of women who need to be delivered to avoid one additional stillbirth."

Credit: 
PLOS

Bench to beside study of a targetable enzyme controlling aggressive prostate cancer

PHILADELPHIA -- Prostate cancer represents a major health challenge and there is currently no effective treatment once it has advanced to the aggressive, metastatic stage. A new has revealed a key cellular mechanism that contributes to aggressive prostate cancer, and supporting a new clinical trial. The study was published in the journal Clinical Cancer Research.

The research led by investigators at the Sidney Kimmel Cancer Center - Jefferson Health (SKCC) and their collaborators at Memorial Sloan Kettering Cancer Center, the University of California, San Francisco, and Celgene Corporation, focused on an enzyme called DNA-PK (DNA-dependent protein kinase), a pivotal component of the cellular machinery that controls both DNA repair and influences gene expression. The work was orchestrated by the laboratory of Karen E. Knudsen, PhD, EVP of Oncology Services and Enterprise Director of SKCC.

Previous studies showed that DNA-PK is excessively active in metastatic prostate cancer and that its hyper-activation is associated with a poor outcome in prostate cancer patients. "Our study further elucidates the functions of DNA-PK and identifies this protein as a master regulator of gene networks that promote aggressive cancer behaviors," says lead author Emanuela Dylgjeri.

A companion study in the same issue led by the laboratory of Felix Feng, MD, in collaboration with the Kundsen laboratory, identified DNA-PK as the most significantly associated kinase with metastatic progression of the disease. In an effort to understand how DNA-PK induces poor outcomes, the investigators found that DNA-PK modulates the expression of gene networks controlling a variety of important cancer-related cellular events, including a developmental process termed the epithelial-mesenchymal transition, the immune response, metabolic pathways (Dylgjeri et al.) and Wnt signaling (Kothari et al.).

The new findings suggest that targeting DNA-PK might allow the development of effective strategies to prevent or treat aggressive, late-stage prostate cancer. Data from the studies was used to develop a clinical trial combining standard-of-care with a first-in-man DNA-PK inhibitor. Early results of the trial have been promising, and the researchers have demonstrated in a laboratory setting that the combined approach is more effective than either single treatment in eliciting anti-tumor effects. The clinical trial is still underway and has now entered the expansion phase of testing.

The newly published studies are focused on translating basic science findings from the laboratory to the clinic, but the investigators also plan to take the lessons learned in the clinic back to the laboratory. The results of the clinical trial will offer important clues and raise new questions that will guide the design of new experiments, with the ultimate goal of understanding how DNA-PK regulates specific cellular pathways to promote more aggressive cancer behavior. These studies in turn will aid in the development of more accurate genetic tests to detect advanced prostate cancer, identify the most appropriate course of treatment for individual patients, and predict treatment outcomes. Team leader Dr. Karen Knudsen envisions long-term practical outcomes of this research, saying "it is our hope to use the information gained by these studies to understand which prostate cancer patients might benefit the most from combination treatments with a DNA-PK inhibitor drug".

Credit: 
Thomas Jefferson University

Can mathematics help us understand the complexity of our microbiome?

image: The microbiome is complex due to many interacting species. Ludington and his team sought to deconstruct this complexity by calculating the geometric structure of the interactions. Their method measures interactions in high dimensional space, considering each species to have its own dimension. One analogy for understanding the mathematical structure is to think of it as foam being simplified into a single bubble by progressively merging adjacent bubbles.

Image: 
Wikimedia Commons

Baltimore, MD--How do the communities of microbes living in our gastrointestinal systems affect our health? Carnegie's Will Ludington was part of a team that helped answer this question.

For nearly a century, evolutionary biologists have probed how genes encode an individual's chances for success--or fitness--in a specific environment.

In order to reveal a potential evolutionary trajectory biologists measure the interactions between genes to see which combinations are most fit. An organism that is evolving should take the most fit path. This concept is called a fitness landscape, and various mathematical techniques have been developed to describe it.

Like the genes in a genome, microorganisms in the gut microbiome interact, yet there isn't a widely accepted mathematical framework to map the patterns of these interactions. Existing frameworks for genes focus on local information about interactions but do not put together a global picture.

"If we understand the interactions, we can make predictions about how these really complex systems will work in different scenarios. But there is a lot of complexity in the interaction networks due to the large number of genes or species. These add dimensions to the problem and make it tricky to solve," said Ludington.

So, Ludington began talking to mathematician Michael Joswig of the Technical University in Berlin.

"Michael thinks natively in high dimensions--many more than four. He understood the problem right away," said Ludington.

Joswig and Ludington then joined with Holger Eble of TU Berlin, a graduate student working with Joswig, and Lisa Lamberti of ETH Zurich. Lamberti had previously collaborated with Ludington to apply a slightly different mathematical framework for the interactions to microbiome data. In the present work, the team expanded upon that previous framework to produce a more global picture by mapping the patterns of interactions onto a landscape.

"In humans, the gut microbiome is an ecosystem of hundreds to thousands of microbial species living within the gastrointestinal tract, influencing health and even longevity," Ludington explained. "As interest in studying the microbiome continues to increase, understanding this complexity will give us predictive power to engineer it."

But the sheer diversity of species in the human microbiome makes it very difficult to elucidate how these communities influence our physiology. This is why the fruit fly makes such an excellent model. Unlike the human microbiome, it consists of only a handful of bacterial species.

"We've built a rigorous mathematical framework that describes the ecology of a microbiome coupled to its host. What is unique about this approach is that it allows a global view of a microbiome-host interaction landscape," said Ludington. "We can now use this approach to compare different landscapes, which will let us ask why diverse microbiomes are associated with similar health outcomes."

The authors note that the framework applies equally well to traditional genetic interactions. Their work is published in the Journal of Mathematical Biology.

Credit: 
Carnegie Institution for Science

Study: Brain injury common in domestic violence

COLUMBUS, Ohio - Domestic violence survivors commonly suffer repeated blows to the head and strangulation, trauma that has lasting effects that should be widely recognized by advocates, health care providers, law enforcement and others who are in a position to help, according to the authors of a new study.

In the first community-based study of its kind, researchers from The Ohio State University and the Ohio Domestic Violence Network found that 81 percent of women who have been abused at the hands of their partners and seek help have suffered a head injury and 83 percent have been strangled.

The research suggests that brain injury caused by blows to the head and by oxygen deprivation are likely ongoing health issues for many domestic violence survivors. Because of poor recognition of these lasting harms, some interactions between advocates and women suffering from the effects of these unidentified injuries were likely misguided, said the authors of the study, which appears in the Journal of Aggression, Maltreatment & Trauma.

"One in three women in the United States has experienced intimate partner violence. What we found leads us to believe that many people are walking around with undiagnosed brain injury, and we have to address that," said lead researcher Julianna Nemeth, an assistant professor of health behavior and health promotion at Ohio State.

The study included 49 survivors from Ohio and 62 staff and administrators from five agencies in the state.

Previous research has acknowledged brain injury as a product of domestic violence. But this is the first study to gather this kind of detailed information from the field. It's also the first to establish that many survivors have likely experienced repeated head injury and oxygen deprivation - a combination that could contribute to more-severe problems including memory loss, difficulty understanding, loss of motivation, nightmares, anxiety and trouble with vision and hearing, Nemeth said.

"Nobody really knows just what this combination of injuries could mean for these women," she said. "When we looked at our data, it was an 'Oh my gosh' moment. We have the information we need now to make sure that people recognize this as a major concern in caring for survivors."

Almost half of the women in the study said they'd been hit in the head or had their head shoved into another object "too many times to remember." More than half were choked or strangled "a few times" and one in five said that happened "too many times to remember."

In some cases, the survivors lived through both experiences multiple times.

The reports from women in domestic violence programs throughout Ohio already have prompted changes to how the statewide advocacy group and the programs it works with are helping the survivors they serve. They've created a model called "CARE" for "Connect, Acknowledge, Respond and Evaluate."

They've adjusted their training and developed materials that address the "invisible injuries" to the brain. They're encouraging providers at agencies to tailor care plans to the specific needs of women who've had traumatic brain injury, and to help them seek medical care to get an appropriate diagnosis and treatment.

Currently, the team is working on an evaluation to see how well that new programming is working.

"Brain injury was not something we really talked about much until now. It wasn't part of any routine training and we're trying to address that now because of what we learned from these survivors," said Rachel Ramirez, a study co-author and training director for the Ohio Domestic Violence Network. She's been exploring ways to ensure better diagnosis and treatment for women with brain injuries, and said there's a long way to go.

"Almost all of the best-practice recommendations for TBI are focused on athletes and soldiers, and some of the guidance is impractical for our population," she said. "These women could be having trouble being able to plan for the future, to make decisions about their safety, to come to appointments, to do their jobs. Many have likely been wondering for years what's going on with them."

Emily Kulow, accessibility project coordinator for the Ohio Domestic Violence Network, said that it's likely some of the survivors who've suffered from severe head trauma and oxygen deprivation have been slipping through the cracks because their symptoms aren't well-understood.

For instance, someone who can never remember to show up for counseling at the right time or who is combative with a roommate might be seen as a troublemaker when she's really at the mercy of her brain injury, Kulow said.

"Regardless of why we're seeing these behaviors, we should be serving all the women who have survived domestic violence and a one-size-fits-all approach won't work."

Added Nemeth, "The survivors who have severe brain injury are likely some of those with the greatest unmet need."

In addition to memory problems and cognitive impairment, poor mental health may arise or be exacerbated by brain injury - and addressing the mental health needs of survivors is an ongoing challenge for agencies, Kulow said.

The researchers also authored another study, recently published in the Journal of Family Violence, that documents challenges that agency employees face when dealing with the complex mental health needs of survivors. The study authors, led by Ohio State Assistant Professor of Social Work Cecilia Mengo, call for care models tailored to survivors who have a mental health disability.

But they also recognize the challenges faced by advocates and survivors, particularly in areas where residents have poor access to counselors, psychologists and psychiatrists.

"It's not that they don't recognize the need for mental health services, but that need is difficult to meet in a state with inadequate mental health services," Ramirez said.

Added Kulow, "We also heard from programs that there's a lack of understanding of the more-serious mental health disorders that people have, such as bipolar disorder."

Credit: 
Ohio State University

The secret of mushroom colors

The fly agaric with its red hat is perhaps the most evocative of the diverse and variously colored mushroom species. Hitherto, the purpose of these colors was shrouded in mystery. Researchers at the Technical University of Munich (TUM), in collaboration with the Bavarian Forest National Park, have now put together the first pieces of this puzzle.

In nature, specific colors and patterns normally serve a purpose: The eye-catching patterns of the fire salamander convey to its enemies that it is poisonous. Red cherries presumably attract birds that eat them and thus disperse their seed. Other animals such as chameleons use camouflage coloring to protect themselves from discovery by predators.

But climate also plays a role in coloration: Especially insects and reptiles tend to be darker in colder climates. Cold-blooded animals rely on the ambient temperature to regulate their body temperature. Dark coloration allows them to absorb heat faster. The same mechanism could also play a role in fungi, as the research team of Franz Krah, who wrote his doctoral thesis on the topic at TUM and Dr. Claus Bässler, mycologist at the TUM and coworker in the Bavarian Forest National Park suspect. Mushrooms might benefit from solar energy to improve their reproduction, as well.

Distribution of 3054 fungus species studied

To test their theory, the researchers combed through vast volumes of data. They investigated the distribution of 3054 species of fungi throughout Europe. In the process, they analyzed the lightness of their coloration and the prevailing climatic conditions in the respective habitats. The results showed a clear correlation: Fungal communities have darker mushrooms in cold climates. The scientists also accounted for seasonal changes. They discovered that fungal communities that decompose dead plant constituents are darker in spring and autumn than in summer.

"Of course, this is just the beginning," explains Krah. "It will take much more research before we develop a comprehensive understanding of mushroom colors." For example, further seasonal coloring effects cannot be detected in fungi that live in symbiosis with trees. "Here, other coloration functions, such as camouflage, also play a role." The researchers also need to study the degree to which dark coloration influences the reproductive rate of fungi.

Credit: 
Technical University of Munich (TUM)

Remote but remarkable: Illuminating the smallest inhabitants of the largest ocean desert

image: The South Pacific Gyre is the largest ocean gyre, covering 37 million km2.

Image: 
Tim Ferdelman / Max Planck Institute for Marine Microbiology

The middle of the South Pacific is as far away from land as you can possibly get. Solar irradiance is dangerously high, reaching a UV-index that is labelled 'extreme'. There are no dust particles or inflows from the land and as a result these waters have extremely low nutrient concentrations, and thus are termed 'ultraoligotrophic'. Chlorophyll-containing phytoplankton (minute algae) are found only at depths greater than a hundred meters, making surface South Pacific waters the clearest in the world. Due to its remoteness and enormous size - the South Pacific Gyre covers 37 million km2 (for comparison, the US cover less than 10 million km2) -, it is also one of the least studied regions on our planet.

Despite its remoteness, both satellite and in situ measurements indicate that the microorganisms living in the waters of the South Pacific Gyre (SPG) contribute significantly to global biogeochemical cycles. Thus, the scientists from Bremen were interested in discovering which microbes are living and active in this ocean desert. During a six-week research cruise on the German research vessel FS Sonne, organized and led by the Max Planck Institute for Marine Microbiology, Greta Reintjes, Bernhard Fuchs and Tim Ferdelman collected hundreds of samples along a 7000 kilometre track through the South Pacific Gyre from Chile to New Zealand. The scientists sampled the microbial community at 15 Stations in water depths from 20 to more than 5000 metres, that is, from the surface all the way down to the seafloor.

Low cell numbers and unexpected distributions

"To our surprise, we found about a third less cells in South Pacific surface waters compared to ocean gyres in the Atlantic", Bernhard Fuchs reports. "It was probably the lowest cell numbers ever measured in oceanic surface waters." The species of microbes were mostly familiar: "We found similar microbial groups in the SPG as in other nutrient-poor ocean regions, such as Prochlorococcus, SAR11, SAR86 and SAR116", Fuchs continues. But there was also a surprise guest amongst the dominant groups in the well-lit surface waters: AEGEAN-169, an organism that was previously only reported in deeper waters.

Reintjes and her colleagues discovered a pronounced vertical distribution pattern of microorganisms in the SPG. "The community composition changed strongly with depth, which was directly linked to the availability of light", Reintjes reports. Surprisingly, the dominant photosynthetic organism, Prochlorococcus, was present in rather low numbers in the uppermost waters and more frequent at 100 to 150 meters water depth. The new player in the game however, AEGEAN-169, was particularly numerous in the surface waters of the central gyre. "This indicates an interesting potential adaptation to ultraoligotrophic waters and high solar irradiance", Reintjes points out. "It is definitely something we will investigate further." AEGEAN-169 has so far only been reported in water depths around 500 metres. "It is likely that there are multiple ecological species within this group and we will carry out further metagenomic studies to examine their importance in the most oligotrophic waters of the SPG."

Methodological milestone

The current research was only possible thanks to a newly developed method that enabled the scientists to analyse samples right after collection. "We developed a novel on-board analysis pipeline", Reintjes explains, "which delivers information on bacterial identity only 35 hours after sampling." Usually, these analyses take many months, collecting the samples, bringing them home to the lab and analysing them there. This pipeline combines next-generation sequencing with fluorescence in situ hybridisation and automated cell enumeration. "The outcome of our method developments is a readily applicable system for an efficient, cost-effective, field-based, comprehensive microbial community analysis", Reintjes points out. "It allows microbial ecologists to perform more targeted sampling, thereby furthering our understanding of the diversity and metabolic capabilities of key microorganisms."

Credit: 
Max Planck Institute for Marine Microbiology

Antibiotics weaken flu defenses in the lung

image: Scanning electron micrograph of epithelial cells lining the inside of the lung. Different cell types with different shapes and different functions compose this inner lining, and the right balance between these cell types is important for a healthy lung.

Image: 
Andreas Wack, Francis Crick Institute

Antibiotics can leave the lung vulnerable to flu viruses, leading to significantly worse infections and symptoms, finds a new study in mice led by the Francis Crick Institute.

The research, published in Cell Reports, discovered that signals from gut bacteria help to maintain a first line of defence in the lining of the lung. When mice with healthy gut bacteria were infected with the flu, around 80% of them survived. However, only a third survived if they were given antibiotics before being infected.

"We found that antibiotics can wipe out early flu resistance, adding further evidence that they should not be taken or prescribed lightly," explains Dr Andreas Wack, who led the research at the Francis Crick Institute. "Inappropriate use not only promotes antibiotic resistance and kills helpful gut bacteria, but may also leave us more vulnerable to viruses. This could be relevant not only in humans but also livestock animals, as many farms around the world use antibiotics prophylactically. Further research in these environments is urgently needed to see whether this makes them more susceptible to viral infections."

The study found that type I interferon signalling, which is known to regulate immune responses, was key to early defence. Among the genes switched on by interferon is a mouse gene, Mx1, which is the equivalent of the human MxA gene. This antiviral gene produces proteins that can interfere with influenza virus replication. Although often studied in immune cells, the researchers found that microbiota-driven interferon signals also keep antiviral genes in the lung lining active, preventing the virus from gaining a foothold.

"We were surprised to discover that the cells lining the lung, rather than immune cells, were responsible for early flu resistance induced by microbiota," says Andreas. "Previous studies have focused on immune cells, but we found that the lining cells are more important for the crucial early stages of infection. They are the only place that the virus can multiply, so they are the key battleground in the fight against flu. Gut bacteria send a signal that keeps the cells lining the lung prepared, preventing the virus from multiplying so quickly.

"It takes around two days for immune cells to mount a response, in which time the virus is multiplying in the lung lining. Two days after infection, antibiotic-treated mice had five times more virus in their lungs. To face this bigger threat, the immune response is much stronger and more damaging, leading to more severe symptoms and worse outcomes."

To test whether the protective effect was related to gut bacteria rather than local processes in the lung, the researchers treated mice with antibiotics and then repopulated their gut bacteria through faecal transplant. This restored interferon signalling and associated flu resistance, suggesting that gut bacteria play a crucial role in maintaining defences.

"Taken together, our findings show that gut bacteria help to keep non-immune cells elsewhere in the body prepared for attack," says Andreas. "They are better protected from flu because antiviral genes are already switched on when the virus arrives. So when the virus infects a prepared organism, it has almost lost before the battle starts. By contrast, without gut bacteria, the antiviral genes won't come on until the immune response kicks in. This is sometimes too late as the virus has already multiplied many times, so a massive, damaging immune response is inevitable."

Credit: 
The Francis Crick Institute