Culture

Family experience influences diabetes risk, management for African Americans

image: Researchers asked families about the significance of food to understand how it influences prevention and management of Type 2 diabetes.

Image: 
Iowa State University

AMES, Iowa - African American families not only share a higher risk for Type 2 diabetes, but many myths and misconceptions about the disease are often passed on from one generation to the next.

To understand how family experiences influence risk and management of the disease, a team of Iowa State University researchers interviewed parents and adult children of 20 African American families with strong histories of Type 2 diabetes. The interviews focused on diagnosis, physical activity and nutrition, resources for managing the disease, family support and communication, and allowed researchers to dig deeper into family dynamics.

"We wanted to get into the meanings and nuances within the family culture to identify some malleable targets we can work with," said Tera Jordan, an ISU associate professor of human development and family studies who conducted the interviews. "By understanding the meanings people have around food and nutrition, we might find ways they can make change."

The study, published in the journal Global Qualitative Nursing Research, identified two primary themes - family interactions and intergenerational openness - that shaped what younger generations knew and did about the disease. Brianna Routh, lead author and assistant professor at Montana State University, was part of the research team while a graduate student at Iowa State. She says the results provide insight that may help doctors and nurses counsel African American patients.

"Some family interactions could directly inform an individual's behaviors and ability to manage their diabetes, so it is important for medical professionals to ask questions to better understand how these interactions might support or hinder the individual's health goals," Routh said.

For example, many in the African American community know there is a genetic component, but are less aware of how physical activity and nutrition can prevent or mitigate effects of the disease, Jordan said.

Significance of food and family

Previous studies have shown family history - regardless of race, age or income - is a strong predictor of obesity and type 2 diabetes. In their paper, ISU researchers explained that cultural acceptance of larger body size and optimistic bias may limit African Americans' perceived risk for the disease. Some are also skeptical of the medical system, which can have negative consequences.

As with many cultures, food plays a central role in African American families. In the interviews, Jordan says several parents and children talked about special foods prepared for holidays and family meals. Parents explained that they made the dishes for their children, but the children often did not realize the health consequences for their parents. For some families, the interviews were the first time they had talked about these issues.

"I really saw the process unfolding of sharing information between the two generations, but also raising awareness about misconceptions, such as how we treated diabetes a generation ago is not the way we're treating it now. Families must be diligent and open to getting updated and new information," Jordan said.

Lessons for families, medical professionals

Some parents openly talked about diabetes with their children and modeled healthy behaviors, but the study revealed a need for greater communication within families as well as with medical professionals. The researchers noted gender differences - men were less likely to discuss their diagnosis - as well as concerns from parents reluctant to talk about it because they did not want to worry their children.

"Too often we heard about someone who had a major medical event, unfortunately, may have lost their life, simply because they were not sharing information with loved ones," Jordan said.

The researchers offer the following recommendations to improve communication and education about Type 2 diabetes:

Open communication: To provide support, family members need to discuss the diagnosis and understand the course for management. Designate a point person who can attend doctor's appointments and make sure medication and management plans are followed.

Notify a coworker or manager: A medical emergency can happen at any time. Tell someone at work that you have Type 2 diabetes and how they should respond if your blood sugar drops and you are unresponsive.

Recognize potential barriers: Doctors and nurses need to make sure patients understand their diagnosis and feels comfortable asking questions. Awareness of cultural and family differences can help break down the barriers to successful management.

"I'd tell health care providers to do anything they can to help facilitate communication and build trust, recognizing they might not have done anything personally to violate that trust, but it's systemic, it's cultural," Jordan said. "We've overcome a lot of things culturally and just recognize that it is probably in the room."

Credit: 
Iowa State University

Disrupting immune cell behavior may contribute to heart disease and failure, study shows

image: Colorized photomicrograph showing a macrophage (orange cell with large red nucleus in center of image) interacting with a fibroblast (blue cell with green nucleus at bottom right) in a mouse heart. Johns Hopkins Medicine researchers have shown that a protein, interleukin 17-A, can prevent critical monocyte cells from differentiating into lymphocytes that protect cardiac muscle from inflammation and in turn, lessen the threat of heart failure.

Image: 
X. Hou and D. Cihakova, Department of Pathology, Johns Hopkins University School of Medicine

On an ice hockey team, the players all start off with identical uniforms, skates and a stick. But if you take one of them, add padding, a glove, and a mask; and switch the stick to one with a larger blade, then you get a goalie. Now, the player has morphed -- or differentiated -- into one with a specific function: protect the goal from invading pucks.

Differentiation of cells in the human immune system is critical to give them the ability to perform specific tasks that, like a biological goaltender, help protect the body from foreign invaders. A new study, led by researchers at Johns Hopkins Medicine, provides evidence that when circulating anti-inflammatory white blood cells known as monocytes fail to properly differentiate into macrophages -- the cells that engulf and digest cellular debris, bacteria and viruses -- certain forms of heart disease may result.

The research, reported in a recent issue of the journal Cell Reports, shows the presence of a specific protein prevents this monocyte-to-macrophage transition from occurring in the heart. This triggers a cascade of events that can cause heart muscle inflammation, or myocarditis; remodeling of the cardiac muscle structure; enlargement of the heart, or dilated cardiomyopathy; and weakening of the organ's ability to pump blood. Eventually, this can result in heart failure.

"The good news, also shown by our study, is that blocking a key protein, known as interleukin-17A or IL-17A, permits the differentiation of anti-inflammatory monocytes, promotes healthy cardiac function, and allows the newly created macrophages to protect, rather than attack, cardiac muscle," says Daniela Cihakova, M.D., Ph.D., associate professor of pathology at the Johns Hopkins University School of Medicine and the senior author of the paper.

In previous live mouse and "test-tube" laboratory studies, Johns Hopkins Medicine researchers determined that IL-17A stimulates spindle-shaped cardiac cells called fibroblasts to release a mediator that causes one type of monocyte, an inflammatory cell known as Ly6Chi to accumulate in greater numbers in the heart than the anti-inflammatory type known as Ly6Clo. Fibroblasts help direct and maintain the structural, biochemical, mechanical and electrical properties of the myocardium, the muscle cells of the heart that beat continuously from before birth until death. Fibroblasts also are involved in replacing these cells with scar tissue once they are damaged by disease or a heart attack.

The researchers found that another function of fibroblasts -- mediating the immune response during myocarditis -- is affected by IL-17A in a way that leads to cardiac fibrosis (the overproduction of collagen, a fibrous tissue normally responsible for wound repair, which stiffens cardiac muscle and decreases its ability to pump blood) and eventually, dilated cardiomyopathy. However, the mechanism behind the disruption remained unclear.

To figure it out, the researchers conducted experiments using a mouse model of human myocarditis. Known as experimental autoimmune myocarditis, or EAM, it is created by immunizing mice with cardiac myosin, a protein that normally regulates heart muscle contraction, but in this case, triggers the immune system to attack the cardiac muscle cells and produce inflammation.

First, Cihakova and her colleagues wanted to determine the source of the macrophages seen in the hearts of mice during myocarditis; whether they were circulating Ly6Clo and Ly6Chi monocytes from the bone marrow that differentiated in the heart to macrophages or mature macrophages already residing there. They achieved this by surgically joining two mice -- each with a different immune cell biomarker -- so that they shared a single circulatory system. EAM was induced in one of the conjoined pair and then the macrophages in the hearts of both were examined after myocarditis set in. Macrophages from both joined mice were found in the heart of the one with EAM, indicating that they were circulating and not in place before the surgery.

Next, the researchers tried to learn if cardiac fibroblasts directly influence the differentiation of Ly6Clo and Ly6Chi monocytes to macrophages. They isolated both types of monocytes from the spleen of one mouse and fibroblasts from the heart of another, then grew the cells together in a culture dish. After 40 hours, no macrophages arose from the Ly6Clo monocytes but nearly all of the Ly6Chi monocytes developed into macrophages. It took 160 hours for one-third of the Ly6Clo monocytes also to differentiate to macrophages. According to Cihakova, this indicated that the fibroblasts were promoting the differentiation of both monocytes.

Based on their previous studies with IL-17A, the researchers focused further on this protein as the potential catalyst for fibroblast-directed disruption of monocyte-to-macrophage conversion. Returning to a test tube experiment, they grew both types of monocytes with either cardiac fibroblasts alone or cardiac fibroblasts that had been stimulated with IL-17A. After nearly a week, Ly6Clo monocyte differentiation was completely shut down while there was little effect on Ly6Chi monocyte maturation into macrophages.

The researchers then verified their findings in living animals by first, separately injecting Ly6Clo and Ly6Chi monocytes into the hearts of mice with EAM. Previously, they had found that EAM results in high levels of IL-17A in the heart. What they observed in this test was the exact same pattern of differentiation as in the test tube trial; none for Ly6Clo monocytes and little impact on Ly6Chi cells.

However, Ly6Clo differentiation to macrophages did occur in the hearts of IL-17A "knockout" mice, a strain bred without the gene to manufacture the interleukin protein.

Taking their investigation of the IL-17A/fibroblast link to the disruption of differentiation one step further, Cihakova and her colleagues focused on a finding from an earlier study of IL-17A.

"We knew that cardiac fibroblasts stimulated by IL-17A are potent producers of a protein, granulocyte-macrophage colony-stimulating factor, or GM-CSF, that is a cytokine, a molecule that evokes an immune response and inflammation in tissues," says Xuezhou (Snow) Hou, Ph.D., the lead author of the Cell Reports paper and formerly at the Johns Hopkins Bloomberg School of Public Health.

"So, thinking that GM-CSF might be the key to why differentiation is disrupted, we added antibodies against GM-CSF to a mix of cardiac fibroblasts, IL-17A and Ly6Clo and found that we could counter IL-17A's influence on the fibroblasts, and in turn, restore normal Ly6Clo monocyte-to-macrophage differentiation," she said.

Cihakova says another problem arises when IL-17A-stimulated fibroblasts also affect the macrophages that arise from Ly6Chi monocytes.

"We learned that IL-17A signaling through the fibroblasts increases the shedding of receptors from the surface of Ly6Chi-derived macrophages in the heart," Cihakova explains. "These receptors produce an enzyme, myeloid-epithelial-reproductive tyrosine kinase, or MerTK, that helps direct the macrophages to engulf dead heart cells and remove them. If this doesn't happen effectively -- as when IL-17A leads to a decreased amount of MerTK -- the chemical contents of uncleared dead cells can spill into the heart and inflame the cardiac muscle."

The researchers also showed that MerTK levels were significantly lower in cardiac biopsies of human myocarditis patients when compared with heart tissue in persons with myocardial ischemia, a condition that occurs when blood flow to the heart is reduced.

"This is important because a measure of the MerTK level in circulating blood might one day be used as a diagnostic biomarker to distinguish between myocarditis and ischemia since the two exhibit very similar symptoms," Hou says.

Cihakova and Hou caution that further studies and eventually, clinical trials, will be needed before IL-17A or GM-CSF could be considered as potential therapeutic targets for drugs that can prevent myocarditis.

Credit: 
Johns Hopkins Medicine

Seeing clearly: Revised computer code accurately models an instability in fusion plasmas

image: PPPL physicist Mario Podesta, one of the scientists who contributed to new research involving the sawtooth instability in fusion plasma.

Image: 
Elle Starkman

Subatomic particles zip around ring-shaped fusion machines known as tokamaks and sometimes merge, releasing large amounts of energy. But these particles -- a soup of charged electrons and atomic nuclei, or ions, collectively known as plasma -- can sometimes leak out of the magnetic fields that confine them inside tokamaks. The leakage cools the plasma, reducing the efficiency of the fusion reactions and damaging the machine. Now, physicists have confirmed that an updated computer code could help to predict and ultimately prevent such leaks from happening.

The research team updated TRANSP, the plasma simulation code developed at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) and used in fusion research centers around the world, by installing a new bit of code known as a kick model into one of the TRANSP components. The kick model -- so called because it simulates jolts of energy that kick the particles within the plasma -- allows TRANSP to simulate particle behavior more accurately than before. Aided by subprograms known as NUBEAM and ORBIT that model plasma behavior by distilling information from raw data, this updated version of TRANSP could help physicists better understand and predict the leaks, as well as create engineering solutions to minimize them.

Fusion, the power that drives the sun and stars, is the fusing of light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei -- that generates massive amounts of energy. Scientists are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

The team found that the updated version of TRANSP accurately modeled the effect of the sawtooth instability -- a kind of disturbance affecting the fusion reactions -- on the movement of highly energetic particles that help cause fusion reactions. "These results are important because they may allow physicists to use the same approach to deal with a broad spectrum of instabilities without switching from one model to another depending on the specific problem," said PPPL physicist Mario Podestà, a coauthor of the paper that reported the findings in Nuclear Fusion. The results, based on sawtooth instabilities that occurred during operation of PPPL's National Spherical Torus Experiment-Upgrade (NSTX-U) in 2016, extend previous PPPL research into putting kick models into TRANSP.

The updated version of TRANSP can simulate plasma behavior of experiments that have not been conducted yet, Podestà said. "Because we understand the physics built into the kick model, and because that model successfully simulated results from past experiments for which we have data, we have confidence that the kick model can accurately model future experiments," he said.

In the future, the researchers want to determine what happens between instabilities to get a fuller sense of what's occuring in the plasma. In the meantime, Podestà and the other scientists are encouraged by the current results. "We now see a path forward to improving the ways that we can simulate certain mechanisms that disturb plasma particles," Podestà said. "This brings us closer to reliable and quantitative predictions for the performance of future fusion reactors."

Credit: 
DOE/Princeton Plasma Physics Laboratory

Newly identified meningeal lymphatic vessels answers key questions about brain clearance

image: The waste of the brain is drained by the CSF and exits the brain through the mLVs at the skull base (basal mLVs). The basal mLVs are connected to the lymphatic system through a hole in the skull (skull foramen) and they have abundant lymphatic vessel branches with a finger-like protrusions. There exists a valve within the vessel structure of the basal meningeal lymphatic vessels that allows the lymph to flow in one direction. In particular, the basal meningeal lymphatic vessels are anatomically located in close proximity to the CSF and have structures favoring the absorption and drainage of the CSF. The CSF is cleared outside the central nervous system into the deep cervical lymph nodes after drainage from the basal meningeal lymphatic vessels. Figure B is a schematic diagram showing that the basal mLVs undergo a severe deformation process and that their functionality is impaired with age.

Image: 
IBS

Just see what happens when your neighborhood's waste disposal system is out of service. Not only do the piles of trash stink but they can indeed hinder the area's normal functioning. That is also the case when the brain's waste management is on the blink. The buildup of toxic proteins in the brain causes a massive damage to the nerves, leading to cognitive dysfunction and increased probability of developing neurodegenerative disorders such as Alzheimer's disease. Though the brain drains its waste via the cerebrospinal fluid (CSF), little has been understood about an accurate route for the brain's cleansing mechanism.

Scientists led by Dr. Gou Young Koh at the Center for Vascular Research within the Institute for Basic Science (IBS) at the Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea have reported the basal side of the skull as the major route, so called "hotspot" for CSF drainage. They found that basal meningeal lymphatic vessels (mLVs) function as the main plumbing pipes for CSF. They confirmed macromolecules in the CSF mainly runs through the basal mLVs. Notably, the team also revealed that the brain's major drainage system, specifically basal mLVs are impaired with aging. Their findings have been reported in the journal Nature on July 24.

Throughout our body, excess fluids and waste products are removed from tissues via lymphatic vessels. It was only recently discovered that the brain also has a lymphatic drainage system. mLVs are supposed to carry waste from the brain tissue fluid and the CSF down the deep cervical lymph nodes for disposal. Still scientist are left with one perplexing question -- where is the main exit for the CSF? Though mLVs in the upper part of the skull (dorsal meningeal lymphatic vessels) were reported as the brain's clearance pathways in 2014, no substantial drainage mechanism was observed in that section.

"As a hidden exit for CSF, we looked into the mLVs trapped within complex structures at the base of the skull," says Dr. Ji Hoon Ahn, the first author of this study. The researchers used several techniques to characterize the basal mLVs in detail. They used a genetically engineered lymphatic-reporter mouse model to visualize mLVs under a fluorescence microscope. By performing a careful examination of the mice skull, they found distinctive features of basal mLVs that make them suitable for CSF uptake and drainage. Just like typical functional lymphatic vessels, basal mLVs are found to have abundant lymphatic vessel branches with finger-like protrusions. Additionally, valves inside the basal mLVs allow the flow to go in one direction. In particular, they found that the basal mLVs are closely located to the CSF. Dr. Hyunsoo Cho, the first author of this study explains, "All up, it seemed a solid case that basal mLVs are the brain's main clearance pathways."

The researchers verified such specialized morphologic characteristics of basal mLVs indeed facilitate the CSF uptake and drainage. Using CSF contrast-enhanced magnetic resonance imaging in a rat model, they found that CSF is drained preferentially through the basal mLVs. They also utilized a lymphatic-reporter mouse model and discovered that fluorescence-tagged tracer injected into the brain itself or the CSF is cleared mainly through the basal mLVs. Jun-Hee Kim, the first author of this study notes, "We literally saw that the brain clearance mechanism utilizing basal outflow route to exit the skull."

It has long been suggested that CSF turnover and drainage declines with ageing. However, alteration of mLVs associated with ageing is poorly understood. In this study, the researchers observed changes of mLVs in young (3-month-old) and aged (24~27-months-old) mice. They found that the structure of the basal mLVs and their lymphatic valves in aged mice become severely flawed, thus hampering CSF clearance. The corresponding author of this study, Dr. Koh says, "By characterizing the precise route for fluids leaving the brain, this study improves our understanding on how waste is cleared from the brain. Our findings also provide further insights into the role of impaired CSF clearance in the development of age-related neurodegenerative diseases."

Many current therapies for Alzheimer's disease target abnormally accumulated proteins, such as beta-amyloid. By mapping out a precise route for the brain's waste clearance system, this study may be able to help find ways to improve the brain's cleansing function. Such breakthrough might become quite a sensational strategy for eliminating the buildup of aging-related toxic proteins. "It definitely warrants more extensive investigation of mLVs in patients with age-related neurodegenerative disease such as Alzheimer's disease prior to clinical investigation," adds Dr. Koh.

Credit: 
Institute for Basic Science

How random tweaks in timing can lead to new game theory strategies

Most game theory models don't reflect the relentlessly random timing of the real world. In some models, players may receive information at the same time, and they act simultaneously. Others may include randomness in terms of sharing information or acting, but that randomness occurs at discrete steps.

But that's not the way the world works, notes economist Justin Grana at the RAND Corporation in Washington, D.C., a former postdoctoral scholar at the Santa Fe Institute (SFI). Competing companies make decisions based on when they receive information, as well as what that information is. Timing can make or break a decision, and randomness evolves continuously, not step-wise.

"We don't know when things will happen," says Grana. "The environment changes according to some kind of random processes, and we have to act at random times. You might get information, and that information causes you to act - or it might cause delays in what you do."

In a new paper in the Berkeley Electronic Journal of Theoretical Economics, Grana and his collaborators investigate game theory models that address what happens when players receive information or act at random times, determined as part of a continuous time evolution.

"We wanted to introduce the uncertainty of timing in these scenarios," says Grana. He developed the models with physicist David Wolpert of SFI and economist James Bono.

In the new work, the researchers looked at models of Bertrand competition -- scenarios that predict how consumers will respond when sellers set the price of a good. They wanted to know under what circumstances random time fluctuations could lead to collusion, a specific kind of cooperation in which two parties may share information if both benefit.

Imagine two gas stations, for example, facing each other off the same remote exit of the same lonely interstate. The owners buy gas at the same price. Knowing that customers will choose the cheaper option, one owner may lower the price, prompting her competitor to lower his price, and so on until neither station can make excessive profits. In the interest of keeping the businesses alive, the two may instead decide to keep prices high and share the customers equally.

The model could help identify, for example, at what rate the station owners would need to have new information about demand in order to sustain this collusive structure, says Grana. It could predict how fluctuation in that timing could influence the strategic decisions of the players involved.

The model is part of an emerging research interest in a variety of fields -- ranging from economics to engineering to air traffic controls -- that focuses on how asynchronous events can influence game theory strategies. Although it's too early to see if real-world data lines up with the predictions of such an abstract model, Grana says this exploratory work suggests that small tweaks in timing can make a big difference in decision-making.

"Those changes are rich enough to show that it's worthwhile to explore loosening our assumptions about timing," he says.

Credit: 
Santa Fe Institute

Found: Fastest eclipsing binary, a valuable target for gravitational wave studies

video: Artist's animation depicting the eclipsing binary ZTF J1530+5027, which is comprised of two extremely dense objects (white dwarfs) that orbit each other roughly every seven minutes. One second of time in the animation represents two minutes of real time. The smaller white dwarf is slightly larger than Earth and is the more massive of the two, with about 60% the mass of the sun. Its companion is larger but less massive, with only about 20% the mass of the sun. The orbital separation of these objects is shrinking by about 26 centimeters per day due to the emission of gravitational waves, depicted in green near the end of the movie.

Image: 
Caltech/IPAC

Observations made with a new instrument developed for use at the 2.1-meter (84-inch) telescope at the National Science Foundation’s Kitt Peak National Observatory have led to the discovery of the fastest eclipsing white dwarf binary yet known. Clocking in with an orbital period of only 6.91 minutes, the rapidly orbiting stars are expected to be one of the strongest sources of gravitational waves detectable with LISA, the future space-based gravitational wave detector.

The Dense "Afterlives" of Stars

After expanding into a red giant at the end of its life, a star like the Sun will eventually evolve into a dense white dwarf, an object with a mass like that of the Sun squashed down to a size comparable to Earth. Similarly, as binary stars evolve, they can engulf their companion in the red giant phase and spiral close together, eventually leaving behind a close white dwarf binary. White dwarf binaries with very tight orbits are expected to be strong sources of gravitational wave radiation. Although anticipated to be relatively common, such systems have proven elusive, with only a few identified to date.

Record-setting White Dwarf Binary

A new survey of the night sky, currently underway at Palomar Observatory and Kitt Peak National Observatory, is changing this situation.

Each night, Caltech’s Zwicky Transient Facility (ZTF), a survey that uses the 48-inch telescope at Palomar Observatory, scans the sky for objects that move, blink, or otherwise vary in brightness. Promising candidates are followed up with a new instrument, the Kitt Peak 84-inch Electron Multiplying Demonstrator (KPED), at the Kitt Peak 2.1-meter telescope to identify short period eclipsing binaries. KPED is designed to measure with speed and sensitivity the changing brightness of celestial sources.

This approach has led to the discovery of ZTF J1539+5027 (or J1539 for short), a white dwarf eclipsing binary with the shortest period known to date, a mere 6.91 minutes. The stars orbit so close together that the entire system could fit within the diameter of the planet Saturn.

"As the dimmer star passes in front of the brighter one, it blocks most of the light, resulting in the seven-minute blinking pattern we see in the ZTF data," explains Caltech graduate student Kevin Burdge, lead author of the paper reporting the discovery, which appears in the today's issue of the journal Nature.

A Strong Source of Gravitational Waves

Closely orbiting white dwarfs are predicted to spiral together closer and faster, as the system loses energy by emitting gravitational waves. J1539's orbit is so tight that its orbital period is predicted to become measurably shorter after only a few years. Burdge's team was able to confirm the prediction from general relativity of a shrinking orbit, by comparing their new results with archival data acquired over the past ten years.

J1539 is a rare gem. It is one of only a few known sources of gravitational waves--ripples in space and time--that will be detected by the future European space mission LISA (Laser Interferometer Space Antenna), which is expected to launch in 2034. LISA, in which NASA plays a role, will be similar to the National Science Foundation's ground-based LIGO (Laser Interferometer Gravitational-wave Observatory), which made history in 2015 by making the first direct detection of gravitational waves from a pair of colliding black holes. LISA will detect gravitational waves from space at lower frequencies. J1539 is well matched to LISA; the 4.8 mHz gravitational wave frequency of J1539 is close to the peak of LISA's sensitivity.

Discoveries Continue for Historic Telescope

Kitt Peak’s 2.1-meter telescope, the second major telescope to be constructed at the site, has been in continuous operation since 1964. Its history includes many important discoveries in astrophysics, such as the Lyman-alpha forest in quasar spectra, the first gravitational lens by a galaxy, the first pulsating white dwarf, and the first comprehensive study of the binary frequency of stars like the Sun. The latest result continues its venerable track record.

Lori Allen, Director of Kitt Peak National Observatory and Acting Director of NOAO says, “We’re thrilled to see that our 2.1-meter telescope, now more than 50 years old, remains a powerful platform for discovery.”

“These wonderful observations are further proof that cutting-edge science can be done on modest-sized telescopes like the 2.1-meter in the modern era,” adds Chris Davis, NSF Program Officer for NOAO.

More Thrills Ahead!

As remarkable as it is, J1539 was discovered with only a small portion of the data expected from ZTF. It was found in the ZTF team's initial analysis of 10 million sources, whereas the project will eventually study more than a billion stars.

"Only months after coming online, ZTF astronomers have detected white dwarfs orbiting each other at a record pace," says NSF Assistant Director for Mathematical and Physical Sciences, Anne Kinney. "It's a discovery that will greatly improve our understanding of these systems, and it's a taste of surprises yet to come."

Credit: 
Association of Universities for Research in Astronomy (AURA)

A new framework to study congenital heart defects

image: Gladstone scientists Deepak Srivastava (left), Yvanka De Soysa (center), and Casey Gifford (right) publish a complete catalog of the cells involved in heart development.

Image: 
Photo: Gladstone Institutes

Each year, 9 months of dreams and anticipation shared by millions of parents-to-be turn to despair and fright when learning their child is born with a birth defect; an often-devastating event affecting one out of 20 children born worldwide. The formation of our organs, limbs, and face are the result of carefully choreographed movement and behavior by millions of cells, much like dancers in a troupe. If even a few cells don't get to the right position and do their job correctly, the end result is a birth defect. Yet, how each individual cell knows what to do at precisely the right time and place has largely been a mystery.

In a new study published in the scientific journal Nature, a team of researchers at the Gladstone Institutes, in collaboration with the Luxembourg Centre for Systems Biomedicine (LCSB) of the University of Luxembourg, reveal for the first time the full spectrum of cells that come together to make a heart at the earliest stages of embryo formation. They also uncovered how the cells are controlled, and how a mutation in just one gene can have catastrophic consequences by affecting a tiny group of cells that make up the organ.

Congenital heart defects are the most common and most lethal human birth defect. Thanks to the advent of a powerful new technology known as single-cell RNA sequencing, the researchers were finally able to discern the role of tens of thousands of individual cells during the formation of the heart, which is essential to determine how genetic mutations cause disease.

"With genome sequencing, we can now more easily find genetic variants that we think are contributing to a disease," said Gladstone President and Senior Investigator Deepak Srivastava, MD, who led the study. "The big challenge is figuring out the specific cell type in which this variant is functioning and how those cells are impacted. This has been particularly difficult for birth defects, given that genetic variants affect only a small subset of the cells in the organ. With single-cell technologies, we can finally begin to unravel the mechanisms behind defects for which we know the genetic cause."

The catalog that Srivastava and his team compiled contains all the genes that are active during different stages of heart development and identifies the cells in which they can be found. It represents the first step in making the connection between a genetic variant and a specific cell type.

"This can tell us, among many other things, which subset of cells are performing critical functions in specific regions of the heart and which are contributing to the underlying cause of a disease associated with genetic mutations," explained Yvanka De Soysa, a graduate student in Srivastava's laboratory and first author of the study.

A Rich Source of Data on Heart Development

To complete the repository, the researchers studied nearly 40,000 individual cardiac cells from a mouse model of heart development. The technology that made this study possible is single-cell RNA sequencing. This sophisticated method, which has only been commercially available for the past 3 years in its current form, enabled the scientists to capture data about thousands of individual cells at once.

"This sequencing technique allowed us to see all the different types of cells present at various stages of heart development and helped us identify which genes are activated and suppressed along the way," said Casey A. Gifford, PhD, a staff scientist at Gladstone who is a senior author on the paper. "We were not only able to uncover the existence of unknown cell types, but we also gained a better understanding of the function and behavior of individual cells--information we could never access before."

Once they identified the numerous types of cells involved in heart development, the team wanted to learn how these diverse cell types are generated. To do so, they teamed up with computational biologists at the LCSB who specialize in using data from single-cell RNA sequencing to uncover the molecular drivers of different cell types.

"Our group has a long history of developing computational models to understand cell conversion," explained Antonio Del Sol, head of the Computational Biology group at the LCSB and Ikerbasque Research Professor at the Research Center CIC bioGUNE in Bilbao, Spain. "We have the expertise to study whole networks of genes that control cell identity. When we joined the project, we applied our method to predict--without any kind of prior knowledge--which molecular factors govern the fate of these different cardiac cells."

A Discovery 20 Years in the Making

The computational analysis predicted the genes involved in generating specific cell types in the heart, which shed light on those cells' function. The analysis also pointed to one major player, a gene called Hand2 that can control the activity of thousands of other genes, and which Srivastava discovered and named over two decades ago.

Then, as a young researcher, Srivastava spent years investigating the role of this gene and master regulator. He eventually found that it is one of the most important genes for the formation of the heart. But about 10 years ago, in trying to discover how this gene actually affects heart cells that make up the organ, his work reached an impasse because the scientific tools to pursue the research didn't exist. Today, his efforts have finally been revived thanks to new technology.

By applying single-cell RNA sequencing, he and his collaborators were able to get a much more detailed and complete picture of how the loss of Hand2 causes different cell populations to become dysregulated.

Mice lacking the gene Hand2 fail to form the right ventricle chamber, which pumps blood to the lungs. Surprisingly, the new prediction made by the Luxembourg researchers suggested that Hand2 is not required for cells being instructed to become right ventricular cells, but that it is critical in forming the cells of the outflow tract, the structure where major outgoing blood vessels of the heart arise.

"This didn't make sense based on previous findings," said De Soysa. "However, we found that, in fact, Hand2 has very distinct functions for different cell types."

The computational prediction turned out to be correct. The team discovered that hearts without the Hand2 gene never made cells of the outflow tract, but did make right ventricular cells. In the choreography of the heart, it is not enough for a cell to be made, it must also get to the right place relative to the other "dancers." Without Hand2, right ventricle cells were created but stuck at their origin, failing to move into the developing heart.

"Our collaborative findings made us change the way we think about heart formation, and showed how disruption of cell fate, migration, or survival of a few cells can cause a heart defect," De Soysa added.

A Hopeful Future for the Treatment of Congenital Heart Disease

The study has revealed the mechanisms by which relatively small populations of cells are affected during development and lead to defects in the formation of the heart. It also represents a discovery that could not have been possible without single-cell RNA sequencing technology.

"Single-cell technologies can inform us about how organs form in ways we couldn't understand before and can provide the underlying cause of disease associated with genetic variations," said Gifford. "We revealed subtle differences in very, very small subsets of cells that actually have catastrophic consequences and could easily have been overlooked in the past. This is the first step toward devising new therapies."

Significantly, the new catalog of cardiac cells can now serve scientists and physicians interested in various aspects of heart development. With knowledge of the types of cells involved in normal and abnormal formation of the heart, the scientific community can begin to design strategies to correct genetic variants that cause congenital heart disease.

These findings could also guide therapeutic approaches to help both newborns and the growing adult population with congenital heart disease.

"With surgical interventions, we've become very good at keeping most kids with heart defects alive," said Srivastava, who is also a pediatric cardiologist at UCSF Benioff Children's Hospital and professor of pediatrics at UC San Francisco. "The result is that we have nearly 2.5 million survivors of congenital heart disease in the United States today."

When children with a birth defect are fortunate enough to survive, the same genetic condition that caused the developmental problem can lead to ongoing difficulties with maintaining a healthy heart over a lifetime.

"We're beginning to see the long-term consequences in adults, and right now, we don't really have any way to treat them," Srivastava added. "My hope is that if we can understand the genetic causes and the cell types affected, we could potentially intervene soon after birth to prevent the worsening of their state over time."

For Srivastava, the holy grail would be to get such a clear picture of the mechanisms involved in causing congenital heart defects that they could develop preventive strategies for people who are genetically at risk.

"Folic acid being the best paradigm--expecting mothers now take higher levels of this vitamin and can successfully prevent nearly two thirds of cases of spina bifida," he said. "The ultimate goal is to create similar public health measures that could reduce the overall incidence of birth defects through prevention. But first, we have to know where and how to intervene."

Credit: 
Gladstone Institutes

Volcanoes shaped the climate before humankind

The volcanoes in the tropics went crazy between 1808 and 1835: Not only did Tambora erupt in Indonesia during this short period of time but there were also four other large eruptions. This unusual series of volcanic eruptions caused long-lasting droughts in Africa and contributed to the last advance of Alpine glaciers during the Little Ice Age.

"Frequent volcanic eruptions caused an actual gear shift in the global climate system," says Stefan Brönnimann, head of the international research team that discovered the effects of the series of eruptions on the oceans and thus on atmospheric circulation. Brönnimann is Professor of Climatology at the University of Bern and a member of the Oeschger Centre for Climate Research. Their research has been published in the Nature Geosciences journal.

Less rain in Africa and India, more rain and snow in Europe

For their investigations, the researchers analyzed new climate reconstructions that include atmospheric circulation and compared the results to observation-based data. Model simulations finally helped to pin down the role of the oceans in climate change in the early 19th century and showed that they could not recover from the effects of the sequence of eruptions for several decades. The consequences: the persistent weakening of the African and Indian monsoon systems and a shift of atmospheric circulation over the Atlantic-European sector. This led to an increase in low-pressure systems crossing Central Europe.

The last glacier advance in the Alps from the 1820s to the 1850s, depicted in paintings and even old photographs, is a consequence of increased precipitation due to the altered circulation in combination with low temperatures. However, global temperature increased again from the late 19th century onward. The Little Ice Age was eventually superseded by a first phase of global warming, culminating in the 1940s and with a significant manmade contribution.

Important for the definition of "pre-industrial climate"

The new Bern study not only explains the global early 19th century climate, but it is also relevant for the present. "Given the large climatic changes seen in the early 19th century, it is difficult to define a pre-industrial climate," explains lead author Stefan Brönnimann, "a notion to which all our climate targets refer." And this has consequences for the climate targets set by policymakers, who want to limit global temperature increases to between 1.5 and 2 degrees Celsius at the most. Depending on the reference period, the climate has already warmed up much more significantly than assumed in climate discussions. The reason: Today's climate is usually compared with a 1850-1900 reference period to quantify current warming. Seen in this light, the average global temperature has increased by 1 degree. "1850 to 1900 is certainly a good choice but compared to the first half of the 19th century, when it was significantly cooler due to frequent volcanic eruptions, the temperature increase is already around 1.2 degrees," Stefan Brönnimann points out.

Credit: 
University of Bern

Study reveals top tools for pinpointing genetic drivers of disease

image: Dr. Daniel Cameron and Professor Tony Papenfuss led the study.

Image: 
Walter and Eliza Hall Institute of Medical Research

Published in Nature Communications, the study is the largest of its kind and was led by Walter and Eliza Hall Institute computational biologists Professor Tony Papenfuss, Dr Daniel Cameron and Mr Leon Di Stefano.

The new study reveals the world's top genomic rearrangement detection tools providing summaries on their performance and recommendations for use. Dr Cameron said the study could ultimately help clinicians determine the best treatments for their patients.

"Basically, you have to understand what is going wrong before you can work out how to fix the problem. In the context of cancer for instance, an understanding of the genetic mutations driving tumour growth could help oncologists determine the most appropriate treatment for their patients," he said.

To determine the best genomic rearrangement detection methods, the researchers comprehensively tested 12 of the most widely used tools to see which ones could accurately identify the differences between a patient's genetic information and the standard human reference genome. The findings revealed that a tool called GRIDSS, developed by Professor Papenfuss and Dr Cameron, was one of the best performing options, most accurately able to detect DNA rearrangements.

Dr Cameron said the study would not have been possible without the Institute's high-performance computing resource.

"Over the course of two years, we tested 12 of the most popular genomic rearrangement detection tools, generating more than 50 terabytes of data, to determine which tools perform well, and when they perform badly. Without these computing resources, we estimate the study would have taken us more than ten years," he said.

The Institute's Theme Leader for Computational Biology Professor Papenfuss said computational methods were required, more than ever before, for making sense of vast and complex datasets being generated from research.

"Computational studies like this one keep the field up to date with best practice approaches for data analysis. This particular study provides a comprehensive resource for users of genomic rearrangement detection methods, as well as developers in the field. It will also help to direct the next iteration of genomic rearrangement tool development at the Institute," he said.

As new experimental techniques and DNA sequencing machines become available, the very nature of the data they generate is changing. Professor Papenfuss said that older analysis tools, while heavily cited and widely used, could lead to erroneous interpretations if used on data produced by the latest DNA sequencing machines. "This is why it is so important for researchers to find the right match between the analysis tool and dataset at hand," he said.

Credit: 
Walter and Eliza Hall Institute

Researchers develop new technology for multiple sclerosis diagnosis and treatment

image: Dinesh Sivakolundu, Dr. Bart Rypma and Dr. Darin T. Okuda.

Image: 
Center for BrainHealth

DALLAS (July 19, 2019) - Researchers at the Center for BrainHealth®, part of The University of Texas at Dallas, in collaboration with a team from UT Southwestern, have developed technology for a novel diagnostic method for multiple sclerosis (MS). The new approach has the potential to determine which damaged regions in an MS patient's brain have the capacity to heal themselves, and which do not.

The researchers examined brain scans from 23 patients and 109 different lesions. Some lesions showed increased levels of surrounding oxygen, a biomarker that is known to correlate with the capacity for healing. The researchers then created 3D images of the lesions using a new, patent-pending technology tool, which revealed that the metabolically active lesions are more spherical with a rough surface, whereas the metabolically inactive lesions are irregular in shape and have a smooth surface. The results are published in the Journal of Neuroimaging (May 2019).

"This diagnostic method represents a significant advance in our field, considering the new MS drugs being developed to heal damaged areas of the brain," said Dinesh Sivakolundu, the study's lead author, and a teaching and doctoral student working in Dr. Bart Rypma's lab at the Center for BrainHealth. "Using our new technology, we could potentially determine which patients would benefit from such new drugs and which patients would not."

The lab of Dr. Darin T. Okuda, a neurologist and MS expert with UT Southwestern Medical Center, looked at the 3D lesion images and studied lesion phenotypes to uncover the relevant biomarker: a blood oxygen level dependent (BOLD) slope that compares the amount of oxygen available at the injury site to that of its surroundings.

As a result of the collaboration, the researchers were able to study MS physiology within and around lesions and its relationship to lesion physiology.

"Our new technology has the potential to be a game-changer in the treatment of MS by helping doctors be more precise in their treatment plans," added Sivakolundu.

Credit: 
Center for BrainHealth

Dangers of the blame game

When David Dao was forcibly removed from a United Airlines flight in 2017 after declining to give up his seat for United employees, there was immediate public outrage against the airline. But quickly after the event, news spread that Dao had used his medical license to trade prescription drugs for sex. Online reports implied that Dao, rather than United Airlines, was to blame because he was viewed as a bad person.

This tendency to shift blame to victims based on assumptions about their character can have serious consequences in today's free market, where consumers have more power than ever to hold companies accountable for faulty products or services. In a new study, researchers discovered which factors lead people to assign blame to a victim and the repercussions of this judgement. The study, which was led by marketing assistant professor Brandon Reich of Portland State University, was recently published online in the Journal of Consumer Psychology.

In one experiment, researchers summarized the United Airlines incident for participants. One group heard that Dao had traded prescription drugs for sex while the second group did not learn about this detail. Then the participants were told that a lawsuit against the airline required a certain number of signatures to proceed, and they were invited to sign the petition. The results of the study showed that when participants were informed about Dao's past moral transgressions, they were significantly less likely to sign the petition against United Airlines, with 65 percent versus 84 percent signing depending on knowledge of the transgression.

"The evidence suggests that introducing the morality of the victim, which is irrelevant to the harm they suffered from a faulty product or service, leads to changes in our thinking about who is to blame and who should be held responsible," says Troy Campbell, PhD, an assistant professor of marketing at the University of Oregon who was involved in the study. "Individuals, groups and society can be harmed by irrational victim blaming, so we need to more fully understand this psychological tendency."

In a second experiment, participants read about a scenario in which a bank employee noticed that he had an extra $200 in his register due to another employee's error. In one group, participants learned that the bank employee kept the money for himself, while the second group learned that the employee told his manager about the discrepancy. The next day, the bank employee's hot coffee spilled all over his lap due to faulty threading in his travel mug. Then the participants answered questions about the scenario, and they were more likely to blame the employee for the spilled coffee when he had kept the $200 for himself. They were also less likely to write a negative review about the travel mug brand or verbally discourage others from using the brand.

Campbell hopes these findings will increase awareness among consumers as they decide when to take action to protect the public. Consumers may have fodder to blame victims of food poisoning, malfunctioning cars, faulty technology products or many other product or service failures. Moral judgements about the victims of sexual or racial harassment can also impact public responses to incidents.

"The more we blame a victim, the less likely we are to take action," says Campbell. "If people are aware of how irrelevant information can influence their behavior, my hope is that they will be empowered to make rational decisions that enable the free market to work more effectively."

Credit: 
Society for Consumer Psychology

Lobster organs and reflexes damaged by marine seismic surveys

image: This is a seismic air gun test during experiments in Storm Bay, Tasmania.

Image: 
Rob McCauley

A new study of the impact on marine life of seismic air guns, used in geological surveys of the seafloor, has found that the sensory organs and righting reflexes of rock lobster can be damaged by exposure to air gun signals.

Published in the journal Proceedings of the Royal Society B, the research by scientists from IMAS and the Centre for Marine Science and Technology at Curtin University is the latest in a series of studies they have conducted into how seismic surveys affect marine animals.

The study was funded by the Australian Government through the Fisheries Research and Development Corporation (FRDC), Origin Energy, and the Victorian Government's CarbonNet Project.

Lead author Dr Ryan Day said researchers exposed rock lobster to seismic air gun noise during field tests in Tasmania's Storm Bay and examined the effects on a key sensory organ, the statocyst, and the lobsters' reflexes.

"While the impact of air guns on whales and fishes has been relatively well-studied, the effects on marine invertebrates such as lobsters, crabs and squid remain poorly understood," Dr Day said.

"We chose to study the impact on rock lobster because they are a high value fishery and an important part of global marine ecosystems.

"Previous studies have shown that the statocyst, a sensory organ on a lobster's head, is critical in controlling their righting reflex, enabling them to remain coordinated and evade predators.

"After exposing lobsters to the equivalent of a commercial air gun signal at a range of 100-150 metres, our study found that the animals suffered significant and lasting damage to their statocyst and righting reflexes.

"The damage was incurred at the time of exposure and persisted for at least one year - surprisingly, even after the exposed lobsters moulted," Dr Day said.

The study's Principal Investigator, Associate Professor Jayson Semmens, said that while the ecological impacts of the damage were not evaluated, the impairment would likely affect a lobster's ability to function in the wild.

"This study adds to a growing body of research that shows marine invertebrates can suffer physiological impacts and changes to their reflexes in response to anthropogenic noise such as seismic surveys," Associate Professor Semmens said.

"In recent years our research team has also looked at the impact of seismic surveys on lobster embryos, scallops and zooplankton

"Such studies are important to enable government, industry and the community to make informed decisions about how such activities can best be conducted while minimising negative outcomes for fisheries and ecosystems globally," he said.

Credit: 
University of Tasmania

Is your favorite brand authentic?

In the modern media world, consumers are constantly bombarded with advertisements claiming that products are "luxury," "European," created with "old-world traditions and craftsmanship" and more, but how do people know if these descriptions are true? The name Haagen-Dazs evokes a premium, imported brand image, but the company's original brand name was Senator Frozen Foods.

Being authentic is trendy today, and researchers have discovered one of the critical factors that influences consumers to believe a brand is in fact authentic. The investigators found that information about a founder's motivation for creating a company has a powerful effect on whether consumers deem a brand authentic, which in turn influences judgments about the quality of the product. The findings were recently published online in the Journal of Consumer Psychology.

In one experiment, participants reviewed product information about Sweet Things granola, and half of the group read that the brand was founded by a young woman named Kelly who was known among her friends for her homemade granola. She decided to make a living selling the product she loved making, which the researchers labeled as the "intrinsic motivation" condition. The other group read that the brand was created as an extension of a larger company that already made gourmet snack foods. The company wanted to expand its market, which the researchers labeled "extrinsic motivation." The study results showed that participants in the intrinsic motivation condition believed the granola brand was more authentic.

In a follow-up experiment, participants saw a list of the ingredients in Sweet Things granola, including rolled oats, nuts, chocolate and dried fruit, as well as possible uses, such as breakfast or a snack while hiking. Then they rated the expected quality of the product. Next, participants read about the origins of the company, and one group read the intrinsic motivation story while the other group read the extrinsic motivation story. Then they re-rated the quality of the granola.

The results showed that participants who had read the story about the founder named Kelly, who had demonstrated intrinsic motivation, re-rated the quality of the product significantly higher than those who had read the extrinsic motivation story.

The researchers discovered that even for products that are generally disliked, the same trend applied. They shared different stories about the origins of a cigarette brand, and the group in the intrinsic motivation condition read that the owner was a nightclub manager who had been hand-rolling cigarettes using unique tobacco blends for himself and decided to start a business based on this hobby. The extrinsic motivation participants read that the nightclub manager noticed that blended cigarettes were becoming more popular and he capitalized on this market knowledge by starting a business. As expected, the people in the intrinsic motivation condition rated the brand as more authentic and higher in quality than the motivation group.

"The findings suggest that people draw many inferences about a company based on their beliefs about its authenticity," says study author Melissa Cinelli, an associate professor of marketing at the University of Mississippi. "A company's story telling strategy can shift opinions."

Consumers' tendency to make assumptions about authenticity can also serve as a word of caution to marketers, she says. If consumers, for example, believe that a granola brand is authentically all-natural, but they notice high-fructose corn syrup as an ingredient, they could feel betrayed if they considered this ingredient to be artificial. This could prompt consumers to discourage others from using the brand.

Now the researchers hope to investigate how perceptions about brand authenticity influence buying decisions. "If inferences about authenticity lead to judements that products are higher quality, then people may be more willing to buy products and also pay more for them," says Cinelli.

Credit: 
Society for Consumer Psychology

Penn engineers' 'LADL' uses light to serve up on-demand genome folding

image: A modification of CRISPR/Cas9 allowed researchers to home in on the desired sequences of DNA on either end of the loop they wanted to form. If those sequences could be engineered to seek one another out and snap together under the other necessary conditions, the loop could be formed on demand.

Image: 
University of Pennsylvania

Every cell in your body has a copy of your genome, tightly coiled and packed into its nucleus. Since every copy is effectively identical, the difference between cell types and their biological functions comes down to which, how and when the individual genes in the genome are expressed, or translated into proteins.

Scientists are increasingly understanding the role that genome folding plays in this process. The way in which that linear sequence of genes are packed into the nucleus determines which genes come into physical contact with each other, which in turn influences gene expression.

Jennifer Phillips-Cremins, assistant professor in Penn Engineering's Department of Bioengineering, is a pioneer in this field, known as "3-D Epigenetics." She and her colleagues have now demonstrated a new technique for quickly creating specific folding patterns on demand, using light as a trigger.

The technique, known as LADL or light-activated dynamic looping, combines aspects of two other powerful biotechnological tools: CRISPR/Cas9 and optogenetics. By using the former to target the ends of a specific genome fold, or loop, and then using the latter to snap the ends together like a magnet, the researchers can temporarily create loops between exact genomic segments in a matter of hours.

The ability to make these genome folds, and undo them, on such a short timeframe makes LADL a promising tool for studying 3D-epigenetic mechanisms in more detail. With previous research from the Phillips-Cremins lab implicating these mechanisms in a variety of neurodevelopmental diseases, they hope LADL will eventually play a role in future studies, or even treatments.

Alongside Phillips-Cremins, lab members Ji Hun Kim and Mayuri Rege led the study, and Jacqueline Valeri, Aryeh Metzger, Katelyn R. Titus, Thomas G. Gilgenast, Wanfeng Gong and Jonathan A. Beagan contributed to it. They collaborated with associate professor of Bioengineering Arjun Raj and Margaret C. Dunagin, a member of his lab.

The study was published in the journal Nature Methods.

"In recent years," Phillips-Cremins says, "scientists in our fields have overcome technical and experimental challenges in order to create ultra-high resolution maps of how the DNA folds into intricate 3D patterns within the nucleus. Although we are now capable of visualizing the topological structures, such as loops, there is a critical gap in knowledge in how genome structure configurations contribute to genome function."

In order to conduct experiments on these relationships, researchers studying these 3D patterns were in need of tools that could manipulate specific loops on command. Beyond the intrinsic physical challenges -- putting two distant parts of the linear genome in physical contact is quite literally like threading a needle with a thread that is only a few atoms thick -- such a technique would need to be rapid, reversible and work on the target regions with a minimum of disturbance to neighboring sequences.

The advent of CRISPR/Cas9 solved the targeting problem. A modification of the gene editing tool allowed researchers to home in on the desired sequences of DNA on either end of the loop they wanted to form. If those sequences could be engineered to seek one another out and snap together under the other necessary conditions, the loop could be formed on demand.

Cremins Lab members then sought out biological mechanisms that could bind the ends of the loops together, and found an ideal one in the toolkit of optogenetics. The proteins CIB1 and CRY2, found in Arabidopsis, a flowering plant that's a common model organism for geneticists, are known to bind together when exposed to blue light.

"Once we turn the light on, these mechanisms begin working in a matter of milliseconds and make loops within four hours," says Rege. "And when we turn the light off, the proteins disassociate, meaning that we expect the loop to fall apart."

"There are tens of thousands of DNA loops formed in a cell," Kim says. "Some are formed slowly, but many are fast, occurring within the span of a second. If we want to study those faster looping mechanisms, we need tools that can act on a comparable time scales."

Fast acting folding mechanisms also have an advantage in that they lead to fewer perturbations of the surrounding genome, reducing the potential for unintended effects that would add noise to an experiment's results.

The researchers tested LADL's ability to create the desired loops using their high-definition 3D genome mapping techniques. With the help of Arjun Raj, an expert in measuring the activity of transcriptional RNA sequences, they also were able to demonstrate that the newly created loops were impacting gene expression.

The promise of the field of 3D-epigenetics is in investigating the relationships between these long-range loops and mechanisms that determine the timing and quantity of the proteins they code for. Being able to engineer those loops means researchers will be able to mimic those mechanisms in experimental conditions, making LADL a critical tool for studying the role of genome folding on a variety of diseases and disorders.

"It is critical to understand the genome structure-function relationship on short timescales because the spatiotemporal regulation of gene expression is essential to faithful human development and because the mis-expression of genes often goes wrong in human disease," Phillips-Cremins says. "The engineering of genome topology with light opens up new possibilities to understanding the cause-and-effect of this relationship. Moreover we anticipate that, over the long term, the use of light will allow us to target specific human tissues and even to control looping in specific neuron subtypes in the brain."

Credit: 
University of Pennsylvania

Study highlights the benefits of a US salt reduction strategy to US food industry

New research, published in The Milbank Quarterly, highlights the potential health and economic impact of the United States (US) Food and Drug Administration's (FDA) proposed voluntary salt policy on workers in the US food industry.

Excess salt consumption is associated with higher risk of cardiovascular disease (CVD). Globally, more than 1.5 million CVD related deaths every year can be attributed to excess dietary salt intake.

The World Health Organization has recommended sodium reduction as a "best buy" to prevent cardiovascular disease (CVD). In 2016, the US Food and Drug Administration (FDA) set voluntary targets for food industry to reduce sodium in processed foods.

In a previous study, researchers from the University of Liverpool, Imperial College London, Friedman School of Nutrition Science and Policy at Tufts and collaborators as part of the Food-PRICE project, found that for the whole US population the optimal reformulation scenario, 100% compliance with the 10-year FDA targets, could prevent approximately 450,000 CVD cases, gain 2 million Quality Adjusted Life Years (QALYs) and produce discounted cost savings of approximately $40 billion over a 20 year period.

Despite this, Congress has temporarily blocked the FDA from implementing voluntary industry targets for sodium reduction in processed foods, the implementation of which could cost the industry around $16 billion over 10 years.

This new study examined the impact of the policy on the food industry itself to determine the cost effectiveness of meeting these draft sodium targets.

The team modelled the health and economic impact of meeting the two-year and 10-year FDA targets, from the perspective of people working in the food system itself, over 20 years, from 2017 to 2036.

They found that the benefits of implementing the FDA voluntary sodium targets extend to food companies and food system workers, and the value of CVD-related health gains and cost savings are together greater than the government and industry costs of reformulation.

The researchers found that achieving long-term sodium reduction targets could produce 20-year health gains of approximately 180,000 QALYs and produce health cost savings of approximately $5.2 billion.

Because many health benefits may occur in individuals over age 65 years or the uninsured, these health savings would be shared among individuals, industry, and government.

Brendan Collins, Public Health Economist (who co-led the study with Dr Chris Kypridemos), University of Liverpool, said: "Excess dietary salt kills people. Salt reduction has therefore been recommended by the World Health Organisation as a "best buy". Around three quarters of salt is hidden in packaged foods before we buy them. That makes it very hard for people to cut their intake."

"We have shown that reducing the salt hidden in processed foods would have huge economic benefits because fewer people get high blood pressure which leads to strokes and heart attacks. That remains true even when just looking at people working in the food industry itself, because they cost their company less in healthcare, and they can continue working and looking after their families for longer."

Professor Simon Capewell, co-author, University of Liverpool, said: "The message for our own UK Government is also clear. A laissez faire approach will kill or maim thousands of people. We therefore welcome the new Prevention Green Paper focus on salt. Reactivating the previously successful FSA approach would prevent thousands of deaths, and powerfully assist the NHS, UK employers and the UK economy".

Credit: 
University of Liverpool