Culture

Study: First clinical proof that genotypes determine if Alzheimer's drugs will work

BUFFALO, N.Y. -- University at Buffalo researchers have determined that a human gene present in 75 % of the population is a key reason why a class of drugs for Alzheimer's disease seemed promising in animal studies only to fail in human studies.

The researchers say the work suggests that in different Alzheimer's disease patients, different mechanisms are at work that determine whether or not a given therapy will be effective.

While a previous study by the researchers studied the function of the gene in tissue culture, this is the first time that drug effect based on a patients' genotype has been clinically shown.

The UB researchers caution that the study has its limitations and randomized double blind studies are needed to confirm the results.

The research was presented today at the annual Alzheimer's Association International Conference (AAIC) in Los Angeles. It was conducted on data from a ten-year, longitudinal, multicenter cohort study by the Texas Alzheimer Research and Care Consortium (TARCC) on 345 Alzheimer's patients. The UB researchers are collaborators on the TARCC.

Proof of concept

"This research provides proof of concept that since different mechanisms are at work in Alzheimer's in different patients, we need to develop more personalized treatments that will prove more effective in individuals," said Kinga Szigeti, MD, PhD, lead investigator, director of UB's Alzheimer's Disease and Memory Disorders Center, part of UBMD Neurology, and associate professor of neurology in the Jacobs School of Medicine and Biomedical Sciences at UB.

The gene, CHRFAM7A, is a fusion between a gene that codes for an Alpha 7 receptor for acetylcholine, a neurotransmitter involved in memory and learning and long associated with Alzheimer's, and a kinase, a type of enzyme.

Szigeti explained that the gene is present in two flavors, a functional gene and one that is not made into protein, data the UB team also is presenting this week at AAIC.

"This splits the population 1-to-3 between non-carriers and carriers," said Szigeti. CHRFAM7A has been implicated in many neuropsychiatric disorders, such as schizophrenia and bipolar disease.

Szigeti explained that three of the four drugs now available for Alzheimer's work by stimulating all receptors that respond to acetylcholine. More specific drugs for Alpha 7 have been in development for over 10 years but failed when moved to the clinical phase.

The human fusion gene modulates the Alpha 7 receptor, one of the receptors binding amyloid beta, the protein that is the hallmark of Alzheimer's that disrupts neuronal communication.

"Since this human fusion gene was not present in the animal models and screening systems used to identify drugs, 75 % of Alzheimer's patients who do carry this gene are less likely to benefit and therefore are at a disadvantage," she said. "This may account for the translational gap."

Gene carriers

"With this study, we compared the effect of cholinesterase inhibitors in patients who did or didn't carry this gene," said Szigeti. "People who don't have the gene respond better to the drugs available now."

She added that neurons vulnerable to Alzheimer's express Alpha 7 and that may be the reason why they die first.

"Our work confirms that Alpha 7 is a very important target for treating Alzheimer's but the right model--a human model--has to be used when testing new drugs," said Szigeti.

Credit: 
University at Buffalo

About 44% of high school seniors who misuse prescription drugs have multiple drug sources

ANN ARBOR--Roughly 11% of high school seniors reported prescription drug misuse during the past year, and of those, 44% used multiple supply sources, according to a pair of University of Michigan studies.

More than 70% of adolescents who obtained prescription drugs from multiple sources had a substance use disorder--involving prescription medications, other drugs and alcohol--within the previous year.

The national average for a substance use disorder for all adolescents is 5%, said senior author Sean Esteban McCabe, a professor at the U-M School of Nursing.

Both studies, published in the July issue of the Journal of the American Academy of Child and Adolescent Psychiatry, found that adolescents using multiple sources for prescription medications were at high risk for other substance use and substance use disorders, among other disturbing patterns.

One study identified sources of misuse for three classes of prescription drugs--opioids, stimulants and tranquilizers--and the differences in motives and behavior among 18,549 high school seniors. The other study identified sources of controlled medications and related behaviors in 103,920 adolescents ages 12 to 17.

A "very concerning" finding is that 30% of prescription drug misusers took their own leftover medication, with girls more likely to take leftovers than boys, said McCabe, who is also co-director of the U-M Center for the Study of Drugs, Alcohol, Smoking and Health. Boys were more likely to obtain prescription drugs from friends or purchase them.

The most common sources for prescription drugs for 12-to-17-year-olds were getting them free from friends and relatives, physician prescriptions for opioids, and buying stimulants and tranquilizers illegally.

"These adolescents are most in need of intervention to address their substance use and any other medical and mental health issues," said Ty Schepis, associate professor at Texas State University and lead author of one of the studies.

This is the first known research to look at adolescent misuse of leftover medications across these three prescription drug classes, McCabe said.

"The implications from these two studies could not be clearer," McCabe said. "Parents, public health experts and clinicians must rally to address this problem. There is a critical need for clinical workforce training to support clinic and school-based education, screening, prevention and early intervention."

Credit: 
University of Michigan

Lifting the fog on carbon budgets

The concept of a carbon budget has become a popular tool in guiding climate policy since the Intergovernmental Panel on Climate Change's (IPCC) Fifth Assessment Report was released in 2014. IIASA researchers were involved in the development of a framework that can help scientists determine which factors affect the size of the remaining carbon budget and how they interact.

Research over the course of the past decade has shown that global warming is more or less proportional to the total amount of CO2 released into the atmosphere. This makes it possible to estimate the total amount of CO2 we can still emit while having a chance to limit global warming to a certain level - a concept known as the remaining carbon budget. The simplicity of the notion has made it an attractive tool for policymakers, even though it is strongly dependent on the assumption of a linear relationship between global temperature rise and cumulative CO2 emissions due to human activity. Given that we are aiming to keep warming well below 2°C and preferably even below 1.5°C, this is an extremely important number to inform decision makers about how rapidly we have to bring our emissions down to zero.

As low-carbon policies and technologies continue to advance, policymakers, companies, and investors are increasingly relying on carbon budgets as a core component for analyzing the potential implications of a carbon constrained future. Over the past couple of years, several estimates were published that all differed to a greater or lesser extent due to reasons that were previously not well understood. We know that global warming is not driven by CO2 emissions alone - other greenhouse gases such as methane, fluorinated gases, nitrous oxide, and aerosols also affect global temperatures, and estimating remaining carbon budgets therefore also implies making assumptions about these non-CO2 contributions. This added uncertainty and decreased the use of the linear relationship between warming and cumulative emissions of CO2 for target setting.

In their study published in Nature, researchers from IIASA and colleagues from among others, the Grantham Institute at Imperial College London, the University of Leeds, MeteoFrance, and the Potsdam Institute for Climate Impact Research, provide a way to understand and track changes in the remaining carbon budget. The framework they propose in their paper can also provide a way to better understand how this number might change in the future and contribute to a more constructive and informed discussion of the topic, while also facilitating better communication across disciplines and communities that research, quantify, and apply estimates of remaining carbon budgets. The study defines the size of the remaining carbon budget by five main factors. These are the amount of warming expected per ton of CO2 emission; the amount of warming observed until today; the amount of future warming expected from gases other than CO2; whether warming stops instantly once CO2 emissions get to zero; and an additional correction for whether there are any additional reinforcing cycles in the Earth system that have not been considered sufficiently.

"Our paper provides a new tool to clearly communicate up to date insights about carbon budgets. With the framework, changes can be pinpointed to single contributions that are much easier to understand. This should increase confidence in carbon budget estimates among policymakers, or at least their advisors, because changes in carbon budget estimates cease to appear to be random but rather are the result of clear progress of science in various areas," explains Joeri Rogelj, a senior researcher with the IIASA Energy Program and lead author of the study.

According to the researchers, their framework can play a role in contextualizing new estimates in the future, even if they use alternative methods. It can also be used in combination with expert judgment to anticipate potential surprise changes in remaining carbon budgets and allow for a more independent assessment by drawing on multiple lines of evidence. A simplified version of this framework was already applied in the recent IPCC Special Report on Global Warming of 1.5°C.

"The remaining carbon budget is a key quantity for defining the challenge of limiting climate change to safe levels. With this paper, we can understand and track this quantity much better. If you carefully look at carbon budgets, they become easy to understand. The fog is lifted, so to speak, and shows even more clearly that the remaining carbon budget to limit global warming to safe levels is tiny - action in the next decade is essential to stay within it," concludes Rogelj.

Credit: 
International Institute for Applied Systems Analysis

Insurance linked to hospitals' decision to transfer kids with mental health emergencies

A national study finds children without insurance who seek treatment for a mental health disorder in the emergency department (ED) are more likely than those with private insurance to be transferred to another hospital.

The study, conducted by researchers at UC Davis Children's Hospital and the UC Davis Department of Psychiatry, showed differences in the decisions to admit or transfer children with mental health emergencies based on the patients' insurance type.

More hospital transfers for children with no insurance

For the study, the researchers assessed a national sample of 9,081 acute mental health events among children in EDs. They looked at the patient's insurance coverage and a hospital's decision to admit or transfer patients with a mental health disorder.

"We found that children without insurance are 3.3 times more likely to be transferred than those with private insurance," said Jamie Kissee Mouzoon, research manager for the Pediatric Telemedicine Program at UC Davis Children's Hospital and first author on the study. "The rate was even higher for patients presenting with bipolar disorder, attention-deficit and conduct disorders and schizophrenia."

Inequities in mental health emergencies

The study shows there may be gaps in providing equitable and quality care to pediatric patients with mental health emergencies based on their insurance coverage.

Transferring a child creates additional burdens for the patient, family and health care system as a whole. It can add to overcrowding in busy emergency departments, higher costs of care and higher out-of-pocket costs for the family.

According to James Marcin, senior author on the study, there are regulations in place to prevent EDs from making treatment decisions based on the patients' insurance. Transferring a patient for any other reason than clinical necessity should be avoided

"Unfortunately, the financial incentives are sometimes hard to ignore and can be even unconscious," Marcin said. "What we have found in this study is consistent with other research that demonstrates that patients without health insurance are more likely to get transferred from clinic to clinic or hospital to hospital."

Marcin also is director for the UC Davis Center for Health and Technology and leads the telemedicine program at UC Davis Health. He is looking into ways that telemedicine - video visits delivered to the children who seek care in remote EDs - might be a solution to the tendency to transfer the patient to another hospital.

Credit: 
University of California - Davis Health

Timing is everything for the mutualistic relationship between ants and acacias

image: A founding queen acacia-ant (Pseudomyrmex ferruginea) cuts her first entrance hole into the swollen thorn (Vachellia cornigera) in which she will start her colony, the first thorn made by this young seedling ant-acacia in Veracruz, Mexico, 1962. Scientists from Penn have made new insights into the genetic drivers of this mutualistic relationship.

Image: 
D. H. Janzen

In the 1960s, Penn biologist Dan Janzen, as part of earning his Ph.D., re-described what has become a classic example of biological mutualism: the obligate relationship between acacia-ants and ant-acacia trees. The acacia trees produce specialized structures to shelter and feed the ant colony, and the ants, in turn, defend the tree against herbivores.

In a new study in Proceedings of the National Academy of Sciences, colleagues of Janzen's in the Penn Biology Department uncover a genetic mechanism that programs the plant side of the ant-acacia relationship. Scott Poethig, a plant biologist, and Aaron Leichty, who earned his Ph.D. working under Poethig and is now a postdoctoral researcher at the University of California, Davis, showed that these species of acacia develop the traits necessary to feed the ant colony--hollow swollen thorns to house them, and nectaries and nutrient-rich leaflet tips called Beltian bodies to feed them--as part of an age-dependent phenomenon in plant development.

"There is a cost associated with making these traits," says Poethig, senior author on the report, "but the plant needs them, otherwise it's a goner. Dan showed: no ants, no plants. The plant is eaten by everything from grasshoppers to mice.

"So there's a tradeoff happening. And what we found is that these traits seem to have evolved on the back of a preexisting pathway that governs a developmental transition in plants."

Adds Leichty: "When we dug into the literature, we found that a lot of plant defense strategies are age-dependent. It's counterintuitive because you think the young plants would want to start making these structures right away so they wouldn't get eaten, but our findings as well as profound logic suggest there are biological constraints on making them."

Poethig has spent a large part of his career studying this transition, what some regard as a plant's "adolescence." But he hadn't considered it in the context of the ant-acacia relationship until his son took Janzen's Humans and the Environment course at Penn and learned about the textbook example of mutualism.

"I thought, Wouldn't it be interesting if this suite of traits was controlled by the developmental pathway I've been studying for the last 30 years?" he recalls.

One clue suggested it might: In Janzen's observations from the field, he had noted that the plants do not make these features right away, suggesting they may need to reach a more mature stage to do so.

To get his hands on some seeds to begin probing this question with molecular tools, Poethig asked Janzen to procure some but also did what any 21st century biologist would: He looked online.

"I landed on an Etsy site called Mr. Nature that sells seeds of Vachellia cornigera," a species of acacia native to Mexico and Central American, says Poethig.

The delivery arrived weeks later with not only the seeds but also an acacia pod and some thorns "and a little sign that said, 'Ouch! Very sharp!'" Poethig says. Those plant parts enabled Janzen to later confirm the species identification from an otherwise questionable source.

Poethig gathered other seeds from a seller in Belize and, finally, from Janzen himself.

"When I went over to pick up the seeds from Dan, his biologist wife Winnie was there and said, 'Dan, maybe you should tell Scott where you got the seeds,'" Poethig recalls. Janzen went on to explain that he had seen a monkey eating the acacia pods in their front yard in Costa Rica. Janzen then collected the seeds from the monkey's scat, dropped while he was eating yet more. The next month he discovered the tree lacked its usual ant colony, so the monkey had easy pickings.

With the seeds in hand, Leichty began to develop strategies to grow and study them in the lab. Once he had the plants growing reliably, he observed what Janzen had seen in the wild a half-century before.

"Sure enough, the traits appear but not right away," Leichty says.

Looking at the three different acacias they had on hand--Vachellia collinisii from Belize, V. collinsii from Costa Rica, and V. cornigera from Florida--Leichty and Poethig found that, while the precise timing differed depending on the species, the plants' extrafloral nectaries, which are made by all acacia species, appeared first. Swollen thorns developed next, and the Beltian bodies appeared last.

The researchers then turned attention to the possible programs for these traits. They obtained the first genome sequence of a Vachellia species and looked specifically at certain microRNAs--short, non-coding sections of the genome--miR156 and miR157, which they had previously found to be associated with controlling the developmental timing of traits in other plant species.

As the swollen thorn and other ant-attracting traits began to appear in the acacia, levels of miR156 and miR157 declined, and the levels of different protein transcription factors repressed by these microRNAs increased.

For the next step in their research, Poethig and Leichty considered another observation that Janzen had made in the field; acacia trees growing in the shade developed these specialized traits more slowly. In the lab, they again found a connection to miR156 and miR157. Plants grown in low-light conditions had much higher levels of the microRNAs and a later appearance of the swollen thorn traits compared to their counterparts grown in full-light.

"The shade experiments led to the delay of this whole pathway," Leichty says, "and offered a simple way to experimentally perturb the timing of these traits while also controlling for the developmental age of our samples."

To get a sense of how the regulation of these traits may have arisen evolutionarily, the researchers explored other acacia species that do not make Beltian bodies or swollen thorns but do make nectaries on their leaves. In these species, as in the ant-acacias, miR156's decline coincided with the appearance of the nectaries. The similarity among the acacias in this regard suggests that the existing pathway was coopted to regulate the other traits that are required for a healthy bodyguard--swollen thorns and good food--the researchers say.

To Janzen, the find is supportive of his field discoveries, making a case for the blending of field and lab investigations.

"Looking from the outside, as ecologists are wont to do," he says, he discerned from his observations from the 1960s in Veracruz, Mexico that the youngest ant-acacia plants "switch on their defenses against herbivores" only when they appeared to have garnered enough resources.

"Scott and Aaron peered at the same event from the inside, at all that DNA stuff I cannot see," adds Janzen. "I have to take their word for its existence. They have to take my word for the herbivory and the protective ant colony. 'Tis the difference between whole-organism biologists and molecular biologists. I watched and asked why. They watched and asked how."

Credit: 
University of Pennsylvania

Study: Rugby-style tackling may have lower force of impact than football-style tackling

The style of tackling used in rugby may be associated with a lower force of impact than the style used in football, according to a preliminary study of college athletes released today that will be presented at the American Academy of Neurology Sports Concussion Conference in Indianapolis July 26-28, 2019.

"For athletes who participate in a sport that involves a tackle or direct contact, adapting a rugby-style tackle where the players lead with their shoulders, not their heads, could make college sports safer," said study author Zach Garrett, DHS, of Marshall University in Huntington, W.Va. "A small number of NFL teams have incorporated the rugby-style tackle in an effort to reduce risk of concussion."

The study measured impact data from 30 male university athletes during their spring practice season. Twenty of the participants were football players who had impact sensors placed in their helmets. Ten of the participants were rugby players who had mouthguards with sensors inserted into them.

At the end of the practice season, the football participants totaled 3,921 impacts over the course of 12 practices, compared to 1,868 impacts over nine practices received by rugby participants. After researchers adjusted for other factors such as false impacts, different sample sizes, and practices, they found that the frequency of impacts was lower for the rugby players than for the football players. The research team also found that the sensors recorded lower impact forces to the head in rugby in comparison to football.

Impact was measured in g-force, which is the measurement of gravity described in units of acceleration. Overall the rugby players had impacts with an average of 21 g-force. Football players had impacts with an average of 63 g-force.

"Further studies with larger numbers of participants are needed to confirm these results and also to determine whether using a rugby-style tackle could effectively reduce the force of impact and potentially reduce the number or severity of concussions in college football," said Garrett.

Credit: 
American Academy of Neurology

Cancer device created at rutgers to see if targeted chemotherapy is working

image: This image shows six devices with biosensors to detect whether a cancer cell is alive when it passes through a tiny hole for fluids. The devices fit on a 3-inch wide piece of glass.

Image: 
Zhongtian Lin

Rutgers researchers have created a device that can determine whether targeted chemotherapy drugs are working on individual cancer patients.

The portable device, which uses artificial intelligence and biosensors, is up to 95.9 percent accurate in counting live cancer cells when they pass through electrodes, according to a study in the journal Microsystems & Nanoengineering.

"We built a portable platform that can predict whether patients will respond positively to targeted cancer therapy," said senior author Mehdi Javanmard, an assistant professor in the Department of Electrical and Computer Engineering in the School of Engineering at Rutgers University-New Brunswick. "Our technology combines artificial intelligence and sophisticated biosensors that handle tiny amounts of fluids to see if cancer cells are sensitive or resistant to chemotherapy drugs."

The device provides immediate results and will allow for more personalized interventions for patients as well as better management and detection of the disease. It can rapidly analyze cells without having to stain them, allowing for further molecular analysis and instantaneous results. Current devices rely on staining, limiting the characterization of cells.

"We envision using this new device as a point-of-care diagnostic tool for assessing patient response and personalization of therapeutics," the study says.

Treatment of cancer patients often requires drugs that can kill tumor cells, but chemotherapy destroys both tumor cells and healthy cells, causing side effects such as hair loss and gastrointestinal problems.

Co-author Joseph R. Bertino, a resident researcher at Rutgers Cancer Institute of New Jersey and professor at Rutgers Robert Wood Johnson Medical School, and his team previously developed a therapeutic approach that targets cancer cells, such as those in B-cell lymphoma, multiple myeloma and epithelial carcinomas. It binds a chemotherapy drug to an antibody so only tumor cells are targeted, and minimizes interaction with healthy cells. Patients will respond positively to this therapy if their tumor cells generate a protein called matriptase. Many patients will benefit while the side effects from standard chemotherapy are minimized.

"Novel technologies like this can really have a positive impact on the standard-of-care and result in cost-savings for both healthcare providers and patients," Bertino said.

The Rutgers team tested their new device using cancer cell samples treated with different concentrations of a targeted anti­cancer drug. The device detects whether a cell is alive based on the shift in its electrical properties as it passes through a tiny fluidic hole. The next step is to perform tests on tumor samples from patients. The researchers hope the device will eventually be used to test cancer therapies on samples of patient tumors before treatment is administered.

Credit: 
Rutgers University

Treating stroke patients just 15 minutes earlier can save lives

Initiating stroke treatment just 15 minutes faster can save lives and prevent disability, according to a new UCLA-led study, published today in JAMA. The research also determined that busier hospitals -- those that treat more than 450 people for stroke each year -- have better outcomes than those that treat fewer than 400 stroke patients per year.

Researchers at the David Geffen School of Medicine at UCLA and five other institutions in the U.S. and Canada, examined data for 6,756 people who experienced ischemic strokes. The patients' median age was 71, and 51.2% were women.

The researchers looked at stroke patients' treatment results in light of their "door-to-puncture" time -- that is, the interval from their arrival at the hospital to the time their treatment began.

The data showed that for every 1,000 people whose door-to-puncture time was 15 minutes sooner, 15 fewer died or were discharged to hospice care, 17 more were able to walk out of the hospital without assistance and 22 more could care for themselves after being discharged from the hospital. Researchers found that patients' median time from arriving at the hospital to the beginning of treatment was one hour, 27 minutes, and the median time from the onset of symptoms to treatment was three hours, 50 minutes.

All of the patients in the study were treated with endovascular reperfusion therapy, which is used to treat strokes caused by a blockage in one of the major arteries of the brain.

The study is one of the largest to quantify the number of patients per thousand that could be saved by earlier stroke treatment, and to do so using real-world data as opposed to a clinical trial, according to Dr. Reza Jahan, the study's co-lead author and a professor of interventional neuroradiology at the Geffen School of Medicine.

About 795,000 people in the U.S. have strokes each year, and about 140,000 die as a result. Ischemic strokes, which occur when a vessel supplying blood to the brain is obstructed, account for 87% of all strokes. (Other types of strokes include hemorrhagic strokes and transient ischemic attacks, which are sometimes referred to as mini strokes.)

Based on the study's results, shaving 15 minutes off of treatment time could potentially improve results for thousands of people each year.

The study found that hospitals that perform endovascular reperfusion therapy on more than 50 patients per year generally begin treatment faster than hospitals that perform fewer than 30; and that initial treatment tends to be delayed at hospitals that are not certified as comprehensive stroke centers or are located in the Northeast, as well as for people who have a stroke during hospital "off hours" -- weekends, holidays, and before 7 a.m. and after 6 p.m. on weekdays.

"We're trying to improve treatment with better staffing on off hours and getting doctors to the hospital quicker when they're on call," Jahan said. "Patients who arrive at the hospital at 2 a.m. should be treated no differently than people who arrive at 2 p.m."

Treatment delays also are more likely for people who live alone or fail to recognize their own stroke symptoms.

Based on the study results, the American Heart Association has already published new goals regarding how fast patients should be treated at comprehensive stroke centers, Jahan said.

Credit: 
MediaSource

Risk and progression of Alzheimer's disease differ by sex

The abnormal accumulation of proteins in the brain is a biological marker for Alzheimer's disease, but the ways in which these proteins spread may help explain why the prevalence of Alzheimer's is higher in women than in men.

A recent study by researchers from the Center for Cognitive Medicine (CCI) at Vanderbilt University Medical Center identified differences in the spread of a protein called tau -- which is linked to cognitive impairment -- between men and women, with women showing a larger brain-wide accumulation of tau than men due to an accelerated brain-wide spread.

The findings were presented at the Alzheimer's Association International Conference July 14-18 in Los Angeles.

Accumulating evidence suggests that tau spreads through brain tissue like an infection, traveling from neuron to neuron and turning other proteins into abnormal tangles, subsequently killing brain cells. Using data from positron emission tomography (PET) scans of healthy individuals and patients with mild cognitive impairment who were enrolled in the Alzheimer's Disease Neuroimaging Initiative (ADNI) database, CCI researchers constructed in vivo networks modeling tau spread using graph theory analysis.

"It's kind of like reconstructing a crime scene after a crime. You weren't there when it happened, but you can determine where an intruder entered a house and what room they entered next," said Sepi Shokouhi, PhD, assistant professor of Psychiatry and Behavioral Sciences and lead investigator for the study. "The graph analysis does something similar to show how tau spreads from one region to another."

The results of the analysis showed the architecture of tau networks is different in men and women, with women having a larger number of "bridging regions" that connect various communities in the brain. This difference may allow tau to spread more easily between regions, boosting the speed at which it accumulates and putting women at greater risk for developing Alzheimer's disease.

If proven, an accelerated spread of tau in women may indicate a need for sex-specific approaches for the prevention of Alzheimer's disease, including earlier therapies, lifestyle interventions and/or cognitive remediation. More studies are needed to validate the accelerated tau spread model in women.

"Understanding how different biological processes influence our memory is a really important topic. Sex-specific differences in the brain's pathological, neuroanatomical and functional organization may map into differences at a neurobehavioral and cognitive level, thus explaining differences in the prevalence of neurodegenerative disorders and helping us develop appropriate treatments," said Shokouhi.

Credit: 
Vanderbilt University Medical Center

Health impairment through carbofuran in red chili unlikely

Carbofuran is a plant protection product which can be used against certain insects, mites, ticks and nematodes. On the basis of the amount of the carbofuran residues and estimated dietary intake of red chilies, an exceedance of the acute reference dose (ARfD) is not to be expected for children or adults. The ARfD describes the quantity of a substance per kilogram of body weight that consumers can ingest with their food in one meal or in several meals spread over one day without any recognisable effect on health. The ARfD is therefore a limit value for risk assessment in relation to short-term intake.

Other plant protection products were also detected in the 10 samples from the manufacturer (2308 grams in total), but their contribution is negligible in relation to carbofuran so that no health risk is to be expected from them either. As the number of samples is small, the results cannot be generalised for red chili.

Credit: 
BfR Federal Institute for Risk Assessment

Researchers wirelessly hack 'boss' gene, a step toward reprogramming the human genome

image: The left image above shows the gene FGFR1 in its natural state. The right image shows the gene when exposed to laser light, which causes the gene to activiate and deactivate.

Image: 
University at Buffalo

BUFFALO, N.Y. -- It seems like everything is going wireless these days. That now includes efforts to reprogram the human genome.

A new University at Buffalo-led study describes how researchers wirelessly controlled FGFR1 -- a gene that plays a key role in how humans grow from embryos to adults -- in lab-grown brain tissue.

The ability to manipulate the gene, the study's authors say, could lead to new cancer treatments, and ways to prevent and treat mental disorders such as schizophrenia.

The work -- spearheaded by UB researchers Josep M. Jornet, Michal K. Stachowiak, Yongho Bae and Ewa K. Stachowiak -- was reported in the June edition of the Proceedings of the Institute of Electrical and Electronics Engineers.

It represents a step forward toward genetic manipulation technology that could upend the treatment of cancer, as well as the prevention and treatment of schizophrenia and other neurological illnesses. It centers on the creation of a new subfield of research the study's authors are calling "optogenomics," or controlling the human genome through laser light and nanotechnology.

"The potential of optogenomic interfaces is enormous," says co-author Josep M. Jornet, PhD, associate professor in the Department of Electrical Engineering in the UB School of Engineering and Applied Sciences. "It could drastically reduce the need for medicinal drugs and other therapies for certain illnesses. It could also change how humans interact with machines."

From "optogenetics" to "optogenomics"

For the past 20 years, scientists have been combining optics and genetics -- the field of optogenetics -- with a goal of employing light to control how cells interact with each other.

By doing this, one could potentially develop new treatments for diseases by correcting the miscommunications that occur between cells. While promising, this research does not directly address malfunctions in genetic blueprints that guide human growth and underlie many diseases.

The new research begins to tackle this issue because FGFR1 -- it stands for Fibroblast Growth Factor Receptor 1 -- holds sway over roughly 4,500 other genes, about one-fifth of the human genome, as estimated by the Human Genome Project, says study co-author Michal K. Stachowiak.

"In some respects, it's like a boss gene," says Stachowiak, PhD, professor in the Department of Pathology and Anatomical Sciences in the Jacobs School of Medicine and Biomedical Sciences at UB. "By controlling FGFR1, one can theoretically prevent widespread gene dysregulations in schizophrenia or in breast cancer and other types of cancer."

Light-activated toggle switches

The research team was able to manipulate FGFR1 by creating tiny photonic brain implants. These wireless devices include nano-lasers and nano-antennas and, in the future, nano-detectors.

Researchers inserted the implants into the brain tissue, which was grown from induced pluripotent stem cells and enhanced with light-activated molecular toggle switches. They then triggered different laser lights -- common blue laser, red laser and far-red laser -- onto the tissue.

The interaction allowed researchers to activate and deactivate FGFR1 and its associated cellular functions -- essentially hacking the gene. The work may eventually enable doctors to manipulate patients' genomic structure, providing a way to prevent and correct gene abnormalities, says Stachowiak, who also holds an appointment in UB's Department of Biomedical Engineering, a joint program between the Jacobs School and UB's engineering school.

Next steps

The development is far from entering the doctor's office or hospital, but the research team is excited about next steps, which include testing in 3D "mini-brains" and cancerous tissue.
Additional study authors include Pei Miao and Amit Sangwan of the UB Department of Electrical Engineering; Brandon Decker, Aesha Desai, Christopher Handelmann of the UB Department of Pathology and Anatomical Sciences; Liang Feng, PhD, of the University of Pennsylvania; and Anna Balcerak of the Maria Sklodowska-Curie Memorial Cancer Center and Institute of Oncology in Poland.

Credit: 
University at Buffalo

Micro-ribonucleic acid in milk:Health risk very unlikely

One type is micro-RNA (miRNA), and its job is to regulate numerous processes in a cell. It has been suggested, however, that some of these miRNAs are involved in the emergence of tumours and other health problems.

The German Federal Institute for Risk Assessment (BfR) was requested to assess the potential health risks of the miRNAs contained in cows' milk and dairy products. Data on such fac-tors as the intake of miRNAs are urgently needed for a definitive risk assessment, but no such data are available at this point in time. The data that are currently available do not permit the conclusion that miRNAs in milk pose a health risk.

Based on the available data on miRNAs, the BfR views it as highly unlikely that the miRNAs ingested with milk have any effect on human health. Current scientific knowledge does not supply any grounds to advise the general population to refrain from consuming milk and dairy
products in the recommended quantities and amounts that are common in Germany.

Credit: 
BfR Federal Institute for Risk Assessment

Gaia starts mapping the galactic bar in the Milky Way

video: 3D Visualisation of the density of stars in the ESA/Gaia data release 2. The view rotates around the solar position. The colour encodes density: blue is low, yellow/orange are high. Due to the Gaia selection function, the highest stellar density is measured close to the Sun, but it is also possible to discern some nearby star clusters, and even the Galactic bar (mainly its orange-clump stars).

Image: 
StarHorse team. Visualisation: Arman Khalatyan Credit background image in the beginning of the video: NASA/Caltech/R. Hurt

The second release of data from Gaia star-mapping satellite, published in 2018, has been revolutionising many fields of astronomy. The unprecedented catalogue contains the brightness, positions, distance indicators and motions across the sky for more than one billion stars in our Milky Way galaxy, along with information about other celestial bodies.

This is just the beginning. While the second release is based on the first twenty-two months of Gaia's surveys, the satellite has been scanning the sky for five years, and will keep doing so at least until 2022. New data releases planned in coming years will steadily improve measurements as well as provide extra information that will enable us to chart our home galaxy and delve into its history like never before.

Meanwhile, a team of astronomers have combined the latest Gaia data with infrared and optical observations performed from ground and space to provide a preview of what future releases of ESA's stellar surveyor will reveal.

"We looked in particular at two of the stellar parameters contained in the Gaia data: the surface temperature of stars and the 'extinction', which is basically a measure of how much dust there is between us and the stars, obscuring their light and making it appear redder," says Friedrich Anders ICCUB member and lead author of the new study.

"These two parameters are interconnected, but we can estimate them independently by adding extra information obtained by peering through the dust with infrared observations", continues the expert.

The team combined the second Gaia data release with several infrared surveys using a computer code called StarHorse, developed by co-author Anna Queiroz and other collaborators. The code compares the observations with stellar models to determine the surface temperature of stars, the extinction and an improved estimate of the distance to the stars.

As a result, the astronomers obtained much better determination of the distances to about 150 million stars - in some cases, the improvement is up to 20% or more. This enabled them to trace the distribution of stars across the Milky Way to much greater distances than possible with the original Gaia data alone.

"With the second Gaia data release, we could probe a radius around the Sun of about 6500 light years, but with our new catalogue, we can extend this 'Gaia sphere' by three or four times, reaching out to the centre of the Milky Way," explains co-author Cristina Chiappini from Leibniz Institute for Astrophysics Potsdam, Germany, where the project was coordinated.
At the centre of our galaxy, the data clearly reveals a large, elongated feature in the three-dimensional distribution of stars: the galactic bar.

"We know the Milky Way has a bar, like other barred spiral galaxies, but so far we only had indirect indications from the motions of stars and gas, or from star counts in infrared surveys. This is the first time that we see the galactic bar in three-dimensional space, based on geometric measurements of stellar distances," says Friedrich Anders.

"Ultimately, we are interested in galactic archaeology: we want to reconstruct how the Milky Way formed and evolved, and to do so we have to understand the history of each and every one of its components," adds Cristina Chiappini.

"It is still unclear how the bar - a large amount of stars and gas rotating rigidly around the centre of the galaxy - formed, but with Gaia and other upcoming surveys in the next years we are certainly on the right path to figure it out", notes the researcher.

The team is looking forward to the next data release from the Apache Point Observatory Galaxy Evolution Experiment (APOGEE-2), as well as upcoming facilities such as the 4-metre Multi-Object Survey Telescope (4MOST) at the European Southern Observatory in Chile and the WEAVE (WHT Enhanced Area Velocity Explorer) survey at the William Herschel Telescope (WHT) in La Palma (Canary Islands).

The third Gaia data release, currently planned for 2021, will include greatly improved distance determinations for a much larger number of stars, and is expected to enable progress in our understanding of the complex region at the centre of the Milky Way.

"With this study, we can enjoy a taster of the improvements in our knowledge of the Milky Way that can be expected from Gaia measurements in the third data release," explains co-author Anthony Brown of Leiden University (the Netherlands).

"We are revealing features in the Milky Way that we could not see otherwise: this is the power of Gaia, which is enhanced even further in combination with complementary surveys," concludes Timo Prusti, Gaia project scientist at ESA.

Credit: 
University of Barcelona

First ever state sepsis regulation in US tied to lower death rates

PITTSBURGH, July 16, 2019 - Death rates from sepsis fell faster in New York than expected--and faster than in peer states--following the introduction of the nation's first state-mandated sepsis regulation, according to an analysis led by University of Pittsburgh researchers and published today in JAMA. The policy requires all New York hospitals to quickly implement certain protocols when the deadly condition is suspected.

The finding is good news for the nearly dozen other states in varying stages of adopting similar policies to reduce deaths from sepsis, the leading cause of death in hospitalized patients. Sepsis is a life-threatening condition that arises when the body's response to an infection injures its own tissues and organs.

"Rarely in the U.S. do we force hospitals to implement specific clinical protocols. Typically, quality improvement is achieved through financial incentives and public reporting," said lead author Jeremy Kahn, M.D., M.S., professor in the Department of Critical Care Medicine at Pitt's School of Medicine and the Department of Health Policy and Management at Pitt's Graduate School of Public Health. "For the first time, state officials are enshrining in regulations that hospitals must follow certain evidence-based protocols when it comes to sepsis. And our study finds that, at least in New York, it seemed to work."

Rory's Regulations were issued by the New York State Department of Health in 2013 after 12-year-old Rory Staunton died of undiagnosed sepsis. The regulations require that hospitals in New York follow protocols for sepsis that include giving antibiotics within three hours and intravenous fluids within six hours of hospitalization. The hospitals also are required to regularly train staff in the protocols and to report adherence and clinical outcomes to the state.

Kahn and his team analyzed records of more than a million sepsis admissions in 509 hospitals in New York and four control states without a sepsis regulation: Florida, Maryland, Massachusetts and New Jersey. The team looked at dates from two years before Rory's Regulations were adopted, and two years after.

In the years before the regulations went into place, 26.3% of the people diagnosed with sepsis in New York died while hospitalized, compared to a rate of 22% in the control states. Following the regulations, New York's sepsis mortality rate dropped 4.3% to 22%, but the death rate only fell 2.9% to 19.1% in the control states.

After accounting for patient and hospital characteristics, as well as pre-existing sepsis trends in the states, New York's sepsis death rate was 3.2% lower following the regulation than would have been expected, relative to the control states. This comparison was crucial to estimating the improvement and sets this study apart from prior work. Sepsis outcomes are known to improve over time--a study just looking in New York would not be able to differentiate the effects of the regulations from underlying trends. Because these improvements occurred more quickly in New York compared to other states, the researchers are more confident that the regulations are the source of the improvement.

"Sepsis is a tremendous global health burden, so developing proven ways to quickly recognize and treat people who have it is a top public health priority," said senior author Derek Angus, M.D., M.P.H., professor and chair of Pitt's Department of Critical Care Medicine and director of Pitt's Clinical Research, Investigation, and Systems Modeling of Acute Illness (CRISMA) Center. "While every state should consider their specific population and needs when developing regulations, our analysis reveals that policies enforcing evidence-based clinical protocols for the timely recognition and treatment of sepsis saves lives."

Credit: 
University of Pittsburgh

Risk of death before and after state-mandated protocols for sepsis care in New York

What The Study Did: Hospital discharge data was used to examine the association between New York state sepsis regulations and the outcomes of patients hospitalized with sepsis.

Authors: Jeremy M. Kahn, M.D., M.S., of the University of Pittsburgh, is the corresponding author.

(doi:10.1001/jama.2019.9021)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network