Culture

Women's basic rights under threat from Trump gag rule, warn experts

The Trump administration is mounting a ferocious attack on abortion rights with plans for a domestic gag rule on abortion counselling and provision, warn experts in The BMJ today.

Under this rule, clinics or programs that receive federal family planning funds "would be prohibited from providing abortions, referring women to places that do, or even counselling women that abortion is an option," explain Dr Natalie Gladstein and colleagues at the University of Michigan.

Such funds currently go towards comprehensive healthcare services like contraception, cancer screening, and sexual transmitted disease (STD) treatment and are accessed by over 4 million Americans a year.

They point out that this rule is opposed by the US medical community, including the American College of Obstetricians and Gynecologists (ACOG), the American Academy of Pediatrics (AAP), and the American College of Physicians (ACP).

And it is well documented that when this law is in effect "there are more unplanned pregnancies, more unsafe abortions, and more maternal deaths," they warn.

They point to a 2017 poll showing only 18% of Americans feel that abortion should be illegal in all or most instances, meaning the majority of Americans support a woman's right to safe, legal abortion.

The proposed rule therefore "imposes the conservative religious beliefs of a minority on the entire American population, directly contradicting the fundamental rights of freedom of speech and freedom of religion on which our nation was founded," they argue.

This rule also represents a gross interference in the patient-physician relationship, by preventing healthcare providers from giving their patients comprehensive, medically accurate information, they write.

It is also unclear whether the policy is actually legal under legislation that requires government to support "a broad range of acceptable and effective family planning services."

The authors acknowledge that plans to block the policy in federal court may not necessarily be bad for Republican politicians "since it could be used to energize their voters in the November elections." But they argue that the result of this proposed rule, should it be allowed to go into effect, is people will not get the health care they need.

"Everyone, regardless of their race, income, or where they live, deserves the best medical care and information available," they write. "Under this rule, they won't get it."

This partisan policy "is not based in medical fact or established law and will lead to poorer quality healthcare and less access to health care. This is an attempt to take away women's basic rights. Period." they conclude.

Credit: 
BMJ Group

Research finds fair classroom practices disarm threat of evaluation retaliation

image: This is Thomas Tripp.

Image: 
WSU

VANCOUVER, Wash. - While tuition inflation presents a challenge for many college-bound students, an area of growing concern for many universities is "grade inflation" -- in part caused when instructors grade more leniently to discourage students from retaliating by giving low teaching evaluations.

Washington State University researchers say instructors can stop worrying about evaluation revenge as long as they use practices in the classroom that students perceive as fair.

"We've long known there's an association between expected students' course grades and how they evaluate teachers. However, our study is the first to show that grades influence evaluations much less, if at all, when students can see what fair processes instructors use to assign grades," said lead author Thomas Tripp, Carson College of Business associate dean, WSU Vancouver.

Tripp conducted this study with former WSU doctoral students, Lixin Jiang, University of Auckland; Kristine Olson, Dixie State University; and Maja Graso, University of Otago.

"Faculty may not feel a need to award artificially high grades if they knew how students' perceptions of justice might influence this relationship," Tripp said.

Fairness is more than grades

The researchers found students' perception of fair classroom processes revolves around four essential teaching practices: (1) following the course rules by using grading rubrics that match stated criteria, and by aligning their course presentation and expectations to the syllabus; (2) obtaining student feedback and incorporating their interests and voice; (3) being aware of bias and grading blindly; and, (4) correcting grades by providing policies for make-up work and absences.

But students' concept of fairness extends beyond just grades, the researchers found.

"We were a bit surprised to learn of other criteria that students defined as "fair," including how well the class is run and how much the professor goes out of the way to help students," said Tripp. "We can see how these are important to students, but they don't fit any definition of "fair" that we know of."

"The most interesting thing we found in our study is that perception of fair process completely eliminated the threat of student retaliation via low teaching evaluations," Tripp said.

Recommendations for instructors

Based on the findings, the researchers recommend instructors follow specific procedures to ensure a fair classroom.

For instance:

Use grading rubrics consistently and share them with students. Course policies, such as late assignments submissions, should be in writing and included in the syllabus. Instructors should include grade-appeal procedures in their course policies, and if possible, have their students submit their appeals by their student ID numbers rather than by their names. Should a grade appeal move up to a panel, the panel could include students to increase representativeness.

Fair processes worth the effort

"While adding such processes may seem like a lot of work, we believe instituting fair processes is the superior option for several reasons," said Tripp. "Rampant use of grading leniency may contribute to grade inflation, which is advancing each decade and diminishing the power of grades to motivate students to work harder."

Tripp said by ignoring fair classroom processes and by grading leniently, instructors risk creating perceptions of both unfair outcomes and unfair process, a deadly combination that is associated with lower student evaluations of teachers.

Credit: 
Washington State University

Medscape report finds physicians are sexually harassed on the job

New York, NY, June 13, 2018: A new report from Medscape finds that more than 1 in 10 female physicians and 16% of female residents have experienced sexual harassment within the past three years. Overall, 7% of physicians (12% women, 4% men), and 9% of medical residents (16% women, 4% men) reported harassment.

More than 3,700 physicians and medical residents responded to the 2018 Medscape Report: Sexual Harassment of Physicians. The report found that nearly half (47%) of physicians who indicated they had been harassed said they were harassed by another physician (54% for residents), with other harassers identified as administrators, non-medical personnel or patients (29%), nurses or nurse practitioners (17%), medical residents and fellows (4%) or medical students (1%). Nearly all (97%) of the female physicians who responded that they had been harassed said the perpetrator was male. Of male physicians who were harassed, 23% were harassed by another man, and 77% were harassed by a woman. Most physicians reporting harassment were between the ages of 35 and 44.

The most common types of harassment reported by survey respondents included sexual comments about body parts or anatomy, unwanted groping, hugging, patting, or other physical contact, sexual remarks and leering, and deliberately infringing on personal space/standing too close. One in 5 physicians reported being asked repeatedly for a date, and more than 20% were harassed with explicit or implicit propositions to engage in sexual activity or received unwanted sexual texts or emails. Sexual assault, rape, promotions or raises in exchange for sexual relations and retaliations for refusal of sexual advances were reported at lower rates.

To read the report, click here: https://www.medscape.com/viewarticle/897786?faf=1

Comprehensive Report of Recent Behaviors

Medscape’s report provides a comprehensive view of the current state of sexual harassment for physicians, medical residents, and other health care professionals, i.e. incidents since 2015. Part 1, released today, focuses on the experiences of physicians and medical residents. Part 2 will report on the experiences of nurses, nurse practitioners and physician assistants, and Part 3 on sexual harassment of physicians from patients. Parts 2 and 3 will be released separately. More than 6,200 health care professionals responded to the survey overall.

The findings come amid reports of sexual misconduct in numerous professions and at a time when the percentage of female physicians and medical students is increasing.

“The Medscape report underscores the need to take on the issue of harassment within the medical community and ensure that those who are victimized will be heard,” said Hansa Bhargava, M.D., Medscape Medical Editor. “Now is the time to come to terms with the reality of the problem – that harassment can occur in healthcare institutions and many victims feel that their complaints will not be taken seriously. Healthcare organizations and practices need to work to change their cultures and to fully investigate the incidents.”

Fears of Retaliation, Trivialization and Loss of Reputation

About half of physicians and residents said they did not confront the issue when the incident happened, saying nothing to their harasser. Forty percent of physicians said they reported the offensive behavior. Of those 40% who did, 54% said that their organizations either did nothing or trivialized the incident, and more than half said that reporting the incident had a negative impact on their job or was not taken seriously. Only one-quarter of all incidents that were reported resulted in an investigation. Action was taken in about 38% of those cases, including the harasser being reprimanded, fired, moved or made to apologize.

Emotional and Professional Impact

Most physicians experiencing harassment said the incidents took place primarily in areas away from patients, such as administrative areas, on-call rooms, and hallways. One in 5 residents said the abuse took place in the operating room. More than one-third (34%) of physicians who were harassed said it interfered with their ability to do their job. Nearly 40% said they avoided working with specific colleagues when possible, and more than 14% decided to quit their jobs because of harassment.

“Even when looking at the issue within the past three years, the Medscape report finds that sexual harassment is happening, and sometimes at the hands of colleagues,” said Leslie Kane, MA, Senior Director of Medscape Business of Medicine. “Incidents of harassment can damage physicians professionally and personally, and in some cases interfere with their ability to care for patients. We hope that the report findings increase awareness of the problem and contribute to change.”

Methodology

Survey Method: Physicians, residents, nurses, nurse practitioners, and physician assistants were invited to participate in a 5- to 7-minute online survey.

Screening Requirements: Respondents were required to reside and practice in the United States.

Sample Size: 6,235 respondents across 29+ specialties met the screening criteria and completed the survey; residents were weighted to Association of American Medical Colleges distribution by gender.

Total physicians: n = 3,711

Total residents: n = 440

Data Collection Period: March 2-April 23, 2018

Sampling Error: The margin of error for the survey was ± 1.24% at a 95% confidence level using a point estimate of 50%. The margin of error for physicians who experienced harassment was ± 5.92%.

Credit: 
DKC

Network biology reveals pathogen targets in the model plant Arabidopsis thaliana

image: This is Shahid Mukhtar.

Image: 
UAB

BIRMINGHAM, Ala. - How are proteins in the cells of a flowering plant similar to social networks on Twitter or Facebook? And how might both of those be related to the way pathogens make plants or people sick?

Shahid Mukhtar, Ph.D., and colleagues at the University of Alabama at Birmingham address these questions in a collaborative study with researchers at the Gregor Mendel Institute, Vienna, Austria. Using systems biology, they successfully identified previously unknown protein targets of plant pathogens in the flowering plant Arabidopsis thaliana, employing some of the same methods used to analyze social networks or biological networks. Their theoretical framework, they say, could help analyze other interactions between species to reveal pathogen contact points.

In a social network, one can map the connections between followers or friends on Twitter and Facebook. A few people will have a huge number of connections, some will have many, and a vast majority will have much fewer. A map of these connections is akin to an airline route map, and the architecture of the networks shows topological features like hubs and bottlenecks. Analysis of social networks has allowed experts to identify people who are the "best information spreaders."

Similarly, for biology, deciphering the network architecture in an ecosystem or among macromolecules within cells of a lifeform can help discover novel components in those complex systems and provide biological insights and testable hypotheses.

One biological network is protein-protein interactions inside cells. Such networks have been studied in organisms as diverse as plants, humans and roundworms. A network map of those protein-protein interactions, which signify proteins' operating in conjunction with other partners, is called a protein interactome.

In a study published in Nature Communications, Mukhtar, an assistant professor in the Department of Biology, UAB College of Arts and Sciences, together with colleagues at UAB and the Gregor Mendel Institute, analyzed two different interactomes. The first one is the global protein interactome for proteins inside leaf cells of the model plant A. thaliana, called the Arabidopsis interactome. They also analyzed another, more specific protein interactome for receptors on the cell surface that allow plants to see, hear, smell and respond to environmental cues and dangers -- especially to virulent pathogens. This second network is called cell surface interactome.

For the proteins inside cells, they first matched more than 4,300 Arabidopsis protein-coding genes to five classes of observable traits, known as phenotypes. The five phenotype groups were essential genes that are needed for plant survival; morphological genes that control a plant's shape or appearance; genes for cellular or biochemical processes; conditional genes, where a mutation shows its effect only when the plant is subjected to stresses like water shortage or temperature extremes; and genes with no known phenotype.

When they associated the phenotypes with a protein interactome network they had previously mapped, they found the large hub and bottleneck nodes were enriched for conditional phenotype genes and depleted for essential genes. This contrasts with the controversial centrality-lethality rule seen in yeast protein interactomes, where the large hub and bottleneck nodes are enriched for essential genes.

Pathogens of Arabidopsis are able to inject pathogen proteins into the plant cells, and those injected "effector" proteins manipulate the plant's network to the pathogen's advantage. Mukhtar and colleagues had previously constructed two interspecies plant-pathogen interaction networks between pathogen effector proteins and Arabidopsis proteins inside the cell.

The team found that large hubs in the Arabidopsis interactome comprised only 6.5 percent of the targets of the pathogen effector proteins, which made it a limited way to identify targets using network biology. But when the researchers applied a method called weighted k-shell decomposition to identify the best "information spreaders," akin to recent analysis of social networks, the Arabidopsis protein nodes in the internal layers of the k-shell decomposition gave a 33 percent discovery rate for pathogen effector targets.

Thus, k-shell decomposition analysis surpasses other centrality measure for effector target discovery.

To test this, they then looked at an unrelated network, the cell surface interactome. These cell-surface proteins allow plants to sense the environment around them. The k-shell analysis predicted that 35 of these cell-surface proteins were the most influential spreaders of information. When the internal portions of the cell-surface proteins were tested for interactions with effector proteins from the bacterial pathogen Pseudomonas syringae, the researchers were able to identify seven previously unknown effector targets, a discovery rate of 40 percent.

The researchers made mutants of the seven effector targets and found that all of the newly identified targets showed changes in pathogen growth on Arabidopsis plant leaves when mutated.

"Our network-centric approach," the researchers wrote, "has exciting potential applicability on diverse intra- and inter-species interactomes, including human protein-protein interaction networks, in efforts to unravel host-pathogen contact points, while fostering the design of targeted therapeutic strategies."

Credit: 
University of Alabama at Birmingham

New and improved way to find baby planets

image: An artist's impression of protoplanets forming around a young star.

Image: 
NRAO/AUI/NSF; S. Dagnello

Washington, DC--New work from an international team of astronomers including Carnegie's Jaehan Bae used archival radio telescope data to develop a new method for finding very young extrasolar planets. Their technique successfully confirmed the existence of two previously predicted Jupiter-mass planets around the star HD 163296. Their work is published by The Astrophysical Journal Letters.

Of the thousands of exoplanets discovered by astronomers, only a handful are in their formative years. Finding more baby planets will help astronomers answer the many outstanding questions about planet formation, including the process by which our own Solar System came into existence.

Young stars are surrounded by rotating disks of gas and dust from which planets are formed. The 60 radio telescope antennae of the Atacama Large Millimeter/submillimeter Array, ALMA, have been able to image these disks with never-before-seen clarity.

The research team--including lead author Richard Teague and co-author Edwin Bergin of the University of Michigan, Tilman Birnstiel of the Ludwig Maximilian University of Munich, and Daniel Foreman-Mackey of the Flatiron Institute--used archival ALMA data to demonstrate that anomalies in the velocity of the gas in these rotating protoplanetary disks can be used to indicate the presence of giant planets.

Other techniques for finding baby planets in the disks surrounding young stars are based on observations of the emission coming from a disk's dust particles. But dust only accounts for one percent of a disk's mass, so the team decided to focus instead on the gas that comprises 99 percent of a young disk.

Their new technique focuses on the motion of the gas, probing radial pressure gradients in the gas to see the shape of the perturbations--like swirls and eddies in a rocky streambed--allowing astronomers to make a more-precise determination of the masses and locations of any planets embedded in the disk.

Their new method successfully confirmed the previously predicted existence of two Jupiter-mass planets around HD 163296. They orbit at distances of 83 and 137 times that between the Sun and the Earth, although their host star is much brighter than our own Sun.

"Although dust plays an important role in planet formation and provides us invaluable information, it is the gas that accounts for 99 percent of protoplanetary disks' mass. It is therefore crucial to study kinematics, or motion, of the gas to better understand what is happening in the disks we observe," explained Bae.

"This method will provide essential evidence to help interpret the high-resolution dust images coming from ALMA. Also, by detecting planets at this young stage we have the best opportunity yet to test how their atmospheres are formed and what molecules are delivered in this process," said lead author Teague.

Credit: 
Carnegie Institution for Science

Turning the tables on the cholera pathogen

image: This plate assay shows how normal L. lactis bacteria (top row) can suppress the growth of V. cholera (bottom row). Normal L. lactis bacteria (upper left circle) acidify the medium around them (yellow color) and hinder V. cholerae (lower left circle) from growing in this area. A L. lactis mutant strain unable to acidify the medium (middle upper circle) has lost its acidifying ability and cannot suppress the pathogen anymore, whereas a repaired mutant L. lactis strain (upper right circle) both abilities again.

Image: 
Wyss Institute at Harvard University

(BOSTON) -- Recent cholera outbreaks in regions that are ravaged by war, struck by natural disasters, or simply lack basic sanitation, such as Yemen or Haiti, are making the development of new and more effective interventions a near-term necessity. Sometimes within hours, the water- and food-borne diarrheal disease caused by the bacterium Vibrio cholerae can lead to severe dehydration, putting victims' lives at immediate risk, especially if rehydration and antibiotic therapies are not readily available or started early enough. The World Health Organization has made it a top priority to identify cholera-managing measures and a cost-effective diagnostic test to detect V. cholerae early on in individuals that may help prevent its spread.

Now, researchers at Harvard's Wyss Institute for Biologically Inspired Engineering, the Massachusetts Institutes of Technology (MIT) and Boston University, led by James J. Collins are reporting a two-pronged probiotic strategy in Science Translational Medicine that is able to suppress V. cholerae's colonization of the intestinal tract in mice and to indicate their presence by simple stool sampling. The approach's first arm leverages the ability of another bacterium normally found in certain foods, Lactococcus lactis, to create an inhospitable intestinal environment for V. cholerae, while its second arm incorporates a synthetic gene circuit into L. lactis that senses a secreted signal from V. cholerae, enabling its detection in the animals.

"Our probiotic strategy presents a conceptually new way to prevent and diagnose cholera infection. First, we harnessed a naturally occurring interaction of the cholera pathogen with the microbiome as a 'living therapeutic' and then use that interaction to engineer an organism carrying a synthetic sensing circuit as a 'living diagnostic'," said Wyss Institute Core Faculty member James Collins, Ph.D., who also is the Termeer Professor of Medical Engineering & Science at Massachusetts Institute of Technology (MIT) and a Professor of Biological Engineering at MIT. "Further translated into human conditions of V. cholerae infection, it could offer an inexpensive and extendable point-of-need intervention for managing cholera in populations at risk of outbreaks."

Collins also is faculty leader of the Wyss Institute's Living Cellular Devices initiative, which genetically re-engineers living cells as programmable devices for various biomedical applications.

Prompted by the early observation that V. cholerae bacteria are sensitive to acidic conditions, the team investigated whether L. lactis, a safe bacterium that ferments the milk sugar lactose into lactic acid and used in the production of buttermilk, could be used as a probiotic intervention against V. cholerae. They demonstrated first in co-culture assays in vitro that normal L. lactis bacteria efficiently inhibited the growth of V. cholerae, while a mutant strain of L. lactis that could not produce lactic acid anymore had lost this inhibitory potential. When the researchers co-introduced L. lactis and V. cholerae into infant mice -- which can be orally infected with the cholera pathogen much more easily than adult mice -- they found that L. lactis significantly increased the likelihood of animals to survive the V. cholerae threat, and decreased V. cholerae's ability to thrive in mouse intestinal tissue. Again, the effect depended on L. lactis' ability to produce lactic acid, and thus to generate an inhospitable microenvironment for V. cholerae in vivo.

"The lethal dose of V. cholerae is very high. By suppressing the pathogen's development in the early phases, our approach slows the progression of the disease and wins the body valuable time to develop its own immune response to clear out the pathogen," said co-first author Ning Mao, Ph.D., who worked on the project as a graduate student in Collins' group. She now is a consultant at Simon-Kucher & Partners in their Life Science Division in Singapore. The two other co-first authors on the study are Ewen Cameron, Ph.D., a former postdoctoral fellow and expert on cholera microbiology on Collins' team who now is Senior Associate at Flagship Pioneering; and Andres Cubillos-Ruiz, Ph.D., another postdoctoral fellow working with Collins.

The team then extended their approach by using L. lactis bacteria to develop a living diagnostic. "Essentially, we used synthetic biology to make L. lactis "hear" the language that V. cholerae bacteria use to talk to themselves," said Mao. Essentially, the researchers grafted a so-called 'quorum sensing' receptor from V. cholerae for a signal called CAI-1 to L. lactis. The sensation of CAI-1 normally allows the pathogen to continuously gauge the density of its population within the intestinal microbiome. However, in the engineered L. lactis bacteria, a synthetic gene circuit links the detection of CAI-1 to parts of one of L. lactis' own signal sensing mechanisms and additional elements to enable a reporter protein to be produced only when CAI-1 is present.

In infant mice that were dosed with both, the engineered cholera-sensing L. lactis bacteria and the cholera pathogen, the team could conveniently diagnose cholera infection by tracing the activity of the reporter protein in fecal pellets of the mice. Because the engineered bacteria had a lowered capacity to create an acidic environment and actively prevent infection themselves, the team created a combined therapeutic and diagnostic probiotic intervention by treating mice with a mixture of natural and engineered L. lactis and demonstrated that it could simultaneously suppress and detect cholera infections.

"This study highlights how targeting the human gut microbiome with appropriate probiotics and engineered living cellular devices could lead to new therapeutics and diagnostics for management of infectious and epidemic diseases in many parts of the world," said Wyss Institute Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at HMS and the Vascular Biology Program at Boston Children's Hospital, as well as Professor of Bioengineering at SEAS.

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Research shows short gamma-ray bursts do follow binary neutron star mergers

CORVALLIS, Ore. - Researchers at Oregon State University have confirmed that last fall's union of two neutron stars did in fact cause a short gamma-ray burst.

The findings, published today in Physical Review Letters, represent a key step forward in astrophysicists' understanding of the relationship between binary neutron star mergers, gravitational waves and short gamma-ray bursts.

Commonly abbreviated as GRBs, gamma-ray bursts are narrow beams of electromagnetic waves of the shortest wavelengths in the electromagnetic spectrum. GRBs are the universe's most powerful electromagnetic events, occurring billions of light years from Earth and able to release as much energy in a few seconds as the sun will in its lifetime.

GRBs fall into two categories, long duration and short duration. Long GRBs are associated with the death of a massive star as its core becomes a black hole and can last from a couple of seconds to several minutes.

Short GRBs had been suspected to originate from the merger of two neutron stars, which also results in a new black hole - a place where the pull of gravity from super-dense matter is so strong that not even light can escape. Up to 2 seconds is the time frame of a short GRB.

The term neutron star refers to the gravitationally collapsed core of a large star; neutron stars are the smallest, densest stars known. According to NASA, neutron stars' matter is packed so tightly that a sugar-cube-sized amount of it weighs in excess of a billion tons.

In November 2017, scientists from U.S. and European collaborations announced they had detected an X-ray/gamma-ray flash that coincided with a blast of gravitational waves, followed by visible light from a new cosmic explosion called a kilonova.

Gravitational waves, a ripple in the fabric of time-space, were first detected in September 2015, a red-letter event in physics and astronomy that confirmed one of the main predictions of Albert Einstein's 1915 general theory of relativity.

"A simultaneous detection of gamma rays and gravitational waves from the same place in the sky was a major milestone in our understanding of the universe," said Davide Lazzati, a theoretical astrophysicist in the OSU College of Science. "The gamma rays allowed for a precise localization of where the gravitational waves were coming from, and the combined information from gravitational and electromagnetic radiation allows scientists to probe the binary neutron star system that's responsible in unprecedented ways."

Prior to Lazzati's latest research, however, it had been an open question as to whether the detected electromagnetic waves were "a short gamma-ray burst, or just a short burst of gamma rays" - the latter being a different, weaker phenomenon.

In summer 2017, Lazzati's team of theorists had published a paper predicting that, contrary to earlier estimates by the astrophysics community, short gamma-ray bursts associated with the gravitational emission of binary neutron star coalescence could be observed even if the gamma-ray burst was not pointing directly at Earth.

"X- and gamma rays are collimated, like the light of a lighthouse, and can be easily detected only if the beam points toward Earth," Lazzati said. "Gravitational waves, on the other hand, are almost isotropic and can always be detected."

Isotropic refers to being evenly transmitted in all directions.

"We argued that the interaction of the short gamma-ray burst jet with its surroundings creates a secondary source of emission called the cocoon," Lazzati said. "The cocoon is much weaker than the main beam and is undetectable if the main beam points toward our instruments. However, it could be detected for nearby bursts whose beam points away from us."

In the months following the November 2017 gravitational wave detection, astronomers continued to observe the location from which the gravitational waves came.

"More radiation came after the burst of gamma rays: radio waves and X-rays," Lazzati said. "It was different from the typical short GRB afterglow. Usually there's a short burst, a bright pulse, bright X-ray radiation, then it decays with time. This one had a weak gamma-ray pulse, and the afterglow was faint, brightened very quickly, kept brightening, then turned off."

"But that behavior is expected when you're seeing it from an off-axis observation point, when you're not staring down the barrel of the jet," he said. "The observation is exactly the behavior we predicted. We haven't seen the murder weapon, we don't have a confession, but the circumstantial evidence is overwhelming. This is doing exactly what we expected an off-axis jet would do and is convincing proof that binary neutron star mergers and short gamma-ray bursts are indeed related to each other."

Credit: 
Oregon State University

NASA finds weakening rainfall in Bud

video: This animation fades from NOAA's GOES-17 satellite enhanced infrared image to rainfall rates derived from the GPM core satellite. On June 12, 2018 at 7:27 p.m. EDT (2327 UTC) the GPM core satellite found moderate to heavy precipitation was only present in the southeastern quadrant of the weakening hurricane Bud. Heaviest rainfall in the area, of over 78 mm (3.1 inches) per hour, was occurring near the Mexico's coastline well to the northeast of BUD's center of circulation.

Image: 
Credits: NASA/JAXA, Hal Pierce

NASA examined the rainfall rates occurring in former Hurricane Bud as it continued moving north in the Eastern Pacific Ocean, paralleling the western coast of Mexico. On June 13, Bud weakened to a tropical storm and warnings have been posted from the Mexican government.
On June 12, 2018 at 7:27 p.m. EDT (2327 UTC), the Global Precipitation Measurement mission or GPM core observatory satellite passed above hurricane Bud in the eastern Pacific Ocean. Bud's movement over colder waters had caused its eye to become less defined as the storm weakened.

Data collected by GPM's Microwave Imager (GMI) showed that moderate to heavy precipitation was only present in the southeastern quadrant of the weakening hurricane. GPM's GMI also indicated that the heaviest rainfall in the area, of over 78 mm (3.1 inches) per hour, was occurring near the Mexico's coastline well to the northeast of Bud's center of circulation. GPM's Microwave Imager (GMI) and Dual Frequency Precipitation Radar (DPR) data were used in this image to show location and intensity of rainfall with hurricane Bud. GPM's radar swath only covered the nearly rain free area west of Bud's center of circulation. GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA.

On June 13, the government of Mexico has issued a Tropical Storm Warning for southern Baja California Sur from Santa Fe to La Paz, including Cabo San Lucas.

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Bud was located near latitude 19.4 degrees north and longitude 108.8 degrees west. That's about 250 miles (405 km) south-southeast of Cabo San Lucas, Mexico. The National Hurricane Center (NHC) said "Bud is moving toward the north-northwest near 3 mph (6 kph) and this general motion is expected to continue today. Bud is forecast to accelerate northward on Thursday and continue that motion into Friday. On the forecast track, the center of Bud will cross southern Baja California Sur late Thursday and move over the Gulf of California later on Friday.

Maximum sustained winds have decreased to near 65 mph (100 kph) with higher gusts. Although additional weakening is expected during the next day or so, Bud is forecast to still be a tropical storm when it reaches southern Baja California Sur late Thursday, June 14."

The rainfall observed by GPM is expected to affect the warning area. NHC said "Bud is expected to produce additional rainfall of 1 to 2 inches across much of southwestern Mexico through Thursday, with isolated maximum amounts of 4 inches. These rains could cause life-threatening flash floods and mud slides. Bud is also expected to produce 1 to 3 inches of rain with isolated totals of 5 inches across southern portions of Baja California Sur and Sonora in northwestern Mexico through Saturday."

Moisture from Tropical Storm Bud is predicted to spread over the Desert Southwest over the weekend with possible heavy rainfall and flash floods in that area.

For updated forecasts on Bud, visit: http://www.nhc.noaa.gov

By Rob Gutro / Hal Pierce
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Geologic history of Ayeyawady River delta mapped for the first time

image: This is a misty morning on an oxbow lake in the Ayeyawady River delta.

Image: 
Photo by Liviu Giosan, Woods Hole Oceanographic Institution

The Ayeyawady River delta in Myanmar is home to millions of people, and is a hub of agricultural activity. Unlike other large rivers across the world, however, the Ayeyawady has been relatively untouched by large infrastructure and dam projects for the past 50 years, and its geologic evolution has never previously been studied.

A team from the Woods Hole Oceanographic Institution (WHOI) has collaborated with Myanmar scientists to present the first extensive view of the delta's history, revealing how its floodplains and shorelines on the Andaman Sea have grown and shrunk over the past 10,000 years. The study, published June 12, 2018, in the journal Earth Surface Dynamics, shows that shifts in monsoon climate have changed the shape of delta in the past, and suggests that human impact could drastically alter the region in the future, says lead author Liviu Giosan, a geologist at WHOI.

"In geological terms, the Ayeyawady, as any delta, is kind of an ephemeral feature. If the river can dump enough sediment onto the area through seasonal floods, the surrounding land will stay above sea level. But if those sediments can't keep up with sea level rise, the delta will eventually be submerged and overtaken by the sea" Giosan says.

According to the new study, the delta has changed repeatedly over the past several thousand years, showing that the region is particularly vulnerable to changes in outflow, ocean currents, or storm frequency.

To map the delta's geologic history, Giosan's team consulted 19th century British charts--which even today are the only available depth soundings off the Ayeyawady--to analyze long-submerged landscapes. They also turned to satellite imagery and NASA's digital elevation data to spot past channels but also shorelines, which appear as slight ridges in the otherwise flat landscape.

On the ground, the researchers dug trenches and drilled cores into those former shorelines and channels, revealing layers of sand and silt that they could then date in the lab. The resulting data showed that the ebb and flow of the delta was linked to extreme climate events: during long periods of heavy monsoons, more sediment traveled downriver, maintaining the delta's coastline. In dryer intervals, that flow dwindled, and waves and tides from the Andaman Sea chipped away at the landscape.

Most surprisingly, Giosan notes, plate tectonics has played a major role in the delta's shape. Instead of the uniform fan shape of most deltas, the modern shoreline of the Ayeyawady advanced far more quickly to the western side of the river than to the east, thanks to a major system of fault lines that run through the area.

"Myanmar sits directly where the Indian plate slides underneath the Asian plate. The fault system extends into the Andaman Sea, maintaining deeper waters and stronger tidal currents on the eastern side of river mouth that move sediments offshore. In the past the western side grew faster than the eastern coast, but this situation is about to change as dams are being built upriver, " Giosan notes.

The team's study will provide a baseline for future scientific work in the region, as well as data that will help inform human development in the future. Its findings suggest that major dam projects in the future could alter the outflow of the Ayeyawady, reducing the sediments deposited downstream. That scenario could upset an already delicate balance, rendering homes and farmland on the delta potentially uninhabitable as the land erodes and seawater encroaches.

"Those effects can't be understood without showing how the delta evolved, how it formed, what direction it went at certain times, or why it switched from one region to another. You need that basis to address specific questions," Giosan adds.

Although he's passionate on a professional level about the work he's doing in Myanmar, Giosan is also quick to note that it appeals to him personally. Like Myanmar, his homeland of Romania was ruled for decades by a closed regime under dictator Nicolae Ceaușescu.

"I know what it's like to live in a closed country," he says. "It's important to me that my colleagues in Myanmar get all the help they can, whether it's in education, or scientific areas."

Credit: 
Woods Hole Oceanographic Institution

'Gut instinct' may have been the GPS of human ancestors

Ask anyone if they remember where they ate the juiciest burger, the sweetest cupcake or the smoothest bisque, and they probably can describe the location in great detail, down to the cross streets, the décor, and the table where they sat. A new USC study in Nature Communications gives a possible explanation for food's prominence in memory.

The body's longest nerve, the vagus nerve, is the autobahn between what scientists have referred to as the "two brains" -- the one in your head and the other in your gastrointestinal tract. The nerve is key for telling you the tank is full and to put the fork down because it helps transmit biochemical signals from the stomach to the most primitive part of the brain, the brainstem.

But in this animal study, researchers may have found a greater purpose behind this complex circuitry involving the vagus nerve. This "gut-brain axis" may help you remember where you ate by directing signals to another part of the brain, the hippocampus, the memory center.

Following our stomach

The scientists believe that this gut instinct, this connection between spatial awareness and food, is likely a neurobiological mechanism that dates back ages to when the definition of fast food was a herd of deer running away from the nomadic hunters who tracked them.

Back then especially, it would be critical for the gut to work with the brain like a Waze or Google Maps navigation app, said Scott Kanoski, an assistant professor of biological sciences at USC Dornsife and corresponding author of the paper. Those wandering early humans could remember a site where they had found and collected food and return repeatedly for more.

"When animals find and eat a meal, for instance, the vagus nerve is activated and this global positioning system is engaged," Kanoski said. "It would be advantageous for an animal to remember their external environment so that they could have food again."

The study was published on June 5.

Disruption disorients the internal compass

To examine this gut-brain connection, the research team conducted the study on rats. They saw that rats with their gut-brain vagus nerve pathway disconnected could not remember information about their environment.

"We saw impairments in hippocampal-dependent memory when we cut off the communication between the gut and the brain," said lead author Andrea Suarez, a PhD candidate in biological sciences. "These memory deficits were coupled with harmful neurobiological outcomes in the hippocampus."

Specifically, the disconnected pathway affected markers in the brain that are key for the growth of new neural connections and new brain cells.

However, it did not appear to affect the rats' anxiety levels or their weight, the scientists noted.

The scientists wrote that their findings may raise an important and timely medical question that merits further exploration: Could bariatric surgeries or other therapies that block gut-to-brain signaling affect memory?

Credit: 
University of Southern California

Getting heart disease patients to exercise: Study says wearables could help but only if money is on the line

PHILADELPHIA -- Combining financial incentives and personalized goal-setting with wearable devices may be an effective way of encouraging patients with heart disease to increase their physical activity. In patients with heart disease, regular physical activity has been shown to decrease the risk of a future heart attack, but getting these patients into a regular exercise program such as cardiac rehab has remained a challenge. Results of a clinical trial led by researchers at Penn Medicine, and published today in the Journal of the American Heart Association (JAHA), show that a home based program offering payment upfront with money taken away if step goals were not met - a design that leverages the concept of loss aversion - increases activity levels and may help to form a more long-lasting habit.

"Regular exercise and cardiac rehab has shown to have significant benefit in those with heart disease but participation in such programs is extremely low for various reasons including patient motivation and access to exercise facilities. There is interest in developing creative remote strategies to engage patients in exercise programs but there is little research for guidance," said Neel Chokshi, MD, MBA, medical director of the Penn Sports Cardiology and Fitness Program and assistant professor of Clinical Medicine in Cardiology. "In this clinical trial, we tested a scalable approach combining wearables and principles from behavioral economics to show significantly increased activity levels even after incentives were stopped."

The study enrolled 105 patients into a home-based, remotely monitored program using the Misfit Shine wearable device for a 24-week period to determine the impact of personalized feedback with goals coupled to financial incentives for the first 16 weeks. Patients in the control arm received the wearable but no other interventions. In the intervention group, patients were given personalized step goals and allocated $14 at the beginning of each week for 16 weeks ($224 in total). Each day the step goal was not met, $2 was taken away. During the main intervention period (weeks 9 to 16), patients in the intervention had an increase in their physical activity by 1368 steps per day more than patients in control. After 16 weeks, financial incentives were stopped and patients were followed for another 8 weeks. During the 8-week follow-up period, patients in the intervention still had an increase of 1154 steps per day more than patients in control.

"While many are hopeful that wearable devices can motivate high-risk patients, we found that wearables alone did not increase physical activity levels," said Mitesh Patel, MD, MBA, MS, an assistant professor of Medicine and Health Care Management, and director of the Penn Medicine Nudge Unit. "However, framing rewards as a loss - a technique from behavioral economics - led to a meaningful difference in behavior. During the 6-month trial, the average patient in the intervention arm had step counts that totaled about 100 miles more than the average patient in control."

All participants were given a wearable device with a two week startup period to establish baseline step counts. The intervention group then received weekly increases in step goals with daily feedback via text message or email on their performance. Progress was divided in two phases; during the "ramp-up incentive" phase (weeks 1-8), daily step goals increased from baseline by 15 percent each week with a maximum goal of 10,000 steps per day. After 8 weeks, step goals remained fixed and participants moved into the "maintenance incentive" phase (weeks 9-16), followed by an 8-week follow-up phase without incentives (weeks 17-24). During the 16-week intervention, participants in this arm were offered a loss-framed financial incentive. Each week, participants were informed that $14 was allocated to a virtual account. Each day the patient achieved his or her step goal, the balance remained unchanged, but each day the step goal was not achieved, the participant was informed that $2 had been deducted. The balance was refreshed with $14 every week on Monday.

Chokshi and team suggest that additional studies should be conducted to evaluate the sustainability of incentive effects over longer-term periods, to compare incentive designs that vary in magnitude, duration, or frequency, and to evaluate financial incentives and personalized feedback independently to assess effects. This study was supported in part by Grant Number UL1TR000003 from the National Center for Advancing Translational Science. The study was also supported in part by the Institute for Translational Medicine and Therapeutics (ITMAT) and the University of Pennsylvania Health System through the Penn Medicine Nudge Unit.

Credit: 
University of Pennsylvania School of Medicine

Cash and goal-setting help motivate heart patients to take healthy steps

DALLAS, June 13, 2018 -- The thought of losing up to $14 a week along with personalized goal setting may have motivated ischemic heart disease patients to increase their exercise, according to a new clinical trial published in Journal of the American Heart Association, the Open Access Journal of the American Heart Association/American Stroke Association.

Ischemic heart disease is the leading cause of death in the United States. Yet, while regular exercise has been shown to reduce the risk of cardiovascular events and risk of death by up to 30 percent among these patients, most don't participate in exercise-based rehabilitation programs or obtain enough physical activity on their own.

"There is a lot of interest in using wearable devices to increase activity levels among high-risk cardiovascular patients, but the best way to design these types of programs is unknown," said Neel Chokshi, M.D., M.BA., first author and cardiologist at the Perelman School of Medicine and medical director of the Sports Cardiology and Fitness Program at Penn Medicine both located in Philadelphia. "Our trial is one of the first to test the use of mobile technology through a home-based program and found that while wearable devices alone were not effective, combining them with financial incentives and personalized goal-setting significantly increased physical activity levels during the 6-month period."

Researchers obtained baseline step counts and tracked 105 ischemic heart disease patients (average age 60; 70 percent men) for 24-weeks to see if financial incentives and personalized goal setting would increase physical activity. Patients in the incentive group received a wrist-worn activity tracking device, personalized step goals, daily feedback and were allocated $14 each week to a virtual account for the first 16 weeks - $2 of which could be lost per day for not achieving step goals. They also selected whether to receive personalized goal-setting communications by text, email, interactive voice recording or a combination.

Patients in the control group received a wearable device that counted steps but no incentives or feedback.

Researchers found:

Patients in the incentive group significantly increased their physical activity levels, 1,368 more steps per day during the main intervention period, compared to the control group.

After financial incentives were stopped in the follow up period, the incentive group still increased their physical activity by 1,154 steps per day compared to the control group.

Patients in the control group had no significant change in their physical activity levels.

"This is one of the first clinical trials that used financial incentives and found increases in physical activity were sustained even after incentives stopped, a potential sign of habit formation," said Mitesh Patel, M.D., senior author and assistant professor at the Perelman School of Medicine at the University of Pennsylvania and director of the Penn Medicine Nudge Unit, both located in Philadelphia. "A key element of our study was that incentives were designed to leverage the behavioral economic principle of loss aversion, which finds that for the same reward size, most people are more motivated when they are told they might lose a reward than when told they could earn a reward."

Credit: 
American Heart Association

Robots learn by checking in on team members

image: Mohamed Abdelkader is one of the researchers that developed an algorithm that enables a team of unmanned aerial vehicles to work together in real time under a capture the flag scenario to intercept an attacker drone.

Image: 
© 2018 Kuat Telegenov

The software and hardware needed to co-ordinate a team of unmanned aerial vehicles (UAVs) that can communicate and work toward a common goal have recently been developed by KAUST researchers.

"Giving UAVs more autonomy makes them an even more valuable resource," says Mohamed Abdelkader, who worked on the project with his colleagues under the guidance of Jeff Shamma. "Monitoring the progress of a drone sent out on a specific task is far easier than remote-piloting one yourself. A team of drones that can communicate among themselves provides a tool that could be used widely, for example, to improve security or capture images simultaneously over a large area."

The researchers trialed a capture the flag game scenario, whereby a team of defender drones worked together within a defined area to intercept an intruder drone and prevent it from reaching a specific place. To give the game more authenticity, and to check if their algorithms would work under unpredictable conditions, the intruder drone was remote-piloted by a researcher.

Abdelkader and the team quickly dismissed the idea of having a central base station that the drones would communicate with. Instead, they custom-built UAVs and incorporated a light-weight, low-power computing and wi-fi module on each one so that they could talk to each other during flight.

"A centralized architecture takes significant computing power to receive and relay multiple signals, and it also has a potential single point of total failure--the base station," explains Shamma. "Instead, we designed a distributed architecture in which the drones coordinate based on local information and peer-to-peer communications."

The team's algorithm aims to achieve an optimal level of peer-to-peer messaging--which needed to be not too much, not too little--and rapid reaction times, without too much heavy computation. This allows the algorithm to work effectively in real time while the drones are chasing an intruder.

"Each of our drones makes its own plan based on a forecast of optimistic views of their teammates' actions and pessimistic views of the opponent's actions," explains Abdelkader. "Since these forecasts may be inaccurate, each drone executes only a portion of its plan, then reassesses the situation before re-planning."

Their algorithm worked well in both indoor and outdoor arenas under different attack scenarios. Abdelkader hopes their software, which is now available as open-source, will provide the test-bed for multiple applications. The KAUST team hope to enable the drones to work in larger, outdoor areas and to improve the software by incorporating adaptive machine-learning techniques.

Credit: 
King Abdullah University of Science & Technology (KAUST)

Dementia risk increased in 50-year-olds with blood pressure below hypertension threshold

New findings from the long-running Whitehall II study of over 10,000 civil servants has found 50-year-olds who had blood pressure that was higher than normal but still below the threshold commonly used when deciding to treat the condition, were at increased risk of developing dementia in later life.

This increased risk was seen even when the study participants did not have other heart or blood vessel-related problems, according to the research, which is published in the European Heart Journal [1] today (Wednesday).

Although there have been previous studies that have linked raised blood pressure in midlife to an increased risk of dementia in later life, the term 'midlife' has been poorly defined and ranged from 35 to 68 years.

The first author of the paper, Dr Jessica Abell, post-doctoral research fellow at the French National Institute of Health and Medical Research in Paris (INSERM) and a research associate in dementia and epidemiology at University College London (UCL), UK, said: "Previous research has not been able to test the link between raised blood pressure and dementia directly by examining the timing in sufficient detail. In our paper we were able to examine the association at age 50, 60 and 70, and we found different patterns of association. This will have important implications for policy guidelines, which currently only use the generic term 'midlife'."

Participants in the Whitehall II study, who were aged between 35-55 in 1985, had their blood pressure measured in 1985, 1991, 1997 and 2003. Other medical information was also taken, such as age, sex, lifestyle behaviours (such as smoking and alcohol intake), and socio-demographic factors.

Among the 8,639 people analysed for this study, 32.5% of whom were women, 385 developed dementia by 2017. Those who had a systolic blood pressure of 130 mmHg or more at the age of 50 had a 45% greater risk of developing dementia than those with a lower systolic blood pressure at the same age. This association was not seen at the ages of 60 and 70, and diastolic blood pressure was not linked to dementia [2].

The link between high blood pressure and dementia was also seen in people who had no heart or blood vessel-related conditions (cardiovascular disease) during the follow-up period; they had an increased risk of 47% compared to people with systolic blood pressure lower than 130 mm.

Guidelines from NICE (National Institute for Health and Care Excellence) in the UK and the European Society of Cardiology both give a threshold of 140/90 mmHg for hypertension, although 2017 guidelines from the American Heart Association, the American College of Cardiology and nine other health organisations lowered the threshold to 130/80 mmHg for all adults. Ideal blood pressure is considered to be between 90/60mmHg and 120/80mmHg.

Professor Archana Singh-Manoux, research professor at INSERM and honorary professor at UCL, who led the research, said: "Our work confirms the detrimental effects of midlife hypertension for risk of dementia, as suggested by previous research. It also suggests that at age 50, the risk of dementia may be increased in people who have raised levels of systolic blood pressure below the threshold commonly used to treat hypertension.

"Our analysis suggests that the importance of mid-life hypertension on brain health is due to the duration of exposure. So we see an increased risk for people with raised blood pressure at age 50, but not 60 or 70, because those with hypertension at age 50 are likely to be 'exposed' to this risk for longer." The average age at which the study participants developed dementia was 75.

Possible reasons for the link between raised blood pressure and dementia include the fact that high blood pressure is linked to silent or mini strokes (where symptoms often are not noticeable), damage to the white matter in the brain, which contains many of the brain's nerve fibres, and restricted blood supply to the brain. This damage may underlie the resulting decline in the brain's processes.

Dr Abell said: "It is important to emphasise that this is observational, population-level research and so these findings do not translate directly into implications for individual patients. Furthermore, there is considerable discussion on the optimal threshold for the diagnosis of hypertension. There is plenty of evidence to suggest that maintaining a healthy blood pressure in middle age is important for both your heart and your brain later in life. Anyone who is concerned about their blood pressure levels should consult their GP."

Limitations of the study include the fact that diagnosis of dementia was made by linking to electronic medical records that might miss milder cases of dementia; the researchers were not able to examine whether the association of hypertension was stronger with Alzheimer's disease or vascular dementia because of the small numbers in the study affected by dementia, and this requires further research; and the researchers do not know whether effective management of high blood pressure in people in mid-life might weaken the risk of later dementia.

"One of the strengths of this study was having repeat blood pressure measurements on the same people, which allowed us to examine their blood pressure status over an 18-year period. This is rare, since previous research has often used a single measure of hypertension," concluded Professor Singh-Manoux.

Credit: 
European Society of Cardiology

Clever bees can identify different flowers by patterns of scent

image: A captive bumblebee walks across the surface of an artificial flower, working out the pattern of scent that has been made by placing peppermint oil in some of the holes.

Image: 
Dave Lawson, University of Bristol

New research led by scientists from the University of Bristol and Queen Mary University of London has revealed that bumblebees can tell flowers apart by patterns of scent.

Flowers have lots of different patterns on their surfaces that help to guide bees and other pollinators towards the flower's nectar, speeding up pollination.

These patterns include visual signals like lines pointing to the centre of the flower, or colour differences.

Flowers are also known to have different patterns of scent across their surface, and so a visiting bee might find that the centre of the flower smells differently to the edge of the petals.

This new research, published today in the journal Proceedings of the Royal Society B shows that bumblebees can tell flowers apart by how scent is arranged on their surface.

Lead author Dr Dave Lawson, from the University of Bristol's School of Biological Sciences, said: "If you look at a flower with a microscope, you can often see that the cells that produce the flower's scent are arranged in patterns.

"By creating artificial flowers that have identical scents arranged in different patterns, we are able to show that this patterning might be a signal to a bee. For a flower, it's not just smelling nice that's important, but also where you put the scent in the first place."

The study also shows that once bees had learnt how a pattern of scent was arranged on a flower, they then preferred to visit unscented flowers that had a similar arrangement of visual spots on their surface.

Dr Lawson added: "This is the equivalent of a human putting her hand in a bag to feel the shape of a novel object which she can't see, and then picking out a picture of that object. Being able to mentally switch between different senses is something we take for granted, but it's exciting that a small animal like a bee is also able to do something this abstract."

Professor Lars Chittka, from Queen Mary's School of Biological and Chemical Sciences, said: "We already knew that bees were clever, but we were really surprised by the fact that bees could learn invisible patterns on flowers - patterns that were just made of scent.

"The scent glands on our flowers were either arranged in a circle or a cross, and bees had to figure out these patterns by using their feelers. But the most exciting finding was that, if these patterns are suddenly made visible by the experimenter, bees can instantly recognise the image that formerly was just an ephemeral pattern of volatiles in the air."

Senior author, Dr Sean Rands, also from Bristol, added: "Flowers often advertise to their pollinators in lots of different ways at once, using a mixture of colour, shape, texture, and enticing smells.

"If bees can learn patterns using one sense (smell) and then transfer this to a different sense (vision), it makes sense that flowers advertise in lots of ways at the same time, as learning one signal will mean that the bee is primed to respond positively to different signals that they have never encountered.

"Advertising agencies would be very excited if the same thing happened in humans."

Around 75 percent of all food grown globally relies on flowers being pollinated by animals such as bees. The work published today is part of ongoing research at the University of Bristol that explores the many different ways in which plants communicate with their pollinators, using different innovative techniques to explore how bees perceive the flowers that they visit.

Credit: 
University of Bristol