Culture

Drug resistant infections associated with higher in-hospital mortality rates in India

Washington, DC - In one of the largest studies to measure the burden of antibiotic resistance in a low- or middle-income country, researchers at the Center for Disease Dynamics, Economics & Policy report that in-hospital mortality is significantly higher among patients infected with multi-drug resistant (MDR) or extensively drug resistant (XDR) pathogens including Staphylococcus aureus, Escherichia coli, Klebsiella pneumoniae and Acinetobacter baumannii.

Researchers analyzed antimicrobial susceptibility testing results and mortality outcomes for over 4,000 patients who visited one of ten tertiary or quaternary referral hospitals across India in 2015. Pathogens were classified as MDR or XDR based on drug susceptibility profiles proposed by the European and US Centers for Disease Control and Prevention. Mortality data was restricted to in-hospital deaths. Additional demographic and clinical data including age, sex, place of infection acquisition, and location in the hospital (i.e., intensive care unit [ICU] or non-intensive care unit) were also collected.

The overall mortality rate among all study participants was 13.1 percent, with mortality as high as 29.0 percent among patients infected with A. baumannii. Patients who died were more likely to have been older and admitted to the ICU at the time of testing. Researchers also found that among MDR infections, those caused by Gram-negative bacteria were associated with higher mortality rates compared to those caused by Gram-positive bacteria, with rates of 17.7 percent and 10.8 percent, respectively.

Study results indicate that patients who acquired MDR bacterial infections were 1.57 times more likely to die, compared to patients with similar susceptible infections, while patients who acquired XDR infections were 2.65 times more likely to die when accounting for age, sex, site of infection, and the number of coinfections.

In both the ICU and non-ICU, odds of mortality were higher among patients with XDR infections; this association was driven by Gram-negative infections (e.g., XDR K. pneumoniae) highlighting the importance of rapidly identifying these infections among all patients.

In India, MDR and XDR Gram-negative bacterial infections are frequent, and the availability of effective antibiotic therapies are declining. This study provides greater insight into the urgent need to increase surveillance, research, and antimicrobial stewardship efforts worldwide. The researchers further note that these findings on the mortality burden of antibiotic resistance can aid in the development of policy efforts to prioritize antibiotic resistance as a global public health threat and to inform future efforts to quantify and track the burden of resistance across low- and middle-income countries.

Credit: 
Center for Disease Dynamics, Economics & Policy

Milk allergy affects half of US food-allergic kids under age 1

SEATTLE (November 16, 2018) - Although parents often focus on peanuts as the food allergy they need to worry about most, cow's milk is the most common food allergy in children under the age of 5. New research being presented at the American College of Allergy, Asthma and Immunology (ACAAI) Annual Scientific Meeting found that over two percent of all U.S. children under the age of 5 have a milk allergy, and 53 percent of food-allergic infants under age 1 have a cow's milk allergy.

"Children in the U.S. spend their early years drinking milk, so it's important to know that many of them - at least in the first few years - may be allergic," says Christopher Warren, PhD(c), lead author of the study. "Our findings suggest that while milk allergy is relatively common during infancy, many children are likely to outgrow their milk allergies. We observed that while an estimated 53 percent of food-allergic infants under age 1 have a milk allergy, the number drops to 41 percent of 1-2-year-olds, 34 percent of 3-5-year-olds and 15% of 11-17-year olds."

The study surveyed more than 53,000 parents in households with children across the U.S. The survey was done over a one-year period from October 2015 to September 2016.

"We know confusion exists over what a real milk allergy looks like," says Ruchi Gupta, MD, ACAAI member and study author. "A child may have a milk intolerance that his parents mistake for a milk allergy. It's important that any child suspected of having a milk allergy have the allergy confirmed with an allergist. A food allergy of any kind can have a big effect on a household, including food costs and quality of life. A child with a milk allergy should receive counseling on how to avoid milk, but also on what it means to unnecessarily cut out foods. You don't want to get rid of necessary nutrients."

According to the study, only 26 percent of milk-allergic children in the US have a current epinephrine auto-injector prescription - the lowest reported rate among the top nine food allergies. "Parents need to make sure they have an epinephrine auto-injector available and should talk to their child's allergist if they have any questions," says Dr. Gupta. Allergists are trained to help you live the life you want by working with you to treat allergic diseases and avoid severe reactions.

Credit: 
American College of Allergy, Asthma, and Immunology

A new story about England's ancient 'Domesday book'

image: University of Illinois history professor Carol Symes, an expert on medieval manuscripts, dug into the origins of a book revered in English history - the "Domesday Book" - and found a different story of its creation, as well as a record of grievances against the Norman conquest.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Nearly a thousand years ago, a famous king created a famous book, later given the title "Domesday" (pronounced "doomsday").

At least that's been the common story: William the Conqueror, 20 years after his 1066 invasion of England from Normandy, ordered a massive survey of his new realm. One year later, he got a book with the results - a record of the nation's wealth and resources, everything from property to sheep to servants.

The "Great Domesday Book," as it was later named, is perhaps the most famous document in English history after the Magna Carta.

The book's origin story, however, had not been thoroughly investigated until University of Illinois history professor Carol Symes took up the task. "What had never been resolved is how this massive text was really created," Symes said, "and in this incredibly narrow timeframe."

Now, after years of research, Symes makes the case in the journal Speculum that the final "Great Domesday Book" came years and perhaps decades later than the 1087 date to which it's attributed, also the year of William's death.

It also was not the orderly bureaucratic enterprise that's often assumed, but instead "enabled hundreds of thousands of individuals and communities to air grievances and to make their own ideas of law and justice a matter of public record," Symes wrote.

"This is documentation of the trauma of conquest. We're watching people pushing back, or at least letting their voices be heard because they're fed up," she said. In one example, the text records townspeople bitterly complaining about the leveling of houses to build a castle.

"We need to rethink what has seemed to be a rather straightforward, top-down royal project, but is revealed to be the tip of a big, monstrous iceberg that involves the agency of many historical actors and often preserves their voices. This helps to tell a very different story about one of the landmark events of England - the Norman conquest and its aftermath - that is not just a story about 'the great man.'"

The universe of the "Domesday Book" is complicated, to say the least. The name is attached to two different bodies of text, "Great Domesday" and "Little Domesday" - the first covering all of the country's shires except three in the southeast, the second covering those three, but in more detail, suggesting it was an earlier draft.

There's also "Exeter Domesday," a collection of 103 booklets that appears to be an even earlier draft of survey results, mostly covering three shires in the southwest.

Curiously, London does not appear in any of these records, which likely is a sign its citizens either ignored the inquest or overwhelmed it with grievances, Symes said.
The Exeter collection is just one of many "satellite" documents that have some connection with the survey or book but have received little scholarly attention, Symes said. For many who focus their research on "Great Domesday," the book has been "the sun around which everything else spins."

Among Symes' contributions is to suggest ways that the different texts relate to each other, since that hasn't been clear. "I think I have figured out the workings behind how this book ("Great Domesday") was made," she said.

Most of Symes' research focused on the Exeter collection and another satellite document, a small fragment of parchment roll, perhaps the oldest in England, from an abbey at Burton-on-Trent in the northwest of the country. In both cases, she examined the original documents.

The Exeter documents provide numerous clues on how "Great Domesday" was assembled, but also serve as a window on the people and the process. A bishop can be seen intervening with the king's advisers when his property is not recorded. Teenage scribes make drinking plans in the marginal notes of manuscripts.

The abbey's parchment fragment, however, is key to Symes' contention that the final book came years and even decades later. She ties its contents to the comings and goings of a man who served at one time as its abbot, who had access to the survey data that went into "Domesday" and may even have been involved in the survey.

"It plugs a huge hole that we had in our evidence. It suggests that the process of creating the thing we call 'Great Domesday' actually took a lot longer than people had thought."
Symes said she was attracted to this particular book as part of her interest in medieval manuscripts, especially the complex ways in which they were "mediated" - i.e., written, handled, copied, recopied, added to, edited, interpreted and heard by audiences, all in an age before the printing press. Historians need to take a text's complex mediation into account, she said, even considering the parchment on which it was written, to fully understand and not misinterpret it.

Symes also likes messiness - finding out "how the sausage gets made." She was attracted to Domesday, in part, "because it's a messy document that people pretend is not messy. It's taken to be this pristine, transparent thing when it's not."

One value in the Domesday research, she said, is in "realizing that the people of almost a thousand years ago were real people with real human emotions and needs. We're putting on a different set of glasses to look at these sources, and what we see is all those people who were written out of the record. We're getting to see and hear them again."

The "wonderful irony," Symes said, is that we can do that through one of the most famous books created in the Middle Ages, by a king.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Cosmic fireworks

image: UD Professor Jamie Holder (left) and doctoral student Tyler Williamson have been studying gamma rays with the help of the VERITAS telescopes located at the Fred Lawrence Whipple Observatory in Amado, Arizona.

Image: 
Photo by Evan Krape

Scientists have discovered something amazing.

In a cluster of some of the most massive and luminous stars in our galaxy, about 5,000 light years from Earth, astronomers detected particles being accelerated by a rapidly rotating neutron star as it passed by the massive star it orbits only once every 50 years.

The discovery is extremely rare, according to University of Delaware astrophysicist Jamie Holder and doctoral student Tyler Williamson, who were part of the international team that documented the occurrence.

Holder called this eccentric pair of gravitationally linked stars a "gamma-ray binary system" and likened the once-in-a-lifetime event to the arrival of Halley's comet or last year's U.S. solar eclipse.

Massive stars are among the brightest stars in our galaxy. Neutron stars are extremely dense and energetic stars that result when a massive star explodes.

This binary system is a massive star with a neutron star orbiting around it. Of the 100 billion stars in our galaxy, less than 10 are known to be this type of system.

Even fewer -- only two systems, including this one -- are known to have an identified neutron star, or pulsar, that emits pulses of radio waves that scientists can measure. This is important because it tells astronomers very accurately how much energy is available to accelerate particles, something scientists know little about.

"You couldn't ask for a better natural laboratory to study particle acceleration in a continually changing environment - at energies far beyond anything we can produce on the Earth," said Holder, a professor in UD's Department of Physics and Astronomy.

The project was led by a team of scientists, including Holder and Williamson, using the VERITAS telescope array at the Fred Lawrence Whipple Observatory in Arizona, in collaboration with scientists using the MAGIC telescopes at the Roque de los Muchachos Observatory located in La Palma, an island of the Canary Islands, Spain. (VERITAS stands for Very Energetic Radiation Imaging Telescope Array System and MAGIC stands for Major Atmospheric Gamma Imaging Cherenkov telescopes.)

The researchers recently reported their findings in the Astrophysical Journal Letters.

Hoping for fireworks

The natural question, to many minds, is why do scientists care about accelerated particles?

"Because our galaxy is full of them. We call them cosmic rays and they carry as much energy as the light from all the stars," said Holder.

Astronomers discovered more than 100 years ago that accelerated particles exist, yet how or where these particles speed up remains a mystery. Pulsars are among the most extreme objects in the universe and they have magnetic fields around them that are millions of times stronger than anything scientists could hope to build on earth. When a pulsar encounters dust or gas close to a massive star, the particles nearby accelerate -- to near speed of light velocities -- and collide with what's around them. The result is a beam of high-energy light called gamma-radiation or gamma rays.

Sophisticated telescopes, like those operated by VERITAS and MAGIC, can detect these gamma rays because they emit a blue flash of light when they reach the Earth's atmosphere. While our eyes can't see these flashes of light because they are too quick, only nanoseconds long, these telescopes can.

Once-in-a-lifetime doctoral experience

Astronomers first discovered gamma rays coming from the pulsar in this unusual pair of stars in 2008. About the size of Newark, Delaware, the pulsar is spinning like the attachment on a kitchen blender, emitting little pulses of gamma rays and radio waves with every rotation.

By measuring these radio pulse frequencies, astronomers were able to tell how fast the pulsar was moving and calculate exactly when it would be closest to the massive star that it was orbiting -- Nov. 13, 2017. It's a trip that took 50 years.

The VERITAS and MAGIC teams began monitoring the night sky and tracking the pulsar's orbit in September 2016. At first, they weren't even sure if they would see anything. But in September 2017 the astronomers began to detect a rapid increase in the number of gamma rays hitting the top of the earth's atmosphere.

Holder and Williamson realized that the pulsar was doing something different each day.

"I would wake up every morning and check and see if we had new data, then analyze it as fast as I could, because there were times where the number of gamma rays we were seeing was changing rapidly over a day or two," said Williamson, a fourth-year doctoral student.

During the closest approach between the star and the pulsar in November 2017, Williamson noticed that the VERITAS telescopes had -- overnight -- recorded ten times the number of gamma rays detected only a few days before.

"I double checked everything before sending the data to our collaborators," Williamson said. "Then one of our partners, Ralph Bird at UCLA, confirmed he'd gotten the same results; that was exciting."

Even more interesting -- this observational data did not match what predictive models had predicted.

Generally speaking, Holder said, existing models predicted that as the pulsar approached the massive star it was orbiting, the number of gamma rays produced would slowly accelerate, experience some volatility and then slowly decay over time.

"But our recorded data showed a huge spike in the number of gamma rays instead," Holder said. "This tells us that we need to revise the models of how this particle acceleration is happening."

What's more, according to Holder, while astrophysicists expected the National Aeronautics and Space Administration's (NASA) Fermi gamma-ray space telescope to record these gamma rays, it didn't. Holder said the reason for this is unclear, but that is part of what makes the VERITAS results so interesting.

Astrophysicists want to learn just which particles are being accelerated, and what processes are pushing them up to these extreme speeds, in order to understand more about the Universe. Holder said that although gamma-ray binary systems probably don't accelerate a large portion of the particles in our galaxy, they allow scientists to study the type of acceleration mechanisms which could produce them.

Charting a promising future

Astronomers won't be able to see this binary system at work again until 2067 when the two stars are once again close together. By then, Williamson joked that he just might be an emeritus professor with time on his hands.

At the moment, Williamson is not worried about running out of things to do. He spent three months at the Arizona-based observatory earlier this year, taking measurements, performing hardware maintenance and devising a remote control to allow the researchers to turn on the telescope's cameras from a computer inside a control room.

"It was a great chance to spend hands-on time with the telescopes and get to know the instrument," said Williamson.

Going forward, he'll spend the remainder of his doctoral studies combing through and analyzing in greater detail the nearly 175 hours of data the VERITAS telescopes collected in 2016 and 2017.

"Tyler is, without a doubt, the luckiest graduate student I've ever met because this event that happens only once every 50 years -- one of the most exciting things we've seen with our telescopes in a decade -- occurred right in the middle of his doctoral work," said Holder.

Credit: 
University of Delaware

Astronomers find possible elusive star behind supernova

image: This is an artist's concept of a blue supergiant star that once existed inside a cluster of young stars in the spiral galaxy NGC 3938, located 65 million light-years away. It exploded as a supernova in 2017, and Hubble Space Telescope archival photos were used to locate the doomed progenitor star, as it looked in 2007.

The star may have been as massive as 50 suns and burned at a furious rate, making it hotter and bluer than our Sun. It was so hot, it had lost its outer layers of hydrogen and helium. When it exploded in 2017, astronomers categorized it as a Type Ic supernova because of the lack of hydrogen and helium in the supernova's spectrum.

In an alternative scenario (not shown here) a binary companion to the massive star may have stripped off its hydrogen and helium layers.

Image: 
NASA, ESA, and J. Olmsted (STScI)

Astronomers may have finally uncovered the long-sought progenitor to a specific type of exploding star by sifting through NASA Hubble Space Telescope archival data. The supernova, called a Type Ic, is thought to detonate after its massive star has shed or been stripped of its outer layers of hydrogen and helium.

These stars could be among the most massive known -- at least 30 times heftier than our Sun. Even after shedding some of their material late in life, they are expected to be big and bright. So it was a mystery why astronomers had not been able to nab one of these stars in pre-explosion images.

Finally, in 2017, astronomers got lucky. A nearby star ended its life as a Type Ic supernova. Two teams of astronomers pored through the archive of Hubble images to uncover the putative precursor star in pre-explosion photos taken in 2007. The supernova, catalogued as SN 2017ein, appeared near the center of the nearby spiral galaxy NGC 3938, located roughly 65 million light-years away.

This potential discovery could yield insight into stellar evolution, including how the masses of stars are distributed when they are born in batches.

"Finding a bona fide progenitor of a supernova Ic is a big prize of progenitor searching," said Schuyler Van Dyk of the California Institute of Technology (Caltech) in Pasadena, lead researcher of one of the teams. "We now have for the first time a clearly detected candidate object." His team's paper was published in June in The Astrophysical Journal.

A paper by a second team, which appeared in the Oct. 21, 2018, issue of the Monthly Notices of the Royal Astronomical Society, is consistent with the earlier team's conclusions.

"We were fortunate that the supernova was nearby and very bright, about 5 to 10 times brighter than other Type Ic supernovas, which may have made the progenitor easier to find," said Charles Kilpatrick of the University of California, Santa Cruz, leader of the second team. "Astronomers have observed many Type Ic supernovas, but they are all too far away for Hubble to resolve. You need one of these massive, bright stars in a nearby galaxy to go off. It looks like most Type Ic supernovas are less massive and therefore less bright, and that's the reason we haven't been able to find them."

An analysis of the object's colors shows that it is blue and extremely hot. Based on that assessment, both teams suggest two possibilities for the source's identity. The progenitor could be a single hefty star between 45 and 55 times more massive than our Sun. Another idea is that it could have been a massive binary-star system in which one of the stars weighs between 60 and 80 solar masses and the other roughly 48 suns. In this latter scenario, the stars are orbiting closely and interact with each other. The more massive star is stripped of its hydrogen and helium layers by the close companion, and eventually explodes as a supernova.

The possibility of a massive double-star system is a surprise. "This is not what we would expect from current models, which call for lower-mass interacting binary progenitor systems," Van Dyk said.

Expectations on the identity of the progenitors of Type Ic supernovas have been a puzzle. Astronomers have known that the supernovas were deficient in hydrogen and helium, and initially proposed that some hefty stars shed this material in a strong wind (a stream of charged particles) before they exploded. When they didn't find the progenitors stars, which should have been extremely massive and bright, they suggested a second method to produce the exploding stars that involves a pair of close-orbiting, lower-mass binary stars. In this scenario, the heftier star is stripped of its hydrogen and helium by its companion. But the "stripped" star is still massive enough to eventually explode as a Type Ic supernova.

"Disentangling these two scenarios for producing Type Ic supernovas impacts our understanding of stellar evolution and star formation, including how the masses of stars are distributed when they are born, and how many stars form in interacting binary systems," explained Ori Fox of the Space Telescope Science Institute (STScI) in Baltimore, Maryland, a member of Van Dyk's team. "And those are questions that not just astronomers studying supernovas want to know, but all astronomers are after."

Type Ic supernovas are just one class of exploding star. They account for about 20 percent of massive stars that explode from the collapse of their cores.

The teams caution that they won't be able to confirm the source's identity until the supernova fades in about two years. The astronomers hope to use either Hubble or the upcoming NASA James Webb Space Telescope to see whether the candidate progenitor star has disappeared or has significantly dimmed. They also will be able to separate the supernova's light from that of stars in its environment to calculate a more accurate measurement of the object's brightness and mass.

SN 2017ein was discovered in May 2017 by Tenagra Observatories in Arizona. But it took the sharp resolution of Hubble to pinpoint the exact location of the possible source. Van Dyk's team imaged the young supernova in June 2017 with Hubble's Wide Field Camera 3. The astronomers used that image to pinpoint the candidate progenitor star nestled in one of the host galaxy's spiral arms in archival Hubble photos taken in December 2007 by the Wide Field Planetary Camera 2.

Kilpatrick's group also observed the supernova in June 2017 in infrared images from one of the 10-meter telescopes at the W. M. Keck Observatory in Hawaii. The team then analyzed the same archival Hubble photos as Van Dyk's team to uncover the possible source.

The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Maryland, conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy in Washington, D.C.

Credit: 
NASA/Goddard Space Flight Center

Ringling train chugs into digital world

image: John Ringling's personal train car used in the early 1900s.

Image: 
The Ringling Museum

SARASOTA, Fla. (November 14, 2018)- The century-old train car known to be the site of business transactions for the Ringling Bros. and Barnum & Bailey Circus has long been too fragile for visitors to step inside. The Wisconsin has faced conservation issues due to the discontinuation of spare train parts. Through a project led by the University of South Florida, it will no longer be limited to glancing through a window.

During a presentation at the International Conference on Cultural Heritage and New Technologies in Vienna, Austria, Davide Tanasi, PhD, assistant professor of history, and Michael Celestin, PhD, senior research engineer, revealed 3D models of the train, including its luxurious interior, providing full digital access.

In order to make this happen, they used digital photogrammetry and terrestrial laser scanning to 3D print furniture elements and metal components of the train undercarriage, allowing for the creation of spare parts for future restoration efforts. They printed using metal, wood and porcelain, closely mimicking the train built in 1896. It's 79x14 ft and includes three states rooms, a kitchen, dining room, servants' quarters and bathrooms.

"The virtualization of the Wisconsin train car is the result of an innovative approach aimed to popularize through digital technology such crown jewel of Florida cultural heritage, currently partly accessible on site and basically digitally invisible for the remote public," said Tanasi.

"What we tried to ensure is that we are able to take real parts out of service so that perfect weight and appearance replicas can take their place," said Celestin. "In so doing, we are able to further preserve the fragile antique components while being able to "adjust the slider" of time's patina on an object--stopping at as little or much age-related damage as we want. This time-machine approach to preservation allows you to very easily create a model of how a part would look if brand new today, and also allow realistic, 3D printed replacement parts to sit alongside their time-aged counterparts."

The Wisconsin was John Ringling's personal train car that traveled the country, often accompanying the cast of his infamous circus come to be known as "The Greatest Show on Earth," which ceased performance in May 2017. It changed hands throughout the 20th century, eventually landing at the Ringling Museum in Sarasota in 2003. Just prior, it underwent significant restoration to uncover the original paint, gold plating, and beautiful stain glass windows.

Dr. Tanasi is founder of the Institute for Digital Exploration (IDEx) in the USF College of Arts and Sciences Department of History. IDEx also led a massive 3D scanning project of the 36,000 square-foot Ca'd'Zan, the one-time residence of John and Mable Ringling.

"The mission of IDEx to document, preserve, and protect cultural heritage aligns with that of The Ringling, which aims to collect, preserve, and exhibit art for the benefit of the public," said David Berry, assistant director of academic affairs at The Ringling. "The technology employed by IDEx will be used to help The Ringling make its collections more accessible to visitors, on site and online."

Credit: 
University of South Florida

Earth's magnetic field measured using artificial stars at 90 kilometers altitude

image: The experiment on La Palma: The laser beam (yellow) generates an artificial guide star in the mesosphere. This light is collected in the receiver telescope (front left). The laser source and the receiver telescope are eight meters away from each other.

Image: 
photo/©: Felipe Pedreros Bustos

The mesosphere, at heights between 85 and 100 kilometers above the Earth's surface, contains a layer of atomic sodium. Astronomers use laser beams to create artificial stars, or laser guide stars (LGS), in this layer for improving the quality of astronomical observations. In 2011, researchers proposed that artificial guide stars could also be used to measure the Earth's magnetic field in the mesosphere. An international group of scientists has recently managed to do this with a high degree of precision. The technique may also help to identify magnetic structures in the solid Earth's lithosphere, to monitor space weather, and to measure electrical currents in the part of the atmosphere called ionosphere.

Astronomers have been using lasers to generate artificial stars for the past 20 years. A laser beam is directed from the ground into the atmosphere. In the sodium layer, it strikes sodium atoms, which absorb the energy of the laser and then start to glow. "The atoms emit light in all directions. Such artificial stars are barely visible to the naked eye but can be observed with telescopes," explained Felipe Pedreros Bustos of Johannes Gutenberg University Mainz (JGU). In connection with the work on his doctoral thesis, the Chilean-born physicist has spent four years working on the project, which besides JGU involves the European Southern Observatory (ESO), the University of California, Berkeley and Rochester Scientific in the USA, the Italian National Institute for Astrophysics (INAF-OAR), and the University of British Columbia in Vancouver, Canada.

The artificial guide stars help astronomers to correct the distortions of light that travels through the atmosphere. The light from the artificial guide star is collected on the ground by telescopes, and the information is used to adjust in real time state-of-the-art deformable mirrors, compensating the distortions and allowing astronomical objects to be imaged sharply, down to the optical resolution, the so-called diffraction limit, of the telescope.

The precession of sodium atoms reveals the strength of the magnetic field

The participants in the collaborative project are using laser guide stars to measure the Earth's magnetic field. An ESO LGS unit dedicated to Research and Development is housed in the Roque de los Muchachos Observatory on La Palma, the westernmost Canary Island. The availability and use of the LGS unit has allowed to perform the reported joint experiments, which also aim at increasing the brightness of laser guide stars. From the observatory, a laser beam is directed at the sodium layer which excites and spin-polarizes the atoms making most of their atomic spin point in the same direction. Due to the effect of the surrounding magnetic field, the polarized atomic spins rotate around the direction of the magnetic field similar to the motion of a gyroscope that is tilted from the vertical, a phenomenon known as Larmor precession. "A guide star becomes brighter when the modulation frequency of our laser coincides with the precession frequency of sodium," explained Pedreros Bustos. "As the Larmor frequency is proportional to the strength of the magnetic field, we can use this method to measure the Earth's magnetic field in the sodium layer." The detection scheme is similar to a stroboscope.

Hence, the group has succeeded in using a well-studied, fundamental laboratory technique to observe the natural world. It fills a gap in our knowledge of the Earth's magnetic field by allowing us to make ground-based observations of the mesosphere, which was previously difficult to access. Up to now, the magnetic field could only be directly measured on the ground, from airplanes, from balloons in the stratosphere, or from satellites.

In May 2018, a US-American research group had published similar findings. However, these latest measurements are much more precise, and scientists hope to improve them still further by using higher-energy lasers. "We can also use the technique to estimate atomic processes in the atmosphere, for example, how often sodium collides with other atoms such as oxygen or nitrogen. This is something that hasn't been done before," said Pedreros Bustos.

This artificial guide star measuring technique will be particularly useful in geophysics. It will make it possible to determine changes to the magnetic field of the Earth's ionosphere caused by solar winds. In addition, observation of oceanic currents and large-scale magnetic structures in the upper mantle would be feasible by means of continuous surveillance of the Earth's magnetic field at altitudes of 85 to 100 kilometers.

Credit: 
Johannes Gutenberg Universitaet Mainz

Difficult-to-treat bowel cancers respond in first study of new drug combination

Dublin, Ireland: Early results from a phase I trial in a small group of patients with advanced cancer using two drugs (nivolumab and pixatimod) that stimulate the immune system report that patients with bowel cancer may benefit from the combination.

Dr James Kuo, a medical oncologist and the deputy medical director of Scientia Clinical Research, Sydney, Australia, presented the first data from this trial at the 30th EORTC-NCI-AACR [1] Symposium on Molecular Targets and Cancer Therapeutics in Dublin, Ireland, today (Thursday). He said that the data suggested that a population of colorectal cancer patients, considered to be microsatellite stable (MSS), received benefit from the drug combination. MSS patients, unlike microsatellite unstable patients, have tumours bearing fewer signals that alert the immune system to the cancer. This is thought to be a major reason why checkpoint inhibitors, such as nivolumab, have been unsuccessful in treating MSS colorectal cancer (CRC).

"No patients with microsatellite stable colorectal cancers have been reported to respond to monotherapy with immune checkpoint inhibitor therapy," he said. "However, in this study of a new drug combination, we observed clinical benefit in four out of five MSS CRC patients enrolled, including two demonstrating a reduction in the tumour burden."

Nivolumab is a programmed cell death protein 1 (PD-1) immune checkpoint inhibitor that is designed to help T cells recognise and attack cancer cells. In the study presented today, nivolumab was combined with pixatimod, a new drug that stimulates different cells of the immune system, namely Dendritic Cells (DCs) and Natural Killer (NK) Cells. This is the first clinical study to investigate the safety and efficacy of this combination in patients with advanced cancer.

From October 2017 to September 2018, a total of 16 patients had enrolled in the study. They have pancreatic cancer (7 patients), colorectal cancer (5), uterine adenosarcoma (1), squamous cell carcinoma (1), endometrial carcinoma (1) and adrenocortical carcinoma (1) [2]. The patients were still fairly fit when they joined the study, being either fully active, with everyday life unaffected by their disease, or restricted in physically strenuous activity, but able to carry out work of a light or sedentary nature (ECOG performance status 0-1). Previous therapies had been unsuccessful in treating their cancer.

Every two weeks, these patients were given 240 mg of nivolumab, while pixatimod was given weekly, starting at a dose of 25 mg, which was determined to be the maximum tolerated dose after two dose limiting toxicities occurred at 50 mg - one patient died after organ failure and another accumulated fluid in the lung but recovered. Both drugs were given via a short intravenous infusion. One patient developed inflammation of the lung tissue (pneumonitis) at 25 mg of pixatimod and another was believed to have inflammation of the brain (encephalitis). Other side effects were manageable and included fatigue, nausea and elevated liver enzymes.

Dr Kuo said, "We observed a response in two patients with metastatic CRC: one has had a sustained reduction of 86% in tumour burden for just over twelve months of treatment now, and another remains on treatment for over six months with ongoing reduction of 38% in tumour burden despite a new growth in the bone. One further patient with CRC has stable disease at sixteen weeks. All three patients were confirmed to have microsatellite stable disease. The clinical improvement of the three patients with CRC has allowed them to continue functioning as they did before they developed cancer, including the ability to perform all the activities of daily living. In total, four patients with CRC remain in the study."

The study is continuing to recruit patients and was expanded in October 2018 to investigate the safety and efficacy of 25 mg pixatimod in combination with 240 mg nivolumab in patients with metastatic pancreatic cancer - one of the most difficult cancers to treat successfully, with less than 5% of patients alive five years after diagnosis.

Dr Kuo continued: "We have also examined the blood of each patient to look for changes in immune cell type, number and function. Although preliminary, the data demonstrate changes likely to be associated with the treatment of a PD-1 inhibitor and an immune-stimulating agent."

He concluded: "While these are early data, the combination of pixatimod with nivolumab may have the potential to overcome intrinsic resistance to PD-1 inhibitors in MSS CRC. The implication of these preliminary data is that patients with MSS CRC, who make up the vast majority of CRC patients, may benefit from combination immunotherapy. Although the study now moves on to determine the effect in metastatic pancreatic cancer patients, the data also support further investigation in MSS CRC."

Co-chair of the EORTC-NCI-AACR Symposium, Professor Antoni Ribas from the University of California Los Angeles, who was not involved in the research, commented: "Although these are results from just a small number of patients, the effect of this drug combination on a cancer subtype known to be inherently resistant to immune checkpoint inhibitors suggest that pixatimod may boost the effectiveness of nivolumab by providing another signal to the immune system, alerting it to these cancers."

Credit: 
ECCO-the European CanCer Organisation

Breakthrough in treatment of restless legs syndrome

New research published in the Journal of Physiology presents a breakthrough in the treatment of Restless Legs Syndrome (RLS).

RLS is a common condition of the nervous system that causes an overwhelming irresistible urge to move the legs. Patients complain of unpleasant symptoms such as tingling, burning and painful cramping sensations in the leg. More than 80% of people with RLS experience their legs jerking or twitching uncontrollably, usually at night.

Until now it was thought that RLS is caused by genetic, metabolic and central nervous system mechanisms. For the first time the researchers show that, in fact, it is not only the central nervous system but also the nerve cells targeting the muscles themselves that are responsible.

This new research indicates that the involuntary leg movements in RLS are caused by increased excitability of the nerve cells that supply the muscles in the leg, which results in an increased number of signals being sent between nerve cells.

Targeting the way messages are sent between nerve cells to reduce the number of messages to normal levels may help prevent the symptoms of RLS occurring. This could be achieved by new drugs that block the ion channels that are essential for the communication between nerve cells.

The research conducted by the University of Gottingen in conjunction with the University of Sydney and Vanderbilt University involved measuring the nerve excitability of motor nerve cells of patients suffering with RLS and healthy subjects.

The next step is to investigate the effect of different medications in patients and the effect on RLS.

Dirk Czesnik, corresponding author of the study, commented on the findings:

'Patients who suffer from Restless legs syndrome complain of painful symptoms in the legs leading to sleep disturbances. The mechanisms for RLS are still not completely understood. We have shown that also the nerve cells supplying muscles in the leg are responsible and hereby additional drug treatments may be ahead targeting these nerve cells.'

Credit: 
The Physiological Society

New virtual reconstruction of a Neanderthal thorax suggests another breathing mechanism

image: This is an image of the reconstruction of the thorax of Kebara 2. Scale = 5 cm.

Image: 
A. Gómez-Olivencia, A. Barash and E. Been

Neanderthals were hunter-gatherers who inhabited western Eurasia for more than 200 thousand years during glacial as well as interglacial periods until they became extinct around 40 thousand years ago. While some of the anatomical regions of these extinct humans are well known, others, such as the vertebral column and the ribs, are less well known because these elements are more fragile and not well preserved in the fossil record. In 1983 a partial Neanderthal skeleton (known officially as Kebara 2, and nicknamed "Moshe") belonging to a young male Neanderthal individual who died some 60,000 years ago was found in the Kebara site (Mount Carmel, Israel). While this skeleton does not preserve the cranium because some time after burial the cranium was removed, probably as a consequence of a funerary ritual. However, all the vertebrae and ribs are preserved, and so are other fragile anatomical regions, such as the pelvis or the hyoid bone (a bone in the neck to which some of the tongue muscles are attached). So it is the skeleton that preserves the most complete thorax in the fossil record.

New statistical and virtual reconstruction methods have enabled the researchers to extract new information, which has just been published in the prestigious journal Nature Communications.

For over 150 years, Neanderthal remains have been found at many sites in Europe and Western Asia (including the Middle East), and the thorax morphology of this human species has been a subject of debate since 1856, when the first ribs belonging to this human group were found. Over the past decade, virtual reconstructions have become a new tool that is increasingly being used in fossil study. This methodology is particularly useful with fragile fossils, such as the vertebra and ribs that form the thorax. Nearly two years ago, the same research team created a reconstruction of the spine of this Neanderthal individual; it displays the preserved spine of Kebara 2 showing less pronounced curves in these humans when compared with Homo sapiens. The team's paper, published in the book "Human Paleontology and Prehistory," pointed to a straighter spine than that of modern humans.

For this virtual model of the thorax, researchers used both direct observations of the Kebara 2 skeleton, currently housed at Tel Aviv University, and medical CT (computerized axial tomography) scans of the vertebrae, ribs and pelvic bones. Once all the anatomical elements had been assembled, the virtual reconstruction was done by means of 3D software specifically designed for this purpose. "This was meticulous work," said Alon Barash of Bar Ilan University in Israel. "We had to scan each vertebra and all of the rib fragments individually and then reassemble them in virtual 3D."

"In the reconstruction process, it was necessary to virtually 'cut' and realign some of the parts that displayed deformation, and mirror-image the ribs that had been best preserved in order to substitute the poorly preserved ones on the other side," said Asier Gómez-Olivencia, an Ikerbasque research fellow at the University of the Basque Country.

"The differences between the thorax of a Neanderthal and of a modern human are striking," said Daniel García-Martínez and Markus Bastir, researchers at the National Museum of Natural Sciences (MNCN-CSIC) and co-authors of the work. "The Neanderthal spine is located more inside the thorax with respect to the ribs, which provides more stability. The thorax is also wider in its lower part," added Mikel Arlegi (UPV/EHU).

"The wider lower thorax of Neanderthals and the more horizontal orientation of the ribs, as shown in its reconstruction, suggest that Neanderthals relied more on the diaphragm for breathing," said Ella Been of the Ono Academic College. "Modern humans rely on both the diaphragm and on the expansion of the rib cage. Here we can see how new technologies and methodologies in the study of fossil remains are providing new information to understand extinct species."

This new information is consistent with the recent works on the larger lung capacity of Neanderthals published by two of the co-authors of this study, Markus Bastir and Daniel García-Martínez (Virtual Anthropology Laboratory of the MNCN), in which they support the presence of greater lung capacity in the Neanderthals.).

Patricia Kramer of the University of Washington sums it all up thus: "This is the culmination of 15 years of research into the Neanderthal thorax; we hope that future genetic analyses will provide additional clues about the respiratory physiology of the Neanderthals".

Credit: 
University of the Basque Country

Half of older patients exposed to potentially inappropriate prescribing

Inappropriate prescribing can include the intensification of existing drugs and the failure to stop or reduce doses of certain drugs after discharge from hospital.

The findings suggest that better coordination of care is needed to reduce avoidable medication related harms among these patients.

Potentially inappropriate prescribing is common among older adults and is associated with adverse outcomes including emergency hospital attendances and admissions, adverse drug events, and poorer quality of life.

Yet research to date has focused on characteristics of patients and general practitioners as risk factors for poor prescribing quality. There has been less focus on how health system factors, such as hospital admission or care transitions, may contribute to the appropriateness of prescribing for these patients.

So researchers, led by Tom Fahey at the Royal College of Surgeons in Ireland, in collaboration with the Department of Statistics and Data Science, Complutense University of Madrid, set out to determine whether hospital admission is associated with potentially inappropriate prescribing among older primary care patients (aged 65 years or more) and whether such prescribing was more likely after hospital admission than before.

They analysed data from 44 general practices in Ireland from 2012 to 2015. A total of 38,229 patients living in the community were included in the analyses. Average age was 77 years, 43% were male, and 10-15% of patients had at least one hospital admission each year.

Rates of potentially inappropriate prescribing were assessed using 45 criteria from the Screening Tool for Older Persons' Prescription (STOPP). The overall level of potentially inappropriate prescribing ranged from 45.3% of patients in 2012 to 51% in 2015.

Irrespective of age, sex, number of prescription items, other conditions, and health cover, hospital admission was associated with a higher rate of potentially inappropriate prescribing.

And among participants who were admitted to hospital, the likelihood of potentially inappropriate prescribing after admission was consistently higher than before admission, even after controlling for patients' characteristics.

This is an observational study, so no firm conclusions can be drawn about cause and effect, and the researchers cannot rule out the possibility that other unmeasured factors may have affected the results. However, the study included data from a large number of patients, and the findings are consistent with previous research in the field.

As such, the researchers say that hospital admission is "an important driver of potentially inappropriate prescribing and the overuse and/or misuse of drugs."

And they call for better coordination of care, particularly for older patients with complex care needs, to help reduce risk of medication errors, adverse drug events, and readmissions.

"Identifying optimal management strategies for older people is vital to ensure that the risk of inappropriate drugs is minimised after transitions of care," they conclude.

In a linked editorial, Professor Anthony Avery at the University of Nottingham and Professor Jamie Coleman at the University of Birmingham, say opportunities to intervene are often missed.

They point to the importance of interventions known to improve outcomes at discharge, including better communication between secondary and primary care, involvement of pharmacists, and closer monitoring of patients. In addition, making the best use of electronic health records for identifying patients at risk and providing decision support, is key to tackling potentially inappropriate prescribing, they conclude.

Credit: 
BMJ Group

Older people are more likely to have an inappropriate prescription after hospitalization

image: Dr. Frank Moriarty is a senior research fellow with the HRB Centre for Primary Care Research at RCSI.

Image: 
RCSI

A new study has found that older patients who were hospitalised were 72% more likely to be given a potentially inappropriate prescription after their hospital admission, independent of other patient factors.

The study, conducted by the HRB Centre for Primary Care Research based in the Department of General Practice at RCSI (Royal College of Surgeons in Ireland), is published in the current edition of The BMJ.

RCSI researchers looked at data from general practice records of 38,229 patients (aged ≥65 years) in Ireland from 2012 to 2015. To determine if the prescriptions were potentially inappropriate, they assessed the records using 45 criteria from the Screening Tool for Older Persons' Prescription (STOPP) version 2.

Commenting on the findings, senior research fellow with the HRB Centre for Primary Care Research at RCSI Dr Frank Moriarty said: "Adults aged 65 years and older are a growing population and represent the largest consumers of prescribed medications. When caring for older patients in primary care, achieving the balance of maximising patients' benefits from medicines while minimising harms and cost can be challenging.

"Research to date has focused on patient and GP characteristics as risk factors for poor prescribing quality. Our study illustrates the need to consider and address potential adverse effects of hospitalisation on prescribing appropriate medication for older patients."

The study found that potentially inappropriate prescribing (PIP) is becoming increasingly prevalent in older people, and hospitalisation is independently associated with an increased risk of PIP. When compared to older people who had not been hospitalised in the past year, the probability of at least one PIP during a year increases by 49% for hospitalised patients after adjusting for other factors, such as the number of prescriptions and type of healthcare cover.

Dr Moriarty said: "Although we adjusted for a range of patient characteristics, there is potential for unmeasured confounding variables, as with any observational study, which may partly or fully explain the results.

"However, many of the common criteria in our study relate to inappropriate duration of use for medicines used for sleep, acid suppression, and anti-inflammatory effect. Documenting and clearly communicating the intended prescription duration or planned review date would ensure that other clinicians, such as GPs, would have complete information for reviewing and stopping such prescriptions. It is vital to identify optimal management strategies for older people to ensure the risk of inappropriate medications is minimised following their time in hospitals."

Credit: 
RCSI

Study of 2,000 children suggests London air pollution is restricting lung development

Children exposed to diesel-dominated air pollution in London are showing poor lung capacity, putting them at risk of lifelong breathing disorders, according to a study led by Queen Mary University of London, King's College London and the University of Edinburgh.

The research, published in The Lancet Public Health journal, shows that whilst traffic pollution control measures have improved air quality in London, they still need significant strengthening to protect children's health.

Air pollution is a leading cause of global mortality, with the World Health Organization estimating over four million deaths annually caused by outdoor air pollution. Children are especially vulnerable and at risk of lifelong breathing disorders, asthma attacks, chest infections and earlier death.

Professor Chris Griffiths from Queen Mary University of London said: "Despite air quality improvements in London, this study shows that diesel-dominated air pollution in cities is damaging lung development in children, putting them at risk of lung disease in adult life and early death.

"We are raising a generation of children reaching adulthood with stunted lung capacity. This reflects a car industry that has deceived the consumer and central government which continues to fail to act decisively to ensure towns and cities cut traffic."

Low Emission Zones (LEZ) restrict or penalise vehicle entry into urban areas to encourage the uptake of lower emission technologies. London introduced the world's largest city-wide LEZ in 2008, roughly contiguous with the M25 orbital motorway and encompassing around 8.5 million residents. But up until now, there has been little evidence on whether LEZs improve air quality or public health.

2,164 children aged 8-9 were enrolled into the study from 28 primary schools in the London boroughs of Tower Hamlets, Hackney, Greenwich and the City of London (all areas which fail to meet current EU nitrogen dioxide limits). The research team monitored children's health and exposure to air pollutants over five years, covering the period when the LEZ was introduced, and found:

Children exposed to air pollution showed significantly smaller lung volume (a loss of approximately 5 per cent in lung capacity). This was linked to annual exposures of nitrogen dioxide (NO2) and other nitrogen oxides (NOx), both of which are in diesel emissions, and particulate matter (PM10).

Following the implementation of London's LEZ, there were small improvements in NO2 and NOx levels, but no improvements in PM10 levels.

Despite these improvements in air quality, there was no evidence of a reduction in the proportion of children with small lungs or asthma symptoms over this period.

The percentage of children living at addresses exceeding the EU limit for NO2 fell following the LEZ introduction, from 99 per cent in 2009 to 34 per cent in 2013, but they were exposed to higher levels when at school, many of which were next to busy roads.

Significant areas of inner and outer London still remain above the EU NO2 limits.

The researchers warn that, at the current rate of change of pollution levels, full compliance with EU limits for NO2 for London remains distant, unless there is significant tightening of current emission controls.

In the meantime, they say clinicians should consider advising parents of children with significant lung disease to avoid living in high pollution areas, or to limit their exposures.

Dr Ian Mudway from King's College London said: "There is an urgent need to improve our air quality, especially within our congested cities. Policies such as the Low Emission Zone strive to do this, but their effectiveness needs careful and objective evaluation, not only in terms of whether they improve air quality, but more importantly, whether they deliver better health. As the evidence base grows demonstrating that air pollution impacts on the health of children born and growing up in our cities, so the justification for decisive action increases."

Dr Samantha Walker, Director of Research and Policy at Asthma UK, said: "It is disappointing that the Low Emission Zone in London has not helped to improve children's lung capacity and shows that a piecemeal approach to reducing air pollution does not work. If children's lungs don't develop properly as a result of air pollution it can increase their likelihood of developing asthma, leaving them coughing, wheezing and at risk of a life-threatening asthma attack. The Government needs to tackle toxic air by putting in place a new Clean Air Act to keep everyone, especially children, safe."

Professor Frank Kelly from NIHR Health Impact of Environmental Hazards HPRU said: "These new findings linking air pollution and children's lung growth provide further support for the introduction of the ultra Low Emission Zone in London early next year."

Credit: 
Queen Mary University of London

Vapers do not undermine desire to quit smoking

Smokers who regularly spend time with vapers (people who use e-cigarettes) are more likely to try quitting smoking, according to a new study carried out by UCL.

The study, published today in BMC Medicine and funded by Cancer Research UK, found that smokers who were regularly exposed to vapers (as opposed to other smokers) were around 20% more likely to have reported both a high current motivation to quit and made a recent quit attempt.

"It is becoming increasingly more commonplace for smokers to come into contact with vapers and some concerns have been raised that this could 'renormalise' smoking in England and undermine smokers' motivation to quit," said the study's lead author, Dr Sarah Jackson (UCL Institute of Epidemiology & Health Care).

"Our results found no evidence that spending time with vapers discourages smokers from quitting, which should help to alleviate concerns about the wider public health impact of e-cigarettes."

Around a quarter (25.8%) of smokers in the study said they regularly spent time with vapers. Of these, around a third (32.3%) had made an attempt to quit smoking in the previous year - a higher rate than was observed among smokers who did not regularly spend time with vapers (26.8%).

"A key factor driving these differences may be that smokers who are regularly exposed to e-cigarette use by others are more likely to use e-cigarettes themselves. When smokers' own use of e-cigarettes was taken into account, exposure to other people using e-cigarettes appeared to have little impact on how motivated smokers were to stop, and whether they made a recent quit attempt," explained Dr Jackson.

The study was conducted over a period of three and a half years, from November 2014 to May 2018. Data was provided by almost 13,000 participants of the Smoking Toolkit Study, an ongoing monthly study about smoking habits in England.

E-cigarettes are estimated to be around 95% safer than smoking tobacco, according to Public Health England. The authors say the findings should offer some reassurance in terms of the wider public health impact of e-cigarettes, particularly given evidence that the alternative, cigarette smoking, appeared to reduce other smokers' motivation to quit.

Kruti Shrotri, Cancer Research UK's tobacco control expert, said: "So far, there hasn't been much evidence about whether e-cigarettes might make smoking tobacco seem normal again. So it's encouraging to see that mixing with people who vape is actually motivating smokers to quit. As the number of people who use e-cigarettes to quit smoking rises, we hope that smokers who come into contact with them are spurred on to give up tobacco for good."

Credit: 
University College London

Genomics provide hope for those with 'one in a million' cancer diagnosis

image: The DNA of rare cancers could hold the clue to improving treatment options, according to a new clinical trial being presented by Professor Clare Scott at the 2018 COSA Annual Scientific Meeting

Image: 
Walter and Eliza Hall Institute, Australia

New research has shown that many Australians with rare cancers can benefit from genomic profiling. The findings of the patient-driven trial are being presented today at the Clinical Oncology Society of Australia Annual Scientific Meeting and could result in dramatic changes to the way those with rare cancers are diagnosed and treated.

The initial data from the pilot study for Nominator Trial is being presented by Professor Clare Scott from the Walter and Eliza Hall Institute of Medical Research and Peter MacCallum Cancer Centre, and was funded in part by Rare Cancers Australia.

The data shows that genomic profiling provides meaningful information that influences diagnosis and treatment in approximately 50 percent of people with rare cancers. 20 percent of those tested got a new treatment plan as a result and 6 percent of participants were given a new diagnosis.

The aim of the national initiative is to trial the use of genomic testing to match rare cancers to cancer treatments. Testing is used to identify molecular features of the cancer or genetic mutations that can be targeted with existing treatments used in other cancer types with the same characteristics.

While genomic testing is becoming increasingly used in other cancer types, this is one of the first Australian studies of its kind to look at the potential benefits for those with rare cancers, which have very low survival rates.

Professor Clare Scott says the initial pilot data is exciting and proves that there is a current unmet need.

"The treatment options for Australians with rare cancers are currently extremely limited and this ultimately leads to poor survival rates. Research has also typically been restricted because of the challenges of finding enough of each type of cancer patient to design appropriate clinical trials.

"Australians in this trial came to us after they had exhausted all their options. The cancers they had are extremely rare - the chances of being diagnosed with these cancer types are often around one in a million.

"Using genomic profiling we were able to uncover new information that gave many patients new treatment options - and ultimately, new hope."

Professor Scott will share examples during her presentation of those who have benefited.

"In one case we were able to identify that a rare heart tumour actually had a genetic profile most closely resembling a melanoma. Using that information we were able to get access to the latest treatments that are benefiting melanoma patients - which we hope will provide better outcomes for this patient."

The Nominator Pilot Study results released today included 36 patients. The two-year study will eventually include 100 patients and will lay the groundwork for other national initiatives looking into genomic profiling across a range of cancer types.

Professor Phyllis Butow, President, Clinical Oncology Society of Australia said one of the impressive things about the study was that it was driven by Australians directly affected by rare cancers.

"Around 52,000 Australians are diagnosed with rare or less common cancers each year. Those directly affected by the disease, led by Kate and Richard Vines from Rare Cancers Australia, helped call for and fund this research, so it's great to see these initial promising results being presented to cancer experts from across the country."

Credit: 
Walter and Eliza Hall Institute