Culture

Avoid piperacillin-tazobactam when treating BSI cause by ceftriaxone-resistant pathogens

Madrid, Spain: The antibiotic combination treatment piperacillin-tazobactam was significantly less effective than meropenem when treating potentially fatal bloodstream infections (BSI) caused by ceftriaxone-resistant Escherichia coli and Klebsiella pneumoniae and should be avoided when treating these organisms, according to research presented at the 28th European Congress of Clinical Microbiology and Infectious Diseases (ECCMID) [1].

Researchers from the Centre of Clinical Research at the University of Queensland determined whether piperacillin-tazobactam, a penicillin-based combination therapy, was as effective for treating BSI as the commonly used antibiotic meropenem. Their hypothesis was that definitive therapy with piperacillin-tazobactam was non-inferior to meropenem.

While there was no difference between the two groups regarding subsequent infections of drug-resistant bacteria or C. difficile, the difference in mortality rate was significant. Twenty-three patients, or 12.3%, treated with piperacillin-tazobactam died by the 30-day mark compared with seven patients, or 3.7%, who had been treated with meropenem.

"The use of piperacillin-tatobactam as definitive therapy for bloodstream infections caused by E. coli or K. pneumoniae with non-susceptibility to third-generation cephalosporins was inferior to meropenem and should be avoided in this context," presenting author Dr Patrick Harris concluded in his presentation.

During the last 10 years the rate of carbapenem resistance has been increasing exponentially worldwide. Researchers urgently need reliable data from well-designed trials to guide clinicians in the treatment of antibiotic resistant Gram-negative infections. Physicians face a situation where meropenem, which is commonly used for bloodstream infection, is suspected of driving resistance to carbapenem, a highly effective antibiotic agent that is usually reserved for known or suspected difficult-to-treat multidrug-resistant (MDR) bacterial infections.

Bloodstream infections carry a high risk for morbidity and mortality. Such infections are common in the hospital setting and they often are difficult to treat because K. pneumoniae and E. coli, the leading cause of BSIs, have developed resistance to cephalosporins, a class of antibiotics originally made from fungi.

The team enrolled adult patients from 32 sites in nine countries - most patients were recruited in Singapore, Australia and Turkey. The study included 378 patients between February 2014 and July 2017. Healthcare-associated infections were the most common, accounting for more than half of the infections in the study group. Most infections, 60.9%, originated in the urinary tract before spreading to the bloodstream. And 86.5% of the cases were caused by the E. coli bacteria.

Harris's team examined the primary outcome for these patients, which was mortality at 30 days after the randomisation. Randomisation occurred within 72 hours of the initial blood culture.

The team also noted secondary outcomes, those consisted of the number of days for each patient to reach the resolution of the infection, the clinical and microbiological success at day four, any relapse of the bloodstream infection or a secondary infection with an organism that was resistant to the trial drugs or Clostridium difficile, which is another type of bacteria that may lead to life-threatening symptoms and is sometimes a side effect of antibiotic treatment.

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

Seven-day antibiotic course delivers similar outcomes to 14-days for Gram-negative bacteraemia

Madrid, Spain: A seven-day course of antibiotic treatment for Gram-negative bacteraemia (GNB), a serious infection that occurs when bacteria get into the bloodstream, was shown to offer similar patient outcomes as a 14-day course, according to research presented at the 28th European Congress of Clinical Microbiology and Infectious Diseases (ECCMID) [1].

Researchers assessed several primary outcomes, including mortality and whether a patient was readmitted to the hospital or had to remain in hospital longer than 14 days. In the group of 306 patients treated with a seven-day course, 46%, or 141 patients, experienced these primary outcomes. In the group of 298 patients who received the longer treatment course, that figure was 50%, or 149 patients.

The findings of a 604-patient clinical trial presented by Dr Dafna Yahav from Tel Aviv University show that mortality rate and other outcomes are similar in the shorter treatment length. Additionally, the shorter treatment may allow patients to return to their everyday activity faster.

"In patients hospitalized with GNB and sepsis resolution before day seven, a course of seven antibiotic days was not inferior to 14 days, reduced antibiotic days and resulted in a more rapid return to baseline activity," Yahav said. "This could lead to a change in accepted management algorithms and shortened antibiotic therapy. Potentially, though we did not show that in our trial, it may lead to reduced cost, resistance development and adverse events."

GNB is a major cause of morbidity and mortality in hospitalized patients. It occurs when bacteria get into the bloodstream as a result of an infection (e.g. a urinary tract infection), surgery or the inappropriate use of medical devices (e.g. catheters). There has been little data available to physicians to assist them in assigning the most beneficial length of antibiotic treatment.

Yahav anticipates that healthcare workers will implement shorter treatment protocols based on these findings. "Shorter therapy had no proven disadvantages. As true for all trials, before physicians use our findings to lead their therapy they should verify that our results are valid for their patients. We included patients that were stable at time of randomization, mostly patients with UTI (68%) and with Enterobacteriaceae as the causative organism (90%)."

Yahav and last author Mical Paul assessed the outcomes for patients admitted to three hospitals in Israel and Italy between January 2013 and August 2017. They analysed 90-day outcomes for 604 patients, 306 patients received a seven-day course of antibiotics and 298 patients received the longer 14-day treatment. Patients with ongoing sepsis or cases where there was an uncontrolled source of infection were excluded from the study.

Additionally, the researchers examined a host of other patient outcomes, such as mortality at 30 and 90 days, whether or not the patient developed secondary infections, if they developed a Clostridium difficile infection that can be common after antibiotic treatment. Yahav's team also noted total number of days patients were treated with antibiotics, remained in hospital, their functional capacity, how long it took them to return to everyday activities, any development of resistance or other negative side effects of treatment.

Thirty-six patients on the shorter treatment course, or 11.8%, had died at the 90-day mark compared with 32 patients, or 10.7%, in the longer treatment group.

In the shorter treatment group, the number of antibiotic days was reduced significantly. Those who were part of the short-term treatment arm were treated a median five days compared with those of the long-term arm, which were treated for a median 12 days. This resulted in a reduction of 1,551 antibiotic days and, Zahav said, had the benefit of allowing patients to return to regular daily activities faster. Patients in the seven-day group were able to return to baseline activities in a range of zero to just over eight weeks. The 14-day group saw a slower return to baseline activities, within a range of one to twelve weeks.

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

Correcting tiny differences in patient's position for radiotherapy could increase survival chances

Barcelona, Spain: Very small differences in the way a patient lies during radiotherapy treatment for lung or oesophageal cancer can have an impact on how likely they are to survive, according to research presented at the ESTRO 37 conference.

These differences of only a few millimetres can mean that the radiation treatment designed to target patients' tumours can move fractionally closer to the heart, where it can cause unintentional damage and reduce survival chances.

The finding suggests that survival could be improved by tightening up treatment guidelines to ensure patients are positioned more accurately.

Radiotherapy plays an important role in cancer care in, amongst others, hard to treat tumours such as lung and oesophageal cancer. However, it can cause side-effects and previous research shows that radiotherapy to the chest can have negative long-term effects on the heart, for example, increasing the risk of heart disease.

When planning radiotherapy treatment, cancer specialists create a CT image of their patient. This reveals the exact position and size of the tumour within the body. At each subsequent treatment, another image is created and used to check that the patient and, therefore, the tumour is in the same position, within a certain threshold, before the treatment is delivered.

The new research was presented by Corinne Johnson, a medical physics PhD student at the Manchester Cancer Research Centre, part of the Christie NHS Foundation Trust and the University of Manchester, UK.

She and her colleagues studied a group of 780 patients with non-small cell lung cancer who were treated with radiotherapy. For each treatment, patients were positioned on the treatment machine and an image was taken to confirm that they lay within 5mm of their original position. They used the data from these images to gauge how accurately the radiotherapy dose was delivered over the course of treatment, and whether it was shifted slightly closer or slightly further away from the patient's heart.

When they compared these data with how likely patients were to survive, they found that patients with slight shifts towards their hearts were around 30 per cent more likely to die than those with similar sized shifts away from their hearts.

When they repeated the research with a group of 177 oesophageal cancer patients, they found an even greater difference of around 50 per cent. In both groups the pattern of survival remained even when researchers took other factors such as the patient's age into account.

Johnson explains: "We already know that using imaging can help us to target cancers much more precisely and make radiotherapy treatment more effective.

"This study examines how small differences in how a patient is lying can affect survival, even when an imaging protocol is used. It tells us that even very small remaining errors can have a major impact on patients' survival chances, particularly when tumours are close to a vital organ like the heart.

"By imaging patients more frequently and by reducing the threshold on the accuracy of their position, we can help lower the dose of radiation that reaches the heart and avoid unnecessary damage."

Johnson and her colleagues are now looking at the data in more detail to see whether particular regions of the heart are more sensitive to radiation than others, and they hope to investigate the effect of differences in patient position in other types of cancer.

President of ESTRO, Professor Yolande Lievens, head of the department of radiation oncology at Ghent University Hospital, Belgium, said: "Radiotherapy treatments are given according to strict protocols to ensure that patients get the most effective treatment with the fewest possible side-effects. This research suggests that changes to lung and oesophageal cancer protocols could positively impact the overall survival of patients with these cancers, both of which have relatively high mortality rates."

Credit: 
European Society for Radiotherapy and Oncology (ESTRO)

New leads in the development and treatment of liver disease

A treatment gap remains for many conditions involving damage to the liver, the body's main organ for removing toxins, among other functions. The Experimental Biology 2018 meeting (EB 2018) will feature important research announcements related to the causes of liver degradation and possible treatments.

Receptor for sleep hormone melatonin may play a role in liver cirrhosis

Texas A&M University College of Medicine and Central Texas Veterans Health Care System researchers have discovered a potential new lead for treating chronic liver diseases. The research focuses on melatonin, a hormone associated with maintaining circadian rhythms. Receptors for this hormone can be found in the liver, as well as elsewhere in the body, and previous experiments using mice have shown that melatonin helps reduce the processes that cause liver fibrosis (scarring that ultimately leads to cirrhosis). When the researchers bred mice that were incapable of expressing different kinds of melatonin receptors, the mice showed different rates of liver fibrosis. Fibrosis was significantly decreased in mice incapable of expressing one receptor in particular, known as MT1. This suggests that drugs designed to block MT1 activity could potentially help slow liver disease progression.

Nan Wu will present this research at the American Society for Investigative Pathology annual meeting at EB on Saturday, April 21, from 8:35-8:45 a.m. in Room 2 (abstract) and on Tuesday, April 24, from 4:15-4:30 p.m. in Room 5A.

Deciphering the links between alcoholism and liver cancer

Steatohepatitis is a type of fatty liver disease that can lead to cirrhosis and liver cancer. While it can occur in people who drink little or no alcohol, it is far more common--and more likely to progress to liver cancer--in people with alcoholism. A new study by researchers at Harbor-UCLA Medical Center reveals how the expression of certain proteins in the liver differs between patients with non-alcoholic steatohepatitis and alcoholic steatohepatitis. The researchers investigated 10 proteins that are known to play a role in cancer development. Both patient groups showed increased levels of most of the proteins compared to healthy people, but the protein levels were much higher in those with alcoholic steatohepatitis, which helps explain why these patients face such a high risk of liver cancer.

Jiajie Lu will present this research at the American Society for Investigative Pathology annual meeting at EB on Sunday, April 22, from 11:45 a.m.-12:45 p.m. in the Exhibit Hall (poster D31) (abstract) and on Tuesday, April 24, from 5:30-7:30 p.m. in Ballroom 20BC.

Potential therapeutic target for liver damage from acetaminophen

Taking too much acetaminophen (the active ingredient in Tylenol®) can cause serious liver damage and even death. In a new study, researchers at the Central Texas Veterans Health Care System and Texas A&M University Health Science Center identify a possible new way to interfere with the process by which acetaminophen damages liver cells. The research focuses on the role a protein, transforming growth factor beta 1 (TGFβ1), plays in the cascade of events that leads to cell death. Scientists discovered that the damage caused by acetaminophen was reversed in mice bred without the ability to produce TGFβ1 and in genetically normal mice that were treated with a TGFβ1-disabling agent. The results suggest that interrupting TGFβ1's activity could be one way to prevent or treat acetaminophen-related liver injury. This work was supported by Central Texas Veterans Health Care System and Texas A&M University Health Science Center, Temple, Texas.

Matthew McMillin will present this research at the American Society for Investigative Pathology annual meeting at EB on Tuesday, April 24, from 2:15-2:30 p.m. in Room 5A (abstract) and on Tuesday, April 24 from 5:30-7:30 p.m. in Ballroom 20BC (poster 415.2).

New insights on non-coding RNA in alcoholic liver disease

A tiny segment of RNA known as microRNA-21 has been found to play a role in cancer and heart disease. New research from the University of Connecticut suggests the molecule also influences the processes involved in alcoholic liver disease, a leading cause of cirrhosis. While microRNA-21 does not itself code for cellular functions the way DNA does, it can interfere with how other genes are expressed. In the study, mice fed a diet spiked with alcohol produced significantly higher amounts of microRNA-21 in the liver compared to mice on a normal diet. Tissue samples from human volunteers also found microRNA-21 levels were markedly increased in people with alcohol-related cirrhosis compared to healthy individuals. The researchers gained additional insights about the ways microRNA-21 affects liver health by breeding mice that were incapable of producing microRNA-21 in their livers.

Yulan Zhao will present this research at the American Society for Investigative Pathology annual meeting at EB on Sunday, April 22, from 4:15-5:15 p.m. in Room 4 (abstract) and on Tuesday, April 24, from 5:30-7:30 p.m. in Ballrooms 20BC (poster 150.10).

EB 2018 is the premier annual meeting of five scientific societies to be held April 21-25 at the San Diego Convention Center. Contact the media team for abstracts, images and interviews, or to obtain a free press pass to attend the meeting.

Credit: 
Experimental Biology

New infection prevention tool improve transparency and standardization of practice

Madrid, Spain: Researchers developed a new colour-coded visual tool called Infection Risk Scan, or IRIS, which is set to make it easier for healthcare workers to measure in which areas a hospital complies with guidelines and where it needs to implement measures to improve infection control and the use antimicrobial therapies, according to research presented at the 28th European Congress of Clinical Microbiology and Infectious Diseases (ECCMID) [1].

Dr Ina Willemsen, presenting author, and her team of researchers looked at several variables in care and compared them to local quality standards and data. These variables included hand hygiene, environmental contamination, healthcare workers' personal hygiene, appropriate use of antibiotics and any transmission of antibiotic-resistant gram-negative bacteria, such as Klebsiella spp. or Escherichia coli (E. coli). From this, the risk factors were displayed as an image that shows patient-related risks and any factors that can be controlled by healthcare workers.

"Infection control needs user-friendly standardized instruments to measure the compliance to guidelines and to implement improvement actions," Willemsen said.

"The IRIS method provides a multifactorial tool ensuring transparency in the infection control practices and outcomes. Repeated use of the IRIS makes it possible to monitor outcome and offers opportunities for targeted adjustment where needed. This results in a plan-do-check-act (PDCA) quality cycle in infection control."

Over three years, Willemsen's team conducted four consecutive IRIS in five wards of a Dutch hospital. All wards improved hand hygiene compliance, increasing to 68% overall from 43%. Environmental contamination, which was evaluated using adenosine triphosphate measurements, improved but could not be sustained. Personal hygiene was already good and was sustained over the observation period. The appropriate use of antibiotics did not improve despite researchers having identified clear places for improvement, however no changes were implemented. And finally, researchers noted only one instance of drug resistant bacteria transmission involving two patients.

In the next two years, IRIS will be implemented in nine hospitals and 40 nursing homes in the Dutch/Belgium border area. In addition to this research, Willemsen also explored whether this model could be useful in comparing the quality of infection control programmes between hospitals and between hospitals in different health systems. To that end they compared IRIS results from a hospital in the Netherlands and one in the United States.

The same variables were used to determine the levels of infection control and antimicrobial use as described above. Willemsen's team compared the US results with the Dutch guidelines and vice versa.

The hospitals varied greatly in their approach to antimicrobial therapy. Narrow-spectrum antimicrobials are used in the Netherlands whereas in the United States, guidelines call for broad-spectrum antimicrobials. There was also a large difference in environmental contamination levels, with the higher levels being found in the Dutch hospital. Personal hygiene practices differ as well, for instance, national guidelines forbid the wearing of jewellery in the Dutch hospital.

Willemsen said: "The standards and guidelines in the two hospitals showed substantial differences, which makes it impossible to compare the level of quality of infection control and antimicrobial use in the two hospitals in different countries with different national guidelines. More evidence-based standardization of guidelines around the globe is needed to allow international comparisons of the standard of care."

Willemsen will present this research in an oral session on Saturday and as a paper poster on Sunday. Last month she published a paper about IRIS in the journal Antimicrobial Resistance & Infection Control.

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

Study recommends strong role for national labs in 'second laser revolution'

image: A view of BELLA, the Berkeley Lab Laser Accelerator.

Image: 
Roy Kaltschmidt/Berkeley Lab

A new study calls for the U.S. to step up its laser R&D efforts to better compete with major overseas efforts to build large, high-power laser systems, and notes progress and milestones at the Department of Energy's Berkeley Lab Laser Accelerator (BELLA) Center and other sites.

An investment in this so-called "second laser revolution" promises to open up a range of applications, from machining to medicine to particle acceleration, according to the December report by the National Academies of Sciences, Engineering, and Medicine, which offers independent analysis to government agencies and policymakers.

The 280-page report, "Opportunities in Intense Ultrafast Lasers: Reaching for the Brightest Light ", recommends increased coordination and collaboration by government labs and agencies, universities, and industry to build up U.S. laser facilities and capabilities.

It also recommends that the DOE lead the creation of a national strategy to develop and operate large-scale national laboratory-based laser projects, midscale projects that could potentially be hosted at universities, and a laser tech-transfer program connecting industry, academia, and national labs.

The committee that prepared the report visited Berkeley Lab and other Northern California national labs, including SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory. The committee also visited the Extreme Light Infrastructure Beamlines laser facility site that is underway in the Czech Republic, and the Laboratory for Laser Energetics of the University of Rochester in New York.

At the DOE's Lawrence Berkeley National Laboratory (Berkeley Lab), BELLA scientists are working to develop laser-based acceleration techniques that could lead to more compact particle accelerators for high-energy physics and drivers for high-energy light sources; also, the report notes, "laser expertise and utilization" that had been concentrated at other laboratories "is now broadening with plans for utilization of lasers at (Berkeley Lab)" and elsewhere.

BELLA has made progress in demonstrating the rapid acceleration of electrons using separate stages of laser-based acceleration by forming and heating plasmas in which a powerful wave is created for electrons to "surf" on.

"There's a lot of work that's been done already, and Berkeley Lab has been a key developer for the vision of where things need to go," said Wim Leemans, director of the BELLA Center and the Lab's Accelerator Technology & Applied Physics Division.

Berkeley Lab was home to a pioneering experiment)in 2004 that showed laser plasma acceleration can produce relatively narrow energy spread beams - reported in the so-called "Dream Beam" issue of the journal Nature - and in 2006 used a similar laser-driven acceleration technique to accelerate electrons to a then-record energy of 1 billion electron volts, or GeV. That achievement was followed in 2014 by a 4.2 GeV beam, using the powerful new laser that is at the heart of the BELLA Center and will be key to its ongoing campaign for 10 GeV. In 1996, Berkeley Lab also logged the first demonstration of X-ray pulses lasting just quadrillionths of a second with a technique known as "inverse Compton scattering," the report notes.

K-BELLA: combining speed and power

"What industry is seeing is the push toward higher-average-power lasers and ultrafast lasers, and it's starting to impact machining and industrial applications," Leemans said. "That's really good news for us." In laser lingo, average power relates to how much total power the laser puts out over time, counting the pulses and the "off time" between pulses, while the peak power is that of an individual pulse.

A rapid-fire rate of high-power pulses gives a laser higher average power and can potentially be applied to a wider range of uses. The National Academies report recommends that U.S. scientific stakeholders should work to define the technical specifications in laser performance goals, such as targets for peak power, repetition rate, length of pulses, and the wavelength of laser light.

In 2012 the BELLA Center's laser set a record by delivering a petawatt (quadrillion watts) of power packed into pulses that measured 40 quadrillionths of a second in length and came at a rate of one per second.

A new goal is to up this pulse rate to 1,000 per second, or a kilohertz, for a next-gen upgrade dubbed K-BELLA. Producing pulse rates of up to 10,000 or 100,000 per second could make this machine relevant for a new type of laser-based particle accelerator.

"There are lots of applications for a k-BELLA-style laser," Leemans said. The vision is for k-BELLA to be a collaborative research facility that would be open to scientists from outside the Lab, he said, which also syncs with the recommendations in the report to foster a more cooperative environment for laser science and scientists. Forging and maintaining connections to other world-class laser centers is also key for the U.S. laser program, the report notes.

Another upgrade that may be useful to the U.S. laser program is the addition of a second beamline at BELLA, Leemans said. A second beamline could enable exotic collisions between a beam of light and an electron beam, or between two beams of light.

Laser-produced beams of light elements, and laser-produced low-energy electron beams, could also be pursued at BELLA to develop the biomedical basis for new types of medical treatments that better target cancers, for example. "We look forward to enhancing our own laser capabilities at Berkeley Lab while working with our partners to strengthen the nation's laser R&D efforts," said James Symons, associate laboratory director for physical sciences. "Higher average power lasers will be essential for all practical applications of laser plasma accelerators."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Measles serious threat for babies, toddlers, unvaccinated youths, ECDC says

Madrid, Spain: The vast majority of measles cases in Europe were reported in unvaccinated patients, and children younger than two years old were at a higher risk of dying from measles than older patients, according to research presented at the 28th European Congress of Clinical Microbiology and Infectious Diseases (ECCMID) [1].

Presenting author Dr Emmanuel Robesyn of the European Centre for Disease Prevention and Control (ECDC) in Stockholm said they analysed data to support European states in reaching the recommended 95% two-dose vaccination coverage. It also set out to determine any possible differences between society's youngest individuals and older populations when infected with the disease.

The study examined all 37,365 measles cases reported to the ECDC from 1 January 2013 through 31 December 2017. The researchers found 81% of all reported cases were patients who were not vaccinated. Most cases were in Italy, Romania, Germany, the Netherlands and the United Kingdom, with each reporting more than 5% of the cases. These countries also had the most cases that had not been connected with importation of the disease.

The study also noted that 33% of the patients were hospitalised and 11% had pneumonia. Most cases, 81%, involved those who were two years old and older. Of the remaining 19% share, 9% were one year old and 10% younger than one year.

The rate at which patients died from the disease highlighted the impact measles had on the very youngest populations. ECDC's analysis showed that one in 1,000 measles patients died, and of those, the greatest fatality was seen in the youngest cases. Cases in one-year-olds were six times more likely to die compared with cases of patients who were two years old or older. Cases in infants younger than one year were seven times more likely to die.

The findings are based on ECDC data collected in the most recent years in the EU/EEA, which can benefit communication efforts to tackle resurgence of measles in Europe.

The World Health Organization has set goals for the elimination of measles and rubella. One of the main actions to achieve those goals is to maintain high rates of sustained immunizations.

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

What's in a name? Yale researchers track PTSD's many identities during war

Posttraumatic stress disorder (PTSD) has been associated with military activities for as long as wars have been fought - but this disorder was only named in the 1980s. A new Yale paper published April 16, 2018 in Chronic Stress documents a different kind of war - a war of words - that has been fought over the name of the disorder, and may have slowed clinical and scientific progress on the disorder.

The study looked back through 14 million newspaper articles between 1900-2016, and found trauma-related symptoms such as flashbacks, trouble sleeping, and severe anxiety were called different names after World War I, World War II, Vietnam, and the Gulf War. Terms such as shell shock, war neurosis, and battle fatigue were used to describe symptoms that many soldiers experienced after returning home from war.

The Yale-led research team used a complex computer coding program to comb through the archives of the New York Times, Reuters, and Associated Press, and found that as each war came and went, a new term was used to describe the disorder now referred to as PTSD. They argue that the lack clear terminology may have slowed professional understanding, knowledge, and research into PTSD.

"Each (military) conflict basically had its own new name for what were really the same group of symptoms," said Adam Chekroud, PhD, Assistant Professor of Psychiatry and the paper's first author. "For us, this was an opportunity to dig into the data, and to quantify this phenomenon."

The paper revealed that PTSD symptoms were known as shell shock during World War I, and irritable heart or soldier's heart during World War II. The term gross stress reaction was introduced in the first edition of the Diagnostic and Statistical Manual in 1952, but was omitted in a second edition in 1968 during the Vietnam War.

It wasn't until 1980, with the publication of the manual's third edition, that the term PTSD was introduced to describe military trauma and non-war related factors, such as sexual abuse. "PTSD has existed forever," Chekroud said. "It's just a question of what we've been calling it."

Chadi Abdallah, MD, Assistant Professor of Psychiatry at Yale and the editor of Chronic Stress, said the history of disjointed terminology resulted in a 60-year delay in understanding traumatic symptoms experienced by veterans and others.

"Society finally recognized that many people were suffering from the same symptoms," he said. "(The study) provides objective measures to how society has reacted to this invisible wound of war."

Credit: 
SAGE

Brachytherapy for cervical cancer does not increase the risk of ureteral stricture

Barcelona, Spain: A rare but potentially serious complication following radiation treatment for cervical cancer is a narrowing of the tube (the ureter) that takes urine from the kidneys to the bladder, which can lead to kidney damage and sometimes life-threatening infections. This is called ureteral stricture and, until now, there have been concerns that brachytherapy might increase the risk, although the treatment itself is associated with better survival.

However, new research presented at the ESTRO 37 conference today (Saturday) from two large international trials, shows that intracavitary and interstitial (IC/IS) brachytherapy is safe and does not increase the risk of ureteral stricture. Intracavitary (IC) brachytherapy involves placing an applicator in the uterus, while interstitial (IS) brachytherapy involves inserting needles directly into the tumour. Then the appropriate radiation dose is delivered to the cancer via one or both of these approaches. The procedure is performed after a CT or MRI scan has pinpointed the exact position of the cancer, so that the radiation treatment can be targeted precisely, and this is called image-guided brachytherapy (IGBT).

Dr Lars Fokdal (MD, PhD), a consultant at the Aarhus University Hospital, Denmark, who led the study, said: "These results show that image-guided brachytherapy using intracavitary combined with interstitial techniques is safe and is not associated with more cases of ureteral stricture afterwards compared to using intracavitary techniques alone."

Dr Fokdal and his colleagues looked at data from 1772 patients with cervical cancer that had started to spread to nearby tissues (locally advanced) who were enrolled in two trials of the treatment - EMBRACE and retro-EMBRACE - that were being carried out in 12 countries. IC/IS IGBT was delivered to 36% of the patients.

After following the patients for between one and 163 months (the middle or 'median' number of months was 29), 36 patients were diagnosed with the more severe form of ureteral stricture (grade 3-4). The overall risk of developing grade 3-4 ureteral stricture was 2% after three years and 3.2% after five years. The risk was lowest (1.3%) among the 1370 patients with small tumours, and highest among the 130 patients with more advanced cancer and who had ureters swollen due to a build-up of urine (hydronephrosis) at the time of diagnosis. In these patients the risk of ureteral stricture was 13.6% after three years and 23.4% after five years.

"The incidence of ureteral stricture in cervical cancer patients generally is between 2-3%, so the overall risk of developing the complication after IC/IS image-guided brachytherapy compares well. It is good to know that the interstitial component of IGBT does not increase the risk of this relatively rare but sometimes serious complication. However, the risk is more pronounced in patients with advanced stage and hydronephrosis at diagnosis," said Dr Fokdal.

"There are different strategies that can be used to avoid ureteral stricture in the subset of patients who are at higher risk. One strategy could be closer observation following IC/IS IGBT so that ureteral strictures could be spotted earlier before they become too severe. Another strategy could be insertion of ureteral stents before radiotherapy in order to visualise the organ on imaging and reduce the dose delivered during brachytherapy."

He continued: "Results from the retro-EMBRACE and EMBRACE trials have also shown that IC/IS image-guided brachytherapy is associated with a better outcome for patients in terms of survival and adverse side-effects. The increased, but targeted radiation dose to the tumour controls the cancer better without adversely affecting nearby organs and tissues. Taking all these results together, we have growing evidence in favour of IC/IS IGBT for treating cervical cancer."

Symptoms of a ureteral stricture include back pain, loss of kidney function and a risk of severe kidney or urinary tract infection. It is treated either by widening the blocked tube and inserting a stent to keep it open or by inserting a tube directly into the kidney through the skin that is connected to a urine collection bag secured to the patient's back. Some patients will need this for a short time, but for some it may be permanent.

President of ESTRO, Professor Yolande Lievens, head of the department of radiation oncology at Ghent University Hospital, Belgium, said: "These results from the EMBRACE and retro-EMBRACE trials provide reassuring evidence on the benefits of combining interstitial and intracavitary brachytherapy to treat cervical cancer patients. While the study showed that patients who are at higher risk of complications can be identified, thus allowing us to monitor them more closely and maybe use a slightly different treatment approach to decrease their risk, it also clearly illustrated that using the latest technology translates into better outcomes and value for our patients."

Credit: 
European Society for Radiotherapy and Oncology (ESTRO)

Researchers illuminate the path to a new era of microelectronics

image: Photograph of the bulk silicon electronic-photonic chip designed by the MIT, UC Berkeley and Boston University team.

Image: 
Amir Atabaki

A new microchip technology capable of optically transferring data could solve a severe bottleneck in current devices by speeding data transfer and reducing energy consumption by orders of magnitude, according to an article published in the April 19, 2018 issue of Nature.

Researchers from Boston University, Massachusetts Institute of Technology, the University of California Berkeley and University of Colorado Boulder have developed a method to fabricate silicon chips that can communicate with light and are no more expensive than current chip technology. The result is the culmination of a several-year-long project funded by the Defense Advanced Research Project Agency that was a close collaboration between teams led by Associate Professor Vladimir Stojanovic of UC Berkeley, Professor Rajeev Ram of MIT, and Assistant Professor Milos Popovic from Boston University and previously CU Boulder. They collaborated with a semiconductor manufacturing research team at the Colleges of Nanoscale Science and Engineering (CNSE) of the State University of New York at Albany.

The electrical signaling bottleneck between current microelectronic chips has left light communication as one of the only options left for further technological progress. The traditional method of data transfer-electrical wires-has a limit on how fast and how far it can transfer data. It also uses a lot of power and generates heat. With the relentless demand for higher performance and lower power in electronics, these limits have been reached. But with this new development, that bottleneck can be solved.

"Instead of a single wire carrying 10 to 100 gigabits per second, you can have a single optical fiber carrying 10 to 20 terabits per second--so about a thousand times more in the same footprint," says Popovic.

"If you replace a wire with an optical fiber, there are two ways you win," he says. "First, with light, you can send data at much higher frequencies without significant loss of energy as there is with copper wiring. Second, with optics, you can use many different colors of light in one fiber and each one can carry a data channel. The fibers can also be packed more closely together than copper wires can without crosstalk."

In the past, progress to integrate a photonic capability onto state-of-the-art chips that are used in computers and smartphones was hindered by a manufacturing roadblock. Modern processors are enabled by highly developed industrial semiconductor manufacturing processes capable of stamping out a billion transistors that work together on one chip. But these manufacturing processes are finely tuned and designing an approach to include optical devices on chips while keeping the current electrical capabilities intact proved difficult.

The first major success in overcoming this roadblock was in 2015 when the same group of researchers published another paper in Nature that solved this problem, but did so in a limited commercially relevant setting. The paper demonstrated the world's first microprocessor with a photonic data transfer capability and the approach to manufacturing it without changing the original manufacturing process-a concept the researchers have termed a zero-change technology. Ayar Labs, Inc., a startup that Ram, Popovic and Stojanovic co-founded, has recently partnered with major semiconductor industry manufacturer GlobalFoundries to commercialize this technology.

However, this previous approach was applicable to a small fraction of state-of-the-art microelectronic chips that did not include the most prevalent kind, which use a starting material referred to as bulk silicon.

In the new paper, the researchers present a manufacturing solution applicable to even the most commercially widespread chips based on bulk silicon, by introducing a set of new material layers in the photonic processing portion of the silicon chip. They demonstrate that this change allows optical communication with no negative impact on electronics. By working with state-of-the-art semiconductor manufacturing researchers at CNSE Albany to develop this solution, the scientists ensured that any process that was developed could be seamlessly inserted into current industry-level manufacturing.

"By carefully investigating and optimizing the properties of the additional material layers for photonic devices, we managed to demonstrate state-of-the-art system-level performance in terms of bandwidth density and energy consumption while starting from a much less expensive process compared to competing technologies," says Fabio Pavanello, a former postdoctoral associate from Popovic's research group who is a co-first author of the paper with both Amir Atabaki, a research scientist at MIT, and Sajjad Moazeni, a graduate student at UC Berkeley. "It took a major collaboration over several years by our three groups across different disciplines to achieve this result," adds Atabaki.

The new platform, which brings photonics to state-of-the-art bulk silicon microelectronic chips, promises faster and more energy efficient communication that could vastly improve computing and mobile devices. Applications beyond traditional data communication include accelerating the training of deep-learning artificial neural networks used in image and speech recognition tasks, and low-cost infrared LIDAR sensors for self-driving cars, smartphone face identification and augmented reality technology. In addition, optically enabled microchips could enable new types of data security and hardware authentication, more powerful chips for mobile devices operating on 5th generation (5G) wireless networks, and components for quantum information processing and computing.

"For the most advanced current state-of-the-art and future semiconductor manufacturing technologies with electronic transistor dimensions below 20nm, there is no other way to integrate photonics than this approach.", concluded Vladimir Stojanovic, whose team led some of the work, "All of the material layers used to form transistors become too thin to support photonics, so the additional layers are needed."

Credit: 
Boston University College of Engineering

Wood formation model to fuel progress in bioenergy, paper, new applications

image: Lignin, a primary component of wood, imparts strength and density to timber but must be removed for biofuel, paper and pulp production through costly treatments that require high heat and harsh chemicals.

Image: 
Jack Wang, North Carolina State University. Image by Hao-Chuan Huang.

A new systems biology model that mimics the process of wood formation allows scientists to predict the effects of switching on and off 21 pathway genes involved in producing lignin, a primary component of wood. The model, built on more than three decades of research led by Vincent Chiang of the Forest Biotechnology Group at North Carolina State University, will speed the process of engineering trees for specific needs in timber, biofuel, pulp, paper and green chemistry applications.

"For the first time, we can predict the outcomes of modifying multiple genes involved in lignin biosynthesis, rather than working with a single gene at a time through trial and error, which is a tedious and time-consuming process," says Jack Wang, assistant professor in NC State's College of Natural Resources and lead author of a paper about the research in Nature Communications.

Lignin, which forms in the plant cell wall, is an essential component for tree growth that imparts strength and density to timber. But lignin must be removed from wood during biofuel, paper and pulp production through costly treatments that require high heat and harsh chemicals.

"Having a model such as this, which allows us to say if you want this type of wood, here are the genes that you need to modify, is very beneficial, especially when you have an enormous number of possible combinations with 21 pathway genes," Wang says. "It's only possible through integrated analysis which allows us to look at this process at a systems level to see how genes, proteins and other components work together to regulate lignin production."

The landmark lignin study may represent the most comprehensive model of a single pathway in a single plant species, Wang says. The model will serve as a foundation for future work and can expand to incorporate new components and processes, he adds.

The model tracks 25 key wood traits. For timber, density and strength are paramount. Biofuel producers home in on genes linked to high polysaccharide levels, allowing wood to be more easily converted to biodiesel or jet fuel. Pulp and paper producers look for wood with low lignin levels or wood that is more readily hydrolyzed. High lignin woods are novel resources for the production of special value-added phenolic compounds.

New applications are already in the works. "NC State researchers in Bob Kelly's lab are looking at how we can produce trees that can be paired with thermophilic bacteria for optimal conversion to biofuels and biochemicals," Wang says. "We are also looking at this integrative analysis to generate trees specifically tailored for production of nanocellulose fibers to replace petroleum-based materials such as plastic."

More than three dozen molecular geneticists, engineers, chemists and mathematicians have contributed to the model. The research, which began in 2008, included the painstaking process of producing thousands of transgenic trees, using the model tree species black cottonwood (Populus trichocarpa). Several researchers who initially contributed to the project as graduate students in the program now run labs of their own around the world.

The study highlights the utility of systems-level plant research, which researchers hope will inspire similar work on related pathways in other species.

"The complexity of biological pathways is such that it's no longer sufficient to look at small-scale, independent analysis of one or two genes," Wang says. "We should use a systems biology approach to look at entire pathwaywide or organismwide analysis at a systems level, to understand how individual genes, proteins and other components work together to regulate a property or a behavior."

"We now have a long-awaited base model where new higher level regulatory factors, such as transcription factors (regulatory proteins), regulatory RNAs and others important to growth and adaptation, can be incorporated to continuously improve the predictability and extend the application of the model," says Chiang, longtime leader of the Forest Biotechnology Group. "So our next step is the production of large varieties for field testing to acquire these important regulatory factors and to produce enough wood to identify their application specificity."

Credit: 
North Carolina State University

New testing of model improves confidence in the performance of ITER

image: Physicists Brian Grierson of PPPL and Gary Staebler of General Atomics.

Image: 
Shaun Haskey

Scientists seeking to bring fusion -- the power that drives the sun and stars -- down to Earth must first make the state of matter called plasma superhot enough to sustain fusion reactions. That calls for heating the plasma to many times the temperature of the core of the sun. In ITER, the international fusion facility being built in France to demonstrate the feasibility of fusion power, the device will heat both the free electrons and the atomic nuclei -- or ions -- that make up the plasma. The question is, what will this heating mix
do to the temperature and density of the plasma that are crucial to fusion production?

New research indicates that understanding the combined heating shows how we could improve the production of fusion in ITER and other next-generation fusion facilities -- a key finding of physicists at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL), the DIII-D National Fusion Facility that General Atomics operates for the DOE, and other collaborators. "This shows what happens when electron heating is added to ion heating," said PPPL physicist Brian Grierson, who led testing of a computer model that projected the DIII-D results to ITER.

The model, created by Gary Staebler of General Atomics and reported in a paper in Physics of Plasmas with Grierson as first author, investigated the DIII-D experimental results in conditions mimicking those expected in ITER. Diagnostics supplied by the University of Wisconsin-Madison and the University of California, Los Angeles measured the resulting turbulence, or random fluctuations and eddies, that took place in the plasma.

Multiscale turbulence

The measurements revealed turbulence with short-to-long wavelengths caused by electron and ion heating, respectively. The combination produced "multiscale" turbulence that modified the way particles and heat leak from the plasma. Turbulence can reduce the rate of fusion reactions.

The combined electron and ion heating altered the gradient, or spatial rate of change in the plasma density. This finding was significant because the fusion power that ITER and other next-generation tokamaks produce will increase as the density grows greater. Moreover, the increase took place without causing impurities to accumulate in the core of the plasma and cool it down, which could halt fusion reactions.

The scientists used a "reduced physics" model called TGLF that simplified the massively parallel and costly simulations of multiscale turbulence that require millions of hours of computing time on supercomputers. The researchers ran this simplified version hundreds of times on PPPL computers to test the impact on the model of uncertainties stemming from the DIII-D experiments.

"The TGLF model exploits the weak turbulence properties of tokamaks like ITER," said Staebler. "It approximately computes the plasma transport billions of times faster than a gyrokinetic multiscale turbulence simulation run on high-performance supercomputers."

Impact of electron heating

The model looked specifically at the impact of electron heating on the overall heating mix. Researchers produce such heating by aiming microwaves at the electrons gyrating around magnetic field lines -- a process that increases the thermal energy of the electrons, transfers it to the ions through collisions, and supplements the heating of the ions by neutral beam injection.

Results indicated that studying multiscale turbulence will be essential to understanding how to deal with the multiscale effect on the transport of heat, particles and momentum in next-generation tokamaks, or fusion devices, Grierson noted. "We need to understand transport under ion and electron heating to confidently project to future reactors," he said, "because fusion power plants will have both types of heating."

Credit: 
DOE/Princeton Plasma Physics Laboratory

Fat cells seem to remember unhealthy diet

video: Immature fat cells have their development damaged when exposed to the fatty acid palmitate or the hormone TNF-alpha, a new study shows. The researchers from Novo Nordisk Foundation Center for Basic Metabolic Research hope this new knowledge may be used to develop new preventive strategies for diabetes.

Image: 
University of Copenhagen

24 hours. This is all it takes for a so-called precursor fat cell to have its 'epigenetic recipe' on how to correctly develop into a mature fat cell, reprogrammed. This change occurs when the cell is put into contact with the fatty acid palmitate or the hormone TNF-alpha, a study conducted by researchers from the Novo Nordisk Foundation Center for Basic Metabolic Research at the University of Copenhagen shows.

Precursor cells are cells that have not yet matured to undertake a specific function in the body, e.g. the function of a muscle or fat cell. Palmitate and TNF-alpha are able to disturb the development of the cell, causing it to develop into a dysfunctional fat cell later in its life. In particular, this reprogramming is found in obese patients suffering from type 2 diabetes, the researchers have found.

We are exposed to palmitate through the food we eat. Especially foods containing large amounts of saturated fat such as dairy products, meat and palm oil are plentiful in palmitate. TNF-alpha is an inflammatory hormone that is secreted in the body during illness. Obese patients also have a higher level of TNF-alpha, as obesity is linked to inflammation.

'Our results stress the importance of a healthy diet and lifestyle for our metabolic health in the years to come. To a large extent, a healthy diet and healthy lifestyle can help prevent the reprogramming of our precursor cells. In the long term, we hope our study may be at the origin of new strategies to reverse the abnormal programming of fat precursor cells, making them healthy and functional once again', says Romain Barrès, who is at the head of the study.

Environmental Factors Play a Key Role

Within so-called epigenetic research, several studies have suggested that human precursor cells have a memory of past environmental exposures. But until now no one has been able to identify the factors affecting the reprogramming of precursor fat cells or establish the rapidity at which cells are reprogrammed.

In cooperation with the Surgical Gastroenterology unit at Hvidovre Hospital, the researchers collected fat tissue from 43 planned operations. 15 patients were lean, 14 were obese and 14 were obese and suffered from type 2 diabetes. By collecting samples from three different groups of patients, the researchers were able to compare the health of the precursor fat cells of the three groups.

The team learned that the cells from the group of obese patients suffering from type 2 diabetes had been reprogrammed and therefore did not function like normal, healthy fat cells. By exposing healthy precursor fat cells to the two external factors for just 24 hours the researchers were able to mimic the reprogramming they had observed in cells from the diabetic patients.

A Promising Research Area

The researchers are unsure whether it is possible to reverse the programming to make the cells healthy and functional again. And even if it is possible, they do not know how to do it.

'We now know that precursors cells can be reprogrammed in a way that function is impaired at the final stage of their development, but so far no one has discovered how to reverse the process. But it is a promising field', says Romain Barrès.

The study, 'Preadipocytes from obese humans with type 2 diabetes are epigenetically reprogrammed at genes controlling adipose tissue function' has been published in the International Journal of Obesity.

Fact Box:

Imagine that the cells in our organs have a recipe book with programmes telling them how to develop into mature cells. The programmes tell the cells within our fat tissue which pages of the book to read in order to develop into fat cells. However, exposure to large amounts of palmitate or TNF-alpha causes the cookbook to change. Now the cells do not have access to all the pages of the book, and they will not be able to develop in the right way. They still develop into fat cells, but they are dysfunctional.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Immune diversity among the KhoeSan population

AURORA, Colo. (April 20, 2018) - A new study of the KhoeSan of Southern Africa has improved the understanding of immune diversity among the oldest surviving indigenous population in the world.

By analyzing genes of two distinct groups of the KhoeSan, investigators were able to find a level of diversity and divergence in immune cell repertoires much higher than identified in any other population. The findings are described in an article published this month in The Journal of Immunology.

"This study enhances our knowledge of the timing and causes of the selection pressures that contribute to the genetic diversity and may help us understand immune cell functions in disease resistance and reproductive success," said senior author Paul J. Norman, PhD, who recently joined the CU School of Medicine and the Colorado Center for Personalized Medicine.

The scientists reviewed how certain cell receptors and ligands affected the function of "natural killer" cells, which are a type of immune cell that are critical both in defense against pathogens, and in placental development during reproduction.

The genomes of the KhoeSan population have become a major focus for studies of human ancestry and evolution because of the KhoeSan hold a pivotal position in human history.

KhoeSan refers to a broadly dispersed set of human populations indigenous to southern Africa who speak a distinct family of languages characterized by "click" consonants. Studies of Y chromosome, mitochondrial, and autosomal DNA show the KhoeSan as the most genetically variable of human populations.

By studying the variations of the receptors and ligands in the immune gene families of two groups of the KhoeSan, the scientists found "the highest diversity and divergence of polymorphic NK cell receptors and ligands observed to date." On top of this diverse background, they identified that unique genetic variants arose to high frequency independently in the two KhoeSan groups. Surprisingly, each of these variants has the same dramatic impact on natural killer cell function, and likely combats disease specific to this geographic area. This knowledge can be used to manipulate natural killer cell functions in future studies and disease treatment.

The detailed review of genetic factors is a primary goal of personalized medicine. Clinicians and researchers use detailed information about molecular functions to study patterns across populations and to evaluate disease risks and potential treatments for individuals. In 2014, the University of Colorado Anschutz Medical Campus, along with its partners UCHealth, Children's Hospital and CU Medicine, invested in the creation of the Colorado Center for Personalized Medicine.

Another goal is broad collaboration among scientists and clinicians. For this paper in the Journal of Immunology, there are 16 authors listed, including Christopher Gignoux, PhD, associate professor in the Division of Biomedical Informatics and Personalized Medicine at the CU School of Medicine.

Credit: 
University of Colorado Anschutz Medical Campus

E. coli's internal bomb may provide novel target for treatment strategy

Madrid, Spain: Bacteria's internal bomb, the so-called toxin-antitoxin (TA) system that is part of the normal bacterial makeup, may be triggered to make bacteria turn on themselves, providing a valuable target for novel antimicrobial approaches in drug design, according to research presented at the 28th European Congress of Clinical Microbiology and Infectious Diseases (ECCMID) [1].

The TA system comprises genes that encode both a toxin and the toxin's antidote. Presenting author Dr Marcin Równicki from the Centre of New Technologies at the University of Warsaw, Poland demonstrated a potential means to stop the growth of Escherichia coli by targeting its TA system, called mazEF.

"Our findings suggest that mazEF toxin-antitoxin system is a potent and sensitive target or antisense peptide nucleic acids to inhibit E. coli growth," Równicki said. "The greatest strength of the proposed strategy is that it may work as a selective inhibitor and can precisely inhibit the growth of one type of bacteria. Also, the proposed strategy is universal and can be adapted to work on all bacteria with toxin-antitoxin systems."

Trylska's team, where Równicki works, tested two strategies for setting off this internal bomb. The first was to stop production of the antitoxin (mazE). The second was to interfere with a gene (thyA) that would indirectly trigger the toxin-antitoxin system. The team used the Mfold server, which is a bundle of software applications, to predict the secondary structures of mazE and thyA they hoped to target.

In order to activate the E. coli TA system, the researchers relied on antisense peptide nucleic acid (PNA) oligomers. PNAs are artificial polymers that act like DNA and can be used to alter the expression of genes. They bound the two PNAs, anti-mazE and anti-thyA, to a cell-penetrating peptide, or short-chain amino acid, which delivered the PNAs through the E. coli's cell wall.

The team then examined the decay of messenger ribonucleic acid (mRNA), which carries genetic information copied from DNA, and watched for interactions between them and three antibiotics (polymyxin B, trimethoprim and sulfametoksazol). They found that both strategies resulted in concentration-dependent growth inhibition of the tested E. coli, which means the higher the concentration, the greater the effect on the bacteria. There was also a synergistic effect with two of the antibiotics, polymyxin B and trimethoprim.

"The antisense oligonucleotide can be designed to broaden the spectrum of activity and inhibit the growth of different types of bacteria simultaneously," Równicki explained. "Moreover, if mutations of the mRNA target appear, the sequence of the antisense oligonucleotide could be quickly redesigned to overcome resistance."

Most of the new antibiotics conform to already established classes and have the same cellular target, therefore are often subject to at least some of the same resistances observed in previous members of the class. For that reason, looking for new targets is fundamental in the antimicrobial development. Antisense-based drugs are part of a growing number of pharmaceutical and biotech programmes to treat not only infectious diseases but also some types of cancer.

"We have provided proof-of-concept for the exploitation of toxin-antitoxin systems in antibacterial strategies," Równicki said. "The proposed strategy may be important in the future when designing new classes of antibiotics, but our research is in the early stage and further investigation is a must."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases