Culture

Immersive virtual reality therapy shows lasting effect of treatment for autism phobias

image: This is the Blue Room displaying a bus scenario to help people with autism overcome their fears and phobias.

Image: 
Third Eye NeuroTech and Newcastle University

Virtual reality has been shown to help children with autism with nearly 45% remaining free from their fears and phobias six months after treatment.

A separate study also published tomorrow, has shown for the first time that the treatment works for some autistic adults.

The Blue Room, developed by specialists at Newcastle University working alongside innovative technology firm Third Eye NeuroTech, allows the team to create a personalised 360 degree environment involving the fear which may debilitate the person with autism in real life.

Within this virtual environment, which requires no goggles, the child can comfortably investigate and navigate through various scenarios working with a therapist using iPad controls but remain in full control of the situation.

The research, funded by the National Institute for Health Research (NIHR), is published today in the Journal of Autism and Developmental Disorders.

"For many children and their families, anxiety can rule their lives as they try to avoid the situations which can trigger their child's fears or phobia," says Professor Jeremy Parr, who led the study.

"To be able to offer an NHS treatment that works, and see the children do so well, offers hope to families who have very few treatment options for anxiety available to them."

Autism can affect a child's learning and development, often resulting in impaired social and communication skills and many also have fears or phobias which can be very distressing but are often overlooked. It is thought these phobias affect around 25% of children with autism. In the trial phobias included travelling on public transport, school classrooms, dogs and balloons.

The Newcastle University experts describe the randomised controlled trial involving 32 children with autism aged 8 - 14 years. Half received treatment in the Blue Room straight away and half acted as a control group, receiving delayed treatment six months later.

Accompanied by a psychologist they underwent four sessions in a week involving a personalised scenario in the Blue Room. Parents were able to watch the treatment via a video link.

"People with autism can find imagining a scene difficult which is why the Blue Room is so well-received. We are providing the feared situation in a controlled way through virtual reality and we are sitting alongside them to help them learn how to manage their fears," explains Dr Morag Maskey, researcher from the Institute of Neuroscience, Newcastle University.

"It is incredibly rewarding to see the effect it can have for some, overcoming a situation which just a week previously would have been so distressing."

After receiving the treatment and with the support of their parents, the children were then introduced to the scenario in the real world.

Two weeks after treatment, the research shows that four of the first 16 (25%) had responded to treatment and were able to cope with a specific phobia. This effect remained with a total of six showing improvement after six months (38%), however, one reported a worsening of their phobia. Meanwhile, in the control group, five untreated participants had become worse in the six months.

The control group went on to be treated in the Blue Room after this time. Results showed that overall 40% of children treated showed improvement at 2 weeks, and 45% at 6 months.

This improvement is comparable with other treatments and the team intend to further examine why some don't respond.

For the first time, the Blue Room treatment was offered to autistic adults. In a separate publication in Autism in Adulthood by the same team, the VR treatment was shown to work in five out of eight autistic adults.

Aged 18- 57, the adults received four 20 minute sessions in the Blue Room with a personalised computer generated scene. Six months after the sessions, five of the eight participants still had real life day-to-day improvements in relation to their phobia.

NHS treatment is available to UK families through the Complex Neurodevelopmental Disorders Service at Northumberland Tyne and Wear NHS Foundation Trust https://www.ntw.nhs.uk/resource-library/complex-neuro-developmental-service-cnds/

Dr Rajesh Nadkarni, Executive Medical Director at Northumberland, Tyne and Wear NHS Foundation Trust, said: "We are proud to be a partner of the Newcastle Blue Room Treatment which is helping people with autism to manage their anxiety. Northumberland, Tyne and Wear NHS Foundation Trust has a strong track record in providing nationally recognised autism services, and we welcome this new research demonstrating the positive effects of this highly innovative treatment."

Eddie Nelson is Director of Third Eye NeuroTech, (http://thirdeye.tv/index.html ) the immersive reality technology company based in County Durham which provides the Blue Room facility. He says: "It is rare as a business that we get the chance to help young people and their families in such a dramatic and tangible way. But what we see with the Blue Room is very anxious young people and adults coming in, yet within four of these specialised sessions they come out having combatted their fears."

Specific phobias which were addressed in the Blue Room treatment included: Dogs, Wasps/bees, lifts, fear of the dark, flying, dolls, balloons, public transport, school, walking into rooms.

Alongside the NHS Service, the Newcastle University team are continuing further research into the effectiveness and lasting effect of the Blue Room.

Credit: 
Newcastle University

Tiny satellites reveal water dynamics in thousands of northern lakes

image: A new study uses CubeSats to measure short-term changes in northern hemisphere lakes. The study region includes (clockwise starting at the top left): Mackenzie River Delta, Northwest Territories, Canada; Canadian Shield, north of Yellowknife, Northwest Territories, Canada; Yukon Flats, Alaska and Tuktoytaktuk Peninsula, Northwest Territories, Canada.

Image: 
Planet

PROVIDENCE, R.I. [Brown University] -- Using an army of small satellites, researchers have shown that water levels in small lakes across northern Canada and Alaska are far more variable during the summer than previously thought. The findings, published in Geophysical Research Letters, could have implications for how scientists calculate the natural greenhouse gas emissions from these northern lakes.

The study used images taken by a network of more than 150 CubeSats -- small satellites about the size of shoeboxes -- which made nearly daily observations of more than 85,000 small North American lakes during the summer of 2017. The images enabled the researchers to see how the lakes changed over time. They found small but significant shoreline changes in individual lakes that added up to hundreds of square kilometers of lake area change across the study region.

"There's been a lot of research on climate-driven changes in lake area, but it's mainly focused on long-term changes," said Sarah Cooley, a Ph.D. student at Brown University and the study's lead author. "This is the first time that anyone has looked at fine-scale, short-term changes, and we found that there's much more variability within a season than expected."

The study area captures a substantial swath of Arctic tundra and boreal forest, a biome that circles the Earth's northern hemisphere in a band from about 50 to 70 degrees north latitude. The region is home to critical forest and tundra ecosystems as well as the planet's highest density of lakes, so understanding its hydrology is scientifically important. One reason for that is that boreal lakes are a significant source of natural greenhouse gas emissions. Their sediments contain tons of organic carbon, which washes in from the surrounding landscape. Some of that carbon then decomposes and it emitted into the atmosphere in the form of carbon dioxide and methane greenhouse gases.

This new finding of substantial summer shoreline fluctuation has implications for how scientists calculate these emissions, the researchers say. That's because shoreline areas where water ebbs and flows from season to season are known hotspots for greenhouse gas production and emission. But estimates of lake emissions generally assume shorelines to be stable within each season. The finding of surprising within-season shoreline fluctuation, the researchers say, suggests that current emissions models from boreal lakes may be underestimated.

"A shoreline that's fluctuating is going to emit more carbon than a stable shoreline," Cooley said. "These short-term fluctuations, which no one had ever mapped before, suggest these lakes are potentially emitting more gas than people thought."

Another finding that surprised the research team was the large overall importance of shoreline fluctuations on the ancient Canadian Shield, a rocky, wet landscape in central Canada where millions of small lakes cover 20 percent of the landscape.

"Previous studies assumed lakes in this area to be relatively stable," said Laurence C. Smith, a co-author of the study and project leader for NASA's Arctic-Boreal Vulnerability Experiment, which helped fund the study. "To our surprise, the high-resolution, high-frequency imaging afforded by CubeSats revealed that small shoreline fluctuations in this lake-rich area sum to impressively large numbers."

In all, the study explored four sub-areas of the North American Arctic and sub-Arctic and found the little-studied Canadian Shield to be most dynamic of all, with about 1.4 percent of its landscape seasonally inundated by small fluctuations in lake levels.

Big data, small satellites

Another takeaway from the study, Cooley says, is that it shows the power of CubeSats to acquire data that larger satellites can't gather.

"What I'm most excited about from a science perspective is the ability to make use of this new CubeSat imagery," Cooley said. "We couldn't have made these observations without the CubeSats, and here we show that it's possible to extract valuable scientific information from those images."

Large space agency satellites festooned with sensitive scientific instruments can gather all sorts of information, but simply don't make enough overhead passes to catch changes that occur over short periods of time. And the satellites that do pass over on a daily basis lack the camera resolution to make fine-scale observations of lake area.

The CubeSats, recently launched by a company called Planet, offered a potential solution. The company operates more than 150 satellites, which orbit the Earth in an arrangement that enables them to image Earth's entire landmass each day as the planet rotates beneath them. And while the tiny satellites lack sophisticated scientific equipment, they do have high-powered cameras capable of capturing images with 3-meter resolution.

But the CubeSat data present some unique challenges, Cooley says. For example, the location data from CubeSats tend not to be as precise as those from space agency satellites. And the CubeSat images lack filters that make them easier to analyze. NASA or European Space Agency (ESA) satellite data are filtered to eliminate images taken on cloudy days or other low-quality images.

So Cooley had to design her own system to compensate for those issues. For the study, she trained a machine learning algorithm to spot anomalous data patterns and throw them away. For example, instances in which a lake suddenly disappears in a day only to return to view a few days later are most likely due to cloud cover or glitchy observations, not an actual lake disappearance. The algorithm could flag such instances and remove them from the data.

Using that algorithm, Cooley and her colleagues were able to sift through more 25 terabytes of CubeSat data. Cooley says she expects more interesting earth science findings to come from CubeSats in the coming years.

"I see this as a beginning of a new period in remote sensing, in that suddenly all sorts of earth observations that may not have been possible before will become possible with these small, simple satellites," Cooley said.

Credit: 
Brown University

The ways of wisdom in schizophrenia

While wisdom is closely linked to improved health and well-being, its role and impact among persons with schizophrenia, possibly the most devastating of mental illnesses, is not known.

In a new paper, publishing February 14, 2019 in Schizophrenia Research, researchers at University of California San Diego School of Medicine report that, on average, persons with schizophrenia (PwS) obtained lower scores on a wisdom assessment than non-psychiatric comparison participants (NPCPs), but that there was considerable variability in levels of wisdom. Nearly one-third of PwS had scores in the "normal" range, and these PwS with higher levels of wisdom displayed fewer psychotic symptoms as well as better cognitive performance and everyday functioning.

"Taken together, our findings argue for the value of assessing wisdom in persons with schizophrenia because increasing wisdom may help improve their social and neuro-cognition, and vice versa," said senior author Dilip Jeste, MD, Distinguished Professor of Psychiatry and Neurosciences and director of the Stein Institute for Research on Aging at UC San Diego School of Medicine.

"There is a concept of 'wellness within illness,'" said Jeste. "Our findings support the hypothesis that wisdom and schizophrenia co-exist in a proportion of these patients, specifically those functioning at a higher level. Furthermore, the data suggest that treatments which enhance positive psychological traits, such as wisdom, may promote health and well-being in persons with schizophrenia.

"We have a tendency in medicine to focus attention on remediating symptoms and impairments to the exclusion of individual strengths. A sustained effort to assess and enhance positive traits in a person with severe mental illness, such as their levels of wisdom, happiness and resilience, might do much to improve the quality of their lives."

In this study, 65 stable adult outpatients with a diagnosis of chronic schizophrenia or schizoaffective disorder and 96 NPCPs completed a commonly used, three-dimensional wisdom scale (3D-WS) that includes measures of cognitive, reflective, and affective (relating to emotion) wisdom.

While people with schizophrenia had lower average scores on the wisdom scale than NPCPs, those PwS with higher wisdom scores performed better on neurocognitive and functional assessments than those with lower scores. Indeed, level of wisdom positively correlated with performance on multiple neurocognitive tests in PwS; no such relationships were detected in NPCPs.

The researchers also found that the reflective domain of wisdom -- the ability to look inward accurately and without bias or to recognize others' perspectives -- was significantly related to the mental health and well-being of PwS, while levels of cognitive and affective wisdom appeared to have no such correlative effect.

"In healthy adults, wisdom seems to reduce the negative effects of adverse life events. It boosts a sense of well-being, life satisfaction and overall health," said first author Ryan Van Patten, a postdoctoral scholar in the UC San Diego School of Medicine Department of Psychiatry. "In persons with schizophrenia, wisdom is also connected to levels of happiness, resilience, and subjective recovery."

In recent years, there has been increasing empirical evidence that wisdom has a biological basis and can be modified and enhanced through a variety of interventions. This study, said Jeste, supports the idea that therapies which promote wise behavior in persons with schizophrenia can also boost their broader capacity to function in society, more healthfully and with greater happiness and satisfaction.

Credit: 
University of California - San Diego

Oldest Americans most focused on reducing food waste

image: The vast majority of Americans are paying attention to reducing food waste with the oldest being the most cognizant, according to the latest Michigan State University (MSU) Food Literacy and Engagement Poll.

Image: 
© MSU AgBioResearch

WASHINGTON, D.C. -- The vast majority of Americans are paying attention to reducing food waste with the oldest being the most cognizant, according to the latest Michigan State University (MSU) Food Literacy and Engagement Poll.

The fourth wave of this poll, conducted Jan. 15-21, 2019, surveyed 2,090 Americans on their attitudes and knowledge of food issues. The results were released today at the American Association for the Advancement of Science Annual Meeting.

The majority of all Americans (88 percent) say they take steps to reduce food waste at home. This includes 94 percent of consumers age 55 and older and 81 percent of those under 30 years old.

Among respondents who make efforts to reduce food waste:

Seventy-one percent said they try not to purchase excess food.

Seventy-one percent said they often consume food before it spoils.

Thirty-four percent share excess food when possible.

Of the 12 percent of Americans who say they do not take steps to reduce food waste at home:

Thirty-one percent say they do not waste food.

Twenty-three percent are not familiar with the term "food waste.

Twenty-one percent do not know how to reduce food waste.

Twenty percent are not concerned about it.

Eighteen percent do not have the time.

This fourth wave of the survey revealed that 41 percent of Americans correctly recognize that 31-50 percent of the food annually produced in the United States goes to waste, including 44 percent of those age 55 and older and 36 percent of those under 30 years old.

"Older Americans pay the closest attention to limiting food waste compared to their peers," said Sheril Kirshenbaum, co-director of the MSU Food Literacy and Engagement Poll. "Previous waves of the survey have revealed this group also performs best on general food literacy questions."

Additional survey highlights include:

Forty-eight percent of Americans say they never, rarely, or aren't sure how often they consume genetically modified organisms, often called GMOs.

Forty-nine percent say they never or rarely seek information about where their food was grown or how it was produced, with an additional 15 percent responding once a month.

Forty-one percent would be willing to buy a GMO-derived fruit or vegetable that stayed fresh longer than currently available produce.

"These findings continue to expand our insights into the attitudes and behaviors of consumers," said poll co-director Doug Buhler, director of MSU AgBioResearch. "Given the challenges ahead in feeding more people while preserving our natural resources and protecting our climate, getting a handle on the causes and remedies of food waste is key to meeting global food demand. It takes months to produce food, but we can waste it in an instant."

For more information about the MSU Food Literacy and Engagement Poll, visit food.msu.edu/poll.

Data from the MSU Food Literacy and Engagement Poll were weighted using U.S. Census Bureau figures, to ensure the sample's composition reflects the actual U.S. population. Launched in 2017, the poll was developed by Food@MSU and is supported by MSU AgBioResearch. The survey, conducted twice per year, is intended to provide an objective, authoritative look at consumer attitudes and perspectives on key food issues, and is designed to help inform national discussion, business planning and policy development.

Credit: 
Michigan State University College of Agriculture & Natural Resources

Risk analysis releases special issue on the social science of automated driving

Risk Analysis, An International Journal has published a special issue, "Social Science of Automated Driving," which features several articles examining the human side of automated driving, focusing on questions about morality, the role of feeling, trust and risk perceptions.

Autonomous vehicles are more than just an improvement on existing vehicles, they are a brand new technology. The widespread acceptance and adoption of autonomous vehicles hinges less on the technological challenges of creating the vehicles and more on the attitudes and perception of the people the technology is intended to serve. Public uncertainty raises significant societal questions about safety, infrastructure spending, regulations, insurance law and more. This collection of papers underscores the key roles of consumer attitudes and perceptions of risk in understanding acceptance of autonomous vehicles.

The issue begins with an article that explores whether an autonomous vehicle should swerve or stay in its lane when confronted with a situation in which either action could result in a collision with a pedestrian. In two empirical studies, Meder and his colleagues found that most people generally preferred that the autonomous vehicle defaults to staying in its travel lane, especially when the likelihood of collision was unknown. This preference held up even in hindsight when a hypothetical accident had already occurred.

The next article, by Liu, Yang and Xu explores the expected safety levels of automated vehicles. The authors used an expressed-preference approach to measure the acceptable level of risk as compared with human-driven vehicles. They found that people expect autonomous vehicles to be four to five times safer than human drivers and that the autonomous vehicles would have to reduce traffic fatalities by 75 percent before they would be accepted.

In a second paper Liu, Yang and Xu investigated the role of social trust and risk/benefit perceptions in the public acceptance of automated driving. The researchers employed a survey to measure three facets of acceptance: general acceptance of automated driving; willingness to pay for automated vehicles; and the intention to use, purchase or recommend automated vehicles. They found that social perceptions of trust directly affected all measures of acceptance.

The study by Brell, Philipsen and Ziefle also looked at risk and benefit perceptions by using a two-step empirical approach to explore risk perceptions of connected and autonomous vehicles in comparison to conventional driving. They found that autonomous driving was perceived as riskier but that increased experience with driver assistance systems resulted in decreased perceptions of riskiness.

The special issue concludes with a study aimed at understanding how feelings related to conventional driving affect the perception and acceptance of autonomous vehicles. Raue and her colleagues explored how feelings related to traditional driving were used as information to make judgments about self-driving cars. They also found that those who had more experience with vehicle automation technologies had lower risk and higher benefit perceptions, as well as higher trust feelings with regard to autonomous vehicles.

Articles included in this special issue:

"How should autonomous cars drive? A preference for defaults in moral judgments under risk and uncertainty" by Björn Meder, Max Planck Institute for Human Development, Nadine Fleischhut, Max Planck Institute for Human Development, Nina-Carolin Krumnau, Max Planck Institute for Human Development, and Michael R. Waldmann, University of Göttingen

"How safe is safe enough for self-driving vehicles?" by Peng Liu, Tianjin University, Run Yang, Tianjin University, and Zhigang Xu, Chang'an University

"Public acceptance of fully automated driving: Effects of social trust and risk/benefit perceptions" by Peng Liu, Run Yang and Zhigang Xu

"sCARy! Risk perceptions in autonomous driving - The influence of experience on perceived benefits and barriers" by Teresa Brell, Ralf Philipsen and Martina Ziefle, RWTH Aachen University

"The influence of feelings while driving regular cars on the perception and acceptance of self-driving cars" by Martina Raue, Massachusetts Institute of Technology (MIT), Lisa A. D'Ambrosio, MIT, Carley Ward, MIT, Chaiwoo Lee, MIT, Claire Jacquillat, Carnegie Mellon University, and Joseph F. Coughlin, MIT

Credit: 
Society for Risk Analysis

What's age got to do with it?

Sophia Antipolis, 14 February 2019: It's often said: It's not how old you are, it's how old you feel. New research shows that physiological age is a better predictor of survival than chronological age. The study is published today in the European Journal of Preventive Cardiology, a journal of the European Society of Cardiology (ESC).

"Age is one of the most reliable risk factors for death: the older you are, the greater your risk of dying," said study author Dr Serge Harb, cardiologist at the Cleveland Clinic in the United States. "But we found that physiological health is an even better predictor. If you want to live longer then exercise more. It should improve your health and your length of life."

Based on exercise stress testing performance, the researchers developed a formula to calculate how well people exercise - their "physiological age" - which they call A-BEST (Age Based on Exercise Stress Testing). The equation uses exercise capacity, how the heart responds to exercise (chronotropic competence), and how the heart rate recovers after exercise.

"Knowing your physiological age is good motivation to increase your exercise performance, which could translate into improved survival," said Dr Harb. "Telling a 45-year-old that their physiological age is 55 should be a wake-up call that they are losing years of life by being unfit. On the other hand, a 65-year-old with an A-BEST of 50 is likely to live longer than their peers."

The study included 126,356 patients referred to the Cleveland Clinic between 1991 and 2015 for their first exercise stress test, a common examination for diagnosing heart problems. It involves walking on a treadmill, which gets progressively more difficult. During the test, exercise capacity, heart rate response to exercise, and heart rate recovery are all routinely measured. The data were used to calculate A-BEST, taking into account gender and use of medications that affect heart rate.

The average age of study participants was 53.5 years and 59% were men. More than half of patients aged 50-60 years - 55% of men and 57% of women - were physiologically younger according to A-BEST. After an average follow-up of 8.7 years, 9,929 (8%) participants had died. As expected, the individual components of A-BEST were each associated with mortality.

Patients who died were ten years older than those who survived. But A-BEST was a significantly better predictor of survival than chronological age, even after adjusting for sex, smoking, body mass index, statin use, diabetes, hypertension, coronary artery disease, and end-stage kidney disease. This was true for the overall cohort and for both men and women when they were analysed separately.

Dr Harb said doctors could use A-BEST to report results of exercise testing to patients "Telling patients their estimated age based on exercise performance is a powerful estimate of longevity and easier to understand than providing results for the individual components of the examination."

Dr Harb noted that this type of approach has shown merit in specific disease areas. For example, ESC guidelines advocate using "cardiovascular risk age" - based on risk factors including smoking, blood cholesterol and blood pressure - to communicate with patients.2

Credit: 
European Society of Cardiology

Study on measles transmission in China have implications for controlling the epidemic worldwide

February 14, 2019 -- A new study on the measles epidemic in China has far-reaching implications for eliminating the infection globally, according to researchers at Columbia University Mailman School of Public Health. Using a new model-inference system developed at the Columbia Mailman School, the researchers were able to estimate population susceptibility and demographical characteristics in three key locations in China, in a period that spans the pre-vaccine and modern mass-vaccination eras. Until now, the dynamics of transmitting measles here had been largely unknown. The findings are published online in PLOS.

Despite widespread vaccination, measles remains a leading cause of death in children globally, and elimination of measles has been particularly challenging in China, the largest country with endemic measles transmission. "While since 2008 China has reported greater than the 95 percent vaccination target coverage set by the World Health Organization for measles elimination, measles has continued to cause large outbreaks every year for reasons that remain undetermined," said Wan Yang, PhD, assistant professor of Epidemiology at the Columbia Mailman School, and lead author.

With the ground-breaking modeling tool, the Columbia Mailman researchers were able to estimate key epidemiological parameters in Beijing, Guangzhou, and Shandong-- during 1951-2004.

The research team collected detailed population data on yearly measles incidence and vaccination rates for the three sites. Since the data were often sparse for an entire city or province in any given year, the model-inference system enabled the investigators to study the long-term transmission dynamics of the disease and identify key epidemiological factors that may be contributing to measles' persistent transmission.

"After thoroughly validating the model-inference system, we were able to show its accuracy in estimating out-of-sample data not used in the model and then apply it to estimate epidemiological and demographical characteristics key to measles transmission during 1951-2004 for these three key locations," noted Yang.

Much of measles research in the past had only focused on industrialized countries. The inference system that Yang and colleagues developed takes into account complex and changing population demographics, contact patterns, age-structure, mass vaccination, as well as under-reporting.

Major cities in China with large migrant populations resulted in endemic measles transmission during the recent decade in particular, Yang points out. Both Beijing and Guangzhou were among such cities; for example, in 2010, 36 percent of Beijing's population were migrants, of which 8.5 percent came from Shandong, a province of moderate economic development, where estimated susceptibility was twice as high as in Beijing and Guangzhou since the mid-1990s.  

"Due to uneven economic development, currently over 100 million workers in China migrate from less developed regions like Shandong to large cities for jobs. This key difference points to possible persistent measles transmission in large cities linked to China's massive migrant worker population," said Yang. "Simple assessment suggests that migrants may have been, and continue to be, a vulnerable subpopulation, contributing to the persistent transmission of measles in the big cities despite high vaccination coverage. And catch-up vaccination targeting migrant populations might be an efficient means of controlling current epidemics in these big cities."

"In addition to issues of migration, our study also revealed interesting differences in measles seasonality among the three study locations," said senior author Jeffrey Shaman, PhD, Mailman School professor of Environmental Health Sciences and director of the Climate and Health program. The literature shows, for example, in industrialized countries increased mixing among school-age children during school terms can facilitate measles transmission, and climate condition may also play a role in measles seasonality. "More specifically, winter indoor heating in cold climates in cities like Beijing and Shandong may increase crowding or reduce ventilation and increase the risk of infection during cold months."

"Our findings revealed characteristics that we believe are crucial to understanding the current persistence of measles epidemics in China and for devising future elimination strategies," said Yang.

Credit: 
Columbia University's Mailman School of Public Health

C-sections by trained health officers a safe alternative

image: These are CapaCare trainees during a training session.

Image: 
Magnus Endal, CapaCare

Sierra Leone has one of the highest maternal mortality ratios in the world -- for every 100,000 live births, 1360 women will die. In Norway, that number is just 5 women per 100,000 live births; in the US, it's 14, according to the United Nations Population Fund.

Why are so many women dying in Sierra Leone? The answer is simple: childbirth can be complicated and there are simply not enough health care providers to address the needs of this small West African country of 7 million people.

Before the 2014-2016 Ebola outbreak, just 10 specialist surgeons were available in public hospitals, with about 150 doctors in total.

But there is hope: a new study of more than 1200 women who had C-sections in nine Sierra Leone hospitals shows that community health officers with surgical training are a safe alternative to medical doctors when it comes to outcomes for Caesarean sections. The study was conducted between October 2016 and May 2017, and has just been published in BJS, the British Journal of Surgery.

"Some of these hospitals have just one or two doctors for the whole district," said the study's first author Alex van Duinen, a Dutch medical doctor and PhD candidate at the Norwegian University of Science and Technology (NTNU). "Human resources are a major problem. Task sharing -- teaching community health officers to provide basic lifesaving surgeries -- can improve that."

Building acceptance in the medical community

Van Duinen's research is being conducted as part of his work with a non-profit organization called CapaCare, co-founded in 2011 by Håkon Bolkan, a surgeon at St Olavs Hospital and a postdoc at NTNU.

CapaCare offers a two-year training programme for selected community health officers (CHO), with the goal of teaching them to do surgeries such as Caesarean sections, appendectomies and hernia repairs.

After the two years of training, graduates also spend a year as an intern at one of the main hospitals in Freetown, after which they are given the position as a surgical assistant community health officer (SACHO). To date, 31 participants have graduated from the programme, with another 33 in training.

But CapaCare's work is not as simple as just producing qualified graduates -- Sierra Leone's medical community, health care officials and the population as a whole have to have confidence that the CapaCare graduates can perform surgeries at least as well as medical doctors.

This evidence has to be clear, and provide scientifically documented proof that the training works. So that's where van Duinen's research comes in.

Prospective study with a home visit

Other researchers have looked at the issue of task sharing for emergency obstetric care, mainly in East Africa, but most of these studies have been retrospective, which means that researchers essentially look for patterns in medical data that have already been collected.

Nevertheless, all of these studies show that task sharing can work well in places where there are just not enough doctors to go around.

On the other hand, a prospective study like that conducted by van Duinen and his colleagues is seen as much stronger support for the findings from the research.

Researchers in a prospective study decide which data to include before they collect it, which ensures data quality. The Sierra Leone C-section study also included a home visit 30 days after the birth, which allowed van Duinen and his colleagues to see how mother and child were doing in the period after discharge.

CapaCare

"We were able to check some of the outcome data," van Duinen said. "Home visits were one of the distinguishing factors of our study."

For example, one baby that had been recorded as a stillbirth in the hospital and one baby that had been recorded as a death were both found to be alive when the researchers did the home visits.

Three of the mothers and 28 of the babies who were alive when they left the hospital also died in the month after birth, which researchers wouldn't have discovered without this extra follow up, van Duinen said.

Not inferior to doctors

When all the numbers were tallied, the researchers found 16 postoperative maternal deaths, 15 treated by a doctor and 1 treated by a graduate of the CapaCare programme.

These numbers indicate that doctors may have worse outcomes. "Although our data suggest that patients treated by doctors and graduates were mostly similar, doctors may have treated more of the complicated patients," van Duinen said.

The difference in the mortality numbers is also due to the fact that there were 50 doctors in the study compared to 12 graduated clinical officers. These 50 doctors performed 831 C-sections that were included in the study, or two-thirds, compared to 443 performed by the graduates. Of these surgeries, 85 per cent were done as emergency surgeries.

However, the mortality numbers and the number of emergency surgeries are "still very high," van Duinen said, but there are also reasons for that.

"There's a huge disincentive for women in the districts to come into the hospital to give birth," he said. "That's why the outcomes are so bad."

Very often, he said, women will wait until the absolute last minute to come in for care, because it may be too costly for them to both travel and get treated at the hospital. They may also be afraid of coming to the hospital because of concern of what they might find there -- it was not that long ago, for example, that Ebola ravaged the country's health care system, during which more than 14000 people were infected and 4000 died.

When the researchers analysed the numbers, they found that caesarean sections done by the graduates were not associated with higher maternal mortality 30 days after the surgery than caesarean sections performed by doctors.

"The message is that there is a slight difference in the two groups, because the doctors may get the more complicated cases, but overall we see that there is non-inferiority," van Duinen said.

Kismet in the jungle

Just by chance, van Duinen has worked with CapaCare since the organization first started its training programme in 2011. That's because he was working at the time at Masanga Hospital, in Sierra Leone's Tonkolili province, as hospital director. Masanga Hospital has been the base for CapaCare's training in Sierra Leone.

The first week he was at Masanga, van Duinen met Bolkan with the first CapaCare student who was just beginning his training.

The idea of training non-doctor medical personnel to do emergency surgeries -- task sharing -- was already something he knew about. During his medical training in the Netherlands, he was sent to Malawi for an internship, where the practice is well established.

"I saw that task sharing could actually work, and then I met Håkon, and I said, 'Yes, of course, we have to do this! It works, I have seen it!'," he said.

Credit: 
Norwegian University of Science and Technology

Recurring infections could lead to delayed bladder or kidney cancer diagnosis

Women with bladder or kidney cancer may lose out on a prompt diagnosis if they are already being regularly treated for recurring urinary tract infections (UTIs), according to new research presented at Cancer Research UK's Early Diagnosis Conference in Birmingham today (Wednesday).

The research suggests this may be because a person prone to infection is assumed to be suffering from yet another UTI rather than being investigated for potential cancer.

The analysis of 24 studies and more than 100,000 people found that up to two thirds with blood in their urine - a UTI symptom and a possible sign of cancer - had no check-up in the six months after their initial visit to a doctor.

UTIs are more common in women than in men. Although UTIs are not directly related to cancer, failing to get recurring infections or blood in the urine checked by a doctor could lead to a delayed cancer diagnosis, particularly among older women.*

The researchers think that developing electronic tools within existing GP computer systems to flag-up patients with ongoing symptoms, or recurring UTIs in older people, could help GPs to identify who may need further investigations or a referral to a urologist.

Dr Yin Zhou, lead author from the University of Cambridge, said: "Detecting cancer early is vital for giving patients the best treatment options and improving survival. This research is an important step towards improving our understanding of why some people are diagnosed later than others.

"Although UTIs are the second most common condition that GPs are prescribing antibiotics for, in some people, symptoms of a UTI may be masking symptoms of bladder or kidney cancer. Only a small number of patients with persistent symptoms and recurring UTIs will go on to develop cancer but it's important that we don't miss them. The next step will be to find a way to detect these patients earlier."

More than 10,500 people are diagnosed with kidney cancer and around 8,500 people are diagnosed with bladder cancer each year in the England. Of those, around 3,400 are diagnosed with late stage kidney cancer and 1,800 with late stage bladder cancer.

When caught early at stage I, kidney cancer patients' five-year survival is seven times greater than when diagnosed at stage IV.** And more than twice as many patients survive bladder cancer for at least a year when diagnosed with stage I disease compared with stage IV.***

Sara Hiom, director of early diagnosis at Cancer Research UK, said: "Early stage cancers may not always display obvious symptoms, but this research highlights the importance of tracking persistent symptoms and ensuring ongoing problems are not ignored. We continue to use the latest evidence to find new ways to support GPs and practices to ensure all patients receive an accurate diagnosis as swiftly as possible - this can make all the difference to their experience and outcome.

Dr Richard Roope, Cancer Research UK's GP expert, said: "GPs see many patients with symptoms suggestive of a urinary infection - thankfully the vast majority will never go on to develop kidney or bladder cancer. But this research shines a light on the importance of taking a step back to consider what might be causing any recurrence of symptoms, rather than assuming the diagnosis is the same as it has been before.

"There's no easy way to know which patients need to be referred or seen again. All GPs want the best for their patients so research like this, highlighting where improvements need to be made, such as arranging a review, is very useful."****

Credit: 
Cancer Research UK

Human cells can also change jobs

image: Pseudo-islets made up of human alpha cells. These cells produce glucagon (blue), but can &laquolearn» to make insulin (red). The GFP protein (green) allows tracing the origin of the cells, thus certifying their change of identity.

Image: 
© Pedro Herrera , UNIGE

Biology textbooks teach us that adult cell types remain fixed in the identity they have acquired upon differentiation. By inducing non-insulin-producing human pancreatic cells to modify their function to produce insulin in a sustainable way, researchers at the University of Geneva (UNIGE), Switzerland, show for the first time that the adaptive capacity of our cells is much greater than previously thought. Moreover, this plasticity would not be exclusive to human pancreatic cells. A revolution for cell biology, to be discovered in the journal Nature.

The human pancreas harbours several types of endocrine cells (α, β, δ, ε and &Upsih;) that produce different hormones responsible for regulating blood sugar levels. These cells are bundled into small clusters, called pancreatic islets or islets of Langerhans. Diabetes occurs when, in the absence of functional β cells, blood sugar levels are no longer controlled. At the UNIGE Faculty of Medicine, Professor Pedro Herrera and his team had already demonstrated, in mice, that the pancreas has the ability to regenerate new insulin cells through a spontaneous mechanism of identity change of other pancreatic cells. But what about the human being? Moreover, is it possible to artificially promote this conversion?

From one hormone to another: a long-term change

To explore whether human cells have this ability to adapt, Geneva scientists used islets of Langerhans from both diabetic and non-diabetic donors. They first sorted the different cell types to study two of them in particular: α cells (glucagon producers) and &Upsih; cells (pancreatic polypeptide cells). "We divided our cells into two groups: one where we introduced only a fluorescent cell tracer, and the other where, in addition, we added genes that produce insulin transcription factors specific to β cells," explains Pedro Herrera.

The researchers then reconstructed "pseudo-islets", with only one cell type at a time to accurately study their behaviour. "First observation: the simple fact of aggregating cells, even into monotypic pseudo-islets, stimulates the expression of certain genes linked to insulin production, as if the "non-β" cells naturally detected the absence of their "sisters". However, in order for the cells to start producing insulin, we had to artificially stimulate the expression of one or two key β cell genes," says Kenichiro Furuyama, a researcher in the Department of Genetic Medicine at the Faculty of Medicine of the UNIGE and the first author of this work. One week after the experiment began, 30% of the α cells were producing and secreting insulin in response to glucose. &Upsih;-Cells, under the same treatment, were even more effective and numerous in converting and secreting insulin in response to glucose.

In a second step, the researchers transplanted these monotypic pseudo-islets of modified human α cells into diabetic mice. "Human cells proved to be very effective. The mice recovered!" rejoices Pedro Herrera. "And as expected, when these human cell transplants were removed the mice became diabetic again. We obtained the same results with cells from both diabetic and non-diabetic donors, showing that this plasticity is not damaged by the disease. In addition, this works in the long term: six months after transplantation, the modified pseudo-islets continued to secrete human insulin in response to high glucose."

Cells that are more resistant in autoimmune diabetes

A detailed analysis of these human glucagon cells that have become insulin producers shows that they retain a cell identity close to that of α cells. Autoimmune diabetes, or type 1 diabetes, is characterized by the destruction of β cells by the immune system of patients. The researchers then wondered whether these modified α cells would also be targeted by autoimmunity, since they remain different from β-cells. To test their resistance, they co-cultured them with T cells from patients with type 1 diabetes. "We found that modified α cells triggered a weaker immune response, and therefore might be less likely to be destroyed than native β cells."

Today, pancreas transplantation is performed in cases of extremely severe diabetes, by transplanting either the entire pancreas or, preferably, only pancreatic islets, a much less invasive approach. This technique is very effective, but has its limits: like any transplant, it goes hand in hand with immunosuppressive treatment. Despite this, the transplanted cells disappear after a few years. "The idea of using the intrinsic regenerative capacities of the human body makes sense here," Pedro Herrera emphasizes. However, many hurdles remain before a treatment resulting from our discovery can be proposed. "We must indeed find a way - pharmacological or by gene therapy - to stimulate this change of identity in the cells concerned within the patient's own pancreas, but without causing adverse effects on other cell types" he adds. The road will be difficult and long. This work was funded by the NIH-NIDDK (National Institute of Diabetes and Digestive and Kidney Diseases, part of the US National Institutes of Health), a bonus of excellence of the Swiss National Science Foundation (SNFS) and the "Fondation privée des HUG", among others.

Credit: 
Université de Genève

Decoding the human immune system

For the first time ever, researchers are comprehensively sequencing the human immune system, which is billions of times larger than the human genome. In a new study published in Nature from the Human Vaccines Project, scientists have sequenced a key part of this vast and mysterious system -- the genes encoding the circulating B cell receptor repertoire.

Sequencing these receptors in both adults and infants, the scientists found surprising overlaps that could provide potential new antibody targets for vaccines and therapeutics that work across populations. As part of a large multi-year initiative, this work seeks to define the genetic underpinnings of people's ability to respond and adapt to an immense range of disease.

Led by scientists at Vanderbilt University Medical Center and the San Diego Supercomputer Center, this advancement is possible due to the merging of biological research with high-powered frontier supercomputing. While the Human Genome Project sequenced the human genome and led to the development of novel genomics tools, it did not tackle the size and complexity of the human immune system.

"A continuing challenge in the human immunology and vaccine development fields has been that we do not have comprehensive reference data for what the normal healthy human immune system looks like," says James E. Crowe, Jr., MD, Director of the Vanderbilt Vaccine Center of Vanderbilt University Medical Center, senior author on the new paper, which was published online in Nature on Feb. 13. "Prior to the current era, people assumed it would be impossible to do such a project because the immune system is theoretically so large, but this new paper shows it is possible to define a large portion, because the size of each person's B cell receptor repertoire is unexpectedly small."

The new study specifically looks at one part of the adaptive immune system, the circulating B cell receptors that are responsible for the production of antibodies that are considered the main determinant of immunity in people. The receptors randomly select and join gene segments, forming unique sequences of nucleotides known as receptor "clonotypes." In this way, a small number of genes can lead to an incredible diversity of receptors, allowing the immune system to recognize almost any new pathogen.

Conducting leukapheresis on three individual adults, the researchers cloned and sequenced up to 40 billion cells to sequence the combinations of gene segments that comprise the circulating B cell receptors -- achieving a depth of sequencing never before done. They also sequenced umbilical cord blood from three infants. The idea was to collect a vast amount of data on a few individuals, rather than the traditional model of collecting only a few points of data on many.

"The overlap in antibody sequences between individuals was unexpectedly high," Crowe explains, "even showing some identical antibody sequences between adults and babies at the time of birth." Understanding this commonality is key to identifying antibodies that can be targets for vaccines and treatments that work more universally across populations.

A central question was whether the shared sequences across individuals were the result of chance, rather than the result of some shared common biological or environmental factor. To address this issue, the researchers developed a synthetic B cell receptor repertoire and found that "the overlap observed experimentally was significantly greater than what would be expected by chance," says Robert Sinkovits, Ph.D., of the San Diego Supercomputer Center at the University of California, San Diego.

As part of a unique consortium created by the Human Vaccines Project, the San Diego Supercomputer Center applied its considerable computing power to working with the multiple terabytes of data. A central tenet of the Project is the merger of biomedicine and advanced computing. "The Human Vaccines Project allows us to study problems at a larger scale than would be normally possible in a single lab and it also brings together groups that might not normally collaborate," Sinkovits says.

Continued collaborative work is now under way to expand this study, including: sequencing other areas of the adaptive immune system, the T cell repertoire; adding additional demographics such as supercentenarians and international populations; and applying AI-driven algorithms to further mine the datasets for insights. The goal is to continue to interrogate the shared components of the immune system to develop safer and highly targeted vaccines and immunotherapies that work across populations.

"Due to recent technological advances, we now have an unprecedented opportunity to harness the power of the human immune system to fundamentally transform human health," says Wayne Koff, Ph.D., CEO of the Human Vaccines Project. "Decoding the human immune system is central to tackling the global challenges of infectious and non-communicable diseases, from cancer to Alzheimer's to pandemic influenza. This study marks a key step toward understanding how the human immune system works, setting the stage for developing next-generation health products through the convergence of genomics and immune monitoring technologies with machine learning and artificial intelligence."

Credit: 
Human Immunome Project

Common virus in early childhood linked to celiac disease in susceptible children

A common intestinal virus, enterovirus, in early childhood may be a trigger for later coeliac disease in children at increased genetic risk of the condition, finds a small study published in The BMJ today.

But adenovirus, another common virus, was not associated with a risk of later coeliac disease.

This preliminary finding adds new information on the role of viral infections as a potential underlying cause of coeliac disease, say the researchers.

Coeliac disease is a common digestive condition caused by an adverse reaction to gluten, a dietary protein found in wheat, barley and rye. It is believed to develop from a combination of genetics and environmental triggers.

Previous studies suggest that stomach and intestinal infections, which are common in childhood, play a role in the development of coeliac disease. But no firm conclusions have been made.

So researchers tested whether enterovirus and adenovirus infections - before the development of coeliac disease antibodies - were more common in children who were later diagnosed with coeliac disease, compared to those who were not.

Between 2001-2007, they recruited 220 Norwegian children who all carried both the HLA DQ2 and the DQ8 genetic makeup. The large majority of patients with coeliac disease carry at least one of these, which carries an increased risk of both coeliac disease and type 1 diabetes.

The researchers collected stool samples from age 3 - 36 months to detect the viruses, and blood samples were tested for coeliac disease antibodies at age 3, 6, 9 and 12 months, and then yearly until 2016.

After an average of nearly 10 years, 25 children were diagnosed with coeliac disease. Each child was then matched to two healthy controls.

Enterovirus was found in 370 (17%) of 2135 stool samples, with 73 children having at least one positive sample. And it was significantly more frequent in samples collected before development of coeliac disease antibodies in cases than in controls - 84 out of 429 (20%) in cases and 129 out of 855 (15%) in controls.

There was a significant association between exposure to enterovirus and later risk of developing coeliac disease, but adenovirus was not linked to the development of the disease.

Enterovirus infections caught after gluten was introduced to the child's diet were associated with coeliac disease, whereas those before or at the time of introduction were not, suggesting that the infection itself was the disease trigger.

This is an observational study, therefore no firm conclusions can be made about cause and the researchers can't rule out the possibility that other unmeasured factors may have influenced the results. What's more, the number of children with coeliac disease was limited and the findings may not be generalised to wider genetic profiles.

But the authors point out that this is the first study of its kind to explore the link between viruses in childhood and later coeliac disease.

And the HLA-DQ2 or HLA-DQ8 gene makeup represents nearly all "genetically susceptible" people, therefore they believe that their findings are likely to apply to a large proportion of those with coeliac disease.

With almost 40% of the population being genetically prone to coeliac disease, the authors highlight the "major problem" of identifying environmental triggers.

The authors suggest that identifying specific viruses as triggers may justify preventative strategies: "If enterovirus is confirmed as a trigger factor, vaccination could reduce the risk of development of coeliac disease," they conclude.

Credit: 
BMJ Group

Should we screen people for irregular heartbeat?

Should we screen people for irregular heartbeat (known as atrial fibrillation, or AF for short) in an effort to prevent strokes? Experts debate the issue in The BMJ today.

The prevalence of irregular heartbeat is rising significantly and is associated with increased risk of heart failure, heart attack, strokes, and potentially dementia, explains Mark Lown at the University of Southampton.

Studies have shown that screen detected irregular heartbeat could be dangerous, therefore the possibility of using blood thinners for prevention of AF related strokes should be considered, he argues.

In a previous study, the use of blood thinners (anticoagulation therapy) was associated with "significantly reduced adjusted risk of stroke from 4% to 1%, and the risk of death from 7% to 4% in just 1.5 years", Lown writes.

The main risk linked with AF screening is treating false positive cases (when healthy people are wrongly identified as sick), but Lown believes that trained clinicians can accurately confirm positive irregular heartbeat diagnoses from single-lead ECGs (which record the electrical activity of the heart) and further reduce the risk of false positives.

In addition, "intermittent screening together with repeated screening every few years could reduce the risk of false negative cases" (when sick people are wrongly identified as healthy).

Lown also suggests that single lead ECG devices are "inexpensive, non-invasive, re-usable, and convenient" and can help to "greatly reduce workload".

And he notes that due to advances in algorithms and wearable technology, such as the Apple Watch, screening for irregular heartbeat could become a part of many people's daily routine "whether we like it or not."

But Patrick Moran at Trinity College Dublin raises concerns over the important gaps in evidence regarding the effect of screening on stroke outcomes.

While he acknowledges that there is evidence showing that screening increases the detection of irregular heartbeat, Moran argues that there are currently no studies which show that screening reduces the risk or severity of stroke, and so we don't know whether the harms inherent in any screening programme would outweigh any benefits.

In an era when the scale of overdiagnosis and overtreatment in modern medicine is becoming increasingly clear, it is unwise to assume that increased detection always translates into improved health outcomes, and that the balance of risks and benefits for those identified through screening will be the same as those who are diagnosed after they develop symptoms, including the risk of bleeding from the use of blood thinners, he adds.

Clinical trials aiming to address these important issues are already underway, and Moran stresses that "we must wait for their results rather than pushing ahead with implementing a costly public health intervention".

"From a policy perspective, there is also considerable ambiguity about how screening would be scaled up and implemented in practice," he says.

Furthermore, while Moran notes the rapid development in ECG diagnostics through apps and wearable devices, he says that these have "the potential to diminish the applicability of previous research carried out using older technology."

Like Lown, Moran believes that action is needed to combat the "looming epidemic" of irregular heartbeat.

But in the absence of reliable research confirming the health benefits of AF screening, he argues that the "growing international momentum" behind it should be "harnessed to ensure that important gaps in knowledge are filled."

Credit: 
BMJ Group

Gender and cultural bias exists against teachers at university level

Students are more likely to rate male university teachers higher than their female counterparts in some areas of STEM and Business, according to Australia's largest review of student experience surveys.

The study, published today in PLOS ONE, examined almost 525,000 individual student experience surveys from UNSW Sydney students from 2010-2016 across five faculties. It is the first study to examine the interaction between gender and cultural bias.

"These results have enormous flow-on effects for society, beyond education, as over 40% of the Australian population now go to university, and graduates may carry these biases with them into the workforce," said Associate Professor Yanan Fan, lead author on the study and statistician from UNSW Science.

The study showed that in Business and Science, a male teacher from an English-speaking background was more than twice as likely to get a higher score on a student evaluation than a female teacher from a non-English speaking background. In Engineering, there wasn’t a significant swing against female teachers, except male English-speaking teachers were 1.4 times more likely to get a higher score than teachers in all other categories. For Medicine, local students were more likely to give lower scores to female teachers from non-English speaking backgrounds.

"In the Business and Science faculties in particular, male English-speaking teachers have the highest probability of getting the highest possible grade at six, out of six possible scores," Associate Professor Fan said.

In Arts and Social Sciences, there was no statistically significant bias against female teachers. The results suggest that where there is a larger proportion of female teachers, such as in Arts and Social Sciences, there is less bias. Bias was observed, however, against male non-English speaking background teachers when evaluated by local students.

"The results show universities must be models of equity and diversity in order to breakdown inequalities that persist in even the most progressive of workplaces," said Professor Merlin Crossley, UNSW Deputy Vice-Chancellor Academic.

"We regard student experience surveys as essential, but we have to know how to interpret the results in order understand unconscious bias and how we can bring about change. UNSW is driving a strategy that embraces diversity and we believe these biases will diminish over time. Diversity is a great strength of UNSW and we must keep celebrating it," said Professor Crossley.

Professor Crossley pointed to unconscious bias training, one of the key initiatives of UNSW's Equity, Diversity and Inclusion Board, as a program that tackles often hidden beliefs and attitudes about gender and culture.

In 2017, UNSW appointed Professor Eileen Baldry as UNSW's first Deputy Vice-Chancellor Inclusion and Diversity. One of the key objectives of the role, and of UNSW's 2025 Strategy, is achieving gender equity targets at all staff grades.

Associate Professor Fan said there was growing evidence to suggest that all aspects of employment, from hiring to performance evaluation to promotion, are affected by gender and cultural background.

"Reducing bias will have great benefits for society as university students represent a large proportion of future leaders in government and industry," said Associate Professor Fan.

Dean of Science at UNSW and co-author of the study, Professor Emma Johnston, says encouraging more women at the professorial level, in leadership positions and in membership of key committees will help shrink these biases.

"We need to continue to support women at all levels of academia in STEM across Australia, in order to smash stereotypes that create the partiality that exists within our community."

"We have clear targets for more gender diversity, particularly at the top. The Science Equity, Diversity and Inclusion working group is crucial for improving the number of women at all levels and helping to remove unconscious bias in performance evaluation," said Professor Johnston.

Last year, UNSW appointed five new Diversity Champions who will act as advocates, lead diversity working groups and identify ways UNSW can achieve its goal to be a global leader in equity, diversity and inclusion. The new Diversity Champions will serve a two-year term from 2019-2020.

Credit: 
University of New South Wales

Drug-induced cellular membrane complexes induce cancer cell death

image: MUSC Hollings Cancer Center researcher Besim Ogretmen in his laboratory.

Image: 
The Medical University of South Carolina Hollings Cancer Center

Old molecules and new complexes: researchers at Hollings Cancer Center at the Medical University of South Carolina (MUSC) have discovered cell membrane complexes called ceramidosomes that may be a new target for drugs to kill cancer cells. This discovery began while figuring out the unexpected cancer cell-killing activity of an FDA-approved multiple sclerosis drug called FTY720 (Gilenya, Novartis). Their findings are reported in the January 2019 issue of the Journal of Biological Chemistry.

"These complexes are really interesting as they form in the cell membrane and cause the cells to explode," says Besim Ogretmen, Ph.D., Endowed Chair in Lipidomics & Drug Discovery in the SmartState Center for Lipidomics, Pathobiology and Therapy at Hollings Cancer Center and professor in the Department of Biochemistry and Molecular Biology.

The Ogretmen Lab coined the term "ceramidosome" when they identified this complex structure made up of lipid (fatty) molecules called ceramide, and two other protein components. The identification of "-somes" has been growing in recent years as researchers are getting a better understanding of which individual molecules come together to form important but previously unknown functions.

Simply put, a cell is like an egg and the cell membrane is like the eggshell. Just as eggshells have tiny openings or pores that allow for air and moisture to move in and out, cell membranes have thousands of pores. Ceramides are chains of lipid molecules that are in the cell membrane and have a function in controlling cell death. The Ogretmen research team discovered that ceramides can come together with other molecules and form a new type of cell membrane pores: ceramidosomes.

Ceramidosomes are large membrane pores that cause the cell membrane to ripple. A rippled cell membrane is weak, and the cell explodes and dies, similar to how a cracked eggshell cannot contain the contents of the egg.

The Ogretmen Lab has been studying the multiple biological functions of lipid molecules such as ceramides for many years. However, FTY720 is an FDA-approved drug that also plays anti-cancer roles, in part by inducing ceramidosomes in the cell membrane. When researchers put FTY720 on cancer cells, they were surprised to find that the cancer cells die. Using FTY720 as a tool to look into what was going on, the Ogretmen lab set out to understand how the cancer cells are dying.

The lab's main focus was to study whether ceramide production plays any role in the observed FTY720-driven cancer cell death. The drug is approved to treat multiple sclerosis. It works by reducing immune system activity, but it must first be specially modified by the liver in order to work for immune suppression.

The Ogretmen lab found that this drug kills cancer cells without the liver modification step. FTY720 works by activating molecules that induce tumor suppression. Just like a chain reaction, the drug turns cancer cells against themselves. This is a novel method of cell death due to the molecules involved and the way the cell essentially explodes. If the ceramidosome is prevented from forming, drug-treated cells do not die. This revealed that the ceramidosome formation is integral to drug-induced cancer cell death.

The lab's initial discovery has led the way for many further studies. More in-depth molecular studies are necessary to determine how the ceramidosomes are activated and move around in the cell membrane. They may also have other functions besides causing cell death. For example, they also found that certain cells such as germline stem cells have ceramidosomes even without FTY720 treatment.

These initial findings are a positive step in the search for more effective cancer drugs. Since FTY720 is FDA approved, there is the possibility that it can be used off-label for cancer treatments. Different versions of the drug are also being studied to find modifications that still kill cancer cells but do not suppress the immune system. Just like good detectives, the researchers are moving clue by clue to figure out the unknown functions of this newly discovered ceramidosome complex. Using FTY720 as a tool to induce ceramidosomes, the researchers hope to accelerate progress along the path from lab bench to patient bedside.

"Understanding the fundamental mechanisms of how cancer cells grow uncontrollably is a key for developing more effective drugs to kill them, and this is our goal," says Ogretmen.

Credit: 
Medical University of South Carolina