Culture

UMN researcher studies hip fracture probability on women in late life

MINNEAPOLIS, MN- June 27, 2019 - New University of Minnesota Medical School research evaluates the impact of multimorbidity on the probability of hip fractures.

In an article recently published in JAMA Internal Medicine, lead author Kristine Ensrud, MD, Professor of Medicine at the University of Minnesota Medical School, examines the impact of disease definition, comorbidity burden and prognosis on hip fracture probability among women 80 years and older. Late-life women account for the majority of hip fractures in the United States, but are often not screened for osteoporosis. Older age, multimorbidity and poor health are risk factors for hip fracture, but these characteristics also increase the risk of competing, non-fracture related mortality. Thus, clinicians have difficulty identifying late-life women most likely to benefit from drug treatment to prevent hip fractures.

"Older patients with multiple medical conditions or poorer prognosis are excluded from clinical trials, but this is exactly the population that clinicians see," explained Ensrud. "That is why a lot of times when you look at clinical trials, the findings are not necessarily relevant to the patients you take care of."

The study found that late-life women with osteoporosis, including those with comorbidities or poorer prognosis had a high probability of hip fracture in the next 5 years, even after accounting for competing mortality risk. These results suggest that this group of women may derive a high absolute benefit from initiation of drug treatment to prevent fracture. In contrast, among late-life women without osteoporosis but still considered to be drug treatment candidates by the National Osteoporosis Foundation, mortality probability far outweighed the probability of hip fracture, especially among those with more comorbidities or poorer prognosis. These findings suggest that the absolute benefit of drug treatment to prevent fracture is much lower in this latter patient population.

Ensrud hopes this kind of study might inform treatment guidelines, making them less likely to focus on a single disease in isolation and more patient-focused.

"Patients have multiple diseases and clinicians can't just solely focus on osteoporosis or cardiovascular disease. They have to focus on the whole patient."

Credit: 
University of Minnesota Medical School

Despite the ACA, millions of Americans with cardiovascular disease still can't get care

Cardiovascular disease (CVD) is the leading cause of death for Americans, yet millions with CVD or cardiovascular risk factors (CVRF) still can't access the care they need, even years after the implementation of the Affordable Care Act (ACA).

In a new study published today in the Journal of General Internal Medicine, researchers at Harvard Medical School found the ACA increased insurance coverage and improved access to medical care for Americans with CVD such as heart disease and stroke, and for those with CVD risk factors including smokers and those with diabetes and obesity. However, three years after the ACA took full effect, 20.6 million remained uninsured, and many were unable to get regular medical care due to cost. The study is the first to document the effect of the ACA on Americans with CVD and CVRF, a group with complex health care needs that face significant health consequences when they lack coverage.

Researchers analyzed nationally representative data on more than one million adults aged 18 to 64 with a range of cardiovascular conditions, comparing data from before ACA implementation (2012-13) to data after the ACA was in effect (2015-16). They estimated that nearly 7% of those with CVD or CVRF gained insurance coverage in the first three years after the ACA went into effect, and that the ACA slightly narrowed previous disparities in coverage for ethnic and racial minorities.

Insurance coverage increased most in states that expanded Medicaid, from 81% to 89%. In non-expansion states, coverage increased more modestly, from 76% to 81%. Post-ACA, the percentage of chronically ill people with insurance ranged from a high of 94% in Massachusetts to a low of 73% in Texas. In addition to increases in coverage, the study found that post-ACA, Americans with CVD were less likely to forgo a doctor visit due to cost, and were more likely to have a check-up in the last year, and to have a primary care physician.

Despite these gains, researchers discovered major gaps in coverage and access to care: Nearly one-quarter of low-income Americans with a CVD or CVRF -- more than 20 million people -- still lacked coverage, and approximately 13% of black and 29% of Hispanic adults in the U.S. with CVD/CVRF remain uninsured. Among Hispanic people with CVD/CVRF in non-Medicaid expansion states, 42% still lack insurance, 25% could not afford a doctor visit, 40% did not have a checkup in the last year, and 48% did not have a personal physician.

"Patients with existing cardiovascular conditions require regular medical care and daily medications to prevent another heart attack or stroke," said lead author Ameen Barghi of Harvard Medical School. "The good news is that the medications for cardiovascular disease are very effective, and millions of Americans gained some coverage under the ACA. Unfortunately, despite these gains, millions of Americans with these conditions remain uncovered, and many of those will likely suffer serious complications and even death because they cannot get the care they need."

"The data shows that insurance coverage increased more in states that opted to expand Medicaid, and also that coverage rates were already lowest in non-expansion states before the ACA, highlighting the importance of the Medicaid expansion for people with serious pre-existing medical conditions," said Dr. Hugo Torres, a study author and also a physician at Massachusetts General Hospital and Instructor of Medicine at Harvard Medical School. "If more states expanded Medicaid, many more people with cardiovascular disease would be covered, particularly among people with low incomes, and in communities of color."

Dr. Torres notes that these findings should be considered in light of the Trump administration's intention to repeal and replace the ACA with a more "market-based" system.

"Repealing or weakening the ACA would strip coverage from millions of Americans with cardiovascular disease or cardiovascular risk factors, spelling disaster for many of them," said the study's senior author, Dr. Danny McCormick, a physician at Cambridge Health Alliance and Associate Professor at Harvard Medical School. "However, today's piecemeal approaches to coverage, including the ACA, will not get all these patients covered. Only a comprehensive Medicare-for-All plan could provide both coverage and good access to care for everyone with a cardiovascular disease or risk factor. Polls show that such reform is popular with the American people, and increasingly on the agenda of many elected officials."

Credit: 
Physicians for a National Health Program

Mum's workplace exposure to solvents may heighten child's autism risk

A mother's workplace exposure to solvents may heighten her child's risk of autism, suggests research published online in the journal Occupational & Environmental Medicine.

While cautious interpretation is needed, the findings add to a growing body of evidence indicating that environmental and workplace factors may be linked to the development of the condition, say the researchers.

Autism is a neurodevelopmental disorder that includes repetitive behaviours and difficulties communicating and socialising with others. In the US alone, recent figures indicate that one in every 68 children is on the autistic spectrum.

The speed with which the number of new cases has increased suggests that factors other than genes may be involved, say the researchers. And several studies have suggested links between prenatal exposure to environmental chemicals and pollutants.

As workplace exposures are often higher than environmental ones, the researchers wanted to explore if these might potentially be linked to the development of autism.

They drew on data collected for the CHildhood Autism Risks from Genetics and Environment (CHARGE) study. These included personal, health, and job history information for the parents of 537 children formally diagnosed with autism spectrum disorders and 414 children with typical neurodevelopment.

For each job, specialists (industrial hygienists) assessed the intensity and frequency of exposure for 750 mothers and 891 fathers to 16 agents that have been linked to neurological and/or congenital abnormalities from three months before pregnancy through to the birth of the child.

These included medicines, metals, pesticides, anaesthetics, asphalt, brake fluid, plastics and polymers, radiation, cleaners/disinfectants and solvents to include paints and degreasers as well as other chemicals.

Exposure levels were classified as none; rare (a few times a year); moderate (weekly); and frequent (several times a week/daily). Intensity was categorised as low, moderate or high, the last of which was taken to mean well above background levels.

The most common occupational exposures among the mums were to disinfectants/cleaners, solvents and ethylene oxide; the least common were to perchlorates, asphalt, PCBs, and machine fluids.

For dads, the most common occupational exposures were to disinfectants/cleaners, solvents and metals; the least common were to perchlorates, PCBs, and asphalt.

Mums with autistic children had been more frequently exposed to solvents than those whose children weren't on the spectrum, the findings showed.

They were 1.5 times more likely to have a child on the autistic spectrum. And moderate intensity cumulative exposure to solvents was associated with a near doubling in risk.

None of the other agents was associated with heightened risk in either parent or when the exposures of both parents were combined.

This is an observational study, and as such, can't establish cause. Only a few parents had been exposed to some of the agents, and after correcting for statistical bias, the observed associations no longer remained significant.

"However, these results are consistent with earlier reports that have identified solvents as a potential risk factor for [autism spectrum disorders]," write the researchers. Solvents can be absorbed through the skin and lungs, and many remain in the body, including in the brain, they point out.

Credit: 
BMJ Group

Menstrual symptoms linked to nearly 9 days of lost productivity through presenteeism every year

Menstrual period symptoms may be linked to nearly nine days of lost productivity every year through presenteeism, suggests the largest study of its kind, published in the online journal BMJ Open.

But the real impact on women and society is underestimated and poorly appreciated, say the researchers.

They set out to evaluate lost productivity associated with menstrual symptoms, as measured by time off from work or school (absenteeism) and working or studying while feeling ill (presenteeism) in 32,748 Dutch women between the ages of 15 and 45.

All the participants were recruited through social media from July to the end of October 2017.

They filled in a comprehensive online questionnaire about the frequency and length of their menstrual cycle and the severity of any associated symptoms, measured using a validated pain score (visual analogue scale or VAS for short).

They were asked if their symptoms had prompted them to take time off from work or school and/or had affected their productivity while working or studying, as well as how often this had happened.

Their blood loss lasted an average of 5 days. Menstrual symptoms prompted nearly a third (31%) of the women to visit their family doctor, and around one in seven (14.4%) to see a gynaecologist.

In all, around one in seven respondents (just under 14%, 4514) said they had taken time off from work or school during their period, with nearly 3.5% (1108) saying this happened during every, or nearly every, menstrual cycle.

The average amount of sickness leave taken came to just over one day a year.

Younger women under the age of 21 were around three times more likely to say they had taken time off because of their menstrual symptoms than were older women.

Most (just under 81%; 26,438) respondents reported presenteeism, and said that they were less productive as a result of their symptoms. In all, productivity was below par on more than 23 days out of the working/study year, with lost productivity amounting to almost 9 days each year.

The researchers calculated that, on average, each woman was less productive for a third of the time (33%), because of menstrual symptoms.

Notably, when women called in sick because of menstrual symptoms, only one in five (20%; 908) told their employer or school the real reason for their absence. And around two thirds (just under 68%; 22,154) of respondents said they wished they had the option of more flexible work/study hours during their period.

This is an observational study, and as such can't establish cause. The method of recruitment may also have introduced an element of 'selection bias' whereby those with debilitating symptoms might have been more likely to take part; the analysis also relied on what women said rather than objective assessment.

But the researchers nevertheless conclude: "Menstruation-related symptoms cause a great deal of lost productivity, and presenteeism is a bigger contributor to this than absenteeism.

"Taking all the symptoms into account, it seems likely that the real impact of [menstruation related symptoms] is underestimated in the general population."

But it's not openly talked about, they add. "Despite being almost two decades into the 21st century, discussions about [symptoms] may still be rather taboo."

Credit: 
BMJ Group

What agility and agile mean for you

image: Official logo of the Society for Industrial and Organizational Psychology

Image: 
SIOP

Agility and Agile: An Introduction for People, Teams, and Organizations is the latest in the Society for Industrial and Organizational Psychology (SIOP) white paper series. Written by Ben Baran, assistant professor of management at Cleveland State University and co-founder and principal of the consulting firm Indigo Anchor, and Scott Bible, assistant professor of practice at The University of Akron, this paper provides an overview of the increasingly common terms "agility" and "agile" along with practical implications for leaders who are operating in complex, changing environments.

"This topic is both timely and important," said Baran. "Executives, their teams, and other leaders within organizations are increasingly looking for new ways to deal with the turbulence they're facing. They want to be nimble to succeed amid shifting market conditions, and the concepts of agility and agile describe a mindset and a set of practices and principles that can help."

"Additionally, the field of industrial and organizational psychology has much to offer with regard to how organizations can build their agility," he added. "In particular, what we know about how people make sense of complex environments, how they can build and sustain high-performance teams, and how all of that can fit together within an organization can be of great benefit to any executive who is trying to make his or her organization increasingly adaptive and proactive."

In addition to providing an overview and background on the topic, Baran and Bible discuss eight specific implications for executives and other organizational leaders. Each implication stems from a combination of the research literature on these topics, the authors' experience working with companies, and emerging practices from a wide range of sectors and industries including the military and Silicon Valley.

Credit: 
Society for Industrial and Organizational Psychology

What journalism professors are teaching students -- about their futures

image: Max Besbris is an assistant professor of sociology at Rice U.

Image: 
Rice Department of Sociology

HOUSTON - (June 27, 2019) - As the journalism industry rapidly evolves, what are professors in the field telling students about their job prospects?

A new study from Rice University and Rutgers University finds educators are encouraging aspiring journalists to look for work outside the news business.

"Professionalizing Contingency: How Journalism Schools Adapt to Deprofessionalization," will appear in an upcoming edition of Social Forces. Authors Max Besbris, an assistant professor of sociology at Rice, and Caitlin Petre, an assistant professor of journalism and media studies at Rutgers, conducted the study in response to the massive transformations taking place in journalism, particularly in the field's labor market.

"The post-Watergate media era where you would work for a local paper or TV station and work your way up to retirement with a nice pension is behind us," Besbris said. "Now, papers are shutting down, news outlets are consolidating, and information is widely available on the internet. We wanted to see how these drastic changes in media and media consumption over the past 20 years were impacting journalism education."

For the study, Besbris and Petre conducted in-depth interviews with 113 faculty, staff and administrators from 44 U.S. journalism programs that varied in size, prestige, location and other factors. The authors argue that journalism schools have sought to reframe the industry's unstable labor market as an inevitable and even desirable part of the business and its professional identity.

"Professional schools in general seem to be a means by which we can get a good career," Besbris said. "A medical degree is a pretty clear path, as is the path of a social worker or engineer. However, journalism is a less defined profession and you don't need a license to practice. That's an interesting aspect of this case. Master's degrees are on the rise but more of them -- including journalism degrees -- don't necessarily offer a clear path to a secure career."

Indeed, the authors found that journalism educators are "very aware" and sensitive to changes in the industry. The majority interviewed said they accept the changes in the field as a reality and see no way of returning to old models. They also agreed that students must move away from thinking about journalism as a coherent career path and instead must accept the precarious nature of their jobs.

"They're telling their students that they don't have to, in fact shouldn't, go work for traditional news organizations -- they can do temporary, contract or freelance work, or work for non-news corporations, the government, NGOs (nongovernmental organizations) or almost any other place," Besbris said. "For a long time journalism had been trying to cultivate the difference between journalism and PR (public relations), so it was really interesting to see this change in thinking, and hear individuals say that students should prepare to work as journalists in non-news organizations."

Besbris also said most of the educators they interviewed stressed that students should be "as entrepreneurial as possible" and be willing to start their own businesses or websites. They encouraged students to not only become good writers or photojournalists, but also develop the skills to do just about anything from writing and editing to recording and designing.

"Many of these J-school professors are telling students to learn to hustle, be game for anything and even to celebrate the precariousness of the labor market," Besbris said.

To be sure, there's pushback from some instructors, Besbris said. Some of those interviewed were "very upset" about the changes taking place in their schools and within the industry. However, Bebris said, those people -- who were mostly Ph.D.s with little or distant experience in the field -- comprised a small minority.

Besbris and Petre hope the research will illuminate how professional schools writ large are adjusting to labor market instability in the fields for which they're training students.

Credit: 
Rice University

NIST presents first real-world test of new smokestack emissions sensor designs

video: To monitor emissions from coal-fired power plants, technicians need to measure the rate at which flue gas is emitted from the smokestack. The flow inside the smokestack contains eddies and swirls but generally travels upwards. In the NIST tests, four probes -- called pitot tubes -- are inserted horizontally into the smokestack. The four probes each take a flow measurement at four different spots, for a total of 16 measurements. With this information, NIST scientists could test the precision and accuracy of a new pitot tube design and measurement method.

Image: 
Sean Kelley/NIST

In collaboration with industry, researchers at the National Institute of Standards and Technology (NIST) have completed the first real-world test of a potentially improved way to measure smokestack emissions in coal-fired power plants. The researchers are presenting their work this week at the 2019 International Flow Measurement Conference (FLOMEKO) in Lisbon, Portugal.

Each year, to meet requirements set by the Environmental Protection Agency (EPA), coal-fired power plants must have their smokestack emissions audited, or checked by an independent third party. NIST researchers wanted to make this test quicker to save the plants money during their audits, while also improving accuracy of the sensors. So, a NIST team has designed new probes for sensing emission flow rates and a new measurement method that could potentially speed up on-site audits by a factor of 10, researchers say.

The fieldwork results were "promising," said NIST engineer Aaron Johnson, and were in reasonable agreement with the laboratory findings. "We were surprised; it did quite well compared to what the EPA has on its books as its 'best practices' method."

To monitor emissions from coal-fired power plants, technicians need to measure the rate at which flue gas is emitted from the smokestack. The flow inside the smokestack contains eddies and swirls but generally travels upwards. In the NIST tests, four probes -- called pitot tubes -- are inserted horizontally into the smokestack.

The four probes each take a flow measurement at four different spots, for a total of 16 measurements. With this information, NIST scientists could test the precision and accuracy of a new pitot tube design and measurement method.

NIST conducted this work as part of a cooperative research and development agreement (CRADA) with the Electric Power Research Institute (EPRI), an independent nonprofit organization whose members include electric utility companies, businesses and government agencies.

"Coal-fired electric generating units may benefit from the current NIST work by having improved standards and techniques to measure mass emissions more accurately, with increased confidence that all entities are reporting on a uniform basis," said EPRI program manager Tom Martz. He added that the potential time savings "is not something we can accurately quantify at this time, but this will be a key objective of future work."

The ultimate goal is to provide research that the EPA might someday develop into a new standard for smokestack emissions calibration.

"The advantages for industry are that it will reduce test time and cost and has the potential to be more accurate" than current industry-standard probes, Johnson said.

Even if the EPA does not create a new standard, however, the work could have benefits for industry by providing power plant companies with more choices for managing their emissions tests. "Our goal is to get it written as an EPA standard," Johnson said. "But it's still up to industry members to decide whether they would want to use it."

Go With the Flow

Smokestacks at coal-fired power plants are equipped with monitors that continuously measure the concentration of flue gas emissions, which include carbon dioxide, mercury, sulfur dioxide and nitrogen oxides, as well as the flow rate of the flue gas. By federal law, the built-in flow-rate sensors need to be calibrated -- that is, checked for accuracy -- during the annual audit.

To conduct the yearly calibration, auditors use small portable devices called pitot tubes. The audit technicians climb the stack - usually several dozen meters (hundreds of feet) tall - and insert their pitot probes horizontally into the gases churning their way up the smokestack. They take several readings of the flow at various points within a cross section of the stack, which is typically 7 or 8 meters (25 feet) in diameter.

By far the most common kind of sensor used for this work is an "S-probe." It has two holes, or ports. One port faces directly into the flow of gas and detects the pressure that builds up in the tube. The other port faces the opposite direction. The faster the flow, the higher the pressure difference between the two ports; measuring this difference in pressure allows auditors to calculate the flow's speed.

S-probes don't require calibration, but each measurement can take several minutes, since the technician has to manually rotate the sensor until one side is facing directly into the flow. This is complicated because the flow is not necessarily traveling directly upward at the point being tested. At the base of the stack, flue gas usually travels around a sharp bend, which creates complicated eddies and swirls that don't go away even in tall smokestacks.

Using S-probes is so labor-intensive that an on-site annual calibration can take a day or more to complete. "And the power plant is losing money all the time the auditors are there, so they want the technicians in and out as fast as possible," Johnson said.

To speed up this process, the NIST scientists have made three innovations. First, they have created two new models of pitot tubes, with five holes instead of two, which perform better than S-probes and may offer advantages over other five-hole models of pitot tubes currently in use.

The probes, designed by NIST physicist Iosif Shinder, come in two shapes: hemispherical and conical.

Second, the scientists have developed a calibration scheme for their new sensors that does not require a technician to rotate the probe inside a smokestack to find the true direction of the flow for each measurement. So, although the sensors would have to be calibrated before use, they would take much less time to use during an actual audit.

Third, NIST's Jim Filla developed software that is compatible with a commercially available automated system to measure flow in real time.

The Real Deal

Until now, the new probes' performance had been measured only at NIST's test facility, which includes a scale-model smokestack simulator and a wind tunnel. But NIST's laboratories cannot replicate all aspects of a real power plant, such as the presence of soot in the smokestack's flow.

"It's one thing to test it in our wind tunnel," Johnson said. "It's another to prepare to test it in a stack that's 120 degrees F."

The first field run, in July 2018, took place at a natural gas plant, where flow is relatively straightforward to measure.

The second, in September 2018, was conducted at a coal-fired power plant with a particularly complicated flow.

The coal-fired plant had an enclosed platform where the pitot tubes were inserted into the smokestack. But the natural gas plant's platform was open to the elements. And at roughly 45 meters (145 feet) in the air, "things shake," said NIST technician Joey Boyd. "While you're working, the stack is swaying, and the floor beneath you is moving."

When NIST researchers analyzed the data, their results were promising, agreeing to within 2% with their laboratory findings.

"The probes performed equally well in the smokestack as they did at NIST's test facility," Johnson said.

Future field tests will help the researchers solve the biggest problem they had: sensor clogging, in which the pitot tubes' ports get gummed up with water and particulate matter and have to be flushed before a test can continue.

Also, the work taught them they needed to write special software signaling to their equipment every time there was a "purge" - a high-pressure blast of air through the pitot probe that could damage a key part of the apparatus if certain valves were not closed in time.

Credit: 
National Institute of Standards and Technology (NIST)

Functional hair follicles grown from stem cells

image: Hair growth in nude mice transplanted with human iPSC-derived dermal papilla cells that were combined with mouse epithelial cells inside a biodegradable scaffold. Left insert: enlarged outside view. Right insert: fluorescent microscopy image of hair follicles under the skin; cell nuclei (blue), epithelial cells (green), human dermal papilla cells (red).

Image: 
Sanford Burnham Preybs

LOS ANGELES - June 27, 2019 - Scientists from Sanford Burnham Prebys have created natural-looking hair that grows through the skin using human induced pluripotent stem cells (iPSCs), a major scientific achievement that could revolutionize the hair growth industry. The findings were presented today at the annual meeting of the International Society for Stem Cell Research (ISSCR) and received a Merit Award. A newly formed company, Stemson Therapeutics, has licensed the technology.

More than 80 million men, women and children in the United States experience hair loss. Genetics, aging, childbirth, cancer treatment, burn injuries and medical disorders such as alopecia can cause the condition. Hair loss is often associated with emotional distress that can reduce quality of life and lead to anxiety and depression.

“Our new protocol described today overcomes key technological challenges that kept our discovery from real-world use,” says Alexey Terskikh, Ph.D., an associate professor in Sanford Burnham Prebys’ Development, Aging and Regeneration Program and the co-founder and chief scientific officer of Stemson Therapeutics. “Now we have a robust, highly controlled method for generating natural-looking hair that grows through the skin using an unlimited source of human iPSC-derived dermal papilla cells. This is a critical breakthrough in the development of cell-based hair-loss therapies and the regenerative medicine field.”

Terskikh studies a type of cell called dermal papilla. Residing inside the hair follicle, these cells control hair growth, including hair thickness, length and growth cycle. In 2015, Terskikh successfully grew hair underneath mouse skin (subcutaneous) by creating dermal papilla derived from human pluripotent stem cells--a tantalizing but uncontrolled process that required further refinement.

"Our new protocol described today overcomes key technological challenges that kept our discovery from real-world use," says Terskikh. "Now we have a robust, highly controlled method for generating natural-looking hair that grows through the skin using an unlimited source of human iPSC-derived dermal papilla cells. This is a critical breakthrough in the development of cell-based hair-loss therapies and the regenerative medicine field."

The approach detailed in the ISSCR presentation, which was delivered by lead researcher Antonella Pinto, Ph.D., a postdoctoral researcher in the Terskikh lab, features a 3D biodegradable scaffold made from the same material as dissolvable stitches. The scaffold controls the direction of hair growth and helps the stem cells integrate into the skin, a naturally tough barrier. The current protocol relies on mouse epithelial cells combined with human dermal papilla cells. The experiments were conducted in immunodeficient nude mice, which lack body hair.

The derivation of the epithelial part of a hair follicle from human iPSCs is currently underway in the Terskikh lab. Combined human iPSC-derived epithelial and dermal papilla cells will enable the generation of entirely human hair follicles, ready for allogenic transplantation in humans. Distinct from any other approaches to hair follicle regeneration, human iPSCs provide an unlimited supply of cells and can be derived from a simple blood draw.

“Hair loss profoundly affects many people’s lives. A significant part of my practice involves both men and women who are seeking solutions to their hair loss,” says Richard Chaffoo, M.D., F.A.C.S., a triple board-certified plastic surgeon who founded La Jolla Hair MD and is a medical adviser to Stemson Therapeutics. “I am eager to advance this groundbreaking technology, which could improve the lives of millions of people who struggle with hair loss.”

Credit: 
Sanford Burnham Prebys

Seven-country study reveals viruses as new leading cause of global childhood pneumonia

Respiratory syncytial virus (RSV) and other viruses now appear to be the main causes of severe childhood pneumonia in low- and middle-income countries, highlighting the need for vaccines against these pathogens, according to a study from a consortium of scientists from around the world, led by a team at the Johns Hopkins Bloomberg School of Public Health.

Pneumonia is the leading cause of death worldwide among children under 5 years old, with about 900,000 fatalities and more than 100 million reported cases each year. This makes pneumonia a greater cause of childhood mortality than malaria, tuberculosis, HIV, Zika virus and Ebola virus combined.

The study, to be published June 27 in The Lancet, was the largest and most comprehensive of its kind since the 1980s. It included nearly 10,000 children in seven African and Asian countries. After testing for viruses, bacteria, and other pathogens in children with severe hospitalized pneumonia--and in community children without pneumonia--the study found that 61 percent of severe pneumonia cases were caused by viruses led by RSV, which alone accounted for 31 percent of cases.

"Prior to this study, we didn't know which specific viruses and bacteria are now causing most of the severe childhood pneumonia cases in the world, but public health organizations and vaccine manufacturers really need that information to work toward reducing the substantial childhood mortality that pneumonia still causes," says study co-principal investigator Maria Deloria Knoll, PhD, a senior scientist in the Bloomberg School's Department of International Health, and associate director of science at the Johns Hopkins International Vaccine Access Center (IVAC).

Identifying the germs that cause pneumonia is difficult in individual cases and much more so on a scale of thousands of cases, especially in low- and middle-income countries where most pneumonia deaths occur. Researchers in prior pneumonia studies simply lacked the microbiological and analytical resources to produce estimates of the major pneumonia pathogens, Knoll says. And, in the past two decades, many low- and middle-income countries have introduced effective vaccines against known major bacterial causes of pneumonia--Haemophilus influenzae type b and Streptococcus pneumoniae--so the global mix of pathogens causing childhood pneumonia has changed as a result.

The new, IVAC-led study, known as the Pneumonia Etiology Research for Child Health (PERCH) study, included 4,232 cases of severe hospitalized pneumonia among children under 5 years and 5,119 community children without pneumonia during a two-year period. The study was carried out at sites in Bangladesh, The Gambia, Kenya, Mali, South Africa, Thailand, and Zambia.

For their study, researchers took nasal and throat swabs as well as blood, sputum and other fluid samples from cases and controls and tested them for pathogens using state-of-the-art laboratory techniques. Cases for the primary analysis were limited to those whose pneumonia was confirmed by chest X-ray, and children with HIV were considered in a separate analysis because the causes of their pneumonia would likely differ from those without HIV. With analytic methods unique for an etiology study, the researchers compared the pathogens found in samples from severe pneumonia cases to those from other children in the community in order to estimate the likeliest cause of each case. In this way they were able to identify the leading causes of childhood pneumonia among children in these settings.

The researchers concluded that, across all study sites combined, viruses accounted for 61.4 percent of cases, bacteria for 27.3 percent of cases, Mycobacterium tuberculosis for 5.9 percent of cases. Fungal and unknown causes accounted for the remainder of cases.

RSV accounted for nearly a third of all cases and was the leading cause of severe pneumonia in each of the seven countries studied. Other top causes were rhinovirus, human metapneumovirus, parainfluenza viruses, and S. pneumoniae bacteria.

"We now have a much better idea of which new vaccines would have the most impact in terms of reducing illness and mortality from childhood pneumonia in these countries," says Katherine O'Brien, MD, who led the PERCH study as a professor at the Johns Hopkins Bloomberg School of Public Health and now serves as Director of Immunizations, Vaccines and Biologicals at the World Health Organization.

RSV has long been known as a common and potentially serious respiratory pathogen among children and the elderly. It remains the leading cause of pneumonia in children younger than 1 year in the United States, according to the Center for Disease Control and Prevention. Several RSV vaccine candidates are being developed and evaluated in clinical trials. A monoclonal antibody therapy, palivizumab, is available for the prevention of RSV disease in children with underlying medical conditions but is not suitable programmatically or financially for widespread use in routine immunization programs.

The analytical technique developed for the study to estimate the cause of individual cases of childhood pneumonia is called the Bayesian Analysis Kit for Etiology Research (BAKER), and is available online as an open-source application for use by other public health researchers.

"Estimating the etiology of pneumonia was like a complex jigsaw puzzle where the picture could only be seen clearly by assembling multiple, different pieces of information using innovative epidemiologic and statistical methods," says Scott Zeger, PhD, Malone Professor of Biostatistics in the Bloomberg School's Department of Biostatistics

Credit: 
Johns Hopkins Bloomberg School of Public Health

Risk prediction model may help determine if a lung nodule will progress to cancer

Bottom Line: A risk prediction model developed using clinical and radiological features could stratify individuals presenting with a lung nodule as having high or low risk for lung cancer.

Journal in Which the Study was Published: Cancer Prevention Research, a journal of the American Association for Cancer Research

Author: Barbara Nemesure, PhD, director of the Cancer Prevention and Control Program and the Lung Cancer Program at Stony Brook Cancer Center in New York

Background: "While lung nodules are not uncommon, a major challenge in the field is determining which nodules will progress to cancer," said Nemesure.

Even though lung and bronchus cancer is the leading cause of cancer mortality in the United States, the five-year survival rate for localized disease is greater than 50 percent, according to recent statistics. However, the majority of lung cancer cases are diagnosed after the cancer has metastasized. "Lung cancer is often asymptomatic in early stages, and the identification of high-risk individuals is a major priority," Nemesure said.

Prior studies in this area include a retrospective analysis of lung cancer patients and an analysis of high-risk individuals undergoing screening for the disease, noted Nemesure. The current study aimed to prospectively predict lung cancer incidence among the general population presenting with a lung nodule, she said.

How the Study Was Conducted and Results: Nemesure and colleagues analyzed data from 2,924 patients presenting with a lung nodule assessed at the Stony Brook Cancer Center's Lung Cancer Evaluation Center between Jan. 1, 2002, and Dec. 31, 2015. Patients were excluded if they had a history of lung cancer or if they were diagnosed with lung cancer within six months of the initial consultation. Participants were randomly assigned to discovery (1,469 patients) and replication (1,455 patients) cohorts. Among them, 171 developed lung cancer over the 13-year period.

Clinical and radiological data were collected to develop a risk prediction model. Using multivariable analyses, the researchers found that the combined variables of age, smoking pack-years, personal history of cancer, the presence of chronic obstructive pulmonary disease, and nodule characteristics such as size, the presence of spiculation, and the presence of a ground-glass opacity, could best predict who would develop lung cancer among the discovery cohort. These factors were combined to develop an overall risk score to stratify patients into high- and low-risk categories.

When the risk score was applied to the replication cohort, the researchers found that the model could discriminate cancer risk with a sensitivity and specificity of 73 percent and 81 percent, respectively. Compared with individuals in the low-risk category, those in the high-risk category had more than 14 times the risk of developing lung cancer.

Author's Comments: "Through our model, we can identify which individuals with lung nodules should be closely monitored, so that we can catch the disease at an early stage and ultimately reduce the burden of lung cancer deaths," Nemesure said.

"Even though the majority of lung nodules do not progress to cancer, it is still vitally important that patients seek follow-up care," noted Nemesure.

Credit: 
American Association for Cancer Research

3D body mapping could identify, treat organs, cells damaged from medical conditions

WEST LAFAYETTE, Ind. - Medical advancements can come at a physical cost. Often following diagnosis and treatment for cancer and other diseases, patients' organs and cells can remain healed but damaged from the medical condition.

In fact, one of the fastest growing medical markets is healing and/or replacing organs and cells already treated, yet remain damaged by cancer, cardiovascular disease and other medical issues. The global tissue engineering market is expected to reach $11.5 billion by 2022. That market involves researchers and medical scientists working to repair tissues damaged by some of the world's most debilitating cancers and diseases.

One big challenge remains for the market - how to monitor and continuously test the performance of engineered tissues and cells to replace damaged ones. Purdue University researchers have come up with a 3D mapping technology to monitor and track the behavior of the engineered cells and tissues and improve the success rate for patients who have already faced a debilitating disease. The technology is published in the June 19 edition of ACS Nano.

"My hope is to help millions of people in need," said Chi Hwan Lee, an assistant professor of biomedical engineering and mechanical engineering in Purdue's College of Engineering, who leads the research team. "Tissue engineering already provides new hope for hard-to-treat disorders, and our technology brings even more possibilities."

The Purdue team created a tissue scaffold with sensor arrays in a stackable design that can monitor electrophysiological activities of cells and tissues. The technology uses the information to produce 3D maps to track activity.

"This device offers an expanded set of potential options to monitor cell and tissue function after surgical transplants in diseased or damaged bodies," Lee said. "Our technology offers diverse options for sensing and works in moist internal body environments that are typically unfavorable for electronic instruments."

Lee said the Purdue device is an ultra-buoyant scaffold that allows the entire structure to remain afloat on the cell culture medium, providing complete isolation of the entire electronic instrument from the wet conditions inside the body.

Lee and his team have been working with Sherry Harbin, a professor in Purdue's Weldon School of Biomedical Engineering, to test the device in stem cell therapies with potential applications in the regenerative treatment of diseases.

Their works align with Purdue's Giant Leaps celebration, celebrating the global advancements in health as part of Purdue's 150th anniversary. Health, including disease monitoring and treatment, is one of the four themes of the yearlong celebration's Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Lee and the other researchers worked with the Purdue Research Foundation Office of Technology Commercializationto patent the new device.

Credit: 
Purdue University

One in five haematological cancer patients suffer blood clots or bleeding

New Danish research may help direct focus towards the serious complications that on average every fifth haematological cancer patient suffers. This is according to medical doctor and PhD Kasper Adelborg from Aarhus University and Aarhus University Hospital, who has studied the cases of 32,000 haematological cancer patients between the years 2000-2013. Haematological cancer includes leukaemia, bone marrow cancer and cancers of the lymph nodes.

"This is a broad group of patients with very different disease experiences depending on the type of haematological cancer. Some patients have a particular risk of suffering blood clots, while others have instead a higher risk of bleeding such as e.g. gastrointestinal bleeding," says Kasper Adelborg, before stating that the new knowledge can be used for even better prevention and individualised treatment:

"If a person has a high risk of suffering a blood clot, treatment with anticoagulant medicine can benefit some patients. But anticoagulant medicine is not desirable if the risk of suffering bleeding is higher. This is a difficult clinical problem, but our study can set goals for what carries most weight for each individual type of cancer," he says.

One example is the disease myelodysplastic syndrome (MDS), which is a type of bone marrow cancer. Here the study showed that the risk of bleeding within ten years was approx. fifteen per cent, while the risk of suffering a blood prop was lower.

"This means that doctors who help these patients should be aware that they have a high risk of bleeding and should therefore not prescribe too much anticoagulant medicine," says Kasper Adelborg.

He adds that with each individual patient there is still a need to weigh up the overall risk of a blood prop and bleeding, which includes taking into account the patient's age, medical history, other diseases, lifestyle etc. before choosing a treatment.

Major preventative potential:

The new study, which has been published in the Journal of Thrombosis and Haemostasis, corroborates previous studies, though researchers have not previously looked at the entire group of haematological cancer patients together - and neither were there any studies covering so many years. Additionally, previous studies have either focused solely on blood clots or bleeding.

Kasper Adelborg emphasises that there are major differences in the prognoses for the different patient groups. For example. only a few children develop a blood prop of suffer bleeding in the years after suffering from leukaemia, while far more patients with e.g. bone marrow cancer develop blood props and/or bleeding.

"The potential for prevention is particularly large in the latter group," he says.

In relation to the population as a whole, the study shows the heightened risk for haematological cancer patients:

Blood clot in the heart: Forty per cent higher.

Blood clot in the brain: Twenty per cent higher.

Blood clot in the legs and lungs: over three hundred per cent higher.

Bleeding: Two hundred per cent higher.

Credit: 
Aarhus University

Mimicking the ultrastructure of wood with 3D-printing for green products

image: 3D printing with sustainable Swedish forest materials. The microscopy images of real wood tissue and the 3D printed version show how the researchers mimicked the real wood's cellular architecture. The printed version is at a larger scale for ease of handling and display, but the researchers are able to print at any scale.

Image: 
Yen Strandqvist/Chalmers University of Technology

Researchers at Chalmers University of Technology, Sweden, have succeeded in 3D printing with a wood-based ink in a way that mimics the unique 'ultrastructure' of wood. Their research could revolutionise the manufacturing of green products. Through emulating the natural cellular architecture of wood, they now present the ability to create green products derived from trees, with unique properties - everything from clothes, packaging, and furniture to healthcare and personal care products.

The way in which wood grows is controlled by its genetic code, which gives it unique properties in terms of porosity, toughness and torsional strength. But wood has limitations when it comes to processing. Unlike metals and plastics, it cannot be melted and easily reshaped, and instead must be sawn, planed or curved. Processes which do involve conversion, to make products such as paper, card and textiles, destroy the underlying ultrastructure, or architecture of the wood cells. But the new technology now presented allows wood to be, in effect, grown into exactly the shape desired for the final product, through the medium of 3D printing.

By previously converting wood pulp into a nanocellulose gel, researchers at Chalmers had already succeeded in creating a type of ink that could be 3D printed. Now, they present a major progression -successfully interpreting wood's genetic code, and digitising it so that it can instruct a 3D printer.

It means that now, the arrangement of the cellulose nanofibrils can be precisely controlled during the printing process, to actually replicate the desirable ultrastructure of wood. Being able to manage the orientation and shape means that they can capture those useful properties of natural wood.

"This is a breakthrough in manufacturing technology. It allows us to move beyond the limits of nature, to create new sustainable, green products. It means that those products which today are already forest-based can now be 3D printed, in a much shorter time. And the metals and plastics currently used in 3D printing can be replaced with a renewable, sustainable alternative," says Professor Paul Gatenholm, who has led this research through the Wallenberg Wood Science Centre at Chalmers.

A further advance is the addition of hemicellulose, a natural component of plant cells, to the nanocellulose gel. The hemicellulose acts as a glue, giving the cellulose sufficient strength to be useful, in a similar manner to the natural process of lignification, through which cell walls are built.

The new technology opens up a whole new area of possibilities. Wood-based products could now be designed and 'grown' to order - at a vastly reduced timescale compared with natural wood.

Paul Gatenholm's group has already developed a prototype for an innovative packaging concept. They printed out honeycomb structures, with chambers in between the printed walls, and then managed to encapsulate solid particles inside those chambers. Cellulose has excellent oxygen barrier properties, meaning this could be a promising method for creating airtight packaging for foodstuffs or pharmaceuticals for example.

"Manufacturing products in this way could lead to huge savings in terms of resources and harmful emissions," he says. "Imagine, for example, if we could start printing packaging locally. It would mean an alternative to today's industries, with heavy reliance on plastics and C02-generating transport. Packaging could be designed and manufactured to order without any waste".

They have also developed prototypes for healthcare products and clothing. Another area where Paul Gatenholm sees huge potential for the technology is in space, believing that it offers the perfect first test bed to develop the technology further.

"The source material of plants is fantastically renewable, so the raw materials can be produced on site during longer space travel, or on the moon or on Mars. If you are growing food, there will probably be access to both cellulose and hemicellulose," says Paul Gatenholm.

The researchers have already successfully demonstrated their technology at a workshop at the European Space Agency, ESA, and are also working with Florida Tech and NASA on another project, including tests of materials in microgravity.

"Traveling in space has always been the catalyst for material development on earth," he says.

Credit: 
Chalmers University of Technology

Some crocs of the past were plant eaters

image: This image shows crocodyliform life reconstructions.

Image: 
Jorge Gonzalez

Based on careful study of tooth remains, researchers have found that ancient groups of crocodyliforms--the group including living and extinct relatives of crocodiles and alligators--were not the carnivores we know today, as reported in the journal Current Biology on June 27. In fact, the evidence suggests that a veggie diet arose in the distant cousins of modern crocodylians at least three times.

"The most interesting thing we discovered was how frequently it seems extinct crocodyliforms ate plants," says Keegan Melstrom (@gulosuchus) of the University of Utah. "Complex teeth, which we infer to indicate herbivory, appear in the extinct relatives of crocodiles at least three times and maybe as many as six in our dataset alone."

All living crocodylians possess a similar general morphology and ecology to match their lifestyle as semiaquatic generalist carnivores, which includes relatively simple, conical teeth. It was clear from the start that extinct species showed a different pattern, including species with many specializations not seen today. One such specialization is a feature known as heterodonty, regionalized differences in tooth size or shape.

"Carnivores possess simple teeth whereas herbivores have much more complex teeth," Melstrom says. "Omnivores, organisms that eat both plant and animal material, fall somewhere in between. Part of my earlier research showed that this pattern holds in living reptiles that have teeth, such as crocodylians and lizards. So these results told us that the basic pattern between diet and teeth is found in both mammals and reptiles, despite very different tooth shapes, and is applicable to extinct reptiles."

To infer what those extinct crocodyliforms most likely ate, Melstrom and his advisor Randall Irmis compared the tooth complexity of extinct crocodyliforms to those of living animals using a method originally developed for use in living mammals. Overall, they measured 146 erupted teeth from 16 different taxa of extinct crocodyliforms at a resolution of 25 data rows per tooth.

Using a combination of quantitative dental measurements and other morphological features, the researchers reconstructed the diets of those extinct crocodyliforms. The results show that those animals had a wider range of dental complexities and presumed dietary ecologies than had been appreciated previously.

Plant-eating crocodyliforms appeared early in the evolutionary history of the lineage, the researchers conclude, shortly after the end-Triassic mass extinction, and persisted until the end-Cretaceous mass extinction. Their analysis suggests that herbivory arose independently a minimum of three times, and possibly six times, in Mesozoic crocodyliforms.

"Our work demonstrates that extinct crocodyliforms had an incredibly varied diet," Melstrom says. "Some were similar to living crocodylians and were primarily carnivorous, others were omnivores, and still others likely specialized in plants. The herbivores lived on different continents at different times, some alongside mammals and mammal relatives, and others did not. This suggests that an herbivorous crocodyliform was successful in a variety of environments!"

Melstrom says they are continuing to reconstruct the diets of extinct crocodyliforms, including in fossilized species that are missing teeth. He also wants to understand why the extinct relatives of crocodiles diversified so radically after one mass extinction but not another and whether dietary ecology could have played a role.

Credit: 
Cell Press

Growing embryonic tissues on a chip

image: Colony of human embryonic stem cells that was exposed to a microfluidic gradient of a morphogen (BMP4), resulting in the establishment of different cell types arranged in layers.

Image: 
Andrea Manfrin, EPFL

It's no surprise that using human embryos for biological and medical research comes with many ethical concerns. Correct though it is to proceed with caution in these matters, the fact is that much science would benefit from being able to study human biology more accurately.

One solution lies with alternative tools - what scientists call in vitro models. But despite some advancements with adult tissues, when it comes to modelling the early developmental processes of the human embryo, things become complicated.

Now, scientists at EPFL's Institute of Bioengineering have simulated aspects of embryo formation in vitro starting from embryonic stem cells.

"A tricky problem in reliably constructing tissues outside of an organism in general is how to present key signaling molecules, also termed morphogens, to the cells in culture at the right time and dose," says EPFL Professor Matthias Lütolf, whose research group led the research. "Simply exposing a collection of stem cells to a single concentration of a morphogen ends in uncontrolled morphogenesis because the cells lack important instructions."

But in a developing embryo, stem cells receive a highly dynamic range of morphogen concentrations from so-called "signaling centers". It is this gradient of morphogens that tells stem cells what type of specialized cell and tissue to become.

To implement this principle, Dr Andrea Manfrin in Lütolf's lab developed a method for exposing human embryonic stem cells in culture to gradients of morphogens, mimicking the real-life conditions of gastrulation - an early stage of the developing embryo where its cells begin to transform into different cell types and tissues.

The method involves growing the stem cells in a microfluidic device, which is a chip with small channels that allow the precise control of tiny amounts of fluid. The researchers grew stem cells in a culture chamber on the microfluidic chip, and were able to expose them to carefully controlled concentration gradients of various morphogens.

The results were impressive: the cells developed and organized into domains of different cell types, depending on the concentration they were exposed to, like they do in the body. In fact, the scientists report that they were able to successfully mimic aspects of gastrulation, paving the way for growing specific human tissues in the lab in a more controlled manner.

"We hypothesized that engineering an artificial signaling center 'ex vivo' could allow us to steer the self-organization of a stem cell population towards a desired outcome," explains Manfrin. "This has obvious advantages for tissue and organ engineering."

These advantages include new tools for drug testing and regenerative medicine. The new technique can also help scientists study processes related to developmental biology - like gastrulation - and could provide alternatives to animal experimentation in some areas of research.

"One of our long-term goals is to engineer organs for transplantation," says Lütolf, who is already working with groups at the Lausanne University Hospital (CHUV) and elsewhere to generate miniaturized organs ('organoids') from patient-derived cells. "We are still far from growing functional organs in a dish; but recent progress in stem cell biology and bioengineering make me optimistic that this can become a reality. The key is to better understand how cells themselves build tissues and organs in the embryo".

Credit: 
Ecole Polytechnique Fédérale de Lausanne