Culture

Gene activity database could spare thousands of mice

A comprehensive database of gene activity in mice across ten disease models, which could significantly reduce animal use worldwide, has been developed by scientists at the Francis Crick Institute, which gives a full picture of the immune response to different pathogens.

The data, published in Nature Communications and available through an online app, shows the activity of every mouse gene - more than 45,000 genes - in the blood of mice with ten different diseases. For the six diseases that involve the lung, samples from lung were also examined.

Previously, researchers would have to create, infect, cull, obtain samples from mice and extract and sequence the RNA to study genes that they are interested in. Using a new app which the lab created for this study, researchers will be able to check the activity of any gene across a range of diseases without needing their own mice. This could prevent thousands of mice being used in individual experiments.

The research team, led by Crick group leader Anne O'Garra and co-ordinated by Christine Graham, worked with many collaborators from the Crick, UK and the USA. They used next-generation sequencing technology, 'RNA-seq', to measure gene activity across the different diseases. As genes need to transcribe their DNA into RNA in order to function, analysing the RNA reveals how active each gene is - in this case after infection or allergen challenge.

"Gene activity can show us how the body responds to infections and allergens," explains Anne. "There are thousands of genes involved in any immune response, so Akul Singhania, a Bioinformatics Postdoc, in our lab used advanced bioinformatics approaches to cluster the genes into modules. These modules represent clusters of genes that are co-regulated and can often be annotated to determine their function and known physiological roles. For example, of 38 lung modules there is a module associated with allergy, and seen only in the allergy model, containing over 100 genes and another module associated with T cells containing over 200 genes."

"By sequencing both lung tissue and blood, we can also see how the immune response in the blood reflects the local response in the lung, and vice versa. This will help us to understand what we can learn from genetic signatures in the blood, since for most diseases doctors can't realistically get lung samples from patients."

A panoply of pathogens

Using the new app, researchers anywhere in the world can look up gene activity in the lungs and blood of mice infected with a range of pathogens: the parasite Toxoplasma gondii, influenza virus and Respiratory Syncytial Virus (RSV), the bacterium Burkholderia pseudomallei, the fungus Candida albicans, or the allergen, house dust mite. They can also see gene activity in the blood of mice with listeria, murine cytomegalovirus, the malaria parasite Plasmodium chabaudi chabaudi, or a chronic Burkholderia pseudomallei infection.

In the study, the research team analysed the genetic signatures associated with these diseases to help understand the immune response. They discovered a broad range of immune responses in the lung, where discrete modules were dominated by genes associated with Type I or Type II interferons, IL-17 or allergy type responses. Type I interferons are known to be released in response to viruses, while Type II interferon (IFN-?) activates phagocytes to kill intracellular pathogens, and IL-17 attracts neutrophils causing early inflammatory immune responses. Interestingly, interferon gene signatures were present in blood modules similarly to the lung, but IL-17 and allergy responses were not.

Surprisingly, genes associated with type I interferon were highly active in both the lungs and blood of mice infected with the Toxoplasma gondii parasite and also seen in response to the Burkholderia pseudomallei bacterium, albeit to a lesser extent. This challenges the view that type I interferon-associated genes are necessarily indicative of viral infections, as the lab had previously shown in tuberculosis.

"We found that mice without functioning interferon pathways were less able to fight off Toxoplasma infection. This was true for both Type I and Type II interferons, which have a complex relationship with each other. We found that both play a key role in protection against the parasite in part by controlling the neutrophils in the blood which in high numbers can cause damage to the host."

From obsolescence to opportunity

The research project began in 2009, using a technique known as microarray to detect gene activity in lung and blood samples and was almost complete and ready to be analysed by 2015. Microarray was then a well-established technique, but the necessary reagents were suddenly discontinued by the manufacturer before the final samples had been processed. Without the equipment to finish the sequencing, the project was in trouble.

With this microarray technology no longer possible, the team needed a different approach. At this time, a technique called RNA-Seq had come onto the market, offering a better way to quantify gene activity.

Following negotiations between Anne and the manufacturer, her team was offered cutting-edge RNA-Seq reagents free of charge, to re process the samples starting in late 2016. They were also provided storage space for the huge amounts of data generated.

As the tissue and blood samples from the microarray experiments were all frozen in storage, Christine Graham in Anne's lab was able to go back to the same samples and heroically process them again, this time for RNA sequencing. Thanks to the excellent storage of the samples, this was possible without use of additional animals. Although time-consuming and a huge task for Christine, by 2018 the team had all the sequencing data they needed.

With a huge amount of data to process, Akul Singhania set about making sense of it all. Using advanced bioinformatics techniques, he clustered the thousands of genes and millions of data points into a meaningful and visual form which we refer to as modules, and created the app to make the data accessible to anyone.

"Ten years since the project began, we now have an open access resource of gene expression that anyone in the world can use to look up their favourite genes and also see if they are regulated by type I or type II interferon signalling," says Anne. "Nobody said science was easy, but it's certainly worthwhile."

Credit: 
The Francis Crick Institute

Going the distance: Brain cells for 3D vision discovered

image: This is a 3D neuron captured under a microscope.

Image: 
Newcastle University, UK

Scientists at Newcastle University, UK have discovered neurons in insect brains that compute 3D distance and direction. Understanding these could help vision in robots.

In stunning images captured under the microscope for the first time, the neurons were found in praying mantises. The work is published in Nature Communications today.

In a specially-designed insect cinema, the mantises were fitted with 3D glasses and shown 3D movies of simulated bugs while their brain activity was monitored. When the image of the bug came into striking range for a predatory attack, scientist Dr Ronny Rosner was able to record the activity of individual neurons.

Dr Rosner, Research Associate in the Institute of Neuroscience at Newcastle University, is lead author of the paper. He said: "This helps us answer how insects achieve surprisingly complex behaviour with such tiny brains and understanding this can help us develop simpler algorithms to develop better robot and machine vision."

The "3D neurons"

Praying mantises use 3D perception, scientifically known as stereopsis, for hunting. By using the disparity between the two retinas they are able to compute distances and trigger a strike of their forelegs when prey is within reach.
The neurons recorded were stained, revealing their shape which allowed the team to identify four classes of neuron likely to be involved in mantis stereopsis.

The images captured using a powerful microscope show the dendritic tree of a nerve cell - where the nerve cell receives inputs from the rest of the brain - believed to enable this behaviour.

Dr Rosner explains: "Despite their tiny size, mantis brains contain a surprising number of neurons which seem specialised for 3D vision. This suggests that mantis depth perception is more complex than we thought. And while these neurons compute distance, we still don't know how exactly.

"Even so, as theirs are so much smaller than our own brains, we hope mantises can help us develop simpler algorithms for machine vision."

The wider research programme which is funded by the Leverhulme Trust, is led by Professor Jenny Read, professor of Vision Science at Newcastle University. She says: "In some ways, the properties in the mantises are similar to what we see in the visual cortex of primates. When we see two very different species have independently evolved similar solutions like this, we know this must be a really good way of solving 3D vision.

"But we've also found some feedback loops within the 3D vision circuit which haven't previously been reported in vertebrates. Our 3D vision may well include similar feedback loops, but they are much easier to identify in a less complex insect brain and this provides us with new avenues to explore."

It's the first time that anyone has identified specific neuron types in the brain of an invertebrate which are tuned to locations in 3D space.

The Newcastle team intend to further develop their research to better understand the computation of the relatively simple brain of the praying mantis with the aim of developing simpler algorithms for machine and robot vision.

Credit: 
Newcastle University

One in 10 people have 'near-death' experiences, according to new study

image: 5th Congress of the European Academy of Neurology

Image: 
European Academy of Neurology

(Oslo, Saturday, 29 June, 2019) Mystical near-death experiences where people report a range of spiritual and physical symptoms, including out-of-body sensations, seeing or hearing hallucinations, racing thoughts and time distortion, affect around 10 per cent of people, according to a new study that analysed participants from 35 countries.

These near-death experiences (NDEs) are equally as common in people who are not in imminent danger of death as in those who have experienced truly life-threatening situations such as heart attacks, car crashes, near drowning or combat situations.

The new findings were presented at the 5th European Academy of Neurology (EAN) Congress by researchers from the Rigshospitalet, Copenhagen University Hospital, University of Copenhagen, Denmark, the Center for Stroke Research, Berlin, and the Norwegian University of Technology, Trondheim, Norway.

Experiences most frequently reported by participants in their study included: abnormal time perception (87 per cent), exceptional speed of thought (65 per cent), exceptionally vivid senses (63 per cent) and feeling separated from, or out of their body (53 per cent).

The study group who reported NDEs variously described feeling at total peace, having their 'soul sucked out', hearing angels singing, being aware they were outside their body, seeing their life flashing before them, and being in a dark tunnel before reaching a bright light. Others spoke of being aware of another's presence before they went to sleep, or of a demon sitting on their chest while they lay paralysed unable to move [see Notes to Editors for selected quotes].

The team recruited 1,034 lay people from 35 countries via a crowdsourcing platform online (to eliminate selection bias) and asked them if they'd ever had an NDE. If they answered 'yes', they were asked for more details, using a detailed questionnaire assessment tool called the Greyson Near-Death Experience Scale, which asks about 16 specific symptoms.

A total of 289 people reported an NDE, and 106 of those reached a threshold of 7 on the Greyson NDE Scale, (which confirms a true NDE). Some 55 per cent perceived the NDE as truly life-threatening and 45 per cent as not truly life-threatening.

Far from being a pleasant experience associated with feelings of peacefulness and wellbeing, as some previous studies have reported, the new study found a much higher rate of people reporting their NDE as unpleasant. Overall, of all the people who claimed an NDE, 73 per cent said it was unpleasant and only 27 per cent said it was pleasant. However, in those with a score of 7 or above on the Greyson NDE Scale (a confirmed NDE), this changed to 53 per cent reporting a pleasant experience and 14 per cent an unpleasant one.

Based on insight gained from previous studies, the researchers found an association between NDEs and Rapid Eye Movement (REM) sleep intrusion into wakefulness. REM sleep is a phase of the sleep cycle where the eyes move rapidly, the brain is as active as when someone is awake, dreaming is more vivid, and most people experience a state of temporary paralysis, as the brain send a signal to the spinal cord to stop the arms and legs moving. When REM sleep intrudes into wakefulness, some people report visual and auditory hallucinations and other symptoms such as sleep paralysis, where they feel conscious but cannot move.

REM sleep intrusion on wakefulness was found to be more common in people with scores of 7 or above on the Greyson NDE Scale (47 per cent) than in people with scores of 6 or below (26 per cent), or in those below the threshold with no such experiences (14 per cent).

Lead researcher Dr Daniel Kondziella, a neurologist at the University of Copenhagen, said, "Our central finding is that we confirmed the association of near-death experiences with REM sleep intrusion. Although association is not causality, identifying the physiological mechanisms behind REM sleep intrusion into wakefulness might advance our understanding of near-death experiences."

Dr Kondziella said that the 10 per cent prevalence figure of NDE was higher than in previous studies conducted in Australia (8 per cent) and Germany (4 per cent). He said this could be explained by the fact they had been conducted on cardiac arrest survivors rather than unprimed lay people, as in this study.

Dr Kondziella said the study replicated the findings of an earlier study by Nelson et al in 20062 that had been criticised for selectional bias, but the new study addressed those potential flaws by recruiting via a crowdsourcing platform.

Credit: 
Spink Health

UMN researcher studies hip fracture probability on women in late life

MINNEAPOLIS, MN- June 27, 2019 - New University of Minnesota Medical School research evaluates the impact of multimorbidity on the probability of hip fractures.

In an article recently published in JAMA Internal Medicine, lead author Kristine Ensrud, MD, Professor of Medicine at the University of Minnesota Medical School, examines the impact of disease definition, comorbidity burden and prognosis on hip fracture probability among women 80 years and older. Late-life women account for the majority of hip fractures in the United States, but are often not screened for osteoporosis. Older age, multimorbidity and poor health are risk factors for hip fracture, but these characteristics also increase the risk of competing, non-fracture related mortality. Thus, clinicians have difficulty identifying late-life women most likely to benefit from drug treatment to prevent hip fractures.

"Older patients with multiple medical conditions or poorer prognosis are excluded from clinical trials, but this is exactly the population that clinicians see," explained Ensrud. "That is why a lot of times when you look at clinical trials, the findings are not necessarily relevant to the patients you take care of."

The study found that late-life women with osteoporosis, including those with comorbidities or poorer prognosis had a high probability of hip fracture in the next 5 years, even after accounting for competing mortality risk. These results suggest that this group of women may derive a high absolute benefit from initiation of drug treatment to prevent fracture. In contrast, among late-life women without osteoporosis but still considered to be drug treatment candidates by the National Osteoporosis Foundation, mortality probability far outweighed the probability of hip fracture, especially among those with more comorbidities or poorer prognosis. These findings suggest that the absolute benefit of drug treatment to prevent fracture is much lower in this latter patient population.

Ensrud hopes this kind of study might inform treatment guidelines, making them less likely to focus on a single disease in isolation and more patient-focused.

"Patients have multiple diseases and clinicians can't just solely focus on osteoporosis or cardiovascular disease. They have to focus on the whole patient."

Credit: 
University of Minnesota Medical School

Despite the ACA, millions of Americans with cardiovascular disease still can't get care

Cardiovascular disease (CVD) is the leading cause of death for Americans, yet millions with CVD or cardiovascular risk factors (CVRF) still can't access the care they need, even years after the implementation of the Affordable Care Act (ACA).

In a new study published today in the Journal of General Internal Medicine, researchers at Harvard Medical School found the ACA increased insurance coverage and improved access to medical care for Americans with CVD such as heart disease and stroke, and for those with CVD risk factors including smokers and those with diabetes and obesity. However, three years after the ACA took full effect, 20.6 million remained uninsured, and many were unable to get regular medical care due to cost. The study is the first to document the effect of the ACA on Americans with CVD and CVRF, a group with complex health care needs that face significant health consequences when they lack coverage.

Researchers analyzed nationally representative data on more than one million adults aged 18 to 64 with a range of cardiovascular conditions, comparing data from before ACA implementation (2012-13) to data after the ACA was in effect (2015-16). They estimated that nearly 7% of those with CVD or CVRF gained insurance coverage in the first three years after the ACA went into effect, and that the ACA slightly narrowed previous disparities in coverage for ethnic and racial minorities.

Insurance coverage increased most in states that expanded Medicaid, from 81% to 89%. In non-expansion states, coverage increased more modestly, from 76% to 81%. Post-ACA, the percentage of chronically ill people with insurance ranged from a high of 94% in Massachusetts to a low of 73% in Texas. In addition to increases in coverage, the study found that post-ACA, Americans with CVD were less likely to forgo a doctor visit due to cost, and were more likely to have a check-up in the last year, and to have a primary care physician.

Despite these gains, researchers discovered major gaps in coverage and access to care: Nearly one-quarter of low-income Americans with a CVD or CVRF -- more than 20 million people -- still lacked coverage, and approximately 13% of black and 29% of Hispanic adults in the U.S. with CVD/CVRF remain uninsured. Among Hispanic people with CVD/CVRF in non-Medicaid expansion states, 42% still lack insurance, 25% could not afford a doctor visit, 40% did not have a checkup in the last year, and 48% did not have a personal physician.

"Patients with existing cardiovascular conditions require regular medical care and daily medications to prevent another heart attack or stroke," said lead author Ameen Barghi of Harvard Medical School. "The good news is that the medications for cardiovascular disease are very effective, and millions of Americans gained some coverage under the ACA. Unfortunately, despite these gains, millions of Americans with these conditions remain uncovered, and many of those will likely suffer serious complications and even death because they cannot get the care they need."

"The data shows that insurance coverage increased more in states that opted to expand Medicaid, and also that coverage rates were already lowest in non-expansion states before the ACA, highlighting the importance of the Medicaid expansion for people with serious pre-existing medical conditions," said Dr. Hugo Torres, a study author and also a physician at Massachusetts General Hospital and Instructor of Medicine at Harvard Medical School. "If more states expanded Medicaid, many more people with cardiovascular disease would be covered, particularly among people with low incomes, and in communities of color."

Dr. Torres notes that these findings should be considered in light of the Trump administration's intention to repeal and replace the ACA with a more "market-based" system.

"Repealing or weakening the ACA would strip coverage from millions of Americans with cardiovascular disease or cardiovascular risk factors, spelling disaster for many of them," said the study's senior author, Dr. Danny McCormick, a physician at Cambridge Health Alliance and Associate Professor at Harvard Medical School. "However, today's piecemeal approaches to coverage, including the ACA, will not get all these patients covered. Only a comprehensive Medicare-for-All plan could provide both coverage and good access to care for everyone with a cardiovascular disease or risk factor. Polls show that such reform is popular with the American people, and increasingly on the agenda of many elected officials."

Credit: 
Physicians for a National Health Program

Mum's workplace exposure to solvents may heighten child's autism risk

A mother's workplace exposure to solvents may heighten her child's risk of autism, suggests research published online in the journal Occupational & Environmental Medicine.

While cautious interpretation is needed, the findings add to a growing body of evidence indicating that environmental and workplace factors may be linked to the development of the condition, say the researchers.

Autism is a neurodevelopmental disorder that includes repetitive behaviours and difficulties communicating and socialising with others. In the US alone, recent figures indicate that one in every 68 children is on the autistic spectrum.

The speed with which the number of new cases has increased suggests that factors other than genes may be involved, say the researchers. And several studies have suggested links between prenatal exposure to environmental chemicals and pollutants.

As workplace exposures are often higher than environmental ones, the researchers wanted to explore if these might potentially be linked to the development of autism.

They drew on data collected for the CHildhood Autism Risks from Genetics and Environment (CHARGE) study. These included personal, health, and job history information for the parents of 537 children formally diagnosed with autism spectrum disorders and 414 children with typical neurodevelopment.

For each job, specialists (industrial hygienists) assessed the intensity and frequency of exposure for 750 mothers and 891 fathers to 16 agents that have been linked to neurological and/or congenital abnormalities from three months before pregnancy through to the birth of the child.

These included medicines, metals, pesticides, anaesthetics, asphalt, brake fluid, plastics and polymers, radiation, cleaners/disinfectants and solvents to include paints and degreasers as well as other chemicals.

Exposure levels were classified as none; rare (a few times a year); moderate (weekly); and frequent (several times a week/daily). Intensity was categorised as low, moderate or high, the last of which was taken to mean well above background levels.

The most common occupational exposures among the mums were to disinfectants/cleaners, solvents and ethylene oxide; the least common were to perchlorates, asphalt, PCBs, and machine fluids.

For dads, the most common occupational exposures were to disinfectants/cleaners, solvents and metals; the least common were to perchlorates, PCBs, and asphalt.

Mums with autistic children had been more frequently exposed to solvents than those whose children weren't on the spectrum, the findings showed.

They were 1.5 times more likely to have a child on the autistic spectrum. And moderate intensity cumulative exposure to solvents was associated with a near doubling in risk.

None of the other agents was associated with heightened risk in either parent or when the exposures of both parents were combined.

This is an observational study, and as such, can't establish cause. Only a few parents had been exposed to some of the agents, and after correcting for statistical bias, the observed associations no longer remained significant.

"However, these results are consistent with earlier reports that have identified solvents as a potential risk factor for [autism spectrum disorders]," write the researchers. Solvents can be absorbed through the skin and lungs, and many remain in the body, including in the brain, they point out.

Credit: 
BMJ Group

Menstrual symptoms linked to nearly 9 days of lost productivity through presenteeism every year

Menstrual period symptoms may be linked to nearly nine days of lost productivity every year through presenteeism, suggests the largest study of its kind, published in the online journal BMJ Open.

But the real impact on women and society is underestimated and poorly appreciated, say the researchers.

They set out to evaluate lost productivity associated with menstrual symptoms, as measured by time off from work or school (absenteeism) and working or studying while feeling ill (presenteeism) in 32,748 Dutch women between the ages of 15 and 45.

All the participants were recruited through social media from July to the end of October 2017.

They filled in a comprehensive online questionnaire about the frequency and length of their menstrual cycle and the severity of any associated symptoms, measured using a validated pain score (visual analogue scale or VAS for short).

They were asked if their symptoms had prompted them to take time off from work or school and/or had affected their productivity while working or studying, as well as how often this had happened.

Their blood loss lasted an average of 5 days. Menstrual symptoms prompted nearly a third (31%) of the women to visit their family doctor, and around one in seven (14.4%) to see a gynaecologist.

In all, around one in seven respondents (just under 14%, 4514) said they had taken time off from work or school during their period, with nearly 3.5% (1108) saying this happened during every, or nearly every, menstrual cycle.

The average amount of sickness leave taken came to just over one day a year.

Younger women under the age of 21 were around three times more likely to say they had taken time off because of their menstrual symptoms than were older women.

Most (just under 81%; 26,438) respondents reported presenteeism, and said that they were less productive as a result of their symptoms. In all, productivity was below par on more than 23 days out of the working/study year, with lost productivity amounting to almost 9 days each year.

The researchers calculated that, on average, each woman was less productive for a third of the time (33%), because of menstrual symptoms.

Notably, when women called in sick because of menstrual symptoms, only one in five (20%; 908) told their employer or school the real reason for their absence. And around two thirds (just under 68%; 22,154) of respondents said they wished they had the option of more flexible work/study hours during their period.

This is an observational study, and as such can't establish cause. The method of recruitment may also have introduced an element of 'selection bias' whereby those with debilitating symptoms might have been more likely to take part; the analysis also relied on what women said rather than objective assessment.

But the researchers nevertheless conclude: "Menstruation-related symptoms cause a great deal of lost productivity, and presenteeism is a bigger contributor to this than absenteeism.

"Taking all the symptoms into account, it seems likely that the real impact of [menstruation related symptoms] is underestimated in the general population."

But it's not openly talked about, they add. "Despite being almost two decades into the 21st century, discussions about [symptoms] may still be rather taboo."

Credit: 
BMJ Group

What agility and agile mean for you

image: Official logo of the Society for Industrial and Organizational Psychology

Image: 
SIOP

Agility and Agile: An Introduction for People, Teams, and Organizations is the latest in the Society for Industrial and Organizational Psychology (SIOP) white paper series. Written by Ben Baran, assistant professor of management at Cleveland State University and co-founder and principal of the consulting firm Indigo Anchor, and Scott Bible, assistant professor of practice at The University of Akron, this paper provides an overview of the increasingly common terms "agility" and "agile" along with practical implications for leaders who are operating in complex, changing environments.

"This topic is both timely and important," said Baran. "Executives, their teams, and other leaders within organizations are increasingly looking for new ways to deal with the turbulence they're facing. They want to be nimble to succeed amid shifting market conditions, and the concepts of agility and agile describe a mindset and a set of practices and principles that can help."

"Additionally, the field of industrial and organizational psychology has much to offer with regard to how organizations can build their agility," he added. "In particular, what we know about how people make sense of complex environments, how they can build and sustain high-performance teams, and how all of that can fit together within an organization can be of great benefit to any executive who is trying to make his or her organization increasingly adaptive and proactive."

In addition to providing an overview and background on the topic, Baran and Bible discuss eight specific implications for executives and other organizational leaders. Each implication stems from a combination of the research literature on these topics, the authors' experience working with companies, and emerging practices from a wide range of sectors and industries including the military and Silicon Valley.

Credit: 
Society for Industrial and Organizational Psychology

What journalism professors are teaching students -- about their futures

image: Max Besbris is an assistant professor of sociology at Rice U.

Image: 
Rice Department of Sociology

HOUSTON - (June 27, 2019) - As the journalism industry rapidly evolves, what are professors in the field telling students about their job prospects?

A new study from Rice University and Rutgers University finds educators are encouraging aspiring journalists to look for work outside the news business.

"Professionalizing Contingency: How Journalism Schools Adapt to Deprofessionalization," will appear in an upcoming edition of Social Forces. Authors Max Besbris, an assistant professor of sociology at Rice, and Caitlin Petre, an assistant professor of journalism and media studies at Rutgers, conducted the study in response to the massive transformations taking place in journalism, particularly in the field's labor market.

"The post-Watergate media era where you would work for a local paper or TV station and work your way up to retirement with a nice pension is behind us," Besbris said. "Now, papers are shutting down, news outlets are consolidating, and information is widely available on the internet. We wanted to see how these drastic changes in media and media consumption over the past 20 years were impacting journalism education."

For the study, Besbris and Petre conducted in-depth interviews with 113 faculty, staff and administrators from 44 U.S. journalism programs that varied in size, prestige, location and other factors. The authors argue that journalism schools have sought to reframe the industry's unstable labor market as an inevitable and even desirable part of the business and its professional identity.

"Professional schools in general seem to be a means by which we can get a good career," Besbris said. "A medical degree is a pretty clear path, as is the path of a social worker or engineer. However, journalism is a less defined profession and you don't need a license to practice. That's an interesting aspect of this case. Master's degrees are on the rise but more of them -- including journalism degrees -- don't necessarily offer a clear path to a secure career."

Indeed, the authors found that journalism educators are "very aware" and sensitive to changes in the industry. The majority interviewed said they accept the changes in the field as a reality and see no way of returning to old models. They also agreed that students must move away from thinking about journalism as a coherent career path and instead must accept the precarious nature of their jobs.

"They're telling their students that they don't have to, in fact shouldn't, go work for traditional news organizations -- they can do temporary, contract or freelance work, or work for non-news corporations, the government, NGOs (nongovernmental organizations) or almost any other place," Besbris said. "For a long time journalism had been trying to cultivate the difference between journalism and PR (public relations), so it was really interesting to see this change in thinking, and hear individuals say that students should prepare to work as journalists in non-news organizations."

Besbris also said most of the educators they interviewed stressed that students should be "as entrepreneurial as possible" and be willing to start their own businesses or websites. They encouraged students to not only become good writers or photojournalists, but also develop the skills to do just about anything from writing and editing to recording and designing.

"Many of these J-school professors are telling students to learn to hustle, be game for anything and even to celebrate the precariousness of the labor market," Besbris said.

To be sure, there's pushback from some instructors, Besbris said. Some of those interviewed were "very upset" about the changes taking place in their schools and within the industry. However, Bebris said, those people -- who were mostly Ph.D.s with little or distant experience in the field -- comprised a small minority.

Besbris and Petre hope the research will illuminate how professional schools writ large are adjusting to labor market instability in the fields for which they're training students.

Credit: 
Rice University

NIST presents first real-world test of new smokestack emissions sensor designs

video: To monitor emissions from coal-fired power plants, technicians need to measure the rate at which flue gas is emitted from the smokestack. The flow inside the smokestack contains eddies and swirls but generally travels upwards. In the NIST tests, four probes -- called pitot tubes -- are inserted horizontally into the smokestack. The four probes each take a flow measurement at four different spots, for a total of 16 measurements. With this information, NIST scientists could test the precision and accuracy of a new pitot tube design and measurement method.

Image: 
Sean Kelley/NIST

In collaboration with industry, researchers at the National Institute of Standards and Technology (NIST) have completed the first real-world test of a potentially improved way to measure smokestack emissions in coal-fired power plants. The researchers are presenting their work this week at the 2019 International Flow Measurement Conference (FLOMEKO) in Lisbon, Portugal.

Each year, to meet requirements set by the Environmental Protection Agency (EPA), coal-fired power plants must have their smokestack emissions audited, or checked by an independent third party. NIST researchers wanted to make this test quicker to save the plants money during their audits, while also improving accuracy of the sensors. So, a NIST team has designed new probes for sensing emission flow rates and a new measurement method that could potentially speed up on-site audits by a factor of 10, researchers say.

The fieldwork results were "promising," said NIST engineer Aaron Johnson, and were in reasonable agreement with the laboratory findings. "We were surprised; it did quite well compared to what the EPA has on its books as its 'best practices' method."

To monitor emissions from coal-fired power plants, technicians need to measure the rate at which flue gas is emitted from the smokestack. The flow inside the smokestack contains eddies and swirls but generally travels upwards. In the NIST tests, four probes -- called pitot tubes -- are inserted horizontally into the smokestack.

The four probes each take a flow measurement at four different spots, for a total of 16 measurements. With this information, NIST scientists could test the precision and accuracy of a new pitot tube design and measurement method.

NIST conducted this work as part of a cooperative research and development agreement (CRADA) with the Electric Power Research Institute (EPRI), an independent nonprofit organization whose members include electric utility companies, businesses and government agencies.

"Coal-fired electric generating units may benefit from the current NIST work by having improved standards and techniques to measure mass emissions more accurately, with increased confidence that all entities are reporting on a uniform basis," said EPRI program manager Tom Martz. He added that the potential time savings "is not something we can accurately quantify at this time, but this will be a key objective of future work."

The ultimate goal is to provide research that the EPA might someday develop into a new standard for smokestack emissions calibration.

"The advantages for industry are that it will reduce test time and cost and has the potential to be more accurate" than current industry-standard probes, Johnson said.

Even if the EPA does not create a new standard, however, the work could have benefits for industry by providing power plant companies with more choices for managing their emissions tests. "Our goal is to get it written as an EPA standard," Johnson said. "But it's still up to industry members to decide whether they would want to use it."

Go With the Flow

Smokestacks at coal-fired power plants are equipped with monitors that continuously measure the concentration of flue gas emissions, which include carbon dioxide, mercury, sulfur dioxide and nitrogen oxides, as well as the flow rate of the flue gas. By federal law, the built-in flow-rate sensors need to be calibrated -- that is, checked for accuracy -- during the annual audit.

To conduct the yearly calibration, auditors use small portable devices called pitot tubes. The audit technicians climb the stack - usually several dozen meters (hundreds of feet) tall - and insert their pitot probes horizontally into the gases churning their way up the smokestack. They take several readings of the flow at various points within a cross section of the stack, which is typically 7 or 8 meters (25 feet) in diameter.

By far the most common kind of sensor used for this work is an "S-probe." It has two holes, or ports. One port faces directly into the flow of gas and detects the pressure that builds up in the tube. The other port faces the opposite direction. The faster the flow, the higher the pressure difference between the two ports; measuring this difference in pressure allows auditors to calculate the flow's speed.

S-probes don't require calibration, but each measurement can take several minutes, since the technician has to manually rotate the sensor until one side is facing directly into the flow. This is complicated because the flow is not necessarily traveling directly upward at the point being tested. At the base of the stack, flue gas usually travels around a sharp bend, which creates complicated eddies and swirls that don't go away even in tall smokestacks.

Using S-probes is so labor-intensive that an on-site annual calibration can take a day or more to complete. "And the power plant is losing money all the time the auditors are there, so they want the technicians in and out as fast as possible," Johnson said.

To speed up this process, the NIST scientists have made three innovations. First, they have created two new models of pitot tubes, with five holes instead of two, which perform better than S-probes and may offer advantages over other five-hole models of pitot tubes currently in use.

The probes, designed by NIST physicist Iosif Shinder, come in two shapes: hemispherical and conical.

Second, the scientists have developed a calibration scheme for their new sensors that does not require a technician to rotate the probe inside a smokestack to find the true direction of the flow for each measurement. So, although the sensors would have to be calibrated before use, they would take much less time to use during an actual audit.

Third, NIST's Jim Filla developed software that is compatible with a commercially available automated system to measure flow in real time.

The Real Deal

Until now, the new probes' performance had been measured only at NIST's test facility, which includes a scale-model smokestack simulator and a wind tunnel. But NIST's laboratories cannot replicate all aspects of a real power plant, such as the presence of soot in the smokestack's flow.

"It's one thing to test it in our wind tunnel," Johnson said. "It's another to prepare to test it in a stack that's 120 degrees F."

The first field run, in July 2018, took place at a natural gas plant, where flow is relatively straightforward to measure.

The second, in September 2018, was conducted at a coal-fired power plant with a particularly complicated flow.

The coal-fired plant had an enclosed platform where the pitot tubes were inserted into the smokestack. But the natural gas plant's platform was open to the elements. And at roughly 45 meters (145 feet) in the air, "things shake," said NIST technician Joey Boyd. "While you're working, the stack is swaying, and the floor beneath you is moving."

When NIST researchers analyzed the data, their results were promising, agreeing to within 2% with their laboratory findings.

"The probes performed equally well in the smokestack as they did at NIST's test facility," Johnson said.

Future field tests will help the researchers solve the biggest problem they had: sensor clogging, in which the pitot tubes' ports get gummed up with water and particulate matter and have to be flushed before a test can continue.

Also, the work taught them they needed to write special software signaling to their equipment every time there was a "purge" - a high-pressure blast of air through the pitot probe that could damage a key part of the apparatus if certain valves were not closed in time.

Credit: 
National Institute of Standards and Technology (NIST)

Functional hair follicles grown from stem cells

image: Hair growth in nude mice transplanted with human iPSC-derived dermal papilla cells that were combined with mouse epithelial cells inside a biodegradable scaffold. Left insert: enlarged outside view. Right insert: fluorescent microscopy image of hair follicles under the skin; cell nuclei (blue), epithelial cells (green), human dermal papilla cells (red).

Image: 
Sanford Burnham Preybs

LOS ANGELES - June 27, 2019 - Scientists from Sanford Burnham Prebys have created natural-looking hair that grows through the skin using human induced pluripotent stem cells (iPSCs), a major scientific achievement that could revolutionize the hair growth industry. The findings were presented today at the annual meeting of the International Society for Stem Cell Research (ISSCR) and received a Merit Award. A newly formed company, Stemson Therapeutics, has licensed the technology.

More than 80 million men, women and children in the United States experience hair loss. Genetics, aging, childbirth, cancer treatment, burn injuries and medical disorders such as alopecia can cause the condition. Hair loss is often associated with emotional distress that can reduce quality of life and lead to anxiety and depression.

“Our new protocol described today overcomes key technological challenges that kept our discovery from real-world use,” says Alexey Terskikh, Ph.D., an associate professor in Sanford Burnham Prebys’ Development, Aging and Regeneration Program and the co-founder and chief scientific officer of Stemson Therapeutics. “Now we have a robust, highly controlled method for generating natural-looking hair that grows through the skin using an unlimited source of human iPSC-derived dermal papilla cells. This is a critical breakthrough in the development of cell-based hair-loss therapies and the regenerative medicine field.”

Terskikh studies a type of cell called dermal papilla. Residing inside the hair follicle, these cells control hair growth, including hair thickness, length and growth cycle. In 2015, Terskikh successfully grew hair underneath mouse skin (subcutaneous) by creating dermal papilla derived from human pluripotent stem cells--a tantalizing but uncontrolled process that required further refinement.

"Our new protocol described today overcomes key technological challenges that kept our discovery from real-world use," says Terskikh. "Now we have a robust, highly controlled method for generating natural-looking hair that grows through the skin using an unlimited source of human iPSC-derived dermal papilla cells. This is a critical breakthrough in the development of cell-based hair-loss therapies and the regenerative medicine field."

The approach detailed in the ISSCR presentation, which was delivered by lead researcher Antonella Pinto, Ph.D., a postdoctoral researcher in the Terskikh lab, features a 3D biodegradable scaffold made from the same material as dissolvable stitches. The scaffold controls the direction of hair growth and helps the stem cells integrate into the skin, a naturally tough barrier. The current protocol relies on mouse epithelial cells combined with human dermal papilla cells. The experiments were conducted in immunodeficient nude mice, which lack body hair.

The derivation of the epithelial part of a hair follicle from human iPSCs is currently underway in the Terskikh lab. Combined human iPSC-derived epithelial and dermal papilla cells will enable the generation of entirely human hair follicles, ready for allogenic transplantation in humans. Distinct from any other approaches to hair follicle regeneration, human iPSCs provide an unlimited supply of cells and can be derived from a simple blood draw.

“Hair loss profoundly affects many people’s lives. A significant part of my practice involves both men and women who are seeking solutions to their hair loss,” says Richard Chaffoo, M.D., F.A.C.S., a triple board-certified plastic surgeon who founded La Jolla Hair MD and is a medical adviser to Stemson Therapeutics. “I am eager to advance this groundbreaking technology, which could improve the lives of millions of people who struggle with hair loss.”

Credit: 
Sanford Burnham Prebys

Seven-country study reveals viruses as new leading cause of global childhood pneumonia

Respiratory syncytial virus (RSV) and other viruses now appear to be the main causes of severe childhood pneumonia in low- and middle-income countries, highlighting the need for vaccines against these pathogens, according to a study from a consortium of scientists from around the world, led by a team at the Johns Hopkins Bloomberg School of Public Health.

Pneumonia is the leading cause of death worldwide among children under 5 years old, with about 900,000 fatalities and more than 100 million reported cases each year. This makes pneumonia a greater cause of childhood mortality than malaria, tuberculosis, HIV, Zika virus and Ebola virus combined.

The study, to be published June 27 in The Lancet, was the largest and most comprehensive of its kind since the 1980s. It included nearly 10,000 children in seven African and Asian countries. After testing for viruses, bacteria, and other pathogens in children with severe hospitalized pneumonia--and in community children without pneumonia--the study found that 61 percent of severe pneumonia cases were caused by viruses led by RSV, which alone accounted for 31 percent of cases.

"Prior to this study, we didn't know which specific viruses and bacteria are now causing most of the severe childhood pneumonia cases in the world, but public health organizations and vaccine manufacturers really need that information to work toward reducing the substantial childhood mortality that pneumonia still causes," says study co-principal investigator Maria Deloria Knoll, PhD, a senior scientist in the Bloomberg School's Department of International Health, and associate director of science at the Johns Hopkins International Vaccine Access Center (IVAC).

Identifying the germs that cause pneumonia is difficult in individual cases and much more so on a scale of thousands of cases, especially in low- and middle-income countries where most pneumonia deaths occur. Researchers in prior pneumonia studies simply lacked the microbiological and analytical resources to produce estimates of the major pneumonia pathogens, Knoll says. And, in the past two decades, many low- and middle-income countries have introduced effective vaccines against known major bacterial causes of pneumonia--Haemophilus influenzae type b and Streptococcus pneumoniae--so the global mix of pathogens causing childhood pneumonia has changed as a result.

The new, IVAC-led study, known as the Pneumonia Etiology Research for Child Health (PERCH) study, included 4,232 cases of severe hospitalized pneumonia among children under 5 years and 5,119 community children without pneumonia during a two-year period. The study was carried out at sites in Bangladesh, The Gambia, Kenya, Mali, South Africa, Thailand, and Zambia.

For their study, researchers took nasal and throat swabs as well as blood, sputum and other fluid samples from cases and controls and tested them for pathogens using state-of-the-art laboratory techniques. Cases for the primary analysis were limited to those whose pneumonia was confirmed by chest X-ray, and children with HIV were considered in a separate analysis because the causes of their pneumonia would likely differ from those without HIV. With analytic methods unique for an etiology study, the researchers compared the pathogens found in samples from severe pneumonia cases to those from other children in the community in order to estimate the likeliest cause of each case. In this way they were able to identify the leading causes of childhood pneumonia among children in these settings.

The researchers concluded that, across all study sites combined, viruses accounted for 61.4 percent of cases, bacteria for 27.3 percent of cases, Mycobacterium tuberculosis for 5.9 percent of cases. Fungal and unknown causes accounted for the remainder of cases.

RSV accounted for nearly a third of all cases and was the leading cause of severe pneumonia in each of the seven countries studied. Other top causes were rhinovirus, human metapneumovirus, parainfluenza viruses, and S. pneumoniae bacteria.

"We now have a much better idea of which new vaccines would have the most impact in terms of reducing illness and mortality from childhood pneumonia in these countries," says Katherine O'Brien, MD, who led the PERCH study as a professor at the Johns Hopkins Bloomberg School of Public Health and now serves as Director of Immunizations, Vaccines and Biologicals at the World Health Organization.

RSV has long been known as a common and potentially serious respiratory pathogen among children and the elderly. It remains the leading cause of pneumonia in children younger than 1 year in the United States, according to the Center for Disease Control and Prevention. Several RSV vaccine candidates are being developed and evaluated in clinical trials. A monoclonal antibody therapy, palivizumab, is available for the prevention of RSV disease in children with underlying medical conditions but is not suitable programmatically or financially for widespread use in routine immunization programs.

The analytical technique developed for the study to estimate the cause of individual cases of childhood pneumonia is called the Bayesian Analysis Kit for Etiology Research (BAKER), and is available online as an open-source application for use by other public health researchers.

"Estimating the etiology of pneumonia was like a complex jigsaw puzzle where the picture could only be seen clearly by assembling multiple, different pieces of information using innovative epidemiologic and statistical methods," says Scott Zeger, PhD, Malone Professor of Biostatistics in the Bloomberg School's Department of Biostatistics

Credit: 
Johns Hopkins Bloomberg School of Public Health

Risk prediction model may help determine if a lung nodule will progress to cancer

Bottom Line: A risk prediction model developed using clinical and radiological features could stratify individuals presenting with a lung nodule as having high or low risk for lung cancer.

Journal in Which the Study was Published: Cancer Prevention Research, a journal of the American Association for Cancer Research

Author: Barbara Nemesure, PhD, director of the Cancer Prevention and Control Program and the Lung Cancer Program at Stony Brook Cancer Center in New York

Background: "While lung nodules are not uncommon, a major challenge in the field is determining which nodules will progress to cancer," said Nemesure.

Even though lung and bronchus cancer is the leading cause of cancer mortality in the United States, the five-year survival rate for localized disease is greater than 50 percent, according to recent statistics. However, the majority of lung cancer cases are diagnosed after the cancer has metastasized. "Lung cancer is often asymptomatic in early stages, and the identification of high-risk individuals is a major priority," Nemesure said.

Prior studies in this area include a retrospective analysis of lung cancer patients and an analysis of high-risk individuals undergoing screening for the disease, noted Nemesure. The current study aimed to prospectively predict lung cancer incidence among the general population presenting with a lung nodule, she said.

How the Study Was Conducted and Results: Nemesure and colleagues analyzed data from 2,924 patients presenting with a lung nodule assessed at the Stony Brook Cancer Center's Lung Cancer Evaluation Center between Jan. 1, 2002, and Dec. 31, 2015. Patients were excluded if they had a history of lung cancer or if they were diagnosed with lung cancer within six months of the initial consultation. Participants were randomly assigned to discovery (1,469 patients) and replication (1,455 patients) cohorts. Among them, 171 developed lung cancer over the 13-year period.

Clinical and radiological data were collected to develop a risk prediction model. Using multivariable analyses, the researchers found that the combined variables of age, smoking pack-years, personal history of cancer, the presence of chronic obstructive pulmonary disease, and nodule characteristics such as size, the presence of spiculation, and the presence of a ground-glass opacity, could best predict who would develop lung cancer among the discovery cohort. These factors were combined to develop an overall risk score to stratify patients into high- and low-risk categories.

When the risk score was applied to the replication cohort, the researchers found that the model could discriminate cancer risk with a sensitivity and specificity of 73 percent and 81 percent, respectively. Compared with individuals in the low-risk category, those in the high-risk category had more than 14 times the risk of developing lung cancer.

Author's Comments: "Through our model, we can identify which individuals with lung nodules should be closely monitored, so that we can catch the disease at an early stage and ultimately reduce the burden of lung cancer deaths," Nemesure said.

"Even though the majority of lung nodules do not progress to cancer, it is still vitally important that patients seek follow-up care," noted Nemesure.

Credit: 
American Association for Cancer Research

3D body mapping could identify, treat organs, cells damaged from medical conditions

WEST LAFAYETTE, Ind. - Medical advancements can come at a physical cost. Often following diagnosis and treatment for cancer and other diseases, patients' organs and cells can remain healed but damaged from the medical condition.

In fact, one of the fastest growing medical markets is healing and/or replacing organs and cells already treated, yet remain damaged by cancer, cardiovascular disease and other medical issues. The global tissue engineering market is expected to reach $11.5 billion by 2022. That market involves researchers and medical scientists working to repair tissues damaged by some of the world's most debilitating cancers and diseases.

One big challenge remains for the market - how to monitor and continuously test the performance of engineered tissues and cells to replace damaged ones. Purdue University researchers have come up with a 3D mapping technology to monitor and track the behavior of the engineered cells and tissues and improve the success rate for patients who have already faced a debilitating disease. The technology is published in the June 19 edition of ACS Nano.

"My hope is to help millions of people in need," said Chi Hwan Lee, an assistant professor of biomedical engineering and mechanical engineering in Purdue's College of Engineering, who leads the research team. "Tissue engineering already provides new hope for hard-to-treat disorders, and our technology brings even more possibilities."

The Purdue team created a tissue scaffold with sensor arrays in a stackable design that can monitor electrophysiological activities of cells and tissues. The technology uses the information to produce 3D maps to track activity.

"This device offers an expanded set of potential options to monitor cell and tissue function after surgical transplants in diseased or damaged bodies," Lee said. "Our technology offers diverse options for sensing and works in moist internal body environments that are typically unfavorable for electronic instruments."

Lee said the Purdue device is an ultra-buoyant scaffold that allows the entire structure to remain afloat on the cell culture medium, providing complete isolation of the entire electronic instrument from the wet conditions inside the body.

Lee and his team have been working with Sherry Harbin, a professor in Purdue's Weldon School of Biomedical Engineering, to test the device in stem cell therapies with potential applications in the regenerative treatment of diseases.

Their works align with Purdue's Giant Leaps celebration, celebrating the global advancements in health as part of Purdue's 150th anniversary. Health, including disease monitoring and treatment, is one of the four themes of the yearlong celebration's Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Lee and the other researchers worked with the Purdue Research Foundation Office of Technology Commercializationto patent the new device.

Credit: 
Purdue University

One in five haematological cancer patients suffer blood clots or bleeding

New Danish research may help direct focus towards the serious complications that on average every fifth haematological cancer patient suffers. This is according to medical doctor and PhD Kasper Adelborg from Aarhus University and Aarhus University Hospital, who has studied the cases of 32,000 haematological cancer patients between the years 2000-2013. Haematological cancer includes leukaemia, bone marrow cancer and cancers of the lymph nodes.

"This is a broad group of patients with very different disease experiences depending on the type of haematological cancer. Some patients have a particular risk of suffering blood clots, while others have instead a higher risk of bleeding such as e.g. gastrointestinal bleeding," says Kasper Adelborg, before stating that the new knowledge can be used for even better prevention and individualised treatment:

"If a person has a high risk of suffering a blood clot, treatment with anticoagulant medicine can benefit some patients. But anticoagulant medicine is not desirable if the risk of suffering bleeding is higher. This is a difficult clinical problem, but our study can set goals for what carries most weight for each individual type of cancer," he says.

One example is the disease myelodysplastic syndrome (MDS), which is a type of bone marrow cancer. Here the study showed that the risk of bleeding within ten years was approx. fifteen per cent, while the risk of suffering a blood prop was lower.

"This means that doctors who help these patients should be aware that they have a high risk of bleeding and should therefore not prescribe too much anticoagulant medicine," says Kasper Adelborg.

He adds that with each individual patient there is still a need to weigh up the overall risk of a blood prop and bleeding, which includes taking into account the patient's age, medical history, other diseases, lifestyle etc. before choosing a treatment.

Major preventative potential:

The new study, which has been published in the Journal of Thrombosis and Haemostasis, corroborates previous studies, though researchers have not previously looked at the entire group of haematological cancer patients together - and neither were there any studies covering so many years. Additionally, previous studies have either focused solely on blood clots or bleeding.

Kasper Adelborg emphasises that there are major differences in the prognoses for the different patient groups. For example. only a few children develop a blood prop of suffer bleeding in the years after suffering from leukaemia, while far more patients with e.g. bone marrow cancer develop blood props and/or bleeding.

"The potential for prevention is particularly large in the latter group," he says.

In relation to the population as a whole, the study shows the heightened risk for haematological cancer patients:

Blood clot in the heart: Forty per cent higher.

Blood clot in the brain: Twenty per cent higher.

Blood clot in the legs and lungs: over three hundred per cent higher.

Bleeding: Two hundred per cent higher.

Credit: 
Aarhus University