Culture

Poor communication between nurses, doctors a primary reason for patient care mistakes

Communication breakdown among nurses and doctors is one of the primary reasons for patient care mistakes in the hospital. In a small pilot study, University of Michigan researchers learned about potential causes of these communication failures by recording interactions among nurses and doctors, and then having them watch and critique the footage together.

Latent TB treatment: Shorter is better

image: Treatment of latent tuberculosis is set to transform after a pair of studies by Dr. Richard Menzies of the Research-Institute of the McGill University Health Centre (RI-MUHC) revealed that a shorter treatment was safer and more effective in children and adults compared to the current standard. These findings are published today in the New England Journal of Medicine.

Image: 
Courtesy of McGill University Health Centre

Montreal, Aug. 1, 2018 - Treatment of latent tuberculosis is set to transform after a pair of studies from the Research-Institute of the McGill University Health Centre (RI-MUHC) revealed that a shorter treatment was safer and more effective in children and adults compared to the current standard. These findings are published today in the New England Journal of Medicine.

Led by Dr. Dick Menzies, the study followed 850 children and 6,800 adults with latent TB, a dormant version of the disease that does not cause symptoms but may lead to serious illness if treatment is not provided. This study in children is one of the largest for a pediatric clinical trial related to TB.

Dr. Menzies' team compared results among latent TB patients who underwent the current standard treatment of nine months of isoniazid (INH) or a four-month treatment with rifampin. Over 85 per cent of children completed rifampin without developing active TB compared to 76 per cent of children who completed isoniazid, with two developing active TB. Results were similar in adults; acceptance and completion of rifampin therapy was much better with significantly fewer serious side effects, particularly drug induced hepatitis (INH can cause serious liver toxicity which can prove fatal or require a liver transplant).

In addition to being a much shorter treatment, the rates of development of active TB were slightly lower with rifampin, indicating that it is at least as effective as the nine-month treatment of INH in preventing TB.

"This four-month therapy is a fundamental game-changer in TB prevention," says Dr. Menzies, who is a respirologist with the MUHC and a professor of Medicine, Epidemiology and Biostatistics at McGill University. "The four month treatment was as effective in preventing TB, safer and more acceptable. We believe this four month rifampin treatment should replace the nine months on INH for most people who need therapy for latent TB."

With patients who originated from nine different countries (Australia, Benin, Brazil, Canada, Ghana, Guinea, Indonesia, Saudi Arabia and South Korea) in a study supported by the Canadian Institutes of Health Research, Dr. Menzies expects the findings to have a major impact on shaping future global guidelines related to treating latent TB.

"These discoveries will fuel a new look at global practices," says Dr. Menzies, who has acted as an advisor to the Public Health Agency of Canada, Citizenship and Immigration Canada, the Centres for Disease Control (CDC), and the World Health Organization (WHO) on TB. "We expect this discovery to have a substantial impact on TB, which remains the number one infectious disease killer globally, causing more deaths than AIDS, malaria, diarrhoeal diseases, or other tropical illnesses."

Treatment of latent TB infection is a key part of the End TB strategy & TB-elimination plans in high-income countries from the WHO. One-quarter of the global population is infected with latent TB and 10 per cent of these will develop active TB.

Credit: 
McGill University Health Centre

Understanding soil through its microbiome

image: Global warfare between bacteria and fungi.

Image: 
EMBL/Hildebrand/Krolik in collaboration with Campbell Medical Illustration

Soil is full of life, essential for nutrient cycling and carbon storage. To better understand how it functions, an international research team led by EMBL and the University of Tartu (Estonia) conducted the first global study of bacteria and fungi in soil. Their results show that bacteria and fungi are in constant competition for nutrients and produce an arsenal of antibiotics to gain an advantage over one another. The study can also help predict the impact of climate change on soil, and help us make better use of natural soil components in agriculture. Nature publishes the results on 1 August 2018.

Research on the soil microbiome requires scientists to get their hands dirty. Over the course of five years, 58.000 soil samples were collected from 1450 sites all over the world (40 subsamples per site), that were carefully selected to be unaffected by human activities such as agriculture. First authors Mohamad Bahram (University of Tartu) and Falk Hildebrand (EMBL), together with a large team of collaborators, set up this massive project, gathered samples, and analysed the 14.2 terabyte dataset. Of the 1450 sites sampled, 189 were selected for in-depth analysis (see Figure 1), covering the world's most important biomes, from tropical forests to tundra, on all continents.

Global microbial war

Only half a percent of the millions of genes found in this study overlapped with existing data from gut and ocean microbiomes. "The amount of unknown genes is overwhelming, but the ones we can interpret clearly point to a global war between bacteria and fungi in soil," says Peer Bork, EMBL group leader and corresponding author of the paper.

Overall, the bacterial diversity in soil is lower if there are relatively more fungi. The team also found a strong link between the number of antibiotic resistance genes in bacteria and the amount of fungi, especially those with potential for antibiotics production such as Penicillium. Falk Hildebrand: "This pattern could well be explained by the fact that fungi produce antibiotics in warfare with bacteria, and only bacteria with adequate antibiotic resistance genes can survive this."

"The antagonism between fungi and bacteria influences the overall diversity of bacterial communities and determines their genetic repertoire of antibiotic resistance", says Mohamad Bahram. This information can be used to predict the spread of genes that lead to antibiotic resistance in different ecosystems, and via what routes they may reach human pathogens. It could also help predict and pinpoint locations with high levels of natural antibiotics producers.

Regional differences

The team also found regional differences in the distribution of bacteria and fungi. Bacteria are everywhere, with the highest genetic diversity in temperate zones with a moderate climate. Environmental factors such as temperature are most decisive in their relative abundance: they often prefer hot and wet locations. Fungi are usually more prevalent in colder and dryer climates like the tundra. They also tend to be more geographically restricted, with differences in populations between continents. This implies that the relative contributions of bacteria and fungi to nutrient cycling are different around the world, and that global climate change may affect their composition and function differently.

Effects of human activity

When comparing data from the unspoiled soil sites with data from locations affected by humans, such as farmland or garden lawns, the ratios between bacteria, fungi and antibiotics were completely different. According to the scientists, this shift in the natural balance - that probably evolved over most of the earth's history - shows the effect of human activities on the soil microbiome, with unknown consequences so far. However, a better understanding of the interactions between fungi and bacteria in soil could help to reduce the usage of soil fertilizer in agriculture, as one could give beneficial microorganisms a better chance at survival in their natural environment.

Credit: 
European Molecular Biology Laboratory

Birds categorize colors just like humans do

image: The rainbow of visible colors varies over a continuous range of wavelengths, but zebra finches break it into discrete colors much like humans, researchers report.

Image: 
Ryan Huang, TerraCommunications LLC

DURHAM, N.C. -- For a small, reddish-beaked bird called the zebra finch, sexiness is color-coded. Males have beaks that range from light orange to dark red. But from a female's point of view, a male's colored bill may simply be hot, or not, new findings suggest.

Due to a phenomenon called categorical perception, zebra finches partition the range of hues from red to orange into two discrete categories, much like humans do, researchers report August 1 in the journal Nature.

The finding comes from a Duke University experiment that tested the birds' ability to tell whether colors are the same or different.

Using different pairwise combinations of eight hues representing the range of male beak colors, the researchers showed 26 females a set of quarter-sized paper discs, some two-toned and some solid colored.

The birds learned that each time they flipped over a two-toned disc with their beak, they found a millet seed treat hidden underneath. If they flipped over a solid colored disc, they got nothing. Picking a particular disc first before the others was a sign that a bird perceived it as having two colors rather than one.

Some trials involved color pairs that were closer together on the color spectrum, while others involved pairs that were farther apart.

Females had no difficulty discriminating the most dissimilar pairings. What was interesting was how they treated the various hues in between.

The findings suggest a threshold effect at work -- a sharp perceptual boundary where orange turns to red.

The birds were much better at distinguishing two colors from opposite sides of the boundary than pairs from the same side, even when all pairs were all equally far apart on the color spectrum.

Previous research had shown that zebra finch females prefer red-beaked males to orange ones, because redness correlates with good health.

This study didn't test whether the birds preferred some disc hues over others, cautions Patrick Green, Duke postdoctoral associate and study co-first author. But the results shed light on what might be happening when a female gazes at a potential mate, Green said.

If the birds lump all hues on one side of a certain redness threshold into the same category, then at least when it comes to beak color, females may not be picky about whether a potential mate is Mr. Perfection or Mr. Good Enough.

"What we're showing is: he's either red enough or not," said Duke biology professor and senior author Stephen Nowicki.

The researchers don't yet know if the threshold between what humans perceive as "orange" versus "red" is the same for birds. But the findings lend support to the idea that such color labels have deep biological roots, and aren't just arbitrary divisions shaped by human culture or language.

Categorical color perception in zebra finches isn't likely to be just the result of variation in how well the light-sensitive cells in the birds' eyes distinguish different wavelengths, the researchers say. It may be in their minds.

"What hits the retina is not always what we see," said Duke postdoctoral associate Eleanor Caves. Signals from the retina are conveyed to the brain for decoding.

"We're taking in this barrage of information, and our brain is creating a reality that is not real," Nowicki said.

Categorical perception may be a cognitive shortcut that helps animals make tough decisions in the face of noisy, limited or ambiguous information, Nowicki said.

"Categorical perception -- what we show in zebra finches -- is perhaps one strategy the brain has for reducing this ambiguity," Caves said. "Categories make it less crucial that you precisely interpret a stimulus; rather, you just need to interpret the category that it's in."

Credit: 
Duke University

US opioid use not declined, despite focus on abuse and awareness of risk

Use of prescription opioids in the United States has not substantially declined over the last decade, despite increased attention to opioid abuse and awareness of their risks, finds a study published by The BMJ today.

The results show that, although opioid use and average dose of opioids levelled off after a peak in 2012-13, all patient groups had a higher average daily dose in 2017 than in 2007, and use was particularly high among patients with a disability.

The US has the highest rate of opioid use in the world, consuming seven times more prescription opioids per person than the UK. An average of 40 people die in the US every day from prescription opioid overdose, and opioid use has been declared a public health emergency.

Recent studies have focused on the sale and supply of opioids, but information on patient demographics is limited. As a result, relatively little is known about opioid use among people outside of the government-provided Medicare insurance scheme.

So a team of US based researchers used data from a national database of medical and pharmacy claims to examine trends in opioid use among 48 million people with health insurance at any time between 2007 to 2016.

Participants were covered either by commercial (private) insurance, or by Medicare Advantage (cover offered by private insurers on behalf of Medicare).

The majority of non-elderly people in the US are covered by commercial insurance, often through their employer or a family member's employer. Most US citizens aged 65 and older are eligible for Medicare, while others are eligible owing to permanent disability.

The research team took certain information into account, such as age, sex and place of residence, race or ethnicity, and type of medical coverage.

To allow for comparison of doses across different drugs they used conversion factors from the Centers for Disease Control and Prevention to translate prescriptions of each drug into milligram morphine equivalents (MME).

The researchers found that although the rate of opioid use and average dose of opioids levelled off after a peak in 2012-13, all three insurance groups had a higher average MME dose in 2017 than in 2007.

Disabled Medicare beneficiaries were much more likely to use opioids than others. They were also more likely to take higher daily doses over a longer period of time.

For example, they found that 52% of disabled Medicare beneficiaries used opioids annually, compared with 14% of commercially insured people and 26% of aged Medicare Advantage beneficiaries.

Disabled Medicare beneficiaries aged 45 to 54 had the highest rate of opioid use. In the third quarter of 2012, 45% of disabled Medicare beneficiaries aged 45 to 54 used opioids.

They also report that within the commercially insured group, by far the most commonly dispensed drug was hydrocodone, but in terms of volume, oxycodone and hydrocodone were similar.

During the study period, the average daily observed dose for disabled Medicare beneficiaries using opioids never dropped below 50 MME per day, a level at which odds of overdose are up to four times higher than with doses of less than 20 MME per day.

The researchers point out that this is an observational study so cannot establish cause, and they outline some limitations. For example, the study did not capture all groups of people, including uninsured people, and claims data may have missed prescriptions for people with multiple sources of insurance.

Nevertheless, they say their results make clear that opioid use rates are high in the US compared with other countries.

And they suggest that doctors and patients should consider whether long term opioid use is improving the patient's ability to function, and if not, should consider other treatments either as an addition or replacement to opioid use.

Credit: 
BMJ Group

Harmless or hormone disorder? A new test enables quick diagnosis for drinking by the liter

Drinking excessive amounts of fluids can be a medically unremarkable habit, but it could also signify a rare hormone disorder. A new procedure now enables a fast and reliable diagnosis. Researchers from the University of Basel and University Hospital Basel reported these findings in the New England Journal of Medicine.

Drinking more than three liters per day with the equivalent increase in urination is regarded as too much. This drinking by the liter - known as "polyuria-polydipsia syndrome"- usually develops over time through habit, or can be a side effect of a mental illness.

Hormone deficiency as a cause

In rare cases, however, it may be caused by diabetes insipidus. This is when the pituitary gland lacks the hormone vasopressin, which regulates the water and salt content in our body. Patients have a decreased ability to concentrate the urine, therefore lose a lot of fluid and have to increase their fluid intake accordingly to prevent dehydration.

The distinction between what is considered a "harmless" primary polydipsia and a diabetes insipidus is crucial, as their therapy is fundamentally different. Diabetes insipidus must be treated with the hormone vasopressin, while patients with primary polydipsia require behavioral therapy to reduce their habitual drinking. A wrong therapy can have life-threatening consequences as treatment with vasopressin without indication can lead to water intoxication.

Blood test instead of water deprivation test

Previously, the differentiation between these two conditions was made using a "water deprivation test" in which the patient was not allowed to drink any liquid for 16 hours after which the doctors would interpret the concentration of the urine. However, this test was often misleading and only led in about half of all cases to a correct diagnosis. Furthermore, a 16-hour water deprivation test is extremely unpleasant and stressful for the patients.

A study involving around 150 patients in 11 clinics compared the conventional "water deprivation test" with a new diagnostic method. It consists of a 2-hour infusion with a hypertonic saline solution; after that, the concentration of the biomarker copeptin, which reflects the content of the hormone vasopressin in the blood, is measured in the patients' blood.

Improved diagnosis and therapy

This method has a much higher diagnostic accuracy: 97 percent of all patients were correctly diagnosed and treated quickly. The new test is now available for clinical use.

Credit: 
University of Basel

Researchers uncover molecular mechanisms of rare skin disease

image: Keratinocyte skin cells are common targets of the beta subtype of human papilloma virus. This usually harmless infection causes skin disease in people with rare gene mutations.

Image: 
The Rockefeller University

Keratinocyte skin cells are common targets of the beta subtype of human papilloma virus. This usually harmless infection causes skin disease in people with rare gene mutations.[/caption]

You're probably infected with one or more subtypes of the human papilloma virus--and, as alarming as that may sound, odds are you will never show any symptoms. The beta subtype of the virus, ß-HPV, is widespread in the general population and the least pathogenic; in fact, most carriers don't even know that they have it.

Yet, in people with the rare disease epidermodysplasia verruciformis (EV), some of these viruses cause skin lesions known as flat warts; later in life they can also cause to skin cancer.

Led by Rockefeller scientist Jean-Laurent Casanova, head of the St. Giles Laboratory of Human Genetics of Infectious Diseases, a group of researchers recently elucidated the molecular mechanisms that make people with EV vulnerable to ß-HPVs. In a new study, published in the Journal of Experimental Medicine, the scientists trace the disease to changes in a group of proteins that normally protect skin cells from the viruses.

Cracks in the armor

In patients with EV, ß-HPVs infect skin cells known as keratinocytes and provoke their proliferation. Searching for genetic causes of the disease, Casanova's team found that it is sometimes associated with mutations in a human gene coding for the protein CIB1.

They observed that individuals with CIB1 deficiency have symptoms identical to those with mutations to EVER1 or EVER2, two genes previously linked to EV. The researchers also noted that patients with EVER1 or EVER2 mutations have very low levels of CIB1, suggesting that the three proteins interact.

The scientists concluded that EVER1, EVER2, and CIB1 form a protein unit that, when functioning correctly, protects keratinocytes from ß-HPVs. If any part of this unit is compromised, however, the viruses are able to replicate and lead to pathological transformations of skin cells.

A flawed virus finds a hospitable host

ß-HPVs are, in a sense, defective: they fail to make two proteins, E5 or E8, which are produced by all other subtypes of HPVs and can occasionally cause common warts. Both E5 and E8 interact with CIB1. Without these proteins, Casanova says, ß-HPVs are no match for the CIB1-EVER1-EVER2 unit, which prevents them from harming cells.

Casanova posits that people with EV are vulnerable to ß-HPVs because their own genetic irregularities nullify those of the viruses. That is, the viruses can compensate for their missing proteins because the corresponding human defense mechanism is awry.

"When people have mutations in CIB1, EVER1 or EVER2, the ß-HPVs can promote the growth of keratinocytes--and can form warts and cancer--because there's nothing to stop them, even though they are intrinsically defective, lacking E5 and E8" he says.

Credit: 
Rockefeller University

Single-payer plan in New York could cover all without increasing spending

A single-payer health care plan could expand coverage for all New York State residents, but would require significant new tax revenue, according to an analysis released today by the RAND Corporation and the New York State Health Foundation.

A plan outlined by the New York Health Act is likely to increase use of health services as more people receive coverage. But overall health care costs would decrease slightly over time if administrative costs are reduced and state officials slow the growth of payments to health care providers, according to the analysis.

The New York Health Act proposes progressively graduated taxes to fund the plan, but does not specify tax rates or structure. Researchers estimate that possible tax schedules imposed to support a single-payer plan would cut health care payments for most of the state's households, while the highest-income households would pay substantially more than they do today.

"Our analysis finds that a single-payer plan in New York does not have to increase the amount of money spent overall on health care in the state, but it would substantially change who pays for health care," said Jodi Liu, the study's lead author and an associate policy researcher at RAND, a nonprofit research organization. "While we estimate the impacts of the New York Health Act across a number of reasonable assumptions, the actual effects would be subject to many future decisions that ultimately influence cost and who pays."

"There was a great need for an independent, rigorous, and credible analysis of an issue that has arrived center stage for New York State and the nation," said David Sandman, president and CEO of the New York State Health Foundation, a private, statewide foundation. "With a fair and factual assessment in hand, the public and policymakers can make up their own minds about the merits of a single-payer approach."

About the New York Health Act

The New York State Legislature is considering a bill that would create a single-payer plan providing coverage to all state residents. In addition, calls for some type of single-payer health plan have increased at the national level.

As outlined in the legislation, the plan -- to be called New York Health -- would offer comprehensive benefits, except for long-term care benefits that may be included later. Patients would have no deductibles, copayments or other out-of-pocket costs at the point of service for covered services.

The plan would be funded by a new trust fund, which would receive funding from the federal government (in lieu of federal financing for current health programs in the state, if federal waivers are approved), current state and local funding for health care, and revenue from two new state taxes. One would be a payroll tax paid jointly by employers (80 percent) and employees (20 percent). A second tax would be on income not subject to payroll taxes, such as interest, dividends and capital gains.

RAND researchers used a microsimulation model to estimate the plan's effects on health care use, spending, and payments in New York compared to what is expected under the status quo for the years 2022, 2026 and 2031.

Key Findings

The analysis estimates that under the New York Health Act, total health spending would be similar in 2022 as with the status quo and become 3 percent lower by 2031. The decrease reflects the assumption that provider payment rates would grow more slowly over time under New York Health than the current health system, as has been the case with other publicly financed health programs.

Researchers estimate that new taxes for health care would need to be about $139 billion in 2022 and $210 billion in 2031 to fully finance New York Health. Under the status quo, the state is expected to collect about $89 billion in taxes from all sources in 2022; thus, the new taxes would be a 156 percent increase in total state tax revenue.

As payments for health care shift from premiums and out-of-pocket payments to progressive taxes, most households in New York could pay less and the highest-income households could pay substantially more, suggests the RAND study. The shift in who pays more or less would ultimately depend on the design of the tax schedule.

Estimates of New Taxes

While the bill mandates only that the new taxes be graduated and does not specify the levels, RAND researchers estimate the impacts of one possible set of graduated marginal tax rates applied to three income brackets. The analyzed scenario has a payroll tax that begins at 6 percent for the lowest bracket (those earning under $27,500 in 2022), rises to 12 percent for the middle bracket, and to 18 percent for those in the highest bracket (above $141,200 in 2022). The nonpayroll tax follows the same schedule and the rates are about 6 percent, 12 percent, and 19 percent for the three brackets.

Under these analyzed tax schedules in 2022, New Yorkers with household compensation (income plus employer contributions to health care) below the 90th percentile would pay an average of $2,800 less per person for health care. While payments decline for most people in this group, payments would likely increase for those people who work for employers that did not previously offer health care coverage.

For New York residents in the 90th to 95th percentile of the household compensation, average health care payments would increase by $1,700 per person in 2022. The top 5 percent of New Yorkers by household compensation -- a heterogeneous group with average household compensation of about $1,255,700 in 2022 -- would pay an average of $50,200 more per person.

The effects on employer health care payments would vary by whether the business currently provides health care to workers, according to the study. Employers that currently offer health coverage would contribute $200 to $800 less per worker on average for employee health benefits under the single-payer plan in 2022. Meanwhile, employers who do not offer coverage under the status quo would pay an estimated $1,200 to $1,800 more per worker on average in 2022 because of new payroll taxes.

Researchers say that the introduction of new taxes could result in some residents and businesses leaving the state, potentially altering the financing of the plan. If even a small percentage of the highest-income residents move or find a way to shield income from the new taxes, tax schedules would need to be revised and could require increasing the financial burden on middle and lower-income residents.

"The estimated effects of the New York Health Act are highly dependent on assumptions about provider payments, administrative costs, and drug prices," said Christine Eibner, co-author of the study and the Paul O'Neill Alcoa Chair in Policy Analysis at RAND. "The actual cost of a single-payer health plan in New York would be sensitive to the extent to which state officials would negotiate or set price levels and generate efficiencies that would curb health care spending."

Credit: 
RAND Corporation

Researchers discover new type of lung cell, critical insights for cystic fibrosis

image: Pulmonary ionocytes (orange) extend through neighboring epithelial cells in the upper respiratory tract of the mouse, to the surface of the epithelial lining. Cell nuclei in cyan.

Image: 
Montoro et al./<i>Nature</i> 2018

Researchers have identified a rare cell type in airway tissue, previously uncharacterized in the scientific literature, that appears to play a key role in the biology of cystic fibrosis. Using new technologies that enable scientists to study gene expression in thousands of individual cells, the team comprehensively analyzed the airway in mice and validated the results in human tissue.

Led by researchers from the Broad Institute of MIT and Harvard and Massachusetts General Hospital (MGH), the molecular survey also characterized gene expression patterns for other new cell subtypes. The work expands scientific and clinical understanding of lung biology, with broad implications for all diseases of the airway -- including asthma, chronic obstructive pulmonary disease, and bronchitis.

Jayaraj Rajagopal, a physician in the Pulmonary and Critical Care Unit at MGH, associate member at the Broad Institute, and a Howard Hughes Medical Institute (HHMI) faculty scholar, and Broad core institute member Aviv Regev, director of the Klarman Cell Observatory at the Broad Institute, professor of biology at MIT, and an HHMI investigator, supervised the research. Daniel Montoro, a graduate student in Rajagopal's lab, and postdoctoral fellows Adam Haber and Moshe Biton in the Regev lab are co-first authors on the paper published today in Nature.

"We have the framework now for a new cellular narrative of lung disease," said Rajagopal, who is also a professor at Harvard Medical School and a principal faculty member at the Harvard Stem Cell Institute. "We've uncovered a whole distribution of cell types that seem to be functionally relevant. What's more, genes associated with complex lung diseases can now be linked to specific cells that we've characterized. The data are starting to change the way we think about lung diseases like cystic fibrosis and asthma."

"With single-cell sequencing technology, and dedicated efforts to map cell types in different tissues, we're making new discoveries -- new cells that we didn't know existed, cell subtypes that are rare or haven't been noticed before, even in systems that have been studied for decades," said Regev, who is also co-chair of the international Human Cell Atlas consortium. "And for some of these, understanding and characterizing them sheds new light immediately on what's happening inside the tissue."

Using single-cell RNA sequencing, the researchers analyzed tens of thousands of cells from the mouse airway, mapping the physical locations of cell types and creating a cellular "atlas" of the tissue. They also developed a new method called pulse-seq to monitor development of cell types from their progenitors in the mouse airway. The findings were validated in human tissue.

One extremely rare cell type, making up less than one percent of the cell population in mice and humans, appeared radically different from other known cells in the dataset. The team dubbed this cell the "pulmonary ionocyte" because its gene expression pattern was similar to ionocytes -- specialized cells that regulate ion transport and hydration in fish gills and frog skin.

Strikingly, at levels higher than any other cell type, these ionocytes expressed the gene CFTR -- which, when mutated, causes cystic fibrosis in humans. CFTR is critical for airway function, and for decades researchers and clinicians have assumed that it is frequently expressed at low levels in ciliated cells, a common cell type spread throughout the entire airway.

But according to the new data, the majority of CFTR expression occurs in only a few cells, which researchers didn't even know existed until now.

When the researchers disrupted a critical molecular process in pulmonary ionocytes in mice, they observed the onset of key features associated with cystic fibrosis -- most notably, the formation of dense mucus. This finding underscores how important these cells are to airway-surface regulation.

"Cystic fibrosis is an amazingly well-studied disease, and we're still discovering completely new biology that may alter the way we approach it," said Rajagopal. "At first, we couldn't believe that the majority of CFTR expression was located in these rare cells, but the graduate students and postdocs on this project really brought us along with their data."

The results may also have implications for developing targeted cystic fibrosis therapies, according to the team. For example, a gene therapy that corrects for a mutation in CFTR would need to be delivered to the right cells, and a cell atlas of the tissue could provide a reference map to guide that process.

The study further highlighted where other disease-associated genes are expressed in the airway. For example, asthma development has been previously linked with a gene that encodes a sensor for rhinoviruses, and the data now indicate that this gene is expressed by ciliated cells. Another gene linked with asthma is expressed in tuft cells, which separated into at least two groups -- one that senses chemicals in the airway and one that produces inflammation. The results suggest that a whole ensemble of cells may be responsible for different aspects of asthma.

Using the pulse-seq assay, the researchers tracked how the newly characterized cells and subtypes in the mouse airway develop. They demonstrated that mature cells in the airway arise from a common progenitor: the basal cells. The team also discovered a previously undescribed cellular structure in the tissue. These structures, which the researchers called "hillocks," are unique zones of rapid cell turnover, and their function is not yet understood.

"The atlas that we've created is already starting to drastically re-shape our understanding of airway and lung biology," said Regev. "And, for this and other organ systems being studied at the single-cell level, we'll have to drape everything we know on top of this new cellular diversity to understand human health and disease."

Credit: 
Broad Institute of MIT and Harvard

Scientists discover why elusive aye-aye developed such unusual features

image: 3D models of aye-aye and squirrel skulls

Image: 
Philip Cox, University of York

It is one of the most unusual primates on the planet - famed for its large eyes, big ears and thin, bony finger used for probing.

Often persecuted as a harbinger of evil, the aye-aye has fascinated scientists, in particular how and why it evolved such unusual features.

But now a new study has, for the first time, measured the extent to which the endangered aye-aye has evolved similar features to squirrels, despite being more closely related to monkeys, chimps, and humans.

When two aye-ayes were first brought back to Europe from their native Madagascar by French explorers in 1780, they were "ranked with the rodents" and believed to be "more closely allied to the genus of squirrel than any other".

By the mid-19th Century the aye-aye had been correctly identified as a primate, but its squirrel-like appearance is often cited as a striking example of "evolutionary convergence", or how unrelated species can independently evolve the same traits.

Now, using techniques developed in collaboration by researchers at the University of York, a new study has used high-resolution microCT scanning to image the skulls of the two species, mapping and modelling the level of convergence in their physical features.

The findings suggest that the demands of needing to produce a high bite force with the two front teeth - in the squirrel for cracking nuts and in the aye-aye for biting into tree bark to feed on wood-boring beetle larvae - have not only led to the aye-aye evolving the ever-growing incisors characteristic of rodents, but has also given it a squirrel-like skull and jaw.

The study shows how lifestyle and ecology can have such a strong influence on the way a species looks that they can almost override ancestry.

Senior author of the study, Dr Philip Cox from the Department of Archaeology at the University of York and the Hull York Medical School, said: "Examples of convergent evolution can be seen throughout nature - for example, despite belonging to separate biological groups, dolphins and sharks have converged in body shape due to their shared need to move efficiently through the water.

"Aye-ayes and squirrels have become an iconic example of convergence because of their similar teeth, but our study has shown for the first time that the evolution of their skulls and jaws has also converged.

"Our analysis suggests that the skulls of both species have not evolved simply to house their teeth, but that the distinctive shape may be what allows them to exact a high bite force. The shape of the skull is what makes the aye-aye look so similar to squirrels in particular."

Using skeletons borrowed from the collections of natural history museums, the research team made 3D reconstructions of the skulls and mandibles of the aye-aye and squirrel, plus a variety of other primates and rodents.

They then took 3D co-ordinates from these reconstructions and put this data into statistical software.

Plotting the evolutionary trees of the two biological groups allowed the team to visualise how the evolutionary paths of the aye-aye and squirrel incline towards each other - showing the high degree of convergence in the skull and jaw, despite the completely different ancestry of the two species.

Dr Cox added: "Our study shows the extent to which functional pressures, such as having to eat mechanically demanding food, can significantly alter an animal's skeleton and result in distantly-related species evolving to resemble one another very closely".

Credit: 
University of York

845-Page analytical report on the longevity industry in the UK released

image: Infographic depicting the United Kingdom Longevity Industry landscape.

Image: 
Biogerontology Research Foundation

Tuesday, July 31, 2018, London, UK: The Biogerontology Research Foundation announces the publication of a new analytical report titled Longevity Industry in UK Landscape Overview 2018. 845 pages in length, the report aims to outline the history, present state and future of the Longevity Industry in the United Kingdom, profiling hundreds of companies, investors, and trends, and offering guidance on the most optimal ways in which UK longevity industry stakeholders, as well as government officials, can work to strengthen the industry, and allow it to reach its full potential as a global longevity science and preventive medicine hub.

The report uses comprehensive infographics to distill the report's data and conclusions into easily understandable portions, and interested readers can get a quick understanding of the report's main findings and conclusions in its 10-page executive summary.

This special regional case study follows-up on the content and general outline of the Longevity Industry made by our consortium in the previous Longevity Industry Landscape Overviews, including Volume I "The Science of Longevity" (750 pages), and Volume II "The Business of Longevity" (650 pages), published earlier this year.

These ongoing analytical reports are part of a collaborative project by The Global Longevity Consortium, consisting of the Biogerontology Research Foundation, Deep Knowledge Analytics, Aging Analytics Agency and the Longevity.International platform.

The report seeks to document what makes the UK fertile ground for a global Longevity industry hub, and how that ground could be developed. The UK is an undoubtable world leader in both the finance and FinTech industries. There is growing evidence that it will become a global hub for AI, in the form of both an exponentially growing UK-based AI industry, as well as in the form of government initiatives which have prioritised development of Artificial Intelligence industry. Indeed, the UK government announced Grand Challenges poised to transform the nation's future, which will be treated as priority areas of national development: Artificial Intelligence and The Ageing Society are among them.

If the UK government would apply a strategy for the synergetic development of these two industries, especially on the front of applying AI to preventive medicine and primary care, then United Kingdom would be in an excellent position to reap many synergetic resulting from the convergence of these two spheres, i.e. AI and Longevity, increasing healthy lifespans while decreasing the economic burden of The Aging Society.

Even more importantly, it is clear from our analysis both in the UK and globally, that the spheres of Finance, AI and Longevity, while most often being considered as separate industries with fairly little overlap, should in fact be more appropriately considered in combination and convergence. Our previous report, Global Longevity Industry Landscape Overview Volume II: "The Business of Longevity", concluded that Longevity Industry does not consist solely of biomedicine, but that P3 Medicine (precision preventive personalised), the AgeTech market and the financial industry are also crucial subsectors of the Longevity industry, given the significant impact that societal ageing and Longevity have upon economies in general and the economic burden of healthcare, pension funds and insurance companies in particular, as well as the fact that multiple types of financial entities have the potential to tie their performance to quantitative measures of healthy longevity like HALY and QALY in order to help economies thrive in response to an increase in its citizens healthy longevity.

One of the strongest conclusions to come out of this report is that the United Kingdom has enough resources in each of three most crucial sectors - Longevity, Artificial Intelligence and Financial Industry - to be in a strong position to succeed; nonetheless, the nation should focus on cross-sector collaboration and synergetic convergence, to accelerate innovation and progress in the Longevity industry at scale.

About the Biogerontology Research Foundation:

The Biogerontology Research Foundation is a UK non-profit research foundation and public policy center seeking to fill a gap within the research community, whereby the current scientific understanding of the ageing process is not yet being sufficiently exploited to produce effective medical interventions. The BGRF funds and conducts research which, building on the body of knowledge about how ageing happens, aims to develop biotechnological interventions to remediate the molecular and cellular deficits which accumulate with age and which underlie the ill-health of old age. Addressing ageing damage at this most fundamental level will provide an important opportunity to produce the effective, lasting treatments for the diseases and disabilities of ageing, required to improve quality of life in the elderly. The BGRF seeks to use the entire scope of modern biotechnology to attack the changes that take place in the course of ageing, and to address not just the symptoms of age-related diseases but also the mechanisms of those diseases.

Credit: 
Biogerontology Research Foundation

Do spiders have a favorite color?

image: Color is a consideration for wolf spiders in recognizing communication behaviors, a UC study found.

Image: 
Joseph Fuqua II/UC Creative Services

Scientists recently discovered the aptly named peacock jumping spiders have the color vision needed to appreciate the male's gaudy display.

Now biologists at the University of Cincinnati are studying whether that ability translates to the more humdrum-looking wolf spiders that are muted browns and tans instead of electric blue, fiery orange and stoplight red.

UC biology professor George Uetz and his students presented their work in June at the American Arachnological Society meeting at the University of Michigan.

"The assumption was wolf spiders don't pay attention to color. But we found that isn't really true," Uetz said. "We need to look more closely at the neurobiology of their eyes. We need to understand what their retinas do."

Like most spiders, wolf spiders have four pairs of eyes, some of which have a reflective lens called a tapetum that sparkles in bright light. If you are an arachnophobe and want a reason never to go in your backyard again, try shining an LED light there some evening and see all the little predators staring back.

Wolf spiders are quickly becoming a model system for study because of labs such as UC's. Uetz has been examining spider behavior, vision and personality for most of his career. Every study reveals there is more to these creatures than meets their eight eyes.

Most humans have trichromatic vision -- they have retinal cells called cones that can see red, green and blue. Wolf spiders, by comparison, have dichromatic vision and see only green and ultraviolet.

"That means they're basically colorblind. But they're sensitive to light in the green wavelength," Uetz said.

In one study presented in June, UC researchers looked at how spiders reacted to a video of courting spiders in which they manipulated the background color, contrast and intensity. Would they react to the courting spider in monochrome? What if the contrast were exaggerated?

Uetz created videos featuring a digital spider and background, both of which could be manipulated to adjust the color and contrast. They played the video for female spiders as well as for male spiders called "eavesdroppers" for their habit of lurking in the background while learning how to mimic other male spiders' courting displays. They found that female spiders were more likely to respond to videos of males that contrasted sharply from their background. Female spiders also responded better to the color and monochrome than the grayscale version, suggesting color makes a difference to spiders.

"What we found is that for female spiders, intensity matters more than color. But for male eavesdroppers, color matters, too. That is the odd finding. We didn't expect that at all," Uetz said.

One surprising finding was that spider eyesight seems to adapt to the changing seasons.

"That makes a lot of sense because when you go out in the early season when the spiders first come out, there are no leaves on the trees so there is broad spectrum light," Uetz said. "But as the seasons change, leaves come out and everything turns green. Spiders have to be able to see the contrast against a lot of color backgrounds."

UC's spider lab keeps about 1,200 wolf spiders (virtually all of them, researchers assure visitors, are accounted for). Students collect juvenile spiders from the same populations of wild spiders living in forests near UC.

In another study, UC postdoctoral researcher Alex Sweger examined the way male wolf spiders use vibrations to woo females. Spiders don't have ears but can "hear" with tiny sensory organs on their legs that pick up the faint vibrations of prey. Male spiders use a special rasping organ on their pedipalps to produce vibrations that drum the ground, rattling leaves or soil, as part of their ritual mating dance.

Sweger used a laser Doppler vibrometer to measure the spider's vibrations and reproduce them with a device called a piezoelectric disc bender.

"It's very similar to the vibrations made by an actual spider. We calibrate the device and attach it to a leaf and see how the female spider responds," he said.

The ruse works.

Sweger suspected that summer rains are the bane of these spider drummers. He found that males tried to woo females regardless of the weather. But when the ground is wet, they rely more on their visual cues -- waving their forelegs in a dance that only female wolf spiders might appreciate.

"They shift to visual behaviors over vibrations on wet leaves, suggesting they are flexible in using different communication modes to suit the conditions," Sweger said.

Even so, males have far less mating success under wet conditions.

"Their breeding season isn't very long. Males have a lot of pressure to mate with as many females as possible to increase their genetic success," Sweger said. "So if you can overcome a hurdle like rain rather than wait for ideal conditions, it benefits you."

For another study, UC biology student Trinity Walls examined whether juvenile spiders that were classified as shy or bold would maintain that behavior later in life. They did.

To classify her subjects as bold or shy, Walls poked at juvenile spiders with a pair of forceps that simulated a bird's beak. Shy spiders typically froze in place, relying on camouflage for long periods after the scare, while the bold spiders resumed their foraging or exploration much more quickly. She repeated the scare tactic when the juveniles were older and compared her results.

Intrepid spiders might have more hunting or mating opportunities because of their bold behavior, but they're also more likely to be be seen and eaten. Shy spiders might be fearful but this excess caution means they might be more likely to pass on their genes.

"There are pros and cons to each behavior," Uetz said. "Bold spiders face more risks from predators drawn to movement. But by moving, they're more likely to find prey or mates."

Student Walls came to UC because of the biology department's spider lab. She has been fascinated by them her entire life, she said.

"I had a pet Mexican red-knee tarantula named Anastasia for eight years," Walls said. "I love spiders."

UC student Olivia Bauer-Nilsen examined whether a bacterial infection common to spiders affected the mating behavior of female wolf spiders. Bauer-Nilsen suspected that the immune response from the infection would make the spider too weak or fatigued to mate. Instead, she found the infection had no discernible effect. She presented a poster on the study at the conference.

"It was my first poster. A lot of people say don't talk to me about spiders ever again. But my family and close friends are not averse to spiders. They're excited that I'm excited," she said.

Uetz said even he wasn't always the fan of spiders he is today.

"I was terrified of spiders before college. Everyone seems to react that way. Spiders are the No. 1 most-feared species on the planet now," Uetz said. "It's completely unjustified."

Uetz said he learned to appreciate spiders in his first biology class when he took a close look.

"When you look at these animals under a microscope, you see them in a completely different way," he said. "These animals are alien but no less interesting."

Credit: 
University of Cincinnati

Dietary competition played a key role in the evolution of early primates

image: Three models of niche competition between euprimates and non-euprimate mammals. Non-euprimates thrived across North America prior to euprimate arrival ~55 Ma (large tree, left). After euprimate arrival (center column), these two groups could have: occupied separate niches with no competition (top row, right); occupied the same niche with one group ultimately displacing the other to reduce competition (middle row, right); or coexisted with minimal competition (bottom row, right).

Image: 
Laura Stroik and Gary T. Schwartz

Since Darwin first laid out the basic principles of evolution by means of natural selection, the role of competition for food as a driving force in shaping and shifting a species' biology to outcompete its adversaries has played center stage. So important is the notion of competition between species, that it is viewed as a key selective force that resulted in the split of the lineage leading to modern humans from that of our early ape ancestors.

The earliest true primates, called "euprimates," lived about 55 million years ago across what is now North America. Two major fossil euprimate groups existed at this time: the lemur-like adapids and the tarsier-like omomyids. Dietary competition with other similarly adapted mammals was presumably equally critical in the origin and diversification of these two groups. Though it's been hinted at, the exact role of dietary competition and overlapping food resources in early adapid and omomyid evolution has never been directly tested.

New research published online today in the Proceedings of the Royal Society B -Biological Sciences led by Laura K. Stroik, an assistant professor of biomedical sciences at Grand Valley State University, and Gary T. Schwartz, associate professor and research scientist at Arizona State University's Institute of Human Origins, confirms the critical role that dietary adaptations played in the survival and diversification of North American euprimates.

"Understanding how complex food webs are structured and the intensity of competition over shared food resources is difficult enough to probe in living communities, let alone for communities that shared the same landscape nearly 55 million years ago," said Stroik.

The researchers utilized the latest in digital imaging and microCT scanning on more than 350 fossil mammal teeth from geological deposits in North America. They sought to quantify the 3D surface anatomy of molars belonging to extinct representatives of rodents, marsupials, and insectivores - all of which were found within the same geological deposits as the euprimates and were thus likely real competitors.

The high-resolution scans allowed them to capture and quantify details of how sharp, cresty, or pointy the teeth were. In particular, they looked at molars, or the teeth at the back of the mouth, useful in pulverizing and crushing food or prey. The relative degree of molar sharpness is directly linked to the broad menu of dietary items consumed by each species.

Stroik and Schwartz used these aspects of molar anatomy to compute patterns of dietary overlap across some key fossil groups through time. These results were then weighed against predictions from three models of how species compete with one another drawn from the world of theoretical ecology. The signal was clear: lineages belonging to the adapids largely survived and diversified without facing competition for food. The second major group, the omomyids, had to sustain periods of intensive competition with at least one contemporaneous mammal group. As omomyids persisted into more recent geological deposits, it is clear that they evolved adaptive solutions that provided them with the ability to compete and were usually victorious.

"The results showed adapids and omomyids faced different competitive scenarios when they originated in North America," said Stroik.

"Part of what makes our story unique is that for the first time we compared these fossil euprimates to a range of potential competitors from across a diverse group of mammals living right alongside adapids and omomyids, not just to other euprimates," said Schwartz. "Doing so allowed us to reconstruct a far greater swath of the ecological landscape for these important early primate relatives than has ever been attempted previously."

The key advance of this new research is the demonstration that diet did in fact play a fundamental role in the establishment, and continued success, of euprimates within the North American mammalian paleocommunity. An exciting outcome is the development of a new quantitative toolkit to diagnose patterns of dietary competition in past communities. This will now allow them to explore the role that diet and competition played in how some of these fossil euprimates continued to evolve and diversify to give rise to living lemurs and all other higher primates.

Credit: 
Grand Valley State University

Research suggests coffee consumption associated with reduced risk of death

A new roundtable report from the Institute for Scientific Information on Coffee (ISIC) titled 'Coffee, caffeine, mortality and life expectancy' highlights the potential role of coffee consumption on all-cause mortality, examining both published and yet-to-be published research to date.

Roundtable delegates including academics, healthcare professionals and dietitians from across six European countries met to discuss the most recent research into coffee and life expectancy, and the potential mechanisms behind an association with reduced risk of all-cause mortality. The roundtable, held at the Royal Society of Medicine in London, was chaired by Sian Porter RD MBDA (Consultant Dietitian and spokesperson for The British Dietetic Association, UK).

Roundtable speaker Professor Miguel Martínez-González (University of Navarra, Spain; Harvard TH Chan School of Public Health) presented unpublished original research studying a cohort of almost 20,000 participants over an average of ten years. Professor Martínez-González's research suggests that coffee consumption at intakes of 3-6 cups of coffee a day reduces all-cause mortality. Within this cohort, there was a 22% lower risk of all-cause mortality for each two additional cups of coffee per day.

During the roundtable, potential mechanisms behind coffee consumption and reduced all-cause mortality were discussed. It was suggested that caffeine alone was unlikely to explain the effect on mortality, mentioning a potential role for polyphenols found in coffee, which may have antioxidant and anti-inflammatory effects.

Key research findings highlighted in the roundtable report include:

Meta-analyses have suggested that coffee consumption versus no coffee consumption is associated with an up to 17% risk reduction of all-cause mortality1-6

A study by Imperial College London and IARC found that participants with the highest consumption of coffee had a lower risk of all-causes of death7

A study from the US found that participants who consumed a cup of coffee a day were 12% less likely to die compared to those who didn't drink coffee8

Sian Porter, Consultant Dietitian and spokesperson for the British Dietetic Association, UK, said: "Data on cause of death and years lived combined with life expectancy data can be a useful way to understand the general population's health, and is research frequently examined by health organisations to help inform policy to guide people towards healthier diets and lifestyles. The growing body of research on coffee consumption and all-cause mortality presents new data for consideration, although more evidence is needed to understand the association and mechanisms behind the results."

The current peer-reviewed body of research on coffee consumption and all-cause mortality was discussed at the roundtable in the context of how coffee may fit into a healthy diet and lifestyle. Delegates questioned how frequently healthcare professionals currently discuss coffee consumption with patients, particularly when dealing with patients at risk of developing cardiovascular disease (CVD) or type 2 diabetes, given the known relationship between coffee consumption and these diseases.

Credit: 
Kaizo

Just two weeks' inactivity can trigger diabetic symptoms in vulnerable patients: Research

image: McMaster University researchers Stuart Philips, left, and Chris McGlory, centre, work with a research subject, right.

Image: 
JD Howell, McMaster University

HAMILTON, ON, July 31, 2018 - Just two weeks without much activity can have a dramatic impact on health from which it is difficult to recover, according to researchers who studied overweight older adults at risk of developing Type 2 diabetes.

Not only did an abrupt, brief period of inactivity hasten the onset of the disease and elevate blood sugar levels among pre-diabetic patients, but researchers reported that some study participants did not fully recover when they returned to normal activity for two weeks.

The findings are published online in The Journals of Gerontology.

"We expected to find that the study participants would become diabetic, but we were surprised to see that they didn't revert back to their healthier state when they returned to normal activity," says Chris McGlory, a Diabetes Canada Research Fellow in the Department of Kinesiology at McMaster University and lead author of the study.

Participants were asked to reduce their daily steps to no more than 1000 steps per day, the equivalent of being housebound due to, for example, illness. Their steps and activity were measured using pedometers and specialized activity monitors, while researchers tested their blood sugar levels and took blood samples during the two-week period.

The results imply that seniors who experience periods of physical inactivity from illness, hospitalization and bed rest, for example, are more likely to suffer harmful consequences to their overall health.

"Treatment of type 2 diabetes is expensive and often complicated," explains Stuart Phillips, the professor in the Department of Kinesiology at McMaster who oversaw the research.

"If people are going to be off their feet for an extended period they need to work actively to recover their ability to handle blood sugar," he says.

According to the most recent statistics from the Centres for Disease Control and Prevention, more than 30 million Americans have diabetes and more than 84 million are prediabetic.

In Canada, Type 2 diabetes is one of the fastest growing diseases, with nearly 60,000 new cases reported each year, according to the Public Health Agency of Canada. It is the sixth leading cause of death and the leading cause of adult blindness and adult amputation.

"In order for pre-diabetic older adults to recover metabolic health and prevent further declines from periods of inactivity, strategies such as active rehabilitation, dietary changes and perhaps medication might be useful," says McGlory.

Research has shown that within days of the onset of ina

ctivity, there are notable reductions in skeletal muscle mass, strength and a rapid onset of insulin resistance, a common feature of type 2 diabetes.

Credit: 
McMaster University