Culture

Dementia and transitional care: Gaps in research and practice

Patients with dementia are hospitalized at higher rates and involved in transitional care more frequently than those who are cognitively unimpaired. Yet, current practices for managing transitional care--and the research informing them--have overlooked the needs of patients with dementia and their caregivers.

"Patients with dementia have only been considered in a small portion of decades of transitional care studies," said Beth Prusaczyk, Ph.D., a recent postdoctoral fellow in The Center for Clinical Quality and Implementation Research at Vanderbilt University Medical Center. Prusaczyk is among the experts beginning to develop evidence-based practices to support patients with dementia in transitional care.

"The research has excluded patients with dementia for several reasons: because of IRB hurdles, and out of the concern that they can't fully appreciate participation," Prusaczyk said. "There is also an erroneous assumption you can't get good data."

Needs Going Unmet

In a new study published in the Journal of Gerontological Nursing, Prusaczyk and colleagues showed that older patients with dementia at one major teaching hospital were less often provided with transitional care steps including patient education, discharge planning, and documentation of medication history, as compared to patients without dementia.

The study used a chart review of 210 patients aged 70 and older who were discharged from an inpatient stay other than the ED. The 126 patients with dementia--60 percent of those included--experienced significant differences in their transitional care. The researchers assumed charts reflected steps taken with patients or caregivers.

Care teams collected medical histories from the patient, family member or other provider for only 60 percent of patients with dementia, compared to 86 percent of patients without. Patients with dementia also received discharge education far less frequently. This included education about:

In-hospital medications (77 percent of patients with dementia; 100 percent of patients without)

Diagnoses (45 percent; 83 percent)

Follow-up needs (42 percent; 81 percent)

Medication regimens after discharge (47 percent; 80 percent)

Prusaczyk says the researchers confirmed these trends during qualitative interviews with providers at the hospital. "They didn't necessarily set a high priority for these types of transitional care activities for patients with dementia or their caregivers," she said.

Aligning Care

In a separate study of the same patient cohort, published in Journal of Interprofessional Care, Prusaczyk's group applied social mapping and network analyses to identify 14 unique types of actors engaged in discharge communications. Both clinicians and non-clinicians (e.g. social workers, case managers) contributed to discharge planning. Perhaps concerningly, primary care physicians did not participate, unless responding to queries initiated by case managers.

"I'd like more research to understand why transition care teams are still struggling to communicate internally, but especially with primary care providers. This is obviously significant with an older patient with dementia," Prusaczyk said.

Prusaczyk, who is a former hospital social worker, acknowledges the "pressures and challenges" of managing care transitions across complex teams. Still, there are clear opportunities to better understand and serve the needs of patients with dementia.

"My focus is identifying changes we can enact now. System-level changes are needed, but there are communication tools available today that improve retention--making sure patients have their glasses, using teach-back and more visual aids," Prusaczyk said. "The change can happen if it is prioritized."

Credit: 
Vanderbilt University Medical Center

$4.6 million award creates program to train cybersecurity professionals

A five-year, $4.63 million award from the National Science Foundation will enable a multi-disciplinary team of researchers at the University of Arkansas to recruit, educate and train the next generation of cybersecurity professionals.

The program will provide the knowledge and tools necessary to protect network and computer systems in three critical industries - cybersecurity, transportation security, and critical infrastructure security.

"The federal agencies that support these industries - all critical to our nation's security and economic health - understand that new cybersecurity challenges are met with an increasingly insufficient security workforce," said Jia Di, professor of computer science and computer engineering and principal investigator for the program. "But people at these agencies also understand that our university, with its specific research strengths, is uniquely positioned to expand the pool of highly skilled professionals who can address these challenges."

The "Cyber-Centric Multidisciplinary Security Workforce Development" program will draw on faculty research expertise in the departments of Computer Science and Computer Engineering, Electrical Engineering and Industrial Engineering. Faculty members will design curriculum focused on cybersecurity in the areas of computer and information systems, transportation and critical infrastructure with specific focus on the electrical power grid. The program will provide job training and research opportunities for graduate and undergraduate students, and all students will be offered internships at government agencies, where additional training could lead to job placement.

The program will focus on attracting students from underrepresented populations and will partner with Northwest Arkansas Community College to open paths for its students to pursue bachelors' and advanced degrees at the university.

The program will address a national shortage of a highly skilled cybersecurity professionals. Over a one-year period, from September 2017 to August 2018, for example, there were more than 300,000 open cybersecurity jobs in the United States, Di said. Professionals at these companies cited lack of education as the reason for this shortage. To qualify for these jobs, students must understand not only computer systems, networks and software, but also data storage protection, cryptography, malware and software vulnerabilities, as well as the nature of cyber-crimes and other threats to infrastructure.

THE PROGRAM

Led by the Arkansas Security Research and Education Institute (ASCENT), which Di directs, the "Cyber-Centric Multidisciplinary Security Workforce Development"program will include investigators affiliated with several U of A research centers - Center for Information Security and Reliability, Mack-Blackwell Transportation Center and Cybersecurity Center for Secure Evolvable Energy Delivery Systems. Students will conduct research at these centers.

Co-principal investigators for the program are Brajendra Panda, professor of computer science and computer engineering; Alan Mantooth, Distinguished Professor of electrical engineering; Dale Thompson, associate professor of computer science and computer engineering; and Chase Rainwater, associate professor of industrial engineering.

Credit: 
University of Arkansas

VR lullaby machine shown to induce tranquil pre-sleep states

video: Inter-Dream involves an interactive bed and ambient music controlled by the artists, and kaleidoscopic visuals controlled by the user with their own brainwaves, via EEG.

Image: 
RMIT University

Designed by PluginHUMAN art duo, Dr Betty Sargeant and Justin Dwyer, the system involves an interactive bed and ambient music controlled by the artists, and kaleidoscopic visuals controlled by the user with their own brainwaves, via EEG.

With each brain frequency assigned a different colour and brainwave intensity tied to movement, each person's brain activity generates unique imagery.

PhD researcher with RMIT University's Exertion Games Lab, Natahan Semertzidis, has now assessed the system for inducing pre-sleep states and general mental wellbeing.

"Technology and sleep are always talked about as incompatible," Semertzidis says. "Our findings flip that notion upside down and show how technology can also aid rest and relaxation."

Analysis of physiological and psychological data along with user interviews were recently presented at the premier international conference of Human-Computer Interaction, CHI 2019.

"Results demonstrated statistically significant decreases in pre-sleep cognitive arousal and negative emotion," Semertzidis says.

"EEG readings were also indicative of restorative restfulness and a clear mind, while interview responses described experiences of mindfulness."

Participants reported a 21 per cent drop in general negative emotion and a 55 per cent drop in feelings of fear after using Inter-Dream. Meanwhile, general positive emotions increased by 8 per cent and feelings of serenity by 13 per cent.

Good sleep is acknowledged to be preceded by a characteristic set of specific cognitive and mood states, as well as physiological changes, says Semertzidis.

"Good sleepers are generally more relaxed and positive, while bad sleepers are more likely to focus on unpleasant and intrusive worries and stressors because of involuntary mental rehearsal of the past day's events," he says.

"It makes sense that influencing these cognitive mood states can help us better ease into rest."

Semertzidis says people not only passively relaxed into the experience but creatively interacted with it too.

"Some of them reported really going on a journey by manipulating the system with their minds," he says.

"This is no trivial notion in the context of inducing positive pre-sleep states, as it has been well documented that creative expression is strongly associated with positive effects on emotion and affect."

Another prevalent theme from the 12 user interviews was cognitive states consistent with those of mindfulness.

"This was often voiced as a redirection of thought away from life stressors and toward the present experience as a result of the system's neurofeedback reactivity," he says.

Semertzidis' supervisor and co-author on the paper, Associate Professor Fabio Zambetta, said while the system itself was not the answer to healthy sleep - clinical interventions would require larger samples and control groups, and technology would need to be explored that was less invasive during sleep - it presented a fascinating case study.

"Our findings are really significant in pointing a possible way forward using neurofeedback technology to facilitate restfulness and sleep onset," Zambetta said.

Semertzidis' co-supervisor Professor Florian 'Floyd' Mueller, who heads RMIT University's Exertion Games Lab, said the project demonstrated how art and science can complement each other in novel approaches to address old problems.

"The work really puts the human body centre stage, allowing us to see the human body not just as a mere input controller, but rather allowing people to experience their bodies as play," Mueller said.

"For me, this is a beginning to facilitate a more playful future, in particular, one where we can even experience rest and ultimately sleep as a form of (digital) play: it would be fascinating to explore what such a future could look like and we are always looking for PhD candidates who want to explore this."

Credit: 
RMIT University

The protein that gives identical cells individuality

New insight into a protein's role in regulating tight DNA packing could have implications for combating tumor cell resistance to anti-cancer treatments.

Hokkaido University researchers have revealed how a protein maintains a delicate balance of tightly packing DNA inside yeast cells with the same genetic material, while also allowing for variation amongst them. The findings, published in the journal PLOS Genetics, could help researchers identify ways to suppress the formation of tumor cells that are resistant to anti-cancer drugs.

The incredibly long strands of DNA found in cells are packed into a structure called chromatin with the help of proteins called histones. Heterochromatin is where parts of chromatin are really tightly packed together. This makes certain genes difficult to access, effectively silencing them. Abnormal heterochromatin formation can inhibit genes that are essential for basic cell functions. But it can also play a role in cellular adaptation to changing circumstances by modifying gene accessibility. The mechanisms that regulate heterochromatin distribution are not yet fully understood.

A protein, called Epe1, is known for its suppressive role in the formation of heterochromatin. Biological chemist Yota Murakami of Hokkaido University led a team of scientists in Japan to find out what was happening at the molecular level.

When a fission yeast cell divides into two, each cell has identical genetic material. Murakami's team found that turning off Epe1 in yeast cells led to stochastic heterochromatin formation, altering the characteristics of some cells and leading to the production of a more diverse yeast population.

At the molecular level, Epe1 works against a molecular label on histone called H3K9me which recruits gene-silencing proteins for heterochromatin formation. The team found that Epe1 prevents H3K9me deposition at sites where abnormal "ectopic" heterochromatin has the potential to form. It also promotes the removal of H3K9me on already-formed ectopic heterochromatin, destabilizing the tight structure. "Interestingly, our study showed that the removal of ectopic heterochromatin by Epe1 is incomplete. Thus, it creates diversity in gene expression and cell characteristics in a population with the same genetic material," says Yota Murakami. "In other words, while Epe1 prevents the emergence of extreme diversity caused by accidental heterochromatin formation, it also allows individuality."

Cell diversity is thought to help adaptation to ever-changing environments, but, it is not always a good thing. Dividing tumor cells can also acquire diversity and develop resistance to anti-cancer treatments. "Since the chromatin regulatory mechanisms found in fission yeast cells are similar to those in humans and other mammals, this work could improve understandings of how cells in our body adapt to changing environments and develop resistance to anti-cancer treatments," explains Yota Murakami.

Credit: 
Hokkaido University

Study identifies potential markers of lung cancer

BOSTON - By examining both blood samples and tumor tissues from patients with non-small-cell lung cancer (NSCLC), investigators at Massachusetts General Hospital (MGH) have identified markers that can distinguish between major subtypes of lung cancer and can accurately identify lung cancer stage. Their proof-of-concept test accurately predicted whether the blood samples they examined came from patients with shorter or longer survival following lung cancer surgery, including patients with early-stage disease.

Their findings could eventually help physicians decide whether an individual patient with lung cancer can benefit from standard treatment or may need more aggressive therapy. The study is published in the open-access journal Scientific Reports.

The US Preventive Services Task Force currently recommends that middle-age and older persons with a history of heavy smoking be screened annually for lung cancer with low-dose CT. Low-dose CT is effective at detecting small lung tumors, but the cost of CT screening and risks of repeated radiation exposure prevent its use for screening of the general population. This points to a need for a low cost, minimally invasive method for identifying people who may require further CT screening to catch the disease at earlier, more readily treatable stages, says co-principal investigator Leo L. Cheng, PhD, an associate biophysicist in the departments of Pathology and Radiology at MGH.

"You cannot use CT as a screening tool for every patient or even for every at-risk patient every year, so what we're trying to do is to develop biomarkers from blood samples that could be incorporated into physical exams, and if there is any suspicion of lung cancer, then we would put the patient through CT," Cheng says.

Along with co-principal investigator David C. Christiani, MD, MPH, a physician in the Department of Medicine at MGH, Cheng and other colleagues studied paired blood samples and tumor tissues taken at the time of surgery and looked for unique metabolomic markers using high-resolution magnetic resonance spectroscopy (MRS), a sensitive technique for characterizing the chemical composition of tissues.

Cheng says that although other research groups have used MRS to identify potential biomarkers of lung cancer in serum, "the uniqueness of our study is that we have paired samples from patients obtained at the same time as surgery."

The paired specimens came from 42 patients with squamous cell carcinomas (SCC) of the lung, and 51 patients with adenocarcinomas of the lung. The investigators also examined blood samples from 29 healthy volunteers who served as controls. The patients included 58 with early (Stage I) lung cancer, and 35 with more advanced disease (Stage II, III, or IV).

The experiments were designed to see whether blood samples and tumor tissue samples from the same patient had common features that would identify the presence or absence of lung cancer, discriminate between cancer subtypes, and confirm the diagnostic accuracy of a simple blood test.

The investigators identified specific profiles of metabolites common to both types of samples and showed the differences between the profiles could signal whether a patient had SCC or adenocarcinoma, which require different treatments. They also found that the profiles could distinguish between early-stage disease, which is often highly treatable, and later disease stages which require more aggressive or experimental treatments.

Importantly, the tests also identified whether the samples came from patients who lived an average of 41 months after surgery, or from those patients who lived longer than 41 months. This finding, if validated in further studies, could identify early on those patients at especially high risk for early death, who might benefit from clinical trials of new drugs.

The ultimate goal of the study is to develop a blood test that could be included as part of a standard physical and could indicate whether a specific patient has suspicious signs pointing to lung cancer. Patients identified by the blood screen would then be referred for CT.

Credit: 
Massachusetts General Hospital

UMN researcher identifies differences in genes that impact response to cryptococcus infection

MINNEAPOLIS, MN- July 15, 2019 - Cryptococcus neoformans is a fungal pathogen that infects people with weakened immune systems, particularly those with advanced HIV/AIDS. New University of Minnesota Medical Research could mean a better understanding of this infection and potentially better treatments for patients.

In "Identification of Pathogen Genomic Differences That Impact Human Immune Response and Disease during Cryptococcus neoformans Infection" published in the journal MBio by American Society for Microbiology, Kirsten Nielsen, PhD, Professor, Department of Microbiology and Immunology, University of Minnesota, Medical School and colleagues were the first to examine how Cryptococcus genes impact the disease using human data.

After her last study, which found that the pathogen was driving the outcome of the Cryptococcus infection, Nielsen went on to examine the underlying genetic differences in her current study.

"We looked at differences in disease between patients - whether the patient lived or died, how the patient's immune system responded to the infection, and whether the antifungal drug treatment worked well - and we asked 'How do genetic differences in the Cryptococcus strains impact the disease variables?'" explained Nielsen.

The study found that there are 40 genes that are crucial to the ability of Cryptococcus to change the outcome of human disease, which have never before been identified as important. These genes give researchers a new set of information that they've never had before.

"We can take this new information generated using the human data and show how the genes work in other models," said Nielsen. "When we deleted the genes, it changed the ability of Cryptococcus to cause disease in a model system, so we know that they are important in disease."

Nielsen and her colleagues hope that identifying which versions of genes are important for patient survival will ultimately lead to better treatment of patients.

"We hope that this will have clinical benefits in the future. If we can figure out why certain strains are more deadly, and identify which patients have those strains, we can treat them differently. This will hopefully decrease reliance on toxic antifungals," said Katrina Jackson, a Graduate Student in the University of Minnesota Medical School, who was involved in the project.

Credit: 
University of Minnesota Medical School

Move over tin foil hat worriers, 'beyond 5G' wireless transceiver developed

image: The 'end-to-end transmitter-receiver' chip boasts a unique architecture combining digital and analog components on a single platform, resulting in ultra-fast data processing and reduced energy consumption.

Image: 
Steve Zylius / UCI

Irvine, Calif., July 16, 2019 - A new wireless transceiver invented by electrical engineers at the University of California, Irvine boosts radio frequencies into 100-gigahertz territory, quadruple the speed of the upcoming 5G, or fifth-generation, wireless communications standard.

Labeled an "end-to-end transmitter-receiver" by its creators in UCI's Nanoscale Communication Integrated Circuits Labs, the 4.4-millimeter-square silicon chip is capable of processing digital signals significantly faster and more energy-efficiently because of its unique digital-analog architecture. The team's innovation is outlined in a paper published recently in the IEEE Journal of Solid-State Circuits.

"We call our chip 'beyond 5G' because the combined speed and data rate that we can achieve is two orders of magnitude higher than the capability of the new wireless standard," said senior author Payam Heydari, NCIC Labs director and UCI professor of electrical engineering & computer science. "In addition, operating in a higher frequency means that you and I and everyone else can be given a bigger chunk of the bandwidth offered by carriers."

He said that academic researchers and communications circuit engineers have long wanted to know if wireless systems are capable of the high performance and speeds of fiber-optic networks. "If such a possibility could come to fruition, it would transform the telecommunications industry, because wireless infrastructure brings about many advantages over wired systems," Heydari said.

His group's answer is in the form of a new transceiver that leapfrogs over the 5G wireless standard - designated to operate within the range of 28 to 38 gigahertz - into the 6G standard, which is expected to work at 100 gigahertz and above.

"The Federal Communications Commission recently opened up new frequency bands above 100 gigahertz," said lead author and postgraduate researcher Hossein Mohammadnezhad, a UCI grad student at the time of the work who this year earned a Ph.D. in electrical engineering & computer science. "Our new transceiver is the first to provide end-to-end capabilities in this part of the spectrum."

Having transmitters and receivers that can handle such high-frequency data communications is going to be vital in ushering in a new wireless era dominated by the "internet of things," autonomous vehicles, and vastly expanded broadband for streaming of high-definition video content and more.

While this digital dream has driven technology developers for decades, stumbling blocks have begun to appear on the road to progress. According to Heydari, changing frequencies of signals through modulation and demodulation in transceivers has traditionally been done via digital processing, but integrated circuit engineers have in recent years begun to see the physical limitations of this method.

"Moore's law says we should be able to increase the speed of transistors - such as those you would find in transmitters and receivers - by decreasing their size, but that's not the case anymore," he said. "You cannot break electrons in two, so we have approached the levels that are governed by the physics of semiconductor devices."

To get around this problem, NCIC Labs researchers utilized a chip architecture that significantly relaxes digital processing requirements by modulating the digital bits in the analog and radio-frequency domains.

Heydari said that in addition to enabling the transmission of signals in the range of 100 gigahertz, the transceiver's unique layout allows it to consume considerably less energy than current systems at a reduced overall cost, paving the way for widespread adoption in the consumer electronics market.

Co-author Huan Wang, a UCI doctoral student in electrical engineering & computer science and an NCIC Labs member, said that the technology combined with phased array systems - which use multiple antennas to steer beams - facilitates a number of disruptive applications in wireless data transfer and communication.

"Our innovation eliminates the need for miles of fiber-optic cables in data centers, so data farm operators can do ultra-fast wireless transfer and save considerable money on hardware, cooling and power," he said.

Credit: 
University of California - Irvine

Rutgers collaborates with WHO to more accurately describe mental health disorders

A Rutgers University researcher contributed to the first study to seek input from people with common mental health issues on how their disorders are described in diagnostic guidelines.

The study, which was conducted by researchers in the United Kingdom and the United States in collaboration with the World Health Organization Department of Mental Health, appears in The Lancet.

"Including people's personal experiences with disorders in diagnostic manuals will improve their access to treatment and reduce stigma," said Margaret Swarbrick, an adjunct associate professor and Director of Practice Innovation and Wellness at Rutgers University Behavioral Health Care, who collaborated with Kathleen M. Pike, executive director and scientific co-director of the Global Mental Health Program on the U.S. portion of the study.

The researchers talked to people with five common disorders -- schizophrenia, bipolar disorder type 1, depressive episode, personality disorder and generalized anxiety disorder -- about how their conditions should be described in the upcoming 11th revision of the International Classification of Diseases and Related Health Problems (ICD-11). The ICD is the most widely used classification system for mental disorders. This is the first time people with diagnosed mental health disorders who are not health practitioners have been invited to give input on any published mental health diagnostic guidelines.

The project surveyed 157 people diagnosed with these conditions in the United Kingdom, India and the United States. The participants reviewed an initial draft of the ICD-11 chapter on mental, behavioral and neurodevelopmental disorders and recommended changes to more accurately reflect their experiences and/or remove objectionable language.

Many participants said the draft omitted emotional and psychological experiences they regularly have. People with schizophrenia added references to anger, fear, memory difficulties, isolation and difficulty communicating internal experiences. People with bipolar disorder added anxiety, anger, nausea and increased creativity. People with generalized anxiety disorder added nausea and anger. People with depression added pain and anxiety. People with personality disorder added distress and vulnerability to exploitation.

The participants also suggested removing confusing or stigmatizing terms such as "retardation," "neuro-vegetative," "bizarre," "disorganized" and "maladaptive."

"We discovered that the current draft reflected an external perspective of these conditions rather than the perspective of the person's lived experience," Swarbrick said. "This is a needed perspective for clinicians and researchers. Participants appreciated the non-technical summaries, which suggest that using such common language would go a long way in bridging the communication gap between the people being diagnosed and clinicians."

Credit: 
Rutgers University

State capacity: How it is measured and compared

image: The state capacity ranking's top 20 'leaders'
(final grades, on a scale of 0 to 10).

Image: 
HSE University

'State capacity' refers to a state's ability to make and effectively implement decisions in domestic and foreign policy. In a study, HSE University political scientists evaluated the state capacity of 142 countries. Based on their findings, the researchers created and trialed a state capacity index, identified eight models of state capacity, and compiled a general international ranking. https://publications.hse.ru/mirror/pubs/share/direct/276237531

Types of Power

The researchers identified three main dimensions of state capacity:

coercive (ensuring external security and internal order);

extractive (financial resources, including taxation, available to the state);

administrative-bureaucratic (quality of administrative and bureaucratic institutions).

Each of them is measured by certain indicators, the total number of which is six:

Military expenditures for ensuring external security (% of GDP);

Aggregated indicator of violence control inside the country (statistics of domestic murders and victims of domestic conflicts - number of cases per 100,000 of population);

Collectability of revenue taxes (% of GDP);

Aggregate income of the state budget (% of GDP);

WGI (World Governance Indicators), including Government Effectiveness, Regulatory Quality, Rule of Law, and Control of Corruption;

Share of shadow economy (% of GDP).

All of these factors formed the basis of the HSE University study. The researchers analyzed these indicators in 142 countries based on data from 2015.

Leaders and Outsiders

The ranking that resulted from the analysis revealed groups of 'leaders' and 'outsiders'. The first group included countries with the highest performances in all three dimensions of state capacity, while the second one included those with the lowest.

Russia falls approximately in the middle of the ranking, ranking alongside Argentina and Mauritius. Meanwhile, according to the authors, with the exception of the two groups that make up the ranking's top and bottom, its largest segment -- the middle (which includes Russia) -- is 'full of if not awkward then difficult to explain groupings', where equally surprising results. For example, Ukraine appears close to Surinam and Senegal, while Kazakhstan ranks next to Bhutan and Albania.

From Ranking to Clusters

The resulting ranking demonstrates that you cannot study and compare the state capacity of different countries by one value. Methodological problems arise from an approach that relies on the principle, 'the more, the better': for example, if one wants a certain country to move up in the ranking, all one has to do is increase one indicator while preserving the other indicators at the same level. This leads to a loss of multi-dimensionality and the importance of specific combinations of state capacity indicators. For example, 'Is it that necessary for countries, such those in Scandinavia, to increase their coercive capacity, when their institutions work quite well?'

That is why cluster analysis has been used to detect different models of state capacity, which 'do not fit into one quantitative scale'.

The scholars used it to divide the countries into eight clusters (types) of state capacity, which were named as follows:

Successful development;

Second echelon;

Individual trajectories;

The oil and gas needle;

Outsiders;

On the verge of failure;

Rising Asian giants;

Variations of post-Soviet trajectories.

Researchers don't believe this list to be indisputable and finalized. It is just a starting point for further investigation. But they don't underestimate the magnitude of their work: '142 countries have been clustered, 79 of which fit into stable associations'.

Successful Development and Second Echelon

The first two clusters are variations of one model. It is characterized by:

High quality of institutions;

Insignificant share of shadow economy;

Very high level of violence control;

Small military expenditure.

The clusters include:

Successful Development: Western European socio-economic leaders (Austria, Belgium, Denmark, Germany, Ireland, Iceland, Luxembourg, the Netherlands, Norway, Finland, France, Sweden, Switzerland); the British Commonwealth countries (Australia, Great Britain, Canada, New Zealand); and Japan in Asia.

Second Echelon: Eastern European countries (Hungary, Poland, Slovakia, Slovenia, Czech Republic); Baltic countries (Latvia, Lithuania, Estonia); EU countries (Italy, Spain, Portugal, Malta, Cyprus); and South Korea.

The administrative-bureaucratic component of state capacity prevails in countries of this cluster. 'The quality of institutions and the legality of the economy provide for high tax collection rates, while the differences may be considerable. For example, even in the Successful Development group, the share of tax revenues in GDP varies from 18.7% in Japan to 44.7% in Denmark, which is quite natural, since this indicator depends on the country's economic policy.'

Individual Trajectories

USA, Israel, Singapore

These countries also have high state capacity, with powerful administrative-bureaucratic potential (low shares of shadow economy and high quality of institutions), and high level of violence control.

The difference from the Successful Development model is a stronger coercive function (expressed in the considerable share of military expenditures in GDP).

'Maintaining almost "maximum" levels of institutional and coercive potentials is not achieved by "pumping" a lion's share of national wealth through the state apparatus. This is particularly clear from the share of public revenues in GDP: it is 29.9% in Israel, 27.8%in the USA, and only 18.9% in Singapore.

The Oil and Gas Needle

Bahrain, Qatar, Kuwait, the United Arab Emirates, Oman, Saudi Arabia.

In this model, the coercive function dominates (countries have some of the world's biggest military expenditures and effective violence control within the state). Domination is provided mainly thanks to budget revenues from extraction of mineral resources.

Easily collected oil and gas revenues and their concentration in the legal sector make the existence of a shadow economy unnecessary, and they do not require high quality institutions and serious inflows of 'traditional' taxes: the share of tax revenue in the GDP is comparatively high only in the UAE (12%), with 0.64% in Bahrain, 1.17% in Kuwait, 1.68% in Saudi Arabia, 2.79% in Oman, and 6.51% in Qatar.

Outsiders

These include countries where all state capacity indicators are weak or close to minimal, but with maintained violence control function.

Core of the cluster: Gabon, Haiti, Gambia, Guinea-Bissau, Zambia, Cameroon, Democratic Republic of the Congo, Kenya, Côte d'Ivoire, Madagascar, Nicaragua, Nigeria, Papua New Guinea, Paraguay, Tanzania, Uganda, Chad, Equatorial Guinea, Eritrea.

On the Verge of Failure

Venezuela, Honduras, Salvador, Trinidad and Tobago, Jamaica.

In terms of state capacity, these countries are close to 'outsiders'. The critical difference is that they have almost lost their ability to control violence.

Rising Asian Giants

China and India.

These countries have a considerable potential of state capacity, but are not leaders in these terms.

Despite their status as the biggest arms importers, their military expenditures as a share of their GDP are moderate (about 2% in China and about 2.5% in India).

The quality of administrative institutions is not high, while 'public systems provide for a rather low level of violence with legitimate economic operations'.

Variations of Post-Soviet Trajectories

Russia and Azerbaijan.

The fundamental component of state capacity is military-coercive with a focus on military expenditures (over 4% of the GDP in both countries in 2015).

Violence control within the country is average. The administrative-bureaucratic component looks 'not impressive, to say the least, particularly in terms of quality of institutions'.

'Talks of "total governmentalisation" of the Russian economy are not confirmed by the data, at least that of 2015. Russia, with its public revenue comprising 29% of its GDP, is not comparable to Oman and Qatar (where it comprises 47.5 and 42.7%), on the one hand, nor to Norway or Finland (44% and 41.4%), on the other'.

Azerbaijan and Russia are the only post-Soviet countries with stable results in the cluster analysis. The researchers named only one stable feature of other post-Soviet countries: 'the ability to join the weirdest associations, which can hardly be explained reasonably.' The scholars concluded that 'probably this means that there is no common "post-Soviet" type of state capacity.'

Credit: 
National Research University Higher School of Economics

Save your money: Vast majority of dietary supplements don't improve heart health or put off death

image: Vitamins for heart health.

Image: 
Johns Hopkins Medicine

In a massive new analysis of findings from 277 clinical trials using 24 different interventions, Johns Hopkins Medicine researchers say they have found that almost all vitamin, mineral and other nutrient supplements or diets cannot be linked to longer life or protection from heart disease.

Although they found that most of the supplements or diets were not associated with any harm, the analysis showed possible health benefits only from a low-salt diet, omega-3 fatty acid supplements and possibly folic acid supplements for some people. Researchers also found that supplements combining calcium and vitamin D may in fact be linked to a slightly increased stroke risk.

Results of the analysis were published on July 8 in Annals of Internal Medicine.

Surveys by the Centers for Disease Control and Prevention show that 52% of Americans take a least one vitamin or other dietary/nutritional supplement daily. As a nation, Americans spend $31 billion each year on such over-the-counter products. An increasing number of studies-- including this new one from Johns Hopkins--have failed to prove health benefits from most of them.

"The panacea or magic bullet that people keep searching for in dietary supplements isn't there," says senior author of the study Erin D. Michos, M.D., M.H.S., associate director of preventive cardiology at the Ciccarone Center for the Prevention of Cardiovascular Disease and associate professor of medicine at the Johns Hopkins University School of Medicine. "People should focus on getting their nutrients from a heart-healthy diet, because the data increasingly show that the majority of healthy adults don't need to take supplements."

For the current study, the researchers used data from 277 randomized clinical trials that evaluated 16 vitamins or other supplements and eight diets for their association with mortality or heart conditions including coronary heart disease, stroke, and heart attack. All together they included data gathered on 992,129 research participants worldwide.

The vitamin and other supplements reviewed included: antioxidants, β-carotene, vitamin B-complex, multivitamins, selenium, vitamin A, vitamin B3/niacin, vitamin B6, vitamin C, vitamin E, vitamin D alone, calcium alone, calcium and vitamin D together, folic acid, iron and omega-3 fatty acid (fish oil). The diets reviewed were a Mediterranean diet, a reduced saturated fat (less fats from meat and dairy) diet, modified dietary fat intake (less saturated fat or replacing calories with more unsaturated fats or carbohydrates), a reduced fat diet, a reduced salt diet in healthy people and those with high blood pressure, increased alpha linolenic acid (ALA) diet (nuts, seeds and vegetable oils), and increased omega-6 fatty acid diet (nuts, seeds and vegetable oils). Each intervention was also ranked by the strength of the evidence as high, moderate, low or very low risk impact.

The majority of the supplements including multivitamins, selenium, vitamin A, vitamin B6, vitamin C, vitamin E, vitamin D alone, calcium alone and iron showed no link to increased or decreased risk of death or heart health.

In the three studies of 3,518 people that looked at a low-salt diet in people with healthy blood pressure, there were 79 deaths. The researchers say that they found a 10% decrease in the risk of death in these people, which they classified as a moderate associated impact.

Of the five studies in which 3,680 participants with high blood pressure were put on a low-salt diet, they found that the risk of death due to heart disease decreased by 33%, as there were 674 heart disease deaths during the study periods. They also classified this intervention as moderate evidence of an impact.

Forty-one studies with 134,034 participants evaluated the possible impact of omega-3 fatty acid supplements. In this group, 10,707 people had events such as a heart attack or stroke indicating heart disease. Overall, these studies suggested that supplement use was linked to an 8 percent reduction in heart attack risk and a 7 percent reduction in coronary heart disease compared to those not on the supplements. The researchers ranked evidence for a beneficial link to this intervention as low.

Based on 25 studies in 25,580 healthy people, data also showed that folic acid was linked to a 20 percent reduced risk of stroke. Some 877 participants had strokes during the trials. The authors graded evidence for a link to beneficial effects as low.

The authors point out that the studies suggesting the greatest impact of folic acid supplementation on reducing stroke risk took place in China, where cereals and grains aren't fortified with folic acid like they are in the U.S. Thus, they say, this apparent protective effect may not be applicable in regions where most people get enough folic acid in their diet.

Twenty studies evaluated the combination of calcium with vitamin D in a supplement. Of the 42,072 research participants, 3,690 had strokes during the trials, and taken together the researchers say this suggests a 17% increased risk for stroke. The risk evidence was ranked as moderate. There was no evidence that calcium or vitamin D taken alone had any health risks or benefits.

"Our analysis carries a simple message that although there may be some evidence that a few interventions have an impact on death and cardiovascular health, the vast majority of multivitamins, minerals and different types of diets had no measurable effect on survival or cardiovascular disease risk reduction," says lead author Safi U. Khan, M.D., an assistant professor of Medicine at West Virginia University.

Credit: 
Johns Hopkins Medicine

A new tool for data scientists and biologists and more

The social network Linkedin will tell a user how he/she is connected to another. In real life, points of connection are not always that evident. However, identifying patterns or relationships and commonalities among entities is a task that is critically important advantage for businesses, biologists, doctors, patients and more.

A new computational tool developed in the lab of USC Viterbi School Ming Hsieh Department of Electrical and Computer Engineering professor Paul Bodgan in collaboration with Ming Hsieh professor Edmond Jonckheere, is able to quickly identify the hidden affiliations and interrelationships among groups/items/persons with greater accuracy than existing tools.

The researchers in Bogdan's lab are sort of like detectives and the puzzle they are trying to figure out is how one clue, person, item or action is connected and related to another entity. Imagine a lab dedicated to a scientific "Six degrees of ..." to discover hidden interrelationships. The problem they are tackling is known by researchers who study complex networks as the "community detection problem"--identifying and mapping out which individuals or items have in common and how they are connected.

Such a computational tool could be leveraged by various groups: political strategists trying to find voters' overlapping values or shared attributes; or biologists who want to predict the potential of a drug's side effects or interactions --without running years' worth of live experiments. Their research is also being deployed to identify which parts of the brain are working on the same functions--a key piece of information for neuroscientists and individuals suffering from brain damage to anticipate if certain areas of the brain might take over functionality for injured tissue. One can also imagine this lab's algorithm working on finding points of contact on seemingly unrelated information.

Their recent paper, titled "Ollivier-Ricci Curvature-Based Method to Community Detection in Complex Networks", in the journal Nature Scientific Reports, documents the method the group has developed to create this improved tool.

Methodology/Proof of Concept:

PhD candidate Jayson Sia who worked on the research indicates that the algorithm they developed, the Ollivier-Ricci curvature (ORC)-based community identification, was tested and validated on four known real-world data sets the field for which the goal is to find the point of connection among the "nodes" or individuals/ individual items in a group by looking at the links between them or what is known in technical jargon as "edges." The data sets include a drug-drug interaction network, the Zachary's Karate Club; a college football conference affiliations; and a set of over 1000 political blogs.

Says lead author Sia, "In this paper, we utilized a novel geometric approach via the Ollivier-Ricci curvature which offers a natural method to discover inherent network community structures.."

Curvature in the geometric context, explains Sia, "essentially measures how a surface deviates from being flat (or how a surface 'curves'). The geometry of surfaces is related to the study of map projections and how distances are measured in a curved surface such as the Earth. The Ollivier-Ricci curvature extends this concept of 'curvature' to networks with positively curved edges being 'well connected' and naturally forming a 'community.' Negatively curved edges on the other hand are interpreted as 'bridges' between communities and cutting such edges would isolate information flow between communities."

Credit: 
University of Southern California

Human pancreas on a chip opens new possibilities for studying disease

image: This microscopic image shows human pancreatic cells color coded to show the presence of insulin and the gene CFTR (cystic fibrosis transmembrane conductance regulator. Researchers report in Nature Communications they used bioengineered human pancreatic tissue on a chip to determine that in addition to causing cystic fibrosis, disruption in CFTR gene expression also may help drive the dangerous complication Cystic Fibrosis-Related Diabetes.

Image: 
Cincinnati Children's Hospital Medical Center

CINCINNATI--Scientists created human pancreas on a chip that allowed them to identify the possible cause of a frequent and deadly complication of cystic fibrosis (CF) called CF-Related Diabetes, or CFRD.

It may be feasible to also use the small two-chambered device, which features bioengineered human pancreatic organoids to study the causes of non-CF-related conditions such as type 1 and 2 diabetes, according to researchers at Cincinnati Children's Hospital Medical Center, who report findings in Nature Communications.

First, however, the scientists want to see if their device can help people with CF--a genetic lung disease caused by a mutation in the CFTR gene. The mutation leads to a water and salt imbalance on cell surfaces that clogs the lungs with thick mucus.

As people with CF get older, they become increasingly at risk for CFRD, according to Anjaparavanda Naren, PhD, the study's principal investigator and Director of the Cystic Fibrosis Research Center (Division of Pulmonary Medicine). Making matters worse is that until now there hasn't been an effective way to study CFRD in the lab to look for better treatments.

"Mouse models of CF don't faithfully recreate CF-Related Diabetes in the lab, and it wasn't possible to study the disease at the depth we achieved in this study," said Naren. "Our technology closely resembles the human pancreas and potentially may help us find therapeutic measures to manage glucose imbalance in people with CF, which is linked to increased illness and death."

The in vitro chip technology can be used to study CFRD and glucose imbalance in specific individuals with the condition, creating the potential for diagnosing different disease manifestations on a highly personalized basis. The chip can help assay variability in the glucose measures of different people, determine correlation of glucose levels with the CFTR mutation type, and test small-molecule interventions.

Chipping Away at CFTR Conundrum

Although mutations in the CFTR gene are known to cause cystic fibrosis, its role in CFRD is unclear. To answer that question, the researchers started by isolating pancreatic ductal epithelial cells and pancreatic islets donated by surgical patients.

The ductal organoids were cultured in a transparent dual-chamber called a micro?uidic device, which contained specific biochemical solutions to generate the pancreas-on-a-chip. Ductal epithelial cells were cultured in the top chamber and pancreatic islet cells were in the bottom chamber, separated by a thin layer of porous membrane that allowed the different chambers to interact.

The cells grew and expanded into three-dimensional pancreatic organs that mimicked cell-to-cell communications and fluid exchange, similar to the function of a naturally developed human pancreas.

When the researchers tested pancreas-on-a-chip by disrupting CFTR gene expression, it impaired cell-to cell communication, fluid exchange and negatively affected endocrine function. This caused an insulin deficiency and recreated the CFRD disease process similar to that observed in the pancreas of a person, Researchers said this confirmed that the CFTR gene has a direct role in regulating insulin secretion and causing diabetes in people with CF.

Microfluidic devices have existed since 1979. But innovations in their design and functionality, especially since the advent of organoid technology, now allow researchers to bioengineer human organ tissues and mimic the function of natural organs in a laboratory setting.

Next Steps

The research team, which includes study first author and research associate Kyu Shik Mun, PhD, now will use the devices in a pilot study to test FDA-approved drugs that modulate CFTR gene expression. The goal will be to determine how well different CFTR drugs can slow or reverse lab-simulated CFRD.

Credit: 
Cincinnati Children's Hospital Medical Center

Forces behind growing political polarization in congress revealed in new model

image: A new model accurately predicted the nature of changes in polarization in 28 of the 30 US Congresses elected in the past 6 decades.

Image: 
Rensselaer Polytechnic Institute

TROY, N.Y -- For much of the 20th century, political polarization within the United States House of Representatives tended to decrease over the course of a two-year term. But starting in the mid-1980s, that trend reversed, and in recent decades, polarization has been more likely to grow.

These findings, published today in Royal Society Interface, are the result of a model developed by researchers at Rensselaer Polytechnic Institute who analyzed millions of roll call votes taken in the U.S. Congress. It was able to accurately predict the nature of changes in polarization in 28 of the 30 U.S. Congresses elected in the past six decades.

"Like the economist Adam Smith's 'invisible hand,' which is the unobservable market force balancing the supply and demand in free markets, we propose in our paper an invisible hand of polarization utility, which decides the level of polarization in votes of legislators," said Boleslaw Szymanski, a distinguished professor of computer science at Rensselaer and a co-author of the paper. "We tend to think humans behave unpredictably, but more and more we see that in a lot of settings, human choices can be explained by abstract and elegant models."

In constructing their model, the authors isolated a force that, they said, plays a role in determining polarization in politics that is analogous to that of gravitational force in physics. They call it "polarization utility," which Szymanski defined as a measure of "how much benefit members of Congress can realize by focusing on issues with appealing to their supporters."

The authors identify two critical factors that determine the level of polarization utility: the polarization of voters and an increase in the influence of campaign donors driven by the increasing costs of election campaigns. Lending credence to the power of these two factors is the fact that the two biggest jumps in polarization utility over the last 60 years occurred in 1960 - with the rise of the Civil Rights movement and increased U.S. involvement in the Vietnam War - and in 2010 following the Citizens United decision in the U.S. Supreme Court that opened the door to unrestricted political donations.

"The beauty of this work is that we can use a simple parameter -- polarization utility -- to quantify and predict how polarized the system is," said Jianxi Gao, an assistant professor of computer science at Rensselaer and a co-author on the paper. "This means that, if we want to encourage or discourage polarization of the Congress, we can do so by changing the utility."

Steps that would influence polarization utility might include limiting campaign donations and changing the length of terms that legislator serve. However, the authors stress that they are not calling for either of these steps to be taken.

"This paper enables us to better understand what are the factors impacting polarization in legislative houses," Szymanski said. "Whether we want polarization and how much is not a decision for us scientists, but rather for society as a whole."

While the model used in this study accounts for how the polarization of voters effects members of the legislative body, Szymanski said that an important next step would be to research how the inverse dynamic works. If polarized politicians influence electorates to become more polarized, the authors write, it could create a feedback loop that "might destabilize democracy." They plan to address this question in future research.

Xiaoyan Lu, who recently received his doctoral degree from Rensselaer, was also a co-author on the paper, the full title of which is The evolution of polarization in the legislative branch of government.

Credit: 
Rensselaer Polytechnic Institute

Avian malaria behind drastic decline of London's iconic sparrow?

image: House sparrow

Image: 
ZSL, BTO

London's house sparrows (Passer domesticus) have plummeted by 71% since 1995, with new research suggesting avian malaria could be to blame.

Once ubiquitous across the capital city, the sudden, and unexplained decline of the iconic birds led a team from ZSL (Zoological Society of London), the RSPB, the British Trust for Ornithology (BTO) and the University of Liverpool to investigate if parasite infections were involved.

Researchers collected data between November 2006 and September 2009 at 11 sites across London. Each site was centred around a single breeding colony and spaced at least four kilometres apart to ensure that birds from different groups didn't mix. The team estimated changes in bird numbers by counting the mature males and took tiny blood and faecal samples from sparrows, carefully caught and soon released, to monitor infection rates and severity.

Of the 11 colonies studied, seven were declining. On average 74% of sparrows carried avian malaria - a strain that only affects birds - but this differed between groups with some as high as 100%. However, it was infection intensity (i.e. the number of parasites per bird) that varied significantly and was higher on average in the declining colonies.

Former ZSL Institute of Zoology researcher and lead author Dr Daria Dadam, now of the BTO, said: "Parasite infections are known to cause wildlife declines elsewhere and our study indicates that this may be happening with the house sparrow in London. We tested for a number of parasites, but only Plasmodium relictum, the parasite that causes avian malaria, was associated with reducing bird numbers."

Professor Andrew Cunningham, Deputy Director of Science at ZSL said: "Although we found that nearly all sparrows carry Plasmodium, there was no association between the number of carriers and local sparrow population growth. Infection intensity, however, was significantly higher in young birds in the declining populations with fewer of the sparrows monitored in those groups surviving from year to year."

The malaria strains the study identified are widespread and infect multiple bird species. They are, therefore, likely to have been native to the UK, and to house sparrows, long before their numbers started to fall. The parasite is spread by mosquitos, which transfer it when they bite to feed. It has been suggested that avian malaria will become more common across Northern Europe due to climate change as higher temperatures and wetter weather favour mosquito reproduction, and more mosquitos will help the disease to spread. Researchers think this could be behind the sudden change.

Dr Will Peach, Head of Research Delivery at RSPB said: "House sparrow populations have declined in many towns and cities across Europe since the 1980s. This new research suggests that avian malaria may be implicated in the loss of house sparrows across London. Exactly how the infection may be affecting the birds is unknown. Maybe warmer temperatures are increasing mosquito numbers, or the parasite has become more virulent."

ZSL works to protect wildlife health and understand how animal diseases spread between populations and habitats. Diseases, like avian malaria, are a significant cause of wildlife decline, a direct threat to a number of endangered species and can infect domestic animals too. Only by understanding the mechanisms of infection and the effect that these diseases have can we can put in place strategies to mitigate them.

Credit: 
Zoological Society of London

Exercise offers protection against Alzheimer's

BOSTON - Higher levels of daily physical activity may protect against the cognitive decline and neurodegeneration (brain tissue loss) from Alzheimer's disease (AD) that alters the lives of many older people, researchers from Massachusetts General Hospital (MGH) have found. In a paper in JAMA Neurology, the team also reported that lowering vascular risk factors may offer additional protection against Alzheimer's and delay progression of the devastating disease. The findings from this study will be presented at the Alzheimer's Association International Conference (AAIC) in Los Angeles by the first author of the study, Jennifer Rabin, PhD, now at the University of Toronto, Sunnybrook Research Institute.

"One of the most striking findings from our study was that greater physical activity not only appeared to have positive effects on slowing cognitive decline, but also on slowing the rate of brain tissue loss over time in normal people who had high levels of amyloid plaque in the brain," says Jasmeer Chhatwal, MD, PhD of the MGH Department of Neurology, and corresponding author of the study. The report suggests that physical activity might reduce b-amyloid (Ab)-related cortical thinning and preserve gray matter structure in regions of the brain that have been implicated in episodic memory loss and Alzheimer's-related neurodegeneration.

The pathophysiological process of AD begins decades before clinical symptoms emerge and is characterized by early accumulation of b-amyloid protein. The MGH study is among the first to demonstrate the protective effects of physical activity and vascular risk management in the "preclinical stage" of Alzheimer's disease, while there is an opportunity to intervene prior to the onset of substantial neuronal loss and clinical impairment. "Because there are currently no disease-modifying therapies for Alzheimer's disease, there is a critical need to identify potential risk-altering factors that might delay progression of the disease," says Chhatwal.

The Harvard Aging Brain Study at MGH assessed physical activity in its participants - 182 normal older adults, including those with elevated b-amyloid who were judged at high-risk of cognitive decline - through hip-mounted pedometers which counted the number of steps walked during the course of the day.

"Beneficial effects were seen at even modest levels of physical activity, but were most prominent at around 8,900 steps, which is only slightly less than the 10,000 many of us strive to achieve daily," notes co-author Reisa Sperling, MD, director of the Center for Alzheimer's Research and Treatment, Brigham and Women's Hospital and Massachusetts General Hospital and co-principal investigator of the Harvard Aging Brain Study.

Interventional approaches that target vascular risk factors along with physical exercise have added beneficial properties, she adds, since both operate independently. Vascular risk factors measured by the researchers were drawn from the Framingham Cardiovascular Disease Risk Score Calculator, and include age, sex, weight, smoking/non-smoking, blood pressure, and whether people are on treatment for hypertension.

Through ongoing studies MGH is working to characterize other forms of physical activity and lifestyle changes that may help retard the progress of Alzheimer's disease. "Beta amyloid and tau protein build-up certainly set the stage for cognitive impairment in later age, but we shouldn't forget that there are steps we can take now to reduce the risk going forward - even in people with build-up of these proteins," says Chhatwal. "Alzheimer's disease and the emergence of cognitive decline is multifactorial and demands a multifactorial approach if we hope to change its trajectory."

Credit: 
Massachusetts General Hospital