Culture

New techniques can detect lyme disease weeks before current tests

Newark, N.J. (October 11, 2018) - Researchers have developed techniques to detect Lyme disease bacteria weeks sooner than current tests, allowing patients to start treatment earlier.

The findings appear in the journal Clinical Infectious Diseases. The authors include scientists from Rutgers Biomedical and Health Sciences, Harvard University, Yale University, the National Institute of Allergy and Infectious Diseases, FDA, Centers for Disease Control and Prevention, and other institutions.

The new techniques can detect an active infection with the Lyme bacteria faster than the three weeks it takes for the current indirect antibody-based tests, which have been a standard since 1994. Another advantage of the new tests is that a positive result in blood indicates the infection is active and should be treated immediately, allowing quicker treatment to prevent long-term health problems. The techniques detect DNA or protein from the Lyme disease bacteria Borrelia burgdorferi.

"These direct tests are needed because you can get Lyme disease more than once, features are often non-diagnostic and the current standard FDA-approved tests cannot distinguish an active, ongoing infection from a past cured one," said lead author Steven Schutzer, a physician-scientist at Rutgers New Jersey Medical School. "The problem is worsening because Lyme disease has increased in numbers to 300,000 per year in the United States and is spreading across the country and world."

Lyme disease signs frequently, but not always, include a red ring or bull's eye skin rash. When there is no rash, a reliable laboratory test is needed and preferably one that indicates active disease. The only FDA-approved Lyme disease tests rely on detecting antibodies that the body's immune system makes in response to the disease. Such a single antibody test is not an active disease indicator but rather only an exposure indicator -- past or present.

"The new tests that directly detect the Lyme agent's DNA are more exact and are not susceptible to the same false-positive results and uncertainties associated with current FDA-approved indirect tests," said Schutzer. "It will not be surprising to see direct tests for Lyme disease join the growing list of FDA-approved direct tests for other bacterial, fungal and viral infections that include Staphylococcus, Streptococcus, Candida, influenza, HIV, herpes and hepatitis, among others."

The authors developed the paper after a meeting at Cold Spring Harbor Laboratory's Banbury Conference Center, a nonprofit research institution in New York to discuss current Lyme disease tests and the potential of new scientific advances to increase the accuracy of an early diagnosis.

Credit: 
Rutgers University

Army-funded research results in new kits for teaching science

image: BioBits kits are designed to be used by students and teachers with no biological training. They use simple, hands-on experiments some using common household fruits to teach concepts of synthetic and molecular biology.

Image: 
(Courtesy photo by Wyss Institute at Harvard University)

ABERDEEN PROVING GROUND, Md. (Oct. 11, 2018) -- An affordable children's educational kit is the latest commercial spinoff of research pursued by the U.S. Army to create advanced materials for Soldier systems.

A team of researchers funded by the Army created and designed a new resource for science teachers. BioBitTM is an educational platform for teaching synthetic biology to kindergartners through high school.

The team from the Massachusetts Institute of Technology, Harvard University and Northwestern University recently tested the platform in the Chicago Public Schools where students and teachers helped.

The test findings showed that students and teachers performed experiments successfully. The program fills a gap in current science, technology, math and engineering education, said Dr. Dawanne Poree, the Army's lead program manager on this project.

"The BioBitTM Explorer kit enables hands-on demonstrations of cutting-edge science that is inexpensive and easy to use, circumventing many current barriers for implementing exploratory biology experiments in classrooms," Poree said.

Chicago-area teachers partnered to develop curricula for high school math and middle school science classes, emphasizing the cross-cutting nature and the value of this activity at various educational levels, officials said.

The high cost and specialized equipment required for even the simplest demonstration of synthetic biology tools prevent many K-12 and some undergraduate schools from effectively teaching these methods, Poree said.

"If the United States is to continue leading research in synthetic biology in the coming decades, then current students must be encouraged to pursue long term study in this field," Poree said.

The kits are commercially available for an $100 to $200 for materials for a classroom of about 30 students.

BioBits is based on the latest methods in synthetic biology. They incorporates methods for freeze-drying harmless cellular extracts into pellets that can be packaged and distributed to classrooms.

These pellets provide individual experiments that can be activated in the classroom to reveal the power of synthetic biology through pre-designed reactions, said Dr. Michael Jewett, of Northwestern University. These pellets provide individual experiments that can be activated in the classroom to reveal the power of synthetic biology through pre-designed reactions.

This platform grew out of prior and ongoing research funded by the Army Research Laboratory's Army Research Office to develop new tools in synthetic biology with the goal of adapting cellular machinery to produce non-biological materials.

Specifically, researchers were initially exploring the mechanisms of protein synthesis inside cells with the long term goal of harnessing and adapting cellular machinery to produce non-biological materials. This research involves the development of new methods in synthetic biology that will be required to utilize powerful biological enzymes to create polymers of interest for the Army.

Research in synthetic biology will be essential for realizing scientific discoveries in the 21st century such as responsive materials, game-changing chemical, material, and drug manufacturing methods, the detection and elimination of toxic chemicals, and medical applications ranging from the detection and treatment of traumatic brain injury to the development of integrated prosthetics.

Credit: 
U.S. Army Research Laboratory

White Americans see many immigrants as 'illegal' until proven otherwise, survey finds

image: Bar chart shows percentage of White Americans who suspected an immigrant was illegal based on biographical details shown. (Source: Who are the "Illegals"? The Social Construction of Illegality in the United States; René D. Flores, Ariela Schachter, American Sociological Review 2018)

Image: 
Washington University in St. Louis

Fueled by political rhetoric evoking dangerous criminal immigrants, many white Americans assume low-status immigrants from Mexico, El Salvador, Syria, Somalia and other countries President Donald Trump labeled "shithole" nations have no legal right to be in the United States, new research in the journal American Sociological Review suggests.

In the eyes of many white Americans, just knowing an immigrant's national origin is enough to believe they are probably undocumented, said Ariela Schachter, study co-author and assistant professor of sociology in Arts & Sciences at Washington University in St. Louis.

"Our study demonstrates that the white American public has these shared, often factually incorrect, stereotypes about who undocumented immigrants are," Schachter said. "And this is dangerous because individuals who fit this 'profile' likely face additional poor treatment and discrimination because of suspicions of their illegality, regardless of their actual documentation."

Findings suggest that the mere perception of illegal status may be enough to place legal immigrants, and even U.S. citizens, at greater risk for discrimination in housing and hiring, for criminal profiling and arrest by law enforcement, and for public harassment and hate crimes in the communities they now call home.

"When people form impressions about who they think is 'illegal,' they often do not have access to individuals' actual documents. There have actually been a number of recent incidents in which legal immigrants and even U.S. born Americans are confronted by immigration authorities about their status. So these judgments seem to be based on social stereotypes. Our goal was to systematically uncover them," said study co-author René D. Flores, the Neubauer Family Assistant Professor of Sociology at the University of Chicago.

From a broader sociological perspective, Schachter and Flores argue that an immigrant's real standing in American society is shaped not just by legal documentation, but also by social perceptions.

"These findings reveal a new source of ethnic-based inequalities -- 'social illegality' -- that may potentially increase law enforcement scrutiny and influence the decisions of hiring managers, landlords, teachers and other members of the public," they conclude in the research.

Conducted in November 2017, the experimental survey asked a representative sample of 1,500 non-Hispanic white Americans to guess whether a hypothetical immigrant was in the country illegally -- and perhaps a threat worth reporting to authorities -- based on the reading of a brief biographical sketch.

By systematically varying the immigrant's nation of origin, education level, language skills, police record, gender, age, race and other variables, researchers created a pool of nearly 7 million unique immigrant sketches that touched on a range of stereotypes. Each respondent was randomly assigned to view 10 of these unique sketches during the survey.

Using complex statistical analysis, researchers estimated how much each of these individual immigrant traits and stereotypes influenced the assumptions of white Americans from various demographic backgrounds, geographic regions and self-identified political affiliations.

Surprisingly, the study found that white Republicans and white Democrats jump to many of the same conclusions about the legal status of hypothetical immigrants -- except when it comes to the receipt of government benefits.

Democrats rightfully recognize that in order to receive government benefits, immigrants must have legal documentation, whereas Republicans are more likely to suspect that receiving benefits marks an immigrant as illegal, even though by law undocumented immigrants are blocked from receiving federal benefits such as welfare.

Most tellingly, even the slightest hint of an immigrant with a criminal background has a huge effect on whether a white American suspects that the immigrant is in the country illegally.

"Saying an immigrant committed a crime had a larger impact on suspicions of illegality than saying they were, say, Mexican," Schachter said. "This is true for both white Democrats and white Republicans. There's a clear implication that the Trump administration's rhetoric on immigrant criminality is driving these beliefs, which, again, are not based in reality. In fact, other research finds that undocumented immigrants are less likely to commit crimes than native-born Americans."

The study also demonstrates significant differences in how immigrants from various countries and differing social statuses are likely to be treated in the United States.

Credit: 
Washington University in St. Louis

Your smartphone could soon be making your commute much less stressful

image: Researchers at the University of Sussex used mobile phones to collect data on different modes of transport.

Image: 
The University of Sussex

Apps that can detect what mode of transport phone users are travelling on and automatically offer relevant advice are set to become a reality after extensive data-gathering research led by the University of Sussex.

Researchers at the University of Sussex's Wearable Technologies Lab believe that the machine learning techniques developed in a global research competition they initiated could also lead to smart phones being able to predict upcoming road conditions and traffic levels, offer route or parking recommendations and even detect the food and drink consumed by a phone user while on the move.

Professor Daniel Roggen, a Reader in Sensor Technology at the University of Sussex, said: "This dataset is truly unique in its scale, the richness of the sensor data it comprises and the quality of its annotations. Previous studies generally collected only GPS and motion data. Our study is much wider in scope: we collected all sensor modalities of smartphones, and we collected the data with phones placed simultaneously at four locations where people typically carry their phones such as the hand, backpack, handbag and pocket.

"This is extremely important to design robust machine learning algorithms. The variety of transport modes, the range of conditions measured and the sheer number of sensors and hours of data recorded is unprecedented."

Prof Roggen and his team collected the equivalent of more than 117 days' worth of data monitoring aspects of commuters' journeys in the UK using a variety of transport methods to create the largest publicly available dataset of its kind.

The project, whose findings will be presented at the Ubicomp conference in Singapore on Friday [October 12], gathered data from four mobile phones carried by researchers as they went about their daily commute over seven months.

The team launched a global competition challenging teams to develop the most accurate algorithms to recognize eight modes of transport (sitting still, walking, running, cycling or taking the bus, car, train or subway) from the data collected from 15 sensors measuring everything from movement to ambient pressure.

The project, supported by Chinese telecoms giant Huawei with academics at Ritsumeikan University and Kyushu Institute of Technology in Japan and Saints Cyril and Methodius University of Skopje in Macedonia, saw 17 teams take part with two entries achieving results with more than 90% accuracy, eight with between 80% and 90%, and nine between 50% and 80%.

The winning team, JSI-Deep of the Jozef Stefan Institute in Slovenia, achieved the highest score of 93.9% through the use of a combination of deep and classical machine learning models. In general deep learning techniques tended to outperform traditional machine learning approaches, although not to any significant degree.

It is now hoped that the highly versatile University of Sussex-Huawei Locomotion-Transportation (SHL) dataset will be used for a wide range of studies into electronic logging devices exploring transportation mode recognition, mobility pattern mining, localization, tracking and sensor fusion.

Prof Roggen said: "By organising a machine learning competition with this dataset we can share experiences in the scientific community and set a baseline for future work. Automatically recognising modes of transportation is important to improve several mobile services - for example to ensure video streaming quality despite entering in tunnels or subways, or to proactively display information about connection schedules or traffic conditions.

"We believe other researchers will be able to leverage this unique dataset for many innovative studies and novel mobile applications beyond smart-transportation, for example to measure energy expenditure, detect social interaction and social isolation, or develop new low-power localisation techniques and better mobility models for mobile communication research."

Credit: 
University of Sussex

Recent National Academies report puts research participants' rights at risk, say law scholars

In a Policy Forum article appearing in the Oct. 12 issue of Science, leading bioethics and legal scholars sound the alarm about a recent report from National Academies of Science, Engineering, and Medicine. The Academies' report on "Returning Individual Research Results to Participants" makes recommendations on how to share research results and data with people who agree to participate in research studies and calls for problematic changes to federal law. This report proclaims its support for research participants' rights but, in reality, creates major new roadblocks to the return of data and results to participants and would roll back important privacy protections people have under current law, according to the analysis in the new Science article.

The article's authors, Susan M. Wolf and Barbara J. Evans, collaborated as part of the "LawSeqSM: Building a Sound Legal Foundation for Translating Genomics into Clinical Application" project funded by the National Human Genome Research Institute and National Cancer Institute of the National Institutes of Health. Wolf is the McKnight Presidential Professor of Law, Medicine & Public Policy; Faegre Baker Daniels Professor of Law; and Professor of Medicine at the University of Minnesota and is Chair of the University's Consortium on Law and Values in Health, Environment & the Life Sciences. Evans is the Mary Ann and Lawrence E. Faust Professor of Law, Professor of Electrical and Computer Engineering, and Director of the Center for Biotechnology & Law at the University of Houston.

"Researchers conducting imaging, environmental health, and genetics studies have offered participants their research findings for years," Wolf and Evans point out. Research participants value access to their results for a wide range of reasons, including protecting their health, and evaluating the privacy risks posed by circulation of their data. People value access to results even when the results are still under study and may be uncertain. Over the past 20 years, researchers have developed pathways for returning results in situations where the results raise clinical concerns, such as suggesting that the person may have a medical condition that needs clinical follow-up evaluation. These pathways are ethically sound and protect the participants' safety by ensuring compliance with necessary laws and regulations. Unfortunately, the Policy Forum article asserts, "the Academies' report rejects this widely supported, legally sound approach" and instead recommends restrictions on access to research results and data.

Wolf and Evans write that, "Efforts to turn back the clock on return of results appear rooted in confusion about the law." The Academies' report incorporates incorrect statements about the federal CLIA legal framework, which aims to ensure the quality of laboratory tests conducted for health care purposes.

The report overstates the degree to which research laboratories can be regulated under the CLIA statute.

The Academies' report also conflicts with existing federal privacy laws that protect research participants' access to their own data. For more than 50 years, Congress has treated individual access to one's own data as an essential element of personal privacy protection, as seen in the Privacy Act that protects data stored in governmental databases, the HIPAA Privacy Rule that protects Americans' medical privacy, and the Genetic Information Nondiscrimination Act that expanded HIPAA's protections to genetic information. Only by seeing the personal data collected can an individual assess the privacy risks involved. Yet the Academies' report recommends that an individual's access to their data be restricted to the subset of data that meets certain quality standards. Wolf and Evans explain how this would undermine federal privacy protections, which recognize that privacy can be put at risk even by low- quality data and data that is wrongly attributed to a person.

Finally, the Policy Forum article criticizes the Academies' recommendation to load multiple decisions about return of results on Institutional Review Boards (IRBs). This would place "substantial new burdens on IRBs, despite extensive literature on the limits of IRB decision making." The report "maximizes the burden on IRBs by mischaracterizing existing consensus guidelines and suggesting that IRBs start over."

Wolf and Evans conclude, "The Academies' report endorses the idea of participant access to results and data, but then builds daunting barriers. The report rejects established legal rights of access, two decades of consensus guidelines, and abundant data showing that participants benefit from access while incurring little risk. The report too often prefers paternalistic silence over partnership."

"True progress on return of results requires accepting participants' established rights of access and respecting the value that participants place on broad access to their data and results. The next step is not to build barriers but to promote transparency."

Credit: 
University of Minnesota

UK Alabama rot risk may be linked to certain types of dog breed and habitat

The risk of contracting renal glomerular vasculopathy (CRGV), popularly known as Alabama Rot, may be higher in certain types of dog breed and land habitat, indicate two linked studies published in this week's Vet Record.

The clinical signs of Alabama Rot typically include skin ulcers and anaemia, progressing to kidney damage and renal failure. As yet, the cause is unknown.

The first known cases in the UK were reported in 2012 in the New Forest in southern England, and cases have tended to occur more frequently at certain times of the year, and in certain geographical areas. But it's not clear what other potential risk factors there might be.

In the first study, the researchers assessed whether certain types of breed might be more at risk. They looked at 101 reported cases (out of 103) diagnosed between November 2012 and May 2017, comparing them with more than 446,000 dogs in receipt of veterinary care at practices submitting data on health issues to the VetCompass programme during 2013.

On average, the vet practice dogs were nearly 4.5 years old, and just over half were male. The most common Kennel Club breeds were gun dogs (spaniels and retrievers), terriers, and toy dogs; working dogs and hounds (salukis, whippets, and Hungarian vizslas) were the least common.

Crossbreds made up over a third (just under 38%) of all the dogs, with labrador retrievers, Staffordshire bull terriers, and Jack Russells, the most common specified breeds.

Compared with the vet practice dogs, those diagnosed with Alabama Rot were more likely to be female (58% vs 48%) and neutered (69% vs 45.5%).

Among the Kennel Club breeds, gun dogs and hounds made up nearly two thirds (60%) of the Alabama Rot cases. They were between 9 and 11 times as likely to have been diagnosed as terriers.

Among the specified breeds, Staffordshire bull terriers, Jack Russells and German shepherds were the least likely to have been diagnosed, while English springer spaniels, whippets, and flat-coated retrievers and Hungarian vizlas were the most likely.

"It is possible that these breed associations result from an inherent susceptibility among these breeds as a result of genetic or behavioural patterns, but it is also possible that the predisposition results from geographical confounding whereby these breeds may occur more commonly in areas with a high risk of CRGV occurrence," explain the researchers.

In the second study, the researchers looked only at the 101 dogs that had been diagnosed with Alabama Rot to see if there were any patterns in timing, geography, and terrain.

Most cases (90%) were reported between December and May, with a third diagnosed between January and March. Fewer than one in 10 cases were diagnosed between June and August.

Cases were reported from most of the western and southern regions of England, over the five years, with the lowest risk seemingly in eastern England, in particular, East Anglia.

Habitat emerged as an influential factor, accounting for more than 20 per cent of the difference in the geographical distribution of cases. Dry lowland heathland and woodland areas were the most likely to be associated with a diagnosis, while pasture was the least likely.

Areas with higher maximum temperatures in winter, and higher average rainfall in winter and spring (such as the West and South of England), were also associated with a heightened risk of a diagnosis.

Both these studies are observational, and as such, no definitive conclusions can be reached about causality.

But the researchers say their findings may help raise the index of suspicion among vets, as it is particularly important to treat Alabama Rot promptly, as well as giving dog owners an indication of when to be extra vigilant.

But further research is warranted to find out if the breeds seemingly at higher risk are inherently more vulnerable or whether there are higher proportions of these breeds in areas of greater risk, they conclude.

In a linked editorial, managing editor, Suzanne Jarvis, reiterates this uncertainty, emphasising the need for caution until further research can shed more light on the matter.

"The next step is to see if this suspected geographical connection holds true; what it is not, is to scare owners of these identified breeds," she insists.

Credit: 
BMJ Group

Ability to recover after 'maximum effort' is crucial to soccer stardom

Footballers' ability to recover after high-intensity effort may not depend on their age, but on their division level, a new study has suggested.

A multinational team of scientists led by the Complutense University of Madrid (UCM) carried out maximum-effort tests with Spanish division one and division two soccer players.

They then measured the players' oxygen consumption, heart rate and ventilation during recovery.

Professor Francisco Javier Calderón Montero, from UCM, is the study's lead author. He said: "Regardless of a player's technical ability, the ability to repeat sprints is essential in soccer. Players may need to sprint every 90 seconds during a game, meaning the available recovery time will be short."

"We wanted to discover if the difference in the recovery time before the next sprint were linked to the level at which a soccer player competes."

The researchers' findings, published today in Physiological Measurement, show that compared to first division players, second division players took longer to recover from maximum effort exertion.

Professor Montero explained: "Our results showed second division players had higher oxygen consumption and heart rate than first division players after 90 seconds of recovery time. These differences were still clear after 180 seconds of recovery time."

"The second division players, therefore, took much longer to recover to the point where they were able to repeat the effort. They are therefore unlikely to be able to repeat sprints as often and as intensely as first division players."

A huge sample of one-hundred-and-ninety-four male soccer players, from seven clubs in the Spanish Professional Football League, took part in the study. There were 114 first division and 80 second division players comprising: 12 goalkeepers, 57 defenders, 86 midfield players, and 39 strikers.

All underwent the same maximum effort test. They first warmed up by running on a treadmill for two minutes at 4 km/h, before increasing their speed until they reached a heart rate of 120-130 bpm. They maintained this for three minutes. After this warm-up period, they could rest to stabilize the respiratory quotient (RQ) at The maximum effort test began at a speed of 6 km/h, and a one per cent slope. The players' running speed increased by 2 km/h every two minutes until they reached maximum effort. The players then underwent a three-minute period of active recovery.

Although the team found a difference in recovery time between players from divisions one and two, the position in which they played had no bearing on the results.

Corresponding author Dr Luca Paolo Ardigò, from the University of Verona, said: "A possible explanation may be that the second division players always played in lesser category clubs with lower training demands. However, the players in our study showed no differences in self-declared duration of training or weekly recovery."

"It is possible that the intensity of training differed, and further study is needed to uncover whether this is a factor."

Credit: 
IOP Publishing

Organs-on-chip technology reveals new drug candidates for Lou Gehrig's disease

image: ALS on a chip with hiPS-derived optogenetic motor neuron from ALS patient (green) and hiPS-derived skeletal muscle cells (purple) was established to represent ALS pathology.

Image: 
Tatsuya Osaki/MIT

The investigation of amyotrophic lateral sclerosis (ALS) - also known as Lou Gehrig's disease - through muscle-on-a-chip technology has revealed a new drug combination that may serve as an effective treatment of the progressive neurodegenerative disease. These findings highlight organ-on-a-chip technologies - in which live conditions of the body are mimicked in a microfluidic cell culture - as promising platforms for testing drug candidates and investigating the pathogenesis of ALS, which remains largely unknown. The disease currently impacts around 12,000 to 15,000 people in the U.S. ALS involves the loss of motor neurons in the spinal cord and motor cortex, which leads to progressive paralysis, muscle atrophy and death. While roughly 10% of ALS patients have a familial version of the disease, which can typically be traced back to a genetic mutation, 90% of patients have "sporadic ALS," which has no known familial links or causes. As the few FDA-approved drugs currently on the market for ALS lack full effectiveness, there is an urgent need for ALS therapy investigations in the clinic, using better clinical models that can go beyond the limitations of animal models. Here, Tatsuya Osaki and colleagues created disease-on-a-chip technology-based approach. It features a microfluidic chip loaded with healthy skeletal muscle bundles and induced pluripotent stem cell-derived, light-sensitive motor neurons from a sporadic ALS patient. Light was used to activate muscle contraction and control neural activity on the chips. Compared to chips with non-ALS-patient-derived cells, the ALS-on-a-chip exhibited fewer and weaker muscle contractions, degraded motor neurons, and increased muscle cell death. Application of two neuroprotective molecules - rapamycin and bosutinib (both in clinical trials) - helped recover muscle contraction induced by motor neuron activity and improve neuronal survival in the chip-based model of disease. Importantly, each treatment on its own has a limited ability to penetrate the blood-brain barrier, but when combined, the molecular duo could efficiently cross blood-brain-barrier-like cell layers built onto the chip.

Credit: 
American Association for the Advancement of Science (AAAS)

New appropriate use criteria for lumbar puncture in Alzheimer's diagnosis

In preparation for more tools that detect and measure the biology associated with Alzheimer's and other dementias earlier and with more accuracy, an Alzheimer's Association-led Workgroup has published appropriate use criteria (AUC) for lumbar puncture (spinal tap) and spinal fluid analysis in the diagnosis of Alzheimer's disease.

The AUC is available online by Alzheimer's & Dementia: The Journal of the Alzheimer's Association as an article in press, corrected proof.

"Early and accurate diagnosis of Alzheimer's disease is critical as therapies that have the potential to stop or slow the progression of the disease become available," said Maria C. Carrillo, Ph.D., Chief Science Officer at the Alzheimer's Association. "These criteria will arm medical professionals with necessary guidance when the use of lumbar puncture is an appropriate part of the process to diagnose Alzheimer's disease and other dementias, thereby giving people with dementia and their families the possibility of a head start in preparing for the course of their disease."

Alzheimer's disease is commonly diagnosed by a thorough examination of physical health, medical history and assessment of memory, thinking and reasoning. Lumbar puncture, while not currently in routine clinical practice in the U.S., is anticipated to be a safe and cost-effective way to retrieve cerebrospinal fluid (CSF) to test for biological markers of Alzheimer's disease, potentially delivering valuable diagnostic information to clinicians and their patients earlier in the course of the disease.

The Workgroup's efforts complement the 2013 AUC for brain amyloid PET scans developed by the Society of Nuclear Medicine and Molecular Imaging (SNMMI) and the Alzheimer's Association.

The lumbar puncture AUC criteria recommend clinicians consider the following patient populations as appropriate and inappropriate:

Appropriate uses of lumbar puncture:

A patient has subjective cognitive decline (SCD) and is considered to be at an increased risk for Alzheimer's disease based on indicators that include a persistent decline in memory, younger onset age (>60), onset in the last 5 years and others. The decision to perform CSF biomarker testing in this case should be individualized and most strongly supported when the individual, family and clinician all are concerned about the patient's cognitive decline.

A patient has mild cognitive impairment (MCI) that is persistent, progressive and unexplained. MCI includes mild deficits on cognitive testing but no change in functional abilities.

A patient has symptoms that suggest possible Alzheimer's disease, meaning the dementia could be due to another cause.

A patient has MCI or dementia with onset at an early age (A patient meets core clinical criteria for probable Alzheimer's disease with typical age of onset.

A patient's dominant symptom is an unexplained change in behavior, such as delusions and delirium, and an Alzheimer's disease diagnosis is being considered.

Inappropriate uses of lumbar puncture:

A patient is cognitively unimpaired, is within the normal range of functioning for their age and lacks significant risk factors for Alzheimer's disease.

A patient is cognitively unimpaired but is considered to be at risk for Alzheimer's disease based on their family history.

A patient has SCD and has been evaluated and found by a clinician not to be at high risk for Alzheimer’s disease based on indications such as no family history or limited concern from an informant like a partner or family member.

A patient has symptoms of rapid eye movement (REM) sleep behavior disorder, which is a strong predictor of disorders such as Parkinson's disease and Lewy body dementia.

A patient already has been diagnosed with Alzheimer's and the test's use would be to determine the stage of their disease or its severity.

A patient is an apolipoprotein E-e4 (ApoE-e4) carrier who has no cognitive impairment. ApoE-e4 is a genetic mutation strongly associated with risk for late-onset Alzheimer's.

The test is being used in lieu of genotyping for individuals who are suspected to carry a rare genetic mutation that causes an early-onset form of Alzheimer's disease.

The AUC includes suggestions from the workgroup on implementing the criteria in clinical practice. They recommend that CSF biomarker testing be done by dementia experts who can determine the appropriateness of the test, educate the patient and family about the benefits and risks, ensure the procedure follows established guidelines, and integrate the results into the patient's treatment plan.

Credit: 
Alzheimer's Association

Sit-stand office desks cut daily sitting time, may boost job performance

Sit-stand workstations that allow employees to stand, as well as sit, while working on a computer reduce daily sitting time and appear to have a positive impact on job performance and psychological health, finds a trial published by The BMJ today.

The results show that employees who used the workstations for 12 months, on average, reduced their sitting time by more than an hour a day, with potentially meaningful benefits.

High levels of sedentary behaviour (sitting) have been associated with an increased risk of chronic diseases (type 2 diabetes, heart disease, and some cancers) as well as death and have been shown to be detrimental for work related outcomes such as feelings of engagement and presenteeism (going to work despite illness).

Office workers are one of the most sedentary populations, spending 70-85% of time at work sitting, but studies looking at ways to reduce sitting in the workplace have been deemed low quality.

So a team of researchers based in the UK, with collaborators in Australia, set out to evaluate the impact of (Stand More AT (SMArT) Work) an intervention designed to reduce sitting time at work.

The trial involved 146 office workers based at the University Hospitals of Leicester NHS Trust of whom 77 were randomly assigned to the intervention group and 69 to the control group over a 12 month period.

The average age of participants was 41 years, 78% reported being of white European ethnicity, and the majority (80%) were women.

The intervention group were given a height adjustable workstation, a brief seminar with supporting leaflet, and workstation instructions with sitting and standing targets. They also received feedback on sitting and physical activity, an action planning and goal setting booklet, a self monitoring and prompt tool, and coaching sessions. The control group carried on working as usual.

Workers' sitting time was measured using a device worn on the thigh at the start of the study (baseline) and at 3, 6, and 12 months. Daily physical activity levels and questions about work (eg. job performance, engagement) and health (eg. mood, quality of life) were also recorded.

At the start of the study, overall sitting time was 9.7 hours per day. The results show that sitting time was lower by 50.62 minutes per day at 3 months, 64.40 minutes per day at 6 months, and 82.39 minutes per day at 12 months in the intervention group compared with the control group. Prolonged sitting time was also reduced in the intervention group.

The reduction in sitting was largely replaced by time spent standing rather than moving, as stepping time and physical activity remained unchanged.

The results also suggest improvements in job performance, work engagement, occupational fatigue, presenteeism, daily anxiety and quality of life, but no notable changes were found for job satisfaction, cognitive function, and sickness absence.

The authors say this was a well-designed trial and their results remained largely unchanged after further analyses. But they acknowledge that their findings may not apply to other organisations, and that self-reporting of work related outcomes may have affected the results.

Nevertheless, they say the SMArT Work successfully reduced sitting time over the short, medium, and longer term, and positive changes were observed in work related and psychological health.

And they suggest future research should assess the longer term health benefits of displacing sitting with standing and how best to promote movement rather than just standing while at work.

In a linked editorial, Dr Cindy Gray at the University of Glasgow says this is an important study that demonstrates lasting reductions in sedentary behaviour and other work-related benefits. But she questions the potential health gains of simply replacing sitting with standing. The intervention did not increase potentially more beneficial physical activity.

She also questions SMArT Work's transferability and suitability for other types of employees, including shift workers, as well as its cost-effectiveness, which she says should be addressed in future research.

Credit: 
BMJ Group

Nutrients may reduce blood glucose levels

image: Mary-Elizabeth Patti, MD, is investigator in the Section on Integrative Physiology and Metabolism at Joslin Diabetes Center and associate professor of medicine at Harvard Medical School.

Image: 
John Soares

BOSTON - (October 10, 2018) - Type 2 diabetes is driven by many metabolic pathways, with some pathways driven by amino acids, the molecular building blocks for proteins. Scientists at Joslin Diabetes Center now have shown that one amino acid, alanine, may produce a short-term lowering of glucose levels by altering energy metabolism in the cell.

"Our study shows that it's possible we can use specific nutrients, in this case amino acids, to change metabolism in a cell, and these changes in metabolism can change how cells take up and release glucose in a beneficial way," says Mary-Elizabeth Patti, MD, an investigator in Joslin's Section on Integrative Physiology and Metabolism and senior author on a paper about the work recently published in Molecular Metabolism.

Performed in cells and in mice, her group's research began with an attempt to see what nutrients might activate a key protein called AMP kinase (AMPK), says Patti, who is also an associate professor of medicine at Harvard Medical School.

"AMPK is an enzyme in cells throughout the body that is activated when nutrient supplies are low, or in response to exercise," she explains. "AMPK then causes a lot of beneficial changes in the cell, turning on genes that serve to increase energy production. AMPK is a good thing, and it also can be activated by a variety of treatments for type 2 diabetes, such as metformin."

That raised a question for Patti and her colleagues: Could an amino acid switch on this beneficial enzyme?

The investigators began their study by testing many amino acids in rat liver cells (the liver is a crucial organ in glucose metabolism). "Alanine was the one amino acid that was consistently able to activate AMPK," Patti says.

The researchers then confirmed that AMPK was producing some of its usual metabolic effects after alanine activation. Additionally, the activation could be seen in human and mouse liver cells as well as rat liver cells, and was present with either high or low levels of glucose in the cells.

Next, scientists gave alanine by mouth to mice and found that levels of AMPK rose in the animals. Moreover, if mice ate alanine before they received a dose of glucose, their resulting blood glucose levels were significantly lower. And while glucose metabolism often behaves quite differently in lean mice than in obese mice, this mechanism was seen in both groups of mice.

Following up, the Joslin team found that the glucose lowering didn't seem to be driven by increases in insulin secretion or decreases in secretion of glucagon, a hormone that increases glucose. Instead, AMPK was boosting glucose uptake in the liver and decreasing glucose release. Further experiments in cells demonstrated that the activated enzyme was altering the Krebs cycle, a central component of cell metabolism.

"All these data together suggest that amino acids, and specifically alanine, may be a unique potential way to modify glucose metabolism," Patti sums up. "If it eventually turns out that you can do that by taking an oral drug as a pre-treatment before a meal, that would be of interest. However, this is early-stage research, and we need to test the concept both in mice and ultimately in humans."

Credit: 
Joslin Diabetes Center

New type of stellar collision discovered

image: This object is possibly the oldest of its kind ever catalogued: the hourglass-shaped remnant named CK Vulpeculae. Originally thought to be a nova, classifying this unusually shaped object correctly has proven challenging over the years. A number of possible explanations for its origins have been considered and discarded. It is now thought to be the result of two stars colliding. These new observations are the first to bring this system into focus, suggesting a solution to a 348-year-old mystery.

Image: 
ALMA (ESO/NAOJ/NRAO)/S. P. S. Eyres

For three and a half centuries, astronomers have pondered a mystery: What did the French monk and astronomer Père Dom Anthelme see when he described a star that burst into view in June 1670, just below the head of the constellation Cygnus, the swan?

It was long thought to be a nova--a star that periodically brightens as it blows off mass. But now, an international team of astrophysicists, including two professors at the University of Minnesota, have cracked the 348-year-old conundrum. The monk witnessed the explosive merger of white and brown dwarf stars--the first ever identified.

The work, led by astrophysicists at Keele University (England), is published in the Monthly Notices of the Royal Astronomical Society.

White dwarfs are the remnants of stars like the sun at the end of its life, while brown dwarfs are "failed stars" that have 15-75 times the mass of Jupiter, but not enough to ignite the thermonuclear fusion reactions that power the sun and other stars. The two stars orbited each other until they got too close and merged, spewing out debris whose chemical composition gave away the secret of the mystery object's origin.

The brown dwarf got the raw end of the deal.

"It was as if you put salsa fixings into a blender and forgot to put the lid on," said Charles Woodward, a physics and astronomy professor in the College of Science and Engineering at the University of Minnesota. "The white dwarf was like the blades at the bottom and the brown dwarf was the edibles. It was shredded, and its remains spun out in two jets--like a jet of goop shooting from the top of your blender as you searched frantically for the lid."

Woodward and fellow University of Minnesota physics and astronomy professor Robert Gehrz were members of the team that proposed studying the object and assisted in designing the program of observations, which were done at the Atacama Large Millimeter/submillimeter Array (ALMA) of telescopes in Chile.

Beneath the swan, an odd duck

The unusual star has been dubbed CK Vulpeculae, as it resides in the constellation Vulpecula (the little fox). It is just below the star Albireo, the head of Cygnus, the swan. That location is inside the Summer Triangle of bright stars, which is now high in the south after nightfall. The star is approximately 2,200 light-years from Earth.

The white dwarf and brown dwarf started out fairly ordinary--orbiting each other in a binary system, as astrophysicists believe most stars are born. The white dwarf had an estimated 10 times the brown dwarf's mass. As they merged, the brown dwarf was torn apart and its remains dumped on the surface of the white dwarf. That star's crushing gravity heated the brown dwarf material and caused thermonuclear "burning" that spilled out a cocktail of molecules and unusual forms (isotopes) of chemical elements. It also caused the brightening that caught the eye of the monk Anthelme.

Rounding up the unusual suspects

CK Vulpecula isn't visible to the naked eye, but through the telescope, the debris ejected during the merger appears as two bright rings of dust and gas that form a glowing hourglass structure around a compact central object. Studying the light from two background stars that had passed through the system, the researchers noted the presence of lithium, a light element that can't exist in the interiors of stars, where nuclear fusion occurs. They also found organic molecules like formaldehyde and methyl alcohol, which also would perish in stellar interiors. Thus, these molecules must have been produced in the debris from the collision.

The amount of dust in the debris was about one percent the mass of the sun.

"That's too high for a classical nova outburst and too low for mergers of more massive stars, as had been proposed earlier," said Sumner Starrfield, a professor at Arizona State University who was involved in the study.

That evidence, plus isotope data, led to the conclusion that the collision was between a white dwarf and brown dwarf. And the remnant star is still blowing off material.

"Collisions like this could contribute to the chemical evolution of our galaxy and universe," noted Minnesota's Gehrz. "The ejected material travels out into space, where it gets incorporated into new generations of stars."

Credit: 
University of Minnesota

More young people are choosing not to drink alcohol

Young people in England aren't just drinking less alcohol - a new study published in BMC Public Health shows that more of them are never taking up alcohol at all, and that the increase is widespread among young people.

Researchers at University College London analysed data from the annual Health Survey for England and found that the proportion of 16-24 year olds who don't drink alcohol has increased from 18% in 2005 to 29% in 2015.

The authors found this trend to be largely due to an increasing number of people who had never been drinkers, from 9% in 2005 to 17% in 2015. There were also significant decreases in the number of young people who drank above recommended limits (from 43% to 28%) or who binge drank (27% to 18%). More young people were also engaging in weekly abstinence (from 35% to 50%)

Dr Linda Ng Fat, corresponding author of the study said: "Increases in non-drinking among young people were found across a broad range of groups, including those living in northern or southern regions of England, among the white population, those in full-time education, in employment and across all social classes and healthier groups. That the increase in non-drinking was found across many different groups suggests that non-drinking may becoming more mainstream among young people which could be caused by cultural factors."

Dr Ng Fat said: "These trends are to be welcomed from a public-health standpoint. Factors influencing the shift away from drinking should be capitalised on going forward to ensure that healthier drinking behaviours in young people continue to be encouraged."

Dr Linda Ng Fat added: "The increase in young people who choose not to drink alcohol suggests that this behaviour maybe becoming more acceptable, whereas risky behaviours such as binge drinking may be becoming less normalised."

Increases in non-drinking however were not found among ethnic minorities, those with poor mental health and smokers suggesting that the risky behaviours of smoking and alcohol continue to cluster.

The researchers examined data on 9,699 people aged 16-24 years collected as part of the Health Survey for England 2005-2015, an annual, cross-sectional, nationally representative survey looking at changes in the health and lifestyles of people across England. The authors analysed the proportion of non-drinkers among social demographic and health sub-groups, along with alcohol units consumed by those that did drink and levels of binge drinking.

The authors caution that the cross-sectional, observational nature of this study does not allow for conclusions about cause and effect.

Credit: 
BMC (BioMed Central)

Earlier treatment could help reverse autistic-like behavior in tuberous sclerosis

image: Research in mice indicates that there's a sensitive period for reversing social deficits in tuberous sclerosis complex, a genetic condition that commonly includes autism spectrum disorder. IN the model, the TSC1 gene was deleted only in cerebellar Purkinje cells, which have been implicated in autism.

Image: 
Peter Tsai

New research on autism has found, in a mouse model, that drug treatment at a young age can reverse social impairments. But the same intervention was not effective at an older age.

The study is the first to shed light on the crucial timing of therapy to improve social impairments in a condition associated with autism spectrum disorder. The paper, from Boston Children's Hospital, the University of Texas, Harvard Medical School and Toronto's Hospital for Sick Children, was published today in Cell Reports.

Tuberous sclerosis and autism

Many of the hundreds of genes that likely regulate complex cognitive and neuropsychiatric behaviors in people with autism still remain a mystery. However, genetic disorders such as tuberous sclerosis complex, or TSC, are providing clues. Patients often have mutations in the TSC1 or TSC2 gene, and about half develop autism spectrum disorder.

The investigators, led by Peter Tsai, MD, PhD, at UT Southwestern Medical Center, used a mouse model in which the TSC1 gene is deleted in a region of the brain called the cerebellum.

"There were several mouse models of TSC previously published, but they all had seizures and died early in life, making it difficult to study social cognition," says Mustafa Sahin, MD, PhD, who directs the Translational Neuroscience Center and the Translational Research Program at Boston Children's and was the study's senior investigator. "That is one reason why we turned to knocking out the TSC1 gene only in cerebellar Purkinje cells, which have been implicated in autism. These mice have normal lifespans and do not develop seizures."

Timing is everything

The new research fed off a previous study published in 2012. In that study, Sahin and colleagues treated the mutant mice starting in the first week of life with rapamycin, a drug approved by the FDA for brain tumors, kidney tumors and refractory epilepsy associated with TSC. They found that they could rescue both social deficits and repetitive behaviors.

But when a similar drug, everolimus, was tested in children with TSC, neurocognitive functioning and behavior didn't significantly improve. Sahin and his colleagues wondered whether there was a specific developmental period during which treatment would be effective.

The new mouse study delineates not only the timeframe for effective rapamycin treatment of certain autism-relevant behaviors, but also some of the cellular, electrophysiological and anatomic mechanisms for these sensitive periods.

"We found that treatment initiated in young adulthood, at 6 weeks, rescued social behaviors, but not repetitive behaviors or cognitive inflexibility," says Sahin.

More importantly, neither the social deficits nor the repetitive behaviors responded when the treatment was started at 10 weeks.

Using advanced imaging, the researchers went on to show that the rescue of social behaviors correlates with reversal of specific MRI-based structural changes, cellular pathology and Purkinje cell excitability. Meanwhile, motor learning rescue appeared independent of Purkinje cell survival or rescue of cellular excitability.

A new clinical trial?

Based on the mouse findings, Sahin is now seeking funds to test whether early treatment can improve a broad range of autistic-like behaviors in children with TSC. Specifically, he'll explore whether treatment as early as 12 to 24 months can help prevent both social deficits and repetitive inflexible behaviors. He hopes to see better results than in the earlier clinical trial, which involved children ages 6 to 21.

Past research indicates that different autism-related disorders may have different windows of treatment. For example, animal studies of Rett syndrome suggest that treatment can be effective relatively late in life and still improve neurological outcome.

Credit: 
Boston Children's Hospital

15 emerging technologies that could reduce global catastrophic biological risks

image: Strategic investment in 15 promising technologies could help make the world better prepared and equipped to prevent future infectious disease outbreaks from becoming catastrophic events. This subset of emerging technologies and their potential application are the focus of a new report, Technologies to Address Global Catastrophic Biological Risks, by a team of researchers at the Johns Hopkins Center for Health Security.

Image: 
Harry Campbell/Johns Hopkins Center for Health Security

Strategic investment in 15 promising technologies could help make the world better prepared and equipped to prevent future infectious disease outbreaks from becoming catastrophic events.

This subset of emerging technologies and their potential application are the focus of a new report, Technologies to Address Global Catastrophic Biological Risks, by a team of researchers at the Johns Hopkins Center for Health Security. The study is among the first to assess technologies for the purpose of reducing GCBRs--a special category of risk defined previously by the Center as threats from biological agents that could lead to sudden, extraordinary, widespread disaster beyond the collective capability of national and international organizations and the private sector to control.

"While systems to respond [to an outbreak] are in place in many areas of the world, traditional approaches can be too slow or limited in scope to prevent biological events from becoming severe, even in the best of circumstances," wrote the Center authors. "This type of response remains critically important for today's emergencies, but it can and should be augmented by novel methods and technologies to improve the speed, accuracy, scalability, and reach of the response."

Through an extensive literature review and interviews with more than 50 experts, the Center project team identified 15 example technologies and grouped them into 5 broad categories that are significantly relevant to public health preparedness and response:

Disease Detection, Surveillance, and Situational Awareness: Ubiquitous Genomic Sequencing and Sensing, Drone Networks for Environmental Detection, Remote Sensing for Agricultural Pathogens

Infectious Disease Diagnostics: Microfluidic Devices, Handheld Mass Spectrometry, Cell-Free Diagnostics

Distributed Medical Countermeasure Manufacturing: 3D Printing of Chemicals and Biologics, Synthetic Biology for Manufacturing MCMs

Medical Countermeasure Distribution, Dispensing, and Administration: Microarray Patches for Vaccine Administration, Self-Spreading Vaccines, Ingestible Bacteria for Vaccination, Self-Amplifying mRNA Vaccines, Drone Delivery to Remote Locations

Medical Care and Surge Capacity: Robotics and Telehealth, Portable Easy-to-Use Ventilator

The project team noted their list is not exhaustive or an endorsement of specific companies. The team used a modified version of DARPA's Heilmeier Catechism to standardize the process of evaluating each technology and formulating guidance for funding decisions. That process informed the team's high-level assessment of the readiness of each technology (from early development to being field-ready), the potential impact of the technology on GCBR reduction (from low to high), and the amount of financial investment that would be needed to meaningfully deploy the technology (from low to high). Details on these findings are included in the report.

Crystal Watson, DrPH, MPH, a senior scholar at the Center, Senior Analyst Matthew Watson, and Senior Scholar Tara Kirk Sell, PhD, MA, co-led the project team, which also included Caitlin Rivers, PhD, MPH; Matthew Shearer, MPH; former Analyst Christopher Hurtado, MHS; former Research Assistant Ashley Geleta, MS; and Tom Inglesby, MD, the Center's director. Their work contributes new ideas to a field in need of innovation despite important, ongoing progress in both the public and private sectors to address pandemic risk.

"The adoption and use of novel technologies for the purpose of epidemic control and public health often lag well behind the innovation curve because they do not have a lucrative market driving their development," wrote the authors. "This leaves unrealized opportunities for improved practice."

They recommend creating a consortium of technology developers, public health practitioners, and policymakers tasked with understanding pressing problems surrounding pandemics and GCBRs and jointly developing technology solutions.

Credit: 
Johns Hopkins Center for Health Security