Culture

New hope from the 'seven year switch' in Type 1 diabetes

New research has shown that the rapid decline in insulin production that causes type 1 diabetes continues to fall over seven years and then stabilises.

A team at the University of Exeter Medical School found evidence that the amount of insulin produced declines by almost 50% each year for seven years. At that point, the insulin levels stabilise.

The finding is a major step forward in understanding Type 1 diabetes and contradicts previous beliefs that the insulin produced by people with the condition drops relentlessly with time. It offers the hope that by understanding what changes after seven years, new strategies could be developed to preserve insulin secreting beta-cells in patients.

The study, published in Diabetes Care, measured C-peptide, which is produced at the same time and in the same quantities as the insulin that regulates our blood sugar. By measuring C-peptide levels in blood or in urine, scientists can tell how much insulin a person is producing themselves, even if they are taking insulin injections as treatment. The team studied 1,549 people with Type 1 diabetes from Exeter, England and Tayside, Scotland in the UNITED study.

Dr Beverley Shields, at the University of Exeter Medical School, who led the research, said: "This finding is really exciting. It suggests that a person with Type 1 diabetes will keep any working beta-cells they still have seven years after diagnosis. We are not sure why this is; it may well be that there is a small group of "resilient" beta-cells resistant to immune attack and these are left after all the "susceptible" beta-cells are destroyed. Understanding what is special about these "resilient" beta-cells may open new pathways to treatment for Type 1 diabetes."

Type 1 diabetes affects around 400,000 people in the UK. The disease commonly starts in childhood but can develop at any age, and causes the body's own immune system to attack and destroy the insulin-producing cells in the pancreas, leaving the patient dependent on life-long insulin injections.

Professor Andrew Hattersley, a Consultant in Diabetes at the Royal Devon and Exeter Hospital and Research Professor at the University of Exeter Medical School, looked forward. "Now we know there is a "seven year switch", the next question is why? Has the immune attack stopped or are we left with "super beta-cells" that can resist the immune onslaught. Any insights into halting the relentless destruction of the precious insulin-producing cells are valuable. We could not have made this progress without the help of over 1,500 patients. We owe it to them to try to find answers that might help patient care quickly."

Karen Addington, UK Chief Executive of the type 1 diabetes charity JDRF, said: "These results provide further evidence that the immune system's assault on insulin-producing beta cells is not as complete as we once believed - and may change over time. This further opens the door to identifying ways to preserve insulin production in people diagnosed with or living with type 1 diabetes."

Credit: 
University of Exeter

Science of racism examined in new set of research articles

White supremacist marches and xenophobic Twitter rants have brought overt racism to the center of public attention in recent months. Even still, subtle, structural, and systemic forms of racism continue to lurk in what is becoming an increasingly racially diverse United States. In a new collection of scholarly articles, psychological scientists describe research on the enduring and often hidden presence of racism at both the interpersonal and societal levels.

The articles are published in a special issue of Current Directions in Psychological Science, a journal of the Association for Psychological Science. Yale University professor Jennifer Richeson, who has earned multiple awards and honors for her research on intergroup relations, including a MacArthur Genius Award, is editor of the special issue.

"Although a special issue is decidedly insufficient to cover all of the emerging research on the psychology of racism, the papers included here are poised to better position psychological science to inform and shape more thoughtful discourse regarding the nature of racism, how it affects individual cognition and health, and, importantly, how best to combat it," Richeson writes in her introduction to the issue.

In one article, Richeson, with lead author Maureen Craig of New York University, and Julian Rucker of Yale University, highlight research showing the perceived threat that many White people feel when they anticipate increases in the of population of minorities. These perceptions can generate prejudice, discrimination, and anti-immigration sentiments. Future research should examine how resistance to demographic changes can be tempered (or worsened) by the rise in intergroup interactions that will occur as neighborhoods and communities grow more diverse, the authors say.

Other articles in the issue review research examining how individuals and societies sustain racial disparities. Studies have shown that White people often cloak or deny their privileged position by, for example, underestimating their advantages in wealth and employment. Even the egalitarian "color-blind" ideology often embraced in schools and workplaces can reduce sensitivity to racism and the unique needs of minorities. Research suggests that learning about racial disparities, such as the disproportionate number of Black individuals in prison, can prompt people to associate Blackness with crime and, consequently, lead them to justify existing crime policies.

Several articles in the issue address research on ethnic/racial identity and psychological health, cultural patterns and institutional realities that support racism, and the association between discrimination and physical health among African Americans.

One theme throughout the articles is the power of positive interactions between majority and marginalized groups. The more that White individuals interact with people of other races and ethnicities, the less threatened they feel by changing demographics and the more invested they become in the well-being of those groups, research suggests. But, the continuing persistence of racial segregation in neighborhoods, schools, and the like limits the potential for such contact to take place. There is also evidence indicating that the endorsement of multiculturalism -- as opposed to color-blindness -- is an effective way to lower prejudice (although some studies show negative effects such as heightened racial stereotyping).

Some of the foremost scholars on racism and diversity contributed to the collection, including Stanford University professor Jennifer Eberhardt, whose work on racial disparities in criminal justice earned her a MacArthur Genius Award; Victoria Plaut, a University of California, Berkeley professor widely recognized for her research on multiculturalism and diversity; Fordham University professor Tiffany Yip, who studies ethnic identity and academic outcomes; and Linda Tropp, an Emory University professor who studies intergroup relations.

The special issue is available free to the public at http://journals.sagepub.com/toc/cdp/current

Credit: 
Association for Psychological Science

Ten thousand bursting genes

image: Intron seqFISH enables 3D reconstruction of nascent transcription active sites (colored spots) in an embryonic stem cell (blue), with individual chromosomes occupying distinct spatial territories (colored differently). Here, 982 transcription active sites, corresponding to individual genes, are present in this cell.

Image: 
Cai laboratory / Cell

A breakthrough new technique enables scientists to image 10,421 genes at once within individual cells.

The work was done in the laboratory of Long Cai, research professor in biology and an affiliated faculty member of the Tianqiao and Chrissy Chen Institute for Neuroscience at Caltech. A paper describing the research appears in the June 7 issue of the journal Cell.

The new technique, dubbed intron seqFISH (sequential fluorescence in situ hybridization), is a major advance in being able to identify what goes on across the genome in hundreds of different cells at once. Previously, researchers could only image four to five genes at a time in cells with microscopy. This work builds off of previous advances from the Cai laboratory, including an earlier version of seqFISH from 2014 and research from 2017 that profiled over 10,000 genes under a microscope. Scaling seqFISH up to a genomic level now enables the imaging of over 10,000 genes--about half of the total number of genes in mammals--within single cells.

In order for genetic instructions to be turned into an actual functioning protein, a process called transcription must first occur. This process often occurs in pulses, or "bursts." First, a gene will be read and copied into a precursor messenger RNA, or pre-mRNA, like jotting a quick, rough draft. This molecule then matures into a messenger RNA, or mRNA, akin to editing the rough draft. During the "editing" process, certain regions called introns are cut out of the pre-mRNA.

The team chose to focus on labeling introns because they are produced so early in the transcription process, giving a picture of what a cell is doing at the precise moment of gene expression.

Using the newly developed intron seqFISH technique, each intron is labeled with a unique fluorescent barcode, enabling it to be seen with a microscope. Seeing introns reveals which genes are currently turned on in individual cells, how strongly they are expressed, and where they are located. 10,421 introns--and therefore 10,421 genes--can be imaged at once.

Previous work that developed the barcoding technique focused on labeling mRNA itself, providing a measurement of how gene expression changed over several hours as the mRNA developed. Looking at introns enabled the researchers to examine, for the first time, so-called nascent transcriptomes--newly synthesized gene expression. This led them to discover that the transcription of genes oscillates globally across many genes on what Cai calls a "surprisingly short" timescale--only about two hours--compared to the time it takes for a cell to divide and replicate itself, which takes from 12 to 24 hours. This means that over the course of a two-hour period, many genes within a cell will burst on and off.

There are several reasons why the oscillation phenomenon had not been observed previously. First, because these two-hour oscillations are not synchronized amongst different cells, the fluctuations are averaged out by methods that require many cells. Second, the high accuracy of the seqFISH method allows the researchers to be certain that what they observe represents real biological fluctuations, rather than technical noise. Lastly, these two-hour oscillations are obscured when mRNAs rather than introns are measured, because mRNA molecules have a longer lifetime, three to four hours, in mammalian cells.

Additionally, because introns stay where the gene is physically located, fluorescently imaging introns allows researchers to visualize where genes are located within the chromosome, the large structure that DNA folds into within the cell's nucleus. In this work, the team was surprised to discover that most active, protein-encoding genes are located on the surface of the chromosome, not buried inside of it.

"This technique can be applied to any tissue," says Cai, who is a collaborator on the Human Cell Atlas, a project that aims to define all cell types in the human body. "Intron seqFISH can help identify cell types and also what the cells are going to do, in addition to giving us a look at the chromosome structure in the same cells."

Credit: 
California Institute of Technology

Face transplantation -- An established option to improve quality of life in patients with severe facial trauma

June 8, 2018 - Thirteen years after the first successful face transplant, US trauma surgeons should be aware of the current role of facial transplantation for patients with severe facial disfigurement - including evidence that the final appearance and functioning are superior to that provided by conventional reconstructive surgery. That's the message of a special update on 'Face Transplantation Today' in the June issue of The Journal of Craniofacial Surgery, edited by Mutaz B. Habal, MD, and published in the Lippincott portfolio by Wolters Kluwer.

Eduardo D. Rodriguez, MD, DDS, and colleagues of the Hansjörg Wyss Department of Plastic Surgery at NYU Langone Health, New York, summarize the world experience with facial transplantation to date, along with a new study showing better aesthetic outcomes with facial transplant, compared to conventional reconstruction. The researchers write, "It is therefore important for trauma surgeons who deal with these injuries regularly to be familiar with the literature on face transplantation following traumatic injuries."

Face Transplant Should Be an Option for Patients with Severe Facial Trauma

The researchers provide an update on all full or partial facial transplant procedures performed to date -emphasizing the risks and benefits, surgical indications, and aesthetic and functional outcomes. They write, "Face transplantation has evolved...into a safe and feasible reconstructive solution, with good aesthetic and functional outcomes for patients with severe facial defects are not amenable to reconstruction through conventional and autologous [using the patient's own tissues] approaches."

Face transplantation may be considered for patients with defects involving at least 60 percent of the facial surface area, with irreparable damage or loss of the "aesthetic units" of the central face (eyelids, nose, lips). While such severe facial injuries are rare, the trauma mechanisms causing them are not. Dr. Rodriguez and colleagues note that most facial transplants performed to date have been in patients who suffered ballistic (firearms) trauma or burns.

In such severe cases, skin grafts and other conventional reconstructive techniques fall short of providing adequate aesthetic and functional outcomes. Trauma surgeons need to be aware of the potential benefits and limitations of facial transplantation. "This can potentially expedite the reconstructive process for patients who may benefit from face transplant," the researchers write.

Yet there are still important gaps in research on the full benefits of facial transplantation. In a new survey study, Dr. Rodriguez's group asked members of the general public to rate before-and-after pictures of patients with severe facial deformities, treated by either conventional reconstruction or facial transplantation.

Ratings were performed using a validated nine-point scale, from minimal (1 point) to severe (9 points) disfigurement. The average perceived disfigurement scores were 4.9 points for the facial transplant recipients versus 8.5 points for those who underwent conventional reconstruction (compared to 1.2 points for a group of individuals with no apparent facial disfigurement).

That supports the impression, communicated to patients considering facial transplantation, that while they may not appear completely normal after the procedure, their appearance "will likely improve dramatically" compared to conventional reconstructive surgery. Recipients have also reported becoming more active in their communities after facial transplantation, due to feeling less conspicuous when out in public. Further research is needed, including assessment of the impact on quality of life and other patient-reported outcomes.

Dr. Rodriguez and coauthors hope their studies will help to make the trauma community more aware of the option of facial transplantation in appropriate cases, and provide a step toward comparing its outcomes to those of conventional reconstruction. With ongoing advances - including the development of less toxic, more effective immunosuppressive therapies to prevent rejection - facial transplantation may become a more widely available alternative for patients with severe disfiguring facial trauma.

Credit: 
Wolters Kluwer Health

Holes in the head

image: More ancient skulls bearing evidence of trepanation -- a telltale hole surgically cut into the cranium -- have been found in Peru than the combined number found in the rest of the world.

Image: 
University of Miami

Even with a highly skilled neurosurgeon, the most effective anesthesia, and all the other advances of modern medicine, most of us would cringe at the thought of undergoing cranial surgery today.

After all, who needs a hole in the head? Yet for thousands of years, trepanation--the act of scraping, cutting, or drilling an opening into the cranium--was practiced around the world, primarily to treat head trauma, but possibly to quell headaches, seizures and mental illnesses, or even to expel perceived demons.

But, according to a new study led by the University of Miami Miller School of Medicine's David S. Kushner, M.D., clinical professor of physical medicine and rehabilitation, trepanation was so expertly practiced in ancient Peru that the survival rate for the procedure during the Incan Empire was about twice that of the American Civil War--when, more three centuries later, soldiers were trepanned presumably by better trained, educated and equipped surgeons.

"There are still many unknowns about the procedure and the individuals on whom trepanation was performed, but the outcomes during the Civil War were dismal compared to Incan times," said Kushner, a neurologist who has helped scores of patients recover from modern-day traumatic brain injuries and cranial surgeries. "In Incan times, the mortality rate was between 17 and 25 percent, and during the Civil War, it was between 46 and 56 percent. That's a big difference. The question is how did the ancient Peruvian surgeons have outcomes that far surpassed those of surgeons during the American Civil War?"

In their study published in the June issue of World Neurosurgery, "Trepanation Procedures/Outcomes: Comparison of Prehistoric Peru with Other Ancient, Medieval, and American Civil War Cranial Surgery," Kushner and his co-authors--biological anthropologists John W. Verano, a world authority on Peruvian trepanation at Tulane University, and his former graduate student, Anne R. Titelbaum, now of the University of Arizona College of Medicine--can only speculate on the answer.

But hygiene, or more accurately the lack of it during the Civil War, may have contributed to the higher mortality rates in the later time period. According to the study, which relied on Verano's extensive field research on trepanation over a nearly 2,000-year period in Peru and a review of the scientific literature about trepanation around the world, Civil War surgeons often used unsterilized medical tools and their bare fingers to probe open cranial wounds or break up blood clots.

"If there was an opening in the skull they would poke a finger into the wound and feel around, exploring for clots and bone fragments," Kushner said, adding that nearly every Civil War soldier with a gunshot wound subsequently suffered from infection. "We do not know how the ancient Peruvians prevented infection, but it seems that they did a good job of it. Neither do we know what they used as anesthesia, but since there were so many (cranial surgeries) they must have used something--possibly coca leaves. Maybe there was something else, maybe a fermented beverage. There are no written records, so we just don't know."

Whatever their methods, ancient Peruvians had plenty of practice. More than 800 prehistoric skulls with evidence of trepanation--at least one but as many as seven telltale holes--have been found in the coastal regions and the Andean highlands of Peru, the earliest dating back to about 400 B.C. That's more than the combined total number of prehistoric trepanned skulls found in the rest of the world. Which is why Verano devoted an entire book, Holes in the Head--The Art and Archeology of Trepanation in Ancient Peru, to the 800-plus skulls, most of which were collected from burial caves and archaeological digs in the late 1800s and early 1900s and reside in museums and private collections today.

It's also why Kushner, a medical history buff and Tulane alumnus, jumped at the chance to join Titelbaum in co-authoring one of the book's chapters, "Trepanation from the Perspective of Modern Neurosurgery," and continues to research the subject.

Published in 2016, the book analyzes the techniques and survival rates of trepanation in Peru through the demise of the Incan Empire in the early 1500s. The researchers gauged survival by classifying the extent of bone remodeling around the trepanned holes, which indicates healing. If there was no evidence of healing the researchers assumed the patient died during or within days of the surgery. If the margins of the trepanation openings showed extensive remodeling, they considered the operation successful and the patient long-lived.

Those classifications, Kushner, Verano and Titelbaum reported in the World Neurosurgery paper, show how ancient Peruvians significantly refined their trepanation techniques over the centuries. They learned, for example, not to perforate the protective membrane surrounding the brain--a guideline Hippocrates codified in ancient Greece at about the same time, 5th century, B.C., that trepanning is thought to have begun in ancient Peru.

The long-term survival rates from such "shallow surgeries" in Peru during those early years, from about 400 to 200 B.C., proved to be worse than those in the Civil War, when about half the patients died. But, from 1000 to 1400 A.D., survival rates improved dramatically, to as high as 91 percent in some samples, to an average of 75 to 83 percent during the Incan period, the study showed.

"Over time, from the earliest to the latest, they learned which techniques were better, and less likely to perforate the dura," said Kushner, who has written extensively about modern-day neurosurgical outcomes. "They seemed to understand head anatomy and purposefully avoided the areas where there would be more bleeding. They also realized that larger-sized trepanations were less likely to be as successful as smaller ones. Physical evidence definitely shows that these ancient surgeons refined the procedure over time. Their success is truly remarkable."

Almost as remarkable is how, by the end of World War I, cranial surgery evolved into the distinct profession of neurosurgery, which continues to improve our understanding of brain anatomy, physiology and pathology. As Kushner notes, today's neurosurgeons regularly cut into the brain to remove tumors and blood clots, reduce intracranial pressure from massive strokes and trauma, repair vascular and structural anomalies and treat a myriad of other complex problems--with great success.

"Today, neurosurgical mortality rates are very, very low; there is always a risk but the likelihood of a good outcome is very high," he said. "And just like in ancient Peru, we continue to advance our neurosurgical techniques, our skills, our tools, and our knowledge."

Credit: 
University of Miami

The cartography of the nucleus

image: A 3D model of the nucleus made with SPRITE: DNA regions in the "inactive hub" on chromosomes 15 (orange) and chromosome 18 (green) coming together around a large nuclear body in the nucleus (blue) called the nucleolus (red).

Image: 
Guttman laboratory / Cell

Nestled deep in each of your cells is what seems like a magic trick: Six feet of DNA is packaged into a tiny space 50 times smaller than the width of a human hair. Like a long, thin string of genetic spaghetti, this DNA blueprint for your whole body is folded, twisted, and compacted to fit into the nucleus of each cell.

Now, Caltech researchers have shown how cells organize the seemingly immense genome in a clever manner so that they can conveniently find and access important genes. Understanding the delicate three-dimensional organization of the genome is crucial, particularly because alterations in DNA structure have been linked to certain diseases such as cancer and early aging. Mapping and pinpointing alterations in nuclear structure may help in finding solutions to these diseases.

The work was done in the laboratory of Mitchell Guttman, assistant professor of biology and Heritage Medical Research Institute investigator. A paper describing the research appears in the June 7 online issue of the journal Cell.

Though the vast majority of cells in every human body contain identical genomes, different types of cells are able to have diverse functions because genes can be expressed at varying levels--in other words, they can be turned on or off. For example, when a stem cell is developing into a neuron, a flurry of activity happens in the nucleus to dial up and down levels of gene expression. These levels would be different, for example, if the stem cell was turning into a muscle cell or if the cell were making the decision to self-destruct.

In addition to the genome, the nucleus also contains structures called nuclear bodies, which are like miniature factories in the nucleus that contain a high concentration of cellular machinery all working to accomplish similar tasks, such as turning on specific sets of genes or modifying RNA molecules to produce proteins in the cell. This cellular machinery needs to be able to efficiently search through six feet of DNA--approximately 20,000 total genes, in mammals--in order to precisely find and control its targets. This is made possible because DNA is organized into three-dimensional structures that make certain genes more or less accessible.

In the new research, Guttman and his team describe a method to three-dimensionally map out how DNA is organized within the space of the nucleus and how regions of chromosomes interact with each other and with nuclear bodies. The technique, dubbed SPRITE (Split-Pool Recognition of Interactions by Tag Extension), allows researchers to examine clusters (or "complexes") of molecules within the nucleus to see which molecules are interacting with each other and where they are located.

In the technique, each complex in the nucleus is given a different molecular barcode, with all of the molecules within a single complex receiving the same barcode. Then, the complexes can be broken open and the molecules analyzed. This way, scientists can determine if two or more molecules were interacting, depending on whether they had the same barcode.

Led by graduate student Sofia Quinodoz, the team used SPRITE to discover that genes across different chromosomes (large folded structures of DNA) cluster together around specific nuclear bodies. Specifically, inactive genes--those that are turned off--across different chromosomes cluster together around a particular nuclear body called the nucleolus, which contains repressive proteins on DNA that keep genes turned off. Conversely, active genes grouped about another kind of nuclear body called the nuclear speckle, contain molecules that help turn the genes on and make them into proteins.

"With SPRITE, we were able to see thousands of molecules--DNAs and RNAs--coming together at various 'hubs' around the nucleus in single cells," says Quinodoz, the study's first author. "Previously, researchers theorized that each chromosome is kind of on its own, occupying its own 'territory' in the nucleus. But now we see that multiple genes on different chromosomes are clustering together around these bodies of cellular machinery. We think these 'hubs' may help the cell keep DNA that are all turned on or turned off neatly organized in different parts of the nucleus to allow cellular machinery to easily access specific genes within the nucleus."

Credit: 
California Institute of Technology

New insight into why Pierce's disease is so deadly to grapevines

image: This is symptoms of Pierce's disease on a grapevine leaf.

Image: 
University of California

Scientists are gaining a better understanding of Pierce's disease and how it affects grapevines. The disease, which annually costs California more than $100 million, comes from a bacterium called Xylella fastidiosa. While the bacterium has been present in the state for more than 100 years, Pierce's disease became a more serious threat to agriculture with the arrival of the glassy-winged sharpshooter insect, which can carry the bacterium from plant to plant.

In a new study, published in Frontiers in Plant Science, researchers at the University of California, Davis, have identified a set of molecular markers that influence the onset of Pierce's disease in grapevines.

"We now have a very good idea of the plant responses to the disease," said lead author Paulo Zaini, a postdoctoral researcher in the Department of Plant Sciences at UC Davis. "This will help us in early diagnosis and help us design strategies to protect the plant from damaging itself."

HOW INFECTION DEVELOPS

The glassy-winged sharpshooter injects the Xylella fastidiosa bacterium into the plant's xylem, which is the part of the plant that carries water. The disease causes leaves to yellow or "scorch," eventually drying up and dropping from the vine. It can kill a plant in three to five years. Few diseases can kill grapevines so quickly.

The glassy-winged sharpshooter was first reported in California in 1994 and can travel greater distances than native sharpshooters. By 2002, the glassy-winged sharpshooter had infested more than 1,100 acres of grapevines statewide.

"What growers do to stop the bug is just apply insecticides at an increasingly growing rate," said Zaini. "It's not a sustainable strategy."

In this study the authors looked at the plant's responses to the disease compared to healthy plants. Better understanding the biochemical changes with onset of the disease can help foster new strategies to increase plant health, rather than having to use insecticides to fight disease.

Scientists have long thought the bacteria growing in the xylem blocked the flow of water to the leaves.

"We thought that the blockage causes a drought stress, but there's much more to it than that." said Abhaya Dandekar, professor of plant sciences and the study's principal investigator. "Not all the vessels are blocked."

The blockage might be part of the problem, but it doesn't answer all the questions. More than 200 plant species harbor the bacterium but are asymptomatic.

Having identified molecular markers important for Pierce's disease in grapevines, researchers can use them to study grapevine varieties or other plants that do not develop disease.

Credit: 
University of California - Davis

UMSOM researchers find that silent carriers of malaria are unlikely to develop the disease

In regions where malaria illness is widespread, it is common to find many individuals who are infected with malaria parasites (Plasmodium falciparum), but without symptoms. New research conducted by the University of Maryland School of Medicine (UMSOM) shows that treating these silent malaria cases could help stop the spread of malaria to others.

UMSOM researchers conducted a study of 114 participants in Malawi ranging from children to adults to better understand the role asymptomatic malaria infections have in the spread and occurrence of malaria illness. It is the first study to use prospective, longitudinal detection of asymptomatic malaria infection to examine subsequent risk of malaria illness among all ages.

These asymptomatic infections may never develop into illness, but they are an important contributor to the spread of malaria and pose a public health challenge.

"We know that in Malawi, like many parts of Africa, most of the malaria parasites are being carried by people who are not sick. They don't get treatment for their infections, because their infections a silent, but when they get bitten by mosquitoes, they can transmit malaria" said Miriam Laufer, MD, MPH, Associate Professor of Pediatrics and Associate Director for Malaria Research in UMSOM's Center for Vaccine Development and Global Health (CVD).

Researchers examined the association between asymptomatic malaria infections and subsequent risk of malaria illness and demonstrated that carrying P. falciparum infection without symptoms was associated with a 50% decrease in the risk of malaria illness.

Using a genotyping method to determine the molecular fingerprint of each parasite, they discovered when people who have asymptomatic malaria infection and get sick from malaria, it is because they acquire a new infection (from the bite of a mosquito) rather than having the asymptomatic infection develop into clinical disease. With new infections, adults and children with and without asymptomatic infection were equally likely to get sick. The researchers concluded that asymptomatic infection did not protect against new infections that made them sick.

"We have always worried that if we give medicine to treat malaria to people with asymptomatic infection, they might get sicker the next time they get malaria. This has been a challenge to introducing new policies like mass drug administration or screening and treating campaigns to interrupt malaria transmission. Our results suggest that treating asymptomatic infection will not lead to increased risk of disease in the short term. Now we need to evaluate these new interventions to determine the long term impact both on the individual's health and also on malaria transmission" said Dr. Laufer.

Researchers enrolled participants seeking treatment for uncomplicated malaria at the Mfera Health Centre in Chikhwawa district in Malawi between June 2014 and March 2015. Subjects were eligible if they had symptomatic P. falciparum infection, detected by malaria rapid diagnostic test (RDT) and confirmed by microscopy, and were HIV-negative at time of screening. They were treated for their initial illness and then followed every month and evaluated every time they were ill.

Credit: 
University of Maryland School of Medicine

Mars rover finds ancient organic compounds that match meteoritic samples

image: A composite self-portrait by NASA's Mars Curiosity Rover taken at the Windjana site in Gale Crater.

Image: 
NASA/JPL-Caltech/MSSS.

Washington, DC-- NASA's Curiosity rover has discovered new "tough" organic molecules in three-billion-year-old sedimentary rocks on Mars, increasing the chances that the record of habitability and potential life could have been preserved on the Red Planet, despite extremely harsh conditions on the surface that can easily break down organic molecules.

"The Martian surface is exposed to radiation from space and harsh chemicals that break down organic matter, so finding ancient organic molecules in the top five centimeters, from a time when Mars may have been habitable, bodes well for us to learn the story of organic molecules on Mars with future missions that will drill deeper," said lead author Jen Eigenbrode of NASA's Goddard Space Flight Center. (She also happens to be a former Carnegie postdoc at our Geophysical Laboratory.)

Organic molecules contain carbon and hydrogen, and can include oxygen, nitrogen, and other elements. Organic compounds are commonly associated with life, although they can be created by non-biological processes as well, processes referred to as abiotic organic chemistry. There is no way for Curiosity to determine if the materials it found came from ancient Martian life or not, according to Eigenbrode.

"Whether it holds a record of ancient life, is the food for extant life, or has existed in the absence of life, organic matter in Martian materials holds chemical clues to planetary conditions and processes," Eigenbrode said.

Carnegie's Andrew Steele was a key member of the research team, whose work on this project built off his discovery six years ago of indigenous organic carbon in 10 Martian meteorites. The organic molecules he found in 2012 are comparable to those found by Curiosity.

Like the meteoric samples, the rocks sampled by Curiosity must be heated by the rover's instruments to very high temperatures, ranging between 500 and 800 degrees Celsius (932 and 1,472 degrees Fahrenheit), to have their organics released as gas. Because the hydrocarbons were released at such high temperatures, they may be coming from bigger, tough organic molecules within the rock.

Sedimentary rocks (mudstones) were drilled from four areas at the base of Mount Sharp, the central mound in Gale crater. Although the surface of Mars is inhospitable today, there is evidence that in the distant past, the Martian climate allowed the presence of liquid water-an essential ingredient for life-at the surface.

Analysis by Curiosity indicates that billions of years ago, a lake inside Gale crater held all the ingredients necessary for life, including chemical building blocks, energy sources, and liquid water. The mudstone gradually formed from silt that settled out of the water and accumulated at the bottom of the lake. Scientists estimated the age of the rocks by the crater count method. Since meteorite impact craters accumulate over time, the more craters a region has, the older it is. Although there was no way to directly date the organic material found within the rocks, it has to be at least as old as the rocks themselves.

The results indicate organic carbon concentrations on the order of 10 parts per million or more. This is close to the amount of observed in Martian meteorites and about 100 times greater than prior in-situ detections of organic carbon. Some of the molecules identified include thiophenes, benzene, toluene, and small carbon chains, such as propane or butene.

Organic molecules containing chlorine were detected on Mars before.

Finding ancient carbon preserved right on the Martian surface gives scientists confidence that NASA's Mars 2020 rover and the European Space Agency's ExoMars rover will find even more organics, both on the surface and in the shallow subsurface.

"Are there signs of life on Mars?" asks Michael Meyer, NASA Program Scientist for the Mars Science Laboratory mission. "We don't know but these results tell us we are on the right track."

Steele says that the next steps must be looking for organic compounds that are released from the rock samples at lower temperatures.

"The next target is material that comes out when heated to less than 600 degrees Celsius, which is where the molecules are that will provide evidence of biological activity or the kinds of abiotic chemistry that could give rise to life," he said.

Adapted from an article provided courtesy of NASA.

Credit: 
Carnegie Institution for Science

Systemic racism needs more examination related to health, says UofL researcher

image: This is Billie Castle, Ph.D.

Image: 
UofL

Although the discipline of public health has recently recognized racism as a social determinant of health, little research examines the issue related to systems and structures.

University of Louisville School of Public Health and Information Sciences researcher Billie Castle, Ph.D., a post-doctoral associate in the Department of Health Promotion and Behavioral Sciences, conducted a literature review on the terms racism and systemic racism and found 85 published articles on the topic.

In a paper published in the Journal of Racial and Ethnic Health Disparities, Castle analyzes themes from the 85 articles and provides discussion on what is needed to move toward equitable solutions.

The themes include: approaches to address systemic racism; the impact of residential and racial segregation on health outcomes; policy implications for reducing health inequities; and system racism's impact on health outcomes.

In the discussion section, Castle points out the absence of research surrounding social determinants of health. Although the literature examined many determinants such as education, neighborhoods, environment and health care, Castle said there was no examination of systemic racism across the connection of all social determinants.

"Public health researchers and practitioners need to look beyond only changing behaviors to include changing the systems and structures that influence the environments in which certain behaviors are necessary to survive," Castle said.

As an example, she said community-based programming is often seen as a hopeful means to prevent youth violence. The problem, Castle said, is that perpetual violent behavior is often in reaction to environmental factors created through historic systemic racist policies and practices.

"It is challenging to change your behavior, but still have to survive in an environment that does not provide the support to sustain that changed behavior," she said. "Changes to inequitable systemic policy and practice that intentionally create healthy economic and socially thriving communities are needed to reduce youth violence and change behaviors."

In the article, Castle also underscores the role of public health practitioners to "actively call out racist practices and move toward utilizing practices that are more racially and socially equitable."

Including more minorities in public health decision-making also is key, Castle said.

"We need to make sure we are equitable in the decisions of who we include in our work. We should immediately think about how our research and practice impacts multiple social identities including race, gender, sexuality, class, religion, etc. -- and how to improve health outcomes for the most marginalized social identities," she said.

Castle's next publication will expand on this topic by examining the historic practice of redlining and its impact on youth participating in violent behaviors.

Credit: 
University of Louisville

USC scientists discover schizophrenia gene roles in brain development

A USC research team identified 150 proteins affecting cell activity and brain development that contribute to mental disorders, including schizophrenia, bipolar condition and depression.

It's the first time these molecules, which are associated with the disrupted-in-schizophrenia 1 (DISC1) protein linked to mental disorders, have been identified. The scientists developed new tools involving stem cells to determine chemical reactions the proteins use to influence cell functions and nerve growth in people.

"This moves science closer to opportunities for treatment for serious mental illness," said Marcelo P. Coba, the study author and professor of psychiatry at the Zilkha Neurogenetic Institute at the Keck School of Medicine of USC.

The findings appear in Biological Psychiatry.

Schizophrenia affects less than 1 percent of the U.S. population, but has an outsized impact on disability, suicide and premature deaths.

The DISC1 gene was linked to schizophrenia nearly 20 years ago. It controls how nerve cells called neurons develop, as well as how the brain matures. DISC1 also directs a network of signals across cells that can contribute to the disease. Scientists say errors in these chemical reactions contribute to schizophrenia.

But the identity of proteins that DISC1 can regulate is poorly understood, prompting the USC researchers and colleagues from the State University of New York Downstate Medical Center to undertake the research. The challenge was to simulate conditions inside the human brain, Coba explained.

Using stem cells, they conducted assays resembling habitat where DISC1 does its work. Then, they used gene editing to insert a molecular tag on DISC1, allowing them to extract it from brain cells and identify the proteins with which it associates.

Identifying the proteins that interact with DISC1 in brain cells could lead to understanding how the risk factors for psychiatric diseases are connected to specific molecular functions, Coba explained. The discovery enables researchers to determine specific processes that differ in patients suffering from specific mental illnesses.

"This gives researchers specific trails to follow within cells from both healthy patients and those diagnosed with disorders," Coba said.

Schizophrenia is one of the top 15 leading causes of disability worldwide. People with schizophrenia live an average of nearly 29 years less than those without the disorder, according to the National Institutes of Mental Health (NIMH).

The illness is often accompanied by conditions such as heart disease and diabetes, which contribute to the high premature mortality rate among people with schizophrenia. About 5 percent of people with schizophrenia die by suicide, a rate far greater than the general population, with the highest risk in the early stages of illness, according to the NIMH.

Credit: 
University of Southern California

Why seashells are tougher than chalk (video)

image: Seashells are made mostly of calcium carbonate, also known as chalk, a mineral soft and crumbly enough to use for sidewalk doodles. Yet seashells are tough and resilient. In this video, Reactions explains why seashells are so different, and why you can't use them to draw on your driveway: https://youtu.be/iUeMxjkSPyM.

Image: 
The American Chemical Society

WASHINGTON, June 7, 2018 -- Seashells are made mostly of calcium carbonate, also known as chalk, a mineral soft and crumbly enough to use for sidewalk doodles. Yet seashells are tough and resilient. In this video, Reactions explains why seashells are so different, and why you can't use them to draw on your driveway: https://youtu.be/iUeMxjkSPyM.

Credit: 
American Chemical Society

International agreement that human-relevant research is needed to enhance drug discovery

image: Dr. Kate Willett, senior author of the report, describes the need for a paradigm shift towards human-relevant research methods.

Image: 
Troy Seidle HSI

Washington DC (June 7, 2018) - The average pre-approval cost of research and development for a successful drug is estimated to be US$2.6 billion and the number of new drugs approved per billion US dollars spent has halved roughly every 9 years since 1950. More than 90% of drug candidates entering clinical trials fail to gain regulatory approval, mainly as a result of insufficient efficacy and/or unacceptable toxicity, because of the limited predictive value of preclinical, animal-based studies. Without significant intervention, the pipeline responsible for new drug production is predicted to dry up completely within 50 years.

However, great advances have been made in life science technologies and computer science, increasing our ability to generate and analyze data, and there is a growing recognition that, to improve the success rate, a stronger focus on human-relevant data is needed. Proceedings of a multistakeholder workshop co-organized by The Humane Society of the United States, Humane Society International and the National Institutes of Health under the auspices of the global BioMed21 Collaboration (biomed21.org) is published in Drug Discovery Today, presenting a comprehensive overview of existing efforts to prioritize human-based biology for health research and proposing key recommendations required to revitalize the drug discovery process.

Report co-author Dr Kate Willett, Senior Director for Science and Regulatory Affairs for HSUS and HSI, said: "Through the BioMed21 collaboration, we are stimulating strategic scientific dialogue on regional and global levels, bringing key stakeholders together to explore and develop consensus recommendations around barriers, opportunities, and priorities for future research funding. Improvements to the drug development process are possible, but stakeholders need to work together to shift toward improved understanding of disease pathways and networks in humans, together with continued development and exploitation of human-relevant enabling technologies such as microphysiological systems and computational systems biology."

In 2007, the National Academy of Sciences first articulated how a transition to an approach based on explicit delineation of biological pathways could improve chemical safety assessment. Since then, the Adverse Outcome Pathway (AOP) framework has developed into a central tool for realizing this vision, and beyond. Such a framework could provide a more predictive and effective rubric for understanding disease pathophysiology across levels of biological organization, and for targeting and evaluating new interventions using the growing toolbox of modern, human-specific tools for biomedical research.

The forthcoming publication makes key recommendations for enhancing drug discovery and development including the need for interdisciplinary and international collaboration and cooperativity. Workshop participants - which included experts from 6 NIH institutes, 5 FDA centers, and other key stakeholders - were in agreement that, in order to incentivize global data sharing, there is a need for standardized data and consistent ontologies and that global funding calls should prioritize the human-based methods, such as induced pluripotent stem cells, organoids and organs-on-chips.

Credit: 
Humane Society International

Sustained use of opioids before spine surgery increases risk of continued use after surgery

June 7, 2018 - Patients who take prescription opioids for a longer period before spinal surgery are more likely to continue using opioids several months after surgery, reports a study in the June 6, 2018, issue of The Journal of Bone & Joint Surgery. The journal is published in the Lippincott portfolio in partnership with Wolters Kluwer.

According to the new research, led by Andrew J. Schoenfeld, MD, MSc, of Brigham and Women's Hospital, Harvard Medical School, nearly nine percent of patients were still taking opioids six months after spinal surgery, and duration of opioid use before surgery was the main risk factor for continued use.

Sustained Preoperative Opioid Use Predicts Continued Use After Spine Surgery

Using insurance claims data, the researchers identified more than 27,000 patients who underwent various types of lower (lumbar) spine surgery between 2006 and 2014. Most of the patients underwent removal of a spinal disc (discectomy) or spinal fusion (arthrodesis). Although the data came from the US Department of Defense's Tricare insurance program, most of the patients in the study were civilians (such as retired military personnel or dependents of active-duty or retired personnel).

Nearly all patients had at least some opioid exposure before spinal surgery. They were classified into four groups:

Exposed: 60 percent had used opioids in the past, but were not actively using them at the time of surgery.

Acute exposure: 34 percent had their first opioid prescription within one month before surgery.

Intermediate sustained use: two percent had uninterrupted opioid use for less than six months before surgery.

Chronic sustained use: three percent had uninterrupted opioid use for six months or longer before surgery.

After surgery, 67 percent of the patients stopped taking opioids within 30 days, and 86 percent discontinued opioids by 90 days. Six months after surgery, 8.8 percent of patients were still taking prescription opioids.

Longer duration of opioid use before spinal surgery was an independent risk factor for continued use after surgery. After adjustment for other patient characteristics, the authors found that the likelihood of discontinuing opioid use within six months was 65 percent lower for patients in the "intermediate sustained" and 74 percent lower in the "chronic sustained" groups, compared to the "acute exposure" group. Somewhat surprisingly, even the patients who were "exposed" but not actively using opioids before surgery were 29 percent less likely than those in the "acute exposure" group to discontinue opioids after surgery.

Several other factors were associated with long-term opioid use after surgery: spinal fusion surgery, preoperative depression or anxiety, preoperative spinal fracture, a longer hospital stay, and junior enlisted rank (suggesting lower socioeconomic status).

The ongoing opioid crisis in the United States has prompted increased attention to the use of pain medications prescribed before and after surgery. Previous opioid use has been linked to an increased risk of complications and adverse outcomes after spinal surgery. This new study focuses on how preoperative opioid use affects continued opioid use after lumbar spine surgery, and finds evidence of a "dose-response" effect: patients taking opioids for a longer period before surgery are less likely to discontinue opioid use after surgery.

"Our results indicate that the majority of patients who are using prescription opioids prior to spine surgery discontinue these medications following surgical intervention," Dr. Schoenfeld and coauthors write. However, because close to 1 out of 10 patients are still taking opioids at six months after spinal surgery, the researchers highlight the need for surgeons to recognize the "biopsychosocial" factors contributing to chronic opioid use.

Since nearly all patients receive opioids before spinal surgery, Dr. Schoenfeld believes it's "reasonable" for surgeons to discuss risk factors for sustained opioid use with patients at the time of surgery. He adds, "Expectation management - defining shared goals of post-surgical pain control and a suspense date when the surgeon and patient agree opioids should likely no longer be necessary - could go a long way toward smoothing the opioid cessation process following surgery."

Credit: 
Wolters Kluwer Health

When did animals leave first footprint on Earth?

image: Trackways and burrows excavated in situ from the Ediacaran Dengying Formation.

Image: 
NIGP

On July 20, 1969, Neil Armstrong put the first footprint on the moon. But when did animals leave the first footprint on Earth?

Recently, an international research team reported discovering fossil footprints for animal appendages in the Ediacaran Period (about 635-541 million years ago) in China. This is considered the earliest animal fossil footprint record. The research was published in Science Advances on June 6, 2018.

Bilaterian animals such as arthropods and annelids have paired appendages and are among the most diverse animals today and in the geological past. They are often assumed to have appeared and radiated suddenly during the "Cambrian Explosion" about 541-510 million years ago, although it has long been suspected that their evolutionary ancestry was rooted in the Ediacaran Period. Until the current discovery, however, no fossil record of animal appendages had been found in the Ediacaran Period.

Researchers from the Nanjing Institute of Geology and Palaeontology of the Chinese Academy of Sciences and Virginia Tech in the United States studied trackways and burrows discovered in the Ediacaran Shibantan Member of the Dengying Formation (551-541 million years ago) in the Yangtze Gorges area of South China. The trackways are somewhat irregular, consisting of two rows of imprints that are arranged in series or repeated groups.

The characteristics of the trackways indicate that they were produced by bilaterian animals with paired appendages that raised the animal body above the water-sediment interface. The trackways appear to be connected to burrows, suggesting that the animals may have periodically dug into sediments and microbial mats, perhaps to mine oxygen and food.

These trace fossils represent some of the earliest known evidence for animal appendages and extend the earliest trace fossil record of animals with appendages from the early Cambrian to the late Ediacaran Period. The body fossils of the animals that made these traces, however, have not yet been found. Maybe they were never preserved.

Credit: 
Chinese Academy of Sciences Headquarters