Culture

Face transplantation -- An established option to improve quality of life in patients with severe facial trauma

June 8, 2018 - Thirteen years after the first successful face transplant, US trauma surgeons should be aware of the current role of facial transplantation for patients with severe facial disfigurement - including evidence that the final appearance and functioning are superior to that provided by conventional reconstructive surgery. That's the message of a special update on 'Face Transplantation Today' in the June issue of The Journal of Craniofacial Surgery, edited by Mutaz B. Habal, MD, and published in the Lippincott portfolio by Wolters Kluwer.

Eduardo D. Rodriguez, MD, DDS, and colleagues of the Hansjörg Wyss Department of Plastic Surgery at NYU Langone Health, New York, summarize the world experience with facial transplantation to date, along with a new study showing better aesthetic outcomes with facial transplant, compared to conventional reconstruction. The researchers write, "It is therefore important for trauma surgeons who deal with these injuries regularly to be familiar with the literature on face transplantation following traumatic injuries."

Face Transplant Should Be an Option for Patients with Severe Facial Trauma

The researchers provide an update on all full or partial facial transplant procedures performed to date -emphasizing the risks and benefits, surgical indications, and aesthetic and functional outcomes. They write, "Face transplantation has evolved...into a safe and feasible reconstructive solution, with good aesthetic and functional outcomes for patients with severe facial defects are not amenable to reconstruction through conventional and autologous [using the patient's own tissues] approaches."

Face transplantation may be considered for patients with defects involving at least 60 percent of the facial surface area, with irreparable damage or loss of the "aesthetic units" of the central face (eyelids, nose, lips). While such severe facial injuries are rare, the trauma mechanisms causing them are not. Dr. Rodriguez and colleagues note that most facial transplants performed to date have been in patients who suffered ballistic (firearms) trauma or burns.

In such severe cases, skin grafts and other conventional reconstructive techniques fall short of providing adequate aesthetic and functional outcomes. Trauma surgeons need to be aware of the potential benefits and limitations of facial transplantation. "This can potentially expedite the reconstructive process for patients who may benefit from face transplant," the researchers write.

Yet there are still important gaps in research on the full benefits of facial transplantation. In a new survey study, Dr. Rodriguez's group asked members of the general public to rate before-and-after pictures of patients with severe facial deformities, treated by either conventional reconstruction or facial transplantation.

Ratings were performed using a validated nine-point scale, from minimal (1 point) to severe (9 points) disfigurement. The average perceived disfigurement scores were 4.9 points for the facial transplant recipients versus 8.5 points for those who underwent conventional reconstruction (compared to 1.2 points for a group of individuals with no apparent facial disfigurement).

That supports the impression, communicated to patients considering facial transplantation, that while they may not appear completely normal after the procedure, their appearance "will likely improve dramatically" compared to conventional reconstructive surgery. Recipients have also reported becoming more active in their communities after facial transplantation, due to feeling less conspicuous when out in public. Further research is needed, including assessment of the impact on quality of life and other patient-reported outcomes.

Dr. Rodriguez and coauthors hope their studies will help to make the trauma community more aware of the option of facial transplantation in appropriate cases, and provide a step toward comparing its outcomes to those of conventional reconstruction. With ongoing advances - including the development of less toxic, more effective immunosuppressive therapies to prevent rejection - facial transplantation may become a more widely available alternative for patients with severe disfiguring facial trauma.

Credit: 
Wolters Kluwer Health

Holes in the head

image: More ancient skulls bearing evidence of trepanation -- a telltale hole surgically cut into the cranium -- have been found in Peru than the combined number found in the rest of the world.

Image: 
University of Miami

Even with a highly skilled neurosurgeon, the most effective anesthesia, and all the other advances of modern medicine, most of us would cringe at the thought of undergoing cranial surgery today.

After all, who needs a hole in the head? Yet for thousands of years, trepanation--the act of scraping, cutting, or drilling an opening into the cranium--was practiced around the world, primarily to treat head trauma, but possibly to quell headaches, seizures and mental illnesses, or even to expel perceived demons.

But, according to a new study led by the University of Miami Miller School of Medicine's David S. Kushner, M.D., clinical professor of physical medicine and rehabilitation, trepanation was so expertly practiced in ancient Peru that the survival rate for the procedure during the Incan Empire was about twice that of the American Civil War--when, more three centuries later, soldiers were trepanned presumably by better trained, educated and equipped surgeons.

"There are still many unknowns about the procedure and the individuals on whom trepanation was performed, but the outcomes during the Civil War were dismal compared to Incan times," said Kushner, a neurologist who has helped scores of patients recover from modern-day traumatic brain injuries and cranial surgeries. "In Incan times, the mortality rate was between 17 and 25 percent, and during the Civil War, it was between 46 and 56 percent. That's a big difference. The question is how did the ancient Peruvian surgeons have outcomes that far surpassed those of surgeons during the American Civil War?"

In their study published in the June issue of World Neurosurgery, "Trepanation Procedures/Outcomes: Comparison of Prehistoric Peru with Other Ancient, Medieval, and American Civil War Cranial Surgery," Kushner and his co-authors--biological anthropologists John W. Verano, a world authority on Peruvian trepanation at Tulane University, and his former graduate student, Anne R. Titelbaum, now of the University of Arizona College of Medicine--can only speculate on the answer.

But hygiene, or more accurately the lack of it during the Civil War, may have contributed to the higher mortality rates in the later time period. According to the study, which relied on Verano's extensive field research on trepanation over a nearly 2,000-year period in Peru and a review of the scientific literature about trepanation around the world, Civil War surgeons often used unsterilized medical tools and their bare fingers to probe open cranial wounds or break up blood clots.

"If there was an opening in the skull they would poke a finger into the wound and feel around, exploring for clots and bone fragments," Kushner said, adding that nearly every Civil War soldier with a gunshot wound subsequently suffered from infection. "We do not know how the ancient Peruvians prevented infection, but it seems that they did a good job of it. Neither do we know what they used as anesthesia, but since there were so many (cranial surgeries) they must have used something--possibly coca leaves. Maybe there was something else, maybe a fermented beverage. There are no written records, so we just don't know."

Whatever their methods, ancient Peruvians had plenty of practice. More than 800 prehistoric skulls with evidence of trepanation--at least one but as many as seven telltale holes--have been found in the coastal regions and the Andean highlands of Peru, the earliest dating back to about 400 B.C. That's more than the combined total number of prehistoric trepanned skulls found in the rest of the world. Which is why Verano devoted an entire book, Holes in the Head--The Art and Archeology of Trepanation in Ancient Peru, to the 800-plus skulls, most of which were collected from burial caves and archaeological digs in the late 1800s and early 1900s and reside in museums and private collections today.

It's also why Kushner, a medical history buff and Tulane alumnus, jumped at the chance to join Titelbaum in co-authoring one of the book's chapters, "Trepanation from the Perspective of Modern Neurosurgery," and continues to research the subject.

Published in 2016, the book analyzes the techniques and survival rates of trepanation in Peru through the demise of the Incan Empire in the early 1500s. The researchers gauged survival by classifying the extent of bone remodeling around the trepanned holes, which indicates healing. If there was no evidence of healing the researchers assumed the patient died during or within days of the surgery. If the margins of the trepanation openings showed extensive remodeling, they considered the operation successful and the patient long-lived.

Those classifications, Kushner, Verano and Titelbaum reported in the World Neurosurgery paper, show how ancient Peruvians significantly refined their trepanation techniques over the centuries. They learned, for example, not to perforate the protective membrane surrounding the brain--a guideline Hippocrates codified in ancient Greece at about the same time, 5th century, B.C., that trepanning is thought to have begun in ancient Peru.

The long-term survival rates from such "shallow surgeries" in Peru during those early years, from about 400 to 200 B.C., proved to be worse than those in the Civil War, when about half the patients died. But, from 1000 to 1400 A.D., survival rates improved dramatically, to as high as 91 percent in some samples, to an average of 75 to 83 percent during the Incan period, the study showed.

"Over time, from the earliest to the latest, they learned which techniques were better, and less likely to perforate the dura," said Kushner, who has written extensively about modern-day neurosurgical outcomes. "They seemed to understand head anatomy and purposefully avoided the areas where there would be more bleeding. They also realized that larger-sized trepanations were less likely to be as successful as smaller ones. Physical evidence definitely shows that these ancient surgeons refined the procedure over time. Their success is truly remarkable."

Almost as remarkable is how, by the end of World War I, cranial surgery evolved into the distinct profession of neurosurgery, which continues to improve our understanding of brain anatomy, physiology and pathology. As Kushner notes, today's neurosurgeons regularly cut into the brain to remove tumors and blood clots, reduce intracranial pressure from massive strokes and trauma, repair vascular and structural anomalies and treat a myriad of other complex problems--with great success.

"Today, neurosurgical mortality rates are very, very low; there is always a risk but the likelihood of a good outcome is very high," he said. "And just like in ancient Peru, we continue to advance our neurosurgical techniques, our skills, our tools, and our knowledge."

Credit: 
University of Miami

The cartography of the nucleus

image: A 3D model of the nucleus made with SPRITE: DNA regions in the "inactive hub" on chromosomes 15 (orange) and chromosome 18 (green) coming together around a large nuclear body in the nucleus (blue) called the nucleolus (red).

Image: 
Guttman laboratory / Cell

Nestled deep in each of your cells is what seems like a magic trick: Six feet of DNA is packaged into a tiny space 50 times smaller than the width of a human hair. Like a long, thin string of genetic spaghetti, this DNA blueprint for your whole body is folded, twisted, and compacted to fit into the nucleus of each cell.

Now, Caltech researchers have shown how cells organize the seemingly immense genome in a clever manner so that they can conveniently find and access important genes. Understanding the delicate three-dimensional organization of the genome is crucial, particularly because alterations in DNA structure have been linked to certain diseases such as cancer and early aging. Mapping and pinpointing alterations in nuclear structure may help in finding solutions to these diseases.

The work was done in the laboratory of Mitchell Guttman, assistant professor of biology and Heritage Medical Research Institute investigator. A paper describing the research appears in the June 7 online issue of the journal Cell.

Though the vast majority of cells in every human body contain identical genomes, different types of cells are able to have diverse functions because genes can be expressed at varying levels--in other words, they can be turned on or off. For example, when a stem cell is developing into a neuron, a flurry of activity happens in the nucleus to dial up and down levels of gene expression. These levels would be different, for example, if the stem cell was turning into a muscle cell or if the cell were making the decision to self-destruct.

In addition to the genome, the nucleus also contains structures called nuclear bodies, which are like miniature factories in the nucleus that contain a high concentration of cellular machinery all working to accomplish similar tasks, such as turning on specific sets of genes or modifying RNA molecules to produce proteins in the cell. This cellular machinery needs to be able to efficiently search through six feet of DNA--approximately 20,000 total genes, in mammals--in order to precisely find and control its targets. This is made possible because DNA is organized into three-dimensional structures that make certain genes more or less accessible.

In the new research, Guttman and his team describe a method to three-dimensionally map out how DNA is organized within the space of the nucleus and how regions of chromosomes interact with each other and with nuclear bodies. The technique, dubbed SPRITE (Split-Pool Recognition of Interactions by Tag Extension), allows researchers to examine clusters (or "complexes") of molecules within the nucleus to see which molecules are interacting with each other and where they are located.

In the technique, each complex in the nucleus is given a different molecular barcode, with all of the molecules within a single complex receiving the same barcode. Then, the complexes can be broken open and the molecules analyzed. This way, scientists can determine if two or more molecules were interacting, depending on whether they had the same barcode.

Led by graduate student Sofia Quinodoz, the team used SPRITE to discover that genes across different chromosomes (large folded structures of DNA) cluster together around specific nuclear bodies. Specifically, inactive genes--those that are turned off--across different chromosomes cluster together around a particular nuclear body called the nucleolus, which contains repressive proteins on DNA that keep genes turned off. Conversely, active genes grouped about another kind of nuclear body called the nuclear speckle, contain molecules that help turn the genes on and make them into proteins.

"With SPRITE, we were able to see thousands of molecules--DNAs and RNAs--coming together at various 'hubs' around the nucleus in single cells," says Quinodoz, the study's first author. "Previously, researchers theorized that each chromosome is kind of on its own, occupying its own 'territory' in the nucleus. But now we see that multiple genes on different chromosomes are clustering together around these bodies of cellular machinery. We think these 'hubs' may help the cell keep DNA that are all turned on or turned off neatly organized in different parts of the nucleus to allow cellular machinery to easily access specific genes within the nucleus."

Credit: 
California Institute of Technology

New insight into why Pierce's disease is so deadly to grapevines

image: This is symptoms of Pierce's disease on a grapevine leaf.

Image: 
University of California

Scientists are gaining a better understanding of Pierce's disease and how it affects grapevines. The disease, which annually costs California more than $100 million, comes from a bacterium called Xylella fastidiosa. While the bacterium has been present in the state for more than 100 years, Pierce's disease became a more serious threat to agriculture with the arrival of the glassy-winged sharpshooter insect, which can carry the bacterium from plant to plant.

In a new study, published in Frontiers in Plant Science, researchers at the University of California, Davis, have identified a set of molecular markers that influence the onset of Pierce's disease in grapevines.

"We now have a very good idea of the plant responses to the disease," said lead author Paulo Zaini, a postdoctoral researcher in the Department of Plant Sciences at UC Davis. "This will help us in early diagnosis and help us design strategies to protect the plant from damaging itself."

HOW INFECTION DEVELOPS

The glassy-winged sharpshooter injects the Xylella fastidiosa bacterium into the plant's xylem, which is the part of the plant that carries water. The disease causes leaves to yellow or "scorch," eventually drying up and dropping from the vine. It can kill a plant in three to five years. Few diseases can kill grapevines so quickly.

The glassy-winged sharpshooter was first reported in California in 1994 and can travel greater distances than native sharpshooters. By 2002, the glassy-winged sharpshooter had infested more than 1,100 acres of grapevines statewide.

"What growers do to stop the bug is just apply insecticides at an increasingly growing rate," said Zaini. "It's not a sustainable strategy."

In this study the authors looked at the plant's responses to the disease compared to healthy plants. Better understanding the biochemical changes with onset of the disease can help foster new strategies to increase plant health, rather than having to use insecticides to fight disease.

Scientists have long thought the bacteria growing in the xylem blocked the flow of water to the leaves.

"We thought that the blockage causes a drought stress, but there's much more to it than that." said Abhaya Dandekar, professor of plant sciences and the study's principal investigator. "Not all the vessels are blocked."

The blockage might be part of the problem, but it doesn't answer all the questions. More than 200 plant species harbor the bacterium but are asymptomatic.

Having identified molecular markers important for Pierce's disease in grapevines, researchers can use them to study grapevine varieties or other plants that do not develop disease.

Credit: 
University of California - Davis

UMSOM researchers find that silent carriers of malaria are unlikely to develop the disease

In regions where malaria illness is widespread, it is common to find many individuals who are infected with malaria parasites (Plasmodium falciparum), but without symptoms. New research conducted by the University of Maryland School of Medicine (UMSOM) shows that treating these silent malaria cases could help stop the spread of malaria to others.

UMSOM researchers conducted a study of 114 participants in Malawi ranging from children to adults to better understand the role asymptomatic malaria infections have in the spread and occurrence of malaria illness. It is the first study to use prospective, longitudinal detection of asymptomatic malaria infection to examine subsequent risk of malaria illness among all ages.

These asymptomatic infections may never develop into illness, but they are an important contributor to the spread of malaria and pose a public health challenge.

"We know that in Malawi, like many parts of Africa, most of the malaria parasites are being carried by people who are not sick. They don't get treatment for their infections, because their infections a silent, but when they get bitten by mosquitoes, they can transmit malaria" said Miriam Laufer, MD, MPH, Associate Professor of Pediatrics and Associate Director for Malaria Research in UMSOM's Center for Vaccine Development and Global Health (CVD).

Researchers examined the association between asymptomatic malaria infections and subsequent risk of malaria illness and demonstrated that carrying P. falciparum infection without symptoms was associated with a 50% decrease in the risk of malaria illness.

Using a genotyping method to determine the molecular fingerprint of each parasite, they discovered when people who have asymptomatic malaria infection and get sick from malaria, it is because they acquire a new infection (from the bite of a mosquito) rather than having the asymptomatic infection develop into clinical disease. With new infections, adults and children with and without asymptomatic infection were equally likely to get sick. The researchers concluded that asymptomatic infection did not protect against new infections that made them sick.

"We have always worried that if we give medicine to treat malaria to people with asymptomatic infection, they might get sicker the next time they get malaria. This has been a challenge to introducing new policies like mass drug administration or screening and treating campaigns to interrupt malaria transmission. Our results suggest that treating asymptomatic infection will not lead to increased risk of disease in the short term. Now we need to evaluate these new interventions to determine the long term impact both on the individual's health and also on malaria transmission" said Dr. Laufer.

Researchers enrolled participants seeking treatment for uncomplicated malaria at the Mfera Health Centre in Chikhwawa district in Malawi between June 2014 and March 2015. Subjects were eligible if they had symptomatic P. falciparum infection, detected by malaria rapid diagnostic test (RDT) and confirmed by microscopy, and were HIV-negative at time of screening. They were treated for their initial illness and then followed every month and evaluated every time they were ill.

Credit: 
University of Maryland School of Medicine

Mars rover finds ancient organic compounds that match meteoritic samples

image: A composite self-portrait by NASA's Mars Curiosity Rover taken at the Windjana site in Gale Crater.

Image: 
NASA/JPL-Caltech/MSSS.

Washington, DC-- NASA's Curiosity rover has discovered new "tough" organic molecules in three-billion-year-old sedimentary rocks on Mars, increasing the chances that the record of habitability and potential life could have been preserved on the Red Planet, despite extremely harsh conditions on the surface that can easily break down organic molecules.

"The Martian surface is exposed to radiation from space and harsh chemicals that break down organic matter, so finding ancient organic molecules in the top five centimeters, from a time when Mars may have been habitable, bodes well for us to learn the story of organic molecules on Mars with future missions that will drill deeper," said lead author Jen Eigenbrode of NASA's Goddard Space Flight Center. (She also happens to be a former Carnegie postdoc at our Geophysical Laboratory.)

Organic molecules contain carbon and hydrogen, and can include oxygen, nitrogen, and other elements. Organic compounds are commonly associated with life, although they can be created by non-biological processes as well, processes referred to as abiotic organic chemistry. There is no way for Curiosity to determine if the materials it found came from ancient Martian life or not, according to Eigenbrode.

"Whether it holds a record of ancient life, is the food for extant life, or has existed in the absence of life, organic matter in Martian materials holds chemical clues to planetary conditions and processes," Eigenbrode said.

Carnegie's Andrew Steele was a key member of the research team, whose work on this project built off his discovery six years ago of indigenous organic carbon in 10 Martian meteorites. The organic molecules he found in 2012 are comparable to those found by Curiosity.

Like the meteoric samples, the rocks sampled by Curiosity must be heated by the rover's instruments to very high temperatures, ranging between 500 and 800 degrees Celsius (932 and 1,472 degrees Fahrenheit), to have their organics released as gas. Because the hydrocarbons were released at such high temperatures, they may be coming from bigger, tough organic molecules within the rock.

Sedimentary rocks (mudstones) were drilled from four areas at the base of Mount Sharp, the central mound in Gale crater. Although the surface of Mars is inhospitable today, there is evidence that in the distant past, the Martian climate allowed the presence of liquid water-an essential ingredient for life-at the surface.

Analysis by Curiosity indicates that billions of years ago, a lake inside Gale crater held all the ingredients necessary for life, including chemical building blocks, energy sources, and liquid water. The mudstone gradually formed from silt that settled out of the water and accumulated at the bottom of the lake. Scientists estimated the age of the rocks by the crater count method. Since meteorite impact craters accumulate over time, the more craters a region has, the older it is. Although there was no way to directly date the organic material found within the rocks, it has to be at least as old as the rocks themselves.

The results indicate organic carbon concentrations on the order of 10 parts per million or more. This is close to the amount of observed in Martian meteorites and about 100 times greater than prior in-situ detections of organic carbon. Some of the molecules identified include thiophenes, benzene, toluene, and small carbon chains, such as propane or butene.

Organic molecules containing chlorine were detected on Mars before.

Finding ancient carbon preserved right on the Martian surface gives scientists confidence that NASA's Mars 2020 rover and the European Space Agency's ExoMars rover will find even more organics, both on the surface and in the shallow subsurface.

"Are there signs of life on Mars?" asks Michael Meyer, NASA Program Scientist for the Mars Science Laboratory mission. "We don't know but these results tell us we are on the right track."

Steele says that the next steps must be looking for organic compounds that are released from the rock samples at lower temperatures.

"The next target is material that comes out when heated to less than 600 degrees Celsius, which is where the molecules are that will provide evidence of biological activity or the kinds of abiotic chemistry that could give rise to life," he said.

Adapted from an article provided courtesy of NASA.

Credit: 
Carnegie Institution for Science

Systemic racism needs more examination related to health, says UofL researcher

image: This is Billie Castle, Ph.D.

Image: 
UofL

Although the discipline of public health has recently recognized racism as a social determinant of health, little research examines the issue related to systems and structures.

University of Louisville School of Public Health and Information Sciences researcher Billie Castle, Ph.D., a post-doctoral associate in the Department of Health Promotion and Behavioral Sciences, conducted a literature review on the terms racism and systemic racism and found 85 published articles on the topic.

In a paper published in the Journal of Racial and Ethnic Health Disparities, Castle analyzes themes from the 85 articles and provides discussion on what is needed to move toward equitable solutions.

The themes include: approaches to address systemic racism; the impact of residential and racial segregation on health outcomes; policy implications for reducing health inequities; and system racism's impact on health outcomes.

In the discussion section, Castle points out the absence of research surrounding social determinants of health. Although the literature examined many determinants such as education, neighborhoods, environment and health care, Castle said there was no examination of systemic racism across the connection of all social determinants.

"Public health researchers and practitioners need to look beyond only changing behaviors to include changing the systems and structures that influence the environments in which certain behaviors are necessary to survive," Castle said.

As an example, she said community-based programming is often seen as a hopeful means to prevent youth violence. The problem, Castle said, is that perpetual violent behavior is often in reaction to environmental factors created through historic systemic racist policies and practices.

"It is challenging to change your behavior, but still have to survive in an environment that does not provide the support to sustain that changed behavior," she said. "Changes to inequitable systemic policy and practice that intentionally create healthy economic and socially thriving communities are needed to reduce youth violence and change behaviors."

In the article, Castle also underscores the role of public health practitioners to "actively call out racist practices and move toward utilizing practices that are more racially and socially equitable."

Including more minorities in public health decision-making also is key, Castle said.

"We need to make sure we are equitable in the decisions of who we include in our work. We should immediately think about how our research and practice impacts multiple social identities including race, gender, sexuality, class, religion, etc. -- and how to improve health outcomes for the most marginalized social identities," she said.

Castle's next publication will expand on this topic by examining the historic practice of redlining and its impact on youth participating in violent behaviors.

Credit: 
University of Louisville

USC scientists discover schizophrenia gene roles in brain development

A USC research team identified 150 proteins affecting cell activity and brain development that contribute to mental disorders, including schizophrenia, bipolar condition and depression.

It's the first time these molecules, which are associated with the disrupted-in-schizophrenia 1 (DISC1) protein linked to mental disorders, have been identified. The scientists developed new tools involving stem cells to determine chemical reactions the proteins use to influence cell functions and nerve growth in people.

"This moves science closer to opportunities for treatment for serious mental illness," said Marcelo P. Coba, the study author and professor of psychiatry at the Zilkha Neurogenetic Institute at the Keck School of Medicine of USC.

The findings appear in Biological Psychiatry.

Schizophrenia affects less than 1 percent of the U.S. population, but has an outsized impact on disability, suicide and premature deaths.

The DISC1 gene was linked to schizophrenia nearly 20 years ago. It controls how nerve cells called neurons develop, as well as how the brain matures. DISC1 also directs a network of signals across cells that can contribute to the disease. Scientists say errors in these chemical reactions contribute to schizophrenia.

But the identity of proteins that DISC1 can regulate is poorly understood, prompting the USC researchers and colleagues from the State University of New York Downstate Medical Center to undertake the research. The challenge was to simulate conditions inside the human brain, Coba explained.

Using stem cells, they conducted assays resembling habitat where DISC1 does its work. Then, they used gene editing to insert a molecular tag on DISC1, allowing them to extract it from brain cells and identify the proteins with which it associates.

Identifying the proteins that interact with DISC1 in brain cells could lead to understanding how the risk factors for psychiatric diseases are connected to specific molecular functions, Coba explained. The discovery enables researchers to determine specific processes that differ in patients suffering from specific mental illnesses.

"This gives researchers specific trails to follow within cells from both healthy patients and those diagnosed with disorders," Coba said.

Schizophrenia is one of the top 15 leading causes of disability worldwide. People with schizophrenia live an average of nearly 29 years less than those without the disorder, according to the National Institutes of Mental Health (NIMH).

The illness is often accompanied by conditions such as heart disease and diabetes, which contribute to the high premature mortality rate among people with schizophrenia. About 5 percent of people with schizophrenia die by suicide, a rate far greater than the general population, with the highest risk in the early stages of illness, according to the NIMH.

Credit: 
University of Southern California

Why seashells are tougher than chalk (video)

image: Seashells are made mostly of calcium carbonate, also known as chalk, a mineral soft and crumbly enough to use for sidewalk doodles. Yet seashells are tough and resilient. In this video, Reactions explains why seashells are so different, and why you can't use them to draw on your driveway: https://youtu.be/iUeMxjkSPyM.

Image: 
The American Chemical Society

WASHINGTON, June 7, 2018 -- Seashells are made mostly of calcium carbonate, also known as chalk, a mineral soft and crumbly enough to use for sidewalk doodles. Yet seashells are tough and resilient. In this video, Reactions explains why seashells are so different, and why you can't use them to draw on your driveway: https://youtu.be/iUeMxjkSPyM.

Credit: 
American Chemical Society

International agreement that human-relevant research is needed to enhance drug discovery

image: Dr. Kate Willett, senior author of the report, describes the need for a paradigm shift towards human-relevant research methods.

Image: 
Troy Seidle HSI

Washington DC (June 7, 2018) - The average pre-approval cost of research and development for a successful drug is estimated to be US$2.6 billion and the number of new drugs approved per billion US dollars spent has halved roughly every 9 years since 1950. More than 90% of drug candidates entering clinical trials fail to gain regulatory approval, mainly as a result of insufficient efficacy and/or unacceptable toxicity, because of the limited predictive value of preclinical, animal-based studies. Without significant intervention, the pipeline responsible for new drug production is predicted to dry up completely within 50 years.

However, great advances have been made in life science technologies and computer science, increasing our ability to generate and analyze data, and there is a growing recognition that, to improve the success rate, a stronger focus on human-relevant data is needed. Proceedings of a multistakeholder workshop co-organized by The Humane Society of the United States, Humane Society International and the National Institutes of Health under the auspices of the global BioMed21 Collaboration (biomed21.org) is published in Drug Discovery Today, presenting a comprehensive overview of existing efforts to prioritize human-based biology for health research and proposing key recommendations required to revitalize the drug discovery process.

Report co-author Dr Kate Willett, Senior Director for Science and Regulatory Affairs for HSUS and HSI, said: "Through the BioMed21 collaboration, we are stimulating strategic scientific dialogue on regional and global levels, bringing key stakeholders together to explore and develop consensus recommendations around barriers, opportunities, and priorities for future research funding. Improvements to the drug development process are possible, but stakeholders need to work together to shift toward improved understanding of disease pathways and networks in humans, together with continued development and exploitation of human-relevant enabling technologies such as microphysiological systems and computational systems biology."

In 2007, the National Academy of Sciences first articulated how a transition to an approach based on explicit delineation of biological pathways could improve chemical safety assessment. Since then, the Adverse Outcome Pathway (AOP) framework has developed into a central tool for realizing this vision, and beyond. Such a framework could provide a more predictive and effective rubric for understanding disease pathophysiology across levels of biological organization, and for targeting and evaluating new interventions using the growing toolbox of modern, human-specific tools for biomedical research.

The forthcoming publication makes key recommendations for enhancing drug discovery and development including the need for interdisciplinary and international collaboration and cooperativity. Workshop participants - which included experts from 6 NIH institutes, 5 FDA centers, and other key stakeholders - were in agreement that, in order to incentivize global data sharing, there is a need for standardized data and consistent ontologies and that global funding calls should prioritize the human-based methods, such as induced pluripotent stem cells, organoids and organs-on-chips.

Credit: 
Humane Society International

Sustained use of opioids before spine surgery increases risk of continued use after surgery

June 7, 2018 - Patients who take prescription opioids for a longer period before spinal surgery are more likely to continue using opioids several months after surgery, reports a study in the June 6, 2018, issue of The Journal of Bone & Joint Surgery. The journal is published in the Lippincott portfolio in partnership with Wolters Kluwer.

According to the new research, led by Andrew J. Schoenfeld, MD, MSc, of Brigham and Women's Hospital, Harvard Medical School, nearly nine percent of patients were still taking opioids six months after spinal surgery, and duration of opioid use before surgery was the main risk factor for continued use.

Sustained Preoperative Opioid Use Predicts Continued Use After Spine Surgery

Using insurance claims data, the researchers identified more than 27,000 patients who underwent various types of lower (lumbar) spine surgery between 2006 and 2014. Most of the patients underwent removal of a spinal disc (discectomy) or spinal fusion (arthrodesis). Although the data came from the US Department of Defense's Tricare insurance program, most of the patients in the study were civilians (such as retired military personnel or dependents of active-duty or retired personnel).

Nearly all patients had at least some opioid exposure before spinal surgery. They were classified into four groups:

Exposed: 60 percent had used opioids in the past, but were not actively using them at the time of surgery.

Acute exposure: 34 percent had their first opioid prescription within one month before surgery.

Intermediate sustained use: two percent had uninterrupted opioid use for less than six months before surgery.

Chronic sustained use: three percent had uninterrupted opioid use for six months or longer before surgery.

After surgery, 67 percent of the patients stopped taking opioids within 30 days, and 86 percent discontinued opioids by 90 days. Six months after surgery, 8.8 percent of patients were still taking prescription opioids.

Longer duration of opioid use before spinal surgery was an independent risk factor for continued use after surgery. After adjustment for other patient characteristics, the authors found that the likelihood of discontinuing opioid use within six months was 65 percent lower for patients in the "intermediate sustained" and 74 percent lower in the "chronic sustained" groups, compared to the "acute exposure" group. Somewhat surprisingly, even the patients who were "exposed" but not actively using opioids before surgery were 29 percent less likely than those in the "acute exposure" group to discontinue opioids after surgery.

Several other factors were associated with long-term opioid use after surgery: spinal fusion surgery, preoperative depression or anxiety, preoperative spinal fracture, a longer hospital stay, and junior enlisted rank (suggesting lower socioeconomic status).

The ongoing opioid crisis in the United States has prompted increased attention to the use of pain medications prescribed before and after surgery. Previous opioid use has been linked to an increased risk of complications and adverse outcomes after spinal surgery. This new study focuses on how preoperative opioid use affects continued opioid use after lumbar spine surgery, and finds evidence of a "dose-response" effect: patients taking opioids for a longer period before surgery are less likely to discontinue opioid use after surgery.

"Our results indicate that the majority of patients who are using prescription opioids prior to spine surgery discontinue these medications following surgical intervention," Dr. Schoenfeld and coauthors write. However, because close to 1 out of 10 patients are still taking opioids at six months after spinal surgery, the researchers highlight the need for surgeons to recognize the "biopsychosocial" factors contributing to chronic opioid use.

Since nearly all patients receive opioids before spinal surgery, Dr. Schoenfeld believes it's "reasonable" for surgeons to discuss risk factors for sustained opioid use with patients at the time of surgery. He adds, "Expectation management - defining shared goals of post-surgical pain control and a suspense date when the surgeon and patient agree opioids should likely no longer be necessary - could go a long way toward smoothing the opioid cessation process following surgery."

Credit: 
Wolters Kluwer Health

When did animals leave first footprint on Earth?

image: Trackways and burrows excavated in situ from the Ediacaran Dengying Formation.

Image: 
NIGP

On July 20, 1969, Neil Armstrong put the first footprint on the moon. But when did animals leave the first footprint on Earth?

Recently, an international research team reported discovering fossil footprints for animal appendages in the Ediacaran Period (about 635-541 million years ago) in China. This is considered the earliest animal fossil footprint record. The research was published in Science Advances on June 6, 2018.

Bilaterian animals such as arthropods and annelids have paired appendages and are among the most diverse animals today and in the geological past. They are often assumed to have appeared and radiated suddenly during the "Cambrian Explosion" about 541-510 million years ago, although it has long been suspected that their evolutionary ancestry was rooted in the Ediacaran Period. Until the current discovery, however, no fossil record of animal appendages had been found in the Ediacaran Period.

Researchers from the Nanjing Institute of Geology and Palaeontology of the Chinese Academy of Sciences and Virginia Tech in the United States studied trackways and burrows discovered in the Ediacaran Shibantan Member of the Dengying Formation (551-541 million years ago) in the Yangtze Gorges area of South China. The trackways are somewhat irregular, consisting of two rows of imprints that are arranged in series or repeated groups.

The characteristics of the trackways indicate that they were produced by bilaterian animals with paired appendages that raised the animal body above the water-sediment interface. The trackways appear to be connected to burrows, suggesting that the animals may have periodically dug into sediments and microbial mats, perhaps to mine oxygen and food.

These trace fossils represent some of the earliest known evidence for animal appendages and extend the earliest trace fossil record of animals with appendages from the early Cambrian to the late Ediacaran Period. The body fossils of the animals that made these traces, however, have not yet been found. Maybe they were never preserved.

Credit: 
Chinese Academy of Sciences Headquarters

End-to-end blood testing device shows capacity to draw sample and provide diagnostic results

image: Design of the automated blood testing and analysis device. The main subsystems include a veni-puncture robot, sample handling module, and blood analyzer.

Image: 
Max Balter, Ph.D. of Rutgers University

Researchers from the Biomedical Engineering Department at Rutgers University have developed an end-to-end blood testing device that integrates robotic phlebotomy with downstream sample processing. This platform device performs blood draws and provides diagnostic results in a fully automated fashion at the point-of-care. By reducing turnaround times, the device has the potential to expedite hospital work-flow, allowing practitioners to devote more time to treating patients. The research has been published in a paper in the June 2018 issue of TECHNOLOGY.

Diagnostic blood testing is the most commonly performed clinical procedure in the world and influences the majority of medical decisions made in hospital and laboratory settings. However, manual blood draw success rates are dependent on clinician skill and patient physiology, and results are generated almost exclusively in centralized labs from large-volume samples using labor-intensive analytical techniques.

To address these issues, the team of researchers at Rutgers University created a device that includes an image-guided venipuncture robot, to address the challenges of routine venous access, with a centrifuge-based blood analyzer to obtain quantitative measurements of hematology. In the paper, results are presented on a white blood cell assay, using a blood mimicking fluid spiked with fluorescent microbeads. Studies were conducted on the integrated device -- from blood draw to analysis -- using blood vessel phantoms, demonstrating both high accuracy and repeatability of the cannulation and resulting white blood cell assay.

"This device represents the holy grail in blood testing technology," stated Martin Yarmush, M.D., Ph.D., the paper's senior author. "Integrating miniaturized robotic and microfluidic systems, this technology combines the breadth and accuracy of traditional laboratory testing with the speed and convenience of point-of-care testing."

"When designing the system, our focus was on creating a modular and expandable device", stated Max Balter, Ph.D., first author of the paper. "With our relatively simple chip design and analysis techniques, the device can be extended to incorporate a broader panel of assays in the future".

Credit: 
World Scientific

Optimal sleep linked to lower risks for dementia and early death

Short and long daily sleep duration were risk factors for dementia and premature death in a study of Japanese adults aged 60 years and older. The findings are published in the Journal of the American Geriatrics Society.

Among 1,517 adults who were followed for 10 years, 294 developed dementia and 282 died. Age- and sex-adjusted incidence rates of dementia and all-cause mortality were greater in those with daily sleep duration of less than 5.0 hours and 10.0 hours or more, compared with those with daily sleep duration of 5.0 to 6.9 hours. Participants with short sleep duration who had high physical activity did not have a greater risk of dementia and death, however.

"Given the beneficial effects of physical activity on risk of sleep disturbance, these findings indicate that not only maintenance of appropriate sleep duration, but also modification of lifestyle behaviors related to sleep may be an effective strategy for preventing dementia and premature death in elderly adults," the authors wrote.

Credit: 
Wiley

Scientists see inner workings of enzyme telomerase, which plays key roles in aging, cancer

image: This is an image of telomerase's catalytic core

Image: 
Juli Feigon, et al./UCLA/Cell

Cancer, aging-related diseases and other illnesses are closely tied to an important enzyme called "telomerase." UCLA researchers report in the journal Cell the deepest scientific understanding yet of this once-mysterious enzyme, whose catalytic core -- where most of its activity occurs -- can now be seen in near atomic resolution.

"We're now seeing not just the face of the clock, we're seeing how the components inside interact to make it work," said Juli Feigon, a professor of chemistry and biochemistry in the UCLA College and a senior author of the study. "At each step, we zoom in closer and see more and more details, and can now begin to deduce not just what the enzyme looks like, but also how it functions. Knowing that may lead to the development of new drugs that target specific parts of the enzyme."

In addition to reporting the highest level of detail ever seen of the structure of telomerase's catalytic core, shown in the animation below, the researchers report for the first time they have captured telomerase in the process of making DNA.

"For the first time, we have a framework, or blueprint, of telomerase," said Lukas Susac, a UCLA postdoctoral scholar in Feigon's laboratory and a co-lead author. "We know people have telomerase mutations and get sick, but we have had no understanding of how this came to be, beyond knowing their telomerase doesn't work. Now we can say the problem is with a specific site within telomerase and perhaps see why the enzyme sometimes does not work properly. To treat an illness, first we have to locate where the problem occurs, and now this is possible. Of course, there are still steps to go."

Telomerase's main job is to maintain the DNA in telomeres, the structures at the ends of human chromosomes. When telomerase is not active, each time the cells divide, the telomeres get shorter. When that happens, the telomeres eventually become so short that the cells stop dividing or die.

Cells with abnormally active telomerase can continually rebuild their protective chromosomal caps and won't die, said Feigon, who also is a member of UCLA's Molecular Biology Institute and an associate member of the UCLA-Department of Energy Institute of Genomics and Proteomics. Over time, this is harmful because DNA errors accumulate and damage cells. Telomerase is especially active in cancer cells, which enables cancer to grow and spread.

Feigon's research team conducted the study using single-celled microorganisms called "Tetrahymena thermophila," which are commonly found in freshwater ponds. Telomerase's components are relatively well-known in Tetrahymena, and it is the organism in which telomerase and telomeres were first discovered. The central catalytic core of telomerase is similar in all organisms, including humans.

Telomerase contains a specialized "reverse transcriptase," or class of proteins, that has four major regions and several sub-regions. In this research, the scientists have revealed a large, previously unstudied sub-region called "TRAP" in the enzyme's reverse transcriptase. Instead of copying from DNA to RNA -- typically DNA makes RNA, which makes proteins -- reverse transcriptases use RNA to make DNA; one that's especially well-known is the HIV reverse transcriptase, the target of many drugs.

While other reverse transcriptases can copy any arbitrary RNA sequence and make DNA out of it, telomerase's reverse transcriptase copies only a specific six-nucleotide RNA and does so many times to make a long chain of DNA. (Nucleotides are the building blocks of DNA and RNA.) TRAP plays a crucial role in adding on small pieces of DNA to the ends of chromosomes to keep them from shortening every time cells divide.

The researchers report for the first time the structure, shape and significance of TRAP, and the region with which it interacts.

"A joy of science is the moment when you are the first person in the world to see something important," said Feigon, a member of the National Academy of Sciences. "I remember looking at this structure when we got it and thinking we had solved a significant piece of the puzzle and were the only people who had seen this. It's very exciting."

Feigon's research team is learning how the regions interact and communicate with one another. In a 2015 study in the journal Science, Feigon and colleagues reported the location of a major region called "TEN." Now the researchers report the structures of TEN and TRAP, and how they interact with each other and with the telomerase RNA. Many mutations that scientists attributed to the TEN region in fact disrupt TEN's interaction with TRAP, the researchers report in Cell.

This is the first time researchers have seen telomerase in the process of making DNA. The researchers captured telomerase immediately after it added a nucleotide to a growing DNA chain in the catalytic core. (The catalytic core consists of telomerase's reverse transcriptase and an RNA.)

What are the implications of the research for fighting cancers? Cancer cells keep reproducing, and for this to occur, telomerase must be highly active -- which it is not in healthy cells. To reduce this, it would be useful to know how to target the enzyme's activity. This new research brings this goal closer to reality by providing clues about what parts to target.

"We have very deep insights into how telomerase works and how the components work together," Susac said. "Each of these interactions could be a point to target, and possibly disrupt or enhance the function of telomerase. Precision will be very important; simply hitting telomerase with a hammer won't work. Telomerase is a very central and unique enzyme in many organisms. Now we have locations to aim for."

The scientists used a technique called "cryo-electron microscopy" that enables them to see the enzyme in extraordinary detail, and used computational modeling to interpret their data. The research team has expertise in several fields, including biochemistry, molecular biology, computational biology and biophysics.

Credit: 
University of California - Los Angeles