Culture

DNA design that anyone can do

Researchers at MIT and Arizona State University have designed a computer program that allows users to translate any free-form drawing into a two-dimensional, nanoscale structure made of DNA.

Until now, designing such structures has required technical expertise that puts the process out of reach of most people. Using the new program, anyone can create a DNA nanostructure of any shape, for applications in cell biology, photonics, and quantum sensing and computing, among many others.

"What this work does is allow anyone to draw literally any 2-D shape and convert it into DNA origami automatically," says Mark Bathe, an associate professor of biological engineering at MIT and the senior author of the study.

The researchers published their findings in the Jan. 4 issue of Science Advances, and the program, called PERDIX, is available online. The lead authors of the paper are Hyungmin Jun, an MIT postdoc, and Fei Zhang, an assistant research professor at Arizona State University. Other authors are MIT research associate Tyson Shepherd, recent MIT PhD recipient Sakul Ratanalert, ASU assistant research scientist Xiaodong Qi, and ASU professor Hao Yan.

Automated design

DNA origami, the science of folding DNA into tiny structures, originated in the early 1980s, when Ned Seeman of New York University proposed taking advantage of DNA's base-pairing abilities to create arbitrary molecular arrangements. In 2006, Paul Rothemund of Caltech created the first scaffolded, two-dimensional DNA structures, by weaving a long single strand of DNA (the scaffold) through the shape such that DNA strands known as "staples" would hybridize to it to help the overall structure maintain its shape.

Others later used a similar approach to create complex three-dimensional DNA structures. However, all of these efforts required complicated manual design to route the scaffold through the entire structure and to generate the sequences of the staple strands. In 2016, Bathe and his colleagues developed a way to automate the process of generating a 3-D polyhedral DNA structure, and in this new study, they set out to automate the design of arbitrary 2-D DNA structures.

To achieve that, they developed a new mathematical approach to the process of routing the single-stranded scaffold through the entire structure to form the correct shape. The resulting computer program can take any free-form drawing and translate it into the DNA sequence to create that shape and into the sequences for the staple strands.

The shape can be sketched in any computer drawing program and then converted into a computer-aided design (CAD) file, which is fed into the DNA design program. "Once you have that file, everything's automatic, much like printing, but here the ink is DNA," Bathe says.

After the sequences are generated, the user can order them to easily fabricate the specified shape. In this paper, the researchers created shapes in which all of the edges consist of two duplexes of DNA, but they also have a working program that can utilize six duplexes per edge, which are more rigid. The corresponding software tool for 3-D polyhedra, called TALOS, is available online and will be published soon in the journal ACS Nano. The shapes, which range from 10 to 100 nanometers in size, can remain stable for weeks or months, suspended in a buffer solution.

"The fact that we can design and fabricate these in a very simple way helps to solve a major bottleneck in our field," Bathe says. "Now the field can transition toward much broader groups of people in industry and academia being able to functionalize DNA structures and deploy them for diverse applications."

Nanoscale patterns

Because the researchers have such precise control over the structure of the synthetic DNA particles, they can attach a variety of other molecules at specific locations. This could be useful for templating antigens in nanoscale patterns to shed light on how immune cells recognize and are activated by specific arrangements of antigens found on viruses and bacteria.

"How nanoscale patterns of antigens are recognized by immune cells is a very poorly understood area of immunology," Bathe says. "Attaching antigens to structured DNA surfaces to display them in organized patterns is a powerful way to probe that biology."

Another key application is designing light-harvesting circuits that mimic the photosynthetic complexes found in plants. To achieve that, the researchers are attaching light-sensitive dyes known as chromophores to DNA scaffolds. In addition to harvesting light, such circuits could also be used to perform quantum sensing and rudimentary computations. If successful, these would be the first quantum computing circuits that can operate at room temperature, Bathe says.

Credit: 
Massachusetts Institute of Technology

Metabolic syndrome patients need more vitamin C to break cycle of antioxidant depletion

CORVALLIS, Ore. - A higher intake of vitamin C is crucial for metabolic syndrome patients trying to halt a potentially deadly cycle of antioxidant disruption and health-related problems, an Oregon State University researcher says.

That's important news for the estimated 35 percent of the U.S. adult population that suffers from the syndrome.

"What these findings are really saying to people as we move out of the rich-food holiday season and into January is eat your fruits and vegetables," said Maret Traber, a professor in the OSU College of Public Health and Human Sciences and Ava Helen Pauling Professor at Oregon State's Linus Pauling Institute. "Eat five to 10 servings a day and then you'll get the fiber, you'll get the vitamin C, and you'll really protect your gut with all of those good things."

A diet high in saturated fat results in chronic low-grade inflammation in the body that in turn leads to the development of metabolic syndrome, a serious condition associated with cognitive dysfunction and dementia as well as being a major risk factor for cardiovascular disease, fatty liver disease and type 2 diabetes.

A patient is considered to have metabolic syndrome if he or she has at least three of the following conditions: abdominal obesity, high blood pressure, high blood sugar, low levels of "good" cholesterol, and high levels of triglycerides.

Findings published in Redox Biology suggest the type of eating that leads to metabolic syndrome can prompt imbalances in the gut microbiome, with impaired gut function contributing to toxins in the bloodstream, resulting in vitamin C depletion, which subsequently impairs the trafficking of vitamin E.

It's a treadmill of antioxidant disruption that serves to make a bad situation worse; antioxidants such as vitamins C and E offer defense against the oxidative stress brought on by inflammation and the associated free radicals, unstable molecules that can damage the body's cells.

"Vitamin C actually protects vitamin E, so when you have lipid peroxidation, vitamin E is used up and vitamin C can regenerate it," Traber said. "If you don't have the vitamin C, the vitamin E gets lost and then you lose both of those antioxidants and end up in this vicious cycle of depleting your antioxidant protection."

Lipid peroxidation is the oxidative degradation of polyunsaturated fatty acids that are a major component of living cells; it's the process by which free radicals try to stabilize themselves by stealing electrons from cell membranes, causing damage to the cell.

"If there's too much fat in the diet, it causes injury to the gut," Traber said. "Bacterial cell walls can then leak from the gut and slip into circulation in the body, and they're chased down by neutrophils."

Neutrophils are the most abundant type of white blood cells, a key part of the immune system. Neutrophils attack bacteria with hypochlorous acid: bleach.

"The white blood cells are scrubbing with bleach and that destroys vitamin C," Traber said. "The body is destroying its own protection because it got tricked by the gut dysbiosis into thinking there was a bacterial invasion."

And without intervention, the process keeps repeating.

"People with metabolic syndrome can eat the same amount of vitamin C as people without metabolic syndrome but have lower plasma concentrations of vitamin C," Traber said. "We're suggesting that's because this slippage of bacterial cell walls causes the whole body to mount that anti-inflammatory response."

Vitamin C is found in fresh vegetables and fruits; sources of vitamin E include almonds, wheat germ and various seeds and oils.

Federal dietary guidelines call for 65 to 90 milligrams daily of vitamin C, and 15 milligrams of vitamin E.

Credit: 
Oregon State University

Controlling neurons with light -- but without wires or batteries

image: Wireless and battery-free implant with advanced control over targeted neuron groups.

Image: 
Philipp Gutruf

University of Arizona biomedical engineering professor Philipp Gutruf is first author on the paper Fully implantable, optoelectronic systems for battery-free, multimodal operation in neuroscience research, published in Nature Electronics.

Optogenetics is a biological technique that uses light to turn specific neuron groups in the brain on or off. For example, researchers might use optogenetic stimulation to restore movement in case of paralysis or, in the future, to turn off the areas of the brain or spine that cause pain, eliminating the need for -- and the increasing dependence on -- opioids and other painkillers.

"We're making these tools to understand how different parts of the brain work," Gutruf said. "The advantage with optogenetics is that you have cell specificity: You can target specific groups of neurons and investigate their function and relation in the context of the whole brain."

In optogenetics, researchers load specific neurons with proteins called opsins, which convert light to electrical potentials that make up the function of a neuron. When a researcher shines light on an area of the brain, it activates only the opsin-loaded neurons.

The first iterations of optogenetics involved sending light to the brain through optical fibers, which meant that test subjects were physically tethered to a control station. Researchers went on to develop a battery-free technique using wireless electronics, which meant subjects could move freely.

But these devices still came with their own limitations -- they were bulky and often attached visibly outside the skull, they didn't allow for precise control of the light's frequency or intensity, and they could only stimulate one area of the brain at a time.

Taking More Control and Less Space

"With this research, we went two to three steps further," Gutruf said. "We were able to implement digital control over intensity and frequency of the light being emitted, and the devices are very miniaturized, so they can be implanted under the scalp. We can also independently stimulate multiple places in the brain of the same subject, which also wasn't possible before."

The ability to control the light's intensity is critical because it allows researchers to control exactly how much of the brain the light is affecting -- the brighter the light, the farther it will reach. In addition, controlling the light's intensity means controlling the heat generated by the light sources, and avoiding the accidental activation of neurons that are activated by heat.

The wireless, battery-free implants are powered by external oscillating magnetic fields, and, despite their advanced capabilities, are not significantly larger or heavier than past versions. In addition, a new antenna design has eliminated a problem faced by past versions of optogenetic devices, in which the strength of the signal being transmitted to the device varied depending on the angle of the brain: A subject would turn its head and the signal would weaken.

"This system has two antennas in one enclosure, which we switch the signal back and forth very rapidly so we can power the implant at any orientation," Gutruf said. "In the future, this technique could provide battery-free implants that provide uninterrupted stimulation without the need to remove or replace the device, resulting in less invasive procedures than current pacemaker or stimulation techniques."

Devices are implanted with a simple surgical procedure similar to surgeries in which humans are fitted with neurostimulators, or "brain pacemakers." They cause no adverse effects to subjects, and their functionality doesn't degrade in the body over time. This could have implications for medical devices like pacemakers, which currently need to be replaced every five to 15 years.

The paper also demonstrated that animals implanted with these devices can be safely imaged with computer tomography, or CT, and magnetic resonance imaging, or MRI, which allow for advanced insights into clinically relevant parameters such as the state of bone and tissue and the placement of the device.

Credit: 
University of Arizona College of Engineering

University of Nevada, Reno uses 15-years of satellite imagery to study snow's comings and goings

image: In collaboration with NASA, Rose Petersky collected snow information around weather towers in the Sagehen Creek watershed.

Image: 
Photo by Adrian Harpold, University of Nevada, Reno.

RENO, Nev. - Winter snows are accumulating in the Sierra Nevada Mountains, creating the snowpacks that serve as a primary source of water for the western U.S.

However, due to rising average temperatures, snowpacks in the Great Basin appear to be transitioning from seasonal, with a predictable amount and melt rate, to "ephemeral," or short-lived, which are less predictable and only last up to 60 days. Unfortunately, ephemeral snow, and the reasons for and impacts of this transition have been poorly tracked and understood. Recent research and two published papers by a former University of Nevada, Reno graduate student and her professors are shedding some light on the subject.

"Small temperature changes can lead to large ecological changes," explained Hydrology Graduate Student Rose Petersky. "More intermittent snowpacks means water flow is more difficult to predict. We might not get as much water into the ground, throwing off the timing of water for plant root systems, reducing our supply and use, and even affecting businesses such as tourism."

Petersky, under the guidance of Natural Resources and Environmental Science Assistant Professor Adrian Harpold in the College of Agriculture, Biotechnology and Natural Resources, was the lead author in two recently published papers analyzing the change. One reports on the causes of the ephemeral snow, and the other reports on the impact of the transition on vegetation in the Great Basin. Natural Resources and Environmental Science Assistant Professor Kevin T. Shoemaker and Professor Peter J. Weisberg also worked on the project and are coauthors.

With funding through the Nevada Agricultural Experiment Station and from NASA, Petersky and the team analyzed both ground-based and satellite-based remote sensing data collected every day from 2001 to 2015. Petersky also wrote an algorithm, or computer formula, to fill in data lost due to cloud cover. To map changes, the team ran the data and algorithm through Google Earth Engine, computing many millions of computations in a few minutes.

With the resulting maps, they discovered that topography can play an important role, with more snow at higher elevations and on more north-facing slopes. In the Great Basin and eastern Sierra Nevada, shifts to more ephemeral snowpacks are due primarily to more rain falling instead of snow. They show that warming is likely to increase ephemeral snowpacks, even beyond the extreme 2015 drought. Consequently, the vegetation types at the greatest risk due to more ephemeral snowpacks were quaking aspen, red fir, Gambel oak and big mountain sagebrush, which represent ecosystems across the Great Basin.

"When it comes to managing natural resources, more information is better," said Harpold. "It will help us identify targets for intervention and work toward better managing the important water resource issues."

The team hopes others can use their results to identify species and areas most in need of management intervention in the form of forest thinning or assisted migration.

"Ultimately, this work will lead to more accurate models and reliable predictions for better water allocation and vegetation management in Nevada and beyond," Petersky concluded.

Credit: 
University of Nevada, Reno

No compelling evidence for health benefits of non-sugar sweeteners

There is no compelling evidence to indicate important health benefits of non-sugar sweeteners, and potential harms cannot be ruled out, suggests a review of published studies in The BMJ today.

Growing concerns about health and quality of life have encouraged many people to adopt healthier lifestyles and avoid foods rich in sugars, salt, or fat. Foods and drinks containing non-sugar sweeteners rather than regular ("free") sugars have therefore become increasingly popular.

Although several non-sugar sweeteners are approved for use, less is known about their potential benefits and harms within acceptable daily intakes beacuse the evidence is often limited and conflicting.

To better understand these potential benefits and harms, a team of European researchers analysed 56 studies comparing no intake or lower intake of non-sugar sweeteners with higher intake in healthy adults and children.

Measures included weight, blood sugar (glycaemic) control, oral health, cancer, cardiovascular disease, kidney disease, mood and behaviour. Studies were assessed for bias and certainty of evidence.

Overall, the results show that, for most outcomes, there seemed to be no statistically or clinically relevant differences between those exposed to non-sugar sweeteners and those not exposed, or between different doses of non-sugar sweeteners.

For example, in adults, findings from a few small studies suggested small improvements in body mass index and fasting blood glucose levels with non-sugar sweeteners, but the certainty of this evidence was low.

Lower intakes of non-sugar sweeteners were associated with slightly less weight gain (-0.09 kg) than higher intakes, but again the certainty of this evidence was low.

In children, a smaller increase in body mass index score was seen with non-sugar sweeteners compared with sugar, but intake of non-sugar sweeteners made no differences to body weight.

And no good evidence of any effect of non-sugar sweeteners was found for overweight or obese adults or children actively trying to lose weight.

The researchers point out that this is the most comprehensive review on this topic to date, and will inform a World Health Organization guideline for health experts and policy makers.

However, they stress that the quality of evidence in many of the studies was low, so confidence in the results is limited. And they say longer term studies are needed to clarify whether non-sugar sweeteners are a safe and effective alternative to sugar.

In a linked editorial, Vasanti Malik at Harvard T.H. Chan School of Public Health agrees that more studies are needed to understand the potential health effects of non-sugar sweeteners and to guide policy development.

Based on existing evidence, she says use of non-sugar sweeteners as a replacement for free sugars, particularly in sugar sweetened drinks, "could be a helpful strategy to reduce cardiometabolic risk [chances of having diabetes, heart disease or stroke] among heavy consumers, with the ultimate goal of switching to water or other healthy drinks."

"Policies and recommendations will need updating regularly, as more evidence emerges to ensure that the best available data is used to inform the important public health debate on sugar and its alternatives," she concludes.

Credit: 
BMJ Group

Medicare's bundled payment experiment for joint replacements shows moderate savings

Boston, MA - Medicare's randomized trial of a new bundled payment model for hip and knee replacement surgeries led to $812 in savings per procedure, or a 3.1% reduction in costs, when compared with traditional means of paying for care, according to new research from Harvard T.H. Chan School of Public Health and Harvard Medical School. The study found that the bundled payment model was also associated with a reduction in use of skilled nursing care after the hospitalization, but had no effects on complication rates among patients.

The study will be published online January 2, 2019 in the New England Journal of Medicine.

"Interest in bundled payments has exploded the past few years," said Michael Barnett, one of the lead authors on the study and assistant professor of health policy and management at Harvard Chan School. "The big question has always been whether this new model can lead hospitals to meaningfully reduce spending without harming patients. This study indicates that with the right financial incentive, hospitals can save money without compromising quality by sending more patients home rather than to a nursing facility."

Bundled payments are an alternative payment strategy that health plans, Medicare, and Medicaid are experimenting with to reduce expenses. Unlike traditional fee-for-service payments, bundled payments provide a single, fixed payment for a procedure and follow-up care rather than individually paying all parties separately.

In 2016, Medicare implemented the Comprehensive Care for Joint Replacement Model. In this randomized controlled trial, all hospitals in randomly selected cities had to accept a bundled payment model for hip or knee joint replacements. The program is the largest randomized policy experiment in Medicare to date of a new payment model. Under the model, hospitals in the selected cities received bonuses or penalties depending on how much they spent on follow-up care 90 days after joint-replacement patients were discharged.

To determine if the bundled payment model was effective at reducing costs and complications, the researchers analyzed data from the first two years of the program (2016-2017). They compared costs associated with 280,161 joint replacement procedures in 803 hospitals that were required to participate in the bundled payment program with 377,278 procedures in 962 hospitals that were not participating in the program.

The analysis showed that before accounting for administrative costs, bonuses and penalties, the bundled payment model resulted in a modest 3% savings for each patient and that complications rates did not increase. The cost savings were driven almost exclusively from reducing the use of post-acute care nursing facilities, the researchers said.

One concern that has been expressed over bundled payments is that such models may incentivize hospitals to avoid operating on sicker, more costly patients. This study, however, showed that the model had little impact on the number of higher-risk patients who received lower extremity joint replacements.

The study adds to the growing body of evidence that bundled payment models reduce spending without sacrificing quality of care, the researchers said. They added that the cost savings associated with bundled payments grew during the 18-month study period and that it is likely savings would continue to grow as the bundled payment model matured.

"While there is widespread agreement that we need to move away from our typical payment system, how to do so remains unclear. We need more rigorous experiments such as this one," said Ateev Mehrotra, senior author and associate professor of health care policy at Harvard Medical School.

Credit: 
Harvard T.H. Chan School of Public Health

Lipo-protein apheresis and PCSK9-inhibitors

Research suggests a specific treatment for Homozygous Familial Hypercholesterolaemic (HoFH) pateints; a combination therapy with PCSK9-inhibitors (PCSK9-I) and lipoprotein-apheresis (LA) may have synergistic effects on circulating lipid and lipoprotein levels. The relationship with the treatment regimen mentioned above can be investigated further with larger datasets.

This review details the role of lipoprotein apheresis in the management of familial hypercholesterolaemia and discusses the potential advantages and disadvantages of its combination with PCSK9 inhibitors. Having impact on inflammation and related mediators, LA is also a potent therapeutic player. A large body of evidence is available to support this theory. On the contrary, only a few observations are available on PCSK9-I effects on inflammation.

It is quite clear that further investigation on possible direct and/or indirect pleiotropic effects of PCSK9-I on inflammatory molecules is necessary. Evidence on both arguments with regard to HoFH and HeFH are reported in short.

Credit: 
Bentham Science Publishers

New findings on genes that drive male-female brain differences, timing of puberty

image: The discovery genes in roundworms raises new questions of whether differences in male and female behavior are hardwired in our brains.

Image: 
Nicoletta Barolini/Columbia University

Researchers have identified a group of genes that induces differences in the developing brains of male and female roundworms and triggers the initiation of puberty, a genetic pathway that may have the same function in controlling the timing of sexual maturation in humans.

The study, led by Columbia University scientists, offers new evidence for direct genetic effects in sex-based differences in neural development and provides a foundation to attempt to understand how men's and women's brains are wired and how they work.

The research was published Jan. 1 in eLife, an open-access journal founded by the Howard Hughes Medical Institute, the Max Planck Society and the Wellcome Trust.

Scientists have long known that puberty is accompanied by substantial changes in the brain characterized by the activation of neurons that produce hormonal signals. But what causes the brain to start releasing the hormones that switch on puberty remains elusive.

"In this paper we show that a pathway of regulatory genes acts within specific neurons to induce anatomical and functional differences in the male versus female brain," said lead study author Oliver Hobert, professor in Columbia's Department of Biological Sciences and a Howard Hughes Medical Institute investigator.

"Remarkably, we found that each member of this pathway is conserved between worms and humans, indicating that we have perhaps uncovered a general principle for how sexual brain differences in the brain are genetically encoded."

For their study, the researchers worked with the transparent roundworm C. elegans

, the first multicellular organism to have its genome sequenced. The worm's genetic makeup is similar to that of humans, making it one of the most powerful research models in molecular genetics and developmental biology.

The research team singled out C. elegans

with a mutation in a single gene known as Lin28. More than a decade ago, scientists discovered a link between mutations in the Lin28 gene and early-onset puberty in adolescent humans, a highly heritable condition that affects about 5 percent of the population. Conversely, overexpression of Lin28 is also associated with a delay in puberty.

"We knew the gene existed in humans, mice and worms, but we didn't understand how it controlled the onset of puberty," Hobert said. "Did Lin28 work directly with the brain? In what tissue type? What other genes did Lin28 control?"

In analyzing mutant C. elegans

strains, the researchers found that worms with early-onset puberty carried the mutated Lin28 gene, similar to humans. They also discovered three additional genes associated with premature sexual maturation -- the most interesting, the fourth gene, called Lin29.

Lin29 turned out to be present only in the male brain and expressed in the central neurons, establishing a distinct difference in the neural structures of males and females. Even more interesting, male C. elegans

missing the Lin29 gene had a male appearance but moved and behaved more like females.

"If you look at animals, including humans, there are dramatic physical and behavioral differences between males and females, including, for example, how they move," Hobert said. "The Lin29-deficient male worms, in essence, were feminized."

Laura Pereira, the paper's first author and a postdoctoral fellow in Columbia's Department of Biological Sciences, said the study is important because it makes the case that specific genes exist that control sex differences in the neural development. "It opens up new questions about whether differences in male and female behavior is hardwired in our brains," she said.

Credit: 
Columbia University

Patients now living a median 6.8 years after stage IV ALK+ lung cancer diagnosis

According to the National Cancer Institute, patients diagnosed with non-small cell lung cancer (NSCLC) between the years 1995 and 2001 had 15 percent chance of being alive 5 years later. For patients with stage IV disease, describing cancer that has spread to distant sites beyond the original tumor, that statistic drops to 2 percent. Now a University of Colorado Cancer Center study published in the Journal of Thoracic Oncology tells a much more optimistic story. For stage IV NSCLC patients whose tumors test positive for rearrangements of the gene ALK (ALK+ NSCLC), treated at UCHealth University of Colorado Hospital between 2009 and 2017, median overall survival was 6.8 years. This means that in this population, instead of only 2 percent of patients being alive 5 years after diagnosis, 50 percent of patients were alive 6.8 years after diagnosis.

"What this shows is that with the development of good targeted therapies for ALK-positive lung cancer, even patients with stage IV disease can do well for many, many years," says Jose Pacheco, MD, investigator at CU Cancer Center and the study's first author.

Of the 110 patients on the current study, 83 percent were never-smokers, and had a median age of 53 years. Almost all of these patients were initially treated with the drug crizotinib, which earned FDA approval in August 2011 to treat ALK+ NSCLC, but had previously been available in Colorado and other academic medical centers in the setting of clinical trials. Importantly, after treatment with crizotinib, when patients on the current study showed evidence of worsening disease, 78 percent were transitioned to another ALK-inhibitor, commonly brigatinib, alectinib or ceritinib.

"Many studies have reported shorter overall survival for patients with stage IV ALK+ NSCLC treated with crizotinib. These studies had lower survival outcomes in large part because of a lower percentage of patients receiving next-gen ALK inhibitors after progressing on crizotinib. Patients here were getting next-gen ALK inhibitors in phase 1 and 2 clinical trials before many other centers had access to them," Pacheco says.

Another factor that influenced survival was the use of pemetrexed-based chemotherapies in ALK+ lung cancer. Often, in addition to targeted therapy with ALK inhibitors, patients will undergo chemotherapy (and sometimes radiation). However, there are many chemotherapies to choose from, and it is often unclear which specific chemotherapies are most successful with specific cancers, stages, and patient characteristics. A 2011 study by CU Cancer Center investigator D. Ross Camidge, MD, PhD, who is also senior author of the current study, suggested that pemetrexed works especially well against the ALK+ form of the disease.

"We try to use mainly pemetrexed-based chemotherapies in ALK+ lung cancer," Pacheco says, "It is possible shorter survival in other studies may be associated with use of non-pemetrexed based chemotherapies."

Interestingly, the existence of brain metastases at time of diagnosis did not predict shorter survival.

"A lot of the new ALK inhibitors that were developed after crizotinib get into the brain very well, and they work similarly in the brain when compared to outside the brain. And we're doing more careful surveillance of patients to see when they develop brain mets - instead of waiting for symptoms and then treating, we're monitoring for the development of metastases with imaging of the brain and if we see something new, we sometimes treat it before it causes symptoms," Pacheco says.

The most predictive factor of shorter survival was the number of organs that were found to carry cancer at the time of diagnosis.

"At this point, 6.8 years one of longest median survivals ever reported for a NSCLC subpopulation with stage IV disease," Pacheco says. "It shows the benefit of targeted therapy and how it's changing survival for a lot of patients. And I think it suggests that for some types of NSCLC, it may become much more of a chronic condition rather than a terminal disease."

Credit: 
University of Colorado Anschutz Medical Campus

A new 'atlas' of genetic influences on osteoporosis

A ground-breaking new study led by researchers from the Lady Davis Institute (LDI) at the Jewish General Hospital (JGH) has succeeded in compiling an atlas of genetic factors associated with estimated bone mineral density (BMD), one of the most clinically relevant factors in diagnosing osteoporosis. The paper, published in Nature Genetics, identifies 518 genome-wide loci, of which 301 are newly discovered, that explain 20% of the genetic variance associated with osteoporosis. Having identified so many genetic factors offers great promise for the development of novel targeted therapeutics to treat the disease and reduce the risk of fracture.

"Our findings represent significant progress in highlighting drug development opportunities," explains Dr. Brent Richards, the lead investigator, a geneticist at the LDI's Centre for Clinical Epidemiology who treats patients with osteoporosis in his practice at the JGH. "This set of genetic changes that influence BMD provides drug targets that are likely to be helpful for osteoporotic fracture prevention."

Osteoporosis is a very common age-related condition characterized by the progressive reduction of bone strength, which results in a high risk of fracture. Especially among older patients, fractures can have severe consequences, including the risk of mortality. Among all sufferers, fractures impose major burdens of hospitalization and extended rehabilitation. As the population ages, the urgency of improving preventive measures becomes all the more intense.

"We currently have few treatment options," said Dr. Richards, a Professor of Medicine, Human Genetics, and Epidemiology and Biostatistics at McGill University, "and many patients who are at high risk of fractures do not take current medications because of fear of side effects. Notwithstanding that it is always better to prevent than to treat. We can prescribe injectables that build bone, but they are prohibitively expensive. We have medications that prevent loss of bone, but they must be taken on a strict schedule. As a result, the number of people who should be treated, but are not, is high. Therefore, we believe that we will have greater success in getting patients to follow a treatment regimen when it can be simplified."

This was the largest study ever undertaken of the genetic determinants of osteoporosis, assessing more than 426,000 individuals in the UK Biobank. After analyzing the data, the researchers further refined their findings to isolate a set of genes that are very strongly enriched for known drug targets. This smaller set of target genes will allow drug developers to narrow their search for a solution to the clinical problem of preventing fractures in those people who are predisposed to osteoporotic fractures. Animal models have already proven the validity of some of these genes.

"Although we found many genetic factors associated with BMD, the kind of precision medicine that genetics offers should allow us to hone in on those factors that can have the greatest effect on improving bone density and lessening the risk of fracture," said Dr. John Morris, also from the LDI and McGill University, the lead author on the study.

Credit: 
McGill University

Physicists record 'lifetime' of graphene qubits

CAMBRIDGE, MA -- Researchers from MIT and elsewhere have recorded, for the first time, the "temporal coherence" of a graphene qubit -- meaning how long it can maintain a special state that allows it to represent two logical states simultaneously. The demonstration, which used a new kind of graphene-based qubit, represents a critical step forward for practical quantum computing, the researchers say.

Superconducting quantum bits (simply, qubits) are artificial atoms that use various methods to produce bits of quantum information, the fundamental component of quantum computers. Similar to traditional binary circuits in computers, qubits can maintain one of two states corresponding to the classic binary bits, a 0 or 1. But these qubits can also be a superposition of both states simultaneously, which could allow quantum computers to solve complex problems that are practically impossible for traditional computers.

The amount of time that these qubits stay in this superposition state is referred to as their "coherence time." The longer the coherence time, the greater the ability for the qubit to compute complex problems.

Recently, researchers have been incorporating graphene-based materials into superconducting quantum computing devices, which promise faster, more efficient computing, among other perks. Until now, however, there's been no recorded coherence for these advanced qubits, so there's no knowing if they're feasible for practical quantum computing.

In a paper published today in Nature Nanotechnology, the researchers demonstrate, for the first time, a coherent qubit made from graphene and exotic materials. These materials enable the qubit to change states through voltage, much like transistors in today's traditional computer chips -- and unlike most other types of superconducting qubits. Moreover, the researchers put a number to that coherence, clocking it at 55 nanoseconds, before the qubit returns to its ground state.

The work combined expertise from co-authors William D. Oliver, a physics professor of the practice and Lincoln Laboratory Fellow whose work focuses on quantum computing systems, and Pablo Jarillo-Herrero, the Cecil and Ida Green Professor of Physics at MIT who researches innovations in graphene.

"Our motivation is to use the unique properties of graphene to improve the performance of superconducting qubits," says first author Joel I-Jan Wang, a postdoc in Oliver's group in the Research Laboratory of Electronics (RLE) at MIT. "In this work, we show for the first time that a superconducting qubit made from graphene is temporally quantum coherent, a key requisite for building more sophisticated quantum circuits. Ours is the first device to show a measurable coherence time -- a primary metric of a qubit -- that's long enough for humans to control."

There are 14 other co-authors, including Daniel Rodan-Legrain, a graduate student in Jarillo-Herrero's group who contributed equally to the work with Wang; MIT researchers from RLE, the Department of Physics, the Department of Electrical Engineering and Computer Science, and Lincoln Laboratory; and researchers from the Laboratory of Irradiated Solids at the École Polytechnique and the Advanced Materials Laboratory of the National Institute for Materials Science.

A pristine graphene sandwich

Superconducting qubits rely on a structure known as a "Josephson junction," where an insulator (usually an oxide) is sandwiched between two superconducting materials (usually aluminum). In traditional tunable qubit designs, a current loop creates a small magnetic field that causes electrons to hop back and forth between the superconducting materials, causing the qubit to switch states.

But this flowing current consumes a lot of energy and causes other issues. Recently, a few research groups have replaced the insulator with graphene, an atom-thick layer of carbon that's inexpensive to mass produce and has unique properties that might enable faster, more efficient computation.

To fabricate their qubit, the researchers turned to a class of materials, called van der Waals materials -- atomic-thin materials that can be stacked like Legos on top of one another, with little to no resistance or damage. These materials can be stacked in specific ways to create various electronic systems. Despite their near-flawless surface quality, only a few research groups have ever applied van der Waals materials to quantum circuits, and none have previously been shown to exhibit temporal coherence.

For their Josephson junction, the researchers sandwiched a sheet of graphene in between the two layers of a van der Waals insulator called hexagonal boron nitride (hBN). Importantly, graphene takes on the superconductivity of the superconducting materials it touches. The selected van der Waals materials can be made to usher electrons around using voltage, instead of the traditional current-based magnetic field. Therefore, so can the graphene -- and so can the entire qubit.

When voltage gets applied to the qubit, electrons bounce back and forth between two superconducting leads connected by graphene, changing the qubit from ground (0) to excited or superposition state (1). The bottom hBN layer serves as a substrate to host the graphene. The top hBN layer encapsulates the graphene, protecting it from any contamination. Because the materials are so pristine, the traveling electrons never interact with defects. This represents the ideal "ballistic transport" for qubits, where a majority of electrons move from one superconducting lead to another without scattering with impurities, making a quick, precise change of states.

How voltage helps

The work can help tackle the qubit "scaling problem," Wang says. Currently, only about 1,000 qubits can fit on a single chip. Having qubits controlled by voltage will be especially important as millions of qubits start being crammed on a single chip. "Without voltage control, you'll also need thousands or millions of current loops too, and that takes up a lot of space and leads to energy dissipation," he says.

Additionally, voltage control means greater efficiency and a more localized, precise targeting of individual qubits on a chip, without "cross talk." That happens when a little bit of the magnetic field created by the current interferes with a qubit it's not targeting, causing computation problems.

For now, the researchers' qubit has a brief lifetime. For reference, conventional superconducting qubits that hold promise for practical application have documented coherence times of a few tens of microseconds, a few hundred times greater than the researchers' qubit.

But the researchers are already addressing several issues that cause this short lifetime, most of which require structural modifications. They're also using their new coherence-probing method to further investigate how electrons move ballistically around the qubits, with aims of extending the coherence of qubits in general.

Credit: 
Massachusetts Institute of Technology

Unmuting large silent genes lets bacteria produce new molecules, potential drug candidates

image: Illinois researchers developed a technique to unmute silent genes in Streptomyces bacteria using decoy DNA fragments to lure away repressors. Pictured, from left: postdoctoral researcher Fang Guo, professor Huimin Zhao and postdoctoral researcher Bin Wang.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- By enticing away the repressors dampening unexpressed, silent genes in Streptomyces bacteria, researchers at the University of Illinois have unlocked several large gene clusters for new natural products, according to a study published in the journal Nature Chemical Biology.

Since many antibiotics, anti-cancer agents and other drugs have been derived from genes readily expressed in Streptomyces, the researchers hope that unsilencing genes that have not previously been expressed in the lab will yield additional candidates in the search for new antimicrobial drugs, says study leader and chemical and biomolecular engineering professor Huimin Zhao.

"There are so many undiscovered natural products lying unexpressed in genomes. We think of them as the dark matter of the cell," Zhao said. "Anti-microbial resistance has become a global challenge, so clearly there's an urgent need for tools to aid the discovery of novel natural products. In this work, we found new compounds by activating silent gene clusters that have not been explored before."

The researchers previously demonstrated a technique to activate small silent gene clusters using CRISPR technology. However, large silent gene clusters have remained difficult to unmute. Those larger genes are of great interest to Zhao's group, since a number of them have sequences similar to regions that code for existing classes of antibiotics, such as tetracycline.

To unlock the large gene clusters of greatest interest, Zhao's group created clones of the DNA fragments they wanted to express and injected them into the bacteria in hopes of luring away the repressor molecules that were preventing gene expression. They called these clones transcription factor decoys.

"Others have used this similar kind of decoys for therapeutic applications in mammalian cells, but we show here for the first time that it can be used for drug discovery by activating silent genes in bacteria," said Zhao, who is affiliated with the Carle Illinois College of Medicine, the Carl R. Woese Institute for Genomic Biology and the Center for Advanced Bioenergy and Bioproducts Innovation at Illinois.

To prove that the molecules they coded for were being expressed, researchers tested the decoy method first on two known gene clusters that synthesize natural products. Next, they created decoys for eight silent gene clusters that had been previously unexplored. In bacteria injected with the decoys, the targeted silent genes were expressed and the researchers harvested new products.

"We saw that the method works well for these large clusters that are hard to target by other methods," Zhao said. "It also has the advantage that it does not disturb the genome; it's just pulling away the repressors. Then the genes are expressed naturally from the native DNA."

In the search for drug candidates, each product needs to be isolated and then studied to determine what it does. Of the eight new molecules produced, the researchers purified and determined the structure of two molecules, and described one in detail in the study - a novel type of oxazole, a class of molecules often used in drugs.

The researchers plan next to characterize the rest of the eight compounds and run various assays to find out whether they have any anti-microbial, anti-fungal, anti-cancer or other biological activities.

Zhao's group also plans to apply the decoy technique to explore more silent biosynthetic gene clusters of interest in Streptomyces and in other bacteria and fungi to find more undiscovered natural products. Other research groups are welcome to use the technique for gene clusters they are exploring, Zhao said.

"The principle is the same, assuming that gene expression is repressed by transcription factors and we just need to release that expression by using decoy DNA fragments," Zhao said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

NYUAD study suggests that 'Actin' is critical in genome regulation during nerve cell formation

(December 31, 2018--Abu Dhabi) -- A new NYU Abu Dhabi study suggests for the first time that Actin, which is a cytoskeleton protein found in the cell, is critical to regulating the genome - the genetic material of an organism - during the formation of "neurons" or nerve cells. The study was published today in PLOS Genetics.

Led by NYU Abu Dhabi Associate Professor of Biology Piergiorgio Percipalle, along with other researchers, this study involved converting "fibroblasts" - cells that maintain connective tissues - with impaired actin expression into neurons in order to identify the role of Actin in neurogenesis. The implication of the methodology together with the availability of fibroblasts not expressing actin is far reaching. It will enable researchers to understand novel concepts in genome regulation and, in the long term, model diseases to identify druggable targets.

"The technology we've applied in my lab has given us the opportunity to identify novel factors and pathways involved in the regulation of the mammalian genome during neurogenesis - the formation of neurons - and has a lot of potential for the development of personalized medicines," says Percipalle, the study's lead researcher.

Credit: 
New York University

Community-based HIV testing effective in reaching undiagnosed populations, new study finds

image: A lay provider performs a finger-prick blood-based rapid diagnostic test.

Image: 
PATH

Hanoi, Vietnam, December 31, 2018--One in three people living with HIV in Vietnam remain undiagnosed, according to recent estimates. New strategies and models of HIV testing are urgently needed to reach undiagnosed populations and help them enroll in antiretroviral therapy (ART), in Vietnam and throughout the world.

Results from an evaluation study now published in PLOS ONE demonstrate that HIV testing by lay providers can serve as a critical addition to efforts to achieve the United Nations' 90-90-90 global HIV targets by 2020 and help to cover the "last mile" of HIV services to at-risk populations in Vietnam.

Evidence from the study--conducted by PATH in partnership with the Vietnam Ministry of Health, the United States Agency for International Development (USAID), and the Center for Creative Initiatives in Health and Population in Hanoi--suggests that community-based HIV testing is an effective approach to reach people at risk of HIV who have never been tested or test infrequently. Key at-risk populations include people who inject drugs, men who have sex with men, female sex workers, and first-time HIV testers.

A cross-sectional survey of 1,230 individuals tested by lay providers found that 74 percent of clients belonged to at-risk populations, 67 percent were first-time HIV testers, and 85 percent preferred lay provider testing to facility-based testing. Furthermore, lay provider testing yielded a higher HIV positivity rate compared to facility-based testing and resulted in a high ART initiation rate of 91 percent.

"PATH is committed to achieving the global 90-90-90 goals by 2020 and to ending AIDS by 2030," said Dr. Kimberly Green, PATH HIV & TB Director. "Innovation in HIV testing is absolutely critical to meet these ambitious targets, and community-based HIV testing offers a promising solution to connect undiagnosed people with the services they need."

Lay providers participating in the study belonged to community-based organizations led by at-risk populations in urban areas and to village health worker networks in rural mountainous areas. Providers used a single rapid diagnostic test in clients' homes, at the offices of community-based organizations, or at any private place preferred by the client. This approach helped to overcome barriers that had prevented key populations from seeking facility-based testing services, such as a perceived lack of confidentiality, fear of stigma and discrimination, inconvenient service opening times and distance, and long waiting times for test results.

Clients who had an HIV-reactive test were referred to the nearest health facility for HIV confirmatory testing, and those who received a confirmed HIV-positive result were referred to a public or private clinic for enrollment in ART. Clients with non-reactive test results received counseling to re-test after three or six months and were referred to a local health facility for HIV prevention services.

The study provides new evidence on the effectiveness of HIV testing administered by non-health care workers representing key populations and frontline village health volunteers. The results also support findings from community-based HIV testing approaches in other regions, including sub-Saharan Africa, that have demonstrated comparatively high rates of HIV testing uptake, high HIV positivity yields, and high success rates in linking people to care.

Based on the results of the study, the Vietnam Ministry of Health developed and approved national guidelines on community-based HIV testing and counseling in April 2018, and wider adoption of lay provider HIV testing was included in the 2018-2020 Global Fund concept note and in programming from the United States President's Emergency Plan for AIDS Relief (PEPFAR). Lay provider HIV testing services are now available in 32 of Vietnam's 63 provinces.

Credit: 
PATH

Abstaining from alcohol in January will mean better sleep, weight loss, and saving money

New research from the University of Sussex shows that taking part in Dry January - abstaining from booze for a month - sees people regaining control of their drinking, having more energy, better skin and losing weight. They also report drinking less months later.

The research, led by Sussex psychologist Dr Richard de Visser, was conducted with over 800 people who took part in Dry January in 2018. The results show that Dry January participants are still drinking less in August. They reported that: