Culture

Abramson Cancer Canter studies show promise of immunotherapy combinations, including CAR T

CHICAGO - As immunotherapies continue to make up a larger share of new cancer drugs, researchers are looking for the most effective ways to use these cutting edge treatments in combination with each or with other pre-existing options. New studies from the Abramson Cancer Center of the University of Pennsylvania are providing fresh clues on potentially effective combinations with CAR T therapy in brain cancer as well as a novel therapeutic target in head and neck cancer, and also providing greater understanding of the mechanisms of resistance in pancreatic cancer. All three studies will be presented as late breaking abstracts at the American Association for Cancer Research Annual Meeting in Chicago.

Combining CAR T Therapy with Checkpoint Inhibitors in Glibolastoma

The first study combines CAR T cell therapy with immune checkpoint blockade inhibitors in glioblastoma - an aggressive form of brain cancer (Abstract LB-340). Researchers used two different types of CAR T cells. One was specifically engineered to bind to epidermal growth factor receptor variant three (EGFRvIII), a gene that is commonly mutated by glioblastoma. Another targeted a protein known as interleukin-13 receptor subunit alpha-2 (IL-13Rα2). Researchers combined a variety of checkpoint inhibitors and found CARs targeting EGFRvIII were five times more effective when paired with an anti-PD-1 treatment, while CARs targeting IL-13Rα2 were five times more effective when paired with CTLA4 inhibitors.

"This not only shows that the combination of CAR T cells and checkpoint inhibitors can have an enhanced effect compared to what either can do alone, it also shows some combinations work better than others, and that a more personalized evaluation of each tumor may result in more effective therapy," said the study's lead author Zev A. Binder, MD, PhD, a senior research investigator in Neurosurgery at Penn. Donald M. O'Rourke, MD, the John Templeton, Jr., M.D. Associate Professor in Neurosurgery at Penn, was the senior author.

The authors say this study lays the groundwork for a better understand of CAR T combination therapy in glioblastoma, which is a concept they plan to continue to advance. They also say the IL-13Rα2 CARs may have other implications since that protein is also expressed in dogs. They've partnered with researchers at the University of Pennsylvania School of Veterinary Medicine, and studies in canine patients are already in progress.

The study was partially supported by Tmunity, the Templeton Family Initiative in Neuro-Oncology, and the Maria and Gabriele Troiano Brain Cancer Immunotherapy Fund.

Finding Checkpoints to Target Outside of PD-1/PDL-1 in Head and Neck Cancer

Penn researchers are also at the forefront of another approach to combination immunotherapies that involves exploiting immune checkpoints beyond PD-1/PDL-1 - in this case, in the treatment of advanced head and neck cancer (Abstract CT158). This international, multisite trial evaluated the drug monalizumab in patients with recurrent or metastatic squamous cell carcinoma of the head and neck. Monalizumab targets an immune checkpoint called CD94/NKG2A that is distinct from the better-known PD-1/PDL-1 system. Inhibition of this checkpoint with monalizumab frees up T cells and Natural Killer (NK) cells to fight off cancer.

This trial combined monalizumab with cetuximab - a drug approved by the U.S. Food and Drug Administration for the treatment of patients with advanced head and neck cancer. Cetuximab is thought to work, in part, by activating NK cells. This study found the combination of these two drugs has more anti-tumor activity together than cetuximab given by itself. Of the 26 patients evaluable on this trial so far, eight had a partial response and 14 had stable disease. Monalizumab was generally well-tolerated, although one patient did stop treatment due to side effects. The trial is ongoing.

"The data so far show that this therapy is active in head and neck cancer, and since monalizumab targets a checkpoint that is different from other inhibitors that are currently available, it's an interesting option as a combination partner for a variety of novel immunotherapeutic approaches," said the trial's principal investigator Roger B. Cohen, MD, a professor of Hematology-Oncology at Penn and Associate Director of Clinical Research in the Abramson Cancer Center.

The trial is supported by Innate Pharma, which manufactures monalizumab.

Translating Early Results Into Future Research in Pancreatic Cancer

Another key to immunotherapies is understanding how cancer is able to fight these therapies off, and a Penn-led trial sheds light on that question in pancreatic cancer (Abstract CT085). The study involved two groups of patients, both of which received a combination of chemotherapy drugs. One group also got a drug called hydroxychloroquine (HCQ) to block a process called autophagy - a built-in resistance mechanism which allows cells to survive when under attack by breaking down unneeded parts and recycling them to stay alive.

Twenty-one of the 46 patients that got the autophagy blocker showed a partial response (46 percent), compared to just eight out of 48 (17 percent) who didn't get HCQ. However, overall survival at one year was slightly lower in the HCQ group (41 percent) than for the group that only got chemotherapy (51 percent).

"While the trial did not reach its overall goal of prolonged survival, HCQ did improve the response rate, which may mean we can harness this therapy in a different setting, perhaps in patients with locally advanced disease in hopes of getting them to a point where we can surgically remove the tumor after initial chemotherapy," said the study's lead author Thomas B Karasic, MD, an Instructor of Hematology-Oncology at Penn. Peter J. O'Dwyer, MD, a professor of Hematology Oncology and director of the Developmental Therapeutics Program in the Abramson Cancer Center, was the study's senior author.

The authors say they benefitted from Penn's ability to accrue patients for large scale clinical trials like this one, which they believe to be the largest trial ever to evaluate autophagy inhibition. They say that helps provide insights into future therapeutic possibilities.

"If we understand the mechanism of why this didn't extend survival despite the improvement in response, it may be able to help us guide the development of future trials and the use of this therapy in other cancers," Karasic said.

Credit: 
University of Pennsylvania School of Medicine

Marine fish won an evolutionary lottery 66 million years ago, UCLA biologists report

image: An evolutionary history of major groups of acanthomorphs, an extremely diverse group of fish.

Image: 
Michael Alfaro/UCLA Ecology and Evolutionary Biology

Why do our oceans contain such a staggering diversity of fish of so many different sizes, shapes and colors? A UCLA-led team of biologists reports that the answer dates back 66 million years, when a six-mile-wide asteroid crashed to Earth, wiping out the dinosaurs and approximately 75 percent of the world's animal and plant species.

Slightly more than half of today's fish are "marine fish," meaning they live in oceans. And most marine fish, including tuna, halibut, grouper, sea horses and mahi-mahi, belong to an extraordinarily diverse group called acanthomorphs. (The study did not analyze the large numbers of other fish that live in lakes, rivers, streams, ponds and tropical rainforests.)

The aftermath of the asteroid crash created an enormous evolutionary void, providing an opportunity for the marine fish that survived it to greatly diversify.

"Today's rich biodiversity among marine fish shows the fingerprints of the mass extinction at the end of the Cretaceous period," said Michael Alfaro, a professor of ecology and evolutionary biology in the UCLA College and lead author of the study.

To analyze those fingerprints, the "evolutionary detectives" employed a new genomics research technique developed by one of the authors. Their work is published in the journal Nature Ecology and Evolution.

When they studied the timing of the acanthomorphs' diversification, Alfaro and his colleagues discovered an intriguing pattern: Although there were many other surviving lineages of acanthomorphs, the six most species-rich groups of acanthomorphs today all showed evidence of substantial evolutionary change and proliferation around the time of the mass extinction. Those six groups have gone on to produce almost all of the marine fish diversity that we see today, Alfaro said.

He added that it's unclear why the other acanthomorph lineages failed to diversify as much after the mass extinction.

"The mass extinction, we argue, provided an evolutionary opportunity for a select few of the surviving acanthomorphs to greatly diversify, and it left a large imprint on the biodiversity of marine fishes today," Alfaro said. "It's like there was a lottery 66 million years ago, and these six major acanthomorph groups were the winners."

The findings also closely match fossil evidence of acanthomorphs' evolution, which also shows a sharp rise in their anatomical diversity after the extinction.

The genomic technique used in the study, called sequence capture of DNA ultra-conserved elements, was developed at UCLA by Brant Faircloth, who is now an assistant professor of biological sciences at Louisiana State University. Where previous methods used just 10 to 20 genes to create an evolutionary history, Faircloth's approach creates a more complete and accurate picture by using more than 1,000 genetic markers. (The markers include genes and other DNA components, such as parts of the DNA that turn proteins on or off, and cellular components that play a role in regulating genes.)

The researchers also extracted DNA from 118 species of marine fish and conducted a computational analysis to determine the relationships among them. Among their findings: It's not possible to tell which species are genetically related simply by looking at them. Seahorses, for example, look nothing like goatfish, but the two species are evolutionary cousins -- a finding that surprised the scientists.

"We demonstrate this approach works, and that it sheds new light on evolutionary history for the most species-rich group of marine vertebrates," Alfaro said.

Credit: 
University of California - Los Angeles

Gene mapping lays groundwork for precision chemotherapy

Despite the great successes of targeted cancer drugs and the promise of novel immunotherapies, the vast majority of people diagnosed with cancer are still first treated with chemotherapy. Now a new study by UCSF researchers using techniques drawn from computational biology could make it much easier for physicians to use the genetic profile of a patient's tumor to pick the chemotherapy treatment with the fewest side effects and best chance of success.

"Since 95 percent of cancer patients still get chemo, we realized we could make a major impact on cancer treatment by helping clinicians prescribe the right chemotherapy drug," said Sourav Bandyopadhyay, PhD, a professor of bioengineering and therapeutic sciences in UCSF's Schools of Pharmacy and Medicine and senior author on the new study.

Chemotherapies are potent toxins delivered into the bloodstream to kill tumor cells throughout the body by damaging DNA in rapidly dividing cells. However, these poisons can also do significant harm to other dividing cells such as those found in the stomach lining and in hair and nail follicles, as well as the blood and immune stem cells in the bone marrow. In addition, cancer cells' susceptibility to these agents varies widely, and tumors often develop resistance to drugs that initially seem effective.

There are more than 100 chemotherapy agents in wide use, but oncologists have very little information to guide their decisions about which of these drugs to use in a given patient. These decisions are typically guided by the drugs' average historical success rate for different types of cancer, rather than any understanding of how the chemotherapy drug will interact with the genetic profile of a specific tumor.

"We know very little about how gene mutations in tumor cells can change how a tumor might respond or not to certain chemotherapy drugs. Mapping these sorts of connections could make it possible to optimize which drugs patients get based on their tumor genetics," said Bandyopadhyay, a member of the UCSF Helen Diller Family Comprehensive Cancer Center and the Quantitative Biosciences Institute.

Now -- in a paper published online April 17, 2018 in Cell Reports -- Bandyopadhyay's lab has systematically mapped connections between 625 breast and ovarian cancer genes and nearly every FDA approved chemotherapy for breast or ovarian cancer. Led by Hsien-Ming "Kevin" Hu, PhD, Bandyopadhyay's group developed a high-throughput combinatorial approach that allowed them to perform 80,000 experiments in laboratory dishes in a matter of weeks. The authors said their results, which they have made publicly available, constitute an invaluable resource to help clinicians predict which chemotherapies will be most effective against tumor cells with particular genetic mutations, and how to rationally combine therapies to prevent cancers from developing resistance.

"We're trying to take a systems view of chemotherapy resistance," said Bandyopadhyay. "With rarer mutations in particular there aren't enough patients for large clinical trials to be able to identify biomarkers of resistance, but by considering all the different potential genetic factors that have been identified together in one study, we can robustly predict from experiments in laboratory dishes how cancers with different genetic mutations will respond to different treatments."

The team began by identifying hundreds of genes frequently mutated in human cancers: 200 implicated in breast cancer, 170 linked to ovarian cancer, and 134 involved in DNA repair, which is compromised in many types of cancer. They then mimicked the effects of such mutations in lab dishes by systematically inactivating each of these cancer-associated genes in healthy human cells, creating 625 different perturbations that mirrored distinct genetic mutations seen in real breast and ovarian cancers.

The researchers then exposed cells from each of these lines to a panel of 31 different drug treatments -- including 23 chemotherapy compounds approved by the FDA for breast and ovarian cancers, six targeted cancer drugs, and two common drug combinations. An automated microscopy system monitored the cells' health and recorded which groups of cells were killed, which survived, and which developed resistance when exposed to a particular treatment.

The resulting "map" of gene-drug interactions allowed the researchers to accurately predict the responses of multiple human cancer cell lines to different chemotherapy agents based on the cell lines' genetic profiles and also revealed new genetic factors that appear to determine the response of breast and ovarian tumor cells to common classes of chemotherapy treatment.

As a proof of principle, the researchers collaborated with Clovis Oncology, a biotech company based in Boulder, Colorado, which is running a clinical trial of drugs known as PARP inhibitors in patients with stage II ovarian cancer. Based on their gene-drug interaction map, the researchers predicted that mutations in two genes, called ARID1A and GPBP1, could contribute to ovarian cancer's ability to develop resistance to this class of drugs. Results from the clinical trial bore out these predictions: patients with these mutations were significantly more likely to develop resistance.

Bandyopadhyay's team has deposited the trove of data generated in the new study in a database maintained by the National Cancer Institute so that other researchers can mine it for information about drug combinations and derive new biological insights about the basis for chemotherapy's success or failure. The lab is also working with the Breast Oncology Program at UCSF to make this data part of an adaptive clinical trial called I-SPY, which lets researchers identify the most effective therapies based on patient molecular profiling, and is collaborating with members of the UCSF Institute for Computational Health Sciences (ICHS) to put these and other public data into a centralized database that clinicians can access through an app to help make the most appropriate treatment decisions.

In future, Bandyopadhyay says, better understanding how chemotherapy agents impact specific biological pathways should allow drug trials to focus on patients who are more likely to respond to the drugs being tested and enable clinicians to identify targeted or combination therapies for patients with a genetic predisposition to resistance.

Credit: 
University of California - San Francisco

How does plant DNA avoid the ravages of UV radiation?

CHAPEL HILL, NC - If the ultraviolet radiation from the sun damages human DNA to cause health problems, does UV radiation also damage plant DNA? The answer is yes, but because plants can't come in from the sun or slather on sunblock, they have a super robust DNA repair kit. Today, the UNC School of Medicine lab of 2015 Nobel laureate Aziz Sancar, MD, PhD, has published an exquisite study of this powerful DNA repair system in plants, which closely resembles a repair system found in humans and other animals.

The study, published in Nature Communications, is the first repair map of an entire multicellular organism. It revealed that the "nucleotide excision repair" system works much more efficiently in the active genes of plants as compared to humans. And this efficiency depends on the day/night cycle.

"These findings advance our understanding of DNA repair mechanisms common among all organisms and may also have practical applications," said co-corresponding author Ogun Adebali, PhD, a postdoctoral researcher in the Sancar lab.

First author Onur Oztas, PhD, a postdoctoral researcher in the Sancar lab, said, "DNA damage accumulating in a plant will impair its growth and development, so boosting the excision repair system could be a good strategy for improving crop yields."

Sancar, the Sarah Graham Kenan Professor of Biochemistry and Biophysics, was awarded the 2015 Nobel Prize in Chemistry for his studies of excision repair, which is now widely viewed as the major mechanism of DNA repair - including repair of UV damage - in living organisms. Most prior studies of this repair system have been in mammalian and bacterial cells; much less is known about how the system works in plants. However, plants must have efficient systems for DNA repair, since they cannot easily avoid sunlight and of course need it for their growth.

For the study, Oztas and colleagues used an excision-repair mapping technique they recently developed, known as XR-seq. The technique enables them to detect and sequence the short lengths of damaged DNA that are cut from chromosomes during the excision repair process. The sequences of these DNA snippets can be matched to corresponding stretches of DNA on a reference genome, in order to map precisely the spots where DNA-damage is under repair.

The UNC researchers performed XR-seq scans on cells from UV-exposed plants - Arabidopsis thaliana, the "lab rat" of plant research also known as thale cress or mouse-ear cress. The resulting repair maps revealed that excision repair in Arabidopsis works faster on genes that are active. Genes when active are transcribed into strands of RNA that may then be translated into proteins, the main machinery of cells. Prior studies from the Sancar lab showed that excision repair works more efficiently for actively transcribed genes in animals and bacteria. The phenomenon, called transcription-coupled repair, is thought to have evolved as a way to direct DNA repair where it is most acutely needed.

"Here we found that the jump in efficiency for transcription-coupled repair is even more pronounced in plants than it is in animals or bacteria," Oztas said.

Sancar's lab performed XR-seq on UV-exposed Arabidopsis over 24-hour periods to discover that the efficiency of transcription-coupled repair also varies according to the "circadian" day/night cycle for 10 to 30 percent of Arabidopsis's genes. This reflects the normal daily variations of transcription activity in these genes.

"The results show that excision repair in plants is regulated in much the same way it is in other organisms - in order to maximize efficiency," Oztas said.

The Sancar lab plans to follow up with studies aimed at solving two lingering mysteries about excision repair in plants. One is that knocking out the excision repair system leads to an increase in plant genome mutations even when the plant is kept in the dark, away from UV or other forms of light.

"This implies that excision repair is needed to fix DNA damage from other, unknown factors besides UV," Oztas said. "We'd like to identify and characterize those unknown factors and find out how excision repair fixes the types of damage they cause."

The plant excision repair system also involves a slightly different set of repair proteins than are found in other organisms. The UNC scientists hope to determine why that is and precisely how plants' distinctive set of excision repair proteins work together to keep plant genomes in good repair.

Credit: 
University of North Carolina Health Care

Two robots are cetter than one for NIST's 5G antenna measurement research

image: NIST's new Large Antenna Positioning System (LAPS) uses two robotic arms to measure and test antennas for applications such as advanced communications systems.

Image: 
Burrus/NIST

Researchers at the National Institute of Standards and Technology (NIST) continue to pioneer new antenna measurement methods, this time for future 5G wireless communications systems.

NIST's new Large Antenna Positioning System (LAPS) has two robotic arms designed to position "smart" or adaptable antennas, which can be mounted on base stations that handle signals to and from huge numbers of devices. Future 5G systems will operate at higher frequencies and offer more than 100 times the data-carrying capacity of today's cellphones, while connecting billions of mobile broadband users in complex, crowded signal environments.

Among its many special capabilities, the LAPS can test transmissions to and from antennas located on fast-moving mobile devices, which requires coordination between the timing of communication signals and robot motion.

"Measurements of antenna signals are a great use for robotics," NIST electronics engineer Jeff Guerrieri said. "The robotic arms provide antenna positioning that would be constrained by conventional measurement systems."

NIST researchers are still validating the performance of the LAPS and are just now beginning to introduce it to industry. The system was described at a European conference last week .

Today's mobile devices such as cell phones, consumer Wi-Fi systems and public safety radios mostly operate at frequencies below 3 gigahertz (GHz), a crowded part of the spectrum. Next-generation mobile communications are starting to use the more open frequency bands at millimeter wavelengths (30-300 GHz), but these signals are easily distorted and more likely to be affected by physical barriers such as walls or buildings. Solutions will include transmitter antenna arrays with tens to hundreds of elements that focus the antenna power into a steerable beam that can track mobile devices.

For decades, NIST has pioneered testing of high-end antennas for radar, aircraft, communications and satellites. Now, the LAPS will help foster the development of 5G wireless and spectrum-sharing systems. The dual-robot system will also help researchers understand the interference problems created by ever-increasing signal density.

The new facility is the next generation of NIST's Configurable Robotic Millimeter-Wave Antenna (CROMMA) Facility, which has a single robotic arm. CROMMA, developed at NIST, has become a popular tool for high-frequency antenna measurements. Companies that integrate legacy antenna measurement systems are starting to use robotic arms in their product lines, facilitating the transfer of this technology to companies like The Boeing Co.

CROMMA can measure only physically small antennas. NIST developed the LAPS concept of a dual robotic arm system, one robot in a fixed position and the other mounted on a large linear rail slide to accommodate larger antennas and base stations. The system was designed and installed by NSI-MI Technologies. The LAPS also has a safety unit, including radar designed to prevent collisions of robots and antennas within the surrounding environment, and to protect operators.

The LAPS' measurement capabilities for 5G systems include flexible scan geometries, beam tracking of mobile devices and improved accuracy and repeatability in mobile measurements.

The LAPS has replaced NIST's conventional scanners and will be used to perform near-field measurement of basic antenna properties for aerospace and satellite companies requiring precise calibrations and performance verification. The near-field technique measures the radiated signal very close to the antenna in a controlled environment and, using mathematical algorithms developed at NIST, calculates the antenna's performance at its operating distance, known as the far field.

But the ultimate goal for the LAPS is to perform dynamic, over-the-air tests of future 5G communication systems. Initial validation shows that basic mechanical operation of the LAPS is within the specified design tolerances for still and moving tests to at least 30 GHz. Final validation is ongoing.

Credit: 
National Institute of Standards and Technology (NIST)

Regional health system growth and implications for stroke care

New research shows that stroke patients are increasingly being transferred out of smaller community and rural hospitals and sent to larger medical centers for their care and rehabilitation. While this is a positive sign for patients who need more advanced treatments, the trend has drawbacks in terms of cost and points to the need to improve the coordination of care between hospitals.

"The underlying goal of stroke care is to get the right person to the right hospital at the right time," said University of Rochester Medical Center (URMC) neurologist Benjamin George, M.D., M.P.H., a co-author of the study which appears this month in the journal Neurology. "The findings of this study show that in recent years community-based hospitals are erring on the side of caution and transferring more patients from their emergency departments to larger hospitals. Given the high cost and burden associated with these transfers, striking a balance between cost and need is essential."

A national movement toward the formation of large regional health systems has occurred since the passage of the Affordable Care Act and its emphasis on population-based health management. This has created opportunities and benefits for both small and large hospitals and their patients, such as increased access to specialized care, continuity of care across health systems, and the ability to provide treat patients closer to home.

In the case of stroke, the creation of regional health networks appears to have accelerated the transfer of these patients away from smaller hospitals. Using national databases, researchers examined transfer patterns for stroke and transient ischemic attack (TIA) patients from emergency departments of hospitals that serve rural or underserved populations to larger teaching hospitals and academic medical centers. They found that national transfer rates had doubled from 2006 to 2014.

In instances where patients require more advanced care, such as a thrombectomy or neuro-critical care, these inter-hospital transfers are necessary and important. The report also notes that transfer rates vary widely, indicating that some hospitals have developed protocols and strong consultation relationships with specialists in stroke centers that allow them to more effectively evaluate and care for patients in their own emergency rooms.

However, in some instances, such as TIA and certain less severe strokes, these transfers do not automatically produce an improvement in care and result in unnecessary costs and inconvenience to patients and their families. The authors speculate that one of the reasons behind the high transfer rates is a general uneasiness on the part of caregivers in smaller hospitals to treat stroke patients. They note that neurological conditions such as stroke represent the largest category of patients transferred between hospitals - two times greater than trauma, the next largest group.

"Understanding the factors associated with this growth in transfers is necessary in order to build a stroke care system that can economically deliver care, optimize outcomes, and minimize unnecessary costs," said University of Florida neurologist Adam Kelly, M.D., a co-author of the study. Kelly was a faculty member at URMC when the study was conducted.

The authors contend that the growth in transfer rates points to the need for better coordination between hospitals to help determine which patients require transfer to stroke centers and those that can be effectively treated where they are. This includes establishing criteria and protocols to evaluate stroke victims and the expanded use of telemedicine - or telestroke - so emergency department providers can consult directly with stroke specialists in other institutions and quickly determine the best course of action.

"Recent advances in care have transformed the way we treat stroke and the partnerships between hospitals that have formed as a result of the creation of regional health systems has expanded access to that care for more patients," said Robert Holloway, M.D., M.P.H., chair of the URMC Department of Neurology and a co-author of the study. "But the goal of these partnerships should be, when appropriate, to deliver care that keeps patients closer to home and this study shows that on a national level there remains work to be done when it comes to stroke patients."

Credit: 
University of Rochester Medical Center

Study shows potential cost savings for early detection and treatment of type 2 diabetes

Health checks including diabetes risk assessment have been introduced in a number of countries. However, there are few population-based trials assessing the benefits, harms and costs of these screening programmes, and these have shown mixed results.

Between 2001 and 2006, a population-based cardiovascular and diabetes screening programme was introduced in five out of sixteen Danish counties. Over 150,000 individuals registered with 181 practices participating in the ADDITION-Denmark study were sent a diabetes risk score questionnaire, and if their score indicated moderate to high risk they were invited to attend for a diabetes test and cardiovascular risk assessment with their family doctor.

More than 27,000 attended for screening, and 1533 were diagnosed with diabetes during screening. A further 1,760,000 individuals were identified for a matched no-screening control group. Participants were followed for approximately six years following diagnosis until 31 December 2012, when national registers were searched for healthcare usage and healthcare cost.

The researchers found that those individuals with clinically-diagnosed diabetes were identified on average 2.2 years later than individuals whose diabetes was detected in the screening practices. Healthcare costs were significantly lower in the screening group compared with the no-screening control group, with an average annual difference in healthcare costs of €889 per individual with diabetes. The results have just been published in the scientific journal Diabetologia.

Lead author, Camilla Sortsø, says "While trials of population-based screening for type 2 diabetes have not demonstrated beneficial effects at the population level, we have previously shown that there are benefits for those found to have diabetes. This study contributes to our previous research by showing that early detection and treatment among individuals at high risk of type 2 diabetes has the potential to reduce costs."

About the study:

Type of study: A non-randomised register-based study. The basic data originates from the ADDITON study, while the new data has been collected from the Danish Diabetes Database and the National Patient Register.

Partners: Applied Economics and Health Research, Copenhagen, University of Cambridge School of Clinical Medicine, the Danish Diabetes Academy, Odense University Hospital, University of Southern Denmark.

About the ADDITION study

The published results are based on the Danish part of the ADDITION study. The ADDITION study is a Danish initiated, international study with partners in Cambridge, UK and Utrecht, Holland.

The study was begun in the year 2000 as a randomised study with the participation of general practitioners from 181 practices in Denmark. All of the general practitioners were trained in systematic early detection of diabetes.

The majority of the general practitioners sent a risk-questionnaire to all patients aged 40-69, while a smaller group of general practitioners handed-out the questionnaire to their patients or handed it out when they visited the doctor for reasons other than diabetes.

In addition, half of the doctors were trained to give intensive treatment consisting of lifestyle changes (healthy diet, exercise, smoking stop) and preventive medicine that e.g. lowers blood sugar levels, blood pressure and cholesterol in the blood. The other half of the general practitioners treated patients in accordance with the existing treatment guidelines.

The randomised study showed a 17 per cent decrease in cardiovascular disease after five years. A relevant but not statistically significant difference. Morbidity and mortality in the group that was treated in accordance with the existing treatment guidelines proved to be considerably smaller than among patients who were given the diabetes diagnosis without systematic screening.

Torsten Lauritzen, Aarhus University and guarantor for the present study tells: "The present paper is based on a recent publication in Diabetologia (2017) demonstrating that a single round of diabetes screening and cardiovascular risk assessment was associated with a 16% risk reduction in cardiovascular disease and a 21% reduction in all-cause mortality in individuals diagnosed with diabetes between 2001 and 2009."

Credit: 
Aarhus University

Pancreatitis in minorities linked to triglycerides, gallstones, alcohol abuse

MAYWOOD, IL - Pancreatitis in ethnic minorities is linked to very severe levels of triglycerides and the risk is further increased by alcohol abuse and gallstones, a study has found.

Loyola Medicine gastroenterologist Ayokunle Abegunde, MD, is a co-author of the study, published online ahead of print in the journal Endocrine Practice.

Pancreatitis is inflammation in the pancreas, a large gland behind the stomach that produces enzymes that aid in digestion and hormones that help maintain bood sugar balance. Acute pancreatitis is responsible for more than 220,000 hospital admissions in the United States each year and has a mortality rate ranging from 3 to 30 percent.

Triglycerides, a type of fat the body uses for energy, can increase the risk of heart disease. Normal levels are below 150 milligrams per deciliter (mg/dL). Levels higher than 500 mg/dL are considered very high, levels between 1,000 and 1,999 mg/dL are severe and levels above 2,000 mg/dL are very severe.

Previous studies have found that severe levels of triglycerides are associated with pancreatitis, but the cutoff level has not been confirmed and there has been limited research on minority populations. The new study is the first to report on pancreatitis, due to severe levels of triglycerides, in a U.S. multiethnic minority population.

Researchers performed a retrospective study of 1,157 adult patients in the Cook County Health & Hospitals System who had triglyceride levels higher than 1,000 mg/dL. The ethnic breakdown was Hispanics, 38.4 percent; African Americans, 31.6 percent; Caucasians, 22.7 percent; Asians, 5.7 percent; and Pacific Islanders, 1.6 percent.

Among all patients, 9.2 percent had pancreatitis. Among patients with triglyceride levels higher than 2,000 mg/dL, there was a 4.3-fold increase in pancreatitis compared to patients with triglyceride levels between 1,000 and 1,999 mg/dL. This finding validates the Endocrine Society's suggested cut-off of 2,000 mg/dL as a risk factor for pancreatitis.

Previous studies have shown that alcohol is another risk factor. The new study found that patients with a history of excessive alcohol intake were four times more likely to develop pancreatitis.

The study confirmed earlier findings that gallstone disease is an additional risk factor. Researchers further found that in patients with pancreatitis, the prevalence of gallstones was significantly higher in women.

Only 2 percent of patients with triglyceride levels below 2,000 mg/dL had pancreatitis, compared to 33.6 percent of patients who had triglyceride levels higher than 2,000 mg/dL and one other risk factor.

The study also found that younger adults were more likely to get pancreatitis. "It is not clear why older patients with similar risk factors were less susceptible to developing acute pancreatitis," researchers wrote.

The study's findings may help physicians assess the pancreatitis risk among patients with severe levels of triglycerides and decide on the urgency and intensity of managing the risk. Early detection and counseling on behavioral risk factors could help reduce the risk of pancreatitis, researchers wrote.

The study is titled "Acute Pancreatitis in Patients with Severe Hypertriglyceridemia in a Multiethnic Minority Population."

Credit: 
Loyola Medicine

When three months from now feels right around the corner

image: Jing Hu is a Ph.D. student in Organizational Behaviour and Human Resource Management at the University of Toronto's Rotman School of Management. She holds an M.A. and B.Sc. from Beijing Normal University. Her research interests include employee's work meaningfulness, time in organizational research, employees' well-being, and cross-culture research. She studies the ups and downs of having a meaningful job and explores the antecedents of work meaningfulness from different levels, including the macro, organizational, and individual level. Also, she studies the topics related to time in organizational research. Moreover, she is interested in the factors that impact employees' well-being. Her work appears in psychology and management journals, including Journal of Experimental Social Psychology, Frontiers in Psychology, and Journal of Personnel Psychology.

Image: 
Rotman School

Toronto - If you've ever noticed yourself thinking about the timing of a plan in two opposing ways - something that feels longer off than your actual time calculation -- you're on to something. New research shows our different ways of estimating time don't necessarily move in lock-step.

Relative time estimates refer to how distant or close a future event feels, such as "soon" or "far away." Absolute time estimates however use objective units -- days, weeks, months or years - to describe when an event may occur.

The study from researchers at the University of Toronto's Rotman School of Management revealed that when we consider unknown future events, such as when we'll use a gift certificate, our relative and absolute time estimates tend to contradict each other. I'll use that gift certificate soon, we might think, even though our actual objective time estimate is three months from now.

As well, the frame of mind we bring to the consideration -- whether we're thinking broadly and abstractly, or using more concrete, detail-oriented thinking -- influences which direction our relative and absolute time estimates will flow.

Several experiments showed that abstract vs. concrete thinking tended to yield reverse results. Study participants induced into an abstract frame of mind felt that a personal activity would occur sooner than those thinking about the same activity who were in a concrete frame of mind. However, when asked when the activity would take place in days or weeks, the abstract thinkers gave longer time estimates than the concrete thinkers.

"It reminds me that when I plan for the future, I shouldn't think just about what is the calendar date, but also how I'm looking at it -- whether in an abstract or concrete mindset," said lead study author Jing Hu, a doctoral student in organizational behaviour and human resource management, whose research was partly inspired by her own reflections on making plans to visit family. Sam Maglio, an assistant professor of marketing at the University of Toronto Scarborough, who is cross-appointed to the Rotman School, co-authored the study with her.

In addition to the study's academic contributions, it suggests that frames of mind can affect the urgency we bring to completing tasks and projects. For example, using an abstract attitude by thinking about why we should do something vs. how, may yield a greater sense of urgency to getting it done, even though the actual time when it will occur is further away. That could be applied to a variety of situations, including leadership contexts, said Ms. Hu.

"If the leader creates a big vision for the subordinate, such as why their work is important, the subordinate will think about their work abstractly," she said. "Then, when the subordinate plans their future activities, the timing will feel shorter to them and they will start doing the work sooner because of the temporal pressure."

The study will appear in the May 2018 issue of the Journal of Experimental Social Psychology.

Credit: 
University of Toronto, Rotman School of Management

Women remain less likely to receive high-intensity statins following heart attack

Less than half of women who filled a statin prescription following a heart attack received a high-intensity statin--indicating they continue to be less likely than men to be prescribed this lifesaving treatment, according to a study published today in the Journal of the American College of Cardiology. The persistent gap in heart disease treatment between women and men continues despite similar effectiveness of more-intensive statins for both sexes and recent efforts to reduce sex difference in guideline-recommended treatment.

The 2013 American College of Cardiology/American Heart Association Guideline on the Treatment of Blood Cholesterol to Reduce Atherosclerotic Cardiovascular Risk in Adults recommends the use of high-intensity statin therapy for women and men less than 75 years of age with established heart disease for secondary prevention.

"Prior studies have found that women are less likely than men to receive treatment with statins following a heart attack. Our study shows that even when women receive statins, these continue to be in lower intensities than the guidelines recommend. The underutilization of these drugs in women was not explained by sex differences in demographics, comorbidities or health care utilization," said Sanne A.E. Peters, PhD, a research fellow in epidemiology at the George Institute for Global Health at the University of Oxford and the lead author of the study.

Using the MarketScan and Medicare databases, researchers analyzed data from 88,256 U.S. adults who filled a statin prescription within 30 days after hospital discharge for a heart attack between January 2014 and June 2015. High-intensity doses were the first statin prescription fill following hospital discharge after heart attack for 47 percent of women and 56 percent of men.

Trends in sex differences in high-intensity statin use over time were examined using beneficiaries with the same inclusion criteria between January 2007 and June 2015. Overall, high-intensity statin prescription fills increased from 22 percent to 50 percent in women and from 27 percent to 60 percent in men.

Researchers found no evidence of the sex difference in the use of high-intensity statins post heart attack diminishing between 2007 and 2015 or following the publication of the 2013 ACC/AHA cholesterol guideline.

"While we found that the magnitude of the sex difference in the use of high-intensity statins after heart attack was larger among the youngest and oldest patients and among those without comorbidities, women were consistently less intensively treated across a broad range of patient characteristics," Peters said. "This gap between our youngest and oldest patients is concerning because the oldest are at the highest risk and young women have been shown to have the slowest rate of decline in heart disease rates in the United States. The underutilization of high-intensity statins in women can be expected to result in a substantial number of preventable vascular events."

The researchers said clinicians may perceive women who have experienced a heart attack to be at a lower risk of recurrence than their male counterparts. A previous study found that sex disparities in treatment were in part due to clinicians' lower perceived heart disease risk in women. The sex difference in the use of high-intensity statins may be explained by variation at the hospital or health care provider level.

"Clinicians should communicate the benefits of high-intensity statins to their female patients in terms of reducing the risk of another heart attack and discuss possible concerns about side effects," Peters said. "Moreover, clinicians themselves should also be aware of the risk of recurrent heart attack in their female patients and the persistent sex disparity in the utilization of high-intensity statins."

In an accompanying editorial, Annabelle Santos Volgman, MD, FACC, and colleagues at Rush Medical College, stated the importance of determining the barriers facing both women and men in receiving guideline-recommended care. They also noted pathophysiologic differences in the disease presentation in women and men may contribute to clinicians treating women less aggressively. Women are more likely to present with nonobstructive disease, which is not benign, but patients and doctors minimize the significance of nonobstructive coronary artery disease leaving patients undertreated with lower intensity statins.

They said, "We think sex should matter, as well as age, race and ethnicities when it comes to patient care and adherence to guidelines. Implementation of such sex-specific strategies will improve CVD outcomes for women and by doing so may also improve outcomes for men."

Limitations of the study include that pharmacy claims identified whether a prescription was filled and do not provide information about the actual written prescription, medication adherence or the reason a clinician may have prescribed a lower-intensity statin.

Credit: 
American College of Cardiology

Elevation in buildings can affect the decisions we make

image: Sina Esteky is the lead author from Miami University.

Image: 
Miami University

People rely on financial managers, doctors and lawyers to be as objective as possible when making decisions about investments, health and legal issues, but findings from a new study suggest that an unexpected factor could be influencing these choices.

In a series of experiments, researchers found that people at higher elevations in an office building were more willing to take financial risks. The study is available online in the Journal of Consumer Psychology.

"When you increase elevation, there is a subconscious effect on the sense of power," says lead author Sina Esteky, PhD, an assistant professor of marketing in the business school at Miami University. "This heighted feeling of power results in more risk-seeking behavior."

In a pilot study, the researchers analyzed data from more than 3,000 hedge funds throughout the world that accounted for assets over $500 billion. They correlated the level of volatility of the fund with the floor level of the firm--which ranged from the first to the 96th floor. The researchers found a slight but significant correlation between increased elevation and the volatility of the fund.

Esteky's team also conducted experiments in which participants were asked to make a betting decision as they were either ascending or descending in the glass elevator of a tall building. The participants who were going up to the 72rd floor were more likely to opt for the risky lottery that could result in either a small or significant win. Those who were descending preferred the conservative lottery with either a moderate or slightly larger win.

In another experiment, participants were either on the ground floor or third floor of a university building, and they were asked to make 10 decisions with differing levels of risk and payoff. Again, the researchers found that participants on the third floor more frequently chose risky options than their ground floor counterparts. To better understand the reason for this behavior, participants completed a series of unfinished words, and people on the third floor were more likely to create words associated with power than the participants on the ground floor. The researchers believe that increased power-related thoughts might explain how elevation affects risk preferences.

Esteky suggests that elevation of an office building may be one reason certain hedge fund managers are willing to invest in risky assets such as Bitcoin, a cryptocurrency and worldwide payment system that is highly volatile. Although this study was limited to financial decisions, further studies could show whether the subconscious effect of elevation influences other professionals like doctors who are choosing treatment plans for patients, Esteky says. In another experiment, he discovered that people were more open to taking a sensory risk by trying an unfamiliar fruit when they were at a higher elevation in a building.

Although the implications of these findings could be unsettling for consumers who are relying on themselves or paid experts to make rational choices, the elevation effect vanished when participants were informed that floor level influences behavior, Esteky explains. The effect also disappeared when people could not see that they were on a higher floor level, such as those in cubicles without a window view.

"The important lesson is that when people become aware of the potential impact of elevation, it doesn't happen anymore," Esteky says. "The brain is very susceptive to subtle situational factors, but also really good at correcting for such effects, so awareness can help us be more rational in our decisions."

Credit: 
Society for Consumer Psychology

Structure of a protein complex related with cell survival revealed

image: R2TP structure.

Image: 
Spanish National Cancer Research Centre (CNIO)

A team from the Spanish National Cancer Research Centre (CNIO) has determined for the first time the high-resolution structure of a complex (R2TP) involved in key processes for cell survival and in diseases such as cancer. This achievement has been made possible by using high-resolution cryo-electron microscopy, a technique brought to the CNIO thanks to Óscar Llorca, director of the Structural Biology Programme and lead author on the paper published in Nature Communications.

In 2017, the Nobel Prize for Chemistry was awarded to three scientists (Jacques Dubochet, Joachim Frank, and Richard Henderson) for their work on the development of cryo-electron microscopy. This technique can capture images of individual molecules, which are used to determine their structure and to ascertain biological processes in atomic detail.

Óscar Llorca and his team have used this technique to learn about the structure and functioning of a complex system called R2TP, which is involved in various key processes for cell survival such as the activation of the kinases mTOR, ATR and ATM, proteins that are the target of various cancer drugs currently being developed.

This work is the result of a collaborative project with the research group led by Professor Laurence H. Pearl at Sussex University in the UK, and has also involved the CSIC Centre for Biological Research (CIB), CIC bioGUNE, and the Laboratory of Molecular Biology in Cambridge, UK.

Complex and versatile 'assembly machinery'

mTOR, ATR and other related kinases do not work in isolation but rather by interacting and forming complexes with other proteins, which are essential for their normal functioning. The assembly of these structures with multiple components does not take place in cells spontaneously. The R2TP system and the HSP90 chaperone are crucial for the assembly and activation of mTOR and other related kinases, but how this happens in cells is still somewhat of a mystery. "If we understand this assembly pathway - explains the researcher - we will be able to identify new ways of targeting the activity of these kinases".

Thanks to cryo-electron microscopy, "we have been able to visualise, for the first time, the high-resolution structure of the human R2TP system", states Llorca. What has most surprised the researchers is the unexpected complexity of the human R2TP system, compared to its yeast homologues. The microscope images show that R2TP is a large platform capable of putting HSP90 in contact with the kinases on which HSP90 must act. When viewed under the microscope, R2TP looks like a jellyfish with three very flexible 'tentacles' made up of RPAP3 protein. The kinases of the mTOR family are recruited to the base of this jellyfish's 'head', while HSP90 is hooked by the tentacles and taken to the kinases, thanks to their flexibility.

"This first observation of the human R2TP system has allowed us to understand its structure and functioning mechanisms, which were previously unknown. Our next steps will be to study the details of how R2TP and HSP90 are able to assemble the complexes made up of kinases of the mTOR family, in order to find ways of interfering with these processes", concludes Llorca. "The R2TP system is also involved in the activation of other essential molecules for the cell and in the development of cancer, such as the RNA polymerase, telomerase, or the 'splicing' system, areas that we intend to explore in the future".

Credit: 
Centro Nacional de Investigaciones Oncológicas (CNIO)

Artificial antimicrobial peptides could help overcome drug-resistant bacteria

CAMBRIDGE, MA -- During the past several years, many strains of bacteria have become resistant to existing antibiotics, and very few new drugs have been added to the antibiotic arsenal.

To help combat this growing public health problem, some scientists are exploring antimicrobial peptides -- naturally occurring peptides found in most organisms. Most of these are not powerful enough to fight off infections in humans, so researchers are trying to come up with new, more potent versions.

Researchers at MIT and the Catholic University of Brasilia have now developed a streamlined approach to developing such drugs. Their new strategy, which relies on a computer algorithm that mimics the natural process of evolution, has already yielded one potential drug candidate that successfully killed bacteria in mice.

"We can use computers to do a lot of the work for us, as a discovery tool of new antimicrobial peptide sequences," says Cesar de la Fuente-Nunez, an MIT postdoc and Areces Foundation Fellow. "This computational approach is much more cost-effective and much more time-effective."

De la Fuente-Nunez and Octavio Franco of the Catholic University of Brasilia and the Dom Bosco Catholic University are the corresponding authors of the paper, which appears in the April 16 issue of Nature Communications. Timothy Lu, an MIT associate professor of electrical engineering and computer science, and of biological engineering, is also an author.

Artificial peptides

Antimicrobial peptides kill microbes in many different ways. They enter microbial cells by damaging their membranes, and once inside, they can disrupt cellular targets such as DNA, RNA, and proteins.

In their search for more powerful, artificial antimicrobial peptides, scientists typically synthesize hundreds of new variants, which is a laborious and time-consuming process, and then test them against different types of bacteria.

De la Fuente-Nunez and his colleagues wanted to find a way to make computers do most of the design work. To achieve that, the researchers created a computer algorithm that incorporates the same principles as Darwin's theory of natural selection. The algorithm can start with any peptide sequence, generate thousands of variants, and test them for the desired traits that the researchers have specified.

"By using this approach, we were able to explore many, many more peptides than if we had done this manually. Then we only had to screen a tiny fraction of the entirety of the sequences that the computer was able to browse through," de la Fuente-Nunez says.

In this study, the researchers began with an antimicrobial peptide found in the seeds of the guava plant. This peptide, known as Pg-AMP1, has only weak antimicrobial activity. The researchers told the algorithm to come up with peptide sequences with two features that help peptides to penetrate bacterial membranes: a tendency to form alpha helices and a certain level of hydrophobicity.

After the algorithm generated and evaluated tens of thousands of peptide sequences, the researchers synthesized the most promising 100 candidates to test against bacteria grown in lab dishes. The top performer, known as guavanin 2, contains 20 amino acids. Unlike the original Pg-AMP1 peptide, which is rich in the amino acid glycine, guavanin is rich in arginine but has only one glycine molecule.

More powerful

These differences make guavanin 2 much more potent, especially against a type of bacteria known as Gram-negative. Gram-negative bacteria include many species responsible for the most common hospital-acquired infections, including pneumonia and urinary tract infections.

The researchers tested guavanin 2 in mice with a skin infection caused by a type of Gram-negative bacteria known as Pseudomonas aeruginosa, and found that it cleared the infections much more effectively than the original Pg-AMP1 peptide.

"This work is important because new types of antibiotics are needed to overcome the growing problem of antibiotic resistance," says Mikhail Shapiro, an assistant professor of chemical engineering at Caltech, who was not involved in the study. "The authors take an innovative approach to this problem by computationally designing antimicrobial peptides using an 'in silico' evolutionary algorithm, which scores new peptides based on a set of properties known to be correlated with effectiveness. They also include an impressive array of experiments to show that the resulting peptides indeed have the properties needed to serve as antibiotics, and that they work in at least one mouse model of infections."

De la Fuente-Nunez and his colleagues now plan to further develop guavanin 2 for potential human use, and they also plan to use their algorithm to seek other potent antimicrobial peptides. There are currently no artificial antimicrobial peptides approved for use in human patients.

"A report commissioned by the British government estimates that antibiotic-resistant bacteria will kill 10 million people per year by the year 2050, so coming up with new methods to generate antimicrobials is of huge interest, both from a scientific perspective and also from a global health perspective," de la Fuente-Nunez says.

Credit: 
Massachusetts Institute of Technology

Logging in tropical forests jeopardizing drinking water

image: River plume draining into the sea full of sediment from upstream logging activity in Western Province, Solomon Islands.

Image: 
Wade Fairley

SOLOMON ISLANDS (April 16, 2018) - Globally, remaining tropical forests are being rapidly cleared, particularly in countries like the Solomon Islands where commercial logging accounts for about 18 percent of government revenue, and at least 60 percent of exports while providing the largest number of formal sector jobs. However, the loss of native forests has huge ecological and social consequences, many of which are poorly documented.

A team of researchers from The University of Queensland (UQ), Wildlife Conservation Society (WCS), and other groups have found that increasing land clearing for logging in Solomon Islands-even with best management strategies in place - will lead to unsustainable levels of soil erosion and significant impacts to downstream water quality.

Combined, these impacts will compromise the integrity of the land for future agricultural uses, interrupt access to clean drinking water and degrade important downstream ecosystems.

The researchers published the results of the study in the journal Environmental Research Letters.

The work focused on Kolombangara Island, where efforts are underway to create a national park to safeguard unlogged forests above 400 meters that have both cultural and ecological significance. This effort is being led by the Kolombangara Island Biodiversity Conservation Association (KIBCA), a community-based organisation focused on conserving the island's rich marine and terrestrial biodiversity. The declaration of a protected area would add significant levels of legal protection and explicit controls over land clearing.

UQ School of Earth and Environmental Sciences Postdoctoral Research Fellow Dr. Amelia Wenger, said the research can provide insight into the full range of impacts from logging activities, which are often not taken into consideration.

"When land-clearing extent reached 40 percent in our models, international standards for safe drinking water were exceeded nearly 40 percent of the time, even if best practices for logging were followed. Loss of the upland forest will compromise local access to clean water essential for drinking, bathing, and household washing," said Wenger.

Findings of this study are being used by KIBCA to communicate to island residents the potential impacts that could occur as a result of logging if the forest was not protected.

KIBCA coordinator Ferguson Vaghi said: "Previously people in Solomon Islands made decisions about logging from a selfish economic perspective. This study highlights that we also need to consider the impacts to the downstream environment."

More broadly, the findings demonstrate that national policies for logging practice must explicitly link soil erosion reduction strategies to natural and ecological thresholds, otherwise they will be ineffective at minimizing impacts.

WCS Melanesia Director Dr. Stacy Jupiter concurs: "Saving tropical forests worldwide depends upon tighter regulation of national laws and policies, as well as local buy-in for forest management. This study nicely illustrates why we need to take action now to protect the world's remaining intact forest landscapes in order to preserve their biodiversity and important ecosystem services for people."

Credit: 
Wildlife Conservation Society

Hello DARKNESS

Somewhere in the vastness of the universe another habitable planet likely exists. And it may not be that far -- astronomically speaking -- from our own solar system.

Distinguishing that planet's light from its star, however, can be problematic. But an international team led by UC Santa Barbara physicist Benjamin Mazin has developed a new instrument to detect planets around the nearest stars. It is the world's largest and most advanced superconducting camera. The team's work appears in the journal Publications of the Astronomical Society of the Pacific.

The group, which includes Dimitri Mawet of the California Institute of Technology and Eugene Serabyn of the Jet Propulsion Laboratory in Pasadena, California, created a device named DARKNESS (the DARK-speckle Near-infrared Energy-resolved Superconducting Spectrophotometer), the first 10,000-pixel integral field spectrograph designed to overcome the limitations of traditional semiconductor detectors. It employs Microwave Kinetic Inductance Detectors that, in conjunction with a large telescope and an adaptive optics system, enable direct imaging of planets around nearby stars.

"Taking a picture of an exoplanet is extremely challenging because the star is much brighter than the planet, and the planet is very close to the star," said Mazin, who holds the Worster Chair in Experimental Physics at UCSB.

Funded by the National Science Foundation, DARKNESS is an attempt to overcome some of the technical barriers to detecting planets. It can take the equivalent of thousands of frames per second without any read noise or dark current, which are among the primary sources of error in other instruments. It also has the ability to determine the wavelength and arrival time of every photon. This time domain information is important for distinguishing a planet from scattered or refracted light called speckles.

"This technology will lower the contrast floor so that we can detect fainter planets," Mazin explained. "We hope to approach the photon noise limit, which will give us contrast ratios close to 10-8, allowing us to see planets 100 million times fainter than the star. At those contrast levels, we can see some planets in reflected light, which opens up a whole new domain of planets to explore. The really exciting thing is that this is a technology pathfinder for the next generation of telescopes."

Designed for the 200-inch Hale telescope at the Palomar Observatory near San Diego, California, DARKNESS acts as both the science camera and a focal-plane wave-front sensor, quickly measuring the light and then sending a signal back to a rubber mirror that can form into a new shape 2,000 times a second. This process cleans up the atmospheric distortion that causes stars to twinkle by suppressing the starlight and enabling higher contrast ratios between the star and the planet.

During the past year and a half, the team has employed DARKNESS on four runs at Palomar to work out bugs. The researchers will return in May to take more data on certain planets and to demonstrate their progress in improving the contrast ratio.

"Our hope is that one day we will be able to build an instrument for the Thirty Meter Telescope planned for Mauna Kea on the island of Hawaii or La Palma," Mazin said. "With that, we'll be able to take pictures of planets in the habitable zones of nearby low mass stars and look for life in their atmospheres. That's the long-term goal and this is an important step toward that."

Credit: 
University of California - Santa Barbara