Culture

Big cities breed partners in crime

image: As cities grow, crime grows disproportionately faster.

Image: 
Rafael de Nadai

EVANSTON, Ill. -- Your intuition is correct. You are more likely to become a victim of a crime if you live in a big city.

Researchers have long known that bigger cities disproportionately generate more crime. Now a new study from Northwestern University and the Santa Fe Institute explains why: It's easier for criminals to find collaborators.

"In a big city, you have the potential to meet more distinct people each day," said Northwestern's Daniel Abrams, senior author of the study. "You're more likely to find an appropriate partner to start a business or invent something. But perhaps you're also more likely to find the partner you need to commit a burglary."

The study was published this week (Sept. 17) in the journal Physical Review E.

Abrams is an associate professor of engineering sciences and applied mathematics in Northwestern's McCormick School of Engineering. Vicky Chuqiao Yang, a postdoctoral fellow at the Santa Fe Institute and a former Ph.D. student in Abrams' lab, is the paper's first author.

As cities grow, crime grows even faster. And certain types of crime -- such as robbery, car theft and murder -- exponentially outpace the population.

"If you double the size of a city, you don't just double the amount of crime," Abrams said. "You actually more than double it."

To learn why, Abrams and Yang collaborated with Northwestern sociology professor Andrew Papachristos, who studies social networks in urban neighborhoods. The researchers developed a new mathematical model that predicted the number of crimes as a function of social interactions.

To do this, they used data from the FBI, Chicago Police Department and the National Incident-Based Reporting System. The team was specifically interested in co-arrest records -- or records in which multiple people were arrested for the same crime -- across seven categories: robbery, motor vehicle theft, murder, aggravated assault, burglary, larceny-theft and rape. Of these categories, only rape grew linearly, at roughly the same pace as a city's population. The other crimes are more "social" in nature and often require a team effort.

Although this project focuses on crime data, the researchers stress that their model also applies to positive outcomes. For example, big cities also produce more patents, more small businesses and more income per resident.

"The world is becoming very quickly urbanized," Abrams said. "There is a huge mass migration to cities from rural areas. It's important to understand what good and bad effects come with that. In order to promote the good side and reduce crime, we have to understand why it occurs."

Credit: 
Northwestern University

Wild African buffalo provide key insights into the genetics of TB resistance

Morris Animal Foundation-funded researchers at Oregon State University discovered areas in the African buffalo genome linked to risk for TB infection. Their finding also demonstrates the complex interplay between host immune responses and spread of infectious disease.

The researchers discovered areas of genetic code variation near genes associated with immunity to bacterial infections and demonstrated that these variations correlated with TB resistance. The team published their findings in the Proceedings of the Royal Society B.

TB is an important disease of both people and animals. The disease is caused by members of the genus Mycobacterium, has a global distribution and affects many different types of animals. Controlling the spread of infection is a considerable challenge for wildlife managers.

"We wanted to know more about the genetic basis of resistance to TB and African buffalo are a great model to study," said Dr. Hannah Tavalire, now a Postdoctoral Research Scholar at the University of Oregon and lead author of the paper. "This study is one of the first looking at immunity to TB in the wild and how it affects herd dynamics, including fitness. Finding a genetic underpinning in individuals who are disease resistant could help us do a better job managing the disease in a real-world setting."

Dr. Tavalire had access to DNA samples and recorded health outcomes from a well-studied herd of African buffalo. Using DNA sequencing technology, Dr. Tavalire looked for genetic regions associated with a longer interval before development of TB, a sign of disease resistance. She identified two areas that were associated with delayed infection. Further analysis demonstrated these regions were near genes known to be important in fighting bacterial infections.

The team's findings dovetail with another paper published by the group last year, which examined the impact of resistance on overall fitness. Although beneficial in preventing infection in younger animals, the team found that resistance might come at a cost once animals become infected later in their lives.

"There are genetic variants that protect a buffalo to some degree from acquiring TB, but there's a big trade-off in fitness once they get the disease," said Dr. Anna Jolles, Associate Professor of Epidemiology at Oregon State University. "You get this interesting dynamic. Being resistant is great for an animal if TB exposure is low, but if exposure is high and they get the disease despite being somewhat resistant, their outlook is very poor. There is this feedback in the population that may lead to perpetuation of both resistant and susceptible individuals."

This paradoxical situation emphasizes that simple test and cull protocols for controlling TB in wild animals may have undesired genetic consequences for the long-term health of a population. Drs. Tavalire and Jolles note that the situation is complex but they hope their findings will translate to other ecosystems.

"Tuberculosis is a huge problem for many wildlife species," said Dr. Janet Patterson-Kane, Morris Animal Foundation Chief Scientific Officer. "The more we learn about how this disease tenaciously persists in animal populations the more we will be able to help veterinarians, managers and public health officials make rational control decisions to limit spread of the disease."

Credit: 
Morris Animal Foundation

McLean successfully integrates spirituality and religion with mental health treatment

image: McLean's Spiritual Psychotherapy for Inpatient, Residential & Intensive Treatment (SPIRIT) program is an optional program that integrates religion and spirituality with mental health treatment.

Image: 
McLean Hospital

In a paper published in the latest issue of the American Journal of Psychotherapy, McLean Hospital clinicians describe the success of the hospital’s Spiritual Psychotherapy for Inpatient, Residential & Intensive Treatment (SPIRIT) program. The clinicians drew their conclusions from a sample of nearly 1,500 patients.

SPIRIT is an optional program that is available to all patients who come to McLean each year for mental health care. It integrates religion and spirituality with mental health treatment. According to the paper, many patients have cited the program as a major factor in their successful treatment.

The paper’s author, David H. Rosmarin, PhD, ABPP, said that “the statistical majority of our patients want to address spirituality in their mental health care, but few clinicians have any training in this area.” To address this need, Rosmarin and his colleagues developed SPIRIT. Its creation was aided by funding from the John Templeton Foundation’s Bridges Consortium.

Rosmarin, director of McLean’s Spirituality and Mental Health Program, reported that SPIRIT offers “a flexible protocol that gives clinicians the tools they need to address a fair amount of spiritual needs.” The protocol, he said, “can be used with any kind of patient, including those with anxiety, mood, substance use, or psychotic disorders; those in inpatient, residential, and intensive programs; and with patients who do or do not have religious backgrounds.”

Paper co-author Brent P. Forester, MD, MSc, said that integrating religion and spirituality with mental health is gaining acceptance among clinicians. “Historically, there has been an unease and tension in the field of psychiatry when it comes to the topic of religion,” he said. “But we have come a long way towards an understanding that spirituality and religion are distinct and that there may be benefits towards asking about the role that spirituality plays in one’s mental health.”

This study, Forester stated, “provides evidence that integrating spiritually informed interventions into standard psychiatric care is well accepted and greatly helps individuals who are struggling with various forms of psychiatric illness.”

Central to the program are group sessions to help patients explore how their spirituality or religion can be incorporated into their treatment plan and help bring about emotional change. SPIRIT participants also receive handouts about a variety of topics, including ways that prayer, meditation, and sacred verses can work with mental health treatment plans.

McLean’s Hilary S. Connery, MD, PhD, believes that SPIRIT can be particularly helpful for people with highly stigmatized illnesses. Often, she said, these illnesses “place sick persons in circumstances that cause them—and their families and sometimes even health care providers—to question whether or not they have done something ‘wrong’ or ‘evil’ that has directly resulted in their illness. Their recovery is fraught with shame and guilt, often disproportionate to their actual illness onset and development. This shame and guilt fosters treatment avoidance.”

However, Connery asserted that “spiritual interventions can be lifesaving for such individuals in correcting inappropriately personalized experiences of ill health, thus allowing them to greatly improve treatment engagement, adherence, and positive outcomes, and to reduce risk for suicide and other stress-related self-harm behaviors.”

Forester, chief of McLean’s Division of Geriatric Psychiatry, believes that seniors may benefit from programs like SPIRIT. “Older adults, in particular, suffer from loneliness, anxiety, and depression and face significant medical challenges,” he said. “This population, therefore, may greatly benefit from assessing attitudes about spirituality and the role of spirituality in one’s life.”

Regardless of age or specific mental health issue, the SPIRIT program has proved to be popular with patients. Rosmarin cited recent studies in which more than 80% of McLean patients report using religion to cope with stress, and 58.2% say they want spirituality as part of their care.

In addition, interviews with patients after treatment have revealed their positive feelings about SPIRIT. “We’ve never had an adverse event or received negative feedback,” Rosmarin said.

SPIRIT is not only popular. It is also effective. “One of the biggest issues in psychiatry is that people don’t complete their treatment or don’t come in the first place,” Rosmarin said. “SPIRIT has helped people stay on track.”

Moreover, he said, “There is a lot of evidence that many people utilize spirituality to cope with stress and other issues.” For example, he stated, “Spiritually based 12-step programs for substance use disorders are by far the most common pathway to recovery in this country.”

Rosmarin hopes the paper will lead to more institutions adopting programs like McLean’s. “SPIRIT could be syndicated regionally or nationally,” he said. “I see no reason why SPIRIT can’t be running at other hospitals. It’s easy to train clinicians, patients want this, and the program works. The proof is in the pudding.”

Journal

American Journal of Psychotherapy

Credit: 
McLean Hospital

Religious hospitals often fail to supply adequate family planning training

AURORA, Colo. (Sept. 19, 2019) - Nearly half of all Catholic and other religious hospitals fail to comply with required abortion and family planning training for obstetrics and gynecology residents, putting women at potential risk, according to a new study from the University of Colorado Anschutz Medical Campus.

The study was published online this month in the American Journal of Obstetrics & Gynecology.

While the Catholic Church's opposition to abortion is well known, certain kinds of family planning training are required for all Ob/Gyn doctors in residency. This has been true since 1996 when the Accreditation Council for Graduate and Medical Education (ACGME) recognized that abortion and family planning are essential parts of Ob/Gyn residency programs.

Yet the study shows that many religious hospitals neglect that training, putting their residents behind peers at other hospitals. One fifth of the programs surveyed said they didn't believe all of their residents would complete the required 20 D&Cs (dilation & curettage) before graduation, suggesting possible limitations to their miscarriage training.

"These residents are clearly not getting sufficient training and then going into other settings where this kind of experience is necessary to care for their patients," said the study's lead author Maryam Guiahi, MD, associate professor of obstetrics and gynecology at the University of Colorado School of Medicine. "Some of them need a catch-up period."

The researchers identified 30 of 278 accredited 2017-2018 programs where at least 70% of resident time is spent in faith-based hospitals that restrict family planning services. These sites account for 11% of all Ob/Gyn training programs. Educational leaders at these programs were asked about education and family planning training. Twenty-five programs responded - an 83% response rate - of which 76% were Catholic hospitals.

While they all reported adequate contraceptive training, with 47% of Catholic hospitals relying on off-site locations, some of the Catholic programs expressed concerns for training in copper IUDs as the Church views this method as an abortifacient despite competing evidence. Those from Catholic hospitals most commonly expressed concern about inadequate training in postpartum tubal ligations. About half of the hospitals offered abortion training as part of the curriculum as required by ACGME, 32% offered residents the chance to arrange training and 12% did not offer it at all. Most relied on off-site collaborations.

Five Catholic programs reported that their residents did not meet the requirement of 20 D&Cs. One third reported being cited by the Residency Review Committee for their lack of adequate family planning training. Almost half (47%) reported the abortion training they provided to their residents was "poor."

"Even though many Catholic and other faith-based hospitals have developed strategies to respond to institutional restrictions," Guiahi said, "the deficiencies are ongoing."

Guiahi believes the Obstetrics and Gynecology Residency Review Committee may have to consider better measures to flag inadequate family planning training, especially at Catholic sites. These would include training in the placement of IUDs, postpartum sterilizations and improved abortion training assessments.

"If adequate training is not ensured," Guiahi said, "the harm will ultimately fall on the women who seek out these physicians when they need these common reproductive services."

Credit: 
University of Colorado Anschutz Medical Campus

Latest issue of Alzheimer's & Dementia

image: Alzheimer's & Dementia: The Journal of the Alzheimer's Association September 2019 issue.

Image: 
Alzheimer's Association

CHICAGO - A new study finds that individuals thought to be at an increased risk of developing Alzheimer's disease and also have high blood pressure and a higher risk of developing heart disease (based on a Framingham risk score), may also have a faster accumulation of tau tangles, one of the hallmarks of Alzheimer's disease. The authors conclude that risk factors including high blood pressure, high cholesterol levels and weight management should be monitored from midlife onward to detect early changes that may be related to Alzheimer's.

The study in the September 2019 issue of Alzheimer's & Dementia: The Journal of the Alzheimer's Association, looked at the association between vascular factors - including high blood pressure and elevated cholesterol - and brain health in individuals who were cognitively normal but had abnormal amyloid and tau levels. The ability to check for changes in the brain before symptoms of dementia appear is important for the development of strategies to reduce the risk in individuals at increased risk for Alzheimer's and other dementias. It may also be useful for predicting who might be at greatest risk for developing dementia in the future.

Link: Vascular Risk Factors are Associated with Longitudinal Changes in Cerebrospinal Fluid Tau Markers

Other articles published in September include:

Link: The Alzheimer's Disease Exposome

In a Theoretical Article, researchers from the University of Southern California and Duke University propose a new roadmap to look at the complexity of Alzheimer's disease. Borrowing a term from the cancer world, the researchers "propose the 'Alzheimer's disease exposome' to address the major gaps in understanding environmental contributions to the genetic and nongenetic risk of Alzheimer's disease and other dementias." The researchers suggest a comprehensive assessment of all the modifiable environmental factors associated with cognitive aging and Alzheimer's disease, including diet, exercise, air pollution, socio-economic status, toxins, traumatic brain injuries, stress, inflammation, infections, hypertension, cells and gender. The authors suggest that using this new lens through which to view the disease will result in new research directions and therapy targets.

Link: Longitudinal Analysis of Dementia Diagnosis and Specialty Care Among Racially Diverse Medicare Beneficiaries

Most individuals with dementia do not receive a diagnosis or their medical care from a specialist, according to a new study. Researchers from the University of Southern California, Johns Hopkins and the University of Washington used Medicare data to review dementia diagnoses of 226,604 individuals. They found that 85% were diagnosed by a non-dementia specialist physician, usually a primary care doctor, and received an unspecified dementia diagnosis. The use of dementia specialty care was especially low among Hispanics and Asian-Americans. Dementia specialists include neurologists, geriatricians, psychiatrists and neuropsychiatrists. The authors conclude, "it is unknown whether low rate of dementia specialty care lead to worse outcomes. A critical next step is to investigate the links between these patterns of diagnosis, health care use and outcomes in the U.S. population."

Credit: 
Alzheimer's Association

Researchers relate neuropsychological tests with real-life activity in multiple sclerosis

image: Dr. Weber is a research scientist in the Center for Traumatic Brain Injury Research at Kessler Foundation.

Image: 
Kessler Foundation/Jody Banks

East Hanover, NJ. September 19, 2019. A recent review by Kessler Foundation researchers explored the relationship between neuropsychological assessment and predictability of performance of everyday life activities among people with multiple sclerosis (MS). The article, "Beyond cognitive dysfunction: Relevance of ecological validity of neuropsychological tests in multiple sclerosis," was epublished on August 30, 2019 by the Multiple Sclerosis Journal in a special issue on rehabilitation.

The authors are Erica Weber, PhD, and John DeLuca, PhD of Kessler Foundation, and Yael Goverover, PhD, of New York University, who is a visiting scientist at Kessler Foundation. Drs. Weber and Goverover are former Switzer fellows. The Switzer Fellowship is awarded by the National Institute on Disability Independent Living and Rehabilitation Research to postdoctoral fellows with the potential to change the lives of people with disabilities though their research.

Link to abstract: https://doi.org/10.1177%2F1352458519860318

Individuals with neurological diseases such as MS often undergo neuropsychological testing to evaluate the influence of cognitive effects on their ability to perform everyday life activities. To be a useful tool for the clinicians who care for these individuals, it is important that their performance on neuropsychological testing parallels their function in everyday life. The Kessler team examined this issue, as well as the broader context for the question: "Are neuropsychological tests ecologically valid?"

The authors examined the literature on the relationships between cognitive and functional domains in the MS population. Cognitive functions included processing speed, executive function, visuospatial function, learning and memory, working memory, and verbal fluency. Functional domains included driving, employment, internet shopping and financial/medical decision-making.

They found that neuropsychological tests do have predictive value for individuals' behavior in these real life settings, according to Dr. Weber, research scientist in the Center for Traumatic Brain Injury Research. "While neuropsychological tests are ecologically valid in this population, other measures need to be considered in the clinical evaluation of individuals with MS, in order to understand the impact of the disease on everyday function," she explained. "Everyday life is complex, and there is no single measure for predicting the performance of complex daily activities. This is especially true for MS."

In summary, to best serve the clinical needs of individuals with MS, neuropsychological testing needs to be viewed in larger context comprising non-cognitive variables, such as motor ability and demographic values, fatigue and depression, and disease activity and level of disability, as well as person-specific factors such as personality and coping styles. "It's important to note that other types of assessments used to evaluate individuals with MS need to be subjected to the same standards for validity as neuropsychological assessments," added Dr. Weber.

Credit: 
Kessler Foundation

First glimpse at what ancient Denisovans may have looked like, using DNA methylation data

image: This image shows a portrait of a juvenile female Denisovan based on a skeletal profile reconstructed from ancient DNA methylation maps.

Image: 
Maayan Harel

If you could travel back in time 100,000 years, you'd find yourself living among multiple groups of humans, including anatomically modern humans, Neanderthals, and Denisovans. But exactly what our Denisovan relatives might have looked like had been anyone's guess for a simple reason: the entire collection of Denisovan remains includes a pinky bone, three teeth, and a lower jaw. Now, researchers reporting in the journal Cell have produced reconstructions of these long-lost relatives based on patterns of methylation in their ancient DNA.

"We provide the first reconstruction of the skeletal anatomy of Denisovans," says author Liran Carmel of the Hebrew University of Jerusalem. "In many ways, Denisovans resembled Neanderthals, but in some traits, they resembled us, and in others they were unique."

Overall, the researchers identified 56 anatomical features in which Denisovans differed from modern humans and/or Neanderthals, 34 of them in the skull. For example, the Denisovan's skull was probably wider than that of modern humans or Neanderthals. They likely also had a longer dental arch.

Carmel, along with study first author David Gokhman and their colleagues, came to this conclusion by using genetic data to predict the anatomical features of the Denisovans. Rather than relying on DNA sequences, they extracted anatomical information from gene activity patterns. Those gene activity patterns were inferred based on genome-wide DNA methylation or epigenetic patterns, chemical modifications that influence gene activity without changing the underlying sequence of As, Gs, Ts, and Cs.

The researchers first compared DNA methylation patterns between the three hominin groups to find regions in the genome that were differentially methylated. Next, they looked for evidence about what those differences might mean for anatomical features based on what's known about human disorders in which those same genes lose their function.

"By doing so, we can get a prediction as to what skeletal parts are affected by differential regulation of each gene and in what direction that skeletal part would change--for example, a longer or shorter femur," Gokhman explains.

To test the method, the researchers first applied it to two species whose anatomy is known: the Neanderthal and the chimpanzee. They found that roughly 85% of the trait reconstructions were accurate in predicting which traits diverged and in which direction they diverged. By focusing on consensus predictions and the direction of the change rather than trying to predict precise measurements, they were able to produce the first reconstructed anatomical profile of the little-understood Denisovan.

The evidence suggests that Denisovans likely shared Neanderthal traits such as an elongated face and a wide pelvis. It also highlighted Denisovan-specific differences, such as an increased dental arch and lateral cranial expansion, the researchers report.

Carmel notes that while their paper was in review, another study came out describing the first confirmed Denisovan mandible. And, it turned out that the jaw bone matched their predictions.

The findings show that DNA methylation can be used to reconstruct anatomical features, including some that do not survive in the fossil record. The approach may ultimately have a wide range of potential applications.

"Studying Denisovan anatomy can teach us about human adaptation, evolutionary constraints, development, gene-environment interactions, and disease dynamics," Carmel says. "At a more general level, this work is a step towards being able to infer an individual's anatomy based on their DNA."

Credit: 
Cell Press

Study finds hub linking movement and motivation in the brain

image: Figure S4A from the study shows that researchers were able to estimate (decode) the animals' actual movement velocity using from the spiking activity of neurons in the lateral septum.

Image: 
Hannah Wirtshafter/The Picower Institite for Learning and Memory

Our everyday lives rely on planned movement through the environment to achieve goals. A new study by MIT neuroscientists at The Picower Institute for Learning and Memory identifies a well-connected brain region as a crucial link between circuits guiding goal-directed movement and motivated behavior.

Published Sept. 19 in Current Biology, the research shows that the lateral septum (LS), a region considered integral to modulating behavior and implicated in many psychiatric disorders, directly encodes information about the speed and acceleration of an animal as it navigates and learns how to obtain a reward in an environment.

"Completing a simple task, such as acquiring food for dinner, requires the participation and coordination of a large number of regions of the brain, and the weighing of a number of factors: for example, how much effort is it to get food from the fridge versus a restaurant," said Hannah Wirtshafter, the study's lead author. "We have discovered that the LS may be aiding you in making some of those decisions. That the LS represents place, movement, and motivational information may enable the LS to help you integrate or optimize performance across considerations of place, speed, and other environmental signals."

Previous research has attributed important behavioral functions to the LS, such as modulating anxiety, aggression, and affect. It is also believed to be involved in addiction, psychosis, depression, and anxiety. Neuroscientists have traced its connections to the hippocampus, a crucial center for encoding spatial memories and associating them with context, and to the ventral tegmental area (VTA), a region that mediates goal-directed behaviors via the neurotransmitter dopamine. But until now, no one had shown that the LS directly tracks movement or communicated with the hippocampus, for instance by synchronizing to certain neural rhythms, about movement and the spatial context of reward.

"The hippocampus is one of the most studied regions of the brain due to its involvement in memory, spatial navigation, and a large number of illnesses such as Alzheimer's disease," said Wirtshafter, who recently earned her PhD working on the research as a graduate student in the lab of senior author Matthew Wilson, Sherman Fairchild Professor of Neurobiology. "Comparatively little is known about the lateral septum, even though it receives a large amount of information from the hippocampus and is connected to multiple areas involved in motivation and movement."

Wilson said the study helps to illuminate the importance of the LS as a crossroads of movement and motivation information between regions such as the hippocampus and the VTA.

"The discovery that activity in the LS is controlled by movement points to a link between movement and dopaminergic control through the LS that that could be relevant to memory, cognition, and disease," he said.

Tracking thoughts

Wirtshafter was able to directly observe the interactions between the LS and the hippocampus by simultaneously recording the electrical spiking activity of hundreds of neurons in each region in rats both as they sought a reward in a T-shaped maze, and as they became conditioned to associate light and sound cues with a reward in an open box environment.

In that data, she and Wilson observed a speed and acceleration spiking code in the dorsal area of the LS, and saw clear signs that an overlapping population of neurons were processing information based on signals from the hippocampus, including spiking activity locked to hippocampal brain rhythms, location-dependent firing in the T-maze, and cue and reward responses during the conditioning task. Those observations suggested to the researchers that the septum may serve as a point of convergence of information about movement and spatial context.

Wirtshafter's measurements also showed that coordination of LS spiking with the hippocampal theta rhythm is selectively enhanced during choice behavior that relies on spatial working memory, suggesting that the LS may be a key relay of information about choice outcome during navigation.

Putting movement in context

Overall, the findings suggest that movement-related signaling in the LS, combined with the input that it receives from the hippocampus, may allow the LS to contribute to an animal's awareness of its own position in space, as well as its ability to evaluate task-relevant changes in context arising from the animal's movement, such as when it has reached a choice point, Wilson and Wirtshafter said.

This also suggests that the reported ability of the LS to modulate affect and behavior may result from its ability to evaluate how internal states change during movement, and the consequences and outcomes of these changes. For instance, the LS may contribute to directing movement toward or away from the location of a positive or negative stimulus.

The new study therefore offers new perspectives on the role of the lateral septum in directed behavior, the researchers added, and given the known associations of the LS with some disorders, it may also offer new implications for broader understanding of the mechanisms relating mood, motivation, and movement, and the neuropsychiatric basis of mental illnesses.

"Understanding how the LS functions in movement and motivation will aid us in understanding how the brain makes basic decisions, and how disruption in these processed might lead to different disorders," Wirtshafter said.

Credit: 
Picower Institute at MIT

Cell biology: Endocannabinoid system may be involved in human testis physiology

The endocannabinoid system (ECS) may be directly involved in the regulation of the physiology of the human testis, including the development of sperm cells, according to a study in tissue samples from 15 patients published in Scientific Reports.

The ECS is a signalling system consisting of endocannabinoids - a type of neurotransmitter - their associated receptors, enzymes and proteins. In humans, the ECS has been associated with sperm quality and function. However, there is little information about the presence of ECS components in human testis tissue or their possible involvement in sperm cell development.

Niels E. Skakkebaek and colleagues investigated the presence of the individual components of the ECS in testis tissue samples from 15 patients with testicular germ cell cancer. They found that ECS components were present in the human testis, including 2-arachidonoylglycerol (2-AG), one of the main endocannabinoids. Enzymes such as FAAH and ABHD2, which degrade 2-AG, and endocannabinoid receptors, were present at various stages of germ cell development where the authors detected a distinct pattern of ECS components across different maturation stages.

The findings indicate that the ECS is present in the human testis and that it may be involved in testicular function, particularly in the regulation of sperm cell development. The findings add to our knowledge of possible associations between marijuana use and changes in semen quality and reproductive hormone levels in young men. However, additional studies of endocannabinoid function in human reproductive organs are needed to address the possible health impacts of cannabis use.

Credit: 
Scientific Reports

Scientists develop new methodology to genetically modify lab mice and human cells

image: Artist's representation of brain cells of laboratory mice altered by a new technique called mosaic analysis with dual recombinase-mediated cassette exchange, or MADR.

Image: 
Illustration by Giorgio Luciano for Cedars-Sinai.

LOS ANGELES (Embargoed Until Sept. 19, 2019 at 11 A.M. EDT) -- A team led by Cedars-Sinai has designed a rapid method to genetically alter laboratory mice and then used this method to produce personalized animal models of pediatric glioma, an aggressive type of malignant brain cancer in children.

The new method overcomes several drawbacks in current techniques. The goal is to make it easier for laboratories to achieve precise, reliable results when modifying mice for research studies, especially those involving cancers driven by multiple genetic variations. The method also can be used to modify patient-derived cells to study diseases in a culture dish.

In an article published Sept. 19 in the journal Cell, the scientists described their technique, called mosaic analysis with dual recombinase-mediated cassette exchange, or MADR.

"We imagine our method as a continually evolving platform," said Joshua Breunig, PhD, assistant professor of Biomedical Sciences at the Board of Governors Regenerative Medicine Institute at Cedars-Sinai. "We hope to keep incorporating new methods to create a type of Swiss army knife for tumor modeling, adaptable to many types of tissues." Breunig is the corresponding and senior author of the Cell study.

The MADR method involves inserting a single copy of genetic material into a specific point on a chromosome in the cell of a mouse embryo or newborn mouse. Each genetic alteration, or mutation, can cause a protein to either stop functioning or acquire a new, abnormal function. Using MADR, scientists were able to introduce both types of mutations into the same mouse.

The mutations reproduced the complex gene expression patterns and pathology found in the pediatric glioma tumors of patients who provided the genetic material. Further, MADR was able to accurately model another aggressive type of pediatric brain tumor-ependymoma-using multi-gene fusions that produced cancer-causing fusion proteins. Importantly, the method precisely targeted the correct tissues, leaving other tissues unaltered, Breunig said.

Currently, scientists modify mice for research in several ways, which can include transplanting cells, using deactivated viruses to ferry genetic material into cells or breeding genetically engineered animals. These techniques can present incomplete pictures of tumors, entail biohazard risk, be costly and time-consuming, or cause unintended cancers in other tissues, Breunig said. MADR is designed to overcome some of these problems.

Moise Danielpour, MD, associate professor of Neurosurgery at Cedars-Sinai, explained that better mouse models are critical to understanding diseases such as pediatric glioma, in which a single tumor may contain many different clones, each with its own combination of gene mutations and altered biochemical pathways. This complexity helps the tumor evolve to resist treatments. Danielpour was a key collaborator and co-author who provided clinical insight for the project.

"My hope is that our new method, by modeling the spectrum of heterogeneity of tumors, will help us develop new, more-targeted therapeutics for pediatric glioma-one of the leading causes of cancer deaths in children and young adults," he said.

Breunig added the study demonstrated that MADR successfully altered not only mouse cells but also human cells that had been modified to include an appropriate recipient site. "This finding shows the potential for scientists to incorporate MADR into 'disease in a dish' studies involving a wide range of disorders," he said.

Clive Svendsen, PhD, director of the Board of Governors Regenerative Medicine Institute and professor of Medicine and Biomedical Sciences, noted that modifying the mouse brain with human genes involved with brain tumors is playing an increasingly important role in modeling cancers.

Both Breunig and Danielpour are affiliated with Cedars-Sinai Cancer. "This study shows how Cedars-Sinai is shaping the future of cancer medicine by fostering creative collaborations between top investigators and clinicians," said Dan Theodorescu, MD, PhD, director of Cedars-Sinai Cancer and Phase ONE Foundation Distinguished Chair. "Our goal is to translate discoveries into lifesaving treatments."

Credit: 
Cedars-Sinai Medical Center

USC researchers hone in on the elusive receptor for sour taste

Sour is the taste of summer, a taste that evokes lemonade stands and vine-ripe tomatoes. Among the five basic tastes -- the others being bitter, sweet, salty and umami -- it is arguably the most subtle. In small amounts, it adds a critical tang to an otherwise bland dish. At higher concentrations and on its own, it's unpleasant or even painful.

But what causes the sensation of sourness, and why do we like it so much? USC scientists may have solved the first mystery: how sour tastes are sensed by animals.

Fruits and vegetables that taste sour are high in acids, including citric acid for lemons, tartaric acid for grapes and acetic acids in fermented foods like vinegar. It has been recognized for more than a century -- since the introduction of the pH meter -- that the low pH and high concentration of H+ ions in these foods generate a perception of sourness in humans. But how pH is sensed at the level of the tongue, and specifically what molecule constitutes the pH sensor, was not known.

A group led by Emily Liman, professor of biological sciences at the USC Dornsife College of Letters, Arts & Sciences, have reported in Current Biology that a sensor for pH on the tongue is the otopetrin 1 gene (Otop1). Otop1 is a member of a class of molecules called ion channels, which allow charged ions to cross cell membranes. In the case of Otop1, the charged ion carried across the membrane is H+, which is released into the mouth by acids.

Last year, Liman's team published research in Science that closed in on the sour-taste sensor. In that study, they used high-throughput sequencing methods made possible by advances in genomics to identify a list of roughly 40 previously uncharacterized genes that could encode a sour sensor. By studying the function of each gene, they whittled the list down to Otop1 because it was the only candidate that, when introduced into non-taste cells, gave them the ability to respond to acids.

While the USC scientists had identified OTOP1 -- the protein encoded by the Otop1 gene -- as forming a proton channel, they did not show that it was required for sour-taste responses in an intact animal.

Taste occurs when ingested chemicals interact with specialized cells on the tongue and palate. These cells are called taste receptor cells and are found in taste buds, which are concentrated on the back, sides and front of the tongue and on the roof of the mouth. Different taste receptor cells respond to each of the five basic tastes, and they release neurotransmitters onto gustatory nerves that send signals to the brain. This allows the nervous system to determine whether the ingested chemical has qualities perceived as bitter, sweet, umami, sour, salty or a mix of the five.

The new study followed up on previous findings that OTOP1 gave cells the ability to detect low pH. Graduate student Yu-Hsiang Tu used gene-editing technology to generate mice with an inactivated Otop1 gene to test if the OTOP1 protein was necessary for responding to sour chemicals -- or acids. When sour taste receptor cells are exposed to acids, they respond by producing an electrical signal -- or current -- because of the movement of H+ ions across the cell's membrane.

Graduate student Bochuan Teng showed that the sour taste receptor cells from the mice with nonfunctional OTOP1 did not have detectable currents representing the movement of H+ into cells. The sour taste receptor cells from the mutant mice also did not decrease their intracellular pH when exposed to acids, which would happen if H+ ions moved into the cell. Finally, the sour taste receptor cells from the mutant mice did not produce action potentials -- another electrical signal -- which are needed to activate the gustatory nerve and signal to the brain in response to some acidic solutions.

Whereas the previous experiments were performed with isolated taste receptor cells, the researchers also studied the importance of OTOP1 in mice by measuring the activity of the gustatory nerves in response to sour-tasting solutions introduced into the mouth of the mice. For these experiments, they teamed up with leading taste researcher Sue Kinnamon and graduate student Courtney Wilson at the University of Colorado Medical School. As expected, the activity of these nerves was severely reduced in mice with nonfunctional OTOP1, showing that the ability of the mice to sense acidic solutions -- and thus sour tastes -- was impaired.

"Our results show that OTOP1 is a bona fide sour taste receptor," Liman said. "This is the first definitive evidence for a protein that is both necessary and sufficient for sour taste receptor cells to respond to acids and stimulate the nerves to enable sour taste perception."

Surprisingly, the scientists found that mice with a nonfunctional Otop1 gene could still produce a small response to sour taste stimuli; the sour taste receptor cells still produced a few action potentials and the gustatory nerve produced a small response to very acidic stimuli. They postulate that another signaling mechanism, unrelated to OTOP1, also contributes to sour taste. They also tested the behavior in mice and found that the mice with a nonfunctional Otop1 gene still found acidic stimuli aversive.

"The behavioral response to acidic stimuli that are ingested is complex. You have the taste receptor cells that can detect acids, but you also have the pain system, which responds to low pH," Liman said. "Finding the molecular basis for the sour taste sensor takes us one step closer to understanding how different animals and individuals perceive the world."

Identification of the molecule responsible for taste opens up possibilities for wide application. This information may lead to an understanding of individual differences in food preferences and flavor perception, guide nutrition science and lead to novel approaches to pest control. Professional flavorists and chemists can leverage this information to manipulate flavors to make food or even medications more pleasing to the palate while also making household products containing toxic chemicals less pleasing.

Credit: 
University of Southern California

Key similarities discovered between human and archaea chromosomes

image: Stephen Bell

Image: 
Indiana University

A study led by researchers at Indiana University is the first to find similarities between the organization of chromosomes in humans and archaea. The discovery could support the use of archaea in research to understand human diseases related to errors in cellular gene expression, such as cancer.

The lead author on the study is Stephen Bell, a professor of biology and chair of the Department of Molecular and Cellular Biochemistry in the College of Arts and Sciences at IU Bloomington. The study will publish Sept. 19 in the journal Cell.

The similar clustering of DNA in humans and archaeal chromosomes is significant because certain genes activate or deactivate based upon how they're folded.

"The inaccurate bundling, or 'folding,' of DNA can lead to the wrong gene being switched on or off," Bell said. "Studies have shown that switching the wrong genes on or off during cellular growth in humans can lead to changes in gene expression that can ultimately be carcinogenic."

Archaea are simple single-celled organisms that comprise one of the three domains of life on Earth. Although found in every type of environment, including the human body, archaea are poorly understood compared to the other two domains: bacteria and eukaryotes, which include mammals such as humans. They're also more similar to eukaryotes on the genetic level than bacteria.

The IU study is the first to visualize the organization of DNA in archaeal chromosomes. The key similarity is the way in which the DNA is arranged into clusters -- or "discrete compartmentalizations" -- based upon their function.

"When we first saw the interaction patterns of the archaea's DNA, we were shocked," Bell said. "It looked just like what has been seen with human DNA."

The study is also the first to describe the protein used to assemble archaeal DNA during cellular growth. The researchers dubbed this large protein complex as "coalescin" due to its similarities to a protein in eukaryotes called "condensin."

The advantages to the use of archaea as a model for studying the organization of DNA during cellular growth in humans -- and the relationship between that organization and the activation of genes that may trigger cancers -- is their relative simplicity.

"Human cells are horrifyingly complex, and understanding the rules that govern DNA folding is extremely challenging," Bell said. "The simplicity of archaea means that they've got the potential to be a terrific model to help understand the fundamentally related -- but much more complicated -- cellular processes in humans."

The study was conducted using Sulfolobus, a genus of archaea that thrives at extremely high temperatures, because their physical durability allows them to be more easily used in experiments. Sulfolobus are found across the globe, notably at locations such as the volcano at Mount St. Helen's and hot springs at Yellowstone National Park.

Credit: 
Indiana University

Study: Even short-lived solar panels can be economically viable

A new study shows that, contrary to widespread belief within the solar power industry, new kinds of solar cells and panels don't necessarily have to last for 25 to 30 years in order to be economically viable in today's market.

Rather, solar panels with initial lifetimes of as little as 10 years can sometimes make economic sense, even for grid-scale installations -- thus potentially opening the door to promising new solar photovoltaic technologies that have been considered insufficiently durable for widespread use.

The new findings are described in a paper in the journal Joule, by Joel Jean, a former MIT postdoc and CEO of startup company Swift Solar; Vladimir Bulovi?, professor of electrical engineering and computer science and director of MIT.nano; and Michael Woodhouse of the National Renewable Energy Laboratory (NREL) in Colorado.

"When you talk to people in the solar field, they say any new solar panel has to last 25 years," Jean says. "If someone comes up with a new technology with a 10-year lifetime, no one is going to look at it. That's considered common knowledge in the field, and it's kind of crippling."

Jean adds that "that's a huge barrier, because you can't prove a 25-year lifetime in a year or two, or even 10." That presumption, he says, has left many promising new technologies stuck on the sidelines, as conventional crystalline silicon technologies overwhelmingly dominate the commercial solar marketplace. But, the researchers found, that does not need to be the case.

"We have to remember that ultimately what people care about is not the cost of the panel; it's the levelized cost of electricity," he says. In other words, it's the actual cost per kilowatt-hour delivered over the system's useful lifetime, including the cost of the panels, inverters, racking, wiring, land, installation labor, permitting, grid interconnection, and other system components, along with ongoing maintenance costs.

Part of the reason that the economics of the solar industry look different today than in the past is that the cost of the panels (also known as modules) has plummeted so far that now, the "balance of system" costs -- that is, everything except the panels themselves -- exceeds that of the panels. That means that, as long as newer solar panels are electrically and physically compatible with the racking and electrical systems, it can make economic sense to replace the panels with newer, better ones as they become available, while reusing the rest of the system.

"Most of the technology is in the panel, but most of the cost is in the system," Jean says. "Instead of having a system where you install it and then replace everything after 30 years, what if you replace the panels earlier and leave everything else the same? One of the reasons that might work economically is if you're replacing them with more efficient panels," which is likely to be the case as a wide variety of more efficient and lower-cost technologies are being explored around the world.

He says that what the team found in their analysis is that "with some caveats about financing, you can, in theory, get to a competitive cost, because your new panels are getting better, with a lifetime as short as 15 or even 10 years."

Although the costs of solar cells have come down year by year, Bulovi? says, "the expectation that one had to demonstrate a 25-year lifetime for any new solar panel technology has stayed as a tautology. In this study we show that as the solar panels get less expensive and more efficient, the cost balance significantly changes."

He says that one aim of the new paper is to alert the researchers that their new solar inventions can be cost-effective even if relatively short lived, and hence may be adopted and deployed more rapidly than expected. At the same time, he says, investors should know that they stand to make bigger profits by opting for efficient solar technologies that may not have been proven to last as long, knowing that periodically the panels can be replaced by newer, more efficient ones.

"Historical trends show that solar panel technology keeps getting more efficient year after year, and these improvements are bound to continue for years to come," says Bulovi?. Perovskite-based solar cells, for example, when first developed less than a decade ago, had efficiencies of only a few percent. But recently their record performance exceeded 25 percent efficiency, compared to 27 percent for the record silicon cell and about 20 percent for today's standard silicon modules, according to Bulovi?. Importantly, in novel device designs, a perovskite solar cell can be stacked on top of another perovskite, silicon, or thin-film cell, to raise the maximum achievable efficiency limit to over 40 percent, which is well above the 30 percent fundamental limit of today's silicon solar technologies. But perovskites have issues with longevity of operation and have not yet been shown to be able to come close to meeting the 25-year standard.

Bulovi? hopes the study will "shift the paradigm of what has been accepted as a global truth." Up to now, he says, "many promising technologies never even got a start, because the bar is set too high" on the need for durability.

For their analysis, the team looked at three different kinds of solar installations: a typical 6-kilowatt residential system, a 200-kilowatt commercial system, and a large 100-megawatt utility-scale system with solar tracking. They used NREL benchmark parameters for U.S. solar systems and a variety of assumptions about future progress in solar technology development, financing, and the disposal of the initial panels after replacement, including recycling of the used modules. The models were validated using four independent tools for calculating the levelized cost of electricity (LCOE), a standard metric for comparing the economic viability of different sources of electricity.

In all three installation types, they found, depending on the particulars of local conditions, replacement with new modules after 10 to 15 years could in many cases provide economic advantages while maintaining the many environmental and emissions-reduction benefits of solar power. The basic requirement for cost-competitiveness is that any new solar technology that is to be installed in the U.S should start with a module efficiency of at least 20 percent, a cost of no more than 30 cents per watt, and a lifetime of at least 10 years, with the potential to improve on all three.

Jean points out that the solar technologies that are considered standard today, mostly silicon-based but also thin-film variants such as cadmium telluride, "were not very stable in the early years. The reason they last 25 to 30 years today is that they have been developed for many decades." The new analysis may now open the door for some of the promising newer technologies to be deployed at sufficient scale to build up similar levels of experience and improvement over time and to make an impact on climate change earlier than they could without module replacement, he says.

"This could enable us to launch ideas that would have died on the vine" because of the perception that greater longevity was essential, Bulovi? says.

Credit: 
Massachusetts Institute of Technology

Minorities more likely to have diabetes at lower weights

Oakland, Calif. - Being overweight or obese is commonly associated with diabetes, but a new Kaiser Permanente study finds the connection differs widely by race or ethnicity. Members of racial and ethnic minority groups were much more likely to have diabetes or prediabetes at lower weights -- even at normal or below-normal body mass index (BMI), according to research published in Diabetes Care.

The large analysis included more than 4.9 million people of diverse backgrounds and geographies who were part of the nationwide Patient Outcomes Research to Advance Learning network. The PORTAL study group, supported by the Patient-Centered Outcomes Research Institute, includes data on more than 12 million patients contributed by all regions of Kaiser Permanente, along with HealthPartners in Minnesota and Denver Health.

Normal-weight Hawaiians and Pacific Islanders were 3 times more likely to have diabetes than normal-weight white people. Diabetes prevalence at normal BMI was 18% for Hawaiians/Pacific Islanders versus just 5% for whites; prevalence was also high for blacks (13.5%), Hispanics (12.9%), Asians (10.1%) and American Indians/Alaskan Natives (9.6%).

Disparities were also found in prediabetes but were not as pronounced. Results also differed by gender. Asians, Hispanics, and Hawaiians/Pacific Islanders had a higher prevalence of prediabetes at lower BMIs than other groups, particularly among women.

For primary care clinicians, the findings could signal a change in how they screen racial and ethnic minority patients for diabetes and prediabetes, said senior author Assiamira Ferrara, MD, PhD, a senior research scientist with the Kaiser Permanente Division of Research in Oakland, California. "This study suggests that along with screening patients who are overweight and obese, minorities should probably be screened even if they have a normal BMI, particularly as they get older," Ferrara said.

This study is one of the largest that has examined the relationship between BMI and diabetes and prediabetes prevalence. The study also included large enough samples of some understudied minority groups to draw conclusions about them, the authors said. The study offers new information about diabetes prevalence across BMI categories among Asians, Hawaiians, Pacific Islanders, American Indians, and Alaskan Natives across the country.

This study took into account neighborhood-level measures of income and education, neither of which were found to fully explain the racial/ethnic differences in prevalence of diabetes beyond BMI. While access to primary care is a major factor in health care disparities, it was not seen as a contributor in this study because all of the patients had health insurance and were members of integrated health systems.

The authors speculated that there could be physiological differences among people of varying races and ethnicities relating to diabetes, citing the example of Asians having a higher share of body fat and visceral fat at the same BMI as other groups, which could lead to insulin resistance, prediabetes, and diabetes.

Lead author Yeyi Zhu, PhD, a research scientist with the Kaiser Permanente Division of Research, called for better understanding of how the physiological mechanisms of diabetes may vary. "Future research could focus on body composition, genetics, and other lifestyle factors that may contribute to disparities in chronic disease burden," Zhu said.

She also noted that the analysis identified a group of people at risk who don't get as much attention for diabetes risk: those who are underweight. The study found significant differences in diabetes prevalence among underweight men, ranging from 7.3% in whites to 16.8% in American Indians/Alaskan Natives.

Credit: 
Kaiser Permanente

Survival in women, men diagnosed with breast cancer

Bottom Line: An analysis of nearly 1.9 million patients diagnosed with breast cancer suggests overall survival is lower among men than women and that undertreatments and clinical characteristics account for much of the difference. The study included National Cancer Database data for 16,025 male and 1.8 million female patients diagnosed with breast cancer between 2004-2014. Men had higher mortality across all breast cancer stages. For men, the three-year survival rate was 86.4% and the five-year survival rate was 77.6%. For women, the three-year survival rate was 91.7% and the five-year survival rate was 86.4%. Limitations of the study include a lack of information on cancer recurrence and cause of death, as well as missing information on details of cancer pathology and  treatment, patient compliance data, lifestyle factors and coexisting illnesses. Study authors suggest future research focus on why clinical characteristics and biological features may have different implications for survival in male and female patients with breast cancer.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Xiao-Ou Shu, M.D., Ph.D., the Vanderbilt University Medical Center, Nashville, and coauthors

(doi:10.1001/jamaoncol.2019.2803)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network