Culture

Indications why older people are more susceptible to Alzheimer's disease

The risk of developing Alzheimer's disease increases with age. Susanne Wegmann of the German Center for Neurodegenerative Diseases (DZNE) in Berlin and colleagues have uncovered a possible cause for this connection: Certain molecules involved in the disease, termed tau-proteins, spread more easily in the aging brain. This has been determined in laboratory experiments. The current study was carried out in close collaboration with researchers in the US at Harvard Medical School and Massachusetts General Hospital. The results were recently published in the journal "Science Advances".

Alzheimer's disease usually begins with memory decline and later affects other cognitive abilities. Two different kinds of protein deposits in the patient's brain are involved in the disease: "Amyloid beta plaques" and "tau neurofibrillary tangles". The emergence of tau neurofibrillary tangles reflects disease progression: they first manifest in the brain's memory centers and then appear in other areas in the course of the disease. Tau proteins or tau aggregates probably migrate along nerve fibers and thereby contribute to the spreading of the disease throughout the brain.

Tau spreads more rapidly in aging brains

What is the role of aging in tau propagation? If the protein spread more easily in older brains, this could explain the increased susceptibility of older people to Alzheimer's disease. Wegmann and her colleagues tested this hypothesis.

Using a "gene vector" - a tailored virus particle - the scientists channeled the blueprint of the human tau protein into the brains of mice. Individual cells then began to produce the protein. Twelve weeks later, the researchers examined how far the tau protein had travelled from the production site. "Human tau proteins spread about twice as fast in older mice as compared to younger animals," Wegmann summarized the results.

The experimental part of the study was carried out in the laboratory of Bradley Hyman at Harvard Medical School in Boston, USA, where Susanne Wegmann worked for several years. In 2018, she moved to the DZNE's Berlin site, where her research group addresses various questions on tau-related disease mechanisms. Here, the major part of data analysis and summarizing the results took place.

Healthy and pathological tau

The experimental setting also allowed the scientists to analyze tau propagation in more detail. The protein exists in a healthy, soluble form in every neuron of the brain. However, in Alzheimer's disease, it can change its shape and convert into a pathological form prone to aggregate into fibrils. "It has long been thought that it is primarily the pathological form of tau that passes from one cell to the next. However, our results show that the healthy version of the protein also propagates in the brain and that this process increases in old age. Cells could also be harmed by receiving and accumulating large amounts of healthy tau," said Wegmann.

The findings from the study raise a number of questions that Wegmann will now tackle with her research group at the DZNE: Which processes underlie the increased spreading of tau in the aging brain? Is too much tau protein produced or too little defective protein removed? Answering these questions may open up new therapeutic options in the long term.

Credit: 
DZNE - German Center for Neurodegenerative Diseases

Looking at how the brain reacts to boredom could help people cope

image: A student digitally turns pegs for 10 minutes while wearing an EEG cap to measure her brain waves as she becomes more bored.

Image: 
WSU

Boredom is a common human experience. But how people cope with or handle being bored is important for mental health.

"Everybody experiences boredom," said Sammy Perone, Washington State University assistant professor in the Department of Human Development. "But some people experience it a lot, which is unhealthy. So, we wanted to look at how to deal with it effectively."

The brains of people who are prone to boredom react differently, compared to those who don't, Perone and his colleagues found in a new paper recently published in the journal Psychophysiology.

Among their findings, those who experience boredom more often tend to have more anxiety and are more prone to depression.

Perone conducted his research and wrote the paper with WSU assistant professor Elizabeth Weybright and WSU graduate student Alana Anderson.

"Previously, we thought people who react more negatively to boredom would have specific brain waves prior to being bored," Perone said. "But in our baseline tests, we couldn't differentiate the brain waves. It was only when they were in a state of boredom that the difference surfaced."

That means the big difference between people who experience boredom often is how they react to a boring situation. The implication is they can be taught coping mechanisms to avoid those negative responses.

How to bore people

To study how the brain reacts to boredom, you must first get a baseline screening, then bore people almost to tears. So Perone studied 54 people in his lab, where they came in, filled out a survey, and were fitted with a special cap that measures brain waves at 128 spots on the scalp.

The survey consisted of a series of questions about boredom and how participants react to it. Next, researchers measured the brain waves of each participant with their eyes open and then eyes closed to get the baseline reading. Then the boredom started.

The subjects sat in front of a computer displaying eight pegs on the screen. Their job was to click on the peg that got highlighted. Each click turned the peg a quarter turn. Then another would be highlighted. The experiment consisted of 320 quarter turns, taking around 10 minutes.

"I've never done it, it's really tedious," Perone said. "But in researching previous experiments, this was rated as the most boring task tested. That's what we needed."

Reactions to boredom are key

When analyzing the brain wave results, researchers looked at two specific areas. The right frontal and left frontal areas of the brain are active for different reasons. Left frontal activity is higher when people are looking to engage or stimulate themselves by thinking about other things. The right frontal activity is increased when people are feeling more negative emotions or becoming more anxious.

In baseline testing, there was no difference between the people who reacted with more left brain activity vs. right brain activity. But people who answered the survey questions saying that they're more prone to experience boredom in their daily life had more right frontal brain activity as they got more bored doing the peg activity.

"We found that the people who are good at coping with boredom in everyday life, based on the surveys, shifted more toward the left," Perone said. "Those that don't cope as well in everyday life shifted more right."

Coping with boredom

There are several ways that people cope positively with boredom, Perone said. They seek out a book or something to read. They create a grocery list or think about what they're going to make for dinner, for example.

"We had one person in the experiment who reported mentally rehearsing Christmas songs for an upcoming concert. They did the peg turning exercise to the beat of the music in their head," Perone said. "Doing things that keep you engaged rather than focusing on how bored you are is really helpful."

Real-world application

The next steps for the research will involve how to get people to be more proactive in their thinking when bored.

"The results of this paper show that reacting more positively to boredom is possible," Perone said. "Now we want to find out the best tools we can give people to cope positively with being bored. So, we'll still do the peg activity, but we'll give them something to think about while they're doing it.

"It's really important to have a connection between the lab and the real world. If we can help people cope with boredom better, that can have a real, positive mental health impact."

Credit: 
Washington State University

Live fast and die young, or play the long game? Scientists map 121 animal life cycles

image: The pace and shape of life varies hugely in the animal kingdom.

Image: 
Kevin Healy

Scientists have pinpointed the "pace" and "shape" of life as the two key elements in animal life cycles that affect how different species get by in the world. Their findings, which come from a detailed assessment of 121 species ranging from humans to sponges, may have important implications for conservation strategies and for predicting which species will be the winners and losers from the global environment crisis.

"Pace of life" relates to how fast animals reach maturity, how long they can expect to live, and the rate at which they can replenish a population with offspring. "Shape of life", meanwhile, relates to how an animal's chance of breeding or dying is spread out across its lifespan.

The scientists, from the National University of Ireland Galway, Trinity College Dublin, Oxford University, the University of Southampton, and the University of Southern Denmark, have today [Monday July 8] published their work in leading journal, Nature Ecology & Evolution.

The wide range of animal life cycles

Animal life cycles vary to a staggering degree. Some animals, such as the turquoise killifish (a small fish that can complete its life cycle in 14 days) grow fast and die young, while others, like the Greenland shark, (a fish that glides around for up to 500 years), grow slowly and have extraordinarily long lifespans.

Similarly, the spread of death and reproduction across animal life cycles also varies greatly. Salmon, for example, spawn over a short period of time with the probability of dying being particularly high both at the start of their life cycle and when they reproduce. Fulmars and some other sea birds, on the other hand, have wider time periods of reproduction and face relatively similar chances of dying throughout their lives.

Humans and Asian elephants have long lifespans and face a relatively low risk of mortality until later ages but have a fairly narrow age range for reproduction because they have long juvenile periods and live a long time after the reproductive part of their life-cycles. Both species share a similar lifespan with the Australian freshwater crocodile, but the crocodile has a completely different reproductive strategy - its reproduction is spread relatively evenly throughout its lifespan but its young have a low chance of reaching adulthood and reproducing.

The puzzle of different life cycles - why so many?

Why animal life cycles vary so much has long been an important puzzle that scientists have sought to solve. Among the reasons are that understanding why animals age, reproduce and grow at different rates may 1) help shed light on the evolution of aging itself, and 2) help identify how species will respond to global environmental change.

In their study, the scientists used population data to compare detailed life cycles for species ranging from sponges to corals, salmon to turtles, and vultures to humans. By mapping 121 life cycles, the scientists noticed that certain animal ecologies and physiologies were associated with certain life cycles.

Dr Kevin Healy who conducted the research at Trinity and is now Lecturer of Zoology at the National University of Ireland Galway, is the lead author of the study. He said: "When we mapped out the range of life cycles in the animal kingdom we saw that they follow general patterns. Whether you are a sponge, a fish or a human, your life cycle can, in general, be described by two things - how fast you live and how your reproduction and chance of dying is spread out across your lifespan."

"As we expected, species with low metabolic rates and slow modes-of-life were associated with slower life cycles. This makes sense; if you don't burn much energy per second, you are restricted in how fast you can grow. Similarly, if you are an animal that doesn't move around a lot, such as a sponge or a fish that lives on the sea bed, playing a longer game in terms of your pace of life makes sense as you may need to wait for food to come to you."

Conservation implications

The scientists also investigated whether certain life cycles made animals more susceptible to ecological threats, by looking for associations between an animal's life cycle and its position on the IUCN red list of endangered species.

Professor of Zoology and Head of the Zoology Department at Trinity, Yvonne Buckley, is co-senior author of the research. She said: "We found that extinction risks were not confined to particular types of life history for the 121 species. Despite these animals having very different ways of maintaining their populations, they faced similar levels of threat."

"Populations of a particular species, like the Chinook salmon or Freshwater crocodile, vary more in how mortality and reproduction are spread across their life-spans than they vary in their pace of life. This is important for the animal populations that we need to conserve as it suggests it may be wiser to consider actions that boost reproduction and/or impart bigger effects on the periods of the life cycles when mortality and reproduction are more likely - rather than simply aiming to extend the lifespans of these animals."

Associate Professor in Ecology at the University of Oxford, Dr Rob Salguero-Gómez, is also co-senior author of the research. He said: "This comparative work, which builds on previous research we have developed testing basic assumptions of how life structures the Plant Kingdom, highlights important commonalities in the ways that both animals and plants go about making a living and adapting to different environments. Indeed, classical works in life history theory predicted a single way to structure life strategies. Our work with plants and now here with animals shows the range of possibilities is much wider than previously believed."

"The unparalleled wealth of animal demographic schedules used in this research produced by an initiative led by Assoc. Prof Salguero-Gómez & co-author Assoc. Prof. Owen Jones, opens up new exciting ways to explore what are the most common strategies used by different species to thrive in their environments, but also to use demographic models to make predictions about what species will be the winners and losers of climate change."

Credit: 
Trinity College Dublin

Heat transport can be blocked more effectively with a more optimized holey nanostructure

image: There is an optimal periodic structure, which minimizes the thermal conduction to a record low level, with a period of about 10 micrometers.

Image: 
University of Jyvaskyla/Ilari Maasilta

The group of professor Ilari Maasilta at the Nanoscience Center, University of Jyväskylä specializes on studying how different nanostructures can be used to enhance or impede the transport of heat. The group's latest results, published in the journal Physical Review Applied on the 3rd of July, 2019, confirm its earlier observations that by using the wave nature of heat in holey nanostructures heat conduction can be reduced by over hundredfold.

The most important applications of controlling heat transport are in fields such as thermoelectric power conversion and cooling, and bolometric radiation detection.

The holey structures consisted of thin insulating silicon nitride plates containing a periodic array of holes in two directions. In principle, any other material could be used, as well. In particular, the group demonstrated that there is an optimal periodic structure, which minimizes the thermal conduction to a record low level, with a period of about 10 micrometers.

In addition, it was realized that if the hole side surfaces could be fabricated with atomic precision, heat conduction could be reduced even further with larger period structures.

"In the future, we will use these results to improve sensitive infrared radiation detectors for future space research, in collaboration with NASA", says professor Ilari Maasilta.

Credit: 
University of Jyväskylä - Jyväskylän yliopisto

Is nonmedical opioid use by adolescents associated with later risk of heroin use?

What The Study Did: This observational study used data from a survey of behavioral health that included students from 10 Los Angeles-area high schools to examine whether nonmedical prescription opioid use was associated with later risk of heroin use in adolescents.

Author: Adam M. Leventhal, Ph.D., of the University of Southern California in Los Angeles, is the corresponding author.

(doi:10.1001/jamapediatrics.2019.1750)

Editor's Note: The article contains funding/support and conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

GW pilot study finds collagen to be effective in wound closure

WASHINGTON (July 8, 2019) -- Collagen powder is just as effective in managing skin biopsy wounds as primary closure with non-absorbable sutures, according to a first-of-its-kind study published in the Journal of Drugs in Dermatology by a team of physician researchers at the George Washington University.

The team investigated the efficacy of topical collagen powder compared to primary closure on the rate and quality of full-thickness wound healing through histopathological analysis of healing and comparison of symptoms and early cosmetic outcomes.

Collagen is best known as an essential structural component of several organs, importantly the skin, but it also plays a pivotal role as a signaling molecule in the regulation of all phases of wound healing. Therefore, externally applied, collagen powder has significant potential for wound healing and care due to its ability to stop bleeding, to recruit immune and skin cells central to wound healing, as well as stimulate new blood vessel formation, and can be left in wounds without causing irritation or facilitating bacterial growth.

"During normal wound healing, collagen acts as a scaffold for cellular entry and growth in the wound bed and encourages the deposition of new collagen," said Adam Friedman, MD, interim chair of the Department of Dermatology at the GW School of Medicine and Health Sciences and an author on the study. "While collagen has been used as a wound healing adjuvant, a good comparison to the standard of wound care has been lacking."

Friedman's team administered two punch biopsies to eight volunteers and treated one wound with a daily topical collagen powder and the other with primary closure. Wounds were biopsied at four weeks for analysis and subjects were asked to rate the itch, pain, and treatment preferences throughout the process.

The researchers found that six out of eight collagen-treated wounds were completely healed after four weeks after initial wounding and all wounds were completely healed eight weeks after the second biopsy. Patients reported similar pain and itch for both primary closure and collagen, with most attributing the itch to the adhesive dressings rather than the wound itself. The team concluded that based on these findings, wounds treated with collagen powder healed at least as well as those treated with primary closure and that the powder can be applied safely for at least four weeks.

"Given the cost and time to place and remove sutures and the potential reimbursement for collagen, using topical collagen powder for punch biopsy wounds may be easier on the patient, not requiring an additional visit for suture removal and yielding an equivalent or possibly better wound healing outcome," Friedman said. "Moving forward we need to further examine the parameters of collagen use on wounds, including duration of therapy and wound sizes."

Credit: 
George Washington University

Ancient Saharan seaway shows how Earth's climate and creatures can undergo extreme change

image: During the Late Cretaceous-early Paleogene, the shallow waters of the Trans-Saharan Seaway waters were teeming with aquatic species which ranged from small mollusks to giant sea snakes and catfish.

Image: 
© Carl Buell

A new paper to be published in the Bulletin of the American Museum of Natural History integrates 20 years of research by a diverse scientific team and describes the ancient Trans-Saharan Seaway of Africa that existed 50 to 100 million years ago in the region of the current Sahara Desert. Led by Maureen O'Leary, Professor of Anatomical Sciences at the Renaissance School of Medicine at Stony Brook University, the paper is a comprehensive synthesis and contains the first reconstructions of extinct aquatic species in their habitats along the seaway and places in context massive climate and sea level changes that can occur on Earth.

The region now holding the Sahara Desert was once under water, in striking contrast to the present-day arid environment. This dramatic difference in climate over time is recorded in the rock and fossil record of West Africa during a time range that extends through the Cretaceous-Paleogene (KPg) boundary. West Africa was bisected by a shallow saltwater body that poured onto continental crust during a time of high global sea level. The Bulletin paper involves an assessment and continued analysis of three expeditions led by Professor O'Leary (1999, 2003, and 2008) within rock exposures in the Sahara Desert in Mali, and subsequently the laboratory work of the fossil finds in the region.

"Fossils found on the expeditions indicate that the sea supported some of the largest sea snakes and catfish that ever lived, extinct fishes that were giants compared to their modern day relatives, mollusk-crushing fishes, tropical invertebrates, long-snouted crocodilians, early mammals and mangrove forests," explained Professor O'Leary, who is also a Research Associate in the Division of Paleontology, American Museum of Natural History. "Because the seaway changed in size and geography frequently, we propose that it may have resulted in 'islands of water' that stimulated species gigantism."

The paper contains the first reconstructions of ancient relatives of elephants and large apex predators such as sharks, crocodilians and sea snakes.

"With our analysis and new technologies, such as a computer-aided map of the seaway, our work is an important step toward increasing our understanding of the KPg boundary event, the time of non-avian dinosaur extinction," said Professor O'Leary.

She and colleagues point out that the paper places in context climate and sea level changes that can occur on Earth.

For example, scientists currently predict that global warming will result in the sea rising two meters by the end of the 21st century. The study in the Bulletin describes how, in the Late Cretaceous, the time under study, sea level rise far exceeded that which is predicted by human-induced climate change. In the Late Cretaceous sea level was 300m higher than present - 40 percent of current land was under water, which is very different from today. This information underscores the dynamic nature of Earth.

Professor O'Leary explained that scientists do not have detailed stratigraphic terrestrial/near shore sections with fossils on every continent to examine exactly how the KPg boundary unfolded globally. There is only one good nearshore or terrestrial section with vertebrate fossils in the western United States. The expeditions in Mali, she added, created a new section, which is imperfect, missing some of the earliest Paleogene yet contributes to a better understanding of global events 50 to 100 million years ago.

The expeditions spanning 20 years involved Professor O'Leary and numerous colleagues internationally to excavate the fossils and conduct the research. The collaborative research team consists of paleontologists and geologists from the United States, Australia and Mali.

"Few paleontologists had worked the region, given its remoteness and scorching 125 degree F temperatures. The shifting sand dunes made it difficult to find rocky outcrops, and worse still, a flash rain storm flooded the roadways making navigation nearly impossible," said Leif Tapanila, PhD, Professor of Geosciences at Idaho State University and a co-author of the paper. "These expeditions could not have succeeded without the experience of local Malian drivers and guides, and I was amazed by the quality and diversity of marine fossils we found in the Sahara Desert."

Credit: 
American Museum of Natural History

Want to boost creativity? Try playing Minecraft

image: Study participants were split into groups with some playing Minecraft and others playing a race car video game or watching TV.

Image: 
Iowa State University News Service

AMES, Iowa - The next time you need to get the creative juices flowing, playing some types of video games may help.

Video games that foster creative freedom can increase creativity under certain conditions, according to new research from Iowa State University. The experimental study compared the effect of playing Minecraft, with or without instruction, to watching a TV show or playing a race car video game. Those given the freedom to play Minecraft without instruction were most creative.

"It's not just that Minecraft can help induce creativity. There seems to be something about choosing to do it that also matters," said Douglas Gentile, a professor of psychology.

If you are not familiar with the game, Gentile says Minecraft is like a virtual Lego world. The game, which has sold more than 100 million copies, allows players to explore unique worlds and create anything they can imagine. Study participants randomly assigned to play Minecraft were split into two groups. The one receiving instruction was told to play as creatively as possible.

After 40 minutes of play or watching TV, the 352 participants completed several creativity tasks. To measure creative production, they were asked to draw a creature from a world much different than Earth. More human-like creatures scored low for creativity and those less human-like scored high. Surprisingly, those instructed to be creative while playing Minecraft were the least creative.

Gentile says there's no clear explanation for this finding. In the paper published by Creativity Research Journal, he, Jorge Blanco-Herrera, lead author and former master's student in psychology; and Jeffrey Rokkum, former Ph.D. student in psychology, outlined possible reasons why the instructed Minecraft group scored lower. Blanco-Herrera says the instructions may have changed subjects' motivation for play.

"Being told to be creative may have actually limited their options while playing, resulting in a less creative experience," Blanco-Herrera said. "It's also possible they used all their 'creative juices' while playing and had nothing left when it came time to complete the test."

Games teach creativity similar to aggression

Video games can have both harmful and beneficial effects. Gentile's previous research has shown the amount, content and context of video games influence what players learn through repeated experiences. While much of Gentile's research has focused on aggression or prosocial behavior, he says the same appears to be true for creativity.

Most video games encourage players to practice some level of creativity. For example, players may create a character and story for role-playing games or be rewarded for creative strategies in competitive games. The researchers say even first-person shooter games can potentially inspire creativity as players think about strategy and look for advantages in combat.

"The research is starting to tell a more interesting, nuanced picture. Our results are similar to other gaming research in that you get better at what you practice, but how you practice might matter just as much," Gentile said.

The researchers say based on these findings, it is important to not disregard the potential video games have as engaging and adaptive educational opportunities.

Credit: 
Iowa State University

Prediction tool from Kaiser Permanente researchers may identify patients at risk for HIV

Oakland, Calif. -- Researchers have developed a new analytical tool that identifies people at risk of contracting HIV so they may be referred for preventive medication. A study describing the tool was published July 5, 2019, in The Lancet HIV by investigators at Kaiser Permanente San Francisco, the Kaiser Permanente Division of Research, Beth Israel Deaconess Medical Center, and Harvard Medical School.

Looking at medical records of 3.7 million Kaiser Permanente patients, researchers developed a machine-learning algorithm to predict who would become infected with HIV during a 3-year period. The algorithm flagged 2.2% of all patients as high or very high risk; this group included nearly half the men who later became infected, a significant improvement from other published HIV risk prediction tools.

"In preexposure prophylaxis, or PrEP, we have an incredibly powerful tool to stop HIV transmission," said senior author Jonathan Volk, MD, MPH, an infectious disease physician who treats patients with HIV at Kaiser Permanente San Francisco Medical Center. PrEP is a daily antiretroviral pill that is more than 99% effective in preventing HIV.

"It is critical that we identify our patients at risk of HIV acquisition," Dr. Volk said. "We used our electronic medical record to develop a tool that could be implemented in a busy clinical practice to help providers identify patients who may benefit most from PrEP."

The Centers for Disease Control and Prevention estimates that just 7% of the people who could benefit from PrEP are taking it. Health care providers have difficulty identifying people at risk for HIV acquisition. Relying on the CDC's indications for PrEP -- sexual orientation and a history of sexually transmitted infections -- underestimates risk for some populations, including African Americans, who have relatively high HIV incidence and low PrEP use.

Finding a reliable and automated approach to predict which patients are at risk of HIV infection is a high priority for public health officials. The U.S. Preventive Services Task Force recently gave PrEP therapy its highest, grade A rating and urged researchers to develop tools to identify individuals at risk for HIV.

"Our predictive model directly addresses this gap and may be substantially more effective than current efforts to identify those who may be good PrEP candidates," Dr. Volk said. He emphasized that it does not replace the clinical judgment of medical providers but could save them time and address misconceptions about HIV risk.

The accuracy of the tool is possible because of Kaiser Permanente Northern California's comprehensive electronic health records, which track many demographic and clinical data points for its members. "Development of the tool required a setting like KP Northern California that had high-quality individual-level data on enough people to identify new HIV infections, which are rare events," explained Michael Silverberg, PhD, MPH, from the Kaiser Permanente Division of Research and co-author of the paper.

The investigators analyzed 81 variables in the electronic health record, finding 44 of them most relevant for predicting HIV risk. A tool that used these 44 variables identified 2.2% of the population as having a high or very high risk of HIV infection within 3 years. This high-risk group included 38.6% of all new HIV infections (46.4% or 32 of 69 men who were diagnosed with HIV during the study period but none of the 14 women).

The tool is limited, as are others, in identifying women at risk of contracting HIV. That's because risk for females may be dependent on the risk factors of their partners, which are not captured by the variables included in the tool, Dr. Volk explained. The tool also performs less well among patients for whom the electronic health record contains less data (because they are new enrollees or access care less frequently).

The authors compared their new tool with simpler models and found it identified more patients who acquired HIV. Importantly, simpler algorithms were less likely to identify African Americans who became infected, whereas the new tool performed well for both white and African American patients.

While access to a wide variety of patient information is an advantage for Kaiser Permanente, other health care organizations could build similar algorithms using fewer electronic health record variables, Dr. Volk said. The study found simpler models that included only 6 variables still helped identify patients at risk for HIV.

The tool could be incorporated in electronic health record systems to alert primary care providers to speak with patients most likely to benefit from discussions about PrEP. Clinicians could also take this opportunity to explain the availability of drug manufacturer and publicly funded programs that may cover all or part of the drug's copay cost. The risk thresholds established in this study flagged a small proportion (2.2%) but large number (13,463) of patients over a 3-year period as potential candidates for PrEP based on HIV risk. The study's authors note that there are no established HIV-risk thresholds for determining PrEP candidacy, and that few of the patients flagged as high risk in this study received PrEP during the study period.

"Embedding our algorithm in the electronic health record could support providers in discussing sexual health and HIV risk with their patients, ultimately increasing the uptake of PrEP and preventing new HIV infections," said lead author Julia L. Marcus, PhD, MPH, now at Harvard Medical School and Harvard Pilgrim Health Care Institute but formerly of the Kaiser Permanente Division of Research.

Credit: 
Kaiser Permanente

Researchers: Eggshells can help grow, heal bones

image: Gulden Camci-Unal, second from left, is leading a UMass Lowell team, including Xinchen Wu, that has determined eggshells inserted into a hydrogel mix can be used to strengthen bone grown in a laboratory for use in bone grafts and other procedures.

Image: 
(Photo Edwin Aguirre for UMass Lowell)

LOWELL, Mass. - Eggshells can enhance the growth of new, strong bones needed in medical procedures, a team of UMass Lowell researchers has discovered.

The technique developed by UMass Lowell could one day be applied to repair bones in patients with injuries due to aging, accidents, cancer and other diseases or in military combat, according to Assistant Prof. Gulden Camci-Unal, who is leading the study.

Through the innovative process, crushed eggshells are inserted into a hydrogel mixture that forms a miniature frame to grow bone in the laboratory to be used for bone grafts. To do so, bone cells would be taken from the patient's body, introduced into this substance and then cultivated in an incubator before the resulting new bone is implanted into the patient.

The research demonstrates that when eggshell particles - which are primarily made of calcium carbonate - are incorporated into the hydrogel mixture, they increase bone cells' ability to grow and harden, which could potentially result in faster healing. And, because the bone would be generated from cells taken from the patient, the possibility the individual's immune system would reject the new material is greatly reduced, according to Camci-Unal.

The process could also be used to help grow cartilage, teeth and tendons, she said.

"This is the first study that uses eggshell particles in a hydrogel matrix for bone repair. We have already filed a patent for it and are very excited about our results. We anticipate the process can be adapted for use in many significant ways," said Camci-Unal, adding that one day, eggshell particles could also serve as a vehicle to deliver proteins, peptides, growth factors, genes and medications to the body.

UMass Lowell students participating in the research include biomedical engineering and biotechnology Ph.D. candidates Sanika Suvarnapathaki and Xinchen Wu of Lowell, along with Darlin Lantigua of Lawrence. Wu was the lead author of the team's research findings, which have been published in the academic journal Biomaterials Science and will be featured on the cover of the publication's print edition this month.

Using eggshells to support bone growth provides a sustainable way to reuse them while advancing the technology behind these procedures, according to the researchers.

"Global waste of discarded eggshells typically amounts to millions of tons annually form household and commercial cooking. By repurposing them, we can directly benefit the economy and the environment while providing a sustainable solution to unmet clinical needs," Camci-Unal said.

This is not the first time Camci-Unal has used an unconventional approach to design new materials for biomedical engineering. Last year, she and her team used the principles behind origami - the ancient Japanese art of paper folding - as inspiration to build tiny 3D structures where biomaterials can be grown in the lab to create new tissues.

Credit: 
University of Massachusetts Lowell

Knowing BRCA status associated with better breast cancer outcomes even without surgery

image: Rachel Rabinovitch, MD, and colleagues show that overall survival is greater in Ashkenazi Jewish women who know their BRCA status, even if women choose against prophylactic mastectomy.

Image: 
University of Colorado Cancer Center

Ashkenazi Jewish women have a 1-in-40 chance of carrying the BRCA mutation and these BRCA-positive women have an 80 percent lifetime risk of developing breast or ovarian cancer. A study by University of Colorado Cancer Center and Shaarei Zedek Medical Center, Israel presented at the European Society of Human Genetics Annual Meeting 2019 shows the importance of healthy women knowing their BRCA status, even when these women choose not to undergo prophylactic mastectomy: Of 63 Ashkenazi Jewish women unaware of their BRCA+ status at the time of breast cancer diagnosis, and 42 women who were aware they were BRCA+ prior to their breast cancer diagnosis (but had decided against surgical prevention), the women who knew their BRCA+ status were diagnosed with earlier stage breast cancer, needed less chemotherapy, less extensive axillary surgery, and had greater overall 5-year survival (98 percent vs. 74 percent).

"The problem is that genetic screening for BRCA by a saliva or blood test is not recommended by any medical body or health care organization in healthy Ashkenazi Jewish women without a strong family history. Therefore, the way women usually find out they have the BRCA gene is only after they are diagnosed with breast cancer, at which time we've lost the opportunity to offer surgery that could prevent breast cancer or start high-risk breast cancer screening at an age young enough to detect cancers earlier," says Rachel Rabinovitch, MD, FASTRO, CU Cancer Center investigator and professor in the CU School of Medicine Department of Radiation Oncology.

The BRCA genes, BRCA1 and BRCA2, are tumor-suppressor genes, meaning that their function is to identify and correct genetic mistakes that could lead to cancer. When these genes become disabled by mutation, they can no longer act against tumors, leading to a much greater chance of developing breast and ovarian cancers. In the general population, about 1 in 400 women carry BRCA mutations, which most health experts feel is too rare to recommend the wide use of screening (which leads insurance companies to not cover the test). However, for Ashkenazi Jewish women, the risk is ten times higher, leading to calls for increased screening in this population. The test itself is relatively simple, checking for three specific mutations and costing less than $200.

"We found that women who knew they were BRCA positive and chose to keep their breasts were much more likely to be diagnosed with noninvasive breast cancer, earlier stage invasive breast cancer, and need less morbid cancer therapy; but most importantly their survival was better," Rabinovitch says, suggesting that results argue for routine BRCA screening in women of Ashkenazi Jewish descent.

"However even among those of us who did the study, we do not agree on when is the best age for genetic screening," Rabinovitch says. "My colleagues who live in Israel suggest it should be done at age 30 (the age at which significant cancer risk begins), but I think that's too late."

The reason has to do with the ability to screen embryos for BRCA mutations and to choose to implant only the embryos without BRCA mutation.

"Testing at age 30 is useful for the woman - it's before she is likely to develop breast or ovarian cancer and so offers her the opportunity for surgical prevention or increased cancer screening. However, by that age, many women have made the decision to get pregnant, and you've deprived them of the option to undergo procedures which would prevent passing on the BRCA mutation to their children. While not all women will choose this, I think they should be given the opportunity to do so," Rabinovitch says.

Another challenge with early BRCA testing is the social stigma in some communities that can be associated with a positive finding. "When do you share with a partner your BRCA status? Would this impact a partner's interest in marriage or having children? These legitimate and important questions are among the consequences of genetic testing," Rabinovitch explains.

Overall, Rabinovitch advises Ashkenazi Jewish women (and men) to be screened for the BRCA mutation "if they are willing to deal with the consequences of that information." If not, Rabinovitch points out, "it can result in a whole lot of anxiety without much benefit."

Study results add to a growing body of evidence that Rabinovitch hopes will eventually be used to guide new BRCA screening recommendations in the United States and Israel for healthy Ashkenazi Jews and other high-risk populations.

Credit: 
University of Colorado Anschutz Medical Campus

Problematic smartphone use linked to poorer grades, alcohol misuse, more sexual partners

A survey of more than 3,400 university students in the USA has found that one in five respondents reported problematic smartphone use. Female students were more likely be affected and problematic smartphone use was associated with lower grade averages, mental health problems and higher numbers of sexual partners.

Smartphones offer the potential of instant, round-the-clock access for making phone calls, playing games, gambling, chatting with friends, using messenger systems, accessing web services (e.g. websites, social networks and pornography), and searching for information. The number of users is rapidly increasing, with some estimates suggesting that there are now more than 2.7 billion users worldwide.

While most people using smartphones find them a helpful and positive part of life, a minority of users develop excessive smartphone use, meaning that smartphone use has significant negative effects on how people function in life. Previous research has linked excessive smartphone use to mental health issues such as anxiety, depression, post-traumatic stress disorder (PTSD), attention deficit hyperactivity disorder (ADHD) and problems with self-esteem.

A collaborative team of researchers from the University of Chicago, University of Cambridge, and the University of Minnesota, developed the Health and Addictive Behaviours Survey to assess mental health and well-being in a large sample of university students. They used the survey to investigate the impact of smartphone use on university students. Just over a third (3,425) of students invited to take the test responded. The results are published today in the Journal of Behavioral Addictions.

The self-report survey consisted of 156 questions. Based on their responses, the students were given a score ranging from 10 to 60, with a score of 32 and above being defined as problematic smartphone use. This definition was based on a threshold recommended previously in clinical validation studies using the scale. Typical characteristics of problematic use include: excessive use; trouble concentrating in class or at work due to smartphone use; feeling fretful or impatient without their smartphone; missing work due to smartphone use; and experiencing physical consequences of excessive use, such as light-headedness or blurred vision.

The researchers found that one in five (20%) of respondents reported problematic smartphone use.

Problematic smartphone use was greater among female rather than male students - 64% of problem users were female. Importantly, the researchers found a link between problematic smartphone use and lower grade point averages (academic achievement).

"Although the effect of problematic smartphone use on grade point averages was relatively small, it's worth noting that even a small negative impact could have a profound effect on an individual's academic achievement and then on their employment opportunities in later life," said Professor Jon Grant from the Department of Psychiatry & Behavioral Neuroscience at the University of Chicago.

While students reporting problematic smartphone use tended be less sexually active than their peers (70.9% compared to 74%), the proportion of students reporting two or more sexual partners in the past 12 months was significantly higher among problem users: 37.4% of sexually-active problematic smartphone users compared with 27.2% sexually-active students who reported no problem use. The proportion with six or more sexual partners was more than double among sexually-active problematic smartphone users (6.8% compared to 3.0%).

"Smartphones can help connect people and help people feel less isolated, and our findings suggest that they may act as an avenue for sexual contact, whether through sustained partnerships or more casual sex," added Dr Sam Chamberlain, Wellcome Trust Clinical Fellow and Honorary Consultant Psychiatrist from the Department of Psychiatry at the University of Cambridge and the Cambridge & Peterborough NHS Foundation Trust.

The researchers found that alcohol misuse was significantly higher in those with problematic smartphone use compared to the control group. To assess this, the team used a scale known as the Alcohol Use Disorders Identification Test: a score of eight or above indicates harmful alcohol use. 33.3% of problematic smartphone users scored eight or above compared to 22.5% of other smartphone users. The researchers found no significant link with any other form of substance abuse or addiction.

In terms of other mental health problems, the researchers found that problematic smartphone use was significantly associated with lower self-esteem, ADHD, depression, anxiety, and PTSD, mirroring similar findings elsewhere.

"It's easy to think of problematic smartphone use as an addiction, but if it was that simple, we would expect it to be associated with a wide range of substance misuse problems, especially in such a large sample, but this does not seem to be the case," added Dr Sam Chamberlain.

"One possible explanation for these results is that people develop excessive smartphone use because of other mental health difficulties. For example, people who are socially isolated, those who experience depression or anxiety, or those who have attention problems (as in ADHD) may be more prone to excessive smartphone use, as well as to using alcohol. Smartphone use likely develops earlier in life - on average - than alcohol use problems and so it is unlikely that alcohol use itself leads to smartphone use."

While the sample size for this study was relative large, suggesting that the findings should be fairly robust, the researchers point out that as a cross-sectional study (one that takes a 'snapshot' at one particular time, rather than following people over a longer period), and so direction of causality cannot be established. In other words, the study cannot say that problematic smartphone use leads to mental health issues or vice versa.

The researchers point out the effect sizes were also generally small, and that more research is needed into positive and negative effects of smartphone use and mental health, including how this changes over time.

Credit: 
University of Cambridge

Transformed tobacco fields could cuts costs for medical proteins

video: 'If we're not using that cropland to make cigarettes anymore, then perhaps we could use that cropland to make enzymes.' Beth Ahner, professor of biological and environmental engineering at Cornell's College of Agriculture and Life Sciences, shows how tobacco plants can be used to make functional proteins for the manufacture of products such as denim, laundry detergent, paper and ethanol.

Image: 
John Munson, Cornell University

ITHACA, N.Y. - A new Cornell University-led study describes the first successful rearing of engineered tobacco plants in order to produce medical and industrial proteins outdoors in the field, a necessity for economic viability, so they can be grown at large scales.

The market for such biologically derived proteins is forecast to reach $300 billion in the near future. Industrial enzymes and other proteins are currently made in large, expensive fermenting reactors, but making them in plants grown outdoors could reduce production costs by three times.

Researchers from Cornell and University of Illinois have engineered plants capable of making proteins not native to the plant itself. "We knew these plants grew well in the greenhouse, but we just never had the opportunity to test them out in the field," said Beth Ahner, professor of biological and environmental engineering and senior author of "Field-grown tobacco plants maintain robust growth while accumulating large quantities of a bacterial cellulase in chloroplasts," published in the journal Nature Plants.

That opportunity came when University of Illinois plant biology professor Stephen Long obtained a permit from the U.S. Department of Agriculture to grow the genetically modified plants in the field.

Conventional wisdom suggested that the burden of asking plants to turn 20 percent of the proteins they have in their cells into something the plant can't use would greatly stunt growth.

"When you put plants in the field, they have to face large transitions, in terms of drought or temperature or light, and they're going to need all the protein that they have," Ahner said. "But we show that the plant still is able to function perfectly normally in the field [while producing nonnative proteins]. That was really the breakthrough."

Jennifer Schmidt, a graduate student in Ahner's lab, and Justin McGrath, a research scientist in Long's lab, are co-first authors of the paper. Maureen Hanson, the Liberty Hyde Bailey Professor in the Department of Molecular Biology and Genetics, is also a co-author.

Credit: 
Cornell University

Grazing animals drove domestication of grain crops

image: Large grazing animals have a strong selective force on plants, certain plants have evolved traits to thrive on pastoral landscapes. Spengler and Mueller theorize that yak herding may have helped drive buckwheat domestication in the southern Himalaya. This lone yak in the Lhasa region of Tibet is a significant evolutionary force on the plant communities around where it grazes.

Image: 
Robert Spengler

Many familiar grains today, like quinoa, amaranth, and the millets, hemp, and buckwheat, all have traits that indicate that they coevolved to be dispersed by large grazing mammals. During the Pleistocene, massive herds directed the ecology across much of the globe and caused evolutionary changes in plants. Studies of the ecology and growing habits of certain ancient crop relatives indicate that megafaunal herds were necessary for the dispersal of their seeds prior to human intervention. Understanding this process is providing scientists with insights into the early domestication of these plants.

The domestication of small-seeded annuals involved an evolutionary switch from dispersal through animal ingestion to human dispersal. Those are the findings of a new study by Robert Spengler, director of the Paleoethnobotany Laboratories at the Max Planck Institute for the Science of Human History, and Natalie Mueller, a National Science Foundation fellow at Cornell University, published in Nature Plants. Spengler and Mueller demonstrate, by looking at rangeland ecology and herd-animal herbivory patterns, that the progenitors of small-seeded crops evolved to be dispersed by megafaunal ruminants. Although today the wild varieties of these species grow in small, isolated patches, the researchers illustrate that heavy grazing of these plants by herd animals causes dense patches to form near rivers or other areas that the animals frequent. In ancient times, these dense patches of plants could have easily been harvested, just like modern farmers' fields - explaining how and why ancient people might have focused on these specific plants. This study provides an answer for this long-standing mystery of plant domestication.

Small-seeded crops are products of another age

During the mid-Holocene (7,000-5,000 years ago), in ecologically rich river valleys and grasslands all around the world, people started to cultivate small plants for their seed or grain. Wheat, barley, and rice are some of the earliest plants to show signs of domestication and scientists have extensively studied the domestication process in these large-seeded cereal crops. Researchers know significantly less about the domestication of small-seeded grain crops, such as quinoa, amaranth, buckwheat, the millets, and several now-lost crops domesticated in North America. The wild ancestors of these crops have small seeds with indigestible shells or seed coats. Today, these wild plants exist in small fragmentary patches dispersed across huge areas - the fact that they do not grow in dense clusters, like the ancestors of wheat and rice, would seem to have made these crop ancestors unappealing targets for foragers. The small seed sizes and hard seed shells, combined with the lack of dense wild populations, led many researchers to argue that they must have been a famine food.

Foraging enough wild seeds from these varieties to grind into flour to bake a loaf of bread would take weeks, especially for rarer or endangered crop ancestors. So why did early foragers focus so heavily on these plants and eventually adopt them as crops?

Spengler and Mueller present a new model, suggesting that when humans first encountered these plants, they would have grown in dense stands created by grazing megafauna, making them easy to harvest. As humans began to cultivate these plants, they took on the functional role of seed dispersers, and eventually the plants evolved new traits to favor farming and lost the old traits that favored being spread by herd animals. The earliest traits of domestication, thinning or loss of indigestible seed protections, loss of dormancy, and increased seed size, can all be explained by to the loss of the ruminant dispersal process and concomitant human management of wild stands.

A novel model for the domestication of small-seeded grain crops

Spengler and Mueller have been interested in plant domestication since graduate school, when they studied under Dr. Gayle Fritz, one of the first scholars to recognize the importance of the American Midwest as a center of crop domestication. Despite decades of research into the nature of plant domestication in North America, no one recognized that the true key was the massive bison herds. The plants that were domesticated, what Mueller calls the "Lost Crops," would have been dispersed by bison in large swaths, making them easy to collect by ancient people and perhaps encouraging these communities to actively plant them themselves. When Europeans exterminated the herds, the plants that relied on these animals to disperse their seeds began to diminish as well. Because the wild ancestors of these lost crops are rare today and the bison herds are effectively extinct, researchers have overlooked this important coevolutionary feature in the domestication process.

However, this process is not unique to the American Midwest and the researchers suggest that there may be links between buckwheat domestication and yak herding in the Himalaya and amaranth domestication and llama herding in the Andes. The authors have identified parallel patterns in rangeland ecology studies, noting that heavy herd animal herbivory can homogenize vegetation communities. For example, heavy pastoralist grazing in the mountains of Central Asia causes many plants to die, but certain plants with adaptations for dispersal by animals thrive. The depositing of plant seeds in nutrient rich dung leads to ecological patches, often called hot spots, that foragers can easily target for seed collecting.

For over a century, scholars have debated why early foragers targeted small-seeded annuals as a major food source (eventually resulting in their domestication). Today, the progenitors of many of these crops have highly fragmentary populations and several are endangered or extinct. Likewise, without large dense homogenous stands of these plants in the wild, such as what exists in the wild for the progenitors of large-seeded cereal crops, it would have been impossible to harvest their seeds. The conclusions that Spengler and Mueller draw help explain why people targeted these plants and were able to domesticate them. "Small-seeded annuals were domesticated in most areas of the world," explains Spengler. "So the ramifications of this study are global-scale. Scholars all over the world will need to grapple with these ideas if they want to pursue questions of domestication."

Spengler and Mueller are continuing their research into the role that grazing animals played in plant domestication. "Currently, we're studying the ecology of fields where modern herd animals graze as proxies to what the ecology would have looked like during the last Ice Age, when large herds of bison, mammoths, and wooly horses dictated what kinds of plants could grow across the American Midwest and Europe," explains Spengler. "We hope these observations will provide even greater insight into the process of domestication all over the world."

Credit: 
Max Planck Institute of Geoanthropology

Indian Ocean causes drought and heatwaves in South America

image: The marine heatwave that accompanied the South American drought of 2013/14 was one of the strongest ever recorded for the area. Trends show that these marine heatwaves are getting longer, more intense and cover larger areas.

Image: 
Dr. Regina Rodrigues

New research has found the record-breaking South American drought of 2013/14 with its succession of heatwaves and long lasting marine heatwave had its origins in a climate event half a world away - over the Indian Ocean.

The findings published in Nature Geoscience by an international research team with authors from the Federal University of Santa Catarina in Brazil, Australia's ARC Centre of Excellence for Climate Extremes and NOAA in the US suggest this may not have been the first time the Indian Ocean has brought extraordinary heat to the region.

It all started with strong atmospheric convection over the Indian Ocean that generated a powerful planetary wave that travelled across the South Pacific to the South Atlantic where it displaced the normal atmospheric circulation over South America.

You can think of these atmospheric waves as being similar to an ocean swell generated by strong winds that travel thousands of kilometres from where they were generated. Large-scale atmospheric planetary waves form when the atmosphere is disturbed and this disturbance generates waves that travel around the planet.

"The atmospheric wave produced a large area of high pressure, known as a blocking high, that stalled off the east coast of Brazil," said lead author Dr Regina Rodrigues.

"The impacts of the drought that followed were immense and prolonged, leading to a tripling of dengue fever cases, water shortages in São Paulo, and reduced coffee production that led to global shortages and worldwide price increases."

That impact wasn't just felt on land as the high-pressure system stalled over the ocean.

"Highs are associated with good weather. This means clear skies - so more solar energy going into the ocean - and low winds - so less ocean cooling from evaporation."

"The result of this blocking high was an unprecedented marine heatwave that amplified the unusual atmospheric conditions and likely had an impact on local fisheries in the region."

The researchers found this atmospheric wave was not an isolated event and that strong convection far away in the Indian Ocean had previously led to drought impacts in South America.

"Using observations from 1982 to 2016, we noticed an increase not only in frequency but also in duration, intensity and area of these marine heatwave events. For instance, on average these events have become 18 days longer, 0.05°C warmer and 7% larger per decade." said CLEX co-author Dr Andrea Taschetto.

The 2013/14 South American drought and marine heatwave is the latest climate case study to show how distant events in one region can have major climate impacts on the other side of the world.

"Researchers found that Australia's 2011 Ningaloo Nino in the Indian Ocean, which completely decimated coastal ecosystems and impacted fisheries, was caused by a La Niña event in the tropical Pacific," said Australian co-author Dr Alex Sen Gupta.

"Here we have yet another example of how interconnected our world is. Ultimately, our goal is to understand and use these complex remote connections to provide some forewarning of high impact extreme events around the world."

Credit: 
University of New South Wales