Culture

God doesn't play dice -- does cancer?

image: James DeGregori, Ph.D., and colleagues show that much of what we think we know about the causes of cancer is wrong.

Image: 
University of Colorado Cancer Center

The saying "God doesn't play dice" is meant to suggest that nothing happens by chance. On the other hand, cancer seems like the ultimate happenstance: Don't we all have a 43-year-old, vegan, triathlete friend fighting cancer? Does this mean that cancer plays dice? According to the traditional model of how cancer develops, yes: Every time a cell divides, you roll a die, and the more years you roll, the greater your chance of rolling an unfortunate mutation that causes cancer. Some young people get very unlucky and some older people get very lucky, but overall, the longer you live, the more times you roll the die, the greater your risk of developing cancer. It makes perfect sense.

Only, a University of Colorado Cancer Center study published in the journal eLife points out a simple problem with this model: Many cancers require more than one activating mutation. In other words, not only one but multiple unlikely bad things have to happen to cause cancer. Think of this like rolling multiple dice, or perhaps like rolling an unlucky number on a single die, multiple times. Say you're rolling a 100-sided die with "42" being a cancer-causing mutation. You would expect it to take longer to roll four 42s than it does to roll one 42, right?

But the current study shows that no matter the number of unlucky events needed to cause a specific kind of cancer, cancer risk rises equally with age. On average, it takes only one mutation to cause mesothelioma, eleven mutations to cause colorectal cancer, and four mutations to cause pancreatic cancer. But despite the dramatically different Vegas odds of "rolling" one, four and eleven "42s," the incidence of these cancers goes up uniformly with age, accelerating from about age 60 to about age 85. In this case, it does not, in fact, take longer to roll eleven 42s than it takes to roll one 42.

The evidence against the traditional model of oncogenesis gets curiouser and curiouser.

Some cancers arise from giant pools of stem cells, while other cancers arise from very small stem cell pools. These larger stem cell pools are like rolling far more dice - sometimes thousands of times more dice - and when rolling more dice, you would expect to get the needed number of unfortunate 42s much more quickly than if rolling fewer dice. This means that cancers that grow from large stem cell pools should arise much earlier in life. They do not. Like the number of activating mutations, no matter the size of the stem cell pool, the incidence of most cancers rises uniformly with age. (Also, whales have millions of times more cells than do mice, and yet cancer is no more common in whales than it is in mice - this lack of correlation between number of cells and cancer incidence is called Peto's Paradox.)

Again, the point is that according to the traditional model of oncogenesis, it should take you longer to randomly generate four cancer-causing mutations than it takes you to generate one. And if you are rolling dice for a billion stem cells, generating these mutations should take less time than if you are rolling dice for a million stem cells. In reality, neither is true: Neither the number of activating mutations nor the size of the stem cell pool affects the age at which people develop these different kinds of cancer.

"What was really striking to us is that if you normalize all the cancer incidences, they all fall on top of each other - for different cancers that require such a different number of drivers and that arise from stem cell pools that range thousands of times in size, incidence is the same across age," says CU Cancer Center Deputy Director, James DeGregori, PhD. "What this means is that, mathematically, one cannot account for carcinogenesis simply with mutation accumulation over age."

In other words, something other than the traditional model of oncogenesis is in play.

It seems somehow fitting that DeGregori's answer to the questions of God and cancer playing dice comes from the field of evolution. And his methodology comes from, well, playing dice.

"Essentially what we show is that a logical way to get these patterns is to assume that selection for these mutations is disadvantageous in youth, but become advantageous later in life," DeGregori says. Lead author of the current study, Andrii Rozhok, PhD, demonstrated this differential selection by using a mathematical technique called Monte Carlo modeling, which, appropriately named after the gambling mecca, is basically a way to incorporate randomness into prediction models.

What DeGregori means is that selection pressure acting on healthy cells and cancer cells competing in the ecosystem of the body - and not necessarily random, cancer-causing mutations - in large part decides who does and does not get cancer. In young people, cancer-causing mutations make a cell less fit for the tissue ecosystem; in older people, cancer-causing mutations make a cell more fit for the new, age-altered tissue ecosystem.

"If you think about the classic example of evolution, before the Industrial Revolution, it was advantageous for moths to be fairly light colored to match light-colored lichens on English trees. In pre-Industrial England, some moths were born black, but they were selected against. It was only when soot covered the trees that being dark colored became advantageous, allowing dark moths to out-compete light ones," DeGregori says.

Like black moths, cancer cells may pop up from time to time in a young, healthy body, but they are generally selected against. It is only when old age, smoking (soot!), UV exposure or other carcinogenic factors adjust the tissue ecosystem that the black moths of cancer cells suddenly find themselves most fit.

"If these mutations are disadvantageous, they are disadvantageous in both large and small stem cell pools. And in terms of cancer risk, it matters less at what age you pick up the one or two or four needed cancer-causing mutations than at what age these mutations become advantageous," DeGregori says.

DeGregori calls this model of cancer driven by natural selection Adaptive Oncogenesis. And, in fact, cancer is not the only bad actor that takes advantage of tissue alterations that we experience at older ages.

"If you graph out heart disease, kidney disease, basically all the bad stuff that starts happening to us late in life, they all display similar patterns as cancer incidence. In this way, we can think of cancer as being cut from the same cloth as other diseases of aging - the same tissue changes that increase the risk of many other diseases make tissues more hospitable to cells with cancer-causing mutations," DeGregori says.

Basically, you get old and then you die. But it's not all bad news!

"The flip side is that youth is powerfully tumor suppressive," DeGregori says. "We are each composed of 30-40 trillion cells and we maintain this cooperative structure for more than half a century with minimal risk of cancer, even with all the exposures, all the cell divisions. It all comes down to how natural selection has invested in our bodies, simply as a means of maximizing return on investment - of investing in ensuring we reach reproductive age. Cancer reflects the waning of selection to maintain us."

Does God play dice? Ask the theologians and philosophers. As for cancer, it seems that it does play dice - chance and randomness certainly play major roles in deciding which individuals get cancer. But the dice are loaded. On a population level, rather than the fact of rolling a 42, what really affects cancer risk is how well adapted these dangerous number-42 cells are to their environment. In young, healthy tissues, 42 cells lose. In older tissues, they win.

Credit: 
University of Colorado Anschutz Medical Campus

New study maps how ocean currents connect the world's fisheries

A new study published in the journal Science finds that the world's marine fisheries form a single network, with over $10 billion worth of fish each year being caught in a country other than the one in which it spawned.

While fisheries are traditionally managed at the national level, the study reveals the degree to which each country's fishing economy relies on the health of its neighbors' spawning grounds, highlighting the need for greater international cooperation.

Led by researchers at the University of California, Berkeley, the London School of Economics, and the University of Delaware, the study used a particle tracking computer simulation to map the flow of fish larvae across national boundaries. It is the first to estimate the extent of larval transport globally, putting fishery management in a new perspective by identifying hotspots of regional interdependence where cooperative management is needed most.

"Now we have a map of how the world's fisheries are interconnected, and where international cooperation is needed most urgently to conserve a natural resource that hundreds of millions of people rely on," said co-author Kimberly Oremus, assistant professor at the University of Delaware's School of Marine Science and Policy.

The vast majority of the world's wild-caught marine fish, an estimated 90%, are caught within 200 miles of shore, within national jurisdictions. Yet even these fish can be carried far from their spawning grounds by currents in their larval stage, before they're able to swim. This means that while countries have set national maritime boundaries, the ocean is made up of highly interconnected networks where most countries depend on their neighbors to properly manage their own fisheries. Understanding the nature of this network is an important step toward more effective fishery management, and is essential for countries whose economies and food security are reliant on fish born elsewhere.

The authors brought together their expertise in oceanography, fish biology, and economics to make progress on this complex problem.

"Data from a wide range of scientific fields needed to come together to make this study possible," said lead author Nandini Ramesh, a post-doctoral researcher in the Department of Earth and Planetary Science at the University of California, Berkeley. "We needed to look at patterns of fish spawning, the life cycles of different species, ocean currents, and how these vary with the seasons in order to begin to understand this system." The study combined data from satellites, ocean moorings, ecological field observations, and marine catch records, to build a computer model of how eggs and larvae of over 700 species of fish all over the world are transported by ocean currents.

The research shows that ocean regions are connected to each other in what's known as a "small world network", the same phenomenon that allows strangers to be linked by six degrees of separation. That adds a potential new risk: threats in one part of the world could result in a cascade of stresses, affecting one region after another.

"We are all dependent on the oceans," said co-author James Rising, assistant professorial research fellow at the Grantham Research Institute in the London School of Economics. "When fisheries are mismanaged or breeding grounds are not protected, it could affect food security half a world away."

A surprising finding of the study was how interconnected national fisheries are, across the globe. "This is something of a double-edged sword," explained lead author Ramesh, "On one hand, it implies that mismanagement of a fishery can have negative effects that easily propagate to other countries; on the other hand, it implies that multiple countries can benefit by targeting conservation and/or management efforts in just a few regions."

"By modeling dispersal by species, we could connect this ecosystem service to the value of catch, marine fishing jobs, food security and gross domestic product," Oremus added. "This allowed us to talk about how vulnerable a nation is to the management of fisheries in neighboring countries."

They found that the tropics are especially vulnerable to this larval movement--particularly when it comes to food security and jobs.

"Our hope is that this study will be a stepping stone for policy makers to study their own regions more closely to determine their interdependencies," said Ramesh. "This is an important first step. This is not something people have examined before at this scale."

Credit: 
University of Delaware

The richer the pickings, the more honest the people

The more money there is in a lost wallet, the more likely it is to be returned to its owner, researchers from the Universities of Zurich, Michigan and Utah show in a global study. They explain the surprising result with the fact that dishonest finders have to adapt their self-image, which involves psychological costs that can exceed the material value of the wallet.

The classic economic model predicts that individuals will typically keep a lost wallet. The financial incentive to keep the wallet is particularly large if it contains a large sum of money. However, this assumption is refuted by a recent study. In 355 cities in 40 countries, the research team investigated what causes people to return a wallet to its owner. To this end, they handed in more than 17,000 apparently lost wallets at the receptions of various institutions such as hotels, banks, museums, post offices or police stations.

The researchers examined four factors that influence the decision to return the wallet: The monetary incentive to keep the money, the effort involved in contacting the owner, altruistic considerations about the welfare of the owner, and what is known as the "psychological costs of dishonest behavior". The latter are caused by the fact that keeping a lost wallet is often perceived as theft, and the finder has to adapt their self-image.

Preserving one's self-image as an honest person

The researchers were able to show that these psychological costs, i.e. the preservation of one's self-image as an honest person, can explain the finders' behavior. "People want to see themselves as an honest person, not as a thief. Keeping a found wallet means having to adapt one's self-image, which comes with psychological costs," explains Michel Maréchal, professor of economics at the UZH Department of Economics. Participants in an additional survey confirmed that the more money there was in a lost wallet, the more likely not returning it was to be classified as theft, causing higher psychological cost of dishonest behavior.

The wallets contained a business card, a shopping list, a key and a varying amount of money. A key is only of value to the owner, but not to the finder. To measure altruistic concerns, the researchers also handed in some wallets without a key. A wallet with money, but no key was less likely to be returned than a wallet with the same amount of money and a key. Based on this finding, the researchers conclude that altruistic considerations play a further, albeit subordinate, role when returning wallets.

Less selfish than expected

Although reality disproves the economic model, an additional survey shows that many academic economists and the general population assume that lost wallets containing large sums of money are less likely to be returned. "We mistakenly assume that our fellow human beings are selfish. In reality, their self-image as an honest person is more important to them than a short-term monetary gain," explains Alain Cohn, assistant professor of economics at the University of Michigan and co-author of the study, summarizing the contradiction between expected and actual behavior.

Where finders are most honest

In countries such as Switzerland, Norway, the Netherlands, Denmark and Sweden, between 70 and 85 percent of the wallets were returned to their owners. The Swiss are the most honest when it comes to returning wallets containing a key but no money. Danes, Swedes and New Zealanders were even more honest when the wallets contained larger sums. In countries such as China, Peru, Kazakhstan and Kenya, on average only between 8 and 20 percent of the wallets were returned to their owners. Although the proportion of returned wallets varied widely between countries, in almost all countries wallets with large sums of money or valuable contents were more likely to be returned.

Credit: 
University of Zurich

Why does the moon smell like gunpowder? (video)

image: After walking on the Moon astronauts hopped back into their lunar lander, bringing Moon dust with them. They were surprised, and perplexed, to find that it smelled like spent gunpowder. This week on Reactions, learn why Moon dust might smell like the aftermath of a Civil War reenactment: https://youtu.be/iQod_oYnFTc.

Image: 
The American Chemical Society

WASHINGTON, June 20, 2019 -- After walking on the moon, astronauts hopped back into their lunar lander, bringing the heavenly body's dust along with them on their spacesuits. They were surprised, and perplexed, to find that it smelled like spent gunpowder. This week on Reactions, learn why moon dust might smell like the aftermath of a Civil War reenactment: https://youtu.be/iQod_oYnFTc.

Credit: 
American Chemical Society

A study from IRB Barcelona describes the reaction mechanism of DNAzymes

image: The DNAzyme 9DB1, whose catalytically active structure and reaction mechanism was unknown, makes use of a mechanism involving two ions, similar to that used by natural enzymes. The results reported by the IRB Barcelona researchers will lead to improvements in current DNAzymes and the design of more efficient ones for diagnostic and therapeutic purposes.

Image: 
Juan Aranda, IRB Barcelona

A study from the Institute for Research in Biomedicine (IRB Barcelona) has published a study in the journal Nature Catalysis that describes the reaction mechanism used by the DNAzyme 9DB1, the first structurally available catalyser formed by DNA.

Until recently, it was widely assumed that DNA served to store genetic information in a stable and irreversible manner. However, in the last ten years, the discovery of the epigenetic code and the finding that nucleic acids can also catalyse certain reactions have changed this vision.

The team headed by Modesto Orozco, head of the Molecular Modelling and Bioinformatics Lab at IRB Barcelona, found that this DNAzyme catalyses RNA ligation through a similar mechanism to that used by natural enzymes.

The conclusion drawn by the study may lead to improvements in current catalysers and in the design of novel biocatalysers formed by DNA. Indeed, given that DNAzymes can carry out a variety of reactions on messenger RNA and can trigger the silencing of genes, they are being developed for diagnostic and biomedical applications.

"The role of DNAzymes as catalysers is of great interest since they are easier to synthesise than proteins and RNA molecules, as well as being more stable and less expensive. However, to date, the catalytic mechanism used by DNAzymes was unknown, as were the differences between catalysers made of DNA and RNA or the protein enzymes," says Orozco, senior professor at the University of Barcelona.

The study published by the IRB Barcelona team aimed to unravel the details of the catalytic mechanism of DNAzymes. To this end, Juan Aranda and Montserrat Terrazas, postdoctoral fellows at IRB Barcelona and first authors of the work, studied DNAzyme 9DB1 at the atomic level using computational simulations and then experimentally validated their findings.

The various computational techniques, ranging from molecular dynamics to the combined use of quantum mechanics and classical mechanics, included in the study have allowed the characterisation of the catalytic state of 9DB1. Using these approaches, the researchers have achieved the first atomic description of the reaction mechanism of a DNAzyme and have characterised the most important interaction in the catalysis and in the transition state of the reaction.

They have experimentally synthesised in vitro variants of 9DB1 to confirm the mechanism that was predicted through the computational approach. The reaction mechanism used by the DNAzyme resembles that of polymerases, which use two divalent cations.

Finally, the scientists have analysed the differences and similarities between the catalytic capacity of DNA, RNA and polymerases. Such atomic information is expected to lead to the design of more efficient DNAzymes.

Credit: 
Institute for Research in Biomedicine (IRB Barcelona)

Vanilla makes milk beverages seem sweeter

Adding vanilla to sweetened milk makes consumers think the beverage is sweeter, allowing the amount of added sugar to be reduced, according to Penn State researchers, who will use the concept to develop a reduced-sugar chocolate milk for the National School Lunch Program.

"We are utilizing a learned association between an odor and a taste that will allow us to reduce the added sugar content," said Helene Hopfer, assistant professor of food science. "Reducing added sugar in products, just like reducing fat and salt, is the holy grail of food science."

The idea that congruent or harmonious odors enhance certain tastes is not new, explained Hopfer, whose research group in the College of Agricultural Sciences has been experimenting with these "cross-modal interactions" in food since she came to Penn State three years ago. Her goal is to see them actually incorporated into foods.

In a blind taste test that provided new insights into taste enhancement by an aroma, participants -- who did not know vanilla had been added to the milk -- consistently indicated that samples with vanilla were significantly sweeter than their added sugar concentrations could explain.

The subjects' responses indicate that with the addition of vanilla, the added sugar content in flavored milk could potentially be reduced by 20 to 50 percent, suggested lead researcher Gloria Wang, and people should not be able to perceive the beverage as less sweet.

"We maintain the sweetness perception by having this congruent odor -- this learned, associated odor -- basically trick the brain into thinking that there is still enough sweetness there," she said. "Based on our results, taste-aroma interaction is a robust effect."

Wang, now an associate scientist in product development with Leprino Foods Co. in Colorado, conducted the research at Penn State as part of her master's degree thesis in food science. She tested not only congruent taste-aroma combinations but incongruent combinations as well. It turned out that even a beef odor in milk slightly enhanced sweetness for study participants.

Given widespread concerns about sugar intake and health, manufacturers are reformulating their products to help address consumer demand, Wang noted. She believes the findings of the research, recently published in Food Quality and Preference, offer them a workable option to reduce added sugar in their products and retain the sweetness consumers demand.

The study was novel because it did not ask participants to rate individual attributes of the milk such as sweetness, intensity of vanilla odor or milk taste. Instead, participants took a more holistic approach and simply selected the best match for the vanilla milk from four differently sweetened milk choices.

Later this summer, Hopfer's lab in the Department of Food Science will start working on a two-year project, funded by the National Dairy Council, aimed at developing a reduced-sugar chocolate milk for the National School Lunch Program. The effort, based on the recent research using the synergistic actions between vanilla and sugar to reduce the added sugar content, will be a challenge because of the inherent bitterness of cocoa.

"The amount of sugar in chocolate milk is quite high because cocoa is very bitter, so you need some sugar to decrease the bitterness of the cocoa and then more to make it sweet," Hopfer said. "We are hoping to utilize what we found with odors to reduce the added sugar content by experimenting to find the sweet spot between cocoa powder, sugar content and vanilla flavor. We know that if it isn't sweet, children won't drink it."

Credit: 
Penn State

Frustrated fish give up thanks to glia, not just neurons

image: When a zebrafish tries to swim but doesn't get anywhere, noradrengeric neurons (magenta) send a message to radial astrocytes (green). When the astrocytes accumulate enough signals of failure from the neurons, they'll tell the fish to give up.

Image: 
Ahrens Lab/Janelia Research Campus

Secured in place in a virtual-reality-equipped chamber, frustrated zebrafish just didn't want to swim anymore.

They had been "swimming" along fine, until the virtual reality system removed visual feedback associated with movement. To the fish, it appeared as if they were drifting backward, regardless of how hard they stroked.

First, the fish thrashed harder. Then, they simply gave up, says neuroscientist Misha Ahrens, a group leader at the Howard Hughes Medical Institute's Janelia Research Campus. "Giving up is a very important thing for animals to be able to do," he says. Without the ability to stop a behavior that's not working, animals would needlessly deplete their energy.

Ahrens and his team at Janelia wanted to identify the neurons responsible for the decision to quit. The researchers watched the zebrafish's brain activity patterns as they struggled. But the clearest signal wasn't coming from neurons. The cells that sprang into action just before the zebrafish called it quits were actually glia, long thought to play a supporting role in the brain.

The find, reported June 20, 2019, in the journal Cell, is clear evidence that cells other than neurons can perform computations that influence behavior - a discovery so surprising that the team took pains to verify their work, Ahrens says.

"We were excited and also very skeptical," he says. "We challenged ourselves to try and disprove it."

More than glue

Until about two decades ago, scientists thought glia (from the Greek for "glue") just provided support and insulation for neurons. But recent research has begun to uncover new roles for glia in processing. Now, Ahrens, Janelia Research Scientist Yu Mu, and their colleagues - Davis Bennett, Mikail Rubinov and others - have shown that, in zebrafish, one type of glial cell can calculate when an effort is futile.

"The original hope was that we would find the neurons that drive this 'giving-up' behavior," Ahrens says.

A whole-brain imaging technique previously developed at Janelia let the researchers look at all of a fish's brain cells, both neurons and glia, while it tried to swim. Then, the team compared the different cells' impact on behavior.

But surprisingly, the team had trouble identifying specific neurons that clearly impacted swimming behavior. Glia were a different story, Mu says. Certain glia, called radial astrocytes, amped up their activity in one part of the brain when the animals stopped trying to swim.

Neurons weren't completely out of the loop: each time a movement attempt failed, certain neurons revved the astrocytes up, until at last they crossed a threshold and sent the quit command. That command went out to a different set of neurons, which then suppressed swimming.

"You could think of the astrocytes as a counter for how many swim bouts have failed," says Mu. It's not a simple job: To tell the fish when to give up, the glia must monitor movement attempts, note repeated failures, and then send the "quit" message to the body.

Control astrocytes, change behavior

To verify the astrocytes' role, the researchers first used a laser to kill only the ones that consistently turned on when the fish gave up. In fact, the team found, fish who lacked those cells continued struggling to swim much longer than the fish whose astrocytes remained.

Next, the team created fish with astrocytes the team could control. Switch on the astrocytes, and the fish stop swimming, the team found, even when the visual environment wasn't messing with them. While normal fish rarely pause, fish with overactive astrocytes spent over half their time languishing in defeat. Taken together, these experiments confirmed that radial astrocytes indeed control the decision to stop swimming, Ahrens says.

One next step for the group will be studying exactly how the astrocytes communicate with neurons. Astrocytes can, for example, release chemical messengers that affect neuron behavior, Mu says. "Astrocytes are like a swiss army knife." Mu wants to identify which of their many tools they deploy to halt unproductive struggle.

Credit: 
Howard Hughes Medical Institute

Canadian researchers discover new genetic link to premenopausal breast cancer

image: Graduate student Mahalakshmi Kumaran (left) and professor Sambasivarao Damaraju led a team that discovered a new genetic marker linked to an increased risk of breast cancer in premenopausal Caucasian women.

Image: 
Jordan Carson

University of Alberta researchers have added a new genetic marker to the breast cancer map, helping to expand the list of genetic mutations clinicians can watch for in cancer screenings.

The genetic marker--called rs1429142--was found to confer a higher risk of breast cancer in Caucasian women carrying the genetic variation compared to women without the variation. In premenopausal women, that risk reached as high as 40 per cent. The ability to identify those genes and their variants (called alleles) can be vital to early detection and life-saving treatment.

"This is important because the more we are able to create a complete picture of all the genes and all the variations and mutations that contribute to breast cancer, the closer we get to developing a genetic screen for breast cancer on a population level," said Sambasivarao Damaraju, a professor at the U of A's Department of Laboratory Medicine & Pathology and a member of the Cancer Research Institute of Northern Alberta. "If we can identify women at risk before they are diagnosed, and as long as we have the resources to mitigate that risk through preventative approaches, we can reduce the overall burden of breast cancer risk in a population."

Though the study primarily focused on genetic causes of breast cancer in Caucasian women, Damaraju's team went on to validate their findings in women of Chinese and African descent to explore the impact demographics may have on cancer risk.

Breast cancer is the most common cancer in women, with an estimated one out of every eight women expected to develop it in their lifetimes. While environmental factors like smoking, diet or lack of physical activity can lead to cancer, a person's genes also contribute to the risk of getting the disease.

"One of the real benefits of this research is that it brings a lot of focus to premenopausal breast cancer, which otherwise wasn't thought much about," said Mahalakshmi Kumaran, Damaraju's graduate student and first author on the paper.

The study, published in the International Journal of Cancer, is the first of its kind to examine breast cancer risk in Caucasian women divided into premenopausal and postmenopausal groupings. Because the majority of breast cancers are diagnosed in women over the age of 55, most genetic association studies focus on postmenopausal women. However, inherited forms of cancers, typically related to genetic mutations, are more likely to be more aggressive and be diagnosed earlier in life.

The researchers examined more than 9,000 women from Alberta for the study, utilizing samples from patients diagnosed with breast cancer and unaffected healthy controls from the Alberta Cancer Research Biobank and The Alberta's Tomorrow Project, respectively.

The DNA isolated from the participants' blood provided clues to specific chromosomes that showed links to breast cancer risk. Using those clues, the team began to zero in on the specific regions of the chromosome to locate genetic variations across samples. They noticed that rs1429142 showed a consistent association with breast cancer risk in multiple tests. When the data was analyzed based on menopausal status, the risk was shown to be significantly higher for premenopausal women.

After confirming the link between the genetic variant and breast cancer, the team then took the extra step of zooming in the genomic region to identify the specific location of the gene on the chromosome and marking it for future researchers.

"Finding this genetic marker is like starting with a high-resolution Google map of the world, and then slowly zooming in to the image of your house," Damaraju said. "It is valuable to do because now we have essentially planted a road sign on the chromosome that can help future researchers to carry out further in-depth studies."

Using international data from other genetic studies of breast cancer, and contributions from Vanderbilt University and St Jude Children's Research Hospital investigators from Tennessee, the team was also able to validate their findings in women of Chinese and African descent. They found that women of African descent were at a particularly high risk of premenopausal breast cancer as a result of the variant gene. This underlined the idea that genetic ancestry plays an important role in cancer risk.

While the study focused primarily on the genetic variation present on a single chromosome, Damaraju said they also found promising leads for identifying more cancer-related genetic markers on other chromosomes as well. In the future, he hopes to see his research contribute to a more precise method of treating breast cancer by tailoring therapies to the specific needs of the patient.

"My focus for the last 20 years has been to build a pipeline from genetic research to benefit the patient," he said. "We identify the genetic predispositions and focus on developing population-related risk models to enable potential screening of populations and eventually possible interventions."

Credit: 
University of Alberta Faculty of Medicine & Dentistry

Not always reaching your potential is okay, but overthinking it is a problem

Having aspirations helps us navigate life in a meaningful and fulfilling way, but it can also cause psychological distress when hopes are left unfulfilled.

New Edith Cowan University (ECU) research has found that it's not failing to make progress toward our 'ideal-self' that is problematic but rather the tendency to focus on that lack of progress in a negative way that leads to psychological distress.

In other words, it pays to be kind to yourself, say the key researchers.

The study, led by Associate Professor Joanne Dickson from ECU's School of Arts and Humanities, explored whether 'ideal-self' and 'actual-self' discrepancies were associated with depressive and anxious symptoms.

It also considered whether 'rumination', or excessive negative thinking, played a role in these relationships.

Professor Dickson said there are two key 'self-guides' that typically motivate us and provide standards for self-evaluation: the 'ideal-self' and the 'ought-self'.

"The 'ideal-self' is the person we ideally want to be - our hopes and aspirations. The 'ought self' is who we believe we ought to be - our duties, obligations, and responsibilities," she said.

"Our findings showed that perceiving one's hopes and wishes as unfulfilled and the loss of desired positive outcomes increases emotional vulnerability and psychological distress.

"Whereas actual-ought self-discrepancies were associated with anxiety (but not depression)."

The role of excessive negative thinking

Professor Dickson said a novel finding was the role of 'rumination', the tendency to engage in repetitive negative thinking.

"It's not failing to make progress toward our 'ideal-self' that is necessarily problematic but rather the tendency to repetitively think about this lack of progress that represents a significant vulnerability that, in turn, leads to increased psychological distress," she said.

In contrast, lack of progress in relation to our 'ought self' (ie duties, responsibilities, obligations) directly increased anxiety (but not depression), and this was not facilitated via repetitive thinking.

"It may be that fulfilling obligations, duties and responsibilities is more pressing or urgent than the pursuit of hopes and the more immediate negative consequences of not fulfilling these 'ought to' obligations may mean there is less time to engage in reflective contemplation," Professor Dickson said.

Advice for minimising psychological distress

Professor Dickson said self-guides as standards that we aspire to are beneficial in giving a sense of purpose and direction in life and promoting wellbeing, even if we don't always reach them, but turning the focus toward negative self-evaluation and self-criticism is counter-productive.

"Reflecting on and at times modifying our self-guides may be helpful, particularly if we are caught in a spiral of negative self-evaluation that is accompanied by a constant sense of failing to meet overly high standards.

"We need to be kind to ourselves and keep our self-guides in perspective," she said.

Credit: 
Edith Cowan University

Pigs help scientists understand human brain

video: An animation showing different networks in the pig brain.

Image: 
Gregory Simchick

Athens, Ga. - For the first time, researchers in the University of Georgia's Regenerative Bioscience Center have used an imaging method normally reserved for humans to analyze brain activity in live agricultural swine models, and they have discovered that pig brains are even better platforms than previously thought for the study of human neurological conditions such as Alzheimer's and Parkinson's.

One immediate potential application is in the study and diagnosis of CTE, a progressive brain disease caused by a series of blunt trauma usually seen in military veterans and NFL football players. Currently CTE can be diagnosed only through an autopsy. The new study strongly suggests that a translational swine model for mapping functional brain connectivity is a promising approach to determine biomarkers or brain signatures that lead to CTE. Using this type of data, doctors would have the opportunity to diagnose CTE while a veteran or athlete is still alive.

By using resting-state functional magnetic resonance imaging (rs-fMRI), the researchers demonstrated functional connectivity in sensorimotor regions of the swine brain that parallels to that of the human brain. These regions include those where all our perceptions, feelings, movements and memories are encoded. The similarities of these functional networks, as published in the journal Brain Connectivity, set the stage for targeted clinical applications in the treatment and prevention of neurological disorders.

Franklin West, associate professor of animal and dairy science in College of Agricultural and Environmental Sciences, and his RBC collaborator, Qun Zhao, drew comparisons between sensory and cognitive relevance found in swine and those previously established in humans.

"Most of the models to-date deal with structural comparisons," said Zhao, associate professor of physics in the Franklin College of Arts and Sciences. "Our model goes beyond brain mass and allows us to address questions related to brain connectivity and memory function. Without a functional map of the brain it's hard to tell what parts of the brain are talking to each other."

Previous research has shown that shape and exact location of brain regions interact strongly with the modeling of brain connectivity. For years, researchers have posited that the shape and size of a swine brain bears physiological and anatomical similarities to the human brain, and therefore swine are considered a good animal model for neurological disease. However, according to the RBC team, scientists have not yet developed a unique model that captures functional connectivity or details the wiring diagram of the brain.

Neuroimaging typically helps researchers identify which regions of the brain activate when a person carries out a task; such as the simple task of starting a car. In order to turn on your car, you first have to look, then find, where to insert the key, as your brain takes visual cues and stimulates different parts of your arm to complete the action. Each part of your arm activates a different part of the brain in the act of inserting the key. If there's any interruption in the connections, those functions don't happen.

Those interrupted connections play a role in neurological disorders, such as Alzheimer's disease, Parkinson's disease, chronic traumatic encephalopathy (CTE) and autism. With any of these disorders, the RBC collaborators can now model a 360-degree view of which parts of the brain are no longer talking to each other and which centers in the brain are being reactivated and reconnected.

"What this new model allows and has never been done before," West said, "is for researchers to ask more refined questions about how the brain talks to itself, functions and coordinates action."

"What we tend to say is the brain is a black box and we don't know how it works," said West. "This study is a game changer. It gives us a light to shine inside the box."

Credit: 
University of Georgia

Eye exams common among US adults but some disparities persist

Bottom Line: A substantial proportion of U.S. adults reported recently having an eye exam in this online survey study that included 2,013 adults ages 50 to 80. About 82% of those surveyed reported an eye exam in the past two years. Reasons for not getting an exam included not having eye problems, cost or lack of insurance. A limitation of the study is that adults with vision impairment may have been less likely to participate because the survey was administered online.

Authors: Joshua R. Ehrlich, M.D., M.P.H., University of Michigan, Ann Arbor, and coauthors

(doi:10.1001/jamaophthalmol.2019.1927)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Spiders risk everything for love

image: Can you spot the spider? Researchers superimposed images of wolf spiders on a leaf litter background to see if blue jays could find them.

Image: 
UC

University of Cincinnati biologist George Uetz long suspected the extravagant courtship dance of wolf spiders made them an easy mark for birds and other predators.

But it was only when he and colleague Dave Clark from Alma College teamed up with former University of Minnesota researcher Tricia Rubi and her captive colony of blue jays that he could prove it.

For a study published in May in the journal Behavioural Processes, Rubi trained a captive colony of blue jays to peck at buttons to indicate whether or not they saw wolf spiders (Schizocosa ocreata) on video screens.

Clark made videos superimposing images of courting, walking and stationary male spiders on a leaf litter background. Rubi presented the videos to blue jays on a flat display screen on the ground.

When viewed from above, the brindled black and brown spiders disappear amid the dead leaves.

The jays had trouble finding spiders that stayed motionless in the videos. This confirmed the adaptive value of the anti-predator "freeze" behavior.

The jays had less trouble seeing spiders that walked in the video. And the jays were especially quick to find male spiders engaged in ritual courtship behavior, in which they wave their furry forelegs in the air like overzealous orchestra conductors.

"By courting the way they do, they are clearly putting themselves at risk of bird predation," UC's Uetz said.

His lab studies the complementary methods spiders employ to communicate with each other, called multimodal communication. Female spiders leave a pheromone trail behind them in their silk and when they rub their abdomens on the ground, Uetz said.

And when male spiders come into visual range, they bounce and rattle their legs on the leaf litter to create vibrations that can travel some considerable distance to the legs of potential mates. The males also wave their front legs in a unique pattern to captivate females.

The males of the species have especially furry front legs that look like black woolen leg warmers. The combination of thick fur and vigorous dancing are indicators that the male is fit and healthy, Uetz said.

"The displays and the decorations show off male quality," Uetz said. "The males that display vigorous courtship and robust leg tufts are showing off their immune competence and overall health. They, in turn, will have sons that have those qualities."

That is, if they live so long. Many birds, including blue jays, find spiders to be tasty. And a chemical called taurine found in spiders is especially important for the neurological development of baby birds, he said.?

Spiders instinctively fear predatory birds. In previous studies, Uetz, Clark and UC student Anne Lohrey determined that wolf spiders would freeze in place when they detected the sharp, loud calls of blue jays, cardinals and other insect-eating birds. By comparison, they ignored the calls of seed-eating birds such as mourning doves along with background forest noises such as the creak of katydids.

"They clearly recognized these birds as some kind of threat," Uetz said. "Robins will hunt them on the ground. Lots of other birds do, too. Turkeys will snap them up."

When Uetz proposed a spider experiment, Rubi said it wasn't hard to train her colony of blue jays. The jays quickly learned to peck at different buttons when they either observed a spider or didn't see one on a video screen.

"Birds are super visual. They have excellent color vision and good visual acuity. It's not surprising they would have no trouble seeing spiders in motion," she said.

Rubi now studies genetic evolution at the University of Victoria in British Columbia.

If natural selection means the most avid courting spiders are also most likely to get eaten, why does this behavior persist across generations? Wouldn't meeker spiders survive to pass along their genes?

Rubi said the explanation lies in another selective force.

"Natural selection is selection for survival, which would lead to spiders that are less conspicuous to predators," she said. "But sexual selection is driven by females. And they select for a more conspicuous display."

In genetic terms, Rubi said, fitness is measured in the number of healthy offspring produced. So while staying alive by minimizing risk is a good strategy for the individual, it's not a viable strategy for the species.

"The longest-lived male can still have a fitness of 'zero' if he never mates," Rubi said. "So there appears to be a trade-off between being safe and being sexy. That balance is what shapes these courtship displays."

Uetz said female wolf spiders can be very choosy about the qualities they value in a mate.

"The tufts on their forelegs are very important. Their size and symmetry play a big role," he said. "They're so tiny and have brains the size of a poppy seed. You wouldn't think they could discriminate, but they do."

And at least for successful male wolf spiders living in a hostile world, this means love wins over fear.

Credit: 
University of Cincinnati

Research details response of sagebrush to 2017 solar eclipse

image: University of Wyoming researchers Daniel Beverly, Mario Bretfeld and Adam Nibbelink set up devices in sagebrush at a site 50 miles south of Yellowstone National Park to measure sagebrush response to the total solar eclipse in August 2017.

Image: 
Krag Beverly

The total solar eclipse's swath across Wyoming and the United States in August 2017 provided an opportunity for scientists to study a variety of celestial and earthly phenomena, from learning more about the sun's corona to the behavior of animals and plants.

University of Wyoming botany and hydrology doctoral student Daniel Beverly used the eclipse to examine the impact of the moon's shadow on an iconic plant species of Wyoming and the Intermountain West: big sagebrush. He found that the short period of darkness caused a significant reduction in photosynthesis and transpiration in the desert shrub, but not quite to the levels of nighttime.

Additionally, the circadian rhythm -- the response to the internal clock common to nearly all organisms including humans -- was interrupted by the sudden changes in sunlight beyond typical cloud cover.

"The reduced temperature and lack of sunshine shocked the circadian clock of big sagebrush, triggering a response far beyond what happens when clouds block sunlight," says Beverly, whose research appears today (Thursday) in the journal Scientific Reports. "However, the duration of eclipse totality was not sufficient to bring the plants completely to their nighttime state."

Scientists have extensively studied the response of animals to total solar eclipses. Those findings have been mixed, with birds, bees and spiders behaving just as they do at dusk, while no behavioral change was observed in animals such as dairy cattle and captive chimpanzees. On the other hand, very little is known about plant responses to eclipses, either on the small scale or across broad ecosystems. Beverly's study -- which involved fellow UW scientists in the Department of Botany, the Program in Ecology and the Wyoming Geographic Information Science Center -- offers some of the most detailed information about individual plant response and potential broad ecosystem impacts ever reported.

Beverly's fieldwork took place at a site about 50 miles southeast of Yellowstone National Park, which was within the eclipse's path of totality Aug. 21, 2017. That area is dominated by mountain big sagebrush. The site experienced 2 minutes, 18 seconds of eclipse totality, with the total duration of partial and total solar eclipse reaching 2 hours, 45 minutes, 36 seconds.

Devices deployed during the study included a micrometeorological tower to record the sun's radiation and changes in temperature; an infrared gas analyzer to measure photosynthesis; and fluorometers to measure leaf response to changing light conditions. During the short duration of near darkness, they found significant reductions in transpiration -- evaporation of water from sagebrush leaves -- as well as photosynthesis, the transformation of light energy into chemical energy that converts carbon dioxide and water into sugar and oxygen.

In addition to documenting the response of specific sagebrush plants to the eclipse, Beverly and his colleagues used those results to estimate the eclipse's impact on sagebrush-dominated areas within the path of totality. They calculated a 14 percent reduction of carbon being converted in Western big sagebrush ecosystems that one day.

"Despite its relatively short duration, the eclipse caused a significant reduction in estimated daily carbon uptake for Aug. 21, 2017, in big sagebrush ecosystems," Beverly says. "This information gives us a more comprehensive understanding of plant physiological responses to sudden changes in light, temperature and humidity that the internal clock fails to predict."

Credit: 
University of Wyoming

How bacteria kill host cells from the inside

image: Model for intramacrophage fate of P. aeruginosa. Phagocytosed P. aeruginosa PAO1 first resides in a vacuole, before escaping the phagosome and promoting macrophage lysis. Live imaging of macrophages infected with fluorescent P. aeruginosa allows to follow the lysis of a specific macrophage (white arrow on the picture). MgtC and OprF act positively on the expression of T3SS, which is involved in cell lysis driven by intracellular P. aeruginosa through the ExoS protein.

Image: 
Anne Blanc-Potard

A bacterial pathogen that typically multiplies outside of host cells can enter and induce the destruction of cells called macrophages, according to a study published June 20 in the open-access journal PLOS Pathogens by Anne-Béatrice Blanc-Potard of the Université de Montpellier in France, and colleagues.

Pathogenic bacteria are commonly classified as intracellular or extracellular pathogens. Intracellular bacterial pathogens can replicate within host cells, including macrophages, which ingest and kill microorganisms in a process called phagocytosis. By contrast, extracellular pathogens such as Pseudomonas aeruginosa multiply outside of cells. However, recent data have shown that several extracellular pathogens can enter host cells. For example, P. aeruginosa has been reported to be engulfed by macrophages in animal models.

In the new study, Blanc-Potard and her colleagues visualized the fate of P. aeruginosa within macrophages. P. aeruginosa first resided in vesicles called phagosomes and subsequently could be detected in the cytoplasm, indicating that the pathogen had escaped degradation within the phagosomes. The intracellular bacteria could eventually induce cell lysis - the disintegration of a cell through membrane rupture. Two bacterial molecules, MgtC and OprF, recently identified to be important for the survival of P. aeruginosa in macrophages, were found to activate intracellular production of type III secretion system (T3SS), which in turn was found to be involved in bacterial escape from the phagosome as well as in cell lysis caused by the bacteria. According to the authors, the transient stage in which P. aeruginosa resides inside of macrophages could contribute to bacterial dissemination during infection.

The authors add, "While the role of macrophages is to ingest and kill microorganisms, the pathogenic bacterium Pseudomonas aeruginosa can induce cell lysis when it insides inside macrophages. The weapons used by internalized bacteria to lyse the macrophage are decrypted."

Credit: 
PLOS

Cereal grains scientists fight hidden hunger with new approach

image: Global demand for staple crops like maize, wheat, and rice is on the rise -- making these crops ideal targets for improving nutrition through biofortification. Biofortification is the process of developing micronutrient-dense staple crops by combining traditional breeding practices with modern biotechnology.

Image: 
N. Palacios and V. Govindan

After a prolonged decline, global hunger is on the rise--affecting more than 820 million individuals in 2017. Additionally, more than 2 billion people suffer from "hidden hunger," which occurs when individuals eat foods that don't provide the nutrients they need to lead healthy, productive lives. Children who suffer from hidden hunger have more difficulty developing to their full mental and physical potential.

Hidden hunger is more prevalent in developing countries that rely heavily on staple crops like wheat, maize, and rice. These populations often do not have access to nutrient-rich foods, such as fruits, vegetables, and fish, and tend to suffer from vitamin A and zinc deficiencies. Vitamin A deficiencies can lead to vision-related disorders, such as corneal scarring and blindness, while zinc deficiencies increase the risk of diarrheal diseases, pneumonia, malaria, and even mortality.

Global demand for staple crops like maize, wheat, and rice is on the rise--making these crops ideal targets for improving nutrition through biofortification. Biofortification is the process of developing micronutrient-dense staple crops by combining traditional breeding practices with modern biotechnology.

Biofortification crop systems are highly sustainable one-time investments--the varieties developed and released will continue to grow annually, even without government attention or external funding, and breeding for higher levels of vitamin A and zinc does not penalize yield. There are currently 290 varieties of 12 biofortified crops, including rice, wheat, maize, that target the low-income households who rely on these staple crops and may suffer from hidden hunger.

"Maize and wheat are excellent targets for biofortification because they are widely cultivated, have wide agroecosystem coverage, are important in diets and trade, have useful native genetic variation for improving micronutrient density in the grain, and have a long history as subjects of breeding and genetic research," according to the authors of "Improving Nutrition through Biofortification: Preharvest and Postharvest Technologies," published in Cereal Foods World.

Organizations such as the International Maize and Wheat Improvement Center (CIMMYT), HarvestPlus, and the International Institute of Tropical Agriculture have worked to develop provitamin A-enriched maize varieties and zinc-enriched maize varieties, which have been released in South America and Africa. These organizations have also worked on enhancing levels of iron and zinc in wheat grain, releasing six biofortified wheat varieties in Pakistan and India in recent years.

Nutritional studies have found these biofortified varieties to be effective. A study in India found that young children who ate zinc-biofortified wheat experienced 17% fewer days with pneumonia and 39% fewer days vomiting than those children who ate conventional wheat products. Provitamin A-biofortified maize is a proven source of vitamin A. Additional studies are likely to reveal more positive effects of biofortified staple crops on nutrition.

Biofortification is one way to address hidden hunger, but it is not without challenges. According to the Cereal Foods World article, "To realize their full potential, biofortified maize and wheat varieties must be integrated as a core products in research, policy, and food value chains for these crops, which implies that all participants in the value chain, particularly farmers and consumers, must be convinced of their value."

Credit: 
AACC International