Culture

Skoltech scientists found a way to control the electrical characteristics of optical memory devices

Organic electronics has been developing at a blistering pace over the last decade: flexible thin-film electronic circuits, sensors, displays, solar light converters and batteries, LEDs and other components have already found valuable applications in product packaging (smart packaging) clothes (wearable electronics, electronic textile), electronic skin, robotics and prosthetics, in particular, smart prosthetic limbs and exoskeletons sensitive to touch, pressure, heat, and cold. Further advancement of organic electronics should result in the creation of a functional interface between the classical "solid-state" electronics and live objects, such as the human body. The Smart Healthcare concept enabling continuous monitoring of the human condition and its timely adjustment in response to the first signs of a disease is believed to have a game-changing impact on healthcare, which will focus on prevention rather than treatment at the advanced stage of a disease when the vast arsenal of treatment methods is not enough to save a patient or improve the quality of life.

Practical applications of organic electronics require that all of its functional components, including organic memory elements, are fully developed. From this perspective, of particular interest are the photochromic compounds, whose molecules, by nature, are single-bit memory cells that undergo reversible isomerization between two quasi-stable states when exposed to light (similarly to "0" and "1" in the binary system.) Unfortunately, the current lack of technical capability makes it almost impossible to reliably switch a single molecule and register its state. This means that photochromic molecules need to be integrated into more complex and larger systems, where the transition from one state to another will produce a response that can be captured, for example, as electrical signal.

Earlier, Professor Troshin's team developed the structure of organic field-effect transistors with a photosensitive photochromic layer and demonstrated the possibility of their optolectrical switching between multiple electrical states. However, the effect of the photochromic material structure and properties on the device electrical characteristics has been unclear until now. In their recent study, the researchers from Skoltech, the Institute of Problems of Chemical Physics, RAS, and N.D. Zelinsky Institute of Organic Chemistry, RAS, have succeeded in identifying the relationships between the structure of photochromic materials and their electrical performance in devices.

"We studied three different photochromic materials of similar structure in optical memory elements based on organic field-effect transistors and found some meaningful patterns following a detailed analysis of the characteristics, such as the switching speed and amplitude, memory window width, and operating stability in the multiple data write-read-erase mode. We showed that having a carbonyl group in the photochromic dihetarylethene bridge moiety makes the switching easier, while reducing the stability of induced states. In contrast, a photochromic compound with an unsubstituted propylene bridge and a relatively narrow memory window ensures reliable switching and long-term device stability. The correlations that we found between the molecular structure of photochromic compounds and the electrical characteristics of the devices made using these materials provide a solid background for the rational development of a new generation of materials for organic memory elements and photodetectors," says the first author of the study Dolgor Dashitsyrenova.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Researchers look to unlock post-traumatic stress disorder puzzle

A team of Penn State and University of Puerto Rico School of Medicine researchers is attempting to answer a question that has long puzzled experts: Why do some individuals suffer post-traumatic stress disorder (PTSD) after experiencing trauma, and others do not?

The research, led by Nanyin Zhang, professor of biomedical engineering and Lloyd & Dorothy Foehr Huck Chair in Brain Imaging at Penn State, explores whether individual vulnerability to PTSD is due to pre-existing conditions or to a response to trauma exposure.

The team used the predator scent model of PTSD in rats and longitudinal design, which involves repeated observations of the same subject over a period of time. Using this methodology, they measured pre-trauma, brain-wide, neural circuit functional connectivity; behavioral responses to trauma exposure; release of corticosterone, a steroidal hormone produced in the cortex of adrenal glands; and post-trauma anxiety.

The results, reported in a recent issue of Nature Communications, found that rats that freeze and become motionless in response to predator scent exposure, correlate with functional connectivity in a set of neural circuits in the brain of these rats. Functional connectivity is the connectivity between different regions of the brain that share functional properties, and is measured via magnetic resonance imaging.

The researchers found that pre-existing neural circuit function can predispose animals to different fearful responses to threats.

"The data we gathered provides a framework of pre-existing circuit function in the brain that determines threat responses," Zhang said. "This may directly relate to PTSD-like behaviors."

Such a framework has a variety of potential benefits for further research into PTSD prevention and treatment.

"This research can help us understand core components of the vulnerability to stress-induced neuropsychiatric disorders," Zhang said. "These components can potentially serve as indicators to not only predict risk for developing anxiety disorders like PTSD but also assist in evaluating different stages of PTSD and possible recovery."

Using rats as test subjects helped overcome a major obstacle of investigating risk factors of PTSD in humans -- the difficulty of monitoring PTSD development from pre- through post- trauma in humans via exposure to well-controlled traumatic events. Studies on humans focus on populations already exposed to a variety of uncontrolled traumatic events and can lead to inconsistent results. Such barriers were overcome in the present study by using rats and applying longitudinal design with controlled traumatic stressors.

"The outcomes of the research can potentially be translated to human studies," Zhang said. "For instance, a biomarker predicting a vulnerability to stress-induced disorders will help determine the risk of assigning an individual to a highly stressful environment, such as combat."

One interesting aside in the study was a counterintuitive finding. Rats with lower freezing behavior showed more avoidance of the predator scent, a prolonged corticosterone response, and higher anxiety long after exposure to the scent.

"It is very likely that they froze less as they adopted different reactions to threats, such as fleeing," Zhang said.

Zhang said the next steps for the research team include identifying neuroimaging biomarkers that can predict an individual's response to threats and developing a process for determining the probability an individual will develop PTSD-like behaviors when exposed to trauma. The team will also explore methods to protect animals with high-risk factors from developing PTSD-like behaviors, such as through optogenetics, which is the use of light to control the activities of individual neurons in freely moving animals.

Credit: 
Penn State

Teens who can describe negative emotions can stave off depression

image: The researchers found that youth who are poor at differentiating their negative emotions are more susceptible to depressive symptoms following stressful life events. Conversely, those who display high NED are better at managing the emotional and behavioral aftermath of being exposed to stress, thereby reducing the likelihood of having negative emotions escalate into a clinically significant depression over time.

Image: 
Lisa Starr and Stephen Dow/University of Rochester

Teenagers who can describe their negative emotions in precise and nuanced ways are better protected against depression than their peers who can't. That's the conclusion of a new study about negative emotion differentiation, or NED--the ability to make fine-grained distinctions between negative emotions and apply precise labels-- published in the journal Emotion.

"Adolescents who use more granular terms such as 'I feel annoyed,' or 'I feel frustrated,' or 'I feel ashamed'--instead of simply saying 'I feel bad'--are better protected against developing increased depressive symptoms after experiencing a stressful life event," explains lead author Lisa Starr, an assistant professor of psychology at the University of Rochester.

Those who score low on negative emotion differentiation tend to describe their feelings in more general terms such as "bad" or "upset." As a result, they are less able to benefit from useful lessons encoded in their negative emotions, including the ability to develop coping strategies that could help them regulate how they feel.

"Emotions convey a lot of information. They communicate information about the person's motivational state, level of arousal, emotional valence, and appraisals of the threatening experience," says Starr. A person has to integrate all that information to figure out--"am I feeling irritated," or "am I feeling angry, embarrassed, or some other emotion?"

Once you know that information you can use it to help determine the best course of action, explains Starr: "It's going to help me predict how my emotional experience will unfold, and how I can best regulate these emotions to make myself feel better."

The team found that a low NED strengthens the link between stressful life events and depression, leading to reduced psychological well-being.

By focusing exclusively on adolescence, which marks a time of heightened risk for depression, the study zeroed in on a gap in the research to date. Prior research suggests that during adolescence a person's NED plunges to its lowest point, compared to that of younger children or adults. It's exactly during this developmentally crucial time that depression rates climb steadily.

Previous research had shown that depression and low NED were related to each other, but the research designs of previous studies did not test whether a low NED temporally preceded depression. To the researchers, this phenomenon became the proverbial chicken-and-egg question: did those youth who showed signs of significant depressive symptoms have a naturally low NED, or was their NED low as a direct result of their feeling depressed?

The team, made up of Starr, Rachel Hershenberg, an assistant professor of psychiatry at Emory University, and Rochester graduate students Zoey Shaw, Irina Li, and Angela Santee, recruited 233 mid-adolescents in the greater Rochester area with an average age of nearly 16 (54 percent of them female) and conducted diagnostic interviews to evaluate the participants for depression.

Next, the teenagers reported their emotions four times daily over the period of seven days. One and a half years later, the team conducted follow-up interviews with the original participants (of whom 193 returned) to study longitudinal outcomes.

The researchers found that youth who are poor at differentiating their negative emotions are more susceptible to depressive symptoms following stressful life events. Conversely, those who display high NED are better at managing the emotional and behavioral aftermath of being exposed to stress, thereby reducing the likelihood of having negative emotions escalate into a clinically significant depression over time.

Depression ranks among the most challenging public health problems worldwide. As the most prevalent mental disorder, it not only causes recurring and difficult conditions for sufferers, but also costs the U.S. economy tens of billions of dollars each year and has been identified by the World Health Organization as the number one cause of global burden among industrialized nations. Particularly depression in adolescent girls is an important area to study, note the researchers, as this age brings a surge in depression rates, with a marked gender disparity that continues well into adulthood.

Adolescent depression disrupts social and emotional development, which can lead to a host of negative outcomes, including interpersonal problems, reduced productivity, poor physical health, and substance abuse. Moreover, people who get depressed during adolescence are more likely to become repeatedly depressed throughout their life span, says Starr. That's why mapping the emotional dynamics associated with depression is key to finding effective treatments.

"Basically you need to know the way you feel, in order to change the way you feel," says Starr. "I believe that NED could be modifiable, and I think it's something that could be directly addressed with treatment protocols that target NED."

The team's findings contribute to a growing body of research that tries to make inroads in the fight against rising rates of adolescent depression, suicidal thoughts, and suicide. According to the most recent CDC data, about 17 percent of high school students nationwide say they have thought of suicide, more than 13 percent said they actually made a suicide plan, and 7.4 percent attempted suicide in the past year.

"Our data suggests that if you are able to increase people's NED then you should be able to buffer them against stressful experiences and the depressogenic effect of stress," says Starr.

Credit: 
University of Rochester

Short sleep duration and sleep variability blunt weight loss

This is one of the conclusions of the Predimed-Plus study, Prevention with the Mediterranean Diet, which has been published in the June issue of the International Journal of Obesity. It is the first study to examine whether the quality of sleep is related to weight loss and a reduction in adipose tissue.

In their study, the researchers from the Human Nutrition Unit of the Rovira i Virgili University, in conjunction with other research groups involved in the Predimed-Plus study, assessed the changes in weight and adiposity - body fat - of the 1,986 individuals who took part in the study for a whole year, all of whom presented overweight, obesity and the metabolic syndrome. The patients followed an intensive intervention programme in terms of lifestyle designed for weight loss. It was based on a low-calorie Mediterranean diet, physical activity and behaviour therapy. The researchers observed that the individuals with highly variable sleep patterns - that's to say, who did not sleep the same number of hours every night - at the beginning of the study lost less weight after a follow-up period of 12 months. What is more, a high sleep variability and sleeping little - less than six hours - a day was associated with a lower decrease in body mass index and waist circumference.

These results reveal that adopting measures to achieve an appropriate sleep pattern may have an impact on maintaining the correct weight and preventing other metabolic disorders associated with excess body fat.

Credit: 
Universitat Rovira i Virgili

Researchers decipher the history of supermassive black holes in the early universe

image: This is an illustration of a supermassive black hole

Image: 
Scott Woods, Western University

Astrophysicists at Western University have found evidence for the direct formation of black holes that do not need to emerge from a star remnant. The production of black holes in the early universe, formed in this manner, may provide scientists with an explanation for the presence of extremely massive black holes at a very early stage in the history of our universe.

Shantanu Basu and Arpan Das from Western's Department of Physics & Astronomy have developed an explanation for the observed distribution of supermassive black hole masses and luminosities, for which there was previously no scientific explanation. The findings were published today by Astrophysical Journal Letters.

The model is based on a very simple assumption: supermassive black holes form very, very quickly over very, very short periods of time and then suddenly, they stop. This explanation contrasts with the current understanding of how stellar-mass black holes are formed, which is they emerge when the centre of a very massive star collapses in upon itself.

"This is indirect observational evidence that black holes originate from direct-collapses and not from stellar remnants," says Basu, an astronomy professor at Western who is internationally recognized as an expert in the early stages of star formation and protoplanetary disk evolution.

Basu and Das developed the new mathematical model by calculating the mass function of supermassive black holes that form over a limited time period and undergo a rapid exponential growth of mass. The mass growth can be regulated by the Eddington limit that is set by a balance of radiation and gravitation forces or can even exceed it by a modest factor.

"Supermassive black holes only had a short time period where they were able to grow fast and then at some point, because of all the radiation in the universe created by other black holes and stars, their production came to a halt," explains Basu. "That's the direct-collapse scenario."

During the last decade, many supermassive black holes that are a billion times more massive than the Sun have been discovered at high 'redshifts,' meaning they were in place in our universe within 800 million years after the Big Bang. The presence of these young and very massive black holes question our understanding of black hole formation and growth. The direct-collapse scenario allows for initial masses that are much greater than implied by the standard stellar remnant scenario, and can go a long way to explaining the observations. This new result provides evidence that such direct-collapse black holes were indeed produced in the early universe.

Basu believes that these new results can be used with future observations to infer the formation history of the extremely massive black holes that exist at very early times in our universe.

Credit: 
University of Western Ontario

Medically unnecessary ambulance rides soar after ACA expansion

By 2016, two years into the expansion of the Affordable Care Act (ACA), 17.6 million previously uninsured people around the U.S. had gained health insurance coverage. But with the expansion, researchers at the University of Colorado Denver and the University of Kentucky found that ambulance dispatches for minor injuries like abrasions, minor burns and muscle sprains rose by a staggering 37% in New York City.

"Policymakers were operating under the assumption that the expansion was going to get people out of emergency rooms," says Andrew Friedson, PhD, assistant professor of economics at CU Denver. "Few people thought a larger enrollment would lead to a larger utilization of emergency care, because an emergency is an emergency. Insurance shouldn't make anything more of an emergency."

The findings are described today in JAMA Network Open, in a study by Friedson, along with CU Denver Professor of Economics Daniel Rees, PhD, and University of Kentucky Associate Professor of Economics Charles Courtemanche, PhD.

Dispatches to minor injuries jumped 37%

The authors analyzed data from all of the 911 ambulance dispatches in New York City between January 1, 2013, and July 31, 2016. In New York City, 911 calls are routed through a central dispatch to a trained EMS dispatcher, who triages the call based on type and severity of injury, alerting ambulances in one of the city's 31 zones.

In the years before and after the ACA, dispatches to more severe injuries (such as chest pain, compound fractures and unconsciousness) remained relatively the same. But dispatches to minor injuries leapt 37.2%, from an average of 20.75 dispatches per dispatch zone per month before ACA to 28.46 in the years following. The increase is equivalent to approximately 239 additional dispatches a month - or 2,868 per year - for minor injuries.

"I was expecting to find an increase under 5%. The size of the association was surprising," says Friedson.

Ambulances are now cheaper than Uber

Previous research found that when Uber shows up in a city, the usage of ambulance services drop off. With the expansion of the ACA, the out-of-pocket cost of ambulances tumbled for many people. When patients bear a smaller portion of the cost, researchers argue, they will be more likely to use an ambulance for medical transportation in less emergent situations.

"Medicaid patients in particular have incredibly low out-of-pocket responsibility for ambulances," says Friedson. "The most an ambulance ride covered under Medicaid costs the patient three dollars. If there's a low-cost alternative to Uber to get the hospital, you're going to take it."

As a result, the medically unnecessary rides may add to city congestion, slow response time to actual emergencies and increase the risk of death for those in dire situations.

Health care policy needs better guardrails

"When the ACA was enacted, policymakers may not have had sufficient guardrails in place with regards to emergency care or ambulance utilization," says Friedson. "One solution would have been to give the law more nuance; it needed to spell out that if you're going to take an ambulance, you're covered as long as you meet a certain acuity level. If not, it will involve additional cost-sharing. Or, if you don't want to make those determinations, because they are difficult to make, policymakers could have included money for expanding the emergency response system."

A handful of major U.S. cities are implementing 911 nurse triage call centers to address non-emergency calls and redirect those patients away from ambulances. NYC - and most U.S. cities - don't do that yet, but as dispatches for scrapes and sprains tie up emergency responders, that may soon change.

Credit: 
University of Colorado Denver

How to improve corporate social and environmental responsibility

Social and environmental responsibility in globalized supply chains are hard to police. That task often falls to nongovernmental organizations, or NGOs, that publicize abuses and call out irresponsible corporations and industries.

According to new research led by the University of California, Riverside NGOs are more likely to sway companies into ethical behavior with carefully targeted reports that take into consideration a range of factors affecting the companies and industries. The report also finds that too much pressure can actually backfire.

The study suggests that in some circumstances, vertical integration, where companies own and control all steps of the production process, can be both economically feasible and promote responsible sourcing throughout an industry.

"Vertical integration costs firms a lot of money. It's not easy and not everyone can do it," said Adem Orsdemir, an assistant professor of operations and supply chain management at UC Riverside and first author of the study. "But if you can do it, it's best."

Manufactured goods usually require raw materials or components produced in countries far away from where they are assembled or marketed. Although many countries have laws to protect workers and the natural environment, enforcement is often lax or nonexistent. Thus, industry relies on independent audits to inspect and verify that their suppliers comply with laws.

Certification by independent auditors can't always be trusted. The auditing organizations can also be inefficient or corrupt, and their inspectors are often thwarted or fooled by unscrupulous producers. This leaves to NGOs the heavy responsibility of monitoring supply chains and holding corporations accountable.

Orsdemir wants to help NGOs direct their limited resources. Intrigued by the example of Taylor Guitars, he examined several scenarios in which exposure by NGOs could influence corporate behavior.

Taylor Guitars, a high-end El Cajon-based guitar manufacturer, learned there was no way to ensure that ebony--black wood from an endangered species of tree--was produced sustainably. So the company bought an ebony sawmill in Cameroon, where ebony can be legally harvested. This move to vertical integration allowed Taylor to oversee and control every step of the production process, from harvesting the ebony to building the guitars.

Taylor doubled the company's wages, instituted and enforced strict labor and environmental policies, and invested in social welfare programs in ebony-producing communities.

Because companies that import illegal ebony face hefty fines by the U.S. Fish and Wildlife Service as well as potential consumer backlash, other musical instrument manufacturers started buying ebony from Taylor Guitars. This practice is known as horizontal sourcing.

Orsdemir wondered under what conditions it would be economically feasible for companies to vertically integrate instead of maintaining an unethical or unsustainable status quo. He mathematically modeled two competing firms selling in the same market under pressures that included the risk of being exposed for failed corporate social and environmental responsibility, or CSER, and the effects such exposure could have on consumer demand for their products.

He found that in industries where horizontal sourcing is unlikely, firms keep the status quo under low CSER violation exposure risk and vertically integrate under moderate violation exposure risk. Surprisingly, they may maintain the status quo under high violation exposure risk, even when it has a strong negative effect on the overall consumer demand for the industry.

"Essentially, what happens is, even if the firm vertically integrates to become responsible, there is a high chance that its competitor would get caught in a violation, impacting the whole industry demand negatively and making responsibility efforts of the firm futile. So, it prefers to stay as it has always been," Orsdemir said.

Where horizontal sourcing is possible, a firm vertically integrates under moderate-to-high CSER violation exposure risk. However, the firm may not share responsible supply through horizontal sourcing if most of its competitor's customers start buying from them instead. If negative attention on the competitor's violations spurs customers to start buying only from the responsible company, it has no incentive to share its responsible supplies with competitors. The responsible firm benefits but industry-wide responsibility doesn't necessarily improve.

The results mean that firms should be conscious about external pressures and the possibility of horizontal sourcing in the industry when considering vertical integration for CSER.

The results also provide guidance for NGOs' violation scrutiny and reporting policies for firms that could adopt vertical integration and horizontal sourcing.

"NGOs can blame companies or industries," Orsdemir said.

Where horizontal sourcing is unlikely, NGOs should specify both violating and nonviolating firms specifically in their reports, rather than call out an entire industry. Good firms will benefit when they are named.

"If horizontal sourcing is not possible too much pressure may backfire and discourage firms from vertical integration," Orsdemir said. "On the other hand, trying to create positive consumer demand for responsible firms is always good."

When horizontal sourcing is possible, NGOs should allocate more resources to scrutinizing firms' CSER violations and should create industry-wide violation reports. But the researchers say NGOs should avoid naming specific firms, which may discourage the sharing of responsibly sourced supply by driving customers to only the responsible firms. This could help improve responsibility within the industry as a whole.

Credit: 
University of California - Riverside

Students chowing down tuna in dining halls are unaware of mercury exposure risks

image: Tuna and other large fish contain significant amounts of mercury in its most toxic form (methylmercury). In a survey of college students, half of the tuna eaters reported eating three or more tuna meals per week, potentially exceeding the EPA's 'reference dose' for mercury.

Image: 
Nick Gonzales

A surprising number of students eating in university dining halls have been helping themselves to servings of tuna well beyond the amounts recommended to avoid consuming too much mercury, a toxic heavy metal.

Researchers at UC Santa Cruz surveyed students outside of campus dining halls on their tuna consumption habits and knowledge of mercury exposure risks, and also measured the mercury levels in hair samples from the students. They found that hair mercury levels were closely correlated with how much tuna the students said they ate. And for some students, their hair mercury measurements were above what is considered a "level of concern."

"It doesn't necessarily mean that they would be experiencing toxic effects, but it's a level at which it's recommended to try to lower your mercury exposure," said Myra Finkelstein, an associate adjunct professor of environmental toxicology at UC Santa Cruz. "Our results were consistent with other studies of mercury levels in hair from people who eat a lot of fish."

Tuna and other large fish contain significant amounts of mercury in its most toxic form (methylmercury), and exposure to high levels of methylmercury can cause neurological damage. Because of its effects on neurological development and reproductive health, concerns about mercury exposure are greatest for pregnant women and children. Finkelstein said college students should also limit their exposure to mercury because their nervous systems are still developing and they are of reproductive age.

She said the study was prompted by her experiences teaching students about mercury in the environment and hearing about how much tuna some students eat. "I've been dumbfounded when students have told me they eat tuna every day," Finkelstein said. "Their lack of knowledge about the risk of exposure to mercury is surprising."

Graduate student Yasuhiko Murata led the study and is first author of a paper on their findings, which has been accepted for publication in Environmental Toxicology and Chemistry and is available online. In the surveys, about a third of students reported weekly tuna consumption, and 80 percent of their tuna meals were at the campus dining halls, where tuna is regularly available from the salad bar. Half of the tuna eaters reported eating three or more tuna meals per week, potentially exceeding the "reference dose" established by the U.S. Environmental Protection Agency (EPA), considered a maximum safe level (0.1 micrograms of methylmercury per kilogram of body weight per day).

Before the results were published, Finkelstein discussed her team's findings with UCSC administrators who oversee the dining halls. New signs in the campus dining halls will now give students information about mercury in tuna and guidelines for fish consumption. Other changes may be made after a more thorough assessment, said William Prime, executive director of dining services.

Finkelstein said this issue could be a concern for all kinds of institutions with dining halls, especially those serving children, such as boarding schools. "Any time you have a dining hall situation where people are helping themselves, some residents may be eating way too much tuna," she said.

Nearly all fish contain some mercury, but tuna, especially the larger species, are known to accumulate relatively high levels of the toxic metal. Consumers are advised to eat no more than two to three servings per week of low-mercury fish (including skipjack and tongol tuna, often labeled "chunk light") or one serving per week of fish with higher levels of mercury (including albacore and yellow fin tuna).

Some of the students surveyed at UC Santa Cruz reported having more than 20 servings of tuna per week. The researchers analyzed the mercury content of the tuna being served in the dining halls, collecting samples periodically over several months, and found that the mercury content was variable, with some samples having five times as much mercury as others.

"Some chunk light tuna was actually quite high in mercury, although typically it has only half or one-third as much as albacore," Finkelstein said.

The researchers calculated that, to stay below the EPA reference dose, a 140-pound person could consume up to two meals per week of the lower-mercury tuna but less than one meal per week of the higher-mercury tuna.

After conducting an initial survey and hair analysis, the researchers conducted a second survey with more detailed questions designed to probe students' knowledge about mercury in tuna and recommended consumption rates. Whether they were tuna eaters or not, most students had very little knowledge about this issue, Finkelstein said. A majority of students answered that it is safe to eat two to three times as much tuna per week as is recommended.

"It was not a large sample size, but only one out of 107 students surveyed had a high level of knowledge as well as confidence in that knowledge, so I think it's important to provide students with more information about safe levels of tuna consumption," she said.

Recommendations regarding consumption of tuna and other fish are complicated by the fact that fish is highly nutritious and contains beneficial omega-3 fatty acids and other nutrients. In addition, mercury concentrations vary widely among different types of fish. The U.S. Food and Drug Administration and EPA have issued advice on eating fish for pregnant women, parents, and caregivers of young children.

Credit: 
University of California - Santa Cruz

Low-income, less educated women least likely to access infertility care

ANN ARBOR, Mich. - Despite similar rates of infertility among all socioeconomic groups, white women, women with higher education levels, and women with higher incomes are at least twice as likely to seek treatment as other groups of women, new research suggests.

Nearly 12.5 % of women - or about 1 in 8 - in a Michigan Medicine study reported experiencing infertility. While older age was linked to higher infertility rates, a woman's race and ethnicity, education and income did not appear to influence her chance of conceiving.

However, those with higher education and income levels were significantly more likely to get infertility treatment, researchers report in journal Fertility and Sterility.

"Our study highlights important unmet infertility needs at a national level," says senior author James Dupree, M.D., M.P.H., a Michigan Medicine urologist and member of the University of Michigan Institute for Healthcare Policy and Innovation.

"While infertility prevalence is equal among women of varying socioeconomic, education and racial and ethnic backgrounds, our findings suggest several significant disparities among women accessing infertility care."

The study included responses from a nationally-representative sample of 2,502 reproductive-aged participants aged 20 to 44. Researchers used data from the National Health and Nutrition Examination Survey (NHANES) between 2013-2016, which reflects an estimated weighted population of 45.6 million women.

More than 80 % of women with a college degree or higher who reported infertility saw a medical provider - compared to just 33 % of women with a high school degree or less, the study suggests. More than two thirds of women with household incomes greater than $100,000 who reported infertility also sought care - compared to a third of women from households making $25,000 or less.

Uninsured women experiencing infertility also reported fewer medical visits that insured women having issues getting pregnant (39 percent compared to 65 percent.)

"Infertility is a medical disease and we hope to better understand existing disparities that may hinder care," says lead author Angela Kelley, M.D., an OB-GYN at the University of Michigan's Von Voigtlander Women's Hospital.

"More research is needed to assist policymakers and medical providers in their efforts to improve diagnosis and treatment for infertility, particularly among underserved women."

Researchers looked at two questions participants answered in the NHANES survey, including whether they had ever attempted to get pregnant over the last year without pregnancy and whether they had seen a medical provider because they were unable to conceive.

Infertility prevalence in the study was nearly twice that of previous estimates by the National Survey of Family Growth (NSFG) which reported a 6.7 percent rate of infertility. Authors point to different measures of infertility as a possible explanation. Previously reported data was limited to women who were married or cohabitating and trying to conceive with the same partner for at least 12 months.

The higher infertility rates may also be influenced by recall bias among women because they were interviewed about infertility and care retrospectively.

Another limitation was that the data also was not able to show whether state-mandated insurance coverage to cover some infertility care (available in 16 states) influenced reported access to infertility care.

However, authors say the study provides the most current information available to date on infertility prevalence and accessing care on a national scale and should be used to improve care for women who experience infertility.

"We hope these findings spur more research and policy changes to address inequities in infertility access," Kelley says. "Clinicians may also consider outreach to target specific, under-represented and under-served patient populations who may not seek infertility care but who would benefit from seeing a provider."

Credit: 
Michigan Medicine - University of Michigan

Smart materials provide real-time insight into wearers' emotions

image: Co-creator Muhammad Umair wearing one of the prototype smart materials wrist bands.

Image: 
Paul Turner/Lancaster University

Smart wearable technology that changes colour, heats up, squeezes or vibrates as your emotions are heightened has the potential to help people with affective disorders better control their feelings.

Researchers from Lancaster University's School of Computing and Communications have worked with smart materials on wrist-worn prototypes that can aid people diagnosed with depression, anxiety, and bi-polar disorders in monitoring their emotions.

Wrist bands that change colour depending upon the level of emotional arousal allow users to easily see or feel what is happening without having to refer to mobile or desktop devices.

"Knowing our emotions and how we can control them are complex skills that many people find difficult to master," said co-author Muhammad Umair, who will present the research at DIS 19 in San Diego.

"We wanted to create low-cost, simple prototypes to support understanding and engagement with real-time changes in arousal. The idea is to develop self-help technologies that people can use in their everyday life and be able to see what they are going through. Wrist-worn private affective wearables can serve as a bridge between mind and body and can really help people connect to their feelings.

"Previous work on this technologies has focused on graphs and abstract visualisations of biosignals, on traditions mobile and desktop interfaces. But we have focused on devices that are wearable and provide not only visual signals but also can be felt through vibration, a tightening feeling or heat sensation without the need to access other programmes - as a result we believe the prototype devices provide real-time rather than historic data."

The researchers worked with thermochromic materials that change colour when heated up, as well as devices that vibrate or squeeze the wrist. Tests of the devices saw participants wearing the prototypes over the course of between eight and 16 hours, reporting between four and eight occasions each when it activated - during events such as playing games, working, having conversations, watching movies, laughing, relaxing and becoming scared.

A skin response sensor picked up changes in arousal - through galvanic skin response, which measures the electrical conductivity of the skin - and represented it through the various prototype designs. Those smart materials which were both instant and constant and which had a physical rather than visual output, were most effective.

Muhammad added: "Participants started to pay attention to their in-the-moment emotional responses, realising that their moods had changed quickly and understanding what it was that was causing the device to activate. It was not always an emotional response, but sometimes other activities - such as taking part in exercise - could cause a reaction.

"One of the most striking findings was that the devices helped participants started to identify emotional responses which they had been unable to beforehand, even after only two days.

"We believes that a better understanding of the materials we employed and their qualities could open up new design opportunities for representing heightened emotions and allowing people a better sense of sense and emotional understanding."

Credit: 
Lancaster University

Controlling deadly malaria without chemicals

image: Female Anopheles stephensi mosquito obtaining a blood meal from a human host. This mosquito is a known transmitter of malaria, with a distribution that ranges from Egypt all the way to China.

Image: 
Jim Gathany/CDC

Scientists have finally found malaria's Achilles' heel, a neurotoxin that isn't harmful to any living thing except Anopheles mosquitoes that spread malaria.

Nearly half the world's population lives in areas vulnerable to malaria which kills roughly 450,000 people per year, most of them children and pregnant women. Progress fighting the disease is threatened as Anopheles develop resistance to chemical insecticides used to control them. There is also great concern about toxic side effects of the chemicals.

About 30 years ago, scientists identified a strain of bacteria that kills Anopheles. Since the bacteria's method of attack was not understood, it couldn't be replicated or used as an alternative to chemical insecticides -- until now.

An international team led by Sarjeet Gill, distinguished professor of molecular, cell and systems biology at UC Riverside, has identified a neurotoxin produced by the bacteria, and determined how it kills Anopheles. Their work is detailed in a paper published today in Nature Communications.

It took Gill and his team 10 years to achieve a breakthrough in their quest to understand the bacteria, and Gill attributes the success to modern gene sequencing techniques. They hit the bacteria with radiation, creating mutant bacterial strains that could not produce the toxin. By comparing the nontoxic strain to the one that kills Anopheles, they found proteins in the bacteria that are the keys to toxin production.

"Identifying the mechanisms by which the bacteria targets Anopheles has not been easy," Gill said. "We were excited not only to find the neurotoxin, called PMP1, but also several proteins that likely protect PMP1 as it's being absorbed in the mosquito's gut."

Many neurotoxins generally target vertebrates, and PMP1 bears 30 percent chemical similarity to botulinum or tetanus, both highly toxic to humans. Because the neurotoxin does not affect humans, vertebrates, fish, or even other insects, Gill believes the bacteria that produce PMP1 likely co-evolved along with Anopheles mosquitoes.

"It was surprising for us that PMP1 is not toxic to mice even by injection," Gill said.

Members of Gill's team include postdoctoral scholars Estefania Contreras, Jianwu Chen, Harpal Dhillon, and Nadia Qureshi as well as graduate student Swati Chawla from UC Riverside, Geoffrey Masuyer and Pål Stenmark from Stockholm University and Han Lim Lee from the Institute for Medical Research in Malaysia. Their work was funded by the U.S. National Institutes of Health.

The team has applied for a patent on this discovery, and now hopes to find partners that will help them develop their bacteria-based Anopheles insecticide. These findings also open the door to new avenues of research on additional environmentally friendly insecticides.

"There is a high likelihood that if PMP1 evolved to kill the Anopheles mosquito, there are other toxins that can kill other disease-spreading pests," Gill said. "This could just be the start of a new way to prevent hundreds of thousands from getting sick and dying every year."

Credit: 
University of California - Riverside

Global agriculture: Impending threats to biodiversity

A new study compares the effects of expansion vs. intensification of cropland use on global agricultural markets and biodiversity, and finds that the expansion strategy poses a particularly serious threat to biodiversity in the tropics.

Global agricultural production must be further increased in the coming years in order to meet rising demand and changing patterns of consumption. This will require either intensification of cropland use or an expansion of farmland. Researchers based at Ludwig-Maximiians-Universitaet (LMU) in Munich, at the Kiel Institute for the World Economy, at the Helmholtz Center for Environmental Research (Leipzig) and at Palacký University in Olomouc (Czech Republic) have now evaluated the trade-offs between food security and the preservation of biodiversity associated with both strategies in the context of global agricultural markets. The study appears in the journal Nature Communications.

"Agriculture is one of the major drivers of biodiversity loss worldwide, and increases in production are almost always achieved at the expense of biodiversity. But whether and where production rises due to intensification or expansion of cropland does make a difference," says Dr. Florian Zabel of the Department of Geography and Remote Sensing at LMU.

The researchers involved in the interdisciplinary collaboration set out to identify those areas in which it would be profitable, under projected climatic and socioeconomic conditions for the next decade, to increase agricultural production by intensifying or expanding the use of land for farming. They then asked what effects each of these strategies would have on biodiversity and global agricultural markets.

"Our results show that, for a given rise in food production, the impact of cropland expansion on biodiversity is many times greater than that of the intensification scenario. This is because expansion can be expected to occur in those regions with the highest existing levels of biodiversity, mainly in Central and South America," says Dr. Tomáš Václavík, who is in the Department of Ecology and Environmental Sciences at Palacký University in Olomouc. Near-term intensification of agriculture on existing cropland, on the other hand, primarily presents a threat to biodiversity in Sub-Saharan Africa.

However, while biodiversity is set at risk in those regions in which more food is produced, the study suggests that all parts of the world - including those in which the local rise is modest - will profit from the fall in food prices that ensues as a result of the overall growth in global production. "This result has potentially critical implications, because it suggests that, while all regions - including North America and the EU - will profit from sinking food prices, the threat to biodiversity is greatest in developing countries in the tropical regions," says Dr. Ruth Delzeit of the Kiel Institute for the World Economy. The effects of intensification and expansion are also predicted to play out differently within these regions. Intensification promises the highest gains in food security in some regions of the tropics, principally India and Sub-Saharan Africa. In contrast, the study sees inhabitants of Latin American countries such as Brazil as the primary beneficiaries of lower food prices brought about by cropland expansion. However, in this region, the expansion strategy presents an especially serious threat to biodiversity.

In addition, the study shows that most existing nature reserves are not located in those regions of high species diversity that were identified as likely targets of cropland expansion. "Most of the areas with high levels of biodiversity that are suitable for agricultural expansion and intensification in the coming years are not currently protected. We therefore recommend to develop global mechanisms which recognize land as a limited resource. Measures should be implemented to protect biodiversity in landscapes that are in use rather than focusing solely on protection sites," says Professor Ralf Seppelt of the Helmholtz Center for Environmental Research in Leipzig. This is the only practicable way to achieve a balance between the conservation of existing biodiversity and the need to increase global agricultural production.

Credit: 
Ludwig-Maximilians-Universität München

Gene activity database could spare thousands of mice

A comprehensive database of gene activity in mice across ten disease models, which could significantly reduce animal use worldwide, has been developed by scientists at the Francis Crick Institute, which gives a full picture of the immune response to different pathogens.

The data, published in Nature Communications and available through an online app, shows the activity of every mouse gene - more than 45,000 genes - in the blood of mice with ten different diseases. For the six diseases that involve the lung, samples from lung were also examined.

Previously, researchers would have to create, infect, cull, obtain samples from mice and extract and sequence the RNA to study genes that they are interested in. Using a new app which the lab created for this study, researchers will be able to check the activity of any gene across a range of diseases without needing their own mice. This could prevent thousands of mice being used in individual experiments.

The research team, led by Crick group leader Anne O'Garra and co-ordinated by Christine Graham, worked with many collaborators from the Crick, UK and the USA. They used next-generation sequencing technology, 'RNA-seq', to measure gene activity across the different diseases. As genes need to transcribe their DNA into RNA in order to function, analysing the RNA reveals how active each gene is - in this case after infection or allergen challenge.

"Gene activity can show us how the body responds to infections and allergens," explains Anne. "There are thousands of genes involved in any immune response, so Akul Singhania, a Bioinformatics Postdoc, in our lab used advanced bioinformatics approaches to cluster the genes into modules. These modules represent clusters of genes that are co-regulated and can often be annotated to determine their function and known physiological roles. For example, of 38 lung modules there is a module associated with allergy, and seen only in the allergy model, containing over 100 genes and another module associated with T cells containing over 200 genes."

"By sequencing both lung tissue and blood, we can also see how the immune response in the blood reflects the local response in the lung, and vice versa. This will help us to understand what we can learn from genetic signatures in the blood, since for most diseases doctors can't realistically get lung samples from patients."

A panoply of pathogens

Using the new app, researchers anywhere in the world can look up gene activity in the lungs and blood of mice infected with a range of pathogens: the parasite Toxoplasma gondii, influenza virus and Respiratory Syncytial Virus (RSV), the bacterium Burkholderia pseudomallei, the fungus Candida albicans, or the allergen, house dust mite. They can also see gene activity in the blood of mice with listeria, murine cytomegalovirus, the malaria parasite Plasmodium chabaudi chabaudi, or a chronic Burkholderia pseudomallei infection.

In the study, the research team analysed the genetic signatures associated with these diseases to help understand the immune response. They discovered a broad range of immune responses in the lung, where discrete modules were dominated by genes associated with Type I or Type II interferons, IL-17 or allergy type responses. Type I interferons are known to be released in response to viruses, while Type II interferon (IFN-?) activates phagocytes to kill intracellular pathogens, and IL-17 attracts neutrophils causing early inflammatory immune responses. Interestingly, interferon gene signatures were present in blood modules similarly to the lung, but IL-17 and allergy responses were not.

Surprisingly, genes associated with type I interferon were highly active in both the lungs and blood of mice infected with the Toxoplasma gondii parasite and also seen in response to the Burkholderia pseudomallei bacterium, albeit to a lesser extent. This challenges the view that type I interferon-associated genes are necessarily indicative of viral infections, as the lab had previously shown in tuberculosis.

"We found that mice without functioning interferon pathways were less able to fight off Toxoplasma infection. This was true for both Type I and Type II interferons, which have a complex relationship with each other. We found that both play a key role in protection against the parasite in part by controlling the neutrophils in the blood which in high numbers can cause damage to the host."

From obsolescence to opportunity

The research project began in 2009, using a technique known as microarray to detect gene activity in lung and blood samples and was almost complete and ready to be analysed by 2015. Microarray was then a well-established technique, but the necessary reagents were suddenly discontinued by the manufacturer before the final samples had been processed. Without the equipment to finish the sequencing, the project was in trouble.

With this microarray technology no longer possible, the team needed a different approach. At this time, a technique called RNA-Seq had come onto the market, offering a better way to quantify gene activity.

Following negotiations between Anne and the manufacturer, her team was offered cutting-edge RNA-Seq reagents free of charge, to re process the samples starting in late 2016. They were also provided storage space for the huge amounts of data generated.

As the tissue and blood samples from the microarray experiments were all frozen in storage, Christine Graham in Anne's lab was able to go back to the same samples and heroically process them again, this time for RNA sequencing. Thanks to the excellent storage of the samples, this was possible without use of additional animals. Although time-consuming and a huge task for Christine, by 2018 the team had all the sequencing data they needed.

With a huge amount of data to process, Akul Singhania set about making sense of it all. Using advanced bioinformatics techniques, he clustered the thousands of genes and millions of data points into a meaningful and visual form which we refer to as modules, and created the app to make the data accessible to anyone.

"Ten years since the project began, we now have an open access resource of gene expression that anyone in the world can use to look up their favourite genes and also see if they are regulated by type I or type II interferon signalling," says Anne. "Nobody said science was easy, but it's certainly worthwhile."

Credit: 
The Francis Crick Institute

Going the distance: Brain cells for 3D vision discovered

image: This is a 3D neuron captured under a microscope.

Image: 
Newcastle University, UK

Scientists at Newcastle University, UK have discovered neurons in insect brains that compute 3D distance and direction. Understanding these could help vision in robots.

In stunning images captured under the microscope for the first time, the neurons were found in praying mantises. The work is published in Nature Communications today.

In a specially-designed insect cinema, the mantises were fitted with 3D glasses and shown 3D movies of simulated bugs while their brain activity was monitored. When the image of the bug came into striking range for a predatory attack, scientist Dr Ronny Rosner was able to record the activity of individual neurons.

Dr Rosner, Research Associate in the Institute of Neuroscience at Newcastle University, is lead author of the paper. He said: "This helps us answer how insects achieve surprisingly complex behaviour with such tiny brains and understanding this can help us develop simpler algorithms to develop better robot and machine vision."

The "3D neurons"

Praying mantises use 3D perception, scientifically known as stereopsis, for hunting. By using the disparity between the two retinas they are able to compute distances and trigger a strike of their forelegs when prey is within reach.
The neurons recorded were stained, revealing their shape which allowed the team to identify four classes of neuron likely to be involved in mantis stereopsis.

The images captured using a powerful microscope show the dendritic tree of a nerve cell - where the nerve cell receives inputs from the rest of the brain - believed to enable this behaviour.

Dr Rosner explains: "Despite their tiny size, mantis brains contain a surprising number of neurons which seem specialised for 3D vision. This suggests that mantis depth perception is more complex than we thought. And while these neurons compute distance, we still don't know how exactly.

"Even so, as theirs are so much smaller than our own brains, we hope mantises can help us develop simpler algorithms for machine vision."

The wider research programme which is funded by the Leverhulme Trust, is led by Professor Jenny Read, professor of Vision Science at Newcastle University. She says: "In some ways, the properties in the mantises are similar to what we see in the visual cortex of primates. When we see two very different species have independently evolved similar solutions like this, we know this must be a really good way of solving 3D vision.

"But we've also found some feedback loops within the 3D vision circuit which haven't previously been reported in vertebrates. Our 3D vision may well include similar feedback loops, but they are much easier to identify in a less complex insect brain and this provides us with new avenues to explore."

It's the first time that anyone has identified specific neuron types in the brain of an invertebrate which are tuned to locations in 3D space.

The Newcastle team intend to further develop their research to better understand the computation of the relatively simple brain of the praying mantis with the aim of developing simpler algorithms for machine and robot vision.

Credit: 
Newcastle University

One in 10 people have 'near-death' experiences, according to new study

image: 5th Congress of the European Academy of Neurology

Image: 
European Academy of Neurology

(Oslo, Saturday, 29 June, 2019) Mystical near-death experiences where people report a range of spiritual and physical symptoms, including out-of-body sensations, seeing or hearing hallucinations, racing thoughts and time distortion, affect around 10 per cent of people, according to a new study that analysed participants from 35 countries.

These near-death experiences (NDEs) are equally as common in people who are not in imminent danger of death as in those who have experienced truly life-threatening situations such as heart attacks, car crashes, near drowning or combat situations.

The new findings were presented at the 5th European Academy of Neurology (EAN) Congress by researchers from the Rigshospitalet, Copenhagen University Hospital, University of Copenhagen, Denmark, the Center for Stroke Research, Berlin, and the Norwegian University of Technology, Trondheim, Norway.

Experiences most frequently reported by participants in their study included: abnormal time perception (87 per cent), exceptional speed of thought (65 per cent), exceptionally vivid senses (63 per cent) and feeling separated from, or out of their body (53 per cent).

The study group who reported NDEs variously described feeling at total peace, having their 'soul sucked out', hearing angels singing, being aware they were outside their body, seeing their life flashing before them, and being in a dark tunnel before reaching a bright light. Others spoke of being aware of another's presence before they went to sleep, or of a demon sitting on their chest while they lay paralysed unable to move [see Notes to Editors for selected quotes].

The team recruited 1,034 lay people from 35 countries via a crowdsourcing platform online (to eliminate selection bias) and asked them if they'd ever had an NDE. If they answered 'yes', they were asked for more details, using a detailed questionnaire assessment tool called the Greyson Near-Death Experience Scale, which asks about 16 specific symptoms.

A total of 289 people reported an NDE, and 106 of those reached a threshold of 7 on the Greyson NDE Scale, (which confirms a true NDE). Some 55 per cent perceived the NDE as truly life-threatening and 45 per cent as not truly life-threatening.

Far from being a pleasant experience associated with feelings of peacefulness and wellbeing, as some previous studies have reported, the new study found a much higher rate of people reporting their NDE as unpleasant. Overall, of all the people who claimed an NDE, 73 per cent said it was unpleasant and only 27 per cent said it was pleasant. However, in those with a score of 7 or above on the Greyson NDE Scale (a confirmed NDE), this changed to 53 per cent reporting a pleasant experience and 14 per cent an unpleasant one.

Based on insight gained from previous studies, the researchers found an association between NDEs and Rapid Eye Movement (REM) sleep intrusion into wakefulness. REM sleep is a phase of the sleep cycle where the eyes move rapidly, the brain is as active as when someone is awake, dreaming is more vivid, and most people experience a state of temporary paralysis, as the brain send a signal to the spinal cord to stop the arms and legs moving. When REM sleep intrudes into wakefulness, some people report visual and auditory hallucinations and other symptoms such as sleep paralysis, where they feel conscious but cannot move.

REM sleep intrusion on wakefulness was found to be more common in people with scores of 7 or above on the Greyson NDE Scale (47 per cent) than in people with scores of 6 or below (26 per cent), or in those below the threshold with no such experiences (14 per cent).

Lead researcher Dr Daniel Kondziella, a neurologist at the University of Copenhagen, said, "Our central finding is that we confirmed the association of near-death experiences with REM sleep intrusion. Although association is not causality, identifying the physiological mechanisms behind REM sleep intrusion into wakefulness might advance our understanding of near-death experiences."

Dr Kondziella said that the 10 per cent prevalence figure of NDE was higher than in previous studies conducted in Australia (8 per cent) and Germany (4 per cent). He said this could be explained by the fact they had been conducted on cardiac arrest survivors rather than unprimed lay people, as in this study.

Dr Kondziella said the study replicated the findings of an earlier study by Nelson et al in 20062 that had been criticised for selectional bias, but the new study addressed those potential flaws by recruiting via a crowdsourcing platform.

Credit: 
Spink Health