Culture

Space travel can make the gut leaky

image: Declan McCole is a professor of biomedical sciences at UC Riverside.

Image: 
Carrie Rosema.

RIVERSIDE, Calif. -- Bacteria, fungi, and viruses can enter our gut through the food we eat. Fortunately, the epithelial cells that line our intestines serve as a robust barrier to prevent these microorganisms from invading the rest of our bodies.

A research team led by a biomedical scientist at the University of California, Riverside, has found that simulated microgravity, such as that encountered in spaceflight, disrupts the functioning of the epithelial barrier even after removal from the microgravity environment.

"Our findings have implications for our understanding of the effects of space travel on intestinal function of astronauts in space, as well as their capability to withstand the effects of agents that compromise intestinal epithelial barrier function following their return to Earth," said Declan McCole, a professor of biomedical sciences at the UC Riverside School of Medicine, who led the study published today in Scientific Reports.

The microgravity environment encountered in space has profound effects on human physiology, leading to clinical symptoms and illnesses including gastroenteritis; previous studies have shown microgravity weakens the human immune system. Microgravity has also been shown to increase the intestinal disease-causing ability of food-borne bacteria such as salmonella.

"Our study shows for the first time that a microgravity environment makes epithelial cells less able to resist the effects of an agent that weakens the barrier properties of these cells," McCole said. "Importantly, we observed that this defect was retained up to 14 days after removal from the microgravity environment."

The permeability-inducing agent McCole's team chose to investigate was acetaldehyde, an alcohol metabolite. McCole explained alcohol compromises barrier function and increases gastrointestinal permeability in normal subjects and in patients with alcoholic liver disease.

The barrier function of the intestinal epithelium, he added, is critical for maintaining a healthy intestine; when disrupted, it can lead to increased permeability or leakiness. This, in turn, can greatly increase the risk of infections and chronic inflammatory conditions such as inflammatory bowel disease, celiac disease, Type 1 diabetes, and liver disease.

McCole's team used a rotating wall vessel -- a bioreactor that maintains cells in a controlled rotation environment that simulates near weightlessness -- to examine the impact of simulated microgravity on cultured intestinal epithelial cells.

Following culture for 18 days in the vessel, the team discovered intestinal epithelial cells showed delayed formation of "tight junctions," which are junctions that connect individual epithelial cells and are necessary for maintaining impermeability. The rotating wall vessel also produces an altered pattern of tight junction assembly that is retained up to 14 days after the intestinal epithelial cells were removed from the vessel.

"Our study is the first to investigate if functional changes to epithelial cell barrier properties are sustained over time following removal from a simulated microgravity environment," McCole said. "Our work can inform long-term space travel and colonization where exposure to a food-borne pathogen may result in a more severe pathology than on Earth."

Credit: 
University of California - Riverside

Once hidden cellular structures emerge in fight against viruses

image: A negative stain electron micrograph of SgrAI/DNA filaments.

Image: 
Nancy Horton

New University of Arizona-led research has revealed the structure and function of one of bacteria's latest strategies in the fight against viruses: a fleet of highly organized enzymes that provide a rapid immune response capable of quickly shredding the harmful DNA of viral invaders.

"This is part of what's often referred to as the world's oldest war," said Nancy Horton, associate professor of molecular and cellular biology who leads the laboratory that conducted the research published last month in Structure. "This war takes place everywhere - from the oceans to the soil to our own guts."

Enzymes are proteins within living cells that speed up chemical reactions. Some enzymes can assume multiple shapes, each with a different function, and toggle between them. In this case, a specific enzyme - SgrAI - has a shape that slowly cuts invasive DNA. However, when many such enzymes link up and wrap around a length of DNA, they create a filament that increases DNA-cleaving ability by 200 times.

"SgrAI enzymes contain two metal atoms, and they have to put them right next to the place where the DNA is going to get cleaved," Horton said. "The structure of the enzyme in the non-filament form holds one of the two metal atoms in the wrong spot. In the filament structure, we see a change in the enzyme shape that pushes that atom into place."

A rapid immune response is important because bacteria-attacking viruses, called bacteriophage, attach outside the bacterial cell before injecting it with their own genetic material. Once inside, the bacteriophage hijack the bacteria's replication machinery to make copies of itself. Eventually, the newly synthesized viruses burst from the captured cell to infect other bacteria.

"This is really basic research," Horton said. "But you have to know how things work before you can fix them. And there are a lot of other medically important enzymes that use this mechanism. When they don't work, they have been implicated in cancer, autoimmune diseases, diabetes and more - it's that fundamental to cell biology."

The findings are part of Horton's larger research interest into the existence of filament-forming enzymes.

Finding Filaments

Filaments were first discovered about 50 years ago before being lost to science by the very methods meant to reveal a cell's inner workings.

In the 1960s, researchers used microscopes that bounced electrons off their subjects to render detail smaller than visible wavelengths of light. But then x-ray crystallography - the technique that led to the discovery of the structure of DNA - came along and gave researchers the ability to achieve even higher resolution images. X-ray crystallography overtook labs in the ensuing decades, and filaments went undetected because they don't form detectable crystalline structures.

Filaments were forgotten to science until around 2010, when a handful of labs around the world, including Horton's, began investigating cellular structures again using newer, higher resolution electron microscopes.

"When my lab first published a paper on the existence of filaments (in 2010), I was met with a lot of resistance," Horton said. "Around the time I was discovering this, I noticed some other labs had also discovered it. After going back through the scientific literature, I realized we knew about this decades ago, but we forgot about it. That's why I call it a renaissance - it's a rediscovery. But then we didn't know why and how filaments formed. So the recent work we've done is to look at why filaments form, what advantages they provide and what their purpose is."

This month, Horton published a compendium in Nature Reviews Molecular and Cellular Biology of the independent filament discoveries where she identifies them as the same phenomenon.

She is now a leader in the burgeoning field of enzyme filamentation. She is using National Science Foundation grants totaling more than $2.3 million to study the structure and mechanism of filament formation by SgrAI and phosphofructokinase, a human metabolic enzyme known as PFK, to learn about the benefots of filament formation to the regulation of enzyme activity.

Credit: 
University of Arizona

Caring for family is what motivates people worldwide

video: Arizona State University psychology graduate students Ahra Ko and Cari Pick explain how an international study including 27 countries and more than 7,000 people has identified caring for family as the most important motivation in people's lives. For 40 years researchers focused on romantic and sexual partners, but these motivations were rated as relatively low priorities. Caring for family members was associated with well-being, and seeking romantic partners was associated with unhappiness.

Image: 
Robert Ewing, ASU

Across the globe, caring for loved ones is what matters most.

But, for decades this has not been the focus of many social psychology studies. An international team of researchers led by evolutionary and social psychologists from Arizona State University surveyed over 7,000 people from 27 different countries about what motivates them, and the findings go against 40 years of research. The study will be published on December 3 in Perspectives on Psychological Science.

"People consistently rated kin care and mate retention as the most important motivations in their lives, and we found this over and over, in all 27 countries that participated," said Ahra Ko, an ASU psychology graduate student and first author on the paper. "The findings replicated in regions with collectivistic cultures, such as Korea and China, and in regions with individualistic cultures like Europe and the US."

The study included people from diverse countries - ranging from Australia and Bulgaria to Thailand and Uganda - that covered all continents except Antarctica. The ASU team sent a survey about fundamental motivations to scientists in each of the participating countries. Then, the researchers in each country translated the questions into the native language and made edits so that all the questions were culturally appropriate.

For the past 40 years, evolutionary psychological research has focused on how people find romantic or sexual partners and how this desire affects other behaviors, like consumer decisions. But study participants consistently rated this motivation - called mate seeking - as the least important factor in their lives.

Evolutionary psychologists define kin care as caring for and supporting family members, and mate retention as maintaining long-term committed romantic or sexual relationships. These two motivations were the most important even in groups of people thought to prioritize finding new romantic and sexual partnerships, like young adults and people not in committed relationships.

"The focus on mate seeking in evolutionary psychology is understandable, given the importance of reproduction. Another reason for the overemphasis on initial attraction is that college students have historically been the majority of participants," said Cari Pick, an ASU psychology graduate student and second author on the paper. "College students do appear to be relatively more interested in finding sexual and romantic partners than other groups of people."

In all 27 countries, singles prioritized finding new partners more than people in committed relationships, and men ranked mate seeking higher than women. But, the differences between these groups were small because of the overall priority given to kin care.

"Studying attraction is easy and sexy, but people's everyday interests are actually more focused on something more wholesome - family values," said Douglas Kenrick, President's Professor of Psychology at ASU and senior author on the study. "Everybody cares about their family and loved ones the most, which, surprisingly, hasn't been as carefully studied as a motivator of human behavior."

The motivations of mate seeking and kin care were also related to psychological well-being, but in opposite ways. People who ranked mate seeking as the most important were less satisfied with their lives and were more likely to be depressed or anxious. People who ranked kin care and long-term relationships as the most important rated their lives as more satisfying.

"People might think they will be happy with numerous sexual partners, but really they are happiest taking care of the people they already have," Kenrick said.

The research team is currently working on collecting information about the relationships among fundamental motivations and well-being around the world.

Credit: 
Arizona State University

Additives result in higher toxins for vape users, Portland State study finds

The vaping industry is filled with unknowns. Those unknowns are leading to more questions as the number of users dealing with injuries, or in some cases, death, continues to rise.

Portland State University Chemistry Professor Rob Strongin led a research team to study what happens when additives are put into vaping products. Specifically, they studied the chemical reaction that takes place when cannabis is consumed using a vape pen or dab rig. The study, "Aerosol Gas-Phase Components from Cannabis E-Cigarettes and Dabbing: Mechanistic Insight and Quantitative Risk Analysis," was published in the September issue of ACS Omega.

"What's inhaled is actually different than what's listed in the ingredients," Strongin said.

The researchers found that of the known toxins formed during vaping, more toxins came from terpenes than from THC, the main mind-altering chemical found in marijuana. Terpenes occur naturally in cannabis but at lower levels than seen in some concentrated THC products. Strongin said they found vendors adding up to 30% (or more) additional terpenes to their products. Terpenes can impact the flavor and smell of cannabis when inhaled.

"There are fewer toxins formed from THC as compared to terpenes," Strongin said. "This is consistent with some of the vaping-related injuries we're seeing. It's not the active ingredients, like THC or nicotine, that appear to be causing the hospitalizations and deaths, but what they are combined with."

Vitamins and thickening agents like Vitamin E have unknown effects on the lungs, and they can undergo reactions during vaping to produce both well-known and potential aerosol toxins, Strongin said.

"There's a reason that no one studied the inhalation toxicity of a lot of the ingredients in e-cigarettes because nobody thought we'd be crazy enough to be inhaling them," he said. "But the problem now is that there's a huge gap in our knowledge."

The main concern for Strongin and his fellow researchers is manufacturers altering their products from their natural state. For those who choose to vape, the less additives the better, the study shows. Although there is still much more research needed on the subject, Strongin added.

To continue expanding on knowledge about cannabis and inhalation, Strongin said they're interested in studying the actual dosage people inhale.

"People are assuming the dose is the amount of THC in the product before vaping," he said. "It's not true. There's more work to be done."

Credit: 
Portland State University

Satellite broken? Smart satellites to the rescue

image: University of Cincinnati aerospace engineering student Yufeng Sun holds a laser scanner used to measure and render objects in three dimensions.

Image: 
Andrew Higley/UC Creative Services

When satellites break, which is surprisingly often, there isn't much you can do about them.

They become expensive and dangerous flotsam, orbiting Earth for years or generations until gravity eventually draws them to a fiery death in the atmosphere.?
University of Cincinnati professor Ou Ma is engineering robotics technology to fix orbiting satellites in his Intelligent Robotics and Autonomous Systems Lab. He envisions robotic satellites that can dock with other satellites for repairs or refueling.

The most useful repair satellite will be able to complete multiple tasks, Ma said. During his career, he has worked on various projects relating to the robotic arms aboard the International Space Station and the former space shuttle program. His signature is floating in orbit on a piece of equipment aboard the space station.

In his lab, Ma and UC senior research associate Anoop Sathyan are developing robotic networks that can work independently but collaboratively on a common task.

For their latest study, Ma and Sathyan put a group of robots to the test with a novel game that uses strings to move an attached token to a designated spot on a table.

Since the robots each control just one string, they need the other robots' cooperation to move the token to the right spot by increasing or relaxing tension on the string in response to each robot's actions.

Using an artificial intelligence called genetic fuzzy logic, the researchers were able to get three robots and then five robots to move the token where the researchers wanted.

Their results were published this month in the journal Robotica.

The researchers found that by using five robots, the collective could accomplish the task even if one of the robots malfunctioned.

"This will be especially true for problems with larger numbers of robots where the liability of an individual robot will be low," the researchers concluded.

Ma said a million things can go wrong with every satellite launch. But for most of those glitches, nothing can be done once the satellite is deployed.

A $400 million Intelsat satellite the size of a small school bus malfunctioned this year after reaching a high elliptical orbit, according to SpaceNews. A few of the first 60 Starlink satellites launched by SpaceX malfunctioned as well this year, but their low Earth orbit is designed to decay to oblivion in just a few years.

Perhaps the most famous satellite glitch of all time occurred in 1990 when the Hubble Space Telescope was deployed only for NASA to learn its pricy mirror was warped. A subsequent repair mission aboard the space shuttle Endeavor in 1993 replaced the mirror to provide astonishing images of the universe.

Sending humans to space for satellite repairs is prohibitively expensive, Ma said. Four subsequent Hubble service missions costing billions of dollars combined were performed by astronauts from the space shuttle. 

Faulty satellites have dogged most international space programs from Japan to Russia. The problem isn't limited to Earth orbit. In 1999, a NASA orbiter crashed into Mars because engineers used pounds instead of metric newtons in thruster software. The thrusters fired with four times less force than anticipated and the spacecraft's orbit was critically low.

The inability to repair satellites becomes a more pressing concern with every launch, Ma said.

"Big commercial satellites are costly. They run out of fuel or malfunction or break down," Ma said. "They would like to be able to go up there and fix it, but nowadays it's impossible."

NASA is hoping to change that. In 2022, the agency will launch a satellite capable of refueling other satellites in low Earth orbit. The goal is to intercept and refuel a U.S. government satellite. The project called Restore-L is expected to provide proof of concept for autonomous satellite repairs, NASA said.

A Colorado company called Maxar is providing the spacecraft infrastructure and robotic arms for the project.

Most satellites fall into disuse because they exhaust their supply of fuel - not from a critical malfunction, said John Lymer, chief roboticist for Maxar. Refueling alone would be a boon for the industry, he said.

"You're retiring a perfectly good satellite because it ran out of gas," he said.?Lymer said he's familiar with the work Ma is doing in his Intelligent Robotics and Autonomous Systems Lab.

"Ou Ma, who I've worked with for many years, works on rendezvous and proximity organization. There are all kinds of technical solutions out there. Some will be better than others. It's about getting operational experience to find out whose algorithms are better and what reduces operational risk the most."

Lymer said the industry is poised to take off, creating a boon for aerospace engineering students like those at UC.

"I think it's the future. We're going to crawl into it -- not leap," he said.

In Ma's lab, students are working on the automated navigation that satellites will need to dock with other satellites in space. It's tricky business since an inadvertent bump in zero gravity can send one or both vehicles tumbling.

"It's easy to make it tumble in space because nothing holds it. Then the satellite becomes even more difficult to grab. If it starts to tumble, it can tumble forever basically. It won't stop on its own," Ma said.

Engineering simulations can predict the dynamic behavior of a target satellite so an approaching satellite can safely arrest it, he said.

"We have simulation tools so from there we can accurately predict its behavior," he said.

"To grab something in space is really difficult. And grabbing something that's tumbling in space is even more difficult."

Time is of the essence. With every launch and every failed satellite, low Earth orbit is approaching the Kessler effect, the theory by Donald Kessler that satellite collisions could create a cascade of debris hampering the safety of future launches as depicted in the fictional 2013 Oscar-winning film "Gravity."

"Think of the speed of these objects. We're not talking about highway speed or even aircraft speed. They're traveling at 17,000 mph," Ma said. 

His research is helping to push the frontiers of knowledge that will pave the way for future space projects.

"We're not developing an entire mission. We're developing the underlying technology," Ma said. "Once the technology is proven, NASA or a commercial company would take it to the next step." 

At a university where Neil Armstrong worked as an aerospace engineering professor, first steps can be big ones.

Credit: 
University of Cincinnati

Imaging study provides new biological insights on functional neurological disorder

BOSTON - Individuals with functional neurological disorder (FND) have symptoms not explained by traditional neurological conditions, including limb weakness, tremor, gait abnormalities, seizures and sensory deficits. New research led by investigators at Massachusetts General Hospital (MGH) and published in Psychological Medicine has uncovered pathways in the brain's white matter that may be altered in patients with FND. The findings advance current understanding of the mechanisms involved in this disease, and offer the possibility of identifying markers of the condition and patients' prognosis.

Because conventional tests such as clinical magnetic resonance imaging (MRI) brain scans and electroencephalograms (EEGs) are usually normal in patients with FND, there are currently no brain-based markers for this disorder and diagnoses are made using physical examination signs. More precise research-based imaging methods such as functional MRI and quantitative MRI have revealed several differences in the brains of some patients, including in gray matter regions. To look for any differences in the brain's white matter--which is composed of bundles of axons coated with protective myelin to help conduct nerve signals--investigators used a technique called diffusion tensor imaging (DTI), which measures the diffusion of water molecules.

The team used DTI to examine the brain white matter of 32 patients with FND and 36 healthy controls. Patients also provided information on the severity of their symptoms, the extent of the physical disability they experience, and the duration of their illness.

The researchers found that patient-reported impairments in physical health and illness duration were each associated with disruptions in white matter fibers within the stria terminalis/fornix, a pathway that is the principal output of the amygdala and hippocampus (brain areas that play roles in emotion/salience and learning/memory, respectively). This is notable given that several structural and functional neuroimaging studies have identified amygdalar and hippocampal abnormalities in patients with FND. Furthermore, reduced integrity within another pathway called the medial forebrain bundle also showed a relationship to patient-reported physical health impairments.

"The findings point to the potential importance of white matter pathways in the biology of FND," said lead author Ibai Diez, PhD, a senior research fellow in Neurology at MGH. "Our methodological approach here is another novelty. We conducted a set of network analyses that not only identifies patterns of white matter alterations, but also links specific patterns of white matter changes to cortical and subcortical brain areas."

Additional research is needed to determine the potential clinical relevance of the results. "Given that white matter disruptions in the stria terminalis/fornix and medial forebrain bundle related to patient-reported impairments in physical health and illness duration, future analyses should evaluate if these white matter profiles might be connected to specific clinical outcomes," said senior author David Perez, MD, MMSc director of the MGH FND Clinical and Research Programs. "Our work also requires further clarification and replication in large sample future studies."

Credit: 
Massachusetts General Hospital

Smooth operator: When earnings management is a good thing

New research from the Kelley School of Business makes the case that "smoothing the numbers" can be beneficial -- if you have the right team in place to handle the job.

Smoothing, in this case, means adjusting accounting reserves up or down. And contrary to the common wisdom that all earnings management is bad, researchers have identified a setting in which it can be good.

In a paper titled "Managerial Ability and Income Smoothing," David Farber, an associate professor of accounting at the Indiana University Kelley School of Business at IUPUI, and fellow researchers Bok Baik of Seoul National University and Sunhwa Choi of Sungkyunkwan University find that when high-ability management teams use their discretion to smooth bumps in earnings, future earnings and cash flows become more predictable, and a firm's stock price improves.

"We found that more-capable managers who use discretionary accounting choices to signal future performance provide more-useful financial reporting," Farber said. "Firms with high-ability managers who smooth earnings have more-predictable earnings and cash flows, and the stock market incorporates that information into a firm's stock price.

"High-ability management teams are better able to anticipate changes in their firms' prospects and can therefore better estimate accrual adjustments necessary to smooth their earnings. These managers are trying to communicate to the market by saying, essentially, 'We had some volatility in earnings this period, but going forward, we expect earnings to follow the path based on the smoother earnings.'"

The study also showed that if a firm's management team is not high ability and attempts to smooth earnings, the firm will likely see negative implications, like less-predictable earnings and lower stock prices. Additionally, the reputations of managers in these firms will likely suffer.

A final version of the paper will be published in The Accounting Review, one of the top scholarly accounting journals in the world, in July.

Credit: 
Indiana University

Earthquakes, chickens, and bugs, oh my!

Two new algorithms could help earthquake early warning systems buy you a few extra seconds to drop, cover, and hold on before the ground begins to shake.

Computer scientists at the University of California, Riverside have developed two algorithms that will improve earthquake monitoring and help farmers protect their crops from dangerous insects, or monitor the health of chickens and other animals. The algorithms spot patterns in enormous datasets quickly, with less computing power and lower cost, than other methods and have been used to improve earthquake detection, monitor the insect vector Asian citrus psyllid, and evaluate the feeding behavior of chickens.

Big data, big problems

Sensors, such as seismic sensors, which automatically record events that happen repeatedly over a period of time, have a problem. They gather so much data that it's hard to spot patterns. Time series analysis remedies this by looking for other examples of a sample sequence within a dataset, usually using graphics processing units, or GPUs. But for very large datasets this becomes impractical because it requires too many GPUs, which increases the cost.

Zachary Zimmerman, a doctoral student in computer science in the Marlan and Rosemary Bourns College of Engineering, built on an algorithm previously developed by co-author and professor of computer science Eamonn Keogh to handle extremely large datasets and ran it on 40 GPUs hosted on the Amazon Web Services cloud.

The algorithm, called SCAMP, sorted nearly two years of seismic recordings from California's Parkfield Fault, a segment of the San Andreas Fault located near the town of Parkfield, in just 10 hours, at a reasonable cost of about $300, and discovered 16 times more earthquakes than were previously known.

"It is difficult to overemphasize how scalable this algorithm is," Keogh said. "To demonstrate this, we did one quintillion--that's 1 followed by 18 zeros--pairwise comparisons of snippets of earthquake data. Nothing else in the literature comes within one-tenth of a percent of that size."

Identifying earthquakes isn't always easy

"The most fundamental problem in seismology is identifying earthquakes at all. There have been a number of methodological improvements by seismologists applying strategies from computer science to look for similar patterns," said co-author Gareth Funning, an associate professor of seismology. "The big advance here is that the dataset you can manage is way, way bigger. When we're looking at seismic data we used to think we were doing well comparing everything in a two-month time window."

Other methods of earthquake detection require the algorithm to find sequences that match a known earthquake. The UC Riverside method instead compares everything within a given time and thus can identify earthquakes that don't necessarily match one given as a model.

For example, their analysis of the Parkfield data discovered subtle, low-frequency earthquakes underneath the San Andreas fault. Sequences of these earthquakes, also known as nonvolcanic tremors, accompany deep, slow movements of tectonic plates.

Flurries of low-frequency earthquakes have occasionally preceded massive earthquakes, like the one in Japan 10 years ago. Better detection of low-frequency earthquakes could help improve forecasts of the largest earthquakes and also help scientists better monitor movements of tectonic plates.

From earthquakes to chickens and insect pests

The SCAMP algorithm can also detect harmful agricultural pests. Keogh attached sensors that recorded the motions of insects as they sucked juices out of leaves and used the algorithm to identify Asian citrus psyllid, the insect responsible for devastating citrus crops by spreading the bacteria that causes Huanglongbing, or citrus greening disease. He also used the algorithm to analyze a dataset from accelerometers, which measure various kinds of movements, attached to chickens over a period of days. SCAMP then identified specific patterns related to feeding and other behaviors.

SCAMP has one limitation, however.

"SCAMP requires you to have the entire time series before you search. In cases of mining historic seismology data, we have that. Or in a scientific study, we can run the chicken around for 10 hours and analyze the data after the fact," said co-author Philip Brisk, an associate professor of computer science and Zimmerman's doctoral advisor. "But with data streaming right off the sensor, we don't want to wait 10 hours. We want to be able to say something is happening now."

Faster real-time earthquake detection

Zimmerman used the billion datapoints, called a matrix profile, generated by SCAMP's analysis of the Parkfield fault data to train an algorithm he called LAMP. LAMP compares the streaming data to examples it has seen before to select the most relevant data as it comes off the sensor.

"Having the matrix profile available to you at the sensor means that you can immediately know what's important and what's not. You can do all your checks in real time because you're just looking through the important bits," Zimmerman said.

The ability to more quickly interpret seismic data could improve earthquake warning systems that already exist.

"With earthquake early warning, you're trying to detect things at monitoring stations and then forward the information to a central system that evaluates whether or not it's a big earthquake," said Funning. "A setup like this could potentially do a lot of that discrimination work before it's transmitted to the system. You could shave time off the computation required to determine that a damaging event is in progress, buying people a couple extra seconds to drop, cover, and hold on."

"A couple of seconds is huge in earthquake early warning," he added.

Credit: 
University of California - Riverside

Image release: Giant magnetic ropes in a galaxy's halo

IMAGE: This combined radio/optical image of the "Whale Galaxy, " NGC 4631, reveals magnetic structures. The spiral galaxy is seen edge-on, with its disk of stars shown in pink. The filaments, shown...

Image: 
Composite image by Jayanne English of the University of Manitoba, with NRAO VLA radio data from Silvia Carolina Mora-Partiarroyo and Marita Krause of the Max-Planck Institute for Radioastronomy. The...

This image of the "Whale Galaxy" (NGC 4631), made with the National Science Foundation's Karl G. Jansky Very Large Array (VLA), reveals hair-like filaments of the galaxy's magnetic field protruding above and below the galaxy's disk.

The spiral galaxy is seen edge-on, with its disk of stars shown in pink. The filaments, shown in green and blue, extend beyond the disk into the galaxy's extended halo. Green indicates filaments with their magnetic field pointing roughly toward us and blue with the field pointing away. This phenomenon, with the field alternating in direction, has never before been seen in the halo of a galaxy.

"This is the first time that we have clearly detected what astronomers call large-scale, coherent, magnetic fields far in the halo of a spiral galaxy, with the field lines aligned in the same direction over distances of a thousand light-years. We even see a regular pattern of this organized field changing direction," said Marita Krause, of the Max-Planck Institute for Radioastronomy in Bonn, Germany.

An international team of astronomers who are part of a project called the Continuum HAlos in Nearby Galaxies -- an EVLA Survey (CHANG-ES), led by Judith Irwin of Queen's University in Ontario, said the image indicates a large-scale, coherent magnetic field that is generated by dynamo action within the galaxy and spirals far outward in the form of giant magnetic ropes perpendicular to the disk.

"We are a little bit like the blind men and the elephant, since each time we look at the galaxy in a different way we reach a different conclusion about its nature! However, we seem to have one of those rare occasions where a classical theory, about magnetic generators called dynamos, predicted the observations of NGC 4631 quite well. Our dynamo model produces spiralling magnetic fields in the halo that are a continuation of the normal spiral arms in the galaxy's disc," said Richard Henriksen, of Queen's University.

The scientists are continuing their work to further refine their understanding of the galaxy's full magnetic structure.

The image was made by combining data from multiple observations with the VLA's giant dish antennas arranged in different configurations to show both large structures and finer details within the galaxy. The naturally-emitted radio waves from the galaxy were analyzed to reveal the magnetic fields, including their directions.

The scientists said the techniques used to determine the direction of the magnetic field lines, illustrated by this image, now can be used on this and other galaxies to answer important questions about whether coherent magnetic fields are common in galactic halos and what their shapes are.

Building such a picture, they said, can answer important questions such as how galaxies acquire magnetic fields, and whether all such fields are produced by a dynamo effect. Can these galaxy halo fields illuminate the mysterious origin of the even larger intergalactic magnetic fields that have been observed?

NGC 4631, 25 million light-years from Earth in the constellation Canes Venatici, is about 80,000 light-years across, slightly smaller than our own Milky Way. It was discovered by the famous British astronomer Sir William Herschel in 1787. This image also shows a companion, NGC 4627, a small elliptical galaxy, just above NGC 4631.

Credit: 
National Radio Astronomy Observatory

Cardiac events in First Nations people with diabetes have decreased, but still higher than in non-First Nations people

A new study provides insight into the cardiovascular health and health care services accessed by First Nations people with diabetes over a 20-year period in Ontario. It showed a decrease in cardiac events, but hospitalizations and death were still more frequent in this population than in non-First Nations people. The research is published in CMAJ (Canadian Medical Association Journal).

"This study provides new insights into patterns of cardiovascular health and related care of First Nations people with diabetes," writes Dr. Michael Green, professor of family medicine and public health at Queen's University, Kingston, and a senior scientist at ICES, Toronto, Ontario, with coauthors. "While progress is being made overall in diabetes management and reducing cardiac events and mortality, First Nations people with diabetes remain at greater risk of cardiovascular disease and complications than other people in Ontario."

This research is published concurrently with a related article in CMAJ Open that describes the methods used to identify the cohorts of First Nations people and other people in Ontario. These papers are the start of a series on diabetes and First Nations health. They are part of a partnership between researchers and Chiefs of Ontario that engages First Nations patients, families, elders and community members in the project http://www.cmajopen.ca/lookup/doi/10.9778/cmajo.20190096

"When Ontario First Nations take charge of our own research agenda intertwined with true meaningful partnership from researchers who value and incorporate First Nations perspectives regarding research, this results in a report that includes First Nations patients' knowledge and understanding of this devastating disease," says Carmen Jones, co-investigator and Health Director, Chiefs of Ontario.

The 20-year study (1996-2015) of more than 1.3 million people aged 20 to 105 years included 22,000 First Nations people.

Rates of various cardiac events, such as heart failure and myocardial infarction, among First Nations people in Ontario with diabetes decreased up to 60% during the study period, but rates were still higher than in non-First Nations people. Use of revascularization procedures increased 2- to 3-fold, and use of medications such as statins to protect against cardiac events also increased substantially, suggesting these may have contributed to the declining cardiac event rates.

"Although our finding of higher cardiac event rates among First Nations people is consistent with those of previous studies, the observed decreasing rates over time contrasts with earlier Canadian studies showing increasing rates of cardiovascular disease among First Nations people in general, suggesting that progress has been made in the management of diabetes and other cardiac risk factors in this population," write the authors.

"Efforts to further close the gap in cardiovascular risk will require emphasis on early prevention strategies that not only include Western medical therapies, but must also account for the complex context of the health of First Nations people in Canada and address their unique social and cultural determinants of health," the authors conclude.

The authors note study limitations, including low absolute numbers of First Nations people that did not allow for more detailed analysis of subgroups. As the study was focused on diabetes, researchers did not look at the effect of other illnesses on the differences in outcomes between First Nations and non-First Nations peoples.

Credit: 
Canadian Medical Association Journal

Study reveals lower rates of cancer and early death in Adventists, including among black individuals

A recent study found lower rates of premature death and cancer in Seventh-day Adventists, a Protestant denomination long known for health promotion, compared with individuals in the general U.S. population. Published early online in CANCER, a peer-reviewed journal of the American Cancer Society, the study also found similar results when limiting the analysis to Black Adventists and the Black general population.

Health behaviors promoted by the Seventh-day Adventist Church include not smoking, eating a plant-based diet, regular exercise, and maintaining normal body weight. Previous research suggests that Seventh-day Adventists have lower risks of many cancers, heart disease, and diabetes, and in California, live longer than individuals in the general population. Results vary by cancer type, however, with little published data for Black individuals.

To provide additional insights, Gary Fraser, PhD, of Loma Linda University in Loma Linda, California, and his colleagues compared death rates and cancer incidence between a national Seventh-day Adventist population and a representative sample of the U.S. population. Specifically, the researchers analyzed data from the nationally inclusive Adventist Health Study-2 and a U.S. Census population, and they adjusted for differences in education, location of residence, and past smoking habits, so that these factors would not explain any of the results.

The team found significantly lower rates of death from any cause, as well as a lower incidence of all cancers combined in the Adventist population (by 33 percent and 30 percent, respectively), and lower incidence rates specifically for breast, colorectal, rectal, and lung cancer (by 30 percent, 16 percent, 50 percent, and 30 percent, respectively). Death rates and incidence of all cancers combined were also significantly lower among Black Adventist individuals compared with Black individuals in the U.S. Census population (by 36 percent and 22 percent, respectively).

"This is the first confirmation of previous reports, now using national populations. Further, we were able to control for differences in tobacco use by excluding any currently smoking non-Adventists, and adjusting for past smoking and time since quitting in previous smokers," said Dr. Fraser. "In addition, this is the first report that includes a comparison among Black individuals alone."

Dr. Fraser noted that the findings do not clearly identify the causes of the health benefits experienced by Adventists, but other studies have provided evidence that the plant-based dietary habits chosen by many Adventists are one important factor. "Adventist vegetarians have less overweight, diabetes, hypertension, elevated blood cholesterol, coronary heart disease, and several cancers compared with Adventist non-vegetarians, who themselves are lower than usual consumers of animal foods," he said. "Thus, the findings in this report comparing all Adventists--vegetarians and non-vegetarians--to average Americans are largely as expected, and strongly suggest that these health advantages may be available to all Americans who choose similar diets, in addition of course to other well-known prudent lifestyle choices such as regular physical activity, avoiding smoking, and care with body weight."

Credit: 
Wiley

Thermal cameras effective in detecting rheumatoid arthritis

A new study, published today in Scientific Reports, highlights that thermal imaging has the potential to become an important method to assess Rheumatoid Arthritis.

Results of the study, carried out with 82 participants, confirm that both palm and finger temperature increase significantly in patients with Rheumatoid Arthritis (RA).

RA patients were examined by two rheumatologists. A subset of these participants underwent diagnostic ultrasonography by a trained rheumatologist in order to ensure that the recruited participants had no active signs of synovitis in their hands and wrists.

Dr Alfred Gatt, from the University of Malta and a Visiting Fellow at Staffordshire University, was lead author of the report. He explained "We used Flir T630 therma camera and followed the guidelines of the American Thermology Association.

"The results of our study show that the two probability curves intersect at 31.5°C for palm temperatures, indicating that individuals whose palm temperatures is less than 31.5°C are more likely to be healthy; while those persons whose palm temperature is less than 31.5°C are more likely to have Rheumatoid Arthritis. Similarly, for finger temperatures, the two probability curves intersect at 30.3°C"

"While ultrasonography had not detected any significant changes in our study population, thermography flagged a possible ongoing disease process by reporting these higher temperatures".

"We hypothesise that this temperature difference may be attributed to underlying subclinical disease activity or else that the original inflammatory process may cause irreversible thermal changes that persist after the disease activity has resolved. We will need further studies to substantiate this."

Dr Gatt added: "Thermal imaging is an emerging technology within medicine and has the potential to become an important clinical tool as disease processes can vary the magnitude and pattern of emitted heat in a person with Rheumatoid Arthritis."

Associate Professor Cynthia Formosa, also from the University of Malta and Visiting Fellow at Staffordshire University, said: "This is the first study to explore thermographic patterns of patients with Rheumatoid Arthritis comparing them to healthy controls. Our results have clearly shown that an RA hand without active synovitis [the medical term for inflammation of the synovial membrane] exhibits higher temperatures when compared to healthy individuals."

Professor Nachi Chockalingam, Director of Centre for Biomechanics and Rehabilitation Technologies at Staffordshire University co-authored the study. He added: "Rheumatoid Arthritis affects more than 400,000 adults in the UK which can lead to deformity, disability and cardio-vascular problems. Timely detection of ongoing synovitis in RA is of paramount important to help enable tight disease control. However we know RA can be difficult to diagnose."

"This work showcases our successful collaboration with colleagues in Malta and the potential thermal imaging has in helping practitioners to assess the disease. In addition to making some seminal scientific contributions, our collaborative research work informs our curriculum development and teaching."

Credit: 
Staffordshire University

A new world map rates food sustainability for countries across the globe

image: This map scores the sustainability of food systems country-by-country. The scale goes from blue to red, with blue indicating a high sustainability score. Gray means there is not enough data available to rate the country.

Image: 
Béné et al., International Center for Tropical Agriculture

Increased awareness of how human diets acerbate climate change - while failing to properly nourish more than 800 million people - makes a better understanding of food systems a global priority. Global initiatives now call for us to transform our diets - for our health and the health of the planet - to help make food systems "sustainable." But researchers at the International Center for Tropical Agriculture (CIAT) and colleagues argue that social and economic variables also need to be included if we are to understand exactly how sustainable our food systems are.

The researchers scoured almost two decades of scientific literature related to food systems. They settled on 20 indicators that are available to 97 countries from low-, middle- and high-income regions, and built a global map to rate the sustainability of food systems across the globe. The indicator can be used to track changes in sustainability over time and has the potential to guide policy and action as climate change, rising populations, and increased demand for food place unprecedented pressure on global food systems.

"Addressing the question of the (un)sustainability of our food systems is critical as the world is bracing for hard-choice challenges and potentially massive tradeoffs around issues related to food quality and food security in the coming decades," wrote the authors in the journal Scientific Data, which is published by Nature. The research was published on ­­­November 25.

Food systems - which refer to the whole web of food production and consumption, from pre-production to food waste - are still a relatively new area of research. There is still little uniformity in terms of indicators used by researchers, governments and international development organizations. This research also aimed to seek standardized terms and methods to help further food systems research.

The study's authors sorted the 20 indicators into four dimensions: environment, economic, social, and food and nutrition. The indicators cover a broad range of factors including greenhouse gas emissions from agriculture, size of the female labor force, fair trade, food price volatility, and food loss and waste.

"This is the first attempt to empirically measure and characterize the sustainability of the food systems worldwide considering not only the dimension food security and nutrition, or environment, but also economic, and social dimensions," said Camila Bonilla, a co-author at the University of California, Davis.

The world's largest employer

"The food system is probably the largest employer in the world, so the sustainability of food systems is also about the economic and social contributions of those hundreds of thousands of people and enterprises that are involved in some aspect of the system -- from production all the way to food retail and distribution and consumption," said Christophe Béné, the study's lead author and senior policy expert at CIAT's Decision and Policy Analysis (DAPA) research area. "It means that the economic and social dimensions of food system sustainability cannot be ignored."

The map identifies some important knowledge gaps.

"Our research highlights how little is currently known about food systems," said Béné. "The reason is that national statistical systems, in both high- and lower-income countries, are collecting only a small portion of the information that is needed to build a comprehensive picture of the whole system."

In the 83 documents used in their literature review, the researchers found 192 different indicators, many of which have some level of overlap but not all of which were directly comparable across countries.

"This research represents a critical step forward in understanding the relationship between the structure and function of food systems and their sustainability," said Steven Prager, a co-author senior scientist at CIAT who works on integrated modeling. "The global food system is really a set of interconnected subsystems and this work offers one of the most systematic attempts to date to unpack food system dynamics, from farm to fork to policy."

Credit: 
The Alliance of Bioversity International and the International Center for Tropical Agriculture

Using artificial intelligence to analyze placentas

UNIVERSITY PARK, Pa. -- Placentas can provide critical information about the health of the mother and baby, but only 20 percent of placentas are assessed by pathology exams after delivery in the U.S. The cost, time and expertise required to analyze them are prohibitive.

Now, a team of researchers has developed a novel solution that could produce accurate, automated and near-immediate placental diagnostic reports through computerized photographic image analysis. Their research could allow all placentas to be examined, reduce the number of normal placentas sent for full pathological examination and create a less resource-intensive path to analysis for research -- all of which may positively benefit health outcomes for mothers and babies.

"The placenta drives everything to do with the pregnancy for the mom and baby, but we're missing placental data on 95 percent of births globally," said Alison Gernand, assistant professor of nutritional sciences in Penn State's College of Health and Human Development. "Creating a more efficient process that requires fewer resources will allow us to gather more comprehensive data to examine how placentas are linked to maternal and fetal health outcomes, and it will help us to examine placentas without special equipment and in minutes rather than days."

The team's study was presented at the International Federation of Placenta Associations meeting held in Buenos Aires, Argentina, in September and at the International Conference on Medical Image Computing and Computer Assisted Intervention held in Shenzen, China, in October.

The patent-pending technology uses artificial intelligence to analyze an image of each side of the placenta after delivery and then produces a report with critical information that could impact the clinical care of the mother and child, such as whether the fetus was getting enough oxygen in the womb or if there is a risk of infection or bleeding.

Currently, there are no evidence-based standards to determine when a placenta should be examined, and low-income countries and areas where home births are more prevalent often lack resources to conduct even a baseline placental analysis. This digital tool could offer a solution, as an individual would need only a smartphone or tablet with the appropriate software.

"Even in very low-resource areas, someone typically has a smartphone," explained Gernand. "Our goal is for a medical professional or trained birth attendant to take a photo which, after analysis through licensed software, could provide immediate information that aids in the care of the mother and baby."

For example, an umbilical cord with an abnormal insertion point or excessive twisting can be a predictor of neonatal stroke. Examination after a stillbirth could give a family information about whether future stillbirths may reoccur and help medical professionals advise them on possible interventions.

To create the system, the researchers analyzed 13,000 high-quality images of placentas and their corresponding pathology reports from Northwestern Memorial Hospital. Then, the researchers labeled a training set of images with data points critical to understanding the placenta, such as areas of incompleteness and the umbilical cord insertion point.

The images were used to train neural networks using CPU and GPU servers that could automatically analyze new placental images to detect features linked to abnormalities and potential health risks. Their system produced predictions on unlabeled images efficiently, and comparisons with the original pathology reports demonstrated the system's high accuracy and clinical potential.

"Past analyses have typically examined features independently and used a limited number of images," said James Wang, professor in Penn State's College of Information Sciences and Technology. "Our tool leverages artificial intelligence and a large and comprehensive dataset to make multiple decisions at the same time by treating the different parts of the placenta as complimentary. To our knowledge, this is the first system for comprehensive, automated placental analysis."

Additionally, this tool could advance pregnancy research and be useful for long-term care by providing clinically meaningful information to patients and practitioners.

Said Wang, "We don't want to replace pathologists, but rather we want to provide physicians with more information right at birth so they can make an efficient and informed decision about how to care for the mother and child."

"We're working to make the placental data accessible by translating it into something that's both clinician and patient friendly," Gernand concluded. "We know placental development and function is vital to the health of the pregnancy, but we only know a fraction of how much it can tell us about the health of the mom and baby. This research is a critical first step in building big data to better understand what we can learn from the placenta."

Credit: 
Penn State

Safety evaluation of conditionally immortalized cells for renal replacement therapy

image: Temperature-dependent effect of SV40T expression on ciPTEC-OAT1 proliferation and apoptosis-sensitivity. (A) Western blot analysis of SV40T levels in ciPTEC-OAT1 cultured at the permissive (33° C) temperature and the non-permissive (37° C) temperature for 1 day, 7 days or 7 days followed by a 4 h incubation at 33° C (switch). Intensity of the bands was normalized to GAPDH and quantification is depicted in the bar graph. Human kidney tissue protein sample served as control. Representative histograms and analysis of cell cycle distribution of ciPTEC-OAT1 cultured at (B) subconfluent and (C) confluent levels at 33° C and 37° C for 1 day, 7 days or 7 days followed by 4 h at 33° C (switch). (D) Cell viability analysis and (E) caspase-3/7 expression in ciPTEC-OAT1 cultured at 33° C and 37° C and exposed to increasing concentrations of nutlin-3a for 24 h. All values are expressed as the mean ± SEM of three independent experiments performed in triplicate. *p < 0.05, **p < 0.01, ***p < 0.001 (unpaired two-tailed Student's t-test and one-way ANOVA followed by Dunnett's multiple comparison test).

Image: 
Correspondence to: Rosalinde Masereeuw, email: r.masereeuw@uu.nl

Here, the research team assessed the safety of conditionally immortalized proximal tubule epithelial cells for bioartificial kidney application, by using in vitro assays and athymic nude rats.

They demonstrate that these cells do not possess key properties of oncogenically transformed cells, including anchorage-independent growth, lack of contact inhibition and apoptosis-resistance.

Taken together, this study lays an important foundation towards bioartificial kidney development by confirming the safety of the cell line intended for incorporation.

Dr. Rosalinde Masereeuw from the Division of Pharmacology, Utrecht Institute for Pharmaceutical Sciences, Utrecht University, Utrecht, The Netherlands said, "End-stage kidney disease represents irreversible kidney failure through a variety of causes."

Human primary proximal tubule epithelial cells have a limited life span in vitro and presenting risks, such as functional changes occurring upon culturing as well as dedifferentiation and senescence of cells.

Issues related to the use of animal-derived cells in BAK are safety concerns compromising approval for clinical application and species differences in cell behaviour.

Due to the expression of temperature-sensitive SV40T, cells can be expanded at permissive temperature of 33 C and differentiated into mature cells at non-permissive temperature of 37 C.

With this cell line the authors demonstrated the capacity of an efficient removal of uremic toxins when cells are cultured on hollow fiber membranes , thereby creating fully functional kidney tubules.

Finally, they evaluated cell transforming properties and tumorigenic potential in vivo to gain more insight into safety and suitability of these cells for applications in renal replacement therapies.

The Masereeuw Research team concluded, "by showing that ci PTEC-OAT1 do not portray fundamental characteristics of oncogenically transformed cells, do not present negative consequences of viral transductions and genomic transgene integrations, such as insertional mutagenesis, nor possess tumorigenic capacity in vivo, the present study lays an important foundation towards validating the safety of a conditionally immortalized cell line for clinical application as cell-based renal replacement therapy."

Credit: 
Impact Journals LLC