Culture

Diving into water treatment strategies for swimming pools

With summer in full swing, many people are cooling off in swimming pools. However, some of the substances that are made when chlorine in the water reacts with compounds in human sweat, urine or dirt aren't so refreshing. Now, researchers have compared the effectiveness of different water treatment processes in mitigating these so-called disinfection byproducts (DBPs). They report their results in ACS' journal Environmental Science & Technology.

Chlorine is usually added to pool water to kill harmful microbes. However, this disinfectant can react with substances in the pool water -- many of which are introduced by swimmers themselves -- to form DBPs, which can irritate the eyes, skin and lungs. Most pool systems continuously recirculate water through various treatment steps to both disinfect the water and reduce DBPs and their precursors. But because of the difficulty of comparing swimming pools with different conditions, such as number of swimmers, chlorine dosing or filling-water quality, scientists don't currently know which strategy is the best. So, Bertram Skibinski, Wolfgang Uhl and colleagues wanted to compare several water treatment strategies under the controlled and reproducible conditions of a pilot-scale swimming pool system.

The researchers continuously added compounds to their model swimming pool that simulated dirt and body fluids and added chlorine according to regulations for full-scale pools. Then, they treated the water with one of seven water treatment strategies. They found that the treatment using coagulation and sand filtration combined with granular activated carbon filtration was the most effective at lowering DBP concentrations. But even this treatment did not completely remove the contaminants because new DBPs were made more quickly than the old ones could be removed. When UV irradiation was used as a treatment step, the levels of some DBPs increased because the UV light elevated the reactivity of organic matter toward chlorine. New strategies need to be explored to more effectively remove DBPs and prevent new ones from forming, the researchers say.

Credit: 
American Chemical Society

Does likelihood of survival differ between patients with single vs. multiple primary melanomas?

Bottom Line: Patients with multiple primary melanomas had a higher likelihood of dying than those with a single primary melanoma in a study that used data from registries in the Netherlands. This observational study included nearly 57,000 patients (54,645 with a single primary melanoma and 2,284 with multiple primary melanomas). The study has limitations to consider, including a lack of information about family history and another possible limitation that researchers chose not to include melanoma in situ (when the disease is in its earliest form) because they analyzed features not related to it. The findings suggest the possibility that more strict follow-up may be warranted for patients with multiple primary melanomas.

Authors: Mary-Ann El Sharouni, M.D., University Medical Center Utrecht, the Netherlands, and coauthors

(doi:10.1001/jamadermatol.2019.1134)

Editor's Note:  Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

To increase bike commuters, look to neighborhoods

COLUMBUS, Ohio--People agree that bike commuting improves health, reduces air pollution and eases traffic, a recent survey suggests. But that wasn't enough to get most people to commute by bike.

The new research indicates that a person's neighborhood may play a large role in influencing the decision to commute by bike.

The study, published recently in the Journal of Transport and Land Use, could give city and regional planners new clues about how to design neighborhoods, streets and bike trails with active commuting in mind, said Yujin Park, lead author of the study and a doctoral candidate at The Ohio State University Knowlton School of Architecture.

"Bicycling contributes to urban vitality and, as a planner, I was interested in what the most influencing factors could be to make people be willing to choose a bicycle to commute," Park said. "We are interested in the urban factors that make a person ride a bicycle more."

The study found that people who live in high-density, mixed-use neighborhoods--the kinds of neighborhoods found in vibrant downtowns or near large college campuses--are more likely to commute by bike than their peers in suburban or rural areas.

Those findings held even in suburban neighborhoods residents considered "bike friendly." The study found that people who live in those neighborhoods might ride bikes for recreation or fun, but are less likely to commute to jobs or classes by bike than their peers who live in higher-density parts of the city.

The study was based on a survey of 1,200 people who commuted to Ohio State, one of the nation's largest public universities.

About 12.6 percent of those people classified themselves as bicyclists, and about 5.4 percent reported that a bicycle is their main mode of transportation to campus. People who lived in high-density areas were more than twice as likely to commute by bike as people in medium-density areas--and more than three times as likely to commute by bike as people in suburban areas.

Both bicyclists and non-bicyclists in the survey agreed that bicycling reduced environmental impacts of commuting, created health benefits and would save money, indicating that recognizing the benefits of bicycling is not a strong enough motivator to push non-bicyclists to start commuting on two wheels.

However, most bicyclists surveyed said they would commute by bike more frequently if they had access to more bike trails, bike-sharing opportunities and covered parking for their bikes at home or at work and school.

Non-bicyclists who lived in high-density neighborhoods appeared to be more concerned about safety--both from other vehicles and from crime--when traveling by bike than their peers who commute by bike.

Previous studies about bike commuting decisions have looked individually at commuters' attitudes about bike commuting and at their psychological perceptions about bike commuting. This research combines those attitudes and perceptions with neighborhood data.

"The conditional willingness to ride a bicycle to commute gradually decreases from high-density neighborhoods to low-density, single-family neighborhoods," Park said.

She said the findings indicate that if campus, city and regional planners want to increase the percentage of people commuting by bike, they might want to target public investment in protected bike lanes, bike paths and bike parking near downtown and campus areas.

Columbus, which won the U.S. Department of Transportation's first-ever Smart City Challenge grant, was awarded $50 million to improve mobility throughout central Ohio. Ohio State has been working with city leaders to increase sustainable transportation to, from and around campus.

"The people who live in those higher-density neighborhoods are the most likely to commute by bike," Park said. "Removing obstacles for them might make the most sense for where we invest our resources."

Credit: 
Ohio State University

The case of the poisoned songbirds

image: A selection of the birds collected after the reported die-off following a drench application of imidacloprid.

Image: 
Photo: Krysta Rogers

Researchers from the California Department of Fish and Wildlife's Wildlife Investigations Laboratory present their results from a toxicological investigation into a mortality event involving songbirds in a new publication in Environmental Toxicology and Chemistry.

On 17 March 2017, residents in Modesto, California, reported dead birds along the street and in front yards in a section of the town. The day prior to the incident, the city had made a drench application of imidacloprid, a pesticide synthetically derived from nicotine, to the base of trees that lined the street. The pesticide was reportedly mixed and applied according to package directions. Researchers at the Wildlife Investigation Laboratory were notified of the incident and conducted a postmortem investigation on the dead songbirds, which were identified as American goldfinches. The cause of death was determined to be imidacloprid poisoning likely due to the ingestion of fallen elm tree seeds contaminated during the drench application.

Lead author, Krysta Rogers, and her colleagues noted that "The mortality event investigated in the present study highlights a previously unidentified risk of drench application for imidacloprid. The pesticide label states that the product be applied to the base of the tree and directly to the root zone. [However] Seeds, insects, or other invertebrates consumed by birds and other animals may be present within that zone. If these food items were contaminated during the drench application, they would be highly toxic to animals when ingested."

The authors recommend that "drench applications not occur during seed drop to minimize the risk of exposure to animals that consume fallen seeds and that mitigation measures could be taken to prevent small animals from accessing areas treated with the pesticide, at a minimum" Finally the authors encourage integrated pest management over the prophylactic use of pesticides as the ideal.

Credit: 
Society of Environmental Toxicology and Chemistry

Computer scientists predict lightning and thunder with the help of artificial intelligence

image: Christian Schön and Professor Jens Dittrich from Saarland University have developed software that predicts thunderstorms with the help of artificial intelligence.

Image: 
Oliver Dietze

At the beginning of June, the German Weather Service counted 177,000 lightning bolts in the night sky within a few days. The natural spectacle had consequences: Several people were injured by gusts of wind, hail and rain. Together with Germany's National Meteorological Service, the Deutscher Wetterdienst, computer science professor Jens Dittrich and his doctoral student Christian Schön from Saarland University are now working on a system that is supposed to predict local thunderstorms more precisely than before. It is based on satellite images and artificial intelligence. In order to investigate this approach in more detail, the researchers will receive 270,000 euros from the Federal Ministry of Transport.

One of the core tasks of weather services is to warn of dangerous weather conditions. These include thunderstorms in particular, as these are often accompanied by gusts of wind, hail and heavy rainfall. The Deutscher Wetterdienst (DWD) uses the "NowcastMIX" system for this purpose. Every five minutes it polls several remote sensing systems and observation networks to warn of thunderstorms, heavy rain and snowfall in the next two hours. "However, NowcastMIX can only detect the thunderstorm cells when heavy precipitation has already occurred. This is why satellite data are used to detect the formation of thunderstorm cells earlier and thus to warn of them earlier," explains Professor Jens Dittrich, who teaches computer science at Saarland University and heads the "Big Data Analytics" group. Together with his doctoral student Christian Schön and the meteorologist Richard Müller from DWD, he has therefore developed a system that could soon supplement NowcastMIX in predicting thunderstorms. Their project is a first step to explore the applicability of artificial intelligence in the prediction of weather and climate phenomena.

In order to accurately predict thunderstorms in a specific region, the so-called convection of air masses, i.e. the rise of heated air while colder air sinks in the surrounding area, must be detected early and precisely. This has been known for a long time. The highlight of the new system, however, is that it only requires two-dimensional images, namely satellite images, to detect these three-dimensional air shifts.

In order to see what is happening in the sky three-dimensionally based on the two-dimensional images, the researchers use photographs taken fifteen minutes apart. Part of the image series for the respective area goes as input to an algorithm that calculates what the future image, not supplied to the algorithm, would look like. The scientists then compare this result with the real image. The size of the deviation between the forecast and reality - the researchers call it the "error" - then serves as input for a second algorithm, which the researchers have trained using machine learning to recognize the relationship between the size of the error and the occurrence of a thunderstorm. In this way, they can calculate whether it there is thunder and lightning or not. "This is the strength when we apply artificial intelligence to large amounts of data. It recognizes patterns that remain hidden from us," explains Professor Dittrich. This is one of the reasons why he and other colleagues have just initiated the new bachelor's and master's degree program "Data Science and Artificial Intelligence".

In the case of lightning and thunder, this combination is definitely "multifaceted," says Dittrich. "Based on the satellite images alone, we can predict flashes with an accuracy of 96 percent for the next 15 minutes. If the prediction window is opened further, the accuracy decreases, but still remains above 83 percent for up to five hours."

However, the rate of false alarms is still too high, according to the researchers. Nonetheless, they believe that they can significantly reduce these false alarms by training their model on additional features that are also used by the current NowcastMIX system, for example. The Federal Ministry of Transport has already granted the computer scientists from Saarbruecken 270,000 euros to investigate this in more detail.

Credit: 
Saarland University

Food insecurity leading to type 2 diabetes

A collaborative study by a team of Connecticut researchers shows there is a strong connection between food insecurity and insulin resistance, the underlying problem in type 2 diabetes. Insulin resistance occurs when cells are not able to respond normally to the hormone insulin.

The research by UConn School of Medicine, UConn School of Dental Medicine, Yale School of Public Health, Quinnipiac University, Hartford Hospital, and the Hispanic Health Council, suggests that for Latinos with type 2 diabetes food insecurity is linked to the disease's development and progression.

Published in the June issue of Journal of Nutrition, the study points to the more than 40 million Americans, including 6.5 million children, who live in food-insecure households where access to nutritionally adequate and safe food is limited or uncertain.

In the United States, the rate of food-insecure households is higher for Latinos, who are also disproportionately affected by metabolic disorders such as type 2 diabetes. In fact, rates of type 2 diabetes are 12.1% among Hispanics compared with 7.4% for non-Hispanic whites.

According to the researchers, food insecurity may increase inflammation in the body. These can be caused by diet-related obesity and excess abdominal fat. Also, food insecurity is stressful. It is often accompanied by mental distress, which triggers the release of cortisol and other stress hormones. These hormones may lead to the progression of insulin resistance.

"Our findings support the plausibility of links between food insecurity and poor health," says Dr. Angela Bermúdez-Millán, assistant professor in the Department of Community Medicine and Health Care at UConn School of Medicine.

"Resources should be redirected toward ending or decreasing food insecurity, a powerful social determinant of health."

The study included 121 study Latinos with type 2 diabetes. Sixty-eight percent of the participants were classified as food insecure.

Researchers tested the relationship between household food insecurity and insulin resistance using baseline data from the Community Health Workers Assisting Latinos Manage Stress and Diabetes (CALMS-D) randomized controlled trial. Fasting blood glucose, insulin levels, stress hormones, and markers of inflammation were measured.

They found that, compared with food secure individuals, food insecure individuals had significantly higher insulin resistance, insulin, glucose, stress hormones, inflammation, and total cholesterol. Food insecure individuals had higher insulin resistance than those who were food secure. Inflammation and stress hormones were the mechanisms through which food insecurity and insulin resistance were linked.

According to Bermúdez-Millán, the findings highlight the importance of implementing interventions that address food insecurity in order to mitigate its effects on inflammation, stress, and insulin resistance.

"Food insecurity is prevalent, widespread, and detrimental to health," she says. "Health care facilities can also help address the issue by screening for food insecurity and connecting patients to available resources and interventions."

Bermúdez-Millán is also calling on legislators to create policies to decrease food insecurity. She recommends modifying disbursement of SNAP benefits to possibly yield downstream benefits for diabetes control, and increasing access to minimally processed foods and more fruits, vegetables, and whole grains in local stores or community or home gardening venues.

The researchers note that a limitation of their study was that it was cross sectional, meaning participants were studied at one point in time. It is possible that those people with worse insulin resistance develop more inflammation and stress hormones, which make them sick or disabled, which in turn can deplete financial resources and lead to food insecurity.

Credit: 
University of Connecticut

Cascade exacerbates storage diseases

image: Dr. Susi Anheuser, Professor Dr. Konrad Sandhoff and Dr. Bernadette Breiden.

Image: 
(c) Photo: Volker Lannert/Uni Bonn

In rare, hereditary storage diseases such as Sandhoff's disease or Tay-Sachs syndrome, the metabolic waste from accumulating gangliosides cannot be properly disposed of in the nerve cells because important enzymes are missing. The consequences for the patients are grave: They range from movement restrictions to blindness, mental decline and early death. Scientists at the University of Bonn now demonstrate why these gangliosides also accumulate in patients with other storage diseases and cause a deterioration in them. The results will soon be presented in the Journal of Lipid Research and can be read online in advance.

Lysosomes take over the function of the stomach in living cells. They digest superfluous metabolic products and break them down into their components, which are then available for the cell's urgently needed new building and operating materials as part of the recycling process. If digestion in the lysosomes is blocked, for example as a result of genetic defects, this "cell waste" is stored. The growing mountain of waste materials can be toxic to nerve cells and may result in an early death, even in childhood.

In Tay-Sachs syndrome and Sandhoff's disease, components of nerve cell membranes cannot be properly degraded, resulting in the ganglioside GM2 being stored in the lysosomes. Gangliosides are water-insoluble fats, so-called lipids, which occur mainly in the ganglion cells of the nervous system. If the GM2-degrading enzyme Hex A is missing or impaired, for example due to genetic defects, destructive ganglioside storage occurs.

In Sandhoff's disease the degradation enzymes Hex A and Hex B are inactive. As with Tay-Sachs syndrome, GM2 storage leads to the destruction of nerve cells. Affected children develop normally in the first months of life; later on, blindness, movement restrictions and mental decline occur - and eventually an early death. "Previous therapeutic approaches have not led to any significant successes in these neurodegenerative gangliosidoses," says Prof. Dr. Konrad Sandhoff, senior professor at the LIMES Institute of the University of Bonn. Enzyme replacement therapies have failed due to the impermeability of the blood-brain barrier for these substances.

Researchers recreate conditions in the test tube

Together with his colleagues Dr. Susi Anheuser and Dr. Bernadette Breiden, Prof. Sandhoff has deciphered the role of the molecular environment in the lysosome for the successful degradation of GM2. In the test tube, the scientists reconstructed the tiny bubbles (vesicles) on which the GM2 is degraded in the "cell stomach" (the lysosome). Normally, the auxiliary protein GM2AP helps to catch and release the GM2 that sits on the shell of the bubbles. It can then be degraded together with the degradation enzyme Hex A to harmless GM3. However, when the function of Hex A is blocked, it is stored - with fatal consequences for nerve cells.

In the test tube, the scientists investigated the influencing factors that inhibit or improve the degradation of GM2. For example, the smaller the vesicle in the "cell stomach" and the more negatively charged their surface is, the easier the degradation enzyme's access to the GM2 and the better the "digestion" functions. The presence of cholesterol and sphingomyelin, on the other hand, significantly reduces GM2 degradation. The investigations showed that the storage of these lipids in Niemann-Pick diseases triggers an additional GM2 accumulation in the lysosome, which significantly exacerbates the type C clinical picture even though the GM2-degrading enzyme Hex A is intact and active. "Genetic disorders of the degradation enzyme evidently trigger a cascade of as yet unknown consequential damages," summarizes Sandhoff.

In another work Sandhoffs team shows that this cascade principle also applies to hereditary mucopolysaccharidoses. In these disorders, one of the storage substances, chondroitin sulphate, triggers additional ganglioside storage in the nerve cells by inhibiting GM2 degradation. In addition to existing short stature, coarse facial features and liver enlargement, it also causes learning difficulties and startle reflexes, which can, however, be alleviated in the animal model by inhibitors of GM2 formation.

"The aim of current therapeutic approaches is to prevent the production of GM2 for these hereditary storage diseases," said Sandhoff. Drugs that are currently on the market only partially fulfill this requirement. "Perhaps gene replacement therapy, which has already been successful in animal models and will soon be used in patients, will be more successful," said the biochemist. With the help of ferries, genes with the correct blueprint for the GM2 degradation enzyme will be introduced into the nerve cells of the brain. Sandhoff: "Hopefully the future will soon show whether this is actually a therapeutic option."

Credit: 
University of Bonn

The two faces of the Jekyll gene

image: Activity of Jekyll promoter as visualised by expression of Jekprom: GFP (green signal).

Image: 
Stefan Ortleb & Twan Rutten

The corresponding genes are lineage-specific for the grass tribes Triticeae and Bromeae and functioned as drivers for the speciation process within the Poaceae.

The Jekyll gene was first described in 2006 by researchers from the IPK in Gatersleben. They found that while it was crucial for sexual reproduction and fertility in barley (Hordeum vulgare), it was also partially similar to the Cn4 toxin produced by scorpions and played a role in cell autolysis. Inspired by this seemingly two-faced nature of the gene, the researchers had named it after Dr. Jekyll, the main character with the split personalities Dr. Jekyll and Mr. Hyde and from the eponymous gothic novella. A follow-up study by the same group of IPK researchers, led by Dr. Ljudmilla Borisjuk, has now shown how stunningly apt their choice of name was.

Whilst working on Jekyll, Dr. V. Radchuk discovered that the gene exists as two different and much diverged allelic variants, Jek1 and Jek3. The Jek1/Jek3 sequences are located at the same chromosomal locus and are inherited in a monogenic Mendelian fashion, whilst Jek1 and Jek3 share identical signal peptides, conserved cysteine positions and direct repeats. Although the encoded protein sequences might just have over 50% similarity, the researchers found that Jek3 actually complements the function of Jek1 in Jek1-deficient plants. Further investigations showed that Jekyll likely emerged in the common ancestor of the tribes of the Triticeae, such as barley, and Bromeae, therefore functioning as a lineage specific gene and probable driver for the separation of the lineages within the Poaceae.

The dual allelic nature of Jekyll made the cover of The Plant Journal and was featured in the belonging Research Highlight. In the meanwhile, the authors have started looking into the newly arisen questions of the cause and benefits of this allelic diversity in barley.

Credit: 
Leibniz Institute of Plant Genetics and Crop Plant Research

New contents: Neuronal Parkinson inclusions are different than expected

image: Content of Lewy bodies: The inclusions in the neurons contain mainly a membranous medley instead of the anticipated protein fibrils.

Image: 
University of Basel, Biozentrum

An international team of researchers involving members of the University of Basel's Biozentrum challenges the conventional understanding of the cause of Parkinson's disease. The researchers have shown that the inclusions in the brain's neurons, characteristic of Parkinson's disease, are comprised of a membranous medley rather than protein fibrils. The recently published study in "Nature Neuroscience" raises new questions about the etiology of Parkinson's disease.

Parkinson's disease is one of the most common neurodegenerative diseases worldwide. This disease is typically accompanied by motor defects such as the tremor of arms and legs, slowness of movements and muscle rigidity, which occur together with other non-motor symptoms. A characteristic of this progressively worsening and currently unstoppable disease are neuronal inclusions, so called Lewy bodies, that occur in many regions of the brain in the course of the disease. For decades, it was assumed that Parkinson's disease is caused by deposits of insoluble fibrils consisting of the protein alpha-synculein in the Lewy bodies.

Membrane fragments instead of protein fibrils

In their current study, the Dutch, German and Swiss researchers, including Prof. Henning Stahlberg's team, refute this long held common thesis. Using state-of-the-art electron microscopes, they have been able to show that the Lewy bodies contain mainly membrane fragments, lipids and other cellular material instead of the anticipated fibrils.

"We used correlative light and electron microscopy and other advanced light microscopy methods to take a closer look at the brain of deceased Parkinson's patients and discovered that the Lewy bodies consist mainly of membrane fragments from mitochondria and other organelles, but have in most cases no or only negligible quantities of protein fibrils," says Stahlberg. "The discovery that alpha-synuclein did not present in the form of fibrils was unexpected for us and the entire research field."

Currently, the researchers do not know yet where and in what form the protein alpha-synuclein is hidden amongst the membrane fragments and how it is involved in the formation of Lewy bodies. However, their work indicates that the laboratory-based model of alpha-synuclein fibrils as a cause and mechanism of Parkinson's disease should be revisited. "Our finding indicates that in order to uncover the causes of a disease one needs to be more strongly guided by the exploration of the pathology in humans," explains Stahlberg.

Ultrastructural insights into cell organelles

"The questions why it has taken so long to better characterize Lewy bodies, can perhaps be answered with the previous sample preparation and electron microscopy methods. Today's technologies enable us to have a much more detailed look into the morphology of human brain," explains Stahlberg. "The big question for us now is: How does alpha-synuclein contribute to the formation of Lewy bodies, if not present in form of fibrils?"

With their work, the researchers raise many new questions regarding the role of the Lewy bodies in the etiology of Parkinson's disease. The insights into such intracellular structures also provide important clues for potential therapeutic approaches to prevent or stop the formation and propagation of Lewy pathology in the brain.

Credit: 
University of Basel

New blood test for detecting Alzheimer's disease

Researchers from Lund University, together with the Roche pharmaceutical company, have used a method to develop a new blood marker capable of detecting whether or not a person has Alzheimer's disease. If the method is approved for clinical use, the researchers hope eventually to see it used as a diagnostic tool in primary healthcare. This autumn, they will start a trial in primary healthcare to test the technique.

Currently, a major support in the diagnostics of Alzheimer's disease is the identification of abnormal accumulation of the substance beta-amyloid, which can be detected either in a spinal fluid sample or through brain imaging using a PET scanner.

"These are expensive methods that are only available in specialist healthcare. In research, we have therefore long been searching for simpler diagnostic tools", says Sebastian Palmqvist, associate professor at the unit for clinical memory research at Lund University, physician at Skåne University Hospital and lead author of the study.

In this study, which is a collaboration between several medical centres, the researchers investigated whether a simple blood test could identify people in whom beta-amyloid has started to accumulate in the brain, i.e. people with underlying Alzheimer's disease. Using a simple and precise method that the researchers think is suitable for clinical diagnostics and screening in primary healthcare, the researchers were able to identify beta-amyloid in the blood with a high degree of accuracy.

"Previous studies on methods using blood tests did not show particularly good results; it was only possible to see small differences between Alzheimer's patients and healthy elderly people. Only a year or so ago, researchers found methods using blood sample analysis that showed greater accuracy in detecting the presence of Alzheimer's disease. The difficulty so far is that they currently require advanced technology and are not available for use in today's clinical procedures", says Sebastian Palmqvist.

The results are published in JAMA Neurology and based on studies of blood analyses collected from 842 people in Sweden (The Swedish BioFINDER study) and 237 people in Germany. The participants in the study are Alzheimer's patients with dementia, healthy elderly people and people with mild cognitive impairment.

The method studied by the researchers was developed by Roche and is a fully automated technique which measures beta-amyloid in the blood, with high accuracy in identifying the protein accumulation.

"We have collaborated with Roche for a long time and it is only now that we are starting to approach a level of accuracy that is usable in routine clinical care around the world", says Oskar Hansson, professor of neurology and head of the unit for clinical memory research at Lund University.

The researchers believe that this new blood sample analysis could be an important complement for screening individuals for inclusion in clinical drug trials against Alzheimer's disease or to improve the diagnostics in primary care which will allow more people to get the currently available symptomatic treatment against Alzheimer's disease.

"The next step to confirm this simple method to reveal beta-amyloid through blood sample analysis is to test it in a larger population where the presence of underlying Alzheimer's is lower. We also need to test the technique in clinical settings, which we will do fairly soon in a major primary care study in Sweden. We hope that this will validate our results", concludes Sebastian Palmqvist.

Credit: 
Lund University

Dung beetles use wind compass when the sun is high

image: In the new study, the researchers investigated dung beetles both out in the field and in the laboratory. Using fans, to create wind they could select the wind direction. They changed the sun's position in the sky using a mirror.

Image: 
Chris Collingridge

Researchers have shown for the first time that an animal uses different directional sensors to achieve the highest possible navigational precision in different conditions. When the sun is high, dung beetles navigate using the wind.

The discovery of the dung beetles' wind compass and how it complements the sun compass was made by an international research team comprising biologists from Sweden and South Africa.

"This is the first study that shows how an animal's biological compass can integrate different directional sensors, in this case wind and sun, in a flexible way. This enables the highest possible precision at all times", says Marie Dacke, professor of Sensory Biology at Lund University and leader of the research team.

The dung beetles cannot use the sun as a directional guide when it is cloudy, or when the sun is higher than 75 degrees above the horizon for a few hours in the middle of the day. A while later, when the sun is a little lower, they turn off the wind compass and again rely on the sun.

In the new study, the researchers investigated dung beetles both out in the field and in the laboratory. Using fans, to create wind they could select the wind direction. They changed the sun's position in the sky using a mirror.

The experiment shows that when the sun is at a low or medium elevation in the sky, the dung beetles change direction by 180 degrees if the sun's position is changed by 180 degrees. However, the dung beetles were not affected when the researchers changed the wind direction by 180 degrees when the sun was at these elevations.

When the sun was highest, the situation was reversed. The wind then showed the way, so the insects responded to a change in the wind direction of 180 degrees.

The results show that directional information can be transferred from the wind compass to the sun compass and vice versa. In this way, the dung beetles can continue on in one direction when one of the compasses becomes less reliable.

The sensors that register wind direction are on the insect's antennae.

"The insect brain is definitely not pre-programmed to always follow the same set of actions. On the contrary, we can show that such small brains work according to very dynamic principles that adapt to the conditions prevailing at a given moment", says Marie Dacke.

The researchers had previously shown that, during the night, some dung beetles navigate by the Milky Way and polarised moonlight while rolling their dung balls in a straight line. Combined with the results from the new study, they show that the insect's compass works at all times of the day or night and probably under almost any conditions.

"Now we will go on to study whether they can also use the wind at night. Another aspect we are curious about is what guides them when there is no wind and it's cloudy", comments Marie Dacke.

The aim of the research is to fully understand how very small brains handle large amounts of information in order to make a relevant decision: is it appropriate to turn left or right, or continue straight on?

Marie Dacke believes that the results will be of direct benefit within a few years, in areas like robot development and artificial intelligence (AI). Just like dung beetles, robots must take large amounts of information into consideration in order to direct their next action.

"Developments in AI are happening at breath-taking speed and part of my research is directly aimed at creating a model of how networks function to integrate information in a smart way", she concludes.

Credit: 
Lund University

New research shows how melting ice is affecting supplies of nutrients to the sea

The findings of a research expedition to coastal Greenland which examined, for the first time, how melting ice is affecting supplies of nutrients to the oceans has been published in the journal Progress in Oceanography.

The European Research Council-funded expedition on board the RSS Discovery took place during the summer of 2017. It was led by Dr Kate Hendry a geochemist from the University of Bristol's School of Earth Sciences.

The scientific crew spent about five weeks at sea in 2017, mostly near the western coast of Greenland, sampling waters, sediments and marine life using a range of cutting-edge technologies.

A Remotely Operated Vehicle (ROV) took high-definition, real time videos of the seafloor and collected samples of marine life, water and sediments which were then analysed by the scientists on board.

The paper highlights the importance of glacial meltwaters, combined with shelf currents and biological production, on biogeochemical cycling in these high-latitude regions over a range of timescales.

Previous work from the Bristol Glaciology Centre has shown that meltwaters released from underneath glaciers are rich in important nutrients. However, until now it's not been clear to what extent these nutrients reach the open ocean where they can 'fertilise' marine life.

Dr Hendry said: "Vigorous biological uptake in the glacial fjords keeps the surface concentration of key dissolved nutrients needed for algae, such as nitrate, phosphate and silicon, very low.

"However, sediment particles from the glaciers reach the shelf waters, albeit in a patchy way, and are then rapidly transported away from the shore.

"These particles, together with the remains of algal shells and biological material, are rapidly dissolved and cycled through shallow marine sediments. This means that the seafloor is a very important source of nutrients - especially silicon - to the overlying waters."

Future changes in the supply of these reactive, glacial sediments, as well as changes in the shelf currents that transport them, will have a profound impact on the nutrient balance and ecosystem structure in the fjords and coastal waters, and potentially even further afield.

Dr Hendry added: "This study shows how geochemical and oceanographic analyses can be used together to probe not only modern nutrient cycling in this region, but also changes in glacial meltwater discharge through time."

Credit: 
University of Bristol

Solving a condensation mystery

video: Smaller droplets move toward a large droplet.

Image: 
(Animation: Weisensee lab)

Condensation might ruin a wood coffee table or fog up glasses when entering a warm building on a winter day, but it's not all inconveniences; the condensation and evaporation cycle has important applications.

Water can be harvested from "thin air," or separated from salt in desalination plants by way of condensation. Due to the fact condensing droplets take heat with them when they evaporate, it's also part of the cooling process in the industrial and high-powered computing arenas. Yet when researchers took a look at the newest method of condensation, they saw something strange: When a special type of surface is covered in a thin layer of oil, condensed water droplets seemed to be randomly flying across the surface at high velocities, merging with larger droplets, in patterns not caused by gravity.

"They're so far apart, in terms of their own, relative dimensions" -- the droplets have a diameter smaller than 50 micrometers -- "and yet they're getting pulled, and moving at really high velocities," said Patricia Weisensee, assistant professor of mechanical engineering & materials science in the McKelvey School of Engineering at Washington University in St. Louis.

"They're all moving toward the bigger droplets at speeds of up to 1 mm per second."

Weisensee and Jianxing Sun, a PhD candidate in her lab, have determined that the seemingly-erratic movement is the result of unbalanced capillary forces acting on the droplets. They also found that the droplets' speed is a function of the oil's viscosity and the size of the droplets, which means droplet speed is something that can be controlled.

Their results were published online in Soft Matter.

WHY ARE THEY MOVING?

In the most common type of condensation in industry, water vapor condenses to form a thick layer of liquid on a surface. This method is known as "filmwise" condensation. But another method has been shown to be more efficient at promoting condensation and the transfer of heat that comes along with it: dropwise condensation.

It has been used on traditionally hydrophobic surfaces -- those that repel water such as the Teflon coating on a non-stick pan. However, these traditional non-wetting surfaces degrade rapidly when exposed to hot vapor. Instead, a few years ago, researchers discovered that infusing a rough or porous hydrophobic surface with a lubricant, such as oil leads to faster condensation. Importantly, these lubricant-infused surfaces (LIS) led to the formation of highly mobile and smaller water droplets, which are responsible for most of the heat transfer when it comes to condensation and evaporation.

During the process, however, the movement of water droplets on the surface seemed erratic -- and fast. "They move at a really high velocity for their size," -- about 100 microns --"just by sitting there," Weisensee said.

"The question is, 'Why are they moving?' "

Using high-speed microscopy and interferometry to watch the process play out, Weisensee and her team were able to discern what was happening and the relationships between droplet size, speed and oil viscosity.

They created water vapor and watched as small droplets formed on the surface. "The first process is that small droplets coalesce and form bigger droplets," Weisensee said. Capillary forces cause the oil to grow up and over the droplets, forming a meniscus -- not the knee muscle, but rather a curved layer of oil surrounding the droplet.

The oil is continuously moving around, trying to strike a balance as it covers different-sized droplets in different places on the surface -- if a large droplet forms here, the meniscus stretches over it, causing the oil layer to contract somewhere else. Any smaller droplets in the area of contraction are swiftly pulled to the larger droplets, leading to oil-rich and oil-poor regions.

During the process, larger droplets are essentially clearing the space, which in turn makes room for the formation of more small droplets.

Since most of the heat transfer (about 85 percent) occurs via these small droplets, using LIS for dropwise condensation should be a more efficient way to disperse of heat and get water from vapor. And since the droplets are very small, less than 100 microns in diameter, condensation can occur in a smaller area.

There's another benefit, too. During "traditional" condensation, gravity is the force that clears water from the surface, making room for new droplets to form. The surface is placed vertically, and the water simply runs off. Since capillary forces are doing the work in dropwise condensation on liquid-infused surfaces, however, the orientation of the surface is of no consequence.

"It could potentially be used on personal devices," where orientation is constantly changing, she said, "or in space." And because the entire process is more efficient than traditional condensation, Weisensee said, "This might be a nice way of clearing up space without having to rely on gravity."

Going forward, Weisensee's team will measure heat transfer to determine if the smaller droplets during dropwise condensation on LIS are, in fact, more efficient. They also plan to investigate different surfaces in order to maximize droplet movement.

Credit: 
Washington University in St. Louis

Researchers create first portable tech for detecting cyanotoxins in water

image: North Carolina State University researchers have developed the first portable technology that can test for cyanotoxins in water. To test for cyanotoxins, users place a drop of water on a customized chip, then insert it into a reader device which connects to a smartphone. These images are screenshots of the smartphone app of the cyanotoxin sensor. Left: Welcome page. Right: Data analysis page.

Image: 
Qingshan Wei, NC State University

North Carolina State University researchers have developed the first portable technology that can test for cyanotoxins in water. The device can be used to detect four common types of cyanotoxins, including two for which the U.S. Environmental Protection Agency (EPA) recently finalized recreational water quality criteria.

Cyanotoxins are toxic substances produced by cyanobacteria. At high enough levels, cyanotoxins can cause health effects ranging from headache and vomiting to respiratory paralysis and death.

The new technology is capable of detecting four common types of cyanotoxins: anatoxin-a, cylindrospermopsin, nodularin and microcystin-LR. One reason the portable technology may be particularly useful is that EPA finalized water quality criteria this month for both microcystin-LR and cylindrospermopsin in recreational waters.

"Our technology is capable of detecting these toxins at the levels EPA laid out in its water quality criteria," says Qingshan Wei, an assistant professor of chemical and biomolecular engineering at NC State and corresponding author of a paper on the work.

"However, it's important to note that our technology is not yet capable of detecting these cyanotoxins at levels as low as the World Health Organization's drinking water limit. So, while this is a useful environmental monitoring tool, and can be used to assess recreational water quality, it is not yet viable for assessing drinking water safety."

To test for cyanotoxins, users place a drop of water on a customized chip developed in Wei's lab, then insert it into a reader device, also developed in Wei's lab, which connects to a smartphone. The technology is capable of detecting and measuring organic molecules associated with the four cyanotoxins, ultimately providing the user's smartphone with the cyanotoxin levels found in the relevant water sample. The entire process takes five minutes.

"The reader cost us less than $70 to make, each chip cost less than a dollar, and we could make both even less expensive if we scaled up production," says Zheng Li, a postdoctoral researcher at NC State and first author of the paper.

"Our current focus with this technology is to make it more sensitive, so that it can be used to monitor drinking water safety," Wei says. "More broadly, we believe the technology could be modified to look for molecular markers associated with other contaminants."

Credit: 
North Carolina State University

New osteoporosis therapy's dual effects on bone tissue

image: Bone forming and antiresorptive effects of romosozumab in postmenopausal women with osteoprosis: bone histomorphometry and microCT analysis after 2 and 12 months of treatment.

Image: 
Dr. Pascale Chavassieux

Sclerostin is a protein produced by osteocytes in the bone that inhibits bone formation. A recent analysis of results from a clinical trial reveals the beneficial effects of romosozumab, an antibody therapy that targets sclerostin, on bone tissue in postmenopausal women with osteoporosis. The findings are published in the Journal of Bone and Mineral Research.

Romosozumab increases serum markers of bone formation and decreases those of bone breakdown, or resorption. This is associated with an increased bone mineral density and a reduced risk of bone fractures.

This latest analysis included 107 patients with osteoporosis who were enrolled in the multicenter, phase 3 clinical trial called the Fracture Study in Postmenopausal Women with Osteoporosis (FRAME) study and who underwent bone biopsies. The analysis showed that at the tissue level, romosozumab produced an early and transient increase in bone formation and a persistent decrease in bone resorption. This led to significant increases in bone mass and improved bone microarchitecture (Figure) after 12 months of therapy. These effects contribute to the reduced fracture risk previously reported in postmenopausal women with osteoporosis treated with romosozumab.

"Romosozumab is the first osteoporosis therapy with a dual effect on bone tissue, increasing bone formation and decreasing resorption" said lead author Dr. Pascale Chavassieux, of the University of Lyon, in France.

Credit: 
Wiley