Culture

Unlocking secrets of the ice worm

image: A closeup of glacier ice worms.

Image: 
Photo by Trinity Hamilton

The ice worm is one of the largest organisms that spends its entire life in ice and Washington State University scientist Scot Hotalilng is one of the only people on the planet studying it.

He is the author of a new paper that shows ice worms in the interior of British Columbia have evolved into what may be a genetically distinct species from Alaskan ice worms.

Hotaling and colleagues also identified an ice worm on Vancouver Island that is closely related to a separate population of ice worms located 1,200 miles away in southern Alaska. The researchers believe the genetic intermingling is the result of birds eating the glacier-bound worms (or their eggs) at one location and then dropping them off at another as they migrate up and down the west coast.

"If you are a worm isolated on a mountaintop glacier, the expectation is you aren't going anywhere," said Hotaling, a postdoctoral biology researcher. "But low and behold, we found this one ice worm on Vancouver Island that is super closely related to ice worms in southern Alaska. The only reasonable explanation we can think of to explain this is birds."

Super cool organism

The ice worm resembles the common earthworm but is smaller and darker in color. What sets the ice worm apart from other members of the Mesenchytraeus genus is its ability to live its entire life in glacial ice.

Millions, perhaps hundreds of millions, of ice worms can be seen wriggling to the top of glaciers from the Chugach Mountains in southeast Alaska to the Cascade Volcanoes of Washington and Oregon during the summer months. In the fall and winter, ice worms subsist deep beneath the surface of glaciers where temperatures stay around freezing.

Hotaling's interest in ice worms began back in 2009 while he was working as a mountaineering ranger on the high elevation slopes of Mt. Rainer. He was climbing at three in the morning when he noticed a lot of small, black worms crawling around on the surface of a glacier.

"I wasn't even a biology undergraduate yet but I remember being so fascinated by the fact that there is this worm that can live in a glacier," he said. "It is not a place where we think of life being able to flourish and these things can be present at like 200 per sq. meter, so dense you can't walk without stepping in them."

Hotaling eventually went back to school and earned a PhD in biology at the University of Kentucky where he studied how climate change is affecting mountain biodiversity.

In the summer of 2017, he finally got the opportunity to circle back and do some research on the ice worm when he arrived in Pullman to start a postdoc position in the laboratory of Associate Professor Joanna Kelley, senior author of the study who specializes in evolutionary genomics and extremophile lifeforms.

"In the Kelley lab, we study organisms that have evolved to live in places that are inhospitable to pretty much everything else," Hotaling said. "Determining the evolutionary mechanisms that enable something like an ice worm to live in a glacier or bacteria to live in a Yellowstone hot spring is a really exciting way to learn about what is possible at the bounds of evolution. That's where we are working now, understanding the evolution of ice worms."

In the study

Hotaling and colleagues extracted and sequenced DNA from 59 ice worms collected from nine glaciers across most of their geographical range. Their analysis revealed a genetic divergence between populations of ice worms that are north and west and south and east of the Coast Mountains of British Columbia.

The researchers predict that this deeper split into two genetically distinct ice worm groups occurred as a result of glacial ice sheets contracting around a few hundred thousand years ago, isolating worms in the Pacific Northwest from their counterparts in Alaska.

The most surprising finding of the study was the discovery of a single ice worm on Vancouver Island that was closely related to a population of ice worms 1,200 miles away in Alaska.

"At first we thought there has to be some kind of error in the analysis or prep methods but upon further investigation we confirmed our initial results," Hotaling said. "These are worms isolated on mountain tops and there is no explanation for how they covered that gap than on, or perhaps within, migrating birds."

The research illuminates an important relationship between two of the few large organisms that inhabit North America's high elevation alpine ecosystems, the ice worm and the Gray-Crowned Rosy Finch, one of North America's highest elevation nesting birds.

"We knew that ice worms were an important source of food for the birds but we didn't know until now that the birds are also likely very important for the ice worms," Hotaling said. "If you are super isolated like an ice worm, you could easily become inbred. But if birds are bringing little bits of new diversity to your mountaintop glacier that could be really good for you."

Hotaling and Kelley's study was published this month in Proceedings of the Royal Society B.

Credit: 
Washington State University

Read how TV advertisers can measure the impact of their spots with second-screen searching

Researchers from the University of Houston, University of Minnesota, and University of California-San Diego published a new paper in the Journal of Marketing, which finds that TV ads lead to a variety of online responses and that advertisers can use these signals to enrich their media planning and ad evaluations.

The study, forthcoming in the July issue of the Journal of Marketing titled "Immediate Responses of Online Brand Search and Price Search to TV Ads," is authored by Rex Du, Linli Xu, and Kenneth Wilbur.

Media producers are hard-pressed to capture viewers' full attention because 178 million Americans regularly use a second-screen device while watching TV. On the other hand, ready access to a second screen empowers TV viewers to take immediate actions after seeing an ad, such as search for product reviews and prices, express opinions on social media, or place an order on the advertiser's website.

This phenomenon has the potential to allow TV advertisers to link post-ad spikes in online activities to the individual TV ads that caused them. This is important to marketers who seek to be able to link their dollars spent on ads to consumer activities, including sales at the cash register. Armed with such insights, advertisers can then use these measures to assess the relative effectiveness of ad spots and to improve ad copy and media placement decisions. The ultimate goal is to improve the cost-effectiveness of TV as an advertising medium. A new study in the Journal of Marketing explores the inner workings of this phenomenon to empower advertisers with the tools they need to make these important choices.

The researchers present a rigorous, yet practical, framework that links TV ad insertions to minute-by-minute online search. Du explains that "Our research offers several key takeaways. First, for both brand search and price search, there is a detectable spike immediately after a regular ad insertion, be it on national or local TV. Second, nearly all of the immediate response occurs within five minutes of an ad insertion. Third, besides generating immediate own-brand searches, national TV ad insertions also lead to significant competitor-brand searches but little competitor-price searches. Fourth, national spots appear to be more cost-effective at generating immediate brand search response, whereas local spots appear to be more cost-effective at generating immediate price search response."

However, the findings about divergent effects of broadcast/cable, weekend/weekday, national/local, and ad creative characteristics on brand versus price search are a caution to advertisers against relying on any single immediate online response metric to assess their ads. "There is not likely a media plan or ad creative that is optimal for all types of online response," says Xu. Wilbur continues, "Our proposed framework for modeling behavioral response at the minute level is transparent and readily replicable. Advertisers, agencies, and networks can use our method to include website traffic, online transactions, social media activities or other important behavioral indicators that vary at the minute level."

Credit: 
American Marketing Association

Study: Internet perpetuates job market inequality

Recent research finds the internet is giving employers and job seekers access to more information, but has not made the hiring process more meritocratic. Instead, lower-wage jobs have become "black holes," with intense competition for positions, while many higher-wage jobs are going to targeted candidates and are open to only limited competition.

"In theory, the internet gives job seekers access to a wider range of job opportunities than was available previously, and also gives organizations access to a much larger pool of job candidates," says Steve McDonald, lead author of a paper on the work and a professor of sociology at North Carolina State University. "We wanted to see how the internet was affecting the hiring process in the real world. What we found is that access to information has not led to access to opportunity for many people."

For this study, researchers interviewed 61 human resources (HR) professionals to learn how their organizations used online tools to advertise job openings and recruit candidates. The study participants came from the public and private sectors across a range of industries, including a handful who were themselves seeking new employment.

"We found that the internet has led to a sharp bifurcation of the job market, split between lower-skill, lower-wage jobs and higher-wage, often managerial, positions," McDonald says.

"Lower-wage jobs are often advertised on large job sites, such as Monster.com," says Annika Wilcox, co-author of the paper and a Ph.D. student at NC State. "This gives the jobs good visibility, and results in hundreds or thousands of applicants for many of the positions. This makes it hard for any individual job seeker to find employment, and poses challenges for the HR professionals tasked with sorting through a flood of applications.

"Some of the study participants referred to this sort of job announcement as a 'black hole,' because applicants are unlikely to hear anything after they submit their applications."

At the other end of the spectrum are high-wage positions, often calling for specific, narrowly defined sets of skills. Many study participants referred to job candidates for these positions as "purple squirrels," because it is so difficult to find candidates who meet all of the relevant criteria.

"These positions are often posted on big job sites, but there is a bias against candidates who actually apply for those jobs," McDonald says. "Instead, HR professionals use sites like LinkedIn to seek out and recruit 'purple squirrels' who are currently employed and not actively seeking new positions."

"This work highlights the fact that the labor market is an uneven playing field," Wilcox says. "If you're in the black hole job market, it's hard to find work. If you are trying to advance in your career, applying for higher-wage positions makes it less likely that you'll get the job. And if you are already in a higher-wage position, you are more likely to be approached about new opportunities.

"Taken collectively, this shows us how difficult it is for people to work their way out of the low-income labor market," Wilcox says.

"There's this idea out there that because there are a lot of jobs online, anyone who is motivated can find work - but that's not always the case," McDonald says. "This is particularly worth noting in the context of debates about whether an individual's access to health care or food-assistance programs should hinge on work requirements."

"It's not simply the number of online job postings that matters," says Amanda Damarin, co-author of the paper and an associate professor at Georgia State University. "It's how employers use those postings versus other recruiting tools."

Credit: 
North Carolina State University

Diving into water treatment strategies for swimming pools

With summer in full swing, many people are cooling off in swimming pools. However, some of the substances that are made when chlorine in the water reacts with compounds in human sweat, urine or dirt aren't so refreshing. Now, researchers have compared the effectiveness of different water treatment processes in mitigating these so-called disinfection byproducts (DBPs). They report their results in ACS' journal Environmental Science & Technology.

Chlorine is usually added to pool water to kill harmful microbes. However, this disinfectant can react with substances in the pool water -- many of which are introduced by swimmers themselves -- to form DBPs, which can irritate the eyes, skin and lungs. Most pool systems continuously recirculate water through various treatment steps to both disinfect the water and reduce DBPs and their precursors. But because of the difficulty of comparing swimming pools with different conditions, such as number of swimmers, chlorine dosing or filling-water quality, scientists don't currently know which strategy is the best. So, Bertram Skibinski, Wolfgang Uhl and colleagues wanted to compare several water treatment strategies under the controlled and reproducible conditions of a pilot-scale swimming pool system.

The researchers continuously added compounds to their model swimming pool that simulated dirt and body fluids and added chlorine according to regulations for full-scale pools. Then, they treated the water with one of seven water treatment strategies. They found that the treatment using coagulation and sand filtration combined with granular activated carbon filtration was the most effective at lowering DBP concentrations. But even this treatment did not completely remove the contaminants because new DBPs were made more quickly than the old ones could be removed. When UV irradiation was used as a treatment step, the levels of some DBPs increased because the UV light elevated the reactivity of organic matter toward chlorine. New strategies need to be explored to more effectively remove DBPs and prevent new ones from forming, the researchers say.

Credit: 
American Chemical Society

Does likelihood of survival differ between patients with single vs. multiple primary melanomas?

Bottom Line: Patients with multiple primary melanomas had a higher likelihood of dying than those with a single primary melanoma in a study that used data from registries in the Netherlands. This observational study included nearly 57,000 patients (54,645 with a single primary melanoma and 2,284 with multiple primary melanomas). The study has limitations to consider, including a lack of information about family history and another possible limitation that researchers chose not to include melanoma in situ (when the disease is in its earliest form) because they analyzed features not related to it. The findings suggest the possibility that more strict follow-up may be warranted for patients with multiple primary melanomas.

Authors: Mary-Ann El Sharouni, M.D., University Medical Center Utrecht, the Netherlands, and coauthors

(doi:10.1001/jamadermatol.2019.1134)

Editor's Note:  Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

To increase bike commuters, look to neighborhoods

COLUMBUS, Ohio--People agree that bike commuting improves health, reduces air pollution and eases traffic, a recent survey suggests. But that wasn't enough to get most people to commute by bike.

The new research indicates that a person's neighborhood may play a large role in influencing the decision to commute by bike.

The study, published recently in the Journal of Transport and Land Use, could give city and regional planners new clues about how to design neighborhoods, streets and bike trails with active commuting in mind, said Yujin Park, lead author of the study and a doctoral candidate at The Ohio State University Knowlton School of Architecture.

"Bicycling contributes to urban vitality and, as a planner, I was interested in what the most influencing factors could be to make people be willing to choose a bicycle to commute," Park said. "We are interested in the urban factors that make a person ride a bicycle more."

The study found that people who live in high-density, mixed-use neighborhoods--the kinds of neighborhoods found in vibrant downtowns or near large college campuses--are more likely to commute by bike than their peers in suburban or rural areas.

Those findings held even in suburban neighborhoods residents considered "bike friendly." The study found that people who live in those neighborhoods might ride bikes for recreation or fun, but are less likely to commute to jobs or classes by bike than their peers who live in higher-density parts of the city.

The study was based on a survey of 1,200 people who commuted to Ohio State, one of the nation's largest public universities.

About 12.6 percent of those people classified themselves as bicyclists, and about 5.4 percent reported that a bicycle is their main mode of transportation to campus. People who lived in high-density areas were more than twice as likely to commute by bike as people in medium-density areas--and more than three times as likely to commute by bike as people in suburban areas.

Both bicyclists and non-bicyclists in the survey agreed that bicycling reduced environmental impacts of commuting, created health benefits and would save money, indicating that recognizing the benefits of bicycling is not a strong enough motivator to push non-bicyclists to start commuting on two wheels.

However, most bicyclists surveyed said they would commute by bike more frequently if they had access to more bike trails, bike-sharing opportunities and covered parking for their bikes at home or at work and school.

Non-bicyclists who lived in high-density neighborhoods appeared to be more concerned about safety--both from other vehicles and from crime--when traveling by bike than their peers who commute by bike.

Previous studies about bike commuting decisions have looked individually at commuters' attitudes about bike commuting and at their psychological perceptions about bike commuting. This research combines those attitudes and perceptions with neighborhood data.

"The conditional willingness to ride a bicycle to commute gradually decreases from high-density neighborhoods to low-density, single-family neighborhoods," Park said.

She said the findings indicate that if campus, city and regional planners want to increase the percentage of people commuting by bike, they might want to target public investment in protected bike lanes, bike paths and bike parking near downtown and campus areas.

Columbus, which won the U.S. Department of Transportation's first-ever Smart City Challenge grant, was awarded $50 million to improve mobility throughout central Ohio. Ohio State has been working with city leaders to increase sustainable transportation to, from and around campus.

"The people who live in those higher-density neighborhoods are the most likely to commute by bike," Park said. "Removing obstacles for them might make the most sense for where we invest our resources."

Credit: 
Ohio State University

The case of the poisoned songbirds

image: A selection of the birds collected after the reported die-off following a drench application of imidacloprid.

Image: 
Photo: Krysta Rogers

Researchers from the California Department of Fish and Wildlife's Wildlife Investigations Laboratory present their results from a toxicological investigation into a mortality event involving songbirds in a new publication in Environmental Toxicology and Chemistry.

On 17 March 2017, residents in Modesto, California, reported dead birds along the street and in front yards in a section of the town. The day prior to the incident, the city had made a drench application of imidacloprid, a pesticide synthetically derived from nicotine, to the base of trees that lined the street. The pesticide was reportedly mixed and applied according to package directions. Researchers at the Wildlife Investigation Laboratory were notified of the incident and conducted a postmortem investigation on the dead songbirds, which were identified as American goldfinches. The cause of death was determined to be imidacloprid poisoning likely due to the ingestion of fallen elm tree seeds contaminated during the drench application.

Lead author, Krysta Rogers, and her colleagues noted that "The mortality event investigated in the present study highlights a previously unidentified risk of drench application for imidacloprid. The pesticide label states that the product be applied to the base of the tree and directly to the root zone. [However] Seeds, insects, or other invertebrates consumed by birds and other animals may be present within that zone. If these food items were contaminated during the drench application, they would be highly toxic to animals when ingested."

The authors recommend that "drench applications not occur during seed drop to minimize the risk of exposure to animals that consume fallen seeds and that mitigation measures could be taken to prevent small animals from accessing areas treated with the pesticide, at a minimum" Finally the authors encourage integrated pest management over the prophylactic use of pesticides as the ideal.

Credit: 
Society of Environmental Toxicology and Chemistry

Computer scientists predict lightning and thunder with the help of artificial intelligence

image: Christian Schön and Professor Jens Dittrich from Saarland University have developed software that predicts thunderstorms with the help of artificial intelligence.

Image: 
Oliver Dietze

At the beginning of June, the German Weather Service counted 177,000 lightning bolts in the night sky within a few days. The natural spectacle had consequences: Several people were injured by gusts of wind, hail and rain. Together with Germany's National Meteorological Service, the Deutscher Wetterdienst, computer science professor Jens Dittrich and his doctoral student Christian Schön from Saarland University are now working on a system that is supposed to predict local thunderstorms more precisely than before. It is based on satellite images and artificial intelligence. In order to investigate this approach in more detail, the researchers will receive 270,000 euros from the Federal Ministry of Transport.

One of the core tasks of weather services is to warn of dangerous weather conditions. These include thunderstorms in particular, as these are often accompanied by gusts of wind, hail and heavy rainfall. The Deutscher Wetterdienst (DWD) uses the "NowcastMIX" system for this purpose. Every five minutes it polls several remote sensing systems and observation networks to warn of thunderstorms, heavy rain and snowfall in the next two hours. "However, NowcastMIX can only detect the thunderstorm cells when heavy precipitation has already occurred. This is why satellite data are used to detect the formation of thunderstorm cells earlier and thus to warn of them earlier," explains Professor Jens Dittrich, who teaches computer science at Saarland University and heads the "Big Data Analytics" group. Together with his doctoral student Christian Schön and the meteorologist Richard Müller from DWD, he has therefore developed a system that could soon supplement NowcastMIX in predicting thunderstorms. Their project is a first step to explore the applicability of artificial intelligence in the prediction of weather and climate phenomena.

In order to accurately predict thunderstorms in a specific region, the so-called convection of air masses, i.e. the rise of heated air while colder air sinks in the surrounding area, must be detected early and precisely. This has been known for a long time. The highlight of the new system, however, is that it only requires two-dimensional images, namely satellite images, to detect these three-dimensional air shifts.

In order to see what is happening in the sky three-dimensionally based on the two-dimensional images, the researchers use photographs taken fifteen minutes apart. Part of the image series for the respective area goes as input to an algorithm that calculates what the future image, not supplied to the algorithm, would look like. The scientists then compare this result with the real image. The size of the deviation between the forecast and reality - the researchers call it the "error" - then serves as input for a second algorithm, which the researchers have trained using machine learning to recognize the relationship between the size of the error and the occurrence of a thunderstorm. In this way, they can calculate whether it there is thunder and lightning or not. "This is the strength when we apply artificial intelligence to large amounts of data. It recognizes patterns that remain hidden from us," explains Professor Dittrich. This is one of the reasons why he and other colleagues have just initiated the new bachelor's and master's degree program "Data Science and Artificial Intelligence".

In the case of lightning and thunder, this combination is definitely "multifaceted," says Dittrich. "Based on the satellite images alone, we can predict flashes with an accuracy of 96 percent for the next 15 minutes. If the prediction window is opened further, the accuracy decreases, but still remains above 83 percent for up to five hours."

However, the rate of false alarms is still too high, according to the researchers. Nonetheless, they believe that they can significantly reduce these false alarms by training their model on additional features that are also used by the current NowcastMIX system, for example. The Federal Ministry of Transport has already granted the computer scientists from Saarbruecken 270,000 euros to investigate this in more detail.

Credit: 
Saarland University

Food insecurity leading to type 2 diabetes

A collaborative study by a team of Connecticut researchers shows there is a strong connection between food insecurity and insulin resistance, the underlying problem in type 2 diabetes. Insulin resistance occurs when cells are not able to respond normally to the hormone insulin.

The research by UConn School of Medicine, UConn School of Dental Medicine, Yale School of Public Health, Quinnipiac University, Hartford Hospital, and the Hispanic Health Council, suggests that for Latinos with type 2 diabetes food insecurity is linked to the disease's development and progression.

Published in the June issue of Journal of Nutrition, the study points to the more than 40 million Americans, including 6.5 million children, who live in food-insecure households where access to nutritionally adequate and safe food is limited or uncertain.

In the United States, the rate of food-insecure households is higher for Latinos, who are also disproportionately affected by metabolic disorders such as type 2 diabetes. In fact, rates of type 2 diabetes are 12.1% among Hispanics compared with 7.4% for non-Hispanic whites.

According to the researchers, food insecurity may increase inflammation in the body. These can be caused by diet-related obesity and excess abdominal fat. Also, food insecurity is stressful. It is often accompanied by mental distress, which triggers the release of cortisol and other stress hormones. These hormones may lead to the progression of insulin resistance.

"Our findings support the plausibility of links between food insecurity and poor health," says Dr. Angela Bermúdez-Millán, assistant professor in the Department of Community Medicine and Health Care at UConn School of Medicine.

"Resources should be redirected toward ending or decreasing food insecurity, a powerful social determinant of health."

The study included 121 study Latinos with type 2 diabetes. Sixty-eight percent of the participants were classified as food insecure.

Researchers tested the relationship between household food insecurity and insulin resistance using baseline data from the Community Health Workers Assisting Latinos Manage Stress and Diabetes (CALMS-D) randomized controlled trial. Fasting blood glucose, insulin levels, stress hormones, and markers of inflammation were measured.

They found that, compared with food secure individuals, food insecure individuals had significantly higher insulin resistance, insulin, glucose, stress hormones, inflammation, and total cholesterol. Food insecure individuals had higher insulin resistance than those who were food secure. Inflammation and stress hormones were the mechanisms through which food insecurity and insulin resistance were linked.

According to Bermúdez-Millán, the findings highlight the importance of implementing interventions that address food insecurity in order to mitigate its effects on inflammation, stress, and insulin resistance.

"Food insecurity is prevalent, widespread, and detrimental to health," she says. "Health care facilities can also help address the issue by screening for food insecurity and connecting patients to available resources and interventions."

Bermúdez-Millán is also calling on legislators to create policies to decrease food insecurity. She recommends modifying disbursement of SNAP benefits to possibly yield downstream benefits for diabetes control, and increasing access to minimally processed foods and more fruits, vegetables, and whole grains in local stores or community or home gardening venues.

The researchers note that a limitation of their study was that it was cross sectional, meaning participants were studied at one point in time. It is possible that those people with worse insulin resistance develop more inflammation and stress hormones, which make them sick or disabled, which in turn can deplete financial resources and lead to food insecurity.

Credit: 
University of Connecticut

Cascade exacerbates storage diseases

image: Dr. Susi Anheuser, Professor Dr. Konrad Sandhoff and Dr. Bernadette Breiden.

Image: 
(c) Photo: Volker Lannert/Uni Bonn

In rare, hereditary storage diseases such as Sandhoff's disease or Tay-Sachs syndrome, the metabolic waste from accumulating gangliosides cannot be properly disposed of in the nerve cells because important enzymes are missing. The consequences for the patients are grave: They range from movement restrictions to blindness, mental decline and early death. Scientists at the University of Bonn now demonstrate why these gangliosides also accumulate in patients with other storage diseases and cause a deterioration in them. The results will soon be presented in the Journal of Lipid Research and can be read online in advance.

Lysosomes take over the function of the stomach in living cells. They digest superfluous metabolic products and break them down into their components, which are then available for the cell's urgently needed new building and operating materials as part of the recycling process. If digestion in the lysosomes is blocked, for example as a result of genetic defects, this "cell waste" is stored. The growing mountain of waste materials can be toxic to nerve cells and may result in an early death, even in childhood.

In Tay-Sachs syndrome and Sandhoff's disease, components of nerve cell membranes cannot be properly degraded, resulting in the ganglioside GM2 being stored in the lysosomes. Gangliosides are water-insoluble fats, so-called lipids, which occur mainly in the ganglion cells of the nervous system. If the GM2-degrading enzyme Hex A is missing or impaired, for example due to genetic defects, destructive ganglioside storage occurs.

In Sandhoff's disease the degradation enzymes Hex A and Hex B are inactive. As with Tay-Sachs syndrome, GM2 storage leads to the destruction of nerve cells. Affected children develop normally in the first months of life; later on, blindness, movement restrictions and mental decline occur - and eventually an early death. "Previous therapeutic approaches have not led to any significant successes in these neurodegenerative gangliosidoses," says Prof. Dr. Konrad Sandhoff, senior professor at the LIMES Institute of the University of Bonn. Enzyme replacement therapies have failed due to the impermeability of the blood-brain barrier for these substances.

Researchers recreate conditions in the test tube

Together with his colleagues Dr. Susi Anheuser and Dr. Bernadette Breiden, Prof. Sandhoff has deciphered the role of the molecular environment in the lysosome for the successful degradation of GM2. In the test tube, the scientists reconstructed the tiny bubbles (vesicles) on which the GM2 is degraded in the "cell stomach" (the lysosome). Normally, the auxiliary protein GM2AP helps to catch and release the GM2 that sits on the shell of the bubbles. It can then be degraded together with the degradation enzyme Hex A to harmless GM3. However, when the function of Hex A is blocked, it is stored - with fatal consequences for nerve cells.

In the test tube, the scientists investigated the influencing factors that inhibit or improve the degradation of GM2. For example, the smaller the vesicle in the "cell stomach" and the more negatively charged their surface is, the easier the degradation enzyme's access to the GM2 and the better the "digestion" functions. The presence of cholesterol and sphingomyelin, on the other hand, significantly reduces GM2 degradation. The investigations showed that the storage of these lipids in Niemann-Pick diseases triggers an additional GM2 accumulation in the lysosome, which significantly exacerbates the type C clinical picture even though the GM2-degrading enzyme Hex A is intact and active. "Genetic disorders of the degradation enzyme evidently trigger a cascade of as yet unknown consequential damages," summarizes Sandhoff.

In another work Sandhoffs team shows that this cascade principle also applies to hereditary mucopolysaccharidoses. In these disorders, one of the storage substances, chondroitin sulphate, triggers additional ganglioside storage in the nerve cells by inhibiting GM2 degradation. In addition to existing short stature, coarse facial features and liver enlargement, it also causes learning difficulties and startle reflexes, which can, however, be alleviated in the animal model by inhibitors of GM2 formation.

"The aim of current therapeutic approaches is to prevent the production of GM2 for these hereditary storage diseases," said Sandhoff. Drugs that are currently on the market only partially fulfill this requirement. "Perhaps gene replacement therapy, which has already been successful in animal models and will soon be used in patients, will be more successful," said the biochemist. With the help of ferries, genes with the correct blueprint for the GM2 degradation enzyme will be introduced into the nerve cells of the brain. Sandhoff: "Hopefully the future will soon show whether this is actually a therapeutic option."

Credit: 
University of Bonn

The two faces of the Jekyll gene

image: Activity of Jekyll promoter as visualised by expression of Jekprom: GFP (green signal).

Image: 
Stefan Ortleb & Twan Rutten

The corresponding genes are lineage-specific for the grass tribes Triticeae and Bromeae and functioned as drivers for the speciation process within the Poaceae.

The Jekyll gene was first described in 2006 by researchers from the IPK in Gatersleben. They found that while it was crucial for sexual reproduction and fertility in barley (Hordeum vulgare), it was also partially similar to the Cn4 toxin produced by scorpions and played a role in cell autolysis. Inspired by this seemingly two-faced nature of the gene, the researchers had named it after Dr. Jekyll, the main character with the split personalities Dr. Jekyll and Mr. Hyde and from the eponymous gothic novella. A follow-up study by the same group of IPK researchers, led by Dr. Ljudmilla Borisjuk, has now shown how stunningly apt their choice of name was.

Whilst working on Jekyll, Dr. V. Radchuk discovered that the gene exists as two different and much diverged allelic variants, Jek1 and Jek3. The Jek1/Jek3 sequences are located at the same chromosomal locus and are inherited in a monogenic Mendelian fashion, whilst Jek1 and Jek3 share identical signal peptides, conserved cysteine positions and direct repeats. Although the encoded protein sequences might just have over 50% similarity, the researchers found that Jek3 actually complements the function of Jek1 in Jek1-deficient plants. Further investigations showed that Jekyll likely emerged in the common ancestor of the tribes of the Triticeae, such as barley, and Bromeae, therefore functioning as a lineage specific gene and probable driver for the separation of the lineages within the Poaceae.

The dual allelic nature of Jekyll made the cover of The Plant Journal and was featured in the belonging Research Highlight. In the meanwhile, the authors have started looking into the newly arisen questions of the cause and benefits of this allelic diversity in barley.

Credit: 
Leibniz Institute of Plant Genetics and Crop Plant Research

New contents: Neuronal Parkinson inclusions are different than expected

image: Content of Lewy bodies: The inclusions in the neurons contain mainly a membranous medley instead of the anticipated protein fibrils.

Image: 
University of Basel, Biozentrum

An international team of researchers involving members of the University of Basel's Biozentrum challenges the conventional understanding of the cause of Parkinson's disease. The researchers have shown that the inclusions in the brain's neurons, characteristic of Parkinson's disease, are comprised of a membranous medley rather than protein fibrils. The recently published study in "Nature Neuroscience" raises new questions about the etiology of Parkinson's disease.

Parkinson's disease is one of the most common neurodegenerative diseases worldwide. This disease is typically accompanied by motor defects such as the tremor of arms and legs, slowness of movements and muscle rigidity, which occur together with other non-motor symptoms. A characteristic of this progressively worsening and currently unstoppable disease are neuronal inclusions, so called Lewy bodies, that occur in many regions of the brain in the course of the disease. For decades, it was assumed that Parkinson's disease is caused by deposits of insoluble fibrils consisting of the protein alpha-synculein in the Lewy bodies.

Membrane fragments instead of protein fibrils

In their current study, the Dutch, German and Swiss researchers, including Prof. Henning Stahlberg's team, refute this long held common thesis. Using state-of-the-art electron microscopes, they have been able to show that the Lewy bodies contain mainly membrane fragments, lipids and other cellular material instead of the anticipated fibrils.

"We used correlative light and electron microscopy and other advanced light microscopy methods to take a closer look at the brain of deceased Parkinson's patients and discovered that the Lewy bodies consist mainly of membrane fragments from mitochondria and other organelles, but have in most cases no or only negligible quantities of protein fibrils," says Stahlberg. "The discovery that alpha-synuclein did not present in the form of fibrils was unexpected for us and the entire research field."

Currently, the researchers do not know yet where and in what form the protein alpha-synuclein is hidden amongst the membrane fragments and how it is involved in the formation of Lewy bodies. However, their work indicates that the laboratory-based model of alpha-synuclein fibrils as a cause and mechanism of Parkinson's disease should be revisited. "Our finding indicates that in order to uncover the causes of a disease one needs to be more strongly guided by the exploration of the pathology in humans," explains Stahlberg.

Ultrastructural insights into cell organelles

"The questions why it has taken so long to better characterize Lewy bodies, can perhaps be answered with the previous sample preparation and electron microscopy methods. Today's technologies enable us to have a much more detailed look into the morphology of human brain," explains Stahlberg. "The big question for us now is: How does alpha-synuclein contribute to the formation of Lewy bodies, if not present in form of fibrils?"

With their work, the researchers raise many new questions regarding the role of the Lewy bodies in the etiology of Parkinson's disease. The insights into such intracellular structures also provide important clues for potential therapeutic approaches to prevent or stop the formation and propagation of Lewy pathology in the brain.

Credit: 
University of Basel

New blood test for detecting Alzheimer's disease

Researchers from Lund University, together with the Roche pharmaceutical company, have used a method to develop a new blood marker capable of detecting whether or not a person has Alzheimer's disease. If the method is approved for clinical use, the researchers hope eventually to see it used as a diagnostic tool in primary healthcare. This autumn, they will start a trial in primary healthcare to test the technique.

Currently, a major support in the diagnostics of Alzheimer's disease is the identification of abnormal accumulation of the substance beta-amyloid, which can be detected either in a spinal fluid sample or through brain imaging using a PET scanner.

"These are expensive methods that are only available in specialist healthcare. In research, we have therefore long been searching for simpler diagnostic tools", says Sebastian Palmqvist, associate professor at the unit for clinical memory research at Lund University, physician at Skåne University Hospital and lead author of the study.

In this study, which is a collaboration between several medical centres, the researchers investigated whether a simple blood test could identify people in whom beta-amyloid has started to accumulate in the brain, i.e. people with underlying Alzheimer's disease. Using a simple and precise method that the researchers think is suitable for clinical diagnostics and screening in primary healthcare, the researchers were able to identify beta-amyloid in the blood with a high degree of accuracy.

"Previous studies on methods using blood tests did not show particularly good results; it was only possible to see small differences between Alzheimer's patients and healthy elderly people. Only a year or so ago, researchers found methods using blood sample analysis that showed greater accuracy in detecting the presence of Alzheimer's disease. The difficulty so far is that they currently require advanced technology and are not available for use in today's clinical procedures", says Sebastian Palmqvist.

The results are published in JAMA Neurology and based on studies of blood analyses collected from 842 people in Sweden (The Swedish BioFINDER study) and 237 people in Germany. The participants in the study are Alzheimer's patients with dementia, healthy elderly people and people with mild cognitive impairment.

The method studied by the researchers was developed by Roche and is a fully automated technique which measures beta-amyloid in the blood, with high accuracy in identifying the protein accumulation.

"We have collaborated with Roche for a long time and it is only now that we are starting to approach a level of accuracy that is usable in routine clinical care around the world", says Oskar Hansson, professor of neurology and head of the unit for clinical memory research at Lund University.

The researchers believe that this new blood sample analysis could be an important complement for screening individuals for inclusion in clinical drug trials against Alzheimer's disease or to improve the diagnostics in primary care which will allow more people to get the currently available symptomatic treatment against Alzheimer's disease.

"The next step to confirm this simple method to reveal beta-amyloid through blood sample analysis is to test it in a larger population where the presence of underlying Alzheimer's is lower. We also need to test the technique in clinical settings, which we will do fairly soon in a major primary care study in Sweden. We hope that this will validate our results", concludes Sebastian Palmqvist.

Credit: 
Lund University

Dung beetles use wind compass when the sun is high

image: In the new study, the researchers investigated dung beetles both out in the field and in the laboratory. Using fans, to create wind they could select the wind direction. They changed the sun's position in the sky using a mirror.

Image: 
Chris Collingridge

Researchers have shown for the first time that an animal uses different directional sensors to achieve the highest possible navigational precision in different conditions. When the sun is high, dung beetles navigate using the wind.

The discovery of the dung beetles' wind compass and how it complements the sun compass was made by an international research team comprising biologists from Sweden and South Africa.

"This is the first study that shows how an animal's biological compass can integrate different directional sensors, in this case wind and sun, in a flexible way. This enables the highest possible precision at all times", says Marie Dacke, professor of Sensory Biology at Lund University and leader of the research team.

The dung beetles cannot use the sun as a directional guide when it is cloudy, or when the sun is higher than 75 degrees above the horizon for a few hours in the middle of the day. A while later, when the sun is a little lower, they turn off the wind compass and again rely on the sun.

In the new study, the researchers investigated dung beetles both out in the field and in the laboratory. Using fans, to create wind they could select the wind direction. They changed the sun's position in the sky using a mirror.

The experiment shows that when the sun is at a low or medium elevation in the sky, the dung beetles change direction by 180 degrees if the sun's position is changed by 180 degrees. However, the dung beetles were not affected when the researchers changed the wind direction by 180 degrees when the sun was at these elevations.

When the sun was highest, the situation was reversed. The wind then showed the way, so the insects responded to a change in the wind direction of 180 degrees.

The results show that directional information can be transferred from the wind compass to the sun compass and vice versa. In this way, the dung beetles can continue on in one direction when one of the compasses becomes less reliable.

The sensors that register wind direction are on the insect's antennae.

"The insect brain is definitely not pre-programmed to always follow the same set of actions. On the contrary, we can show that such small brains work according to very dynamic principles that adapt to the conditions prevailing at a given moment", says Marie Dacke.

The researchers had previously shown that, during the night, some dung beetles navigate by the Milky Way and polarised moonlight while rolling their dung balls in a straight line. Combined with the results from the new study, they show that the insect's compass works at all times of the day or night and probably under almost any conditions.

"Now we will go on to study whether they can also use the wind at night. Another aspect we are curious about is what guides them when there is no wind and it's cloudy", comments Marie Dacke.

The aim of the research is to fully understand how very small brains handle large amounts of information in order to make a relevant decision: is it appropriate to turn left or right, or continue straight on?

Marie Dacke believes that the results will be of direct benefit within a few years, in areas like robot development and artificial intelligence (AI). Just like dung beetles, robots must take large amounts of information into consideration in order to direct their next action.

"Developments in AI are happening at breath-taking speed and part of my research is directly aimed at creating a model of how networks function to integrate information in a smart way", she concludes.

Credit: 
Lund University

New research shows how melting ice is affecting supplies of nutrients to the sea

The findings of a research expedition to coastal Greenland which examined, for the first time, how melting ice is affecting supplies of nutrients to the oceans has been published in the journal Progress in Oceanography.

The European Research Council-funded expedition on board the RSS Discovery took place during the summer of 2017. It was led by Dr Kate Hendry a geochemist from the University of Bristol's School of Earth Sciences.

The scientific crew spent about five weeks at sea in 2017, mostly near the western coast of Greenland, sampling waters, sediments and marine life using a range of cutting-edge technologies.

A Remotely Operated Vehicle (ROV) took high-definition, real time videos of the seafloor and collected samples of marine life, water and sediments which were then analysed by the scientists on board.

The paper highlights the importance of glacial meltwaters, combined with shelf currents and biological production, on biogeochemical cycling in these high-latitude regions over a range of timescales.

Previous work from the Bristol Glaciology Centre has shown that meltwaters released from underneath glaciers are rich in important nutrients. However, until now it's not been clear to what extent these nutrients reach the open ocean where they can 'fertilise' marine life.

Dr Hendry said: "Vigorous biological uptake in the glacial fjords keeps the surface concentration of key dissolved nutrients needed for algae, such as nitrate, phosphate and silicon, very low.

"However, sediment particles from the glaciers reach the shelf waters, albeit in a patchy way, and are then rapidly transported away from the shore.

"These particles, together with the remains of algal shells and biological material, are rapidly dissolved and cycled through shallow marine sediments. This means that the seafloor is a very important source of nutrients - especially silicon - to the overlying waters."

Future changes in the supply of these reactive, glacial sediments, as well as changes in the shelf currents that transport them, will have a profound impact on the nutrient balance and ecosystem structure in the fjords and coastal waters, and potentially even further afield.

Dr Hendry added: "This study shows how geochemical and oceanographic analyses can be used together to probe not only modern nutrient cycling in this region, but also changes in glacial meltwater discharge through time."

Credit: 
University of Bristol