Culture

Are hyoliths Palaeozoic lophophorates?

image: Reconstruction of Triplicatella opimus from the Chengjiang Lagerstätte in a proposed deposit-feeding lifestyle.

Image: 
©Science China Press

Hyoliths are extinct invertebrates with calcareous shells that were common constituents of the Cambrian fauna and formed a minor component of benthic faunas throughout the Palaeozoic until their demise in the end-Permian mass extinction. The biological affinity of hyoliths has long been controversial and the group has been compared with a number of animal phyla, most frequently the Mollusca or the Sipuncula, although other researchers have considered hyoliths as a separate "extinct phylum". However, recent discoveries of a tentaculate feeding apparatus ('lophophore') and fleshy apical extensions from the shell ('pedicle'), have resulted in hyoliths being placed within the lophophorates with a close relationship to the brachiopods.

A new article by Zhifei Zhang and his research group at Northwest University, China, together with Dr. Christian Skovsted from the Swedish Museum of Natural History have questioned this phylogenetic placement, after analyzing hundreds of hyolith fossils from the lower Cambrian (520 million years ago) Chengjiang Biota of South China (Liu et al.). In their material from South China, the first credible soft parts of an orthothecid hyolith other than the gut has been preserved in the species Triplicatella opimus.

The soft part morphology of Triplicatella opimus confirms the presence of a tentaculate feeding organ in orthothecids, demonstrating that both recognized orders of hyoliths possessed a tentaculate feeding organ. The tuft-like arrangement of the tentacles of T. opimus differs from that of hyolithids suggesting a different function of the feeding organ between orthothecid (collecting food directly from the substrate) and hyolithid hyoliths (filter feeding strategy).

A comparative study was undertaken by Liu et al., investigating the structure of the feeding organ between hyoliths and other recognized fossil and modern lophophore-bearing animals. This analysis indicated that the structure lacked many morphological features that are distinctive of a lophophore and consequently it is likely that the feeding organ of hyoliths is not a lophophore. The tuft-like morphology of the feeding apparatus of Triplicatella from South China additionally suggests that the organ was adapted to feeding on nutrients directly from the substrate rather than filter feeding as seen in younger hyolith specimens. Liu et al. further suggest filter feeding in hyoliths may have been a secondary adaption, evolving later with the appearance of helens, a mineralized structure used to lift the body of the hyoliths above the seafloor.

Recently, scientists illustrated apical structures from a species of hyolith from the Cambrian of South China, claiming that they represent an attachment structure similar to the brachiopod pedicle. A detailed analysis of the apical structures by Liu et al. have demonstrated that these structures represent crushed portions of the shell and are not in any way comparable to the brachiopod pedicle. The identical morphology of apical structures could also be observed in hyolith specimens from a nearly contemporaneous fauna (Shipai Biota) that allows for a better understanding of how this part of the shell is preserved. The similarity in ornament between the apical structure and the rest of the shell and the similarity in preservation indicates that the purported pedicle in orthothecid hyoliths represents a partly crushed apical shell section and is not a biological analogue to the complex organ that constitutes a brachiopod pedicle.

In their article for NSR, Liu et al. consider that this new evidence suggests that hyoliths did not possess a lophophore or a pedicle similar to those of brachiopods. Liu et al. instead argue that hyoliths likely occupied a more basal position in the Lophophorata, a conclusion which is strengthened by recently published data on hyolith shell structures.

Credit: 
Science China Press

Self-cannibalizing mitochondria may set the stage for ALS development

Northwestern Medicine scientists have discovered a new phenomenon in the brain that could explain the development of early stages of neurodegeneration seen in diseases such as ALS, which affects voluntary muscle movement such as walking and talking.

The discovery was so novel, the scientists needed to coin a new term to describe it: mitoautophagy, a collection of self-destructive mitochondria in diseased upper motor neurons of the brain that begin to disintegrate from within at a very early age. Upper motor neurons in the brain are responsible for initiating muscle movement and relaxation and are one of the first to break down in neurodegenerative diseases.

The study was published in the journal Frontiers in Cellular Neuroscience.

The phenomenon is observed mainly in one of the most common pathologies observed in neurodegenerative diseases, TDP-43 pathology, which is seen in more than 90% of ALS cases. When a pathology is present in the body, it indicates that something is wrong or functioning abnormally.

"I think we have found the culprit that primes neurons to become vulnerable to future degeneration: suicidal mitochondria," said senior study author Hande Ozdinler, associate professor of neurology at Northwestern University Feinberg School of Medicine. "The mitochondria basically eat themselves up very early in the disease. This occurs selectively in the neurons that will soon degenerate in patient's brains."

"This type of degeneration begins much earlier than previously thought," said study lead author Mukesh Gautam, the A Long Swim Ellen Blakeman fellow at Northwestern.

Using a process called immuno-coupled electron microscopy, the scientists investigated the cellular events that go wrong inside the neurons that become vulnerable to disease. After analyzing more than 200 neurons, they observed the self-destruction of mitochondria only in the diseased neurons, and especially within the context of TDP-43 pathology.

Mitochondria are powerhouses of the cell that create and maintain energy in the cells. In the diseased upper motor neurons, mitochondria self-destruct first by elongating, then forming a ring-like structure, until they finally disintegrate from the inside out.

It is a type of degeneration never been seen before, and it is different from previously described stages of mitochondrial degeneration.

The study analyzed mitochondria in the upper motor neurons of three different mouse models of ALS at only 15 days old - equivalent to a toddler in humans. While the study was in mice, Ozdinler and her team showed many times before that the upper neurons even in different species are almost identical at a cellular level, especially within the context of TDP-43 pathology.

These self-destructive mitochondria could become a future target for drug therapies to treat ALS and other neurodegenerative diseases in which a person's movement is affected, Ozdinler said. They are currently working with drug companies to see if drugs used for human patients with mitochondrial disease could in fact improve the health of diseased motor neurons.

"Many of the drugs currently on the market that target the health and the integrity of mitochondria may well be repurposed and considered for neurodegenerative diseases in the future," Ozdinler said. "Maybe we don't need to reinvent the wheel to cure ALS and other neurodegenerative diseases.

"To overcome neurodegeneration, we need to improve the health and the stability of mitochondria. If we improve the health of the mitochondria early, we may even eliminate protein aggregate formation, a pathology broadly observed in many diseases."

Credit: 
Frontiers

Measuring online behavioral advertising: One more step to protect users

image: Dr. Nikolaos Laoutaris, Research Professor at IMDEA Networks.

Image: 
IMDEA Networks Institute

When we search for information on the Internet, buy online or use social networks we often see ads relating to our likes or profile. The fact is not everyone knows how online advertising works and what data companies are using to create personalized ads to show us. Dr. Nikolaos Laoutaris, Research Professor at IMDEA Networks, has published new research results on the detection of behavioural targeting in online advertising.

The relevancy of this study, according to Laoutaris, is that researchers can "test whether an advertisement is targeted or not, and if so, towards which demographic groups; it's fundamental for being able to check (proactively or reactively) the application of new data protection laws that, among others, put restrictions on targeted advertising".

The General Data Protection Regulation (GDPR), which came into force on the 25th of May, 2018, was adopted precisely by the European Union to return control of personal data to the individual and to unify the regulatory framework for multinationals. "GDPR defines several sensitive categories such as health, religion, political beliefs, sexual orientation all of which require special type of consent that has usually not been granted when targeted advertising happens", states, Dr. Nikolaos Laoutaris. For this reason, it's very important to identify targeted ads in order to protect specific demographic groups, like minors.

Earlier work of the IMDEA Networks Research Professor has found correlations between specific targets and some topics. For instance, it's the case of people who suffer from cancer, Alzheimer or sexually transmitted diseases.

How can users and Data Protection Authorities (DPA's) test ad targeting? There are two ways: detection via content-based analysis and count-based detection & crowdsourcing. Following these methodologies, it's possible to know if ads have been targeted to those groups protected by GDPR and FTC's Child Online Privacy Protection Act (COPPA). "By doing so they offer a valuable tool for Data Protection Authorities and can even help the advertising sector itself validate its own self-regulation programs like AdChoices [...] It is really difficult to see how these laws will be applied if there are no easy methods for detecting violators", highlights, Dr. Laoutaris.

The first method basically looks at the browsing history of a user and tries to understand the types of website he visits. Then looks for correlations between these websites and the advertisements offered. Laoutaris explains that to do that "the method needs to understand and characterise the type of content of web-sites and ads and, hence, the name content-based analysis". The second method and the latest one is, in the words of the researcher, even more powerful. Why? Because it's using frequency-based analysis so as to detect targeting. To use it, it's just needed a free to use tool called eyeWndr, which only cares to see if a particular advertisement "follows" a user across websites more than other ads. "This alone proves to be enough for distinguishing targeted from non-targeted ads", says, Laoutaris. The App is very simple to use, because it marks with a colour circle the ad and clicking above it the user can see a score: if it's high does means that ad it's targeted. The results of this study will be presented in ACM CoNEXT Conference to be held in Orlando, Florida, this December. An early version of this paper can be found here.

Going beyond online advertising, Dr. Laoutaris has presented, in a seminar celebrated at IMDEA Networks within the Science Week of Madrid, an earlier work on detecting online price discrimination as well as his community building efforts in setting up and growing the Data Transparency Lab. He found out that depending on the country you are connected you have to pay a different price for the same product. On the other hand, the researcher has talked about why it's important making online services pay users for their data.

Credit: 
IMDEA Networks Institute

Job losses during the Great Recession may be responsible for decline in US birth rates

(Carlisle, Pa.) - New research published this month in the Southern Economic Journal reveals job losses during the Great Recession (2007-2009) may be partly responsible for the recent drop in U.S. birth rates. Further, researchers found job losses for men and women affect fertility in different ways--as does women's age and marital status--shedding light on possible drivers of the falling U.S. birth rate, which has been slipping since 2007.

Economist Shamma Alam, assistant professor of international studies at Dickinson College, and Bijetri Bose of UCLA's Fielding School of Public Health found that for married or cohabitating couples, job losses for male heads-of-household led to a significant decline in fertility. In contrast, job losses for female heads-of-household did not affect the likelihood of birth in the short term or medium term. However, job losses for single women decreased the likelihood of birth, and this negative effect on fertility persisted in the medium term, further illustrating the fertility of single women has been decreasing for a longer time compared to married couples. Collectively, the data show a relationship between job losses and the falling birth rate.

The data also reveal women over 40 are more likely to have a birth following a job loss. "Losing a job, while creating stress and uncertainty, can also lead to additional time to care for a child," said Alam. The authors say the data showed extra time after a job loss could be a factor encouraging older women to conceive, especially as fertility decreases with age.

Alam suggests a potential reason for male job loss affecting fertility more than female job loss could be due to earnings. "Men are traditionally the primary income earners for many households, so losing a larger proportion of household income could cause couples to delay births or have fewer children," said Alam. Additionally, the researchers noted poorer families reduced their fertility more compared to wealthier families. They said this was due to more affluent families having greater income/wealth to fall back on following a job loss.

The study looked at data during the recession, which lasted from December 2007 to June 2009, and for the four years following the recession. In the study, which is the first to use longer-term individual and household data on fertility and the recession, Alam and Bose analyzed data from the Panel Study of Income Dynamics, a national survey with detailed information on employment, fertility and household wealth and income, which has been running since 1968.

Credit: 
Dickinson College

Subtle changes, big effects

image: The pathogenic bacteria, Shigella is a major cause of a million diarrhoea related deaths per year.

Image: 
iStock (CC BY 4.0)

"Does the flap of a butterfly's wings in Brazil set off a tornado in Texas?"
- Edward Lorenz, at the 139th meeting of the American Association for the Advancement of Science

Scientists have recently discovered the mechanism by which a minuscule change in 3 atoms in a protein molecule can affect immune signalling in cells. This 'butterfly effect' is used by the bacterium, Shigella flexneri, to survive within the host cells that it infects.

Ranabir Das' team at the National Centre for Biological Sciences (NCBS), Bangalore, has found that a tiny change in the protein UBC13, caused by a bacterial enzyme, creates a cascade of small atomic alterations that add up until they prevent UBC13 from binding to a partner protein, TRAF6. Without the UBC13-TRAF6 complex, the host cell is unable to begin signalling for an immune response against the bacteria. This study examines the mechanics of a subtle alteration--involving the loss of just 3 atoms--and investigates how it can have far-reaching consequences.

According to the chaos theory in mathematics, a minute change such as the 'flap of a butterfly's wing' could cause huge changes elsewhere. This seems to hold true at much smaller scales too--for example, within a cell. Scientists from the National Centre for Biological Sciences (NCBS), Bangalore, have now found how a minuscule atomic change in a protein molecule enables the bacterium to shut down its host's immune signalling system.

Shigella flexneri is a sneaky and highly infective bacteria. The organisms, which cause diarrhoea in humans, first attach to the cells lining the host's gut. Then using a needle-like apparatus, S. flexneri begin pumping in their secret weapon--an enzyme that alters a single amino acid in the host cell protein UBC13 (UBiquitin Conjugating enzyme E2 13). This change renders UBC13 unable to bind to a partner protein TRAF6 (TumouR necrosis factor Associated Factor 6) and effectively blocks the host cell from signalling for an inflammatory response. Subsequently, Shigella penetrates into the host cells to multiply.

As of now, little was known about how such a subtle effect--a change in just 3 atoms in a protein molecule containing roughly 3000 atoms--could shut down a host's immune response.

A new study by Ranabir Das' team at NCBS, however, has shown that when an amine group (formed of one nitrogen and two hydrogen atoms) is removed from a single amino acid in the host protein UBC13, a series of relatively small atomic changes pile up to block a key step in the immune response pathway. The work, which received funding support from the Tata Institute of Fundamental Research and the Department of Biotechnology, India, has been published as a paper in the journal eLife.

"We used a combination of structural studies, computational modelling, and enzymatic experiments to find that the atomic changes in UBC13 do not alter its structure. Rather, these changes completely disrupt its ability to bind its partner protein, TRAF6," says Priyesh Mohanty, who is part of Das' team, and the first author in the paper that describes these results.

The 14th amino acid in UBC13, an arginine residue (Arginine14), is critical in forming a 'salt bridge'--a bond between oppositely charged ions--with TRAF6. This salt bridge between the surfaces of the two proteins is necessary to stabilize and maintain the UBC13-TRAF6 complex, which in turn, plays a pivotal role in immune response signalling. When the bacterial enzyme, named Ospl, deamidates a glutamine residue (Glutamine100), a neutral amine group is replaced by a negatively charged hydroxyl group. Now, this group literally steals away Arginine14 from the UBC13-TRAF6 inter-molecular salt bridge, to form an intra-molecular salt bridge with itself. This, in turn, compromises the transient interactions between UBC13 and TRAF6, which are necessary in forming the final complex. Finally, the new negative charge on the UBC13 surface creates a transient repulsive force between the UBC13 and TRAF6 proteins.

The salt bridge and the surface attractive forces collectively create a force powerful enough to steady (or hold) the UBC13-TRAF6 complex. With the loss of these forces, the complex falls apart, and without the complex, the immune signal against the bacteria is blocked.

"This investigation has really helped us understand the role of individual amino acids in the associations between proteins and how proteins function within the cell," say Mohanty and Das. "To the best of our knowledge, our study is the first to provide a mechanism by which glutamine deamidation of a target protein hinders its function inside the host cell. Interestingly, this mechanism allows the bacteria to attenuate the inflammatory response and promotes its ability to survive in a human host," they add.

Currently, Shigella infections cause almost 0.2 million deaths due to diarrhoea every year, of which one-third occur in infants. Several multidrug-resistant strains of these bacteria have also appeared, and yet there are no vaccines or drugs to prevent or limit the spread of Shigella outbreaks.

"It is, therefore, very important to study what mechanism Shigella uses to attenuate inflammatory responses in human hosts. Such studies may help researchers to identify new drugs that could control Shigella outbreaks in humans," say Mohanty and Das.

Credit: 
National Centre for Biological Sciences

Heart-on-a-chip technology predicts preclinical systolic and diastolic in vivo observations for novel cardiac drug in development

NEW YORK, November 18, 2019--TARA Biosystems, Inc. today reported in vivo and in vitro functional data from a study of investigational candidate, MYK-491, showing that TARA's human iPSC-derived organ-on-a-chip technology can directly measure in vivo cardiac performance. These data will be presented today at the American Heart Association's Scientific Sessions in Philadelphia. MYK-491 is MyoKardia, Inc.'s lead clinical-stage activator candidate designed to increase the contractility of the heart (systolic function) with minimal or no effect on myocardial relaxation and compliance (diastolic function) by acting directly on the proteins in the heart muscle responsible for contraction.

"These results are exciting because they demonstrate how TARA's advanced biology can really make an impact on the translation of clinical compounds," said Michael P. Graziano, PhD, chief scientific officer of TARA Biosystems, "Replicating complex physiology in systems that up to now could only be seen in animals positions our technology as a faster, cheaper, and more human-relevant alternative to animal testing."

In the presented study, the effects of MYK-491 were evaluated in instrumented canine models and TARA's human cardiac organoid model. The results indicate agreement between the two models, both showing improvements in systolic elastance (force production) with negligible effects on diastolic function. Both systolic and diastolic tension are dysregulated in patients with heart failure and, given their load dependency, systolic and diastolic mechanics have been difficult to measure in an in vitro setting, typically requiring studies in large animals with advanced instrumentation to capture such complex, integrated functional effects preclinically. TARA's organ-on-a-chip platform may offer an in vitro alternative to collect such measurements in a human setting.

"In the study reported today at AHA, TARA's human heart-on-a-chip technology provided confirmatory preclinical evidence of what we have seen in our other preclinical and clinical studies: MYK-491 appears to increase systolic contractility without impacting diastolic relaxation," said Robert McDowell, PhD, chief scientific officer of MyoKardia. "This platform may serve as a valuable human translational model for cardiovascular drug discovery with its ability to capture the nuances of human heart contraction and relaxation mechanics."

The uses of human induced pluripotent stem cells (iPSCs) holds great promise as a foundation to bridge the human translation gap. However, experimental models, which rely on iPSCs alone lack relevant physiological hallmarks and drug responses seen in human heart muscle. TARA leverages the power of iPSCs and subjects them to a rigorous maturation process on its patented Biowire™ II system, producing 3D human cardiac tissues called Cardiotype™ tissues. In a study published earlier this year in Cell, TARA scientific founders validated the ability of the Biowire™ II platform to create physiologically relevant human cardiac tissues. The research also showed how the platform could be used to model different heart diseases by using iPSCs from patients. Additionally, findings published recently in the Journal of Toxicological Sciences, show TARA's 3D-cardiac tissue platform predicts responses to a wide range of drugs known to affect cardiac function in humans, something that has been a challenge in pre-clinical models until now.

Credit: 
CG Life

Should scientists change the way they view (and study) same sex behavior in animals?

Over the years, scientists have recorded same-sex sexual behavior in more than 1,500 animal species, from snow geese to common toads. And for just as long evolutionary biologists studying these behaviors have grappled with what has come to be known as a "Darwinian paradox": How can these behaviors be so persistent when they offer no opportunity to produce offspring?

In a new article, researchers from the Yale School of Forestry & Environmental Studies make the case that it's time to reframe the question from "why do animals engage in same sex behavior (SSB)" to "why not?" Writing in the journal Nature Ecology & Evolution, the authors suggest that these behaviors may actually have been part of the original, ancestral condition in animals and have persisted because they have few -- if any -- costs and perhaps some important benefits.

"We propose a shift in our thinking on the sexual behaviors of animals," says Julia Monk, lead author and F&ES doctoral candidate. "We're excited to see how relaxing traditional constraints on evolutionary theory of these behaviors will allow for a more complete understanding of the complexity of animal sexual behaviors."

Typically, research into these behaviors has rested on two assumptions, the authors state. The first is that same-sex behavior has high costs because individuals spend time and energy on activities that have no potential for reproductive success. The other is that same-sex behaviors emerged independently in different animal lineages.

They argue that a combination of same-sex and different-sex sexual behaviors (DSBs) is an original condition for all sexually producing animals -- and that these tendencies likely evolved in the earliest forms of sexual behavior.

They also dispute the assumption that because different-sex behaviors are essential for sexual reproduction selection -- or the tendency of beneficial traits that promote increases in population, size, or resilience -- will eliminate sexual behaviors that do not immediately result in reproduction. On the contrary, they suggest that SSB is not always -- and maybe even seldom -- very costly. This would suggest that this behavior is actually what evolutionary biologists call "neutral," meaning that it has neither negative nor positive effects and therefore persists because there's no reason for natural selection to weed it out.

Moreover, the authors suggest that not only are same-sex behaviors often "not costly," but can be advantageous from a natural selection perspective because individuals are more likely to mate with more partners. Many species aren't inherently monogamous but instead try to mate with more than one individual. In many species it can be difficult for individuals to even discern between different sexes.

"So, if you're too picky in targeting what you think is the opposite sex, you just mate with fewer individuals. On the other hand, if you're less picky and engage in both SSB and DSB, you can mate with more individuals in general, including individuals of a different sex," says co-author Max Lambert, a postdoctoral fellow at the University of California-Berkeley's Departmental of Environmental Science.

For example, scientists have found that male burying beetles engage in increased same-sex behavior when they perceive a higher cost of missed mating opportunities with females. This suggests that engaging with different-sex behaviors exclusively is actually disadvantageous because it reduces chances to display mating potential when mating opportunities are rare.

Such examples only hint at what scientists don't know about same-sex behaviors in animals, Lambert said. There are thousands of examples of SSB in animals, he said, yet most of these observations occurred by chance and scientists rarely if ever actively study how often these behaviors occur compared with different-sex sexual behaviors.

"So far, most biologists have considered SSB as extremely costly and, consequently, something that is aberrant," he says. "This strong assumption has stopped us as a community from actively studying how often and under what conditions SSB is happening. Given our casual observations suggests that SSB seems to happen pretty commonly across thousands of species, imagine what we would have learned if we had assumed this was something interesting and not just a rampant accident."

Other co-authors include Erin Giglio from the Department of Integrative Biology at the University of Texas at Austin; Ambika Kamath from the University of California Berkeley; and Caitlin McDonough from the Center for Reproductive Evolution at Syracuse University.

For the paper, the researchers explained that they use the terms "same-sex behaviors" and "different-sex behaviors" rather than terms such as homosexuality or heterosexuality to avoid conflation with terms for human sexual identities.

Nonetheless, Monk notes that scientific questioning into the persistence of same-sex sexual behaviors has long been observed through the lens of a human society that has historically judged some behaviors to be "normal" or "abnormal." This tendency, she says, has hindered our understanding of animal behavior in that it has promoted research that only confirms pre-existing assumptions or even averts important steps in the scientific process.

"Once you really dig into the research on the behavior of animals you can't help but be impressed by the diversity of life and how animals are out there defying our expectations all the time," she says. "And this should lead us to question those expectations."

Credit: 
Yale School of the Environment

Ohio University research shows 'bad cholesterol' is only as unhealthy as its composition

ATHENS, Ohio (Nov. 18, 2019) - New research at Ohio University shows that a particular subclass of low-density lipoproteins (LDL), also known as "bad cholesterol," is a much better predictor of potential heart attacks than the mere presence of LDL, which is incorrect more often than not.

The presence of LDL is considered an indicator for the potential risk of heart attacks or coronary disease, but studies have shown that about 75 percent of patients who suffer heart attacks have cholesterol levels that don't indicate a high risk for such an event. Research by Ohio University Distinguished Professor Dr. Tadeusz Malinski and researcher Dr. Jiangzhou Hua in Ohio University's Nanomedical Research Laboratory shows that of the three subclasses that comprise LDL, only one causes significant damage.

"Our studies can explain why a correlation of total "bad" cholesterol with a risk of heart attack is poor and dangerously misleading - it's wrong three quarters of the time," Malinski said. "These national guidelines may seriously underestimate the noxious effects of LDL cholesterol, especially in cases where the content of subclass B in total LDL is high (50% or higher)."

Malinski's team used nanosensors to measure the concentration of nitric oxide and peroxynitrite in endothelium stimulated by LDL subclasses and reported the findings in a study published in the current issue of International Journal of Nanomedicine. Subclass B of LDL was found to be the most damaging to endothelial function and can contribute to the development of atherosclerosis. Therefore, it's not the total amount of LDL cholesterol one has, but rather the concentration of subclass B to the other two, subclass A and subclass I, that should be used to diagnose atherosclerosis and the risk of heart attack.

"Understanding this could lead to improving the accuracy of diagnosis for the evaluation of cardiovascular disease rates," Malinski said. "Analyzing the mixture of LDL subclasses may provide a parameter-based model for an early medical diagnosis of estimating the risk of cardiovascular disease."

Credit: 
Ohio University

The measurements of the expansion of the universe don't add up

image: Solving the discordant data on the expansion rate of the universe is like trying to thread a 'cosmic needle', where its hole is the H0 value measured today and the thread is brought by the model from the furthest Universe we can observe: the cosmic microwave background.

Image: 
NASA/JPL-Caltetch/ESA-Planck Collaboration/SINC

Physicists use two types of measurements to calculate the expansion rate of the universe, but their results do not coincide, which may make it necessary to touch up the cosmological model. "It's like trying to thread a cosmic needle," explains researcher Licia Verde of the University of Barcelona, co-author of an article on the implications of this problem.

More than a hundred scientists met this summer at the Kavli Institute for Theoretical Physics at the University of California (USA) to try to clarify what is happening with the discordant data on the expansion rate of the universe, an issue that affects the very origin, evolution and fate of our cosmos. Their conclusions have been published in Nature Astronomy journal.

"The problem lies in the Hubble constant (H0), a parameter which value -it is actually not a constant because it changes with time- indicates how fast the Universe is currently expanding," points out cosmologist Licia Verde, an ICREA researcher at the Institute of Cosmos Sciences of the University of Barcelona (ICC-UB) and the main author of the article.

"There are different ways of measuring this quantity," she explains, "but they can be divided into two major classes: those relying on the Late Universe (the closest to us in space and time) and those based on the Early Universe, and they do not give exactly the same result."

A classic example of measurements in the late universe are those provided by the regular pulsations of cepheid stars, which the astronomer Henrietta Swan Leavitt already observed a century ago and which helped Edwin Hubble calculate distances between galaxies and prove in 1929 that the Universe is expanding.

The current analysis of the variable brightness of cepheids with space telescopes such as the Hubble, along with other direct observations of objects in our cosmic environment and more distant supernovae, indicate that the H0 value is approximately 73.9 kilometres per second per megaparsec (an astronomical unit equivalent to about 3.26 million light years).

However, measurements based on the early Universe provide an average H0 value of 67.4 km/s/Mpc. These other records, obtained with data from the European Space Agency's Planck Satellite and other instruments, are obtained indirectly on the basis of the success of the standard cosmological model (Lambda-CDM model), which proposes a Universe made up of 5 % atoms or ordinary matter, 27 % dark matter (made up of particles, as yet detected, that provide additional gravitational attraction so that galaxies can form and clusters of galaxies are held together) and 68 % dark energy, which is responsible for accelerating the expansion of the Universe.

"In particular, these measurements of the primordial Universe focus on the farthest light that can be observed: the cosmic microwave background, produced when the Universe was only 380,000 years old, in the so-called recombination era (where protons recombined with electrons to form atoms)," says Licia Verde.

The researcher highlights a relevant fact: "There are very different and independent ways (with totally different instruments and scientific tools) to measure the H0 on the basis of the early Universe, and the same goes for the late Universe. What is interesting is that all the measurements of one type are in mutual agreement with one another, at an exquisite precision of 1 or 2 %, as are those of the other type, with the same great precision; but when we compare the measurements of one class with those of the other, the discrepancy arises."

"It looks like a small difference, only 7%, but it is significant considering that we are talking about precisions of 1 or 2% in the value of the Hubble constant," as emphasised by Licia Verde, who jokes: "It is like trying to thread a 'cosmic needle' where its hole is the H0 value measured today and the thread is brought by the model from the furthest Universe we can observe: the cosmic microwave background."

In addition, she points out some of the consequences of the discrepancy: "The lower the H0 is, the older the Universe is. Its current age is calculated at about 13.8 billion years considering that the Hubble constant is 67 or 68 km/s/Mpc; but if its value were 74 km/s/Mpc, our universe would be younger: it would be approximately 12.8 billion years old."

Modifying the model in the early Universe

The authors point out in their study that this anomaly does not seem to depend on the instrument or method used for measuring, or on human equipment or sources. "If there are no errors in the data or measurements, could it be a problem with the model?" the researcher asks.

"After all, the H0 values of the primordial Universe class are based on the standard cosmological model, which is very well established, very successful, but which we can try to change a little to solve the discrepancy," says the expert. "However, we cannot tamper with the characteristics of the model that work very well".

If the data continue to confirm the problem, theoretical physicists seem to agree that the most promising route for solving it is to modify the model just before the light observed of the cosmic microwave background was formed, i.e. just before recombination (in which there was already 63 % dark matter, 15 % photons, 10 % neutrinos and 12 % atoms). One of the ideas proposed is that, shortly after the Big Bang, an intense episode of dark energy could have occurred that expanded the Universe faster than previously calculated.

"Although it is still highly speculative, with this fine-tuned model, the H0 value obtained with measurements based on the primordial Universe could coincide with local measurements," notes Licia Verde, who concludes: "It won't be easy, but in this way we could thread the cosmic needle without breaking what works well in the model."

Credit: 
Spanish Foundation for Science and Technology

Steep energy bills can lead families into poverty, nationwide study shows

image: Households considered energy-burdened spend 10% or more of their income on heating and electricity.

Image: 
UW Oshkosh graphics

While it makes sense that families living below the poverty line have a difficult time covering their energy bills, new University of Wisconsin Oshkosh research shows the reverse to be true as well ... high energy bills can lead a household into poverty.

The nationwide study--led by UWO environmental sociologist Jeremiah Bohr and published Nov.15 in the peer-reviewed journal Social Forces--indicates that dedicating inordinate amounts of income to energy services can threaten a family's well-being over time.

"In a state like Wisconsin with harsh winters, it is very important to think of the families that have trouble covering the heating bills," he said. "At a certain point, it is non-negotiable. You have to heat your home or the pipes will freeze."

Bohr and Anna McCreery with Elevate Energy, an economic development agency in Chicago, analyzed household income and energy expenditures of thousands of American households across two decades. They paid special attention to households classified as "energy burdened"--those spending 10% or more of their income on heating and electricity.?

"When comparing households living beneath the poverty line, those that were energy burdened were about twice as likely to remain in poverty two years later," Bohr said. "This pattern repeated when comparing households living above the poverty line--energy-burdened households had double the odds of transitioning into poverty within two years."

Bohr said the results have implications for discussions of poverty reduction, energy consumption and climate policy.

"Although renewable energy costs have fallen dramatically and are near parity with the cost of other fuels, carbon taxes or related instruments may nonetheless contribute to higher energy costs for some consumers or the perception of higher costs," he said.

"As policymakers and activists continue to address energy consumption as part of climate change mitigation, it is important to recognize how energy costs can affect economically vulnerable households."

Households may experience the impacts of high energy costs immediately, forcing families to perhaps reduce spending on food or other necessities to heat the home or causing them to fall behind on their utility bills and, consequently, harm their credit rating.

"Activists and policymakers should keep in mind that too many households lack a proper safety net to secure them from policies that might increase energy costs. This research emphasizes the importance of energy assistance and energy efficiency for low-income households," he said.

For example, improving the energy efficiency of low-income housing will likely produce environmental benefits while potentially reducing the risk of energy burdens that can threaten a household's economic well-being.

When analyzing energy consumption in the U.S., sociologists typically have focused on excessive use. But energy scarcity for households is an important topic for environmental sociology and social science more broadly.

"Focusing on energy insecurity presents an opportunity for social scientists to engage in a larger conversation about social inequality as it affects energy and climate policy," Bohr said.

Credit: 
University of Wisconsin Oshkosh

Antibiotics from the sea

image: Divers collect leaves of Posidonia oceanica in the bay of Calvi (Corsica). On this plant many planctomycetes have been found.

Image: 
Christian Jogler

Nearly three-quarters of all clinically relevant antibiotics are natural substances, produced by bacteria. However, the antibiotics that are currently available are losing their effectiveness and increasing numbers of pathogens are becoming resistant. This means there is an urgent need for new antibiotics, but at present fewer than one per cent of known species of bacteria are available for the search for active substances. The remaining 99 per cent are considered ‚impossible to cultivate' and are therefore hardly studied.

In addition, the ability to produce antibiotics is not evenly distributed among bacteria. "Talented producers are primarily microorganisms with complex lifestyles, an unusual cell biology and large genomes," explains microbiologist Christian Jogler of Friedrich Schiller University, Jena. "Such organisms produce antibiotic compounds and deploy them in the fight against other bacteria for nutrients and habitats," he adds. Anywhere that such microbiological battles over resources take place and nutrients are scarce is a promising place to search for potential producers of antibiotics.

Targeted cultivation of potential antibiotics producers

That is exactly what Jogler and his team have done. With the help of diving robots and scientific divers, they looked for Planctomycetes in a total of 10 marine locations. "We know that Planctomycetes live in communities with other microorganisms and compete with them for habitat and nutrients," says Jogler, explaining what makes this group of bacteria of interest to the researchers.

With samples from the Mediterranean, the North Sea, the Baltic Sea and the Black Sea, as well as the Atlantic, the Pacific and the Arctic Ocean, the scientists succeeded in creating pure cultures of 79 new Planctomycetes. "These pure cultures together represent 31 new genera and 65 new species," adds lead author Dr Sandra Wiegand.

Unknown cell division and antibiotic potential

Bioinformatic and microscopic methods were used to characterise the newly obtained pure cultures. "The bioinformatic analysis was holistic in its approach," says Wiegand. The potential to produce small molecules such as antibiotics was studied, as were the processes of cellular signal transduction. The latter are a measure of the complexity of the microbial lifestyle and therefore a further pointer towards antibiotic production. "The results of these analyses show that the newly obtained Planctomycetes have extraordinarily complex lifestyles and have the potential to produce new antibiotics," she added.

The researchers were able to provide experimental confirmation of some of their bioinformatic analyses in this study. Among other things, they investigated the cell biology of the Planctomycetes that had been isolated. "They divide in a very different way from all other important pathogenic bacteria," says Jogler. Furthermore, the research shows unexpected new mechanisms of bacterial cell division. Above all, however, the study provides impressive evidence that supposedly ‚non-cultivable' bacteria can be obtained and characterised in pure culture.

According to the authors of the study, many aspects of their current work can be transferred to other potential antibiotic producers. "Hypothesis-driven cultivation and holistic characterisation are essential for discovering something really new and opening up new therapeutic avenues," stresses Jogler, who only moved to Jena a few weeks ago. The Professor for Microbial Interactions is pleased that he has found an ideal environment for his research at Friedrich Schiller University, with its Cluster of Excellence ‚Balance of the Microverse'.

Credit: 
Friedrich-Schiller-Universitaet Jena

Disparities in care, mortality among hospitalized homeless adults with cardiovascular conditions

What The Study Did: Disparities in hospital care as measured by procedure rates and in-hospital death rates between homeless and nonhomeless adults hospitalized for cardiovascular conditions in New York, Massachusetts and Florida were examined in this observational analysis.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Rishi K. Wadhera, M.D., M.P.P., M.Phil., of Beth Israel Deaconess Medical Center and Harvard Medical School in Boston, is the corresponding author.

(doi:10.1001/jamainternmed.2019.6010)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Media advisory: The full study is linked to this news release. This study is being released to coincide with presentation at the American Heart Association's Scientific Sessions 2019.

Embed this link to provide your readers free access to the full-text article: This link will be live at the embargo time https://jamanetwork.com/journals/jamainternalmedicine/fullarticle/10.1001/jamainternmed.2019.6010?guestAccessKey=493acc16-47eb-468b-a5e8-665788f93fad&utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=111819

Credit: 
JAMA Network

Heart pumps associated with complications in some patients after heart stent procedure

In critically ill patients who require a heart pump to support blood circulation as part of stent procedures, specific heart pumps have been associated with serious complications, according to a new study led by cardiologists at Washington University School of Medicine in St. Louis.

Though the observational study does not prove that the heart pumps -- ventricular assist devices -- are the cause of complications, it suggests that with current practice patterns, there is an association between the use of the pumps and an increased risk of bleeding, kidney problems, stroke and death in patients undergoing stent procedures. The study authors are calling for more research evaluating the heart pumps marketed under the brand name Impella.

Results from the study were presented Nov. 17 at the American Heart Association's Scientific Sessions 2019 in Philadelphia and published simultaneously in the journal Circulation.

After statistically adjusting for certain variables, the researchers found an increased risk of death, bleeding, acute kidney injury and stroke among patients while they were still hospitalized after receiving Impella pumps versus balloon pumps. In particular, the Impella pump was associated with a 24% higher risk of death than seen with balloon pumps and a 34% increased risk of stroke compared with the balloon pump. Both of these differences are statistically significant. In no category was the Impella pump associated with improved outcomes.

"These results deserve a closer look to try to better understand the link between the device and its complications," said lead author Amit P. Amin, MD, a Washington University cardiologist and associate professor of medicine, who is presenting the data. "They suggest that perhaps a more measured approach -- one that balances risks and benefits -- is needed in this critically ill population. These data are observational, so they can't prove causation. But they underscore the need for large, randomized clinical trials and prospective registries to better understand and guide the use of cardiac support devices."

The researchers analyzed data from the Premier Healthcare Database that included information from 48,000 patients treated at 432 U.S. hospitals. Each patient in the study underwent a heart stent procedure, which involves opening a blocked artery in the heart to improve blood flow. Some patients undergoing the stent procedure are seriously ill, often having other medical conditions including heart failure, low blood pressure, complex blockages and other cardiac problems that might lead doctors to decide to add a mechanical assist device during the procedure to help the heart pump a greater volume of blood. Of the patients in this study, just under 10% (4,782 patients) received an Impella heart pump. The remaining 90% (43,524 patients) received an intra-aortic balloon pump.

Most patients undergoing stent procedures don't need a ventricular assist device. This study is focused on the small segment (roughly 3% to 5%) of patients undergoing stent procedures for more advanced heart problems -- such as complex blockages or heart failure or cardiogenic shock, in which the heart loses its capacity to pump sufficient blood -- and need a ventricular assist device. Most patients receive an intra-aortic balloon pump, which rhythmically inflates and deflates in coordination with the heart's natural rhythm, to help push blood through the vessels. These pumps have been in use since the 1960s. But since 2008, a steadily increasing proportion of patients receive the more recently approved Impella pumps, which have small rotors that create a continuous flow of blood.

The data came from patients treated from 2004 to 2016. The Impella pump was introduced into clinical practice in 2008, allowing for comparisons from the time periods before and after this type of pump came into use. Impella use increased steadily from about 1% of patients receiving a pump in 2008 to almost 32% of all patients in 2016 undergoing stent procedures with support devices.

The researchers also found large variations in how often hospitals used Impella pumps. The hospitals that used Impella pumps more frequently had higher adverse outcomes, as well as higher costs associated with caring for these patients, despite controlling for clinical factors. The researchers analyzed the possibility that sicker patients were more likely to receive the Impella pump, perhaps explaining at least part of that association. Instead, they found a trend showing lower Impella use among more critically ill patients.

The authors caution that there are limitations to this observational study, such as physician preference for use of Impella or balloon pumps, or inability to account for factors that were not measured in the observational study. But since the majority of the data suggest no improvement in outcomes linked to the use of the Impella pump as well as serious complications, Amin and his colleagues call for more definitive research to better understand the appropriate role for circulatory support devices in clinical practice.

"These mechanical support devices are innovative and can efficiently pump blood to the body, but in this study, we found no association with improved outcomes with the Impella pumps," Amin said. "This warrants more study so we can understand which patients are likely to benefit from these cardiac assist devices and which are more likely to develop problems."

Credit: 
Washington University School of Medicine

Fertilization discovery could lead to new male contraceptive, help infertile couples

image: Claudia Rival, PhD, (from left) shares notes with Kodi Ravichandran, PhD, and Jeffrey J. Lysiak, PhD. The researchers have made a surprising discovery about fertilization and the egg's role in it, potentially paving the way for a new contraceptive.

Image: 
Dan Addison | UVA Communications

An unexpected discovery about fertilization from the University of Virginia School of Medicine reveals new insights on how sperm and egg fuse and could have major implications for couples battling infertility - and may lead to a future male contraceptive.

The finding has the potential both to boost the success rate of in-vitro fertilization and reduce its cost. "The infertility experts here at UVA are very excited about this," said researcher Jeffrey J. Lysiak, PhD. "This tells us a lot about fundamental biology, but we think it could also have important clinical applications."

The finding recasts the role of the egg in the fertilization process. The old notion of the egg as a passive partner for sperm entry is out. Instead, the researchers found, there are molecular players on the surface of the egg that bind with a corresponding substance on the sperm to facilitate the fusion of the two. "High school biology taught us a very sperm-centric version of fertilization," said UVA researcher Kodi Ravichandran, PhD. "And now it's very clear that it is a dynamic process where both the sperm and egg are equally and actively involved in the ultimate biological goal of achieving fertilization."

Not Dead Yet

Ravichandran and Lysiak's laboratories started collaborating few years back on how immature sperm go through developmental stages in the testes. During their studies, they noticed something unusual. Some immature sperm that appeared to be dying weren't dying at all - they were alive and healthy. These sperm had a molecular marker on their surface suggestive of a dying cell, and this marker grew stronger as the sperm matured. "This initially made no sense," Lysiak said. "We had to do a lot of experiments to show that, indeed, these were live, motile sperm."

It turned out that this marker on the sperm that often tells the body to remove dying cells is used differently and in an important way during fertilization. This marker, phosphatidylserine (PS), is normally held inside cells until they die, but it is also exposed, quite deliberately, on healthy, live sperm.

The egg, meanwhile, expresses protein partners that specifically and actively engage the PS on the sperm. This PS-based recognition, along with other interactions, promotes sperm-egg fusion. Masking the PS on the sperm, or preventing the receptors on egg from recognizing the sperm, block fertilization quite efficiently.

That has several intriguing implications. First, for couples struggling with infertility, doctors one day might try to enhance the exposure of PS on the sperm to promote the chance of conception. They also could examine a man's sperm before in-vitro fertilization to select sperm that are most likely to result in pregnancy. This could help prevent the need for multiple attempts and reduce the cost couples must bear.

"When men go in for infertility tests, they do a basic semen analysis, and the current analysis primarily looks at the number of sperm, can it swim and how does it look," Lysiak said. "It doesn't provide much of an idea of the sperm's fitness to fertilize." As part of this work, the Ravichandran and Lysiak groups have also designed a new test to determine the fertilization fitness of the sperm based on exposure of PS.

Second, the researchers believe that finding a way to mask phosphatidylserine on the sperm head could be a potential form of contraception. "It is a very likely possibility," Ravichandran said. "We blocked phosphatidylserine by three or four different ways [in lab dishes], and we are pleasantly surprised how well it blocks sperm-egg fusion."

Understanding Fertilization

Ravichandran, chairman of UVA's Department of Microbiology, Immunology and Cancer Biology, and Lysiak, of the Department of Urology, plan to explore the basic-science questions related to fertilization in their laboratories and the potential therapeutic applications through a company they have formed called PS-Fertility.

They noted that the exciting possibilities are very much a testament to the importance of fundamental scientific research into basic biological questions. "Fertilization has been studied for 100 years. One would think we would have figured out something as fundamental as fertilization," Ravichandran said. "But the answers is, although surprising, not really. There are still many black boxes we don't understand, and this opens up several new avenues to pursue."

Credit: 
University of Virginia Health System

Mantis shrimp vs. disco clams: Colorful sea creatures do more than dazzle

image: A peacock mantis shrimp guards its eggs.

Image: 
Lindsey Dougherty

When Lindsey Dougherty was an undergraduate student at CU Boulder in 2011, she got the chance to visit North Sulawesi, Indonesia, on a research trip. There, in the clear tropical waters off the coast, she encountered an animal that would change the course of her career.

It was the disco clam (Ctenoides ales). And it caught Dougherty's eye for good reason: Even in a coral reef, these tropical bivalves are explosions of color. They have bright-red appendages that dangle out of their shells and thin strips of tissue that pulse with sparkly light like a disco ball--hence their name.

In that moment, she found her research calling.

"How do they flash?'" Dougherty remembered thinking as she dove through the reef with scuba gear.

As a graduate student at the University of California, Berkeley, the young scientist solved that first puzzle: the clams, she discovered, carry tiny, silica spheres in their tissue.

Now back in Colorado as an instructor in the Department of Ecology and Evolutionary Biology (EBIO), Dougherty is pursuing an even trickier mystery: Why are these bivalves so colorful in the first place?

The answer could reveal new clues to how the interaction between species drives the evolution of ocean animals over millions of years.

It's a pursuit that has expanded to include several high school students and introduced Dougherty to an animal that may be even more groovy-looking than the disco clam--a fierce predator on the same coral reefs called the peacock mantis shrimp (Odontodactylus scyllarus).

And in a recently published paper in the journal Royal Society Open Science, she and her colleagues report that they may be finally getting close to solving that puzzle.

"It's a long time to spend on one organism," Dougherty said. "But I think it also shows how many questions there are about one seemingly simple clam."

Clams vs. mantis shrimp

To grasp Dougherty's obsession with this shelled organism, it helps to understand the weirdness of the disco clam.

Jingchun Li is a curator of invertebrates at the CU Museum of Natural History and advised Dougherty during her postdoctoral studies at CU Boulder. Li has spent her career exploring the diversity of the world's bivalves--a class of aquatic mollusks that include animals like clams, scallops and mussels.

"Normally, if you think about clams like the ones in clam chowder--they're little white things," said Li, also an assistant EBIO professor. "But these clams are so colorful. One hypothesis we had is this might be some sort of warning signal to predators saying, 'Don't eat me.'"

In other words, disco clams might taste really bad, and they advertise that to the world using their bright colors. Kind of like a coral reef version of poison dart frogs in the Amazon.

To test that idea, Li and Dougherty recruited several peacock mantis shrimp--which, despite their names, aren't actually shrimp--and kept them in tanks on the CU Boulder campus.

Like disco clams, these animals are pretty wild to look at. They come in a rainbow of colors, from blues and greens to neon orange and yellow. But don't let their appearance fool you. Known for their powerful punches, mantis shrimp can extend their front claws at speeds of nearly 75 miles per hour--fast enough to generate an underwater shock wave that can shatter aquarium glass on impact.

The researchers, in other words, set up an ecological contest between the colorful mantis shrimp and flashing clams.

They offered their mantis shrimp a choice between two types of disco clam tissue. The mantis shrimp could either eat bright-red meat from the clams' exterior or normal, white meat from their inner muscles.

The mantis shrimp didn't even hesitate. They went for the white meat.

"It turns out they really hate the red tissue," Li said.

That, along with chemical analyses of the two types of meat, certainly seemed to suggest that the team's poison dart frog hypothesis had been spot on.

But another wrinkle emerged: When the group offered the mantis shrimp white meat that was dyed to look red, the invertebrates still chowed down.

As Dougherty put it, "Whether or not the red color is a warning needs more research."

Vinegar vs. Sriracha

It's work that's happening now in Li's lab. She's hoping to discover whether the same mantis shrimp can learn to fear the color red--a key step in determining whether that shade may act as a warning signal.

Aiding Li in that effort are two seniors from Monarch High School in Louisville, Colorado, who are helping out in her lab through the Science Research Seminar program in the Boulder Valley School District. They're tackling a pretty basic question: What kinds of food taste gross to a mantis shrimp?

The students, Grateful Beckers and Elysse DeBarros, have spent their semester trying out different flavor combinations, which they add to chunks of supermarket shrimp meat. Once they find a suitably yucky taste, they'll mix it with meat dyed red to see if mantis shrimp will become wary of that hue over time.

Early results suggest that mantis shrimp can't stand the taste of vinegar but, like many people, don't seem to mind Sriracha sauce.

"They're so unique," Beckers said. "They're a large, interesting shrimp with a lot of interesting adaptations."

Dougherty herself may still have a way to go before she resolves the mystery that first caught her attention on the Indonesian reef all those years ago. But it's been a fascinating road of discovery for this Colorado native who first learned to scuba dive in the Pueblo Reservoir.

"I love the mountains, and I love diving. I don't think they should ever be mutually exclusive," Dougherty said. "Everyone is connected to the ocean whether they realize it or not."

Credit: 
University of Colorado at Boulder