Culture

National livestock movement bans may prove economically damaging 

New research from the University of Warwick has pioneered an economic perspective on controlling livestock diseases. Focusing on Foot and Mouth Disease (FMD), bovine TB (bTB) and bluetongue virus (BTV), the researchers draw striking conclusions about the role of movement bans in controlling an outbreak.

In the 2001 outbreak of FMD, the movement of cattle, sheep and other livestock was generally banned in an effort to prevent the spread of infection. Similarly in 2007, an outbreak of bluetongue virus lead to large-scale movement bans across eastern England.

Given that the livestock industry relies on the movement of animals (between farms or farm to slaughter) to make a profit, such movement bans can have a profound and wide ranging impact on farmers. Moreover, in 2001 the general message that "the countryside is closed" resulted in enormous losses to the tourist industry.

The research, "The Role of Movement Restrictions in Limiting the Economic Impact of livestock Infections" and published today (19 August 2019) by Nature Sustainability, found that the current UK government policy of national movement bans when an outbreak FMD is detected (and large-radius bans for BTV) may cause unnecessary economic harm, when a more localised movement ban could be as successful in halting the spread of the disease and would limit the subsequent negative economic impact.

Led by Dr Mike Tildesley, of Warwick's Zeeman Institute for Systems Biology and Infectious Disease Epidemiology Research (SBIDER), the researchers use state of the art predictive models to examine the consequences of different control options.

The researchers argue that whilst livestock movements bring the risk of long-range spread of infection, this risk is strongest from farms in close proximity to where infections has been detected; therefore a limited movement ban (only preventing movements from farms near to known cases) brings most of the benefits but less of the economic costs.

By not automatically implemented national bans during FMD or BTV outbreaks, geographical regions unaffected by the outbreak would not face the same economic impact caused by the restrictions put in place by a national ban.

Accordingly, whilst a national ban on livestock movement was an appropriate initial response to the FMD outbreak of 2001 given its widely dispersed nature, the policy caused potentially avoidable economic harm in the outbreak of 2007.

Commenting on the research Dr Tildesley says:

"Our research says that movement controls need to be carefully matched to both the epidemiological and economic consequences of the disease, and optimal movement bans are often far shorter than existing policy.

"For example, our work suggests that movement bans of between 15-60km are optimal for FMD (with larger radii preferable if tourism losses can be ignored), while for BTV the optimal policy is to allow all movements"

"Adopting these optimal movement bans could lead to vast savings compared to more stringent policies. We fully recognise the need for the government to rapidly contain novel outbreaks in the face of uncertainty, but our work suggests that optimal movement bans should be enacted as soon as possible."

The researchers also looked at bovine tuberculosis (bTB), concluding that the economic cost of any movement ban is more than the epidemiological benefits; however if tests are sufficiently cheap a localised testing program around infected farms could be economically viable in the long-term.

Credit: 
University of Warwick

Lighting up proteins with Immuno-SABER

image: This image shows a cryosection of the mouse retina in which the researchers visualized 10 proteins at a time with Immuno-SABER's multiplexing abilities.

Image: 
Wyss Institute at Harvard University

(BOSTON) -- To better understand how tissues and organs develop, fail to function, and regenerate over time, researchers would like to visualize their constituent cells' repertoires of molecules within 3D space. Ambitious efforts like the "Human BioMolecular Atlas Program", the "Human Cell Atlas Project", and several brain atlas projects are underway to map the presence and abundance of many proteins - the products of gene expression - in organs and tissues of the human body at the scale of single cells. However, existing imaging methods are typically limited in various aspects of their performance, their accessibility to researchers, or both.

As reported in Nature Biotechnology, a team led by Peng Yin, Ph.D., at Harvard's Wyss Institute for Biological Engineering and Harvard Medical School (HMS) has now filled this void with a new DNA-nanotechnology-based approach called Immuno-SABER, short for "Immunostaining with Signal Amplification By Exchange Reaction." The method combines the protein targeting specificity of commonly available antibodies with a DNA-based signal-amplification strategy that enables the highly multiplexed visualization of many proteins in the same sample with pre-programmable and tunable fluorescence signals at each target site. The team has validated their method in a broad range of cell and tissue preparations.

"We demonstrated that Immuno-SABER provides the capability to independently tune the signal intensity for individual protein targets 5 to 180-fold, with multiplexing capability to allow the simultaneous detection of many proteins. Together with its speed, relative ease of use and low costs, this technique has the potential to fast-forward ongoing large-scale protein-mapping studies and biomarker discovery efforts across many tissues and diseases," said Peng Yin who is a Wyss Institute Core Faculty member.

Based on his group's advances in harnessing DNA nanotechnology-driven barcoding and signal amplification technologies, Yin recently was recently also selected as an awardee of the Human BioMolecular Atlas Program (HuBMAP) and an awardee of the Human Cell Atlas Project. He also is co-leader of the Wyss Institute's Molecular Robotics Initiative, and Professor of Systems Biology at HMS.

Antibodies are the most common detection reagents for proteins both in research and clinical settings. They are typically tagged with fluorescent stains to make them detectable by microscopy. However, conventional antibody staining methods typically allow only a maximum of five different stains to be used simultaneously, and target proteins can differ significantly in their abundances, making it difficult to distinguish rare protein targets with high sensitivity from the background fluorescence that many tissues display.

Immuno-SABER utilizes the "Primer Exchange Reaction" (PER) method previously reported by Yin's group to synthesize long concatemers of short DNA primer sequences with the help of a catalytic DNA hairpin structure. The PER-generated concatemers are attached via short handle sequences to DNA-barcodes on antibodies that bind to target proteins in fixed cell and tissue samples with high specificity. At the target site, SABER concatemers provide a scaffold with multiple binding sites for complementary fluorescent oligonucleotides ("imagers"), and thus a means to amplify the signal emanating from each protein target.

"By barcoding antibodies with unique short DNA sequences and applying Immuno-SABER, we can simultaneously visualize multiple protein targets on the same sample and with high specificity. This essentially opens up a way to analyze the protein variety present in tissues in a robust and multiplexed fashion," said co-first and co-corresponding author Sinem Saka, Ph.D., who works as a Postdoctoral Fellow on Yin's team.

The team significantly boosted the multiplexing potential of their Immuno-SABER approach by coupling it with their previously developed "DNA-Exchange" technique. In DNA-Exchange, imagers that mark one set of target proteins are washed off and replaced by another set of imagers marking a different group of target proteins and this can be repeated multiple times.

Previously developed methods for highly multiplexed protein detection that work by repeating some of their key steps at different protein targets tend to suffer from suboptimal sensitivities, or take considerable time (low throughput) and finesse to execute. "Exchange-SABER" provides high sensitivity with one single step of staining and amplification, and high multiplexing and throughput with multiple fast imager exchange steps," said co-first author Yu Wang, who is a graduate student on Yin's team. "As proof-of-concept, we visualized 10 different proteins in cryosections of the mouse retina."

Wang was co-mentored by co-author George Church, Ph.D., who is a Core Faculty member at the Wyss Institute and Professor of Genetics at HMS and of Health Sciences and Technology at Harvard University and the Massachusetts Institute of Technology (MIT).

Another key facet of Immuno-SABER facilitating the parallel detection of many proteins at a time is its ability to tune signal strength. The team achieved this by assembling more complex branched structures from PER-generated concatemers that contain higher numbers of binding sites for fluorescent imagers. "Programming the complexity of PER-based concatemer structures allows us to tune the signal strength to the abundance of particular proteins. We can at the same time visualize rare proteins with branched SABER products that enable higher signal amplification, and abundant proteins with linear SABER products," said Saka. In their study, the team combined linear and branched SABER concatemers to, for example, simultaneously visualize six protein targets with different abundances and cellular locations in human tonsil samples.

Yin's team's existing suite of DNA nanotechnology-powered imaging technologies including DNA-PAINT and Discrete Molecular Imaging, have advanced the field of super-resolution microscopy, which allows researchers to study single molecules at their normal locations. To achieve similarly high resolution of proteins in more complex tissue environments the team combined Immuno-SABER with a method known as "Expansion Microscopy", which was previously developed by co-author Edward Boyden, Ph.D., the Y. Eva Tan Professor in Neurotechnology at MIT. The expansion method swells fixed tissues artificially to larger volumes, which increases the separation distance between individual molecules and thus improves their effective resolution without the need for specialized instruments. "Combining Expansion Microscopy with Exchange-SABER simultaneously gives us the high-multiplexing, -throughput, and -resolution capabilities needed to move efforts such as building molecular atlases for the human body more effectively forward," said Wang.

"Peng Yin's team again demonstrates how they can program engineered DNA molecules to carry out specific tasks like molecular robots, in this case allowing us to visualize simultaneously the location of numerous proteins within human cells and tissues with high resolution, which should greatly accelerate discovery of molecular mechanisms of biological control as well as new disease biomarkers," said Wyss Institute Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at HMS, the Vascular Biology Program at Boston Children's Hospital, and Professor of Bioengineering at Harvard's John A. Paulson School of Engineering and Applied Sciences (SEAS).

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Researchers find hurricanes drive the evolution of more aggressive spiders

image: The spider known as Anelosimus studiosus, which lives along the Gulf and Atlantic coasts of the United States and Mexico.

Image: 
Joseph T Lapp

Researchers at McMaster University who rush in after storms to study the behaviour of spiders have found that extreme weather events such as tropical cyclones may have an evolutionary impact on populations living in storm-prone regions, where aggressive spiders have the best odds of survival.

Raging winds can demolish trees, defoliate entire canopies and scatter debris across forest floors, radically altering the habitats and reshaping the selective pressures on many organisms, suggests a new study published today in the journal Nature Ecology & Evolution.

"It is tremendously important to understand the environmental impacts of these 'black swan' weather events on evolution and natural selection," says lead author Jonathan Pruitt, an evolutionary biologist and Canada 150 Chair in McMaster's Department of Psychology, Neuroscience & Behaviour.

"As sea levels rise, the incidence of tropical storms will only increase. Now more than ever we need to contend with what the ecological and evolutionary impacts of these storms will be for non-human animals," he says.

Pruitt and his team examined female colonies of the spider known as Anelosimus studiosus, which lives along the Gulf and Atlantic coasts of the United States and Mexico, directly in the path of tropical cyclones that form in the Atlantic basin from May to November.

To conduct the research, scientists had to tackle many logistical and methodological challenges which included anticipating the trajectory of the tropical cyclones. Once a storm's path was determined, they sampled populations before landfall, then returned to the sites within 48 hours.

They sampled 240 colonies throughout the storm-prone coastal regions, and compared them to control sites, with particular interest in determining if extreme weather--in this case areas disturbed in 2018 by subtropical storm Alberto, Hurricane Florence and Hurricane Michael--caused particular spider traits to prevail over others.

As a species, A. studiosus is divided into two sets of inherited personality traits: docile and aggressive. The aggressiveness of a colony is determined by the speed and number of attackers that respond to prey, the tendency to cannibalize males and eggs, the vulnerability to infiltration by predatory foreign spiders, among other characteristics.

Aggressive colonies, for example, are better at acquiring resources when scarce but are also more prone to infighting when deprived of food for long periods of time or when colonies become overheated.

"Tropical cyclones likely impact both of these stressors by altering the numbers of flying prey and increasing sun exposure from a more open canopy layer," explains Pruitt. "Aggressiveness is passed down through generations in these colonies, from parent to daughter, and is a major factor in their survival and ability to reproduce."

The analysis suggested that after a tropical cyclone event, colonies with more aggressive foraging responses produced more egg cases and had more spiderlings survive into early winter. The trend was consistent across multiple storms that varied in size, duration and intensity, suggesting the effects are robust evolutionary responses, says Pruitt.

Credit: 
McMaster University

A second planet in the Beta Pictoris system

image: At least two giant planets, aged 20 million years at most, orbit around the star (which is hidden): β Pictoris c, the nearest one, which has just been discovered, and β Pictoris b, which is more distant. The disk of dust and gas can be seen in the background.

Image: 
P Rubini / AM Lagrange

A team of astronomers led by Anne-Marie Lagrange, a CNRS researcher at the Institut de Planétologie et d'Astrophysique de Grenoble (CNRS/Université Grenoble Alpes) (1), has discovered a second giant planet in orbit around β Pictoris, a star that is relatively young (23 million years old) and close (63.4 light years), and surrounded by a disk of dust. The β Pictoris system has fascinated astronomers for the last thirty years since it enables them to observe a planetary system in the process of forming around its star. Comets have been discovered in the system, as well as a gas giant, β Pictoris b, detected by direct imaging and described in 2009 by Lagrange's team. This time, the team had to analyse more than 10 years of high-resolution data, obtained with the HARPS instrument at ESO's La Silla Observatory in Chile, in order to indirectly detect the presence of β Pictoris c (2). This second giant planet, which has a mass nine times that of Jupiter, completes its orbit in roughly 1,200 days, and is relatively close to its star (approximately the distance between the Sun and the asteroid belt, whereas β Pictoris b is 3.3 times more distant). The researchers hope to find out more about the planet from data from the GAIA spacecraft and from the future Extremely Large Telescope now under construction in Chile.

Credit: 
CNRS

Lithium fluoride crystals 'see' heavy ions with high energies

image: Lithium fluoride crystal with heavy ion tracks recorded during viewing under a fluorescence microscope.

Image: 
Source: IFJ PAN

Lithium fluoride crystals have recently been used to register the tracks of nuclear particles. Physicists from the Institute of Nuclear Physics of the Polish Academy of Sciences in Cracow have just demonstrated that these crystals are also ideal for detecting tracks of high-energy ions of elements even as heavy as iron.

When a nuclear particle enters into a crystal, it interacts with the atoms or molecules in its crystal network. In certain crystals and under the appropriate conditions, the resulting defect can be a source of weak light - luminescence. At the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) in Cracow research has been conducted on materials showing this type of properties for many years. One of them is lithium fluoride LiF. Its crystals have recently been used to detect low-energy particles such as alpha particles (helium nuclei). In their latest publication in the Journal of Luminescence, the Cracow-based physicists show that the field of application of lithium fluoride also extends to the detection of particles with significant energies and even includes ions of such heavy elements as iron 56Fe, completely stripped of electrons.

"Lithium fluoride track detectors are simply crystals. Unlike detection devices that monitor near-real time tracks of particles, they are passive detectors. In other words, they work like photographic film. Once crystals are exposed to radiation, we need to use a fluorescence microscope to find out what tracks we have recorded," says Prof. Pawel Bilski (IFJ PAN).

Fluorescent nuclear track detectors have been known for about a decade. So far, they have been made only from appropriately doped Al2O3 aluminium oxide crystals in which, under the influence of radiation, permanent colour centres are created. Such centres, when excited by light of an appropriate wavelength, emit photons (with lower energies) which make it possible to see the track of a particle under a microscope. In the case of lithium fluoride, the excitation is carried out with blue light and the emission of photons takes place in the red range.

"Detectors with doped aluminium oxide require an expensive confocal microscope with a laser beam and scanning. Tracks in lithium fluoride crystals can be seen with a much cheaper, standard fluorescent microscope", says Prof. Bilski and emphasizes: "Tracks recorded in crystals very accurately reproduce the path of a particle. Other detectors, such as the well-known Wilson chamber, usually widen the track. In the case of LiF crystals, the resolution is restricted only by the diffraction limit."

While the impossibility of observing tracks of particles in near real time is difficult to call an advantage, it does not always have to be a disadvantage. For example, in personal dosimetry, detectors are needed to determine the dose of radiation to which the user has been exposed. These devices must be small and easy to use. The millimetre-sized crystalline lithium fluoride plates meet this requirement perfectly. This is one of the reasons why these crystals, grown by Czochralski method in the IFJ PAN, can now be found in the European Columbus module of the International Space Station, among many other types of passive detectors. Replaced every six months within the DOSIS 3D experiment, the detectors make it possible to determine the spatial distribution of the radiation dose within the station and its variability over time.

During the latest research, crystalline lithium fluoride plates were exposed to high energy ions. The irradiation was carried out in the HIMAC accelerator in the Japanese city of Chiba. During the bombardment with various ion beams, the energies of particles ranged from 150 megaelectronvolts per nucleon in the case of 4He helium ions to 500 MeV/nucleon in the case of 56Fe iron ions. The detectors were also irradiated at with 12C carbon ions, 20Ne neon and 28Si silicon beams.

"In the crystal plates placed perpendicularly to the ion beam, we observed practically point sources of light of a size on the border of the optical resolution of a microscope. These were the places where the high-energy ion pierced the crystal," says Prof. Bilski. "As part of the tests, some of the plates were also placed parallel to the beam. The probability of registering a track was then lower, but when it did happen, a long fragment of the track of the particle was 'imprinted' in the crystal."

The tests carried out confirm that lithium fluoride track detectors are ideal for recording the passage of heavy ions with high energies. In addition, it seems that these are not the only possibilities of LiF crystals. Every other atom in their interior is lithium, which interacts very well with neutrons. Lithium fluoride detectors, especially those enriched with the lithium 6Li isotope, will probably allow for very effective registration of low-energy neutrons, and there is much to indicate that also those of a higher energy. If future studies confirm this assumption, it will be possible to construct personal neutron dosimeters. The small size of LiF crystals would also allow for interesting technical applications that are technologically inaccessible today. LiF track detectors could be used, for example, to study secondary particles formed around the primary proton beam produced by accelerators used in medicine to fight cancer.

Credit: 
The Henryk Niewodniczanski Institute of Nuclear Physics Polish Academy of Sciences

WVU study investigates rural LBGTQ youth's motivations for participating in activism

image: This is WVU Assistant Professor of Social Work Megan Gandy-Guedes.

Image: 
West Virginia University

While marriage equality continues to be a big win for the LGBTQ movement since its passage in the U.S. in 2015, many activists are concerned about what's next.

Researchers from West Virginia University and the University of Kansas have spent the intervening years studying young adults comprising the next generation of LGBTQ activists to understand their aspirations for the movement's future.

Working with a lobbying organization in the rural southwest, WVU Assistant Professor of Social Work Megan Gandy-Guedes and KU Assistant Professor of Social Welfare Megan Paceley sought to understand the social, economic and environmental issues important to LGBTQ young adults living in the rural U.S. and their motivations for engaging in activism and social justice efforts.

"Seeing the challenges of LGBTQ youth in states that are in the middle of the country or in more rural or more conservative areas - needing a voice in the younger generation really motivated us to do this research," Gandy-Guedes said.

They surveyed young adults ages 18 to 29 who attended the lobbying organization's annual leadership symposium to inform its future programming.

"The organization's leaders felt like they needed a better approach to reaching the younger generation of leaders, activists and advocates," Gandy-Guedes said. "They asked us to do a survey to find out the motivations of these young people and the things they are interested in being involved in."

Five issues resonated the most for the young adults surveyed: police use of force, conversion therapy (meant to change an individual's sexual orientation), pay inequality, lived equality for transgender individuals and preservation of land, water and wildlife.

"There are so many issues on the forefront of these young peoples' minds," Gandy-Guedes said. "It really emphasizes that if organizers are going to engage these young people, they need to engage them with a multitude of issues in mind."

The young adults' motivations for getting involved in activism included a desire to help other LGBTQ individuals, experiences with or fears of isolation and victimization, and societal influence. The researchers noted that these concerns were shared by individuals who had experienced multiple levels of marginalization.

"Because of the way people talked about their interests in social change and activism, they weren't issues that just affected LGBTQ people. The issues affected LGBTQ people across different identities and people outside of the LBGTQ community," Paceley said. "It wasn't just LGBTQ marginalization. It was gender, sexuality, race, economic status or a combination of these things."

The organization Gandy-Guedes and Paceley partnered with has already hired a program director to focus on these issues.

"To engage with the next generation of activists, you have to approach social justice issues from a multitude of identities and come at it from a multitude of outcomes in mind," Gandy-Guedes said. "They already had an executive director and some staff, but a program director is an important way to engage young people."

To improve engagement with young adults, Gandy-Guedes and Paceley recommend that organizations continue to listen to their needs.

"Just ask them. That's what I love about the organization we worked with. They recognized a gap in what they were doing, and they reached out to us to help understand that gap," Paceley said. "They didn't just listen - they responded to it. Things change, even in just a short period of time of a few years. Continually asking people who these issues affect, what they want to see done, how they want to do this work, what their motivations are and what they are afraid of is crucial for engaging them and continuing the work."

Credit: 
West Virginia University

Research shows TCOM and osteopathic approach making a difference

image: The 2 ½ -year study was conducted by the PRECISION Pain Research Registry and TCOM's John Licciardone, DO, MS, MBA

Image: 
UNTHSC

Osteopathic medicine's emphasis on physician empathy and understanding leads to higher patient satisfaction, a study by researchers at UNTHSC's Texas College of Osteopathic Medicine indicates.

The 2 ½ -year study, conducted by the PRECISION Pain Research Registry and TCOM's John Licciardone, DO, MS, MBA, reaffirmed the importance of empathy and better interpersonal manner when treating patients with chronic pain.

Third-year TCOM student Monika Schmitt, a PRECISION Clinical Research Fellow, helped craft the article published in the Journal of the American Osteopathic Association.

"The greater levels of physician empathy and the better interpersonal manner reported by patients who were treated by osteopathic physicians may be important mediators of clinical outcomes in people with chronic pain," Dr. Licciardone said. "That patients of osteopathic physicians reported lower levels of pain catastrophizing and were more resilient and better able to cope with their pain may explain their lower levels of disability."

Focusing on interpersonal manner, empathy and communication style, the results of the study showed patients seeing osteopathic physicians had more favorable perception of their treatment. In keeping with some of the foundational osteopathic tenants - such as the body is capable of self-regulation and self-healing -- the study illustrated that DOs were less likely to prescribe opioids.

The results also raise awareness of the burgeoning opioid crisis and how osteopathic physicians can help decrease potential addiction to opioids.

"Our results show that osteopathic physicians are less likely to prescribe opioids or nonsteroidal anti-inflammatory drugs for their patients with pain," Dr. Licciardone said. "Thus, osteopathic philosophy and medical care is very consistent with current guidelines from the Centers for Disease Control and Prevention, which emphasize the importance of initiating chronic pain management with non-pharmacological treatments and avoiding opioids unless the benefits outweigh the risk for a patient."

Schmitt started as a PRECISON Clinical Research Fellow in the summer of 2018 and soon after found herself working on the research paper in the fall.

"Working with Dr. Licciardone on the PRECISION project has shown me how evidence-based medicine is continually evolving," Schmitt said. "The importance of this research is its contribution to the body of knowledge on physician behaviors. As an osteopathic medical student, I am excited to study what makes osteopathic physicians unique."

Schmitt represents the first class of PRECISION Clinical Research Fellowship Program since its inception in 2018. The program welcomed its second group in June of 2019.

"An important aspect of the PRECISION Pain Research Registry is that it provides a mechanism for medical students to learn about and participate in research through its Clinical Research Fellowship Program," Dr. Licciardone said. "The registry facilitates student involvement by providing access to its resources, including research data and biological samples."

Credit: 
University of North Texas Health Science Center

Scientists uncover mystery of DNA methylation

All species mark their DNA with methyl groups. This is done to regulate gene expression, distinguish indigenous DNA from foreign DNA, or to mark old DNA strands during replication. Methylation is carried out by certain enzymes called methyltransferases, which decorate DNA with methyl groups in certain patterns to create an epigenetic layer on top of DNA.

Until now, scientists have not been struggling to tell which enzymes are responsible for which patterns. But in a new study, recently published in Nature Communications, scientists from The Novo Nordisk Foundation Center for Biosustainability (DTU Biosustain) at Technical University of Denmark have coupled enzymes with specific methylation patterns in two bacteria.

"Knowing which enzyme does what opens up to a lot of applications. With this knowledge, you can construct model organisms with artificial methylomes, mimicking the methylation pattern of the strain you want to introduce DNA to. In this way you can ensure 'survival' of introduced DNA," says Specialist and first-author of this paper Torbjørn Ølshøj Jensen from DTU Biosustain.

Scientists often run into problems with methylation, when they try to introduce foreign DNA to a host organism, for instance bacteria or yeast. But introducing foreign DNA is essential when building production hosts - often called cell factories - capable of producing for instance medicine, sustainable bio-chemicals and food ingredients. Often, the host needs genes (DNA) from other organisms in order to produce the sought-for compounds.

But just as often, the host will reject the foreign DNA and chop it into pieces, simply because the methylation patterns reveal that the DNA is alien. Scientists working with E. coli as their host, usually don't have that many problems - or fewer than others ­- when introducing new DNA, since E. coli is well-known and rather "well-behaved." But moving into lesser known hosts can become a great problem.

"Working in other bacteria than E. coli, you often have to do a lot of trial and error when it comes to DNA transformation, but that's just not good enough. You need knowledge and tools. With this, you have a systematic and rational way of fixing the problems," Torbjørn Ølshøj Jensen says.

The goal was to find out, which enzymes are responsible for which patterns. In order to uncover this, the researchers constructed DNA-rings (plasmids) containing one of the methyltransferases and "cassettes" holding multiple copies of certain DNA patterns. These DNA-patterns, called motifs, are the targets for methyltransferases. By coupling the two, the methyltransferase expressed by the plasmid would mark the DNA in a specific way, thus, revealing the enzyme's methylation pattern.

This was done for all methyltransferases. Afterwards, all the plasmids (in a pool) were read using a sequencing method designed to reveal methyl groups. This gave the researchers a "library" of enzyme-to-motif couplings.

This quick method of identifying methyltransferase methylation patterns holds great promise to other researchers struggling with DNA degradation, according to the research team.

To validate the method, the scientists analysed the genomes of the temperature resistant bacterium M. thermoacetica as well as the bacterium A. woodii. Both bacteria are hosts with great potential for industrial applications and substantially modified genomes.

In total, the two bacterial organisms hold 23 methyltranstransferase genes, but only show modification on 12 different DNA-motifs on their genomes, meaning that not all methyltransferases are active.

The team assessed all of the 23 methyltransferases, looking for those being active on the genomes. For 11 of the 12 motifs, they were able to couple activity to specific methyltransferases gene.

Using this method, the team hopes to design hosts with an unambiguous "methylome" - meaning that the organism only harbours wanted methyltransferases, which will ease introduction of foreign DNA into non-model organisms.

This can be useful both when building cell factories based upon new or lesser-known hosts, and in order to understand the regulation of gene expression and cell differentiation.

Credit: 
Technical University of Denmark

Black hole holograms

image: Theoretical prediction of the image of the black hole from the table-top experiment. The radius of the ring depends on the temperature. The image of the black hole is deformed as the observation point θobs is varied.

Image: 
©Hashimoto, K., Kinoshita, S., Murata, K., "Einstein Rings in Holography", Phys. Rev. Lett. 123, 031602, DOI: 10.1103/PhysRevLett.123.031602

Osaka, Japan - A research team from Osaka University, Nihon University and Chuo University has proposed a novel theoretical framework whose experiment could be performed in a laboratory to better understand the physics of black holes. This project can shed light on the fundamental laws that govern the cosmos on both unimaginably small and vastly large scales.

Recently, the world was transfixed when the first ever images of a black hole were released by the Event Horizon Telescope. Or, to be more precise, the pictures showed the bright circle, called an Einstein ring, made by the light that just barely escaped the grasp of the black hole's immense gravity. This ring of light was due to fact that, according the theory of general relativity, the fabric of spacetime itself becomes so contorted by the mass of the black hole that it acts like a huge lens.

Unfortunately, our understanding of black holes remains incomplete, because the theory of general relativity--which is used to describe the laws of nature at the scale of stars and galaxies--is not currently compatible with quantum mechanics, our best theory of how the Universe operates on very small scales. Since black holes, by definition, have a huge mass compressed into a tiny space, reconciling these wildly successful but thus far conflicting theories is necessary to understand them.

One possible approach for solving this conundrum is called string theory, which holds that all matter is made of very tiny vibrating strings. One version of this theory predicts a correspondence between the laws of physics we perceive in our familiar four dimensions (three dimensions of space plus time) and strings in a space with an extra dimension. This is sometimes called a "holographic duality," because it is reminiscent of a two-dimensional holographic plate that holds all the information of a 3D-object.

In the newly published research, the authors, Koji Hashimoto (Osaka University), Keiju Murata (Nihon University) and Shunichiro Kinoshita (Chuo University) apply this concept to show how the surface of a sphere, which has two dimensions, can be used in a tabletop experiment to model a black hole in three dimensions. In this setup, light emanating from a source at one point of the sphere is measured at another, which should show the black hole if the spherical material allows holography.

"The holographic image of a simulated black hole, if observed by this tabletop experiment, may serve as an entrance to the world of quantum gravity" says the author Hashimoto. The researchers also calculated the radius of the Einstein ring that would be observed if this theory is correct.

"Our hope is that this project shows the way forward towards a better understanding of how our Universe truly operates on a fundamental level," says the author Keiju Murata.

Credit: 
Osaka University

Weather on ancient Mars: Warm with occasional rain, turning cold

image: This is the site of the Mars2020 landing. Chemical Alteration by Water, Jezero Crater Delta: On ancient Mars, water carved channels and transported sediments to form fans and deltas within lake basins (colour enhanced to show mineral types).

Image: 
NASA/JPL-Caltech/MSSS/JHU-APL Full image available for download at: http://tinyurl.com/yxrq8eb3

A new study of conditions on Mars indicates that the climate 3 to 4 billion years ago was warm enough to provoke substantial rainstorms and flowing water, followed by a longer cold period where the water froze. This may have implications on the conditions for the development of life on Mars

Scientists have long known that water was abundant on ancient Mars, but there has been no consensus on whether liquid water was common, or whether it was largely frozen in ice. Was the temperature high enough to allow the water to flow? Did this happen over an extended period, or just occasionally? Was the surface a desert or frozen? Warm conditions make it much more likely that life would have developed independently on the surface of ancient Mars. Now a new comparison of patterns of mineral deposition on the red planet with similar depositions on Earth lends weight to the idea that early Mars had one or more long periods dominated by rainstorms and flowing water, with the water later freezing.

Presenting the findings today at the Goldschmidt Geochemistry Conference in Barcelona, Professor Briony Horgan (Purdue University) said:

"We know there were periods when the surface of Mars was frozen; we know there were periods when water flowed freely. But we don't know exactly when these periods were, and how long they lasted. We have never sent unmanned missions to areas of Mars which can show us these earliest rocks, so we need to use Earth-bound science to understand the geochemistry of what may have happened there. Our study of weathering in radically different climate conditions such as the Oregon Cascades, Hawaii, Iceland, and other places on Earth, can show us how climate affects pattern of mineral deposition, like we see on Mars.

Here on Earth, we find silica deposition in glaciers which are characteristic of melting water. On Mars, we can identify similar silica deposits in younger areas, but we can also see older areas which are similar to deep soils from warm climates on Earth. This leads us to believe that on Mars 3 to 4 billion years ago, we had a general slow trend from warm to cold, with periods of thawing and freezing.

If this is so, it is important in the search for possible life on Mars.

We know that the building blocks of life on Earth developed very soon after the Earth's formation, and that flowing water is essential for life's development. So evidence that we had early, flowing water on Mars, will increase the chances that simple life may have developed at around the same time as it did on Earth. We hope that the Mars 2020 mission will be able to look more closely at these minerals, and begin to answer exactly what conditions existed when Mars was still young".

Analysis of the surface geology of Mars supports a trend from a warm to a cold climate, but the climate models themselves don't support this, due to the limited heat arriving from the young Sun.

"If our findings are correct, then we need to keep working on the Mars climate models, possibly to include some chemical or geological, or other process which might have warmed the young planet", said Horgan

The research team compared Earth data to Martian minerals detected using the NASA CRISM spectrometer, currently orbiting Mars, which can remotely identify surface chemicals where water once existed. They also took data from the Mars Curiosity Rover. Professor Horgan is a co-investigator on the Mars 2020 mission, due to be launched in July 2020 and to begin to explore the Jezero Crater in February 2021.

Commenting, Professor Scott McLennan (Stony Brook University) said, "What is especially exciting about this work is that it used well understood Earth based geological processes from regions that are good analogs for Mars. The results not only make sense from the perspective of developing climate evolution models for Mars but also demonstrated a possible mechanism for forming the most interesting and perplexing and non-crystalline components that have been found in all of the samples analysed so far by the Curiosity rover".

Professor McLennan was not directly involved in this work, this is an independent comment

Credit: 
Goldschmidt Conference

A laser-driven programmable non-contact transfer printing technique

image: (A) Schematic illustration of the laser-driven programmable non-contact transfer printing process via an active elastomeric micro-structured stamp. (B) Printing single Si platelet and LED chip onto various receivers. (C) Programmable printing Si platelets and LED chips onto various receivers.

Image: 
©Science China Press

Transfer printing is an emerging assembly technique to transfer micro/nano-objects (i.e., inks) from one substrate (i.e., donor) to another substrate (i.e., receiver) using soft polymeric stamps. The transfer printing technique enables the assembly of diverse materials in various structural layouts with large throughputs of thousands of objects per second, and is valuable to develop advanced electronic systems such as flexible and stretchable inorganic electronics requiring the heterogeneous integration of inorganic materials with soft elastomers, which represents one of the ongoing technology revolutions in the electronics industry.

Various approaches based on tunable dry adhesives have been utilized to develop transfer printing techniques, including contact techniques and non-contact techniques. The performance of contact techniques critically depends on the receiver's geometry and properties since the printing requires the contact of the stamp to the receiver. In contrast to contact transfer printing techniques, non-contact approaches eliminate the influence of receiver on the transfer yield and allow non-contact printing of inks onto arbitrary receivers. However, existing non-contact transfer printing techniques usually induce undesired high temperature increases in the system, which many cause permanent interfacial damages and limit their broad utilities in transfer printing of brittle materials (e.g., Si) widely involved in conventional electronics.

In response to this challenge, Song's group in Zhejiang University developed a laser-driven programmable non-contact transfer printing technique via a simple yet robust innovative design of an active elastomeric micro-structured stamp with tunable adhesion. The tunable adhesive features cavities filled with air and encapsulated by a micro-patterned surface membrane duplicated from low-cost and easily available sandpapers. The micro-patterned surface membrane can be inflated dynamically to control the interfacial adhesion by heating the air in cavities through a metal layer (e.g., iron particles) on the inner cavity surface, which serves as the laser absorbing layer. This construct offers continuously thermal-controlled tunable adhesion with a large switchability of more than three orders of magnitude at a temperature increase below 100 °C.

This active adhesive extends concepts developed for contact printing techniques and enables the development of a novel laser-driven programmable non-contact transfer printing technique. Theoretical and experimental studies reveal the fundamental aspects of the design and fabrication of the active elastomeric micro-structured stamp, and the operation of non-contact transfer printing. Demonstrations in programmable transfer printing of micro-scale Si platelets and micro-scale LED chips onto various challenging flat or rough receivers (e.g., paper, steel sphere, leaf) with ultra-low adhesion illustrate the unusual capabilities for deterministic assembly that have been difficult to address by existing printing schemes. This innovative laser-driven non-contact transfer printing technique creates engineering opportunities in a wide range of applications such as flexible electronics, paper-based electronics, bio-integrated electronics, and MicroLED display, where the heterogeneous integration of diverse materials is required.

Credit: 
Science China Press

Genetic risk is associated with differences in gut microbiome

image: This is Johnny Ludvigsson, professor emeritus, Linköping University.

Image: 
Thor Balkhed/Linköping University

Children with a high genetic risk of developing type 1 diabetes have different gut microbiomes than children with a low risk, according to a new study from Linköping University in Sweden and the University of Florida in the US. The results published in the scientific journal Nature Communications suggest that genetic risk can shape an individual's response to environmental factors in the development of autoimmune diseases.

Both hereditary and environmental factors play a role in the development of type 1 diabetes, a serious autoimmune disease that often develops in childhood or adolescence. Once developed, type 1 diabetes requires lifelong intensive treatment with insulin. An increased genetic risk is not sufficient to cause the disease: environmental factors are also necessary and play a crucial role. The gut flora is such a factor, and its role in various diseases, particularly autoimmune diseases in which the gut flora plays an important role in the maturation of the immune system, has been extensively studied.

The current study is based on the ABIS study (All Babies in Southeast Sweden) at Linköping University. The aim of ABIS is to determine why children develop immune-mediated diseases, especially type 1 diabetes. In total, 17,000 children born between 1997 and 1999 have been followed with questionnaires and biological samples from birth, and then at the ages of 1,3,5, 8 years and beyond. In this new study of 403 children, researchers wanted to study the connections between genetic predisposition for disease and gut flora. The children were analysed for genetic risk based on genetic variation in HLA genes, which play an important role in the immune system. Certain variants of the HLA genes are the strongest genetic risk factors for type 1 diabetes. The gut microbiome was analysed in stool samples collected when the children were one year old. The result showed significant differences in microbiome in children with different genetic risks.

"Certain bacterial species were not found at all in children with high genetic risk, but were found in those with low or no risk. This is very interesting, as this could mean that certain species have protective effects and may be useful in future treatment to prevent autoimmune diseases. It may be that certain species cannot survive in individuals with high genetic risk", says Johnny Ludvigsson, senior professor in the Department of Clinical and Experimental Medicine, Linköping University, and senior consultant at HRH Crown Princess Victoria Children's Hospital, Linköping University Hospital.

Previous studies of gut flora in relation to the development of type 1 diabetes have been based on children who are all at high genetic risk for the disease. The present study is the first in which a relationship has been studied in a general population that includes children with low, neutral, and high genetic risk.

"The ABIS cohort is uniquely valuable as it allows certain types of study on the importance of environmental factors for the development of type 1 diabetes. ABIS is the only big prospective cohort in the world where a general population has been followed from birth, which allows this type of studies on how genetic and environmental factors work together", says Johnny Ludvigsson.

Further research is needed to obtain deeper knowledge about how the combined effect of genetics and gut flora influence the development of type 1 diabetes. The results may also be important for the development of other autoimmune diseases where genetics is important, such as celiac disease and rheumatoid arthritis.

Credit: 
Linköping University

Lab-based dark energy experiment narrows search options for elusive force

image: This is the atom interferometer.

Image: 
Imperial College London

An experiment to test a popular theory of dark energy has found no evidence of new forces, placing strong constraints on related theories.

Dark energy is the name given to an unknown force that is causing the universe to expand at an accelerating rate.

Some physicists propose dark energy is a 'fifth' force that acts on matter, beyond the four already known - gravitational, electromagnetic, and the strong and weak nuclear forces.
However, researchers think this fifth force may be 'screened' or 'hidden' for large objects like planets or weights on Earth, making it difficult to detect.

Now, researchers at Imperial College London and the University of Nottingham have tested the possibility that this fifth force is acting on single atoms, and found no evidence for it in their most recent experiment.

This could rule out popular theories of dark energy that modify the theory of gravity, and leaves fewer places to search for the elusive fifth force.

The experiment, performed at Imperial College London and analysed by theorists at the University of Nottingham, is reported today in Physical Review Letters.

Professor Ed Copeland, from the Centre for Astronomy & Particle Physics at the University of Nottingham, said: "This experiment, connecting atomic physics and cosmology, has allowed us to rule out a wide class of models that have been proposed to explain the nature of dark energy, and will enable us to constrain many more dark energy models.''

The experiment tested theories of dark energy that propose the fifth force is comparatively weaker when there is more matter around - the opposite of how gravity behaves.

This would mean it is strong in a vacuum like space, but is weak when there is lots of matter around. Therefore, experiments using two large weights would mean the force becomes too weak to measure.

The researchers instead tested a larger weight with an incredibly small weight - a single atom - where the force should have been observed if it exists.

The team used an atom interferometer to test whether there were any extra forces that could be the fifth force acting on an atom. A marble-sized sphere of metal was placed in a vacuum chamber and atoms were allowed to free-fall inside the chamber.

The theory is, if there is a fifth force acting between the sphere and atom, the atom's path will deviate slightly as it passes by the sphere, causing a change in the path of the falling atom. However, no such force was found.

Professor Ed Hinds, from the Department of Physics at Imperial, said: "It is very exciting to be able to discover something about the evolution of the universe using a table-top experiment in a London basement."

Credit: 
Imperial College London

'Hidden' data exacerbates rural public health inequities

Differences in the health of rural residents compared to their urban neighbors are startling. In Washington, for instance, rural residents are one-third more likely to die from intentional self-harm or 13 percent more likely to die from heart disease.

However, while statistics like these help guide public health policy and spending, they can hide even greater health disparities within those rural communities, said Betty Bekemeier, director of the UW School of Public Health's Northwest Center for Public Health Practice and a professor in the UW School of Nursing.

"Populations in rural areas already have suffered disproportionately from a lot of negative health outcomes," she said. "Then on top of that, they lack the data, capacity and infrastructure to understand and better address those problems."

Yet, some of the data rural public health officials need to better serve their communities exists but is hard to access and use. So, what gives?

To find out, Bekemeier and her colleagues at the Northwest Center embarked on the SHARE-NW project: a five-year effort to identify, gather and visualize data in four Northwest states to help rural communities more effectively address health disparities and achieve health equity.

"Rural communities in Washington, Oregon, Idaho and Alaska face high poverty and are home to large populations of Alaska Native, Native American, Latino and other residents who are often marginalized and impacted by health disparities," explains the SHARE-NW website.

The SHARE-NW project is currently in its third year. The results of the group's survey of rural leaders were published Aug. 18 in a special health equity issue of the Journal of the American Medical Informatics Association .

The UW researchers conducted phone interviews in 2018 with officials in the four Northwest states, including staff from the Alaska Native Tribal Health Consortium, the Panhandle Health District in Idaho, the Crook County Health Department in Oregon, the Wahkiakum County Health Department in Washington and 21 other rural health organizations.

"In our study with rural public health system leaders, we identified barriers to using data, such as 1) lack of easy access to timely data, 2) data quality issues specific to rural and tribal communities, and 3) the inability for rural leaders to use those data," wrote the study authors.

"You may have a very seemingly homogenous population on the face of it," said Bekemeier, the study's lead author. "But you have small population groups that are very disproportionately impacted by certain issues, and leaders in those communities may not be aware that these problems exist, let alone how deeply individuals are affected."

For instance, one agency told the researchers: "What immediately comes to mind is our migrant farmworker community, especially with their language barriers and their temporary status in our community ... It's really hard to get the data to know who we're looking at ..."

To address this problem, SHARE-NW is building a readily accessible database and the related visualizations so local health officials can more easily talk about the makeup of their communities, identify local needs and foster data-supported decisions.

"If you have data, you can talk about what the issues are that need to be prioritized," Bekemeier explained. "Now, we're focusing on the six priority areas that were common across their community health assessments."

Those areas are: obesity, including physical activity and nutrition/food access; diabetes; tobacco; mental health, including suicide and substance abuse; violence and injury; and oral health, including access to dental care.

"We're doing this with them," Bekemeier said of the rural health leaders. One of the key elements for creating this robust data and tool set is local participation in not only using the data, but also adding to it from their own local research.

Credit: 
University of Washington

Shedding light on how the human eye perceives brightness

image: Brightness perception can be explained by the summation of a non-linear luminance term and a linear melanopsin term, suggesting that melanopsin signal may express the absolute brightness level.

Image: 
Yokohama National University

Japanese scientists are shedding new light on the importance of light-sensing cells in the retina that process visual information. The researchers isolated the functions of melanopsin cells and demonstrated their crucial role in the perception of visual environment. This ushers in a new understanding of the biology of the eye and how visual information is processed.

The findings could contribute to more effective therapies for complications that relate to the eye. They can also serve as the basis for developing lighting and display systems.

The research was published in Scientific Reports on May 20th, 2019.

The back of the human eye is lined with the retina, a layer of various types of cells, called photoreceptors, that respond to different amounts of light. The cells that process a lot of light are called cones and those that process lower levels of light are named rods.

Up until recently, researchers have thought that when light struck the retina, rods and cones were the only two kinds of cells that react. Recent discoveries have revealed an entirely new type of cells, called intrinsically photosensitive retinal ganglion cells (ipRGCs). Unlike rods and cones, ipRGCs contain melanopsin, a photopigment that is sensitive to light. While it has been established that ipRGCs are involved in keeping the brain's internal clock in sync with changes in daylight, their importance in the detection of the amount of light had not yet been well understood.

"Until now, the role of retinal melanopsin cells and how they contribute to the perception of the brightness of light have been unclear," said Katsunori Okajima, a professor at the Faculty of Environment and Information Sciences, Yokohama National University and one of the authors of the study.

"We've found that melanopsin plays a crucial role on the human ability to see how well-lit the environment is. These findings are redefining the conventional system of light detection that so far has only taken into consideration two variables, namely brightness and the amount of incoming light. Our results suggest that brightness perception should rely on a third variable -- the intensity of a stimulus that targets melanopsin."

In the study, the authors showed how cones and melanopsin combine to allow the perception of brightness. In order to better assess the contribution of melanopsin to the detection of light, the melanopsin's signals were isolated from cones and rods. This separation allowed for more accurate observation of the melanopsin signal alone. Visual stimuli were carefully designed and positioned in order to specifically stimulate the light-sensitive chemical. Also, the researchers used tracking software to measure study participants' pupil diameters under each visual stimulus. This served as a way to determine the relationship between brightness perception and the actual visual stimulus intensity on the retina.

The researchers were able to show that the varying brightness levels of an image that was perceived is a sum of the melanopsin response and the response that is generated by the cones. The former is a linear readout and the latter is not. The results also show that melanopsin is not a minor contributor in brightness perception. Rather, it is a crucial player in brightness perception.

This work was supported by the Japan Society for the Promotion of Science Grants-in-Aid for Scientific Research (Grant Numbers 15H05926 and 18H04111).

Credit: 
Yokohama National University