Culture

First accurate data showing male to female transgender surgery can give better life

Scientists have developed a transgender-specific questionnaire, which confirms for the first time that gender surgery significantly improves quality of life for the majority of patients. The study shows that 80% of male-to-female patients perceived themselves as women post-surgery. However, the quality of life of transgender individuals is still significantly lower than the general population.

Many transgender individuals request gender reassignment surgery, but until now there only existed information on general aspects of health related quality of life (QoL) and non-validated questionnaires about improvement of QoL. A team at the University hospital in Essen, Germany, led by Dr. Jochen Hess, followed 156 patients for a median of more than 6 years after surgery. They developed and validated the new Essen Transgender Quality of Life Inventory, which is the first methodology to specifically consider transgender QoL.

They found that there was a high overall level of satisfaction with the outcomes of surgery. When comparing the QoL of the last four weeks with the QoL during the time of publicly identifying as transgendered there was a highly significant increase on all subscales of the ETL as well as for the global score indicating a large improvement of QoL in the course of the transitioning process.

Doctor Hess commented

"The good news is that we found that around three-quarters of patients showed a better quality of life after surgery. 80% perceived themselves to be women, and another 16% felt that they were 'rather female'. 3 women in 4 were able to have orgasms after reassignment surgery.

It's very important that we have good data on Quality of Life in transgender people. They generally suffer from a worse QoL than non-transgender population, with higher rates of stress and mental illness, so it's good that surgery can change this, but also that we can now show that it has a positive effect. Until now we have been using general methods to understand quality of life in transgender individuals, but this new method means that we can address well-being in greater depth".

Recent data1 estimates that 1.4 million adults in the USA identify as transgender, which is about 0.6% of the population. Comparable European figures are not available, but there is wide variation between reported prevalence in individual European countries. Transgender individuals have seen greater visibility in recent years due to the openness of personalities such as Caitlin Jenner, Chelsea Manning, and Andreja Pejic.

The team notes that there are limitations to the study: there was a high drop-out rate, and the results are from a single centre (NOTE: please see information in 'Notes for Editors', which contains more information on the drop-out rate. Info also available directly from press officer).

"Nevertheless, we now have the first specific validated tool for measuring QoL in transgender patients, we hope that this means that we can go forward to gather better information to help us improve treatment", said Dr. Hess.

Commenting, Prof Professor Piet Hoebeke (Ghent University Hospital, Belgium) said:

"As patients develop a better understanding and higher acceptance of transgender surgery, more will seek gender confirming surgery. Despite this observation many doctors are still not convinced that this is a medical condition for which surgery can be offered as a valuable treatment. We need studies like this one to convince the medical world that these patients can get a better QOL with treatment".

Professor Jens Sønksen (University of Copenhagen) commented

"This study suffered from a high drop-out rate, which needs to be considered alongside the main data. Nevertheless, this is a large important study, one of the largest clinical transsexual surveys ever attempted, and the fact that has been performed using a specific validated questionnaire is significant. This is probably the best view of quality of life in after sex-reassignment that we have".

Neither Professor Hoebeke nor Professor Sønksen were involved in this work, these are independent comments.

Credit: 
European Association of Urology

Online intervention improves depression treatment rates in teen moms

image: This is M. Cynthia Logsdon, Ph.D., W.H.N.P.-B.C., F.A.A.N.

Image: 
University of Louisville

LOUISVILLE, Ky. - An online program persuaded teenage mothers across 10 Kentucky counties to seek medical help for depression, highlighting an inexpensive way to increase mental health treatment rates for the vulnerable group, according to a University of Louisville study.

The website included videos of adolescent mothers describing their experiences with postpartum depression and treatment, questions and answers, and local and national resources, including referrals for counseling services and suicide and child-abuse prevention hotlines.

Untreated postpartum depression hinders a mother's relationship with her child, her functioning at work and school, mothering skills and development. The condition also can harm a baby's development and attachment to the mother, said M. Cynthia Logsdon, Ph.D., W.H.N.P.-B.C., UofL School of Nursing professor and lead researcher of the study.

Half of the roughly 400,000 adolescents 18 and younger who give birth annually in the United States experience depressive symptoms, but less than 25 percent follow referrals for depression evaluation and treatment, according to the study.

The research, conducted from 2013 to 2016, involved more than 200 teen moms in urban, suburban and rural counties in Kentucky and was funded by a nearly $440,000 grant from the National Institutes of Health.

Study participants on average were 18 years old, primarily African-American, did not have a high school diploma and had given birth in the past year.

For both rural and urban counties, the intervention led to significant changes in attitude, intention to seek depression treatment and actually seeking treatment.

Credit: 
University of Louisville

80% cut in liver metastasis by restricting the blood vessels supplying it

image: The researchers from left to right: Joana Márquez, Fernando Unda, Iker Badiola abd Gaskon Ibarretxe.

Image: 
(UPV/EHU)

Metastasis is the process whereby a tumour that grows in one organ breaks away from it and travels to another organ and colonises it. In the colonisation process it needs to create new blood vessels through which the cancer cells obtain the nutrients and oxygen they need to grow. This blood vessel formation process is called angiogenesis and is carried out by the endothelial cells. "Unlike normal endothelial cells and due to the signals that reach them from the tumour cells, the cells that supply the tumours have increased growth and tend to move towards the metastatic mass to help it grow," said Iker Badiola, member of the Signaling Lab research group in the Department of Cell Biology and Histology of the UPV/EHU's Faculty of Medicine and Pharmacy.

In order to find out what is actually causing this change in the endothelial cells, the UPV/EHU's Signaling Lab Research Group and the Department of Pharmacology, Pharmacy and Pharmaceutical Technology of the University of Santiago de Compostela, in collaboration with other groups of researchers, embarked on research using mice. The ultimate aim was, as Badiola pointed out, "to slow down the metastatic process by impacting on angiogenesis in the event of bringing about the restoration of the endothelial cells". In the research they induced liver metastasis in mice by using colon cancer cells and from the mass they extracted endothelial cells. They then compared these endothelial cells with other healthy ones. The comparison made covered two aspects: on a protein level, in which they saw which proteins appeared and which did not in each cell type, and to what degree they did so, and in the same way with respect to the degree of micro-RNA. Micro-RNA consists of small elements which for some time were not thought to perform any function but which are now known to play a role in protein regulation.

Using Biocomputing tools they screened and selected the proteins and relevant micro-RNA elements, and "in the final step in this selection process we ended up with a specific micro-RNA: miR-20a. This is an element that appears in healthy endothelial cells, but disappears in those that are in contact with the tumour. We saw that due to the disappearance of the miR-20a in the endothelial cells, a set of proteins appeared and that was when their behaviour began to change and they started to grow and move around," explained Badiola.

Restoring miR-20a using nanoparticles

They then started experiments to see whether including the miR-20a element would restore the behaviour of the endothelial cells that supply the tumours. To do this, they developed nanoparticles "designed to target the endothelial cells in the liver and loaded with miR-20a. We administered them to mice in which we had previously induced metastasis to find out the effect. The pathological analysis revealed that, in the cases treated, far fewer new blood vessels had formed inside the tumours. We also confirmed that the number and size of the metastatic masses had fallen by 80%", he said.

Badiola positively rates being able to reduce metastasis size by 80%, but he makes it clear that "if it is ever used as a treatment, it will be a complementary treatment. You can't ignore the fact that the metastasis goes on growing 20% and, what is more, at no time are the tumour cells destroyed nor are they attacked directly. The strategy of tackling the metastasis that we have achieved involves limiting the supply of nutrients and oxygen; in other words, we restrict the help".

Credit: 
University of the Basque Country

New report examines scientific evidence on safety and quality of abortion care in US

WASHINGTON -- While legal abortions in the U.S. are safe, the likelihood that women will receive the type of abortion services that best meet their needs varies considerably depending on where they live, says a new report from the National Academies of Sciences, Engineering, and Medicine. In addition, the report notes, the vast majority of abortions can be provided safely in office-based settings.

The committee that wrote the report examined the scientific evidence on the safety and quality of the four abortion methods used in the U.S. -- medication, aspiration, dilation and evacuation (D&E), and induction. It assessed quality of care based on whether it is safe, effective, patient-centered, timely, efficient, and equitable according to well-established standards. Most abortions in the U.S. are performed early in pregnancy; in 2014, 90 percent occurred by 12 weeks of gestation. Medication and aspiration abortions are the most common methods and, together, account for about 90 percent of all abortions. Serious complications from abortion are rare regardless of the method, and safety and quality are enhanced when the abortion is performed as early in pregnancy as possible.

Abortion-specific regulations in many states create barriers to safe and effective care. These regulations may prohibit qualified providers from performing abortions, misinform women of the risks of the procedures they are considering, or require medically unnecessary services and delay care, the report says. Examples of these policies include mandatory waiting periods, pre-abortion ultrasound, and a separate in-person counseling visit. Some states require abortion providers to provide women with written or verbal information suggesting that abortion increases a woman's risk of breast cancer or mental illness, despite the lack of valid scientific evidence of increased risk.

In 2014, there were 17 percent fewer abortion clinics than in 2011, and 39 percent of women of reproductive age resided in a county without an abortion provider. In 2017, 25 states had five or fewer abortion clinics, and five states had only one abortion clinic. In addition, approximately 17 percent of women travel more than 50 miles to obtain an abortion.

The vast majority of abortions can be provided safely in office-based settings, the report says. In 2014, 95 percent of abortions were provided in clinics and other office-based settings. For any outpatient procedure, including abortion, the important safeguards are whether the facility has the appropriate equipment, personnel, and an emergency transfer plan to address complications that might occur. The committee found no evidence indicating that clinicians who perform abortions require hospital privileges to ensure a safe outcome for the patient.

No special equipment or emergency arrangements are required for medication abortions. For other abortion methods, the minimum facility characteristics depend on the level of sedation used, the report says. If moderate sedation is used, the facility should have equipment to monitor oxygen saturation, heart rate, and blood pressure as well as have emergency resuscitation equipment and an emergency transfer plan. Deeper sedation requires equipment to monitor ventilation.

The committee also reviewed the evidence on what clinical skills are necessary for health care providers to safely perform the various components of abortion care, including pregnancy determination, counseling, gestational age assessment, medication dispensing, procedure performance, patient monitoring, and follow-up assessment and care. It concluded that trained physicians - such as OB-GYNs and family medicine physicians -- as well as advanced practice clinicians - such as certified nurse-midwives, nurse practitioners, and physician assistants - can safely and effectively provide medication and aspiration abortions. Physicians with appropriate training and sufficient experience to maintain requisite surgical skills can provide D&E abortions. Clinicians with training in managing labor and delivery can safely and effectively provide induction abortions.

In its review of abortion's potential long-term health effects, the committee examined the evidence on future childbearing and pregnancy, risk of breast cancer, and mental health effects. It found that having an abortion does not increase a woman's risk of secondary infertility, pregnancy-related hypertensive disorders, preterm birth, breast cancer, or mental health disorders such as depression, anxiety, or post-traumatic stress disorder. The risk of a very preterm first birth appears to be associated with the number of prior abortions. For example, an increased risk of a first birth earlier than 28 weeks of gestation was found to be associated with having two or more aspiration abortions, compared with the first birth of women with no history of prior abortion.

Nineteen states require a physician to be physically present to provide mifepristone -- the only medication specifically approved by the FDA for use in medication abortions - and 17 states require medication abortions to be performed in a facility with attributes of an ambulatory surgery center or hospital. There is no evidence that these practices improve safety or quality of care, the report says. How the limited distribution of mifepristone affects quality of abortion care merits further investigation.

Access to clinical education and training in abortion care in the U.S. is highly variable at both the undergraduate and graduate levels, the report says. Medical residents and other advanced clinical trainees often have to find abortion training and experience in settings outside of their educational program. In addition, training opportunities are particularly limited in the Southern and Midwestern states, as well as in rural areas throughout the country.

The committee also looked at trends in abortion care. Between 1980 and 2014, the abortion rate in the U.S. decreased by more than half, from an estimated 29 to 15 per 1,000 women of reproductive age. The reason for this decline is not fully understood, but it has been attributed to the increasing use of contraceptives, especially long-acting methods such as intrauterine devices, historic declines in the rate of unintended pregnancy and increasing numbers of state regulations that limit the availability of otherwise legal abortion services.

Credit: 
National Academies of Sciences, Engineering, and Medicine

Chemical peels are safe for people with darker skin, result in few side effects and complications

BOSTON-- Results from a new study led by Boston Medical Center (BMC) indicate that, when performed appropriately, chemical peels can be a safe treatment option for people with darker skin. The findings, first published online in the Journal of the American Academy of Dermatology, show that less than four percent of people with darker skin experienced unwanted side effects from a chemical peel. In addition, the researchers observed a lower rate of side effects compared to previous studies that included all skin types.

Chemical peels have been shown to effectively treat acne, premature aging, and dark or light spots on the skin. Side effects of chemical peels include swelling, crusting, reddening, acne, and pigmentation changes in the skin. Although having darker skin is a risk factor for complications during a chemical peel, no large-scale studies have looked into the long-term side effects of the treatment in a racially and ethnically diverse population.

"These findings should give some assurances to people with darker skin who are considering getting a chemical peel," says lead researcher Neelam Vashi, MD, a director of the Center Ethnic Skin at BMC and at Boston University School of Medicine (BUSM). "People with darker skin have long been underrepresented in dermatological research, and it's important to make sure we know how safe and effective these treatments are for them."

Researchers followed 132 patients with darker skin who received a total of 473 chemical peels to determine how prevalent side effects were. The same dermatologist performed all of the chemical peels, and the peel was applied all at once, rather than in sections on the skin. Eighteen participants experienced side effects, the most common being crusting, dark spots, and reddening. The side effects lasted an average of 4.5 weeks.

The study also found that side effects were least likely to occur in the winter, which could be attributed to the limited sun exposure participants experience during that time. People with the darkest skin tones were most likely to experience side effects and complications, a trend that could be better understood with additional research.

Credit: 
Boston Medical Center

Plasmons triggered in nanotube quantum wells

image: A wafer of highly aligned carbon nanotubes, seen in gray on a piece of glass, facilitated a novel quantum effect in experiments at Rice University.

Image: 
Jeff Fitlow/Rice University

A novel quantum effect observed in a carbon nanotube film could lead to the development of unique lasers and other optoelectronic devices, according to scientists at Rice University and Tokyo Metropolitan University.

The Rice-Tokyo team reported an advance in the ability to manipulate light at the quantum scale by using single-walled carbon nanotubes as plasmonic quantum confinement fields.

The phenomenon found in the Rice lab of physicist Junichiro Kono could be key to developing optoelectronic devices like nanoscale, near-infrared lasers that emit continuous beams at wavelengths too short to be produced by current technology.

The new research is detailed in Nature Communications.

The project came together in the wake of the Kono group's discovery of a way to achieve very tight alignment of carbon nanotubes in wafer-sized films. These films allowed for experiments that were far too difficult to carry out on single or tangled aggregates of nanotubes and caught the attention of Tokyo Metropolitan physicist Kazuhiro Yanagi, who studies condensed matter physics in nano materials.

"He brought the gating technique (which controls the density of electrons in the nanotube film), and we provided the alignment technique," Kono said. "For the first time we were able to make a large-area film of aligned nanotubes with a gate that allows us to inject and take out a large density of free electrons."

"The gating technique is very interesting, but the nanotubes were randomly oriented in the films I had used," Yanagi said. "That situation was very frustrating because I could not get precise knowledge of the one-dimensional characteristics of nanotubes in such films, which is most important. The films that can only be provided by the Kono group are amazing because they allowed us to tackle this subject."

Their combined technologies let them pump electrons into nanotubes that are little more than a nanometer wide and then excite them with polarized light. The width of the nanotubes trapped the electrons in quantum wells, in which the energy of atoms and subatomic particles is "confined" to certain states, or subbands.

Light then prompted them to oscillate very quickly between the walls. With enough electrons, Kono said, they began to act as plasmons.

"Plasmons are collective charge oscillations in a confined structure," he said. "If you have a plate, a film, a ribbon, a particle or a sphere and you perturb the system (usually with a light beam), these free carriers move collectively with a characteristic frequency." The effect is determined by the number of electrons and the size and shape of the object.

Because the nanotubes in the Rice experiments were so thin, the energy between the quantized subbands was comparable to the plasmon energy, Kono said. "This is the quantum regime for plasmons, where the intersubband transition is called the intersubband plasmon. People have studied this in artificial semiconductor quantum wells in the very far-infrared wavelength range, but this is the first time it has been observed in a naturally occurring low-dimensional material and at such a short wavelength."

Detecting a very complicated gate voltage dependence in the plasmonic response was a surprise, as was its appearance in both metallic and semiconducting single-walled nanotubes. "By examining the basic theory of light-nanotube interactions, we were able to derive a formula for the resonance energy," Kono said. "To our surprise, the formula was very simple. Only the diameter of the nanotube matters."

The researchers believe the phenomenon could lead to advanced devices for communications, spectroscopy and imaging, as well as highly tunable near-infrared quantum cascade lasers.

While traditional semiconductor lasers depend on the width of the lasing material's bandgap, quantum cascade lasers do not, said Weilu Gao, a co-author on the study and a postdoctoral researcher in Kono's group that is spearheading device development using aligned nanotubes. "The wavelength is independent of the gap," he said. "Our laser would be in this category. Just by changing the diameter of the nanotube, we should be able to tune the plasma resonance energy without worrying about the bandgap."

Kono also expects the gated and aligned nanotube films will give physicists the opportunity to study Luttinger liquids, theoretical collections of interacting electrons in one-dimensional conductors.

"One-dimensional metals are predicted to be very different from 2-D and 3-D," Kono said. "Carbon nanotubes are some of the best candidates for observing Luttinger liquid behaviors. It's difficult to study a single tube, but we have a macroscopic one-dimensional system. By doping or gating, we can tune the Fermi energy. We can even convert a 1-D semiconductor into a 1-D metal. So this is an ideal system to study this kind of physics."

Credit: 
Rice University

Wandering greenhouse gas

On the seafloor of the shallow coastal regions north of Siberia, microorganisms produce methane when they break down plant remains. If this greenhouse gas finds its way into the water, it can also become trapped in the sea ice that forms in these coastal waters. As a result, the gas can be transported thousands of kilometres across the Arctic Ocean and released in a completely different region months later. This phenomenon is the subject of an article by researchers from the Alfred Wegener Institute, published in the current issue of the online journal Scientific Reports. Although this interaction between methane, ocean and ice has a significant influence on climate change, to date it has not been reflected in climate models.

In August 2011, the icebreaker Polarstern from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) was making its way through the ice-covered Arctic Ocean, on a course that took her just a few hundred kilometres from the North Pole. Back then, AWI geochemist Dr Ellen Damm tested the waters of the High North for the greenhouse gas methane. In an expedition to the same region four years later, she had the chance to compare the measurements taken at different times, and found significantly less methane in the water samples.

Ellen Damm, together with Dr Dorothea Bauch from the GEOMAR Helmholtz Centre for Ocean Research in Kiel and other colleagues, analysed the samples to determine the regional levels of methane, and the sources. By measuring the oxygen isotopes in the sea ice, the scientists were able to deduce where and when the ice was formed. To do so, they had also taken sea-ice samples. Their findings: the ice transports the methane across the Arctic Ocean. And it appears to do so differently every year, as the two researchers and their colleagues from the AWI, the Finnish Meteorological Institute in Helsinki and the Russian Academy of Science in Moscow relate in the online journal Scientific Reports.

The samples from 2011 came from sea ice that had started its long journey north in the coastal waters of the Laptev Sea of eastern Siberia nearly two years earlier, in October 2009. The samples from 2015, which had only been underway in the Arctic Ocean half as long, showed a markedly lower level of the greenhouse gas. The analysis revealed that this ice was formed much farther out, in the deeper ocean waters. However, until now, the climate researchers' models haven't taken into consideration the interaction between methane, the Arctic Ocean and the ice floating on it.

Every molecule of methane in the air has 25 times the effect on temperature rise compared to a molecule of carbon dioxide released into the atmosphere by burning coal, oil or gas. Methane in the Arctic also has an enormous impact on warming at northerly latitudes, and further exacerbates global warming - a good reason to investigate the methane cycle in the High North more closely.

Methane is produced by cattle breeding and rice cultivation, as well as various other natural processes. For example, the remains of algae and other plant materials collect on the floor of the shallow Laptev Sea, and in other shallow waters off the Arctic coast. If there is no oxygen there, microorganisms break down this biomass, producing methane. To date, simulations have paid too little attention to the routes of by carbon and release of methane from the Arctic regions.

In autumn, when air temperatures drop, many areas of open water also begin to cool. "Sea ice forms on the surface of the Russian shelf seas, and is then driven north by the strong winds," explains AWI sea-ice physicist Dr Thomas Krumpen, who also took part in the study. The ice formation and offshore winds produce strong currents in these shallow marginal seas, which stir up the sediment and carry the methane produced there into the water column. The methane can also be trapped in the ice that rapidly forms in these open areas of water - also known as polynya - in the winter.

"As more seawater freezes it can expel the brine contained within, entraining large quantities of the methane locked in the ice," explains AWI researcher Ellen Damm. As a result, a water-layer is formed beneath the ice that contains large amounts of both salt and methane. Yet the ice on the surface and the dense saltwater below, together with the greenhouse gas it contains, are all pushed on by the wind and currents. According to Thomas Krumpen, "It takes about two and a half years for the ice formed along the coast of the Laptev Sea to be carried across the Arctic Ocean and past the North Pole into the Fram Strait between the east cost of Greenland and Svalbard." Needless to say, the methane trapped in the ice and the underlying saltwater is along for the ride.

The rising temperatures produced by climate change are increasingly melting this ice. Both the area of water covered by sea ice and the thickness of the ice have been decreasing in recent years, and thinner ice is blown farther and faster by the wind. "In the past few years, we've observed that ice is carried across the Arctic Ocean faster and faster," confirms Thomas Krumpen. And this process naturally means major changes in the Arctic's methane turnover. Accordingly, quantifying the sources, sinks and transport routes of methane in the Arctic continues to represent a considerable challenge for the scientific community.

Credit: 
Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research

Land under water: Estimating hydropower's land use impacts

image: Norway is one of the top-ten hydropower electricity producers worldwide, with more than 95 percent of domestic power production from hydropower. A new tool developed at the Norwegian University of Science and Technology (NTNU) enables policymakers and industry to understand the tradeoffs between hydropower and loss of land and biodiversity from when lands are drowned under reservoirs.

Image: 
Ånund Killingtveit/NTNU

Hydropower is the world's top provider of renewable energy, producing a whopping 16 per cent of the global energy supply. That's a good thing when it comes to the climate, especially compared to energy from fossil fuels. But hydropower is not without its environmental costs, particularly when it comes to the land that is drowned under reservoirs or gobbled up by roads and power lines built for a hydropower project.

Now, a team of Norwegian-based researchers has developed an innovative way to describe how much land it takes to generate a kilowatt-hour of electricity from hydropower. The goal is to make it easier for policymakers and businesses to assess the environmental trade-offs of current hydropower plants and involved in investing in new hydropower plants.

"Some hydropower reservoirs may look natural at first. However, they are human influenced and if land has been flooded for their creation, this may impact terrestrial ecosystems," said Martin Dorber, a PhD candidate at the Norwegian University of Science and Technology's (NTNU) Industrial Ecology Programme.

Life-cycle analysis

There's widespread agreement that boosting the amount of electricity that the world gets from renewable energy like hydropower is key to combatting global climate change.

The Intergovernmental Panel on Climate Change (IPCC) has looked at this issue in a special report on renewable energy sources and climate change. There, the organization says governments and industry need to include the long-term environmental consequences of hydropower into current and future projects. That way, they can identify the environmental trade-offs that will result from expanding hydropower production.

Dorber and his colleagues Francesca Verones from NTNU's Industrial Ecology Programme, and Roel May from the Norwegian Institute for Nature Research realized that they had the perfect tool for quantifying the environmental effects of hydropower production. It's an analysis tool called Life Cycle Assessment, commonly abbreviated as LCA.

LCA gives researchers a methodology to look at all the environmental impacts of a product or process during its life cycle. That means they start at the very beginning, from the production of the item's components, to when the product or process is in created and in use, and finally to when it is no longer in use and recycled or otherwise disposed of. The idea is to give a picture of the complete environmental cost of something.

As an example, if you were to conduct a life-cycle assessment of a beer can, you would need to know everything starting with the environmental costs of mining the raw material (bauxite), shipping it to be made into aluminium, the making of the can itself, and what it takes to recycle it after it has been used. It's complicated, but researchers at NTNU's Industrial Ecology Programme have perfected this approach for hundreds of different products and processes.

One of the potential environmental effects of hydropower development is what it can do to biodiversity. It can alter freshwater habitat, degrade water quality, and change land use by flooding land for reservoirs, and from the construction of the dam and the power lines and access roads the project needs. The researchers realized that there isn't enough information available yet to allow LCA to assess all of these impacts from hydropower, so they decided to focus on one key issue: land use and land use change.

"Land use and land use change is a key issue, as it is one of the biggest drivers of biodiversity loss, because it leads to loss and degradation of habitat for many species," Dorber said.

Flooding natural lakes

The first step for the researchers was to conduct what is called a life cycle inventory, by figuring out just how much land is used to produce a kilowatt-hour of electricity.

Since Norway is one of the top ten hydropower producers in the world, with more than 95 per cent of all domestic power production from hydropower, the researchers realized they needed to create an inventory specific to Norway.

There are databases that attempt to provide this information, but the largest of these only had information on hydropower production for Switzerland and Brazil. And none of the databases accounted for the water area of a natural lake that might have been flooded to make the hydro reservoir, the researchers said.

"Most of the Norwegian hydropower reservoirs are created by impounding natural lakes," Dorber said. "So if we applied the information from the databases that don't account for the water area of a natural lake, that would lead to a gross overestimation of the environmental impact."

The trouble is, there's minimal information available on the size of the lakes that were inundated to create Norway's 1289 hydropower reservoirs. So the researchers found a way to estimate what original lake sizes would have been -- by using satellite imagery.

How they did it

Fortunately, the researchers had access to two excellent sources of information to make their estimates. The first was measurements of the actual reservoir surface areas at their highest regulated water level, provided by the Norwegian Water Resources and Energy Directorate (NVE).

The second was free, downloadable satellite images from the NASA-USGS Global Land Survey data set, where they used images from 1972-1983 for Norway. The researchers also used aerial photographs as needed from an Internet portal called Norge i Bilder, which provides aerial photographs for Norway as early as 1937.

The dates of the images matter, because the researchers needed to be able to see the land before the hydropower projects were constructed. That meant they could only assess water surface areas before dam construction for hydropower plants built on or after 1937 for the area covered by the aerial photographs, and for dams built on or after 1972 for the area covered by the Landsat photos. As a result, they were only able to calculate how much land was inundated for 184 hydropower reservoirs in total.

For a variety of reasons, they weren't able to use data for 77 reservoirs that they had land area information for. In the end, they were able to calculate land occupation by reservoirs that provided roughly 20 per cent of the total average annual hydropower electricity produced in Norway between 1981 and 2010.

"By dividing the inundated land area with the annual electricity production of each hydropower reservoir, we calculated site-specific net land occupation values for the Life Cycle Inventory," Dorber said. " While it's beyond the scope of this work, our approach is a crucial step towards quantifying impacts of hydropower electricity production on biodiversity for Life Cycle Analysis."

Other uses, other countries

Dorber points out that their approach could be employed by other countries that wanted to know more about the effects of hydropower on the environment, because the Landsat data covers the entire globe and is freely available.

And when they compared the Norwegian information they generated to the hydropower information they had from Switzerland and Brazil, they saw just how different the effects are in the different countries.

"The avera ge land occupation in our study across all the hydropower
plants we looked at is 0.027 m2· yr/kWh and is larger than the existing 0.004 m2· yr/kWh in the database for Switzerland," Dorber said. "However, when we adjusted the land occupation value to address uncertainties, the adjusted average land occupation (0.007 m2·yr/kWh) is lower than our average land occupation (0.027 m2·yr/kWh) and is thus closer to the existing 0.004 m2·yr/kWh in the Ecoinvent database."

Knowing how much land was occupied when a dam was built can also help to calculate how much water is lost on average to evaporation, which can affect aquatic ecosystems by reducing the amount of water released from the dam.

And because the creation of hydropower reservoirs leads to an initial increase in greenhouse gas emissions from the decomposition of organic matter that was flooded by the reservoir or flushed into the reservoir, the information can also be used to calculate net water consumption and net greenhouse gas emissions for the Life Cycle Inventory, Dorber said.

"We have shown that remote sensing data can be used to quantify the land use change caused by hydropower reservoirs. At the same time our results show that the land use change differs between hydropower reservoirs," Dorber said. "Therefore, more reservoir-specific land use change assessment is a key component that is needed to quantify the potential environmental impacts in connection with hydropower reservoirs."

Credit: 
Norwegian University of Science and Technology

The complex journey of red bloods cells through microvascular networks

image: A snapshot showing red blood cells deforming as they flow through another microvascular network geometry. The large deformation of each individual cell is captured to better understand how individual cells behave as they flow through these networks.

Image: 
Rutgers University

If you think of the human body, microvascular networks comprised of the smallest blood vessels are a central part of the body's function. They facilitate the exchange of essential nutrients and gasses between the blood stream and surrounding tissues, as well as regulate blood flow in individual organs.

While the behavior of blood cells flowing within single, straight vessels is a well-known problem, less is known about the individual cellular-scale events giving rise to blood behavior in microvascular networks. To better understand this, researchers Peter Balogh and Prosenjit Bagchi published a recent study in the Biophysical Journal. Bagchi resides in the Mechanical and Aerospace Engineering Department at Rutgers University, and Balogh is his PhD student.

To the researchers' knowledge, theirs is the first work to simulate and study red blood cells flowing in physiologically realistic microvascular networks, capturing both the highly complex vascular architecture as well as the 3D deformation and dynamics of each individual red blood cell.

Balogh and Bagchi developed and used a state-of-the-art simulation code to study the behavior of red blood cells as they flow and deform through microvascular networks. The code simulates 3D flows within complex geometries, and can model deformable cells, such as red blood cells, as well as rigid particles, such as inactivated platelets or some drug particles.

"Our research in microvascular networks is important because these vessels provide a very strong resistance to blood flow," said Bagchi. "How much energy the heart needs to pump blood, for example, is determined by these blood vessels. In addition, this is where many blood diseases take root. For example, for someone with sickle cell anemia, this is where the red blood cells get stuck and cause enormous pain."

One of the paper's findings involves the interaction between red blood cells and the vasculature within the regions where vessels bifurcate. They observed that as red blood cells flow through these vascular bifurcations, they frequently jam for very brief periods before proceeding downstream. Such behavior can cause the vascular resistance in the affected vessels to increase, temporarily, by several orders of magnitude.

There have been many attempts to understand blood flow in microvascular networks dating back to the 1800s and French physician and physiologist, Jean-Louis-Marie Poiseuille, whose interest in the circulation of blood led him to conduct a series of experiments on the flow of liquids in narrow tubes. He also formulated a mathematical expression for the non-turbulent flow of fluids in circular tubes.

Updating this research, Balogh and Bagchi use computation to enhance the understanding of blood flow in these networks. Like many other groups, they originally modelled capillary blood vessels as small, straight tubes and predicted their behavior.

"But if you look at the capillary-like vessels under the microscope, they are not straight tubes...they are very winding and continuously bifurcate and merge with each other," Bagchi said. "We realized that no one else had a computational tool to predict the flow of blood cells in these physiologically realistic networks."

"This is the first study to consider the complex network geometry in 3D and simultaneously resolve the cell details in 3D," Balogh said. "One of the underlying goals is to better understand what is occurring in these very small vessels in these complex geometries. We hope that by being able to model this next level of detail we can add to our understanding of what is actually occurring at the level of these very small vessels."

In terms of cancer research, this model may have tremendous implications. "This code is just the beginning of something really big," Bagchi said.

In the medical field today, there are advanced imaging systems that image the capillary network of blood vessels, but it's sometimes difficult for those imaging systems to predict the blood flow in every vessel simultaneously. "Now, we can take those images, put them into our computational model, and predict even the movement of each blood cell in every capillary vessel that is in the image," Bagchi said.

This is a huge benefit because the researchers can see whether the tissue is getting enough oxygen or not. In cancer research, angiogenesis -- the physiological process through which new blood vessels form from pre-existing vessels -- is dependent upon the tissue getting enough oxygen.

The team is also working on modeling targeted drug delivery, particularly for cancer. In this approach nanoparticles are used to carry drugs and target the specific location of the disease. For example, if someone has cancer in the liver or pancreas, then those specific organs are targeted. Targeted drug delivery allows increased dose of the drug so other organs don't get damaged and the side effects are minimized.

"The size and shape of these nanoparticles determine the efficiency of how they get transported through the blood vessels," Bagchi said. "We think the architecture of these capillary networks will determine how well these particles are delivered. The architecture varies from organ to organ. The computational code we developed helps us understand how the architecture of these capillary networks affects the transport of these nanoparticles in different organs."

This research used computational simulations to answer questions like: How accurately can a researcher capture the details of every blood cell in complex geometries? How can this be accomplished in 3D? How do you take into account the many interactions between these blood cells and vessels?

"In order to do this, we need large computing resources," Bagchi said. "My group has been working on this problem using XSEDE resources from the Texas Advanced Computing Center. We used Stampede1 to develop our simulation technique, and soon we will be moving to Stampede2 because we'll be doing even larger simulations. We are using Ranch to store terabytes of our simulation data."

The eXtreme Science and Engineering Discovery Environment (XSEDE) is a National Science Foundation-funded virtual organization that integrates and coordinates the sharing of advanced digital services -- including supercomputers and high-end visualization and data analysis resources -- with researchers nationally to support science. Stampede1, Stampede2, and Ranch are XSEDE-allocated resources.

The simulations reported in the paper took a few weeks of continuous simulation and resulted in terabytes of data.

In terms of how this research will help the medical community, Bagchi said: "Based on an image of capillary blood vessels in a tumor, we can simulate it in 3D and predict the distribution of blood flow and nanoparticle drugs inside the tumor vasculature, and, perhaps, determine the optimum size, shape and other properties of nanoparticles for most effective delivery," Bagchi said. "This is something we'll be looking at in the future."

Credit: 
University of Texas at Austin, Texas Advanced Computing Center

One quarter of penis cancer sufferers don't get recommended treatment -- halving the survival rate

A major international survey has found that around a quarter of patients are not receiving the recommended treatment for cancer of the penis. It also found that these patients had half the survival rate of those who were treated according to guidelines. The study, presented at the EAU conference in Copenhagen, finds that non-adherence is partly due to patients refusing treatment, or doctors being reluctant to treat appropriately or being unfamiliar with the best procedures.

Penis cancer is considered a rare human cancer. Around 1 in 100,000 men contract penis cancer every year in the West (which means that a country such as the UK has around 640 cases per year. The US sees 2320 cases per year), however in recent years this rate has risen by 20% to 25% in many countries1, especially in older men.

Cancer of the penis is extremely distressing. As the American Cancer Society says "cancer of the penis can be a frightening prospect. Partially or completely removing the penis is often the most effective way to cure penile cancer, but for many men this cure seems worse than the disease"2.

In this large international survey, researchers found that a significant minority (25%) of patients do not receive the recommended treatment. In part this is due to patients being reluctant to go ahead with surgery which removes all or part of the penis, in part due to doctors not proceeding with the appropriate surgery to treat this rare cancer.

Researchers from 12 centres in Italy, Spain, the USA, Brazil, and Hungary, looked at adherence to the EAU guidelines on treatment of penile cancer. They retrospectively examined the records of 425 patients who had been treated in the 2010-2016 period. Lead author,

Dr Luca Cindolo (Abruzzo, Italy) said:

"We found that most patients were treated in accordance with the gold-standard EAU recommendations3, but around 25% of patients had not received appropriate treatment. From our work, we see that around twice as many patients survive if they have been treated according to recommended guidelines.

In around half of those patients not treated according to guidelines, the decision was made by the doctor, and we suspect that this is because many doctors are unfamiliar with treating this rare, but devastating cancer. In one in 6 cases, the patient, or the patient's carers, made the decision not to be treated according to guidelines. We often find that patients don't want to be treated, or that the patients' carers are unwilling to take the decision to treat".

These are often difficult treatment decisions to take, and so they need to be arrived at after open discussion between the patient and the medical team. It's a condition which most urologists don't see very often, so it's best if the medical team is experienced in dealing with the condition. This may mean that the treatment in national or international centres of excellence is the best way to proceed".

Commenting, Dr Vijay Sangar (Director of Surgery, Christie Hospital, Manchester) said:

"We often find that patients with rare cancers get short-changed because the cancer is so seldom encountered by doctors. We can suggest that if we treat rare cancers in national or even international centres of excellence, the chances of better management improve. In the UK for example, we centralised the treatment of penis cancer into just 10 centres of excellence, whereas in some countries such as Hungary, Spain, and Italy, these rare urological cancers are still treated locally, which may reflect the lower survival rates. Generally, the more penile cancer a team sees the better they become at managing the disease. The recently established eUROGEN consortium will make a huge difference to European patient care; this gives patients with rare urological diseases access to the best management no matter where they are in Europe".

Credit: 
European Association of Urology

Large racial and ethnic disparity in world's most common STI

image: This graph shows prevalence of the STI.

Image: 
Johns Hopkins Medicine

In a new Johns Hopkins study, researchers have added to evidence that Trichomonas vaginalis (TV), the world's most common curable sexually transmitted infection (STI), disproportionately affects the black community.

A report of the findings, published March 15 in Clinical Infectious Diseases, highlights the major racial/ethnic disparities in TV infection in the United States population, calling on the need to improve racial equity in sexual health.

"These findings are likely reflective of real social and structural disparities, such as lower access to health care, that result in high infection rates in the black community. Targeted public health education about Trichomonas will be critical," says Aaron Tobian, M.D., Ph.D., associate professor of pathology at the Johns Hopkins University School of Medicine and the study's senior author.

Trichomonas vaginalis infection is caused by a protozoan parasite and is asymptomatic in most people. In women, TV infection cases can be characterized by a foul-smelling vaginal discharge, painful urination and abdominal pain. Similar urination and abdominal symptoms can be seen in men. However, if TV infection is left untreated, it can lead to severe consequences in women, such as pelvic inflammatory disease and infertility.

Unlike with some other STIs, there are currently no routine surveillance or reporting programs for TV in the United States, and only two previous studies have ever examined its prevalence on a national scale, according to Tobian. Those studies, he says, only focused on women or a younger population.

To examine the prevalence of TV infection in both males and females 18 to 59 years old, Tobian and colleagues used data from the 2013-2014 National Health and Nutrition Examination Survey, a collection of health information from the United States' noninstitutionalized, civilian population that includes valid TV results from urine samples.

Of the 4,057 participants, 1,942 were males, 2,115 were females, 822 identified as non-Hispanic black and 3,235 identified as other races/ethnicities.

While the prevalence of TV infection was 0.03 and 0.8 percent among males and females of other races/ethnicities, the burden of TV infection was significantly higher among black males and females: 4.2 percent and 8.9 percent, respectively.

Higher prevalence of TV infection was associated with being female, black, older, having less than a high school education and living below poverty level, independent of having multiple sexual partners. TV prevalence was also higher among older individuals (i.e., 25-59 years old compared to age 18-24).

People below poverty level had a prevalence of 3.9 percent versus 0.6 percent for those at or above poverty level, and individuals without a high school education had a prevalence of 2.9 percent versus 0.8 percent for those with at least a high school education.

Currently, screening for TV infection is only recommended for people who are HIV-positive. Eshan U. Patel, M.P.H., lead author of the study, says the new findings should encourage broader screening initiatives, educational programming and policy changes to ensure access to sexual health care.

"It's unfortunate that TV infection hasn't received a stronger public health response, especially since it is easy to diagnose and treat," says Patel. TV infection can be detected using the same diagnostic platform as the one used for Chlamydia and can be cured with just one pill (metronidazole).

Credit: 
Johns Hopkins Medicine

UH scientists investigating mysterious dark matter

image: DarkSide-50 time projection chamber interior at Gran Sasso National Laboratory in Italy.

Image: 
DarkSide Collaboration

University of Houston scientists are helping to develop a technology that could hold the key to unraveling one of the great mysteries of science: what constitutes dark matter? Scientists believe dark matter makes up 85 percent of the matter in the universe, but nobody actually knows what dark matter is.

"If we are the experiment that finds dark matter, we can change the fundamental understanding of the universe as we know it," said UH assistant professor Andrew Renshaw. "We can really start to understand the fundamental properties of the universe - how we got from the big bang to where we are, and what the future holds."

Renshaw and professor Ed Hungerford are leading a team of physicists from the College of Natural Sciences and Mathematics in the DarkSide program, an international research collaboration seeking to detect dark matter in the form of weakly interacting massive particles (WIMPs). In principle, when WIMP particles collide with ordinary nuclei, extremely small, low-energy nuclear recoil would result. In very simple terms, the scientists are trying to build technology that can detect WIMPS by detecting this very tiny, but observable recoil.

The UH team is using the DarkSide program's first physics detector, DarkSide-50 (DS-50), located underground at the Gran Sasso National Laboratory in Central Italy. The team and their collaborators have improved the sensitivity of the DS-50 detector in recent years by switching from atmospheric argon to low-radioactivity liquid argon, which was extracted from underground gas wells in Colorado. But a next-generation detector in development will take it even further.

DarkSide-20k (DS-20k) is currently being constructed using similar components from the present DarkSide experiment. Whereas DS-50 holds about 9.5 gallons (50 kilograms) of low-radioactivity liquid argon, this new detector, DS-20k, will employ new readout technology and will be some 400 times larger, holding 3,800 gallons (20,000 kilograms) of liquid argon. The new experiment is expected to start acquiring data at the Gran Sasso National Laboratory in 2021.

This detector, said Hungerford, will push the search for WIMP dark matter to new levels of sensitivity, hopefully finding the elusive WIMP. Or, he said, it could demonstrate that dark matter is not a particle, since this technology has now proven capable of searching for types of dark matter other than WIMPs.

"Previously, if you wanted to look for a specific kind of dark matter, you really had to look for a specific kind of detector. Now with this liquid argon technology, it's really opening the door to using a single technology to search for a handful of different kinds of dark matter," added Renshaw, who recently presented DarkSide findings at the UCLA Dark Matter Conference.

While Hungerford and Renshaw continue their research in Houston, three other members of the UH team are manning the day-to-day operations in Italy. Research associate Nicola Canci manages the DS-50 detector and monitors its performance.

"The cryogenic system keeping the argon in liquid phase needs to be monitored, and some operations are needed to allow for the good performances of the detector. Electronics are monitored. Signals coming from the detector are improved, if needed, and the quality of data is routinely checked," Canci said.

Credit: 
University of Houston

The coffee cannabis connection

CHICAGO --- It's well known that a morning cup of joe jolts you awake. But scientists have discovered coffee affects your metabolism in dozens of other ways, including your metabolism of steroids and the neurotransmitters typically linked to cannabis, reports a new study from Northwestern Medicine.

In a study of coffee consumption, Northwestern scientists were surprised to discover coffee changed many more metabolites in the blood than previously known. Metabolites are chemicals in the blood that change after we eat and drink or for a variety of other reasons.

The neurotransmitters related to the endocannabinoid system -- the same ones affected by cannabis -- decreased after drinking four to eight cups of coffee in a day. That's the opposite of what occurs after someone uses cannabis. Neurotransmitters are the chemicals that deliver messages between nerve cells.

Cannabinoids are the chemicals that give the cannabis plant its medical and recreational properties. The body also naturally produces endocannabinoids, which mimic cannabinoid activity.

In addition, certain metabolites related to the androsteroid system increased after drinking four to eight cups of coffee in a day, which suggests coffee might facilitate the excretion or elimination of steroids. Because the steroid pathway is a focus for certain diseases including cancers, coffee may have an effect on these diseases as well.

"These are entirely new pathways by which coffee might affect health," said lead author Marilyn Cornelis, assistant professor of preventive medicine at Northwestern University Feinberg School of Medicine. "Now we want to delve deeper and study how these changes affect the body."

Little is known about how coffee directly impacts health. In the new study, Northwestern scientists applied advanced technology that enabled them to measure hundreds of metabolites in human blood samples from a coffee trial for the first time. The study generates new hypotheses about coffee's link to health and new directions for coffee research.

The paper will be published March 15 in the Journal of Internal Medicine.

Drinking lots of coffee for science

In the three-month trial based in Finland, 47 people abstained from coffee for one month, consumed four cups a day for the second month and eight cups a day for the third month. Cornelis and colleagues used advanced profiling techniques to examine more than 800 metabolites in the blood collected after each stage of the study.

Blood metabolites of the endocannabinoid system decreased with coffee consumption, particularly with eight cups per day, the study found.

The endocannabinoid metabolic pathway is an important regulator of our stress response, Cornelis said, and some endocannabinoids decrease in the presence of chronic stress.

"The increased coffee consumption over the two-month span of the trial may have created enough stress to trigger a decrease in metabolites in this system," she said. "It could be our bodies' adaptation to try to get stress levels back to equilibrium."

The endocannabinoid system also regulates a wide range of functions: cognition, blood pressure, immunity, addiction, sleep, appetite, energy and glucose metabolism.

"The endocannabinoid pathways might impact eating behaviors," suggested Cornelis, "the classic case being the link between cannabis use and the munchies."

Coffee also has been linked to aiding weight management and reducing risk of type 2 diabetes.

"This is often thought to be due to caffeine's ability to boost fat metabolism or the glucose-regulating effects of polyphenols (plant-derived chemicals)," Cornelis said. "Our new findings linking coffee to endocannabinoids offer alternative explanations worthy of further study."

It's not known if caffeine or other substances in coffee trigger the change in metabolites.

Although Cornelis studies the effects of coffee, she didn't drink it growing up in Toronto or later living in Boston.

"I didn't like the taste of it," Cornelis said. But when she moved to join Northwestern in 2014, she began to enjoy several cups a day. "Maybe it's the Chicago water," she mused, "but I do have to add cream and sweetener."

Credit: 
Northwestern University

Health chiefs failing to investigate rising deaths in England and Wales, argue experts

Health chiefs are failing to investigate a clear pattern of rising death rates and worsening health outcomes in England and Wales, argue experts in The BMJ today.

Lucinda Hiam at the London School of Hygiene & Tropical Medicine and Danny Dorling at the University of Oxford say weekly mortality figures show 10,375 additional deaths (a rise of 12.4%) in England and Wales in the first seven weeks of 2018 compared with the previous five years.

This rise cannot be explained by ageing of the population, a flu epidemic, or cold weather - and no official explanation has been forthcoming as to why death rates have continued to be so high relative to previous trends, they write.

However, they note that the first seven weeks of 2018 were unusual in terms of the operation of the NHS.

On 2 January, after “an unprecedented step by NHS officials,” thousands of non-urgent operations were cancelled, a clear sign of a system struggling to cope. Many hospitals were already at or beyond their safe working levels, "with high numbers of frail patients stuck on wards for want of social care," and a rise in influenza cases had begun.

However, they then show that influenza can have only accounted for a very small part of the overall rise in mortality in early 2018.

The past five years have been challenging in terms of health outcomes in the UK, they add. For example, spending on health and social care year on year has increased at a much slower rate than in previous years, while outcomes in a large number of indicators have deteriorated, including a very rapid recent increase in the numbers of deaths among mental health patients in care in England and Wales.

They point out that the Office of National Statistics has in the past 12 months reduced its projections of future life expectancy for both men and women in the UK by almost a year each, and, in doing so, has estimated that more than a million lives will now end earlier than expected.

Mortality in infants born into the poorest families in the UK has also risen significantly since 2011.

Hiam and Dorling argue that there remains "a clear lack of consensus" over the reasons for the rise in deaths - and say they and others have already called for an urgent investigation by the Health Select Committee of the House of Commons.

"The latest figures for this year make the case for an investigation stronger and more urgent with each passing day," they conclude.

Credit: 
BMJ Group

Nanostructures created at UCLA could make gene therapies safer, faster & more affordable

image: This image shows an array of nanospears before being released for delivery of genetic information to cells.

Image: 
UCLA Broad Stem Cell Research Center/ACS Nano

UCLA scientists have developed a new method that utilizes microscopic splinter-like structures called "nanospears" for the targeted delivery of biomolecules such as genes straight to patient cells. These magnetically guided nanostructures could enable gene therapies that are safer, faster and more cost-effective.

The research was published in the journal ACS Nano by senior author Paul Weiss, UC Presidential Chair and distinguished professor of chemistry and biochemistry, materials science and engineering, and member of the Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research at UCLA.

Gene therapy, the process of adding or replacing missing or defective genes in patient cells, has shown great promise as a treatment for a host of diseases, including hemophilia, muscular dystrophy, immune deficiencies and certain types of cancer.

Current gene therapy approaches rely on modified viruses, external electrical fields or harsh chemicals to penetrate cell membranes and deliver genes straight to patient cells. Each of these methods has its own shortcomings; they can be costly, inefficient or cause undesirable stress and toxicity to cells.

To overcome these barriers, Weiss and Dr. Steven Jonas, a clinical fellow in the UCLA Broad Stem Cell Research Center Training Program, led a research team that designed nanospears composed of silicon, nickel and gold. These nanospears are biodegradable, can be mass-produced inexpensively and efficiently, and, because of their infinitesimal size -- their tips are about 5,000 times smaller than the diameter of a strand of human hair -- they can deliver genetic information with minimal impact on cell viability and metabolism.

Jonas compared the cutting-edge biomolecule delivery method to real-world delivery methods appearing on the horizon.

"Just as we hear about Amazon wanting to deliver packages straight to your house with drones, we're working on a nanoscale equivalent of that to deliver important health care packages straight to your cells," explained Jonas, who is training in the division of pediatric hematology/oncology at UCLA Mattel Children's Hospital. In the near future, Jonas hopes to apply nanotechnologies to deploy cell and gene therapies quickly and widely to the pediatric cancer patients he treats.

The construction of nanospears was inspired by the work of their collaborators, Hsian-Rong Tseng, a professor of molecular and medical pharmacology, and Xiaobin Xu, a postdoctoral fellow in Weiss' interdisciplinary research group. Tseng and Xu are both co-authors of the study.

"Based on Xiaobin's nanomanufacturing work, we knew how to make nanostructures of different shapes in massive numbers using simple fabrication strategies," said Weiss, who is also a member of the California NanoSystems Institute. "Once we had that in hand, we realized we could make precise structures that would be of value in gene therapies."

Weiss and Jonas are not the first to conceive of using guided nanostructures or robotic "nanomotors" to enhance gene therapies, however existing methods have limited precision and require potentially toxic chemicals to propel the structures to their targets.

By coating their nanospears with nickel, Weiss and Jonas eliminated the need for chemical propellants. A magnet can be held near a lab dish containing cells to manipulate the direction, position and rotation of one or many nanospears. In the future, Weiss and Jonas envision that a magnetic field could be applied outside of the human body to guide nanospears remotely within the body to treat genetic diseases.

Weiss and Jonas tested their nanospears as vehicles for a gene that causes cells to produce a green fluorescent protein. About 80 percent of targeted cells exhibited a bright green glow, and 90 percent of those cells survived. Both numbers are a marked improvement on existing delivery strategies.

Much like gene therapy, many forms of immunotherapy -- a process in which patient-specific immune cells are genetically engineered to recognize and attack cancer cells -- rely on expensive or time-consuming processing methods.

"The biggest barrier right now to getting either a gene therapy or an immunotherapy to patients is the processing time," Jonas said. "New methods to generate these therapies more quickly, effectively and safely are going to accelerate innovation in this research area and bring these therapies to patients sooner, and that's the goal we all have."

Weiss and Jonas have been collaborating with UCLA researchers to optimize the delivery of gene therapy strategies that have long been in the works.

"One of the amazing things about working at UCLA is that for each of the targeted diseases, we collaborate with leading clinicians who already have gene therapies in development," Weiss said. "They have the gene-editing cargo, model cells, animal models and patient cells in place so we are able to optimize our nanosystems on methods that are on the pathway to the clinic."

Credit: 
University of California - Los Angeles Health Sciences