Culture

TSRI scientists zero in on treatment for Charcot-Marie-Tooth disease

image: Xiang-Lei Yang, PhD, and Zhongying Mo, PhD, led the study at The Scripps Research Institute.

Image: 
Photo by Cindy Brauer

LA JOLLA, CA - March 8, 2018 - About 1 in 2,500 people have a degenerative nerve disease called Charcot-Marie-Tooth (CMT). The disease is typically diagnosed in children, who can lose their ability to walk and use their hands for fine motor skills. There is no cure--yet.

Scientists at The Scripps Research Institute (TSRI) have now shown a path to developing treatments for disease subtype CMT2D. As they report in the journal Nature Communications, it may be possible to reverse the disease by using a small molecule to restore normal protein function in the nervous system.

"This study provides guidance for developing therapeutics," says Xiang-Lei Yang, PhD, TSRI professor and senior author of the study.

Importantly, the study reveals how a better understanding of the fundamental causes of CMT can point researchers toward a cure for other subtypes.

Detective work reveals new role for mutant protein

Here's a puzzle: CMT2D is caused by mutations in a protein called GlyRS, which is expressed by cells throughout the body. Yet, the disease only damages the peripheral nervous system--the nerves in hands and feet.

Adding to the mystery, studies show that GlyRS primarily affects a process called protein synthesis, where genetic information is translated into proteins. Again, this process happens in all cells, so why would hands and feet be most affected?

"Our everyday research is like a detective role," says Zhongying Mo, PhD, senior research associate at TSRI and first author of the study.

The new study offers the answer: GlyRS has a role outside protein synthesis.

The researchers discovered that mutations in GlyRS trigger unusual interactions between GlyRS and a protein called HDAC6. Normally, HDAC6 would regulate a process called acetylation, which readies a protein called α-tubulin for its role in forming microtubules. Yang compares microtubules to a highway. Thanks to α-tubulin, signaling proteins and other important molecules can zip along, sending signals from your tiptoes to your brain.

But in CMT, the aberrant protein interactions with HDAC6 prevent proper α-tubulin acetylation, turning that highway into a dirt road. Nervous system signals can't run smoothly, and the longer the nerve, the rougher the road. Because our longest nerves reach our feet and hands, this finding explains why CMT2D is most severe in the peripheral nervous system--even though the mutant proteins are everywhere in the body.

Further experiments in a mouse model of CMT2D showed that researchers could bring back proper nerve function by injecting the mice with a small molecule that blocks HDAC6 from interfering in α-tubulin acetylation. Although this particular small molecule would not be safe for humans to take, Yang and Mo believe a similar molecule may work as a future CMT2D therapy.

"It's exciting when you can accumulate all the evidence and point to a specific target," says Mo.

Targeting the root cause of CMT

Yang and Mo are excited to find this potential treatment target, but their ultimate goal is to treat the root cause of all types of CMT. To do this, they need to do more studies like this one, which reveal the fundamental pathology of the disease.

From patient to patient, different mutations can cause either mild or very severe symptoms. Some types of CMT are diagnosed in infancy, while others don't appear until adolescence. "That variability is striking," Yang says.

Now that the researchers know about this GlyRS interaction with HDAC6, they would like to investigate where else mutant proteins in CMT are causing problems. In fact, an earlier study from the Yang lab caught another problem made by the mutant proteins, which has something to do with affecting nerve maintenance signal. Yang hopes future studies can solve these mysteries and even show a way to target mutant GlyRS itself.

"Our understanding of the disease is ever-increasing," says Yang.

Credit: 
Scripps Research Institute

Report: Big tobacco is targeting the world's most vulnerable to increase profits

The sixth edition of The Tobacco Atlas and its companion website TobaccoAtlas.org* finds the tobacco industry is increasingly targeting vulnerable populations in emerging markets, such as Africa, Asia, and the Middle East, where people are not protected by strong tobacco control regulations. The report was released at the 17th World Congress on Tobacco OR Health in Cape Town, South Africa.

The Atlas, which is co-authored by American Cancer Society (ACS) and Vital Strategies, graphically details the scale of the tobacco epidemic around the globe. It shows where progress has been made in tobacco control, and describes the latest products and tactics being deployed by the tobacco industry to grow its profits and delay or derail tobacco control efforts. In response to an evolving tobacco control landscape, the Sixth Edition includes new chapters on regulating novel products, partnerships, tobacco industry tactics and countering the industry.

In 2016 alone, tobacco use caused over 7.1 million deaths worldwide (5.1 million in men, 2.0 million in women). Most of these deaths were attributable to cigarette smoking, while 884,000 were related to secondhand smoke. The increase in tobacco-related disease and death has been outpaced by the increase in industry profits. The combined profits of the world's biggest tobacco companies exceeded US $62.27 billion in 2015, the last year on record for all the major companies. This is equivalent to US $9,730 for the death of each smoker, an increase of 39% since the last Atlas was published, when the figure stood at US$7,000.

"Every death from tobacco is preventable, and every government has the power reduce the human and economic toll of the tobacco epidemic," said Jeffrey Drope, PhD, co-editor and author of The Atlas and Vice President, Economic and Health Policy Research at the American Cancer Society. "It starts by resisting the influence of the industry and implementing proven tobacco control policies. The Atlas shows that progress is possible in every region of the world. African countries in particular are at a critical point - both because they are targets of the industry but also because many have opportunity to strengthen policies and act before smoking is at epidemic levels."

"Tobacco causes harm at every stage of its life cycle, from cultivation to disposal," said Dr. Neil Schluger, Vital Strategies' Senior Advisor for Science and co-editor and author of The Atlas. "It is linked to an ever-increasing list of diseases, burdens health systems, and exacerbates poverty, especially when a breadwinner falls ill and dies from tobacco use. At a conservative estimate, there are more than 7 million tobacco-related deaths and global economic costs of two trillion dollars (PPP) each year, not including costs such as those caused by second-hand smoke and the environmental and health damages of tobacco farming. The only way to avert this harm is for all governments to vigorously implement the Framework Convention on Tobacco Control and to enforce the proven strategies that reduce tobacco use."

Tobacco use and exposure to secondhand smoke costs the global economy more than two trillion dollars (PPP) every year - equivalent to almost 2% of the world's total economic output. More than 1.1 billion people are current smokers, while 360 million people use smokeless tobacco. Low and middle income countries represent over 80% of tobacco users and tobacco-related deaths, placing an increased share of tobacco-related costs on those who can least afford it. A growing proportion of that burden will fall on countries across Africa in the future, if governments do not implement tobacco control policies now to prevent it.

Africa is at a tipping point

The Sixth Edition of The Tobacco Atlas reveals that the tobacco industry deliberately targets countries that lack tobacco control laws and exploits governments, farmers and vulnerable populations across Africa. In Sub-Saharan Africa alone, consumption increased by 52% between 1980 and 2016 (from 164 billion to 250 billion sticks). This is being driven by population growth and aggressive tobacco marketing in countries like Lesotho, where prevalence is estimated to have increased from 15% in 2004 to 54% in 2015. Economic growth has increased consumers' ability to afford tobacco products and there is a lack of tobacco control interventions to deter tobacco use. Furthermore, in countries like Ethiopia, Nigeria, and Senegal, smoking is now more common among youth than adults - potentially increasing the future health and economic burden of tobacco in these countries.

Yet Africa has also seen real successes in tobacco control recently, according to The Tobacco Atlas. Ghana and Madagascar have introduced comprehensive bans on tobacco advertising, promotion and sponsorship. Burkina Faso, Djibouti, Kenya, and Madagascar have implemented graphic warnings on cigarettes, an important intervention in countries with multiple dialects and for citizens in those countries who have low levels of literacy. South Africa has implemented consecutive tobacco tax increases to deter consumption and Kenya has implemented a highly effective track-and-trace system to track and reduce illicit trade. These countries are setting an example to others across the world.

Other examples of effective tobacco control policies

In spite of the tobacco industry's efforts to impede progress, global cigarette consumption and tobacco use prevalence have declined recently thanks to an overall increase in the adoption of proven and innovative tobacco control measures. Tobacco taxes alone could deliver a 30% relative reduction in smoking prevalence by 2025. This would save 38 million lives and $16.9 trillion, just from former smokers becoming healthier.

In 2013, the Philippines implemented one of the largest tobacco tax increases in a low and middle income country, leading more than 1 million smokers to quit. Kenya implemented a successful track and trace system for tobacco products, which helped to stem the illicit market.

Turkey's comprehensive tobacco control strategy reduced smoking prevalence from 39.3% in 2000 to 25.9% in 2015.

Analysis by Australia's government found that plain packaging alone resulted in 108,228 fewer smokers between December 2012 and September 2015.

Brazil has banned all tobacco additives such as flavors used to attract children. WHO predicts that there will be 3 million fewer smokers in Brazil between 2015 and 2025.

"We are proud that our two organizations have worked together for almost two decades to engender a healthier world," said Dr Otis Brawley, Chief Medical Officer, American Cancer Society. "The data in The Tobacco Atlas depict a sobering look at the daunting magnitude of the epidemic, but also show considerable progress in places where governments take up solutions that are proven to work. For the first time, more than two billion people are protected by at least one WHO MPOWER measure, but very few countries have taken up every measure. The data are clear that measures like raising taxes and enacting 100% smoke-free air laws indisputably work, but too many governments have not yet committed to adopting them. Our life-saving opportunity lies in that gap."

"The ultimate path to improved tobacco control is political will," said José Luis Castro, President and CEO, Vital Strategies. "Strong tobacco control policies deliver a significant return on investment, and The Tobacco Atlas offers the best and most recent data on the tobacco epidemic as a resource for governments to pursue effective strategies. The answer does not lie with the industry: as The Atlas makes clear, there is a complete disconnect between the tobacco industry's claims about harm reduction and its actual work to grow tobacco use among vulnerable populations. Governments must be accountable to their citizens in reducing tobacco use and improving health. They must prepare to rebuff the tobacco industry's challenges to legislation, seek the appropriate assistance to build capacity, and be transparent about the industry's inevitable approaches. We urge governments, advocates, organizations and people who care about health, the environment and development to stand together to reduce this man-made epidemic in pursuit of a healthier planet."

Credit: 
American Cancer Society

Should doctors recommend acupuncture for pain?

Some see acupuncture as a safe alternative to drugs, while others argue there's no convincing evidence of clinical benefit and potential for harm. So should doctors recommend acupuncture for pain? Experts debate the issue in The BMJ today.

Acupuncture is a safe alternative to drugs for chronic pain, argues Mike Cummings, Medical Director of the British Medical Acupuncture Society and Associate Editor of the journal Acupuncture in Medicine, published by BMJ.

In the US, acupuncture is recommended for back pain, but in the UK, it is no longer included in the National Institute for Health and Care Excellence's (NICE) guidelines for low back pain, although it remains in the NICE guideline on headaches, he explains.

The biggest and most robust dataset for acupuncture in chronic pain comes from a review of data from 20,827 patients, showing moderate benefit for acupuncture compared with usual care, but smaller effects compared with sham acupuncture, he writes. Importantly, it also shows that 85% of the effect of acupuncture is maintained at one year.

Further evidence that sham acupuncture is linked to better quality of life compared with usual care for patients with chronic pain "should urge a more flexible approach from guideline developers," adds Cummings.

He acknowledges that acupuncture "seems to incur more staffing and infrastructure costs than drug based interventions, and in an era of budget restriction, cutting services is a popular short term fix." But argues that group clinics in the community "can provide more treatment at much lower cost."

Another challenge is the lack of commercial sector interest in acupuncture, he adds, meaning that it does not benefit from the lobbying seen for patented drugs and devices.

In summary, he says the pragmatic view sees acupuncture as a relatively safe and moderately effective intervention for a wide range of common chronic pain conditions.

"For those patients who choose it and who respond well, it considerably improves health related quality of life, and it has much lower long term risk for them than non-steroidal anti-inflammatory drugs. It may be especially useful for chronic musculoskeletal pain and osteoarthritis in elderly patients, who are at particularly high risk from adverse drug reactions," he concludes.

In a linked patient commentary, Kumari Manickasamy says acupuncture gave her hope when she had exhausted all avenues offered by conventional medicine for severe pelvic girdle pain during pregnancy.

She points out that there are few safe options for pain relief in pregnancy, and says women with pelvic girdle pain "have to strike a difficult balance between controlling their pain and risking harm to their child."

But Professors Asbjørn Hróbjartsson at the University of Southern Denmark and Edzard Ernst at the University of Exeter, argue that doctors should not recommend acupuncture for pain "because there is insufficient evidence that it is clinically worthwhile."

Overviews of clinical pain trials comparing acupuncture with placebo find a small, clinically irrelevant effect that "may be due to bias rather than acupuncture," they write.

Acupuncture enthusiasts often emphasise "pragmatic" comparisons between acupuncture and usual care. However, they argue that "unblinded pragmatic trials cannot differentiate possible true effects of acupuncture from placebo effects and bias." To inform us reliably of any causal relation between acupuncture and effect, "we need to focus on adequately blinded "explanatory" acupuncture trials," they say.

"If acupuncture is endorsed as a theatrical placebo we should be discussing the ethics of placebo interventions, not the elusive effect of acupuncture," they add.

They also point to harms of acupuncture as well as costs to the NHS, which they say may amount to £25m (€28m; $34m) a year. "Health services funded by taxpayers should use their limited resources for interventions that have been proved effective."

"After decades of research and hundreds of acupuncture pain trials, including thousands of patients, we still have no clear mechanism of action, insufficient evidence for clinically worthwhile benefit, and possible harms. Therefore, doctors should not recommend acupuncture for pain," they conclude.

Credit: 
BMJ Group

Recovery from spinal cord injuries can be predicted

image: How well patients recover from a spinal cord injury can be reliably predicted.

Image: 
Marc Bolliger

A trauma to the spinal cord, quickly leads to a progressive loss of nerve tissue. This not only affects the injured area, but over time affects also other parts of the spinal cord and even the brain. These neurodegenerative changes can be explored in detail using magnetic resonance imaging. An international team of researchers headed up by Patrick Freund from the Spinal Cord Injury Center of the University of Zurich and the Balgrist University Hospital has now for the first time investigated the extent and progression of microstructural changes over the first two years after a spinal cord injury.

The smaller the initial nerve loss, the better the long-term recovery

In their study, the scientists examined 15 patients who had suffered acute traumatic injuries to the spinal cord as well as 18 healthy study participants after 2, 6, 12, and 24 months. In the brain as well as spinal cord, they determined the anatomical extent of neurodegeneration, the loss of myelin (the insulating layer surrounding nerve cells), as well as the accumulation of iron in the nerve tissue as a result of degeneration and inflammation. It then emerged that there was a direct link between the recovery levels of patients after two years and the extent of neurodegenerative change within the first six months after injury. "The smaller the overall loss of nerve tissue across the neuroaxis at the beginning, the better the patients' long-term clinical recovery," summarizes Patrick Freund.

Predicting long-term recovery by measuring early changes

What the researchers found surprising was the fact that the recovery was steepest within the first six months but neurodegenerative changes greatest within the same time period with no signs of deceleration over two years in the spinal cord and brain. This indicates a fierce competition between compensatory and neurodegenerative changes early after injury. The battle seems to be lost in favor of neurodegeneration over time. Nevertheless, the magnitude of early microstructural changes is predictive of the long term recovery of patients suffering from a spinal cord injury. Crucially, non-invasive, high-resolution neuroimaging provides a mean to predict recovery trajectories and distinguish between neurodegeneration caused by the spinal cord injury itself and beneficial changes resulting from therapy. "We have now a tool to reliably predict recovery and determine the effects of treatments and rehabilitation measures as opposed to spontaneous neurodegeneration in humans" adds neuroimaging specialist Freund. "Clinical studies can thus be carried out more efficiently and cost-effectively in the future."

Clinical studies into the influence of arm and leg exercises planned

The patients who took part in the study will be examined again after five years using the same method. The scientists want to determine whether the neurodegenerative changes will have ceased by then or whether they will still be ongoing. Patrick Freund and his team are also planning training studies that aim to show whether the high-intensity exercising of arm and leg functions helps to slow down or stop the loss of nerve tissue.

Credit: 
University of Zurich

US cancer treatment guidelines 'often based on weak evidence'

Cancer treatment guidelines produced by the US National Comprehensive Cancer Network (NCCN) are often based on low quality evidence or no evidence at all, finds a study published by The BMJ today.

The researchers, led by Dr Vinay Prasad at Oregon Health & Science University, say their findings "raise concern that the NCCN justifies the coverage of costly, toxic cancer drugs based on weak evidence."

NCCN guidelines are developed by a panel of cancer experts who make recommendations based on the best available evidence.

These recommendations are used by US private health insurers and social insurance schemes to make coverage decisions, and guide global cancer practice, but it is not clear how the evidence is gathered or reviewed.

In the US, the Food and Drug Administration (FDA) approves all new drugs and grants new indications for drugs already on the market. The NCCN makes recommendations both within and outside of FDA approvals, but patterns of NCCN recommendations beyond FDA approvals have not been analysed.

So Dr Prasad and his team compared FDA approvals of cancer drugs with NCCN recommendations in March 2016 for a contemporary sample of drugs. When the NCCN made recommendations beyond the FDA's approvals, the evidence used to support those recommendations was evaluated.

A total of 47 new cancer drugs were approved by the FDA for 69 indications over the study period, whereas the NCCN recommended these drugs for 113 indications, of which 69 (62%) overlapped with the 69 FDA approved indications and 44 (39%) were additional recommendations.

Only 10 (23%) of these additional recommendations were based on evidence from randomised controlled trials, and seven (16%) were based on evidence from phase III studies. Most relied on small, uncontrolled studies or case reports, or no offered evidence.

And almost two years after their analysis, the researchers found that only six (14%) of the additional recommendations by the NCCN had received FDA approval.

"The NCCN frequently makes additional recommendations for the use of drugs beyond approvals of the FDA and when it does so, it often fails to cite evidence or relies on low levels of evidence," write the authors.

"Few of these additional recommendations subsequently lead to drug approval," they add. "If there is additional evidence in support of these recommendations the NCCN should improve its process and cite all evidence used."

This is an observational study, so no firm conclusions can be drawn about cause and effect, and the researchers point to some limitations. However, they say, given that NCCN endorsement is linked to reimbursement by many commercial insurers and social insurance schemes, "our results suggest that payers may be covering cancer drugs with varying and scientifically less robust justification."

Finally, they point out that 86% of NCCN guidelines members have financial ties to the pharmaceutical industry, with 84% receiving personal payments and 47% receiving research payments.

"The presence of conflicted physicians has been shown to lead to more optimistic conclusions regarding disputed practices," they say. "Thus our findings raise concern about the nature of the recommendations offered by these individuals."

Credit: 
BMJ Group

First look at Jupiter's poles show strange geometric arrays of storms

image: Five massive storms form a pentagon around a storm at the center of Jupiter's south pole-the first look we've ever gotten at the gas giant's poles, and a scientific mystery.

Image: 
NASA/SWRI/JPL/ASI/INAF/IAPS

Jupiter's got no sway. The biggest planet in the solar system has no tilt as it moves, so its poles have never been visible from Earth.

But in the past two years, with NASA's Juno spacecraft, scientists have gotten a good look at the top and bottom of the planet for the first time. What they found astounded them: bizarre geometric arrangements of storms, each arrayed around one cyclone over the north and south poles--unlike any storm formation seen in the universe.

The study, authored by scientists from an international group of institutions including the University of Chicago, is published in March 8's Nature as part of a set of four papers dedicated to new observations from the Juno spacecraft.

Juno launched in 2011 with the ambitious mission of finally seeing beneath the dense clouds covering Jupiter. On July 4, 2016, it finally reached the planet's orbit. Since then it's been orbiting the planet, taking pictures and measuring the planet's profile in infrared, microwave, ultraviolet, gravity and magnetism--and answering questions scientists have had about Jupiter for decades.

One of these was the question of what lay at its elusive poles. When scientists got the first images, they were stunned. At the north pole, eight storms surrounded one storm at the center. At the south pole, it was the same arrangement, only with five storms. But the numbers stayed oddly constant; the storms weren't drifting and merging, as our current understanding of the science suggested they should.

"They are extraordinarily stable arrangements of such chaotic elements," said Morgan O'Neill, a University of Chicago postdoctoral scholar and a co-author on the paper. "We'd never seen anything like it."

The geometry rang a faint bell in O'Neill's mind, though. She found it in the library of strange physical phenomena only observed under special conditions in the laboratory. In the 1990s, scientists observed a similar behavior as they used electrons to simulate a frictionless, turbulent 2-D fluid as it cools. Instead of merging, which tends to happen in such 2-D flows, small vortices would clump together and form equally spaced arrays, or "vortex crystals," around a center.

It's not yet clear whether the same physics underlies both these behaviors, O'Neill said, but it is tantalizing. "The next step is: Can you create a model that builds a virtual planet and predicts these flows?" she said. With further studies, they can understand the forces at play in the swirling storms.

A greater understanding of the physics behind the flows and dynamics of storms is helpful on every planet; though O'Neill did her PhD on the dynamics of cyclones on gas giants (including a prediction that Jupiter's poles would not look like Saturn's: "I got it...partially right," she said), she now uses similar storm modeling to study hurricanes on Earth.

Credit: 
University of Chicago

Why people experience seasonal skin changes

A new British Journal of Dermatology study provides information that may help explain why many people experience eczema and dry skin in the winter.

In tests of skin on 80 adults, the levels of breakdown products of filaggrin--a protein that helps maintain the skin's barrier function--changed between winter and summer on the cheeks and hands. Changes were also seen regarding the texture of corneocytes, cells in the outermost part of the skin's epidermis.

"This study shows clearly that the skin barrier is affected by climatic and seasonal changes. Both children and adults suffer from red cheeks in the winter in northern latitudes and some may even develop more permanent skin conditions such as atopic eczema and rosacea," said senior author Dr. Jacob Thyssen, of the University of Copenhagen, in Denmark. "By the use of high magnification we show that the skin cells suffer from shrinkage and therefore change their surface. The clinical message to individuals are that they should protect their skin with emollients in the winter and sunscreen in the summer.

Nina Goad of the British Association of Dermatologists said: "We already know that humidity can affect the texture of the skin and impact on skin disorders like eczema, and humidity fluctuates according to season. In the winter, rapidly changing temperatures, from heated indoors to cold outdoors environments, can affect the capillaries, and prolonged exposure to wet weather can strip the skin's barrier function. This latest study is interesting as it sheds new light on further reasons for seasonal skin changes, at a cellular level. Given that skin problems are the most common reason for people to visit their doctor, any research that improves our understanding of skin disorders and how best to manage them is always a positive step."

Credit: 
Wiley

Warm showers and ball exercises may help women during childbirth

A new International Journal of Nursing Practice study demonstrates that during childbirth, women may benefit from warm showers, perineal exercises with a ball, or the combination of both strategies. The study found positive effects of these strategies in terms of lessening pain, anxiety, and stress.

The study was a randomized controlled trial conducted with 128 women during childbirth who were admitted for hospital birth in São Paulo, Brazil from June 2013 to February 2014.

"When we evaluated pain and anxiety using a visual analog scale, and also evaluated the salivary release of stress hormones before and after interventions of warm showers and perineal exercises with a ball, we found greater tolerance regarding pain, reduction of anxiety, a decrease in the release stress hormones, and an increase in well-being hormones," said lead author Dr. Angelita José Henrique, of the Federal University of São Paulo. "Our results indicate that these interventions should be encouraged because they are safe practices, low-cost, and are directly related to comfort, and they should be used as an adjuvant to medications and anesthesia during childbirth."

Credit: 
Wiley

When fee-pressured audit offices focus on non-audit services, financial statements suffer, study sho

image: Erik Beardsley, assistant professor of accountancy in Notre Dame's Mendoza College of Business.

Image: 
Matt Cashore/University of Notre Dame

Firms hire auditors to create independent assessments of their financial statements, providing assurance to investors and outside parties that they are free from material misstatement.

However -- especially since the economic downturn -- companies pressure auditors to lower their fees as a way to reduce costs. Auditors, in turn, place greater emphasis on more-profitable non-audit services, such as consulting, which can negatively impact audit quality, according to new research from the University of Notre Dame.

"How do Audit Offices Respond to Audit Fee Pressure? Evidence of Increased Focus on Non-audit Services and their Impact on Audit Quality" by Erik Beardsley, assistant professor of accountancy in Notre Dame's Mendoza College of Business, along with Dennis Lassila of Texas A&M University and Thomas Omer from the University of Nebraska-Lincoln, is forthcoming in Contemporary Accounting Research.

The team examined audit fees, non-audit fees and client misstatement rates of 561 audit offices from 2004-2013.

"Audit offices experiencing audit fee pressure appear to focus more on providing non-audit services in relation to their total fees," Beardsley says, "and we find that when they do that, audit quality suffers.

"If financial statements are misstated, then later re-stated," he says, "it means the auditor didn't catch the misstatement before the financial statements were presented, meaning audit quality was low."

Beardsley says firms and investors should be wary of auditors trying to sell more non-audit services, which has been an ongoing concern for the Public Company Accounting Oversight Board. The board has focused on whether non-audit services impair auditor independence and whether it has an effect on audit quality.

The Sarbanes-Oxley Act of 2002, enacted in response to a series of high-profile financial scandals including Enron and WorldCom, set requirements for all U.S. public companies in an effort to improve corporate governance and accountability. Among these requirements were restrictions on the type of non-audit services an auditor may provide.

Beardsley notes, "These restrictions put in place in the early 2000s certainly decreased the amount of non-audit services that auditors provide. However, some firms seem to be focusing on them again, and our study suggests that this could be due in part to the reduced profitability of audit engagements."

Credit: 
University of Notre Dame

ALMA reveals inner web of stellar nursery

image: This spectacular and unusual image shows part of the famous Orion Nebula, a star formation region lying about 1350 light-years from Earth. It combines a mosaic of millimetre wavelength images from the Atacama Large Millimeter/submillimeter Array (ALMA) and the IRAM 30-metre telescope, shown in red, with a more familiar infrared view from the HAWK-I instrument on ESO's Very Large Telescope, shown in blue. The group of bright blue-white stars at the left is the Trapezium Cluster -- made up of hot young stars that are only a few million years old.

Image: 
ESO/H. Drass/ALMA (ESO/NAOJ/NRAO)/A. Hacar

This spectacular and unusual image shows part of the famous Orion Nebula, a star formation region lying about 1350 light-years from Earth. It combines a mosaic of millimetre-wavelength images from the Atacama Large Millimeter/submillimeter Array (ALMA) and the IRAM 30-metre telescope, shown in red, with a more familiar infrared view from the HAWK-I instrument on ESO's Very Large Telescope, shown in blue. The group of bright blue-white stars at the upper-left is the Trapezium Cluster -- made up of hot young stars that are only a few million years old.

The wispy, fibre-like structures seen in this large image are long filaments of cold gas, only visible to telescopes working in the millimetre wavelength range. They are invisible at both optical and infrared wavelengths, making ALMA one of the only instruments available for astronomers to study them. This gas gives rise to newborn stars -- it gradually collapses under the force of its own gravity until it is sufficiently compressed to form a protostar -- the precursor to a star.

The scientists who gathered the data from which this image was created were studying these filaments to learn more about their structure and make-up. They used ALMA to look for signatures of diazenylium gas, which makes up part of these structures. Through doing this study, the team managed to identify a network of 55 filaments.

The Orion Nebula is the nearest region of massive star formation to Earth, and is therefore studied in great detail by astronomers seeking to better understand how stars form and evolve in their first few million years. ESO's telescopes have observed this interesting region multiple times, and you can learn more about previous discoveries here, here, and here.

This image combines a total of 296 separate individual datasets from the ALMA and IRAM telescopes, making it one of the largest high-resolution mosaics of a star formation region produced so far at millimetre wavelengths [1].

Credit: 
ESO

Half of Scots 'not confident' in giving CPR, study finds

image: This is Dr. Gareth Clegg.

Image: 
University of Edinburgh

Half of the Scottish adult population do not feel confident administering CPR - and more than a fifth do not know when it is required, according to a new study led by the University of Stirling.

The study, which has been welcomed by the Scottish Government, is the first to examine the readiness and willingness of Scots to carry out cardiopulmonary resuscitation (CPR). Experts believe the work could help to explain why our survival rates from cardiac arrest are poor when compared to other countries.

Fiona Dobbie, a Research Fellow at the Institute of Social Marketing, part of the Faculty of Health Sciences and Sport, led the work, which also involved the Resuscitation Research Group at the University of Edinburgh and the Scottish Government.

"The findings of our study will help develop policy and future interventions to improve the rate of bystander CPR," Ms Dobbie said. "From a policy perspective, there is a need for more tailored and targeted interventions to encourage CPR training, which has been linked with improving confidence in CPR. As confidence increases, so does the likelihood of providing emergency aid in an out-of-hospital cardiac arrest.

"Our findings suggest that priority groups are people who are not working, in a lower social grade and the elderly."

In 2015, Scotland's Strategy for Out-of-Hospital Cardiac Arrest was launched with the aim of equipping 500,000 people with CPR skills in a bid to save an additional 300 lives per year following an out-of-hospital cardiac arrest.

The new study, which informs the Strategy, comprised an Ipsos MORI survey, which canvassed 1,027 adults in Scotland.

Fifty per cent said they would not feel confident administering bystander CPR, with a further 21 per cent admitting that they would not know if it was required.

Twenty-two per cent would not be comfortable giving CPR, for fear of causing an injury to the victim, while 19 per cent would be reluctant due to their lack of skills. The same proportion would be put off by visible vomit or blood and 16 per cent by indications that the ill person is a drug user, the poll found.

The team found that confidence was affected by age, social grade and employment status; the older the person was, the less likely they were trained in CPR, show willingness to be trained, or feel confident to administer CPR. Fifty-eight per cent of 35 to 44 year olds said that they would like to be trained in CPR, compared to just 37 per cent of 55 to 64 year olds and 23 per cent of those aged 65 and over.

Respondents with professional, managerial and non-manual occupations were more likely to have been trained in CPR than those in manual, unskilled occupations and the long-term unemployed.

Dr Gareth Clegg, a Senior Clinical Lecturer at the University of Edinburgh, an Honorary Consultant in Emergency Medicine at Edinburgh Royal Infirmary, and an Associate Medical Director, of the Scottish Ambulance Service, co-authored the paper.

He said: "Survival from cardiac arrest in Scotland is a poor relative to the best performing centres in the world. One of the most important determinants of survival is bystander CPR, which more than doubles chances of survival.

"We already know that people in the most deprived areas in Scotland are much more likely to have a cardiac arrest, at a younger age, and less likely to survive than those in affluent areas.

"This work is important because it suggests that those living in communities which are most likely to need CPR are least ready to carry it out."

Dr Clegg added: "Using the findings from this research, we hope to develop ways to teach hundreds of thousands of people in Scotland how to perform CPR - and save hundreds more lives each year."

Minister for Public Health Aileen Campbell said: "That half of adults in Scotland in this survey were already confident giving CPR gives us a firm foundation to build on, and to date more than 200,000 people across the country have learned CPR since 2015.

"This is great progress towards our 500,000 target by 2020 and a testimony to the work put in by our Save A Life For Scotland partnership, who have brought these lifesaving skills to more people across the country.

"Any CPR is better than no CPR and we know it's the main way we can increase survival after Out of Hospital Cardiac Arrest."

Credit: 
University of Stirling

Combating childhood obesity by preventing 'fatty liver' in fetus

image: Fat content in fetal livers at the end of pregnancy from, A) fetus of a normal weight mother during pregnancy and B) fetus of an obese, overweight mother during pregnancy. The red stain represents fat.

Image: 
Edward Dick

New research published in The Journal of Physiology indicates that an obese pregnant mother and exposure to a high fat, high sugar diet during pregnancy produces a "fatty liver" in the fetus, potentially predisposing children to obesity, metabolic and cardiovascular disorders later in life. The research aims to understand the cellular mechanisms involved in laying down fat in the liver of a fetus - leading to a "fatty liver". This knowledge is essential to developing strategies to combat childhood obesity.

Throughout the world over fifty percent of women of reproductive age are either overweight or obese. Obesity in pregnant women, combined with intake of high-fat, high-sugar diets during pregnancy predisposes their children to obesity and other chronic metabolic and cardiovascular diseases. These complications probably account for a significant proportion of the current epidemic of childhood obesity.

This research aimed to shed light on the mechanisms underlying this link between obese pregnant mothers and obesity in their child, called developmental programming, which is currently poorly understood. This study found that when a fetus develops in an obese pregnant mother, fat accumulates in its liver and many metabolic pathways are disturbed. This is the first study to report that important recently discovered microRNAs (a DNA product which modifies protein synthesis) play a role in this increased deposition of fat in the liver in our closest relatives, the nonhuman primates. The observations of this study may explain why children of obese mothers live shorter lives than the offspring of normal weight mothers.

While there is always some fat in the liver, when the liver fat increases above normal an individual is said to have a "fatty liver". As the amount of liver fat increases, troubles begin. If this is dealt with early, it can be reversed, however if fat deposition persists, the damage can lead to liver scarring and even later life liver cancer. Therefore it is important to understand the early cellular mechanisms involved in laying down the fat early in life while the fetus is developing.

The research was a joint collaboration between University of Wyoming at Laramie Wyoming, the Texas Biomedical Research Institute, San Antonio, Texas, Wake Forest Baptist Medical Centre, Indiana University School of Medicine and the University of Texas Health Science Center. In order to determine which genes were changed in the fetal liver of obese pregnant monkeys and to identify which microRNAs regulate these genes, genomic and epigenomic methods were used. The altered cellular signalling pathways were identified using bioinformatics approaches, and microscopic studies were conducted to quantify the amount of stored fat and sugar present in the liver cells, as well as assessing their shape, which is an indicator of liver cell health.

Although the researchers were able to see significant changes in many cellular functional pathways the study may have been limited in its ability to show significant changes during even early stages of subjects. A major principle of developmental programming is that significant changes may lie dormant, only to emerge under stress or when hormones begin to change with puberty or aging.

The researchers plan to now investigate metabolic and cardiovascular health of monkey offspring of obese mothers, including liver function at regular intervals across the life course to follow the progress of these fetal changes. This will allow them to assess whether unwanted consequences of maternal obesity can pass across generations from mother to daughter to grandchildren. They also plan to identify interventions that can reverse these unwanted changes using the same technologies employed in this study.

Peter Nathanielsz, one of the lead investigators on the projects commented on the significance of the results: "This research is important as throughout the world over fifty percent of women of reproductive age are overweight or obese. Maternal obesity, combined with high fat, high sugar diets, makes it more likely that children will suffer from liver disease and face health problems such as obesity and heart disease later in life.

"It wasn't until we saw the microscope slides for the staining of liver sections showing very high amounts of lipid in fetuses of obese mothers that we realized the dramatic impact of maternal obesity at such an early developmental time point. Histological analyses of these livers showing the condition steatosis, underlined the detrimental impact of maternal obesity on the developing fetus."

Credit: 
The Physiological Society

Seeing is believing -- precision atom qubits achieve major quantum computing milestone

image: An artist's impression of two qubits -- one made of two phosphorus atoms and one made of a single phosphorus atom -- placed 16 nanometres apart in a silicon chip. UNSW scientists were able to control the interactions between the two qubits so the quantum spins of their electrons became correlated. When the spin of one electron is pointing up, the other points down.

Image: 
UNSW

The unique Australian approach of creating quantum bits from precisely positioned individual atoms in silicon is reaping major rewards, with UNSW Sydney-led scientists showing for the first time that they can make two of these atom qubits "talk" to each other.

The team - led by UNSW Professor Michelle Simmons, Director of the Centre of Excellence for Quantum Computation and Communication Technology, or CQC2T - is the only group in the world that has the ability to see the exact position of their qubits in the solid state.

Simmons' team creates the atom qubits by precisely positioning and encapsulating individual phosphorus atoms within a silicon chip. Information is stored on the quantum spin of a single phosphorus electron.

The team's latest advance - the first observation of controllable interactions between two of these qubits - is published in the journal Nature Communications. It follows two other recent breakthroughs using this unique approach to building a quantum computer.

By optimising their nano-manufacturing process, Simmons' team has also recently created quantum circuitry with the lowest recorded electrical noise of any semiconductor device.

And they have created an electron spin qubit with the longest lifetime ever reported in a nano-electric device - 30 seconds.

"The combined results from these three research papers confirm the extremely promising prospects for building multi-qubit systems using our atom qubits," says Simmons.

2018 Australian of the Year inspired by Richard Feynman

Simmons, who was named 2018 Australian of the Year in January for her pioneering quantum computing research, says her team's ground-breaking work is inspired by the late physicist Richard Feynman.

"Feynman said: 'What I cannot create, I do not understand'. We are enacting that strategy systematically, from the ground up, atom by atom," says Simmons.

"In placing our phosphorus atoms in the silicon to make a qubit, we have demonstrated that we can use a scanning probe to directly measure the atom's wave function, which tells us its exact physical location in the chip. We are the only group in the world who can actually see where our qubits are.

"Our competitive advantage is that we can put our high-quality qubit where we want it in the chip, see what we've made, and then measure how it behaves. We can add another qubit nearby and see how the two wave functions interact. And then we can start to generate replicas of the devices we have created," she says.

For the new study, the team placed two qubits - one made of two phosphorus atoms and one made of a single phosphorus atom - 16 nanometres apart in a silicon chip.

"Using electrodes that were patterned onto the chip with similar precision techniques, we were able to control the interactions between these two neighbouring qubits, so the quantum spins of their electrons became correlated," says study lead co-author, Dr Matthew Broome, formerly of UNSW and now at the University of Copenhagen.

"It was fascinating to watch. When the spin of one electron is pointing up, the other points down, and vice versa.

"This is a major milestone for the technology. These type of spin correlations are the precursor to the entangled states that are necessary for a quantum computer to function and carry out complex calculations," he says.

Study lead co-author, UNSW's Sam Gorman, says: "Theory had predicted the two qubits would need to be placed 20 nanometres apart to see this correlation effect. But we found it occurs at only 16 nanometres apart.

"In our quantum world, this is a very big difference," he says. "It is also brilliant, as an experimentalist, to be challenging the theory."

Leading the race to build a quantum computer in silicon

UNSW scientists and engineers at CQC2T are leading the world in the race to build a quantum computer in silicon. They are developing parallel patented approaches using single atom and quantum dot qubits.

"Our hope is that both approaches will work well. That would be terrific for Australia," says Simmons.

The UNSW team have chosen to work in silicon because it is among the most stable and easily manufactured environments in which to host qubits, and its long history of use in the conventional computer industry means there is a vast body of knowledge about this material.

In 2012, Simmons' team, who use scanning tunnelling microscopes to position the individual phosphorus atoms in silicon and then molecular beam epitaxy to encapsulate them, created the world's narrowest conducting wires, just four phosphorus atoms across and one atom high.

In a recent paper published in the journal Nano Letters, they used similar atomic scale control techniques to produce circuitry about 2-10 nanometres wide and showed it had the lowest recorded electrical noise of any semiconductor circuitry. This work was undertaken jointly with Saquib Shamim and Arindam Ghosh of the Indian Institute of Science.

"It's widely accepted that electrical noise from the circuitry that controls the qubits will be a critical factor in limiting their performance," says Simmons.

"Our results confirm that silicon is an optimal choice, because its use avoids the problem most other devices face of having a mix of different materials, including dielectrics and surface metals, that can be the source of, and amplify, electrical noise.

"With our precision approach we've achieved what we believe is the lowest electrical noise level possible for an electronic nano-device in silicon - three orders of magnitude lower than even using carbon nanotubes," she says.

In another recent paper in Science Advances, Simmons' team showed their precision qubits in silicon could be engineered so the electron spin had a record lifetime of 30 seconds - up to 16 times longer than previously reported. The first author, Dr Thomas Watson, was at UNSW undertaking his PhD and is now at Delft University of Technology.

"This is a hot topic of research," says Simmons. "The lifetime of the electron spin - before it starts to decay, for example, from spin up to spin down - is vital. The longer the lifetime, the longer we can store information in its quantum state."

In the same paper, they showed that these long lifetimes allowed them to read out the electron spins of two qubits in sequence with an accuracy of 99.8 percent for each, which is the level required for practical error correction in a quantum processor.

Australia's first quantum computing company

Instead of performing calculations one after another, like a conventional computer, a quantum computer would work in parallel and be able to look at all the possible outcomes at the same time. It would be able to solve problems in minutes that would otherwise take thousands of years.

Last year, Australia's first quantum computing company - backed by a unique consortium of governments, industry and universities - was established to commercialise CQC2T's world-leading research.

Operating out of new laboratories at UNSW, Silicon Quantum Computing Pty Ltd has the target of producing a 10-qubit demonstration device in silicon by 2022, as the forerunner to a silicon-based quantum computer.

The Australian government has invested $26 million in the $83 million venture through its National Innovation and Science Agenda, with an additional $25 million coming from UNSW, $14 million from the Commonwealth Bank of Australia, $10 million from Telstra and $8.7 million from the NSW Government.

It is estimated that industries comprising approximately 40% of Australia's current economy could be significantly impacted by quantum computing. Possible applications include software design, machine learning, scheduling and logistical planning, financial analysis, stock market modelling, software and hardware verification, climate modelling, rapid drug design and testing, and early disease detection and prevention.

Credit: 
University of New South Wales

Dispelling the myth that scientists don't care about teaching

A new study using surveys and classroom noise analysis shows the success of a three-year effort by faculty in the Biology Department at San Francisco State University to get smarter about their teaching. The results run counter to conventional wisdom that scientists care more about research than they do about the students in their classrooms.

At the request of her faculty colleagues, Professor of Biology Kimberly Tanner led the effort, which started with a five-day summer training institute in 2013 and snowballed into more workshops and follow-up programs throughout the semester. By the end of the program, 89 percent of the faculty ended up participating in at least one workshop, and 83 percent participated in follow-up programs. Faculty who went through the entire program spent more than 100 hours each on training.

The training focused on a few main techniques, like "active learning" techniques for giving students more control over how they learn, creating tests in a way that accurately assesses student knowledge and creating a more inclusive classroom environment.

To figure out whether the program was actually working, the researchers developed a technique for measuring student participation in class by analyzing recordings of classroom noise. They found that 81 percent of the faculty taking part in the study used active learning techniques in at least half of their class sessions. And surveys of the participating faculty members showed that 96 percent were more confident in their teaching after the training. The results were published in a paper in the journal CBE -- Life Sciences Education, which was posted online in January and is part of the March 1 issue. Almost 70 members of the department were featured as authors.

In the process, the researchers dispelled another myth. "A lot of faculty at other universities think if they devote time to their teaching, their research will suffer," said Tanner. But when surveyed, only 6 percent of study participants reported that. On the other hand, 30 percent said the opposite -- that their research had been positively affected. Tanner suspects this shift is due to a stronger sense of community and a more collaborative atmosphere fostered by the training.

In a discipline that experiences heavy student attrition, these techniques to foster a more engaging classroom are crucial. "The majority of students leave biology," says Tanner. "And they leave based on personal demographics -- more women leave, more students of color leave. These strategies will help us retain more of those students." Tanner hopes the study will also serve as an example that inspires other universities to follow.

Credit: 
San Francisco State University

Linking virus sensing with gene expression, a plant immune system course-corrects

Plant immune systems, like those of humans and animals, face a difficult balancing act: they must mount responses against ever-evolving pathogens, but they must not overdo it. Immune responses require energy and resources and often involve plants killing their own infected cells to prevent the pathogens from spreading.

Researchers at Durham University in the UK have identified a crucial link in the process of how plants regulate their antiviral responses. The research is published in the March 2 issue of the Journal of Biological Chemistry.

Martin Cann's lab at Durham, in collaboration with the laboratories of Aska Goverse at Wageningen University and Frank Takken at the University of Amsterdam, studied a receptor protein called Rx1, which is found in potato plants and detects infection by a virus called potato virus X.

Binding to a protein from the virus activates Rx1 and starts a chain of events that results in the plant mounting an immune response. But the exact sequence of cellular events - and how Rx1 activation was translated into action by the rest of the cell - was unknown.

"Our study revealed an exciting, and unexpected, link between pathogen attack and plant DNA," Cann said.

Specifically, the study showed that Rx1 joins forces with a protein called Glk1. Glk1 is a transcription factor, meaning it binds to specific regions of DNA and activates genes involved in cell death and other plant immune responses. The team found that when Glk1 bound to virus-activated Rx1, it was able to turn on the appropriate defense genes.

Interestingly, when the viral protein was absent, Rx1 seemed to have the opposite effect - actually keeping Glk1 from binding to DNA. In this way, it prevented an inappropriate immune response.

"(T)he immune response involves reprogramming the entire cell and also often the entire plant," Cann said. "(A)n important part of this regulatory process is not only allowing activation but also making sure the entire system is switched off in the absence of infection."

As over a third of the annual potential global crop harvest is lost to pathogens and pests, breeding plants with better immune systems is an important challenge. Understanding how this immune system is regulated at the appropriate level of activity gives the researchers more ideas of points in the immune signaling pathway that could targeted to increase the plant's baseline ability to resist disease.

"To increase (crop) yield, there is an urgent need for new varieties that are resilient to these stresses," Cann said. "A mechanistic understanding of how plants resist or overcome pathogen attack is crucial to develop new strategies for crop protection."

Credit: 
American Society for Biochemistry and Molecular Biology