Culture

A decade after housing bust, mortgage industry on shaky ground, experts warn

Despite tough banking rules put in place after last decade's housing crash, the mortgage market again faces the risk of a meltdown that could endanger the U.S. economy, warn two Berkeley Haas professors in a paper co-authored by Federal Reserve economists. The threat reflects a boom in nonbank mortgage companies, a category of independent lenders that are more lightly regulated and more financially fragile than banks--and which now originate half of all US home mortgages.

"If these firms go out of business, the mortgage market shuts down, and that has dire Implications for the overall health of the economy," says Richard Stanton, professor of finance and Kingsford Capital Management Chair in Business at Haas. Stanton authored the Brookings paper, "Liquidity Crises in the Mortgage Market," with Nancy Wallace, the Lisle and Roslyn Payne Chair in Real Estate Capital Markets and chair of the Haas Real Estate Group. You Suk Kim, Steven M. Laufer, and Karen Pence of the Federal Reserve Board were coauthors.

Bank regulation fueled boom in nonbank lenders

During the housing bust, nonbank lenders failed in droves as home prices fell and borrowers stopped making payments, fueling a wider financial crisis. Yet when banks dramatically cut back home loans after the crisis, it was nonbank mortgage companies that stepped into the breach. Now, nonbanks are a larger force in residential lending than ever. In 2016, they accounted for half of all mortgages, up from 20 percent in 2007, the Brookings Institution paper notes. Their share of mortgages with explicit government backing is even higher: nonbanks originate about 75 percent of loans guaranteed by the Federal Housing Administration (FHA) or the U.S. Department of Veterans Affairs (VA).

Nonbank lenders are regulated by a patchwork of state and federal agencies that lack the resources to watch over them adequately, so risk can easily build up without a check. While the Federal Reserve lends money to banks in a pinch, it does not do the same for independent mortgage companies.

Scant access to cash

In stark contrast with banks, independent mortgage companies have little capital of their own and scant access to cash in an emergency. They have come to rely on a type of short-term funding known as warehouse lines of credit, usually provided by larger commercial and investments banks. It's a murky area since most nonbank lenders are private companies which are not required to disclose their financial structures, so Stanton and Wallace's paper provides the first public tabulation of the scale of this warehouse lending. They calculated that there was a $34 billion commitment on warehouse loans at the end of 2016, up from $17 billion at the end of 2013. That translates to about $1 trillion in short-term "warehouse loans" funded over the course of a year.

If rising interest rates were to choke off the mortgage refinance market, if an economic slowdown prompted more homeowners to default, or if the banks that extend credit to mortgage lenders cut them off, many of these companies would find themselves in trouble with no way out. "There is great fragility. These lenders could disappear from the map," Stanton notes.

Risk to taxpayers

The ripple effects of a market collapse would be severe, and taxpayers would potentially be on the hook for losses posted by failed mortgage companies. In addition to loans backed by the FHA or VA, the government is exposed through Ginnie Mae, the federal agency that provides payment guarantees when mortgages are pooled and sold as securities to investors. The mortgage companies are supposed to bear the losses if these securitized loans go bad. But if those companies go under, the government "will probably bear the majority of the increased credit and operational losses," the paper concludes. Ginnie Mae is especially vulnerable because almost 60 percent of the dollar volume of the mortgages it guarantees comes from nonbank lenders.

Vulnerable communities would be hit hardest. In 2016, nonbank lenders made 64 percent of the home loans extended to black and Latino borrowers, and 58 percent of the mortgages to homeowners living in low- or moderate-income tracts, the paper reports.

The authors emphasize that they hope their paper raises awareness of the risks posed by the growth of the nonbank sector. Most of the policy discussion on preventing another housing crash has focused on supervision of banks and other deposit-taking institutions. "Less thought is being given, in the housing finance reform discussions and elsewhere, to the question of whether it is wise to concentrate so much risk in a sector with such little capacity to bear it," the paper concludes.

Stanton adds, "We want to make the nonbank side part of the debate."

Credit: 
University of California - Berkeley Haas School of Business

Flipping lipids for cell transport-tubules

image: The phospholipid bilayer is composed of many different lipids. Flipping certain ones from the outer to inner layer (and vice versa) allows the cell to change dynamics and interact with the environment.

Image: 
Kyoto University / Tomoki Shimizu / Shin Lab

Researchers are getting closer to understanding the molecular processes that cause parts of cell membranes to morph into tiny tubes that can transport molecules in and out of cells.

Kyoto University cell biologists wanted to find out if 'flipping' enzymes belonging to the P4-ATPase family were involved in inducing cell membranes to change shape. These enzymes flip specific lipids between the inner and outer layers of the membrane. Until now, it hasn't been clear if they played a role in changing the membrane's curvature, because scientists were not able to see their activity in conjunction with membrane deformation.

The researchers developed a process to allow them to do just that. They tagged fluorescent molecules to curvature-sensing 'BAR' proteins that are present in cytosol, and observed how they behaved.

It is known that when one type of BAR-domain, called N-BAR, is recruited to the cell membrane, it penetrates the cell's lipid bilayer, inducing the formation of a small inward curvature. The protein senses this change in curvature, leading to the recruitment of more N-BAR domains, which attach to each other along a part of the membrane, triggering its transformation into a tube.

BAR and F-BAR domains, on the other hand, do not do this, unless, the researchers found, a special flipping enzyme, called ATP10A, was activated in cells. ATP10A flips the lipid phosphatidylcholine from the outer to the inner layer of the cell membrane, causing a small change in its curvature. When ATP10A was activated, BAR and F-BAR domains sensed a change in curvature and bound to the cell membrane. The proteins then gathered, attached to each other along the cell membrane, and transformed that part into an inwardly-protruding tubule. This did not happen in cells in which ATP10A was turned off.

"Increased inward plasma membrane bending by ATP10A expression enhances endocytosis," explains Hye-Won Shin of Kyoto University's Graduate School of Pharmaceutical Sciences. Endocytosis is the process in which a part of a cell engulfs external molecules for further processing.

"The plasma membrane also dynamically changes shape during cell migration, cancer cell invasion, cell division, nutrient uptake, and entry of pathogens and viruses into cells," Shin explains. "This study is the first evidence that changes in the transbilayer lipid composition induced by P4-ATPases can deform biological membranes," she says.

The study is published in the latest issue of The EMBO Journal.

Credit: 
Kyoto University

Poor grades tied to class times that don't match our biological clocks

image: Owls performed worst of all the groups due to chronic social jet lag.

Image: 
Benjamin Smarr

It may be time to tailor students' class schedules to their natural biological rhythms, according to a new study from UC Berkeley and Northeastern Illinois University.

Researchers tracked the personal daily online activity profiles of nearly 15,000 college students as they logged into campus servers.

After sorting the students into "night owls," "daytime finches" and "morning larks" -- based on their activities on days they were not in class -- researchers compared their class times to their academic outcomes.

Their findings, published today in the journal Scientific Reports, show that students whose circadian rhythms were out of sync with their class schedules - say, night owls taking early morning courses - received lower grades due to "social jet lag," a condition in which peak alertness times are at odds with work, school or other demands.

"We found that the majority of students were being jet-lagged by their class times, which correlated very strongly with decreased academic performance," said study co-lead author Benjamin Smarr, a postdoctoral fellow who studies circadian rhythm disruptions in the lab of UC Berkeley psychology professor Lance Kriegsfeld.

In addition to learning deficits, social jet lag has been tied to obesity and excessive alcohol and tobacco use.

On a positive note: "Our research indicates that if a student can structure a consistent schedule in which class days resemble non-class days, they are more likely to achieve academic success," said study co-lead author Aaron Schirmer, an associate professor of biology at Northeastern Illinois University.

While students of all categories suffered from class-induced jet lag, the study found that night owls were especially vulnerable, many appearing so chronically jet-lagged that they were unable to perform optimally at any time of day.
But it's not as simple as students just staying up too late, Smarr said

"Because owls are later and classes tend to be earlier, this mismatch hits owls the hardest, but we see larks and finches taking later classes and also suffering from the mismatch," said Smarr. "Different people really do have biologically diverse timing, so there isn't a one-time-fits-all solution for education."

In what is thought to be the largest-ever survey of social jet lag using real-world data, Smarr and Schirmer analyzed the online activity of 14,894 Northeastern Illinois University students as they logged in and out of the campus's learning management system over two years.

To separate the owls from the larks from the finches, and gain a more accurate alertness profile, the researchers tracked students' activity levels on days that they did not attend a class.

Next, they looked at how larks, finches and owls had scheduled their classes during four semesters from 2014 to 2016 and found that about 40 percent were mostly biologically in sync with their class times. As a result, they performed better in class and enjoyed higher GPAs.

However, 50 percent of the students were taking classes before they were fully alert, and another 10 percent had already peaked by the time their classes started.

Previous studies have found that older people tend to be active earlier while young adults shift to a later sleep-wake cycle during puberty. Overall, men stay up later than women, and circadian rhythms shift with the seasons based on natural light.

Finding these patterns reflected in students' login data spurred researchers to investigate whether digital records might also reflect the biological rhythms underlying people's behavior.

The results suggest that "rather than admonish late students to go to bed earlier, in conflict with their biological rhythms, we should work to individualize education so that learning and classes are structured to take advantage of knowing what time of day a given student will be most capable of learning," Smarr said.

Credit: 
University of California - Berkeley

Software automatically generates knitting instructions for 3-D shapes

image: James McCann, assistant professor of robotics, and Carnegie Mellon graduate students Lea Albaugh and Vidya Narayanan check a computer-controlled knitting machine. Their system translates a 3-D shapes into stitch-by-stitch instructions so the machine can automatically produce them.

Image: 
Carnegie Mellon University/Michael Henninger

PITTSBURGH--Carnegie Mellon University computer scientists have developed a system that can translate a wide variety of 3-D shapes into stitch-by-stitch instructions that enable a computer-controlled knitting machine to automatically produce those shapes.

Researchers in the Carnegie Mellon Textiles Lab have used the system to produce a variety of plush toys and garments. What's more, James McCann, assistant professor in the Robotics Institute and leader of the lab, said this ability to generate knitting instructions without need of human expertise could make on-demand machine knitting possible.

McCann's vision is to use the same machines that routinely crank out thousands of knitted hats, gloves and other apparel to produce customized pieces one at a time or in small quantities. Gloves, for instance, might be designed to precisely fit a customer's hands. Athletic shoe uppers, sweaters and hats might have unique color patterns or ornamentation.

"Knitting machines could become as easy to use as 3-D printers," McCann said.

That's in stark contrast to the world of knitting today.

"Now, if you run a floor of knitting machines, you also have a department of engineers," said McCann, who noted that garment designers rarely have the specialized expertise necessary to program the machines. "It's not a sustainable way of doing one-off customized pieces.

In their latest work, to be presented this summer at SIGGRAPH 2018, the Conference on Computer Graphics and Interactive Techniques in Vancouver, Canada, McCann and his colleagues developed a method for transforming 3-D meshes -- a common method for modeling 3-D shapes -- into instructions for V-bed knitting machines.

These widely used machines manipulate loops of yarn with hook-shaped needles, which lie in parallel needle beds angled toward each other in an inverted V shape. The machines are highly capable, but are limited in comparison with hand knitting, said Vidya Narayanan, a Ph.D. student in computer science.

The CMU algorithm takes these constraints into account, she said, producing instructions for patterns that work within the limits of the machine and reduce the risk of yarn breaks or jams.

A front-end design system such as this is common in 3-D printing and in computer-driven machine shops, but not in the knitting world, McCann said. Likewise, 3-D printing and machine shops use common languages and file formats to run their equipment, while knitting machines use a variety of languages and tools that are specific to particular brands of knitting machines.
McCann led an earlier effort to create a common knitting format, called Knitout, which is capable of being implemented with any brand of knitting machine.

Further work is needed to make on-demand knitting a reality. For instance, the system now only produces smooth knitted cloth, without the patterned stitching that can make knitted garments distinctive. The knitting ecosystem also needs to be expanded, with design tools that will work with any machine. But progress could be rapid at this point, McCann said.

"The knitting hardware is already really good," he explained. "It's the software that needs a little push. And software can improve rapidly because we can iterate so much faster."

In addition to McCann and Narayanan, the research team included Jessica Hodgins, professor of computer science and robotics; Lea Albaugh, a Ph.D. student in the Human-Computer Interaction Institute; and Stelian Coros, a faculty member at ETH Zurich and an adjunct professor of robotics at CMU.

The research paper, along with a video, is available on GitHub.

Credit: 
Carnegie Mellon University

Herring larvae could benefit from an acidifying ocean

image: These strange-looking floating objects are really giant plastic test tubes called KOSMOS mesocosms that extend roughly 19 metres deep. Researchers used them to test the response of young herring to ocean acidification.

Image: 
Photo: KOSMOS/GEOMAR

One of the many downsides of too much carbon dioxide in the atmosphere is what happens when some of that CO2 is absorbed by the oceans. As atmospheric CO2 levels increase from burning fossil fuels, this carbon dioxide is soaked up by seawater and makes the oceans more acidic.

Increased acidity is bad news for coral reefs and creatures whose shells are made from calcium carbonate, but how does it affect the entire food web?

Using an unusual long-term research installation in the waters off the west coast of Sweden, a team of researchers from Germany, Sweden and Norway decided to find out. Their results have just been published in Nature Ecology and Evolution.

19-meter-long giant test tubes

Most studies on the effects of acidified seawater on fish species are done in the laboratory, says Fredrik Jutfelt, an associate professor at the Norwegian University of Science and Technology's (NTNU) Department of Biology, who was one of the study's authors. Some of those studies -- of which there are relatively few --show reduced survival for fish in their early life stages, he said.

But studying fish in a tank don't necessarily give researchers the chance to study the entire food web. For that you ideally need to be able to study fish in their natural habitat, the ocean. That's where the experimental set up in Gullmarsfjord, Sweden comes in.

Here, researchers were able to use what they call "mesocosms", or tall vertical floating "test tubes" filled with ocean water, 2.8 meters in diameter and 19 meters long, which were set out in Gullmarsfjord. Each mesocosm contained 50 m3 of ocean water.

The bags are long enough to contain an entire small ocean food web, from the tiny plankton that form the base of the food web, up to and including herring larvae.

Five of the bags were filled with "normal" seawater to serve as a control, while the remaining five bags were filled with seawater that had been acidified with CO2. The researchers used predictions of the amount of CO2 that would be in the atmosphere at the end of this century to set the CO2 levels in these last five bags.

They then added fertilized herring eggs to all of the bags when the phytoplankton in the bags began to wake up and reproduce, in late April 2013. The herring larvae that hatched in both bags lived in and fed exclusively on the plankton in the bags until the experiment was ended on 28 June.

The researchers took regular samples of the physical, chemical and biological conditions in the mesocosms.

More food outweighs negative effects of acidity

The earliest days for a young herring are among the most critical in their survival, because they have to find enough of the food they like to eat.

What the researchers found was that the increased CO2 they introduced to the test bags stimulated the growth of plankton enough to improve the survival of these tiniest of young herring.

In the end, the researchers found that the young herring in the mesocosms exposed to elevated CO2 levels had a survival rate that was nearly 20 per cent higher than the herring in the control mesocosms. That's in direct contrast to laboratory studies that have reported decreased larval fish survival.

"It appears that the herring will have an advantage over other more sensitive species in a future acidified ocean," said Michael Sswat, a researcher from the GEOMAR Helmholtz Centre for Ocean Research Kiel, who was the lead author of the study.

Sswat and his colleagues also ran a parallel laboratory study where they raised siblings of the herring larvae in the mesocosms in the laboratory at comparable pH and CO2 levels.

"The surprising finding here was that while the herring were unaffected by CO2 in our simultaneous lab experiment, the herring in the natural ecosystem experiments in the mesocosms benefitted from high CO2," said NTNU's Jutfelt. "This finding was likely due to changes in the ecosystem, as that was the main difference between the lab and the field experiments. We also measured the ecosystem changes in the mesocosms and we saw that the extra CO2 stimulated growth of algae, which led to more zooplankton, meaning more food for fish."

Catriona Clemmesen from GEOMAR, another study co-author, says that herring larvae may be naturally more able to adapt to ocean acidification.

"The tolerance of herring larvae to pH changes could be due to their life history strategy. "Herring spawn mostly near the ground, where naturally high CO2 levels prevail. They are therefore probably better adapted to ocean acidification than other fish species such as the cod that spawns near the surface," Clemmesen said.

Not necessarily good news

Although the researchers found that the herring larvae did well in the high CO2 conditions, they caution that their findings are not a cause for celebration.

"Whereas herring larvae were shown to be tolerant to CO2 levels projected for the end of this century, larval development in other fish species, including the Atlantic cod (Gadus morhua), is negatively affected under projected ocean acidification scenarios," the researchers wrote. "Also biodiversity of fish may be affected, as shown at natural volcanic vents, where changes in food availability and predation benefitted dominant fish species."

Credit: 
Norwegian University of Science and Technology

Opioid use prevalent among electronic dance music partygoers

One in 10 electronic dance music (EDM) party attendees have misused opioids in the past year, exceeding the national average, finds a study by the Center for Drug Use and HIV/HCV Research (CDUHR) at NYU Meyers College of Nursing.

The study, published in the journal Drug and Alcohol Dependence, suggests that prevention and harm reduction efforts need to target this increasingly popular scene as efforts continue toward reducing the opioid crisis.

Opioid use has grown to epidemic proportions in the United States and has been a main contributor to a resurgence of heroin use, as well as the spread of HIV and Hepatitis C. In 2016, approximately 11.5 million Americans had misused prescription opioids, with 1.8 million meeting criteria for dependence or abuse.

"We've always known that electronic dance music party attendees are at high risk for use of club drugs such as ecstasy or Molly, but we wanted to know the extent of opioid use in this population," said CDUHR researcher Joseph Palamar, PhD, MPH, the study's lead author and an associate professor of population health at NYU School of Medicine.

Since the study was conducted specifically on EDM partygoers, the researchers note that the results may not apply to the general population, but, rather, highlight the need for prevention efforts in this high-risk group.

"This population of experienced drug users needs to be reached to prevent initiation and continued use, which can lead to riskier and more frequent use, dependence, and deleterious outcomes such as overdose - particularly if opioids are combined with other drugs," Palamar said.

Throughout the summer of 2017, the researchers surveyed 954 individuals (ages 18 to 40) about to enter EDM parties at nightclubs and dance festivals in New York City. Attendees were asked about nonmedical use - defined as using in a manner which is not prescribed (such as to get high) - of 18 different opioids - including OxyContin, Percocet, Vicodin, codeine, fentanyl, and heroin.

The researchers found that almost a quarter (23.9 percent) of EDM party attendees have used opioids nonmedically in their lifetime and one out of 10 (9.8 percent) did so in the past year, which is higher than the national prevalence of past-year use of approximately 4 percent of adults 18 and older. Five percent of respondents reported misusing opioids in the past month.

OxyContin was the most highly reported opioid used in this scene, followed by Vicodin, Percocet, codeine, and Purple Drank, also known as Sizzurp or Lean (which typically contains codeine syrup). A smaller portion of users also snorted (15 percent) or injected opioids (11 percent) in the past year, which increases risk for overdose and dependence.

People who smoke cigarettes or use other drugs (including amphetamine, methamphetamine, and cocaine) were more likely to report misusing opioids in the past month. In particular, nonmedical users of benzodiazepines such as Xanax were at high odds for also using opioids, and, on average, reported using more different opioid drugs in the past year than those who did not use opioids. While the study did not measure whether multiple drugs were used simultaneously, research has shown that users of benzodiazepines and prescription opioids commonly combine the drugs to enhance or come down from the effects of other drugs.

Previous opioid use predicted the likelihood of someone reporting willingness to use if offered in the future. Among non-users, 5.7 percent reported that they would take opioids if offered by a friend in the next month. However, among those who had taken opioids in the past year, almost three out of four (73.6 percent) reported they would be willing to use again.

Notably, almost nine out of 10 past-year users of Purple Drank indicated that they would use again if offered by a friend, although these findings on this cocktail may be limited. "While real Sizzurp, Lean, or Purple Drank contains codeine syrup, it is likely that many people consumed concoctions without codeine," Palamar cautioned. However, the authors note that prevalence of opioid misuse did not change when removing this concoction from the list of opioids examined.

Credit: 
New York University

Walleye fish populations are in decline

image: Walleye is a popular game fish in Wisconsin, Minnesota and Canada. It's prized for its flaky meat and mild, sweet flavor. A new study shows its population is in decline.

Image: 
Andrew Rypel/UC Davis

Walleye, an iconic native fish species in Wisconsin, the upper Midwest and Canada, are in decline in northern Wisconsin lakes, according to a study published this week in the Canadian Journal of Fisheries and Aquatic Sciences.

The study does not pinpoint the exact causes for the decline, though it suggests it is likely a combination of factors, including climate change, habitat degradation and harvest rates that might at times outpace production levels if not monitored closely. Additional research is ongoing regarding what declining production means for future walleye harvests in this region.

'SOMETHING IS NOT RIGHT'

For the study, researchers analyzed production statistics collected between 1990 and 2012 for adult walleye populations in Wisconsin lakes. They found that annual walleye production across all lakes decreased by 27 percent during that time. It takes 1.5 times longer to produce the same amount of walleye biomass, or fish weight, now as it did in 1990.

Lakes experiencing declines are often stocked with walleye to make up for a loss in natural production. However, the data show that stocked lakes have seen larger declines in walleye production. Lakes with a mixture of both stocked and naturally reproducing walleye experienced declines of 47 percent, while lakes with only stocking and no natural reproduction declined by 63 percent.

"This is a clear warning sign that something is not right," said lead author Andrew Rypel, an ecologist at University of Wisconsin-Madison and the Wisconsin Department of Natural Resources during the time of the study. He is currently an associate professor and the Peter B. Moyle and California Trout Chair in Coldwater Fish Ecology at the University of California, Davis. "The results suggest that anglers, tribes and resource management agencies will all need to work together to craft new science-based management policies for moving forward."

ABOUT WALLEYE

Rypel, who grew up in Wisconsin fishing walleye, notes that walleye are "kind of a big deal" in the area. As salmon are to the West, walleye are to this region. People travel to the state's Northwoods just to fish for walleye; a large Catholic population enjoys them for Friday fish fries; and Native American tribes spear walleye in the early spring in accordance with their cultural and religious traditions. The meat is prized for its flaky, mild and sweet flavor.

"People catch and release bass," Rypel said. "That's not the case with walleye. People love to eat walleye."

THREATS TO WALLEYE

Bass like warmer waters, and unlike walleye, their populations have increased over the study period. Walleye require cooler waters, and their decline has accompanied a rise in lake temperatures. This points to climate change as one factor in the loss of walleye that the authors say should continue to be examined.

Habitat alteration, residential development around walleye-bearing lakes and indirect food web interactions are likely additional factors.

Another complication is that while most Wisconsin lakes are dominated by low-producing walleye populations, some of the highest-producing walleye lakes are used to estimate sustainable harvest for the region. Consequently, studies on those healthier walleye lakes may not adequately represent the overall walleye population.

ACTION IS UNDERWAY

State, tribal and community leaders have already taken several actions to help the struggling walleye. These include major stocking initiatives, new fishing regulations, programs to enhance habitat, bass removals and even moratoriums on walleye harvests.

"Most people interested in the outdoors, fishing and hunting are interested in leaving something for future generations, hopefully something better," Rypel said. "It's essential that we work collaboratively when we see trends that fisheries like these are in decline."

Credit: 
University of California - Davis

How self-driving cars could shrink parking lots

image: Left to right: Sina Bahrami, Mehdi Nourinejad and Professor Matthew Roorda designed an algorithm to optimize the design of parking lots for autonomous vehicles, increasing their capacity by an average of 62 per cent.

Image: 
Roberta Baker

New U of T Engineering research shows that adoption of self-driving cars -- also known as autonomous vehicles (AVs) -- could significantly reduce the amount of valuable urban space dedicated to parking.

"In a parking lot full of AVs, you don't need to open the doors, so they can park with very little space in between," says Professor Matthew Roorda, senior author of a new study in Transportation Research Part B. "You also don't need to leave space for each car to drive out, because you can signal the surrounding AVs to move out of the way."

While traditional parking lots are configured for "islands" of cars that can each pull in or out of a spot, an AV parking lot could resemble a solid grid, with outer cars moving aside as needed to let the inner cars enter and exit. The researchers' challenge was to determine the optimal size of the grid to maximize storage while minimizing the number of moves required to extract any given car.

"There's a trade-off," says Mehdi Nourinejad, a recent PhD graduate from the Department of Civil Engineering and the study's lead author. "If you have a very large grid, it leads to a lot of relocations, which means that it takes longer on average to retrieve your vehicle. On the other hand, if you have a number of smaller grids, it wastes a lot of space."

Nourinejad, Roorda and their co-author Sina Bahrami created a computer model in which they could simulate the effects of various layouts for AV parking lots. They then used an algorithm to optimize the design for various factors, including minimizing the number of relocations and maximizing the proportion of the lot that was used for parking versus lanes for relocation, entering or exiting.

Their analysis showed that, for a given number of cars, a well-designed AV parking lot could accommodate 62 per cent more cars than a conventional one. Depending on parking lot dimensions, in some cases they were able to increase the capacity even further -- square-shaped AV parking lots could accommodate up to 87 per cent more cars. This improved use of space could translate into much smaller parking lot footprints, provided the total number of cars that need to park in them remains constant.

Another advantage of AV parking lots is that the design is not fixed. "If demand changes -- for example, if you need to pack more cars into the lot -- you don't need to paint new parking spaces," says Bahrami. "Instead, the operator can just signal the cars to rearrange themselves. It will take longer to retrieve your vehicle, but you will fit more cars in."

Roorda hopes that municipal parking authorities will be able to use their design approach to enhance urban spaces. "Right now, our downtown cores have giant municipal parking lots next to major attractions," he says. "AVs could allow us to both shrink and relocate these parking lots, opening up valuable space in cities."

The concept of an AV driving and dropping off a passenger, navigating to an ultra-efficient AV parking lot and later returning to pick up the passanger sounds attractive. But this new paradigm could also introduce negative consequences, such as a potential increase in traffic congestion.

"Right now, we have a lot of cars on the road with just one passenger," says Roorda. "If we locate AV parking lots too far away from major attractions, we could end up with streets crowded with vehicles that have zero passengers, which would be worse."

Another drawback is that team's designs only work for parking lots reserved exclusively for AVs, rather than a mix of AVs and conventional vehicles, though Roorda says that a single lot could have both AV and non-AV areas.

Roorda and his team also can't predict when the number of AVs on the road will reach the critical mass required to make use of their designs.

"We're talking about large numbers of vehicles that can fully drive themselves, with no requirement for a driver to take over if something goes wrong," says Roorda. "There's a lot that has to happen before we get to that stage."

Credit: 
University of Toronto Faculty of Applied Science & Engineering

West Greenland Ice Sheet melting at the fastest rate in centuries

image: Ice cores from the West Greenland Ice Sheet 'percolation zone' were studied under a light table at Dartmouth’s Ice Core Laboratory to reveal ice layers that tell the history of how much melt has occurred through time.

Image: 
Robert Gill/Dartmouth College

HANOVER, N.H. - March 28, 2018 - The West Greenland Ice Sheet melted at a dramatically higher rate over the last twenty years than at any other time in the modern record, according to a study led by Dartmouth College. The research, appearing in the journal Geophysical Research Letters, shows that melting in west Greenland since the early 1990s is at the highest levels in at least 450 years.

While natural patterns of certain atmospheric and ocean conditions are already known to influence Greenland melt, the study highlights the importance of a long-term warming trend to account for the unprecedented west Greenland melt rates in recent years. The researchers suggest that climate change most likely associated with human greenhouse gas emissions is the probable cause of the additional warming.

"We see that west Greenland melt really started accelerating about twenty years ago," said Erich Osterberg, assistant professor of earth sciences at Dartmouth and the lead scientist on the project. "Our study shows that the rapid rise in west Greenland melt is a combination of specific weather patterns and an additional long-term warming trend over the last century."

According to research cited in the study, loss of ice from Greenland is one of the largest contributors to global sea level rise. Although glaciers calving into the ocean cause much of the ice loss in Greenland, other research cited in the study shows that the majority of ice loss in recent years is from increased surface melt and runoff.

While satellite measurements and climate models have detailed this recent ice loss, there are far fewer direct measurements of melt collected from the ice sheet itself. For this study, researchers from Dartmouth and Boise State University spent two months on snowmobiles to collect seven ice cores from the remote "percolation zone" of the West Greenland Ice Sheet.

When warm temperatures melt snow on the surface of the percolation zone, the melt water trickles down into the deeper snow and refreezes into ice layers. Researchers were easily able to distinguish these ice layers from the surrounding compacted snow in the cores, preserving a history of how much melt occurred back through time. The more melt, the thicker the ice layers.

"Most ice cores are collected from the middle of the ice sheet where it rarely ever melts, or on the ice sheet edge where the meltwater flows into the ocean. We focused on the percolation zone because that's where we find the best record of Greenland melt going back through time in the form of the refrozen ice layers," said Karina Graeter, the lead author of the study as a graduate student in Dartmouth's Department of Earth Sciences.

The cores, some as long as 100-feet, were transported to Dartmouth where the research team used a light table to measure the thickness and frequency of the ice layers. The cores were also sampled for chemical measurements in Dartmouth's Ice Core Laboratory to determine the age of each ice layer.

The cores reveal that the ice layers became thicker and more frequent beginning in the 1990s, with recent melt levels that are unmatched since at least the year 1550 CE.

"The ice core record ends about 450 years ago, so the modern melt rates in these cores are the highest of the whole record that we can see," said Osterberg. "The advantage of the ice cores is that they show us just how unusual it is for Greenland to be melting this fast".

Year-to-year changes in Greenland melt since 1979 were already known to be closely tied to North Atlantic ocean temperatures and high-pressure systems that sit above Greenland during the summer - known as summer blocking highs. The new study extends the record back in time to show that these were important controls on west Greenland melt going back to at least 1870.

The study also shows that an additional summertime warming factor of 2.2 degrees Fahrenheit is needed to explain the unusually strong melting observed since the 1990s. The additional warming caused a near-doubling of melt rates in the twenty-year period from 1995 to 2015 compared to previous times when the same blocking and ocean conditions were present.

"It is striking to see how a seemingly small warming of only 2.2 degrees Fahrenheit can have such a large impact on melt rates in west Greenland," said Graeter.

The study concludes that North Atlantic ocean temperatures and summer blocking activity will continue to control year-to-year changes in Greenland melt into the future. Some climate models suggest that summer blocking activity and ocean temperatures around Greenland might decline in the next several decades, but it remains uncertain. However, the study points out that continued warming from human activities would overwhelm those weather patterns over time to further increase melting.

"Cooler North Atlantic ocean temperatures and less summer blocking activity might slow down Greenland melt for a few years or even a couple decades, but it would not help us in the long run," said Osterberg. "Beyond a few decades, Greenland melting will almost certainly increase and raise sea level as long as we continue to emit greenhouse gases."

Credit: 
Dartmouth College

The American Society For Aesthetic Plastic Surgery reports that modern cosmetic surgical procedures

image: New data from the American Society for Aesthetic Plastic Surgery (ASAPS) shows that many modern cosmetic surgical procedures are on the rise, and that surgical procedures account for 77% of all surveyed physicians' business. The latest annual survey (Cosmetic Surgery National Data Bank Statistics) from the organization now reflects input exclusively from ABPS board-certified plastic surgeons, which previously encompassed data from physicians in a wider range of specialties.

Image: 
American Society for Aesthetic Plastic Surgery

NEW YORK, NY (March 28, 2018) - New data from the American Society for Aesthetic Plastic Surgery (ASAPS) shows that many modern cosmetic surgical procedures are on the rise, and that surgical procedures account for 77% of all surveyed physicians' business. The latest annual survey (Cosmetic Surgery National Data Bank Statistics) from the organization now reflects input exclusively from ABPS board-certified plastic surgeons, which previously encompassed data from physicians in a wider range of specialties.

"We opted to change the format of our survey to better represent the specialty of plastic surgery," explains Clyde H. Ishii, MD, President of ASAPS. "After more than two decades of collecting data from various specialties it made sense for us to fine tune our survey and take a closer look at what board-certified plastic surgeons are seeing in their practices," he explains.

The new data demonstrates that many surgical procedures, previously believed to be on the decline or leveling out, are increasing in popularity. "Contrary to popular belief and what is depicted in mainstream media, the facelift is by no means dead," W. Grant Stevens, MD, President-elect of ASAPS said. "In fact, the new data indicates that the number of facelifts performed in the United States increased by 21.9% in the past year alone and by 21.8% over the past five years," he added. "With advances including less invasive techniques resulting in less post-operative downtime, an increasing number of patients are warming up to the idea of going under the knife as surgery still promises the longest term, if not permanent results," Stevens said.

Other plastic surgical procedures seeing significant growth include breast lifts, blepharoplasty (eyelid surgery - upper and lower), upper arm lifts and liposuction, all of which have seen double-digit increases over the past year alone and all of which have seen double-digit increases over the past five years as well.

Breast lifts are up by 13.9% over the past year and 57.5% over the past five years

Eyelid surgery (blepharoplasty) is up 26.3% over the past year and 33.5% over the past five years

Liposuction is up 16.5% over the past year and 58.0% over the past five years

Upper arm lifts are up 20.1% over the past year and 59.1% over the past five years

The data also identified the top 5 surgical and nonsurgical procedures for men and women as follows:

Top 5 Surgical Procedures for Women:

Breast Augmentation

Liposuction

Breast Lift

Tummy Tuck

Eyelid Surgery (Blepharoplasty)

Top 5 Nonsurgical Procedures for Women:

Botulinum Toxin

Hyaluronic Acid

Hair Removal

Nonsurgical Fat Reduction

Chemical Peels

Top 5 Surgical Procedures for Men:

Liposuction

Eyelift Surgery (Blepharoplasty)

Breast Reduction (treatment of Gynecomastia)

Tummy Tuck

Facelift

Top 5 Nonsurgical Procedures for Men:

Botulinum Toxin

Hyaluronic Acid

Nonsurgical Fat Reduction

Hair Removal

Photo Rejuvenation (IPL)

To obtain a full copy of ASAPS' latest statistics, including a PDF book containing press-ready infographics, please visit https://surgery.org/sites/default/files/ASAPS-Stats2017.pdf

Credit: 
American Society for Aesthetic Plastic Surgery

Supernova may have 'burped' before exploding

image: The FELT, captured in 2015, rose in brightness over just 2.2 days and faded completely within 10 days.

Image: 
NASA/JPL-Caltech

The slow fade of radioactive elements following a supernova allows astrophysicists to study them at length. But the universe is packed full of flash-in-the pan transient events lasting only a brief time, so quick and hard to study they remain a mystery.

Only by increasing the rate at which telescopes monitor the sky has it been possible to catch more Fast-Evolving Luminous Transients (FELTs) and begin to understand them.

According to a new study in Nature Astronomy, researchers say NASA's Kepler Space Telescope captured one of the fastest FELTs to date. Peter Garnavich, professor and department chair of astrophysics and cosmology physics at the University of Notre Dame and co-author of the study, described the event as "the most beautiful light curve we will ever get for a fast transient."

"We think these might actually be very common, these flashes, and we have just been missing them in the past because they are so fast," Garnavich said. "The fact that one occurred in the small area of the sky being monitored by Kepler means they are probably fairly common."

The FELT, captured in 2015, rose in brightness over just 2.2 days and faded completely within 10 days. Most supernovae can take 20 days to reach peak brightness and weeks to become undetectable.

Researchers debated what could be causing these particularly fast events but ultimately settled on a simple explanation: The stars "burp" before exploding and don't generate enough radioactive energy to be seen later. As the supernova runs into the gas expelled in the burp, astrophysicists observe a flash. The supernova then fades beyond their ability to detect it.

"Our conclusion was that this was a massive star that exploded, but it had a mass loss -- a wind -- that started a couple of years before it exploded," Garnavich described. "A shock ran into that wind after the explosion, and that's what caused this big flash. But it turns out to be a rather weak supernova, so within a couple of weeks we don't see the rest of the light."

The only visible activity is from the quick collision of the gas and the exploding star, where some of the kinetic energy is converted to light. One mystery that remains is why the "burp" would happen such a short time before the supernova explosion. Astrophysicists want to know how the outside of the star reacts to what's happening deep in the core, Garnavich said.

While the Kepler telescope and its K2 mission is expected to run out of fuel and end in the coming months, NASA's Transiting Exoplanet Survey Satellite (TESS) is planned for launch following the K2 mission. Garnavich said data retrieved during the TESS mission could also be used to study FELTs.

Credit: 
University of Notre Dame

Potential biomarkers in animals could signal Ebola virus infection before symptoms appear

image: Investigators at USAMRIID and Boston University have used a model of Ebola virus infection in primates that better mimics human disease to identify early markers of infection. These markers of infection can be detected as early as 4 days before the first signs of disease such as fever.

Image: 
US Army Medical Research Institute of Infectious Diseases

Scientists have identified potential biomarkers in nonhuman primates exposed to Ebola virus (EBOV) that appeared up to four days before the onset of fever, according to research published today in the journal Science Translational Medicine.

The work, a collaboration between the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID) and Boston University (BU), could pave the way for developing diagnostic tools to identify EBOV infection in humans even before symptoms appear. Such tools would be invaluable in limiting the spread of disease where there are cases of known potential exposure to the virus, according to USAMRIID investigator Sandra L. Bixler, Ph.D., the paper's co-first author.

Bixler said previously developed animal models of EBOV infection have an acute disease course lasting only 7-10 days on average. This makes therapeutic intervention challenging, since the timeframe for administering treatment is very short. In addition, such models are based on high viral doses and are uniformly lethal, which does not reflect the variable and comparatively extended time to disease onset seen in humans.

"Those models make sense for testing vaccines and therapeutics," Bixler commented. "But for human infection, they don't really match what we see in the field--especially given what we've learned from the most recent Ebola virus disease outbreak in Western Africa."

So Bixler and USAMRIID colleague Arthur J. Goff, Ph.D., decided to investigate alternative models that could still replicate human infection while extending the disease course. Instead of challenging the animals via injection, which is a standard laboratory model, they tested the intranasal route--which would be more likely to occur in a natural outbreak where people may be exposed to infected bodily fluids.

The team designed a study using a lower dose of EBOV in 12 cynomolgus macaques. The animals, exposed by intranasal infection and closely monitored for signs of disease, fell into four categories. Three succumbed to disease in the usual timeframe of 7-10 days following infection; four had a delayed onset of 10-15 days; three were late-onset (20-22 days); and two survived.

"We were then faced with the challenge of teasing apart any differences between acute versus delayed disease onset, and survivors versus non-survivors," said Louis A. Altamura, Ph.D., one of the USAMRIID scientists who performed gene expression profiling to monitor the host response via changes in RNA transcripts over time. Thanks to a long-standing collaboration between USAMRIID and BU, investigators at USAMRIID's Center for Genome Sciences, along with BU scientists John H. Connor, Ph.D., and Emily Speranza, Ph.D., performed further genomic data analysis and began to look for early markers of infection.

What they found--in all the animals except the two survivors--were interferon-stimulating genes that appear prior to infection with Ebola virus. Importantly, the genes can be detected four days before the onset of fever, which is one of the earliest clinical signs of EBOV exposure. When Speranza compared the results to humans, using Ebola patient blood samples from the most recent outbreak, she found the same pattern.

"This demonstrates that lethal Ebola virus disease has a uniform and predictable response to infection, regardless of the time to onset," commented Gustavo Palacios, Ph.D., who directs USAMRIID's Center for Genome Sciences. "Furthermore, expression of a subset of genes could predict disease development prior to other host-based indications of infection, such as fever."

EBOV causes severe hemorrhagic fever in humans and nonhuman primates with high mortality rates and continues to emerge in new geographic locations, including Western Africa, the site of the largest recorded outbreak to date. More than 28,000 confirmed, probable and suspected cases have been reported in Guinea, Liberia and Sierra Leone, with more than 11,000 reported deaths, according to the World Health Organization (WHO). Today, WHO estimates that there are over 10,000 survivors of Ebola virus disease.

Research on Ebola virus is conducted under Biosafety Level 4, or maximum containment conditions, where investigators wear positive-pressure "space suits" and breathe filtered air as they work. USAMRIID is the only laboratory in the Department of Defense with Biosafety Level 4 capability, and its research benefits both military personnel and civilians.

Credit: 
US Army Medical Research Institute of Infectious Diseases

Dark matter is a no show in ghostly galaxy

image: (Left) The ultra-diffuse galaxy is rich with globular clusters, which hold the key to understanding this mysterious object's origin and mass. (Right) A closer look at one of the globular clusters within the galaxy, which are all much brighter than typically seen, with the brightest emitting almost as much light as the brightest globular cluster within the Milky Way. The spectrum, obtained by Keck Observatory, shows the calcium absorption lines used to determine the velocity of this object. 10 clusters were observed, providing the information needed to determine the mass of the galaxy, revealing its lack of dark matter.

Image: 
W. M. KECK OBSERVATORY/GEMINI OBSERVATORY/NSF/AURA/J. MILLER/J. POLLARD

Maunakea, Hawaii - Galaxies and dark matter go hand in hand; you typically don't find one without the other. So when researchers uncovered a galaxy, known as NGC1052-DF2, that is almost completely devoid of the stuff, they were shocked.

"Finding a galaxy without dark matter is unexpected because this invisible, mysterious substance is the most dominant aspect of any galaxy," said lead author Pieter van Dokkum of Yale University. "For decades, we thought that galaxies start their lives as blobs of dark matter. After that everything else happens: gas falls into the dark matter halos, the gas turns into stars, they slowly build up, then you end up with galaxies like the Milky Way. NGC1052-DF2 challenges the standard ideas of how we think galaxies form."

The research, published in the March 29th issue of the journal Nature, amassed data from Gemini North and W. M. Keck Observatory, both on Maunakea, Hawaii, the Hubble Space Telescope, and other telescopes around the world.

Given its large size and faint appearance, astronomers classify NGC1052-DF2 as an ultra-diffuse galaxy, a relatively new type of galaxy that was first discovered in 2015. These barely visible galaxies are surprisingly common. However, no other galaxy of this type yet-discovered is so lacking in dark matter.

"NGC 1052-DF2 is an oddity, even among this unusual class of galaxy," said Shany Danieli, a Yale University graduate student on the team.

Van Dokkum and his team first spotted NGC1052-DF2 with the Dragonfly Telephoto Array, a custom-built telescope in New Mexico that they designed to find these ghostly galaxies. NGC1052-DF2 stood out in stark contrast when comparisons were made between images from the Dragonfly Telephoto Array and the Sloan Digital Sky Survey (SDSS). The Dragonfly images show a faint "blob-like" object, while SDSS data reveal a collection of relatively bright point-like sources.

To get a closer look at this inconsistency, the team dissected the light from several of the bright sources within NGC1052-DF2 using Keck Observatory's Deep Imaging Multi-Object Spectrograph (DEIMOS) and Low-Resolution Imaging Spectrometer (LRIS), identifying 10 globular clusters. These clusters are large compact groups of stars that orbit the galactic core.

The spectral data obtained on the Keck telescopes revealed that the globular clusters were moving much slower than expected. The slower the objects in a system move, the less mass there is in that system. The team's calculations show that all of the mass in the galaxy could be attributed to the mass of the stars, which means there is almost no dark matter in NGC1052-DF2.

"If there is any dark matter at all, it's very little," van Dokkum explained. "The stars in the galaxy can account for all of the mass, and there doesn't seem to be any room for dark matter."

"Keck is in a very small group of telescopes that could even attempt these observations, because you need a large telescope to measure these accurate velocities," van Dokkum added. "Keck also has some of the best spectrographs in the world for measuring the velocities of faint objects. We had the opportunity to check and make sure we got the same result within the uncertainties, and that gave us confidence that we were doing things right."

To peer even deeper into this unique galaxy, the team used the Gemini-North Multi Object Spectrograph (GMOS) to capture detailed images of NGC1052-DF2, assess its structure, and confirm that the galaxy had no signs of interactions with other galaxies.

"Without the Gemini images dissecting the galaxy's morphology we would have lacked context for the rest of the data," said Danieli. "Also, Gemini's confirmation that NGC1052-DF2 is not currently interacting with another galaxy will help us answer questions about the conditions surrounding its birth."

The team's results demonstrate that dark matter is separable from galaxies.

"This discovery shows that dark matter is real - it has its own separate existence apart from other components of galaxies," said van Dokkum.

NGC1052-DF2's globular clusters and atypical structure has perplexed astronomers aiming to determine the conditions this galaxy formed under.

"It's like you take a galaxy and you only have the stellar halo and globular clusters, and it somehow forgot to make everything else," van Dokkum said. "There is no theory that predicted these types of galaxies. The galaxy is a complete mystery, as everything about it is strange. How you actually go about forming one of these things is completely unknown."

However, researchers do have some ideas. NGC1052-DF2 resides about 65 million light years away in a collection of galaxies that is dominated by the giant elliptical galaxy NGC 1052. Galaxy formation is turbulent and violent, and van Dokkum suggests that the growth of the fledgling massive galaxy billions of years ago perhaps played a role in NGC1052-DF2's dark-matter deficiency.

Another idea is that a cataclysmic event within the oddball galaxy, such as the birth of myriad massive stars, swept out all the gas and dark matter, halting star formation.

These possibilities are speculative, however, and don't explain all of the characteristics of the observed galaxy, the researchers said.

The team continues the hunt for more dark-matter-deficient galaxies. They are analyzing Hubble images of 23 other diffuse galaxies. Three of them appear to share similarities with NGC1052-DF2, which van Dokkum plans to follow up on in the coming months at Keck Observatory.

"Every galaxy we knew about before has dark matter and they all fall in familiar categories like spiral or elliptical galaxies," van Dokkum said. "But what would you get if there were no dark matter at all? Maybe this is what you would get."

Credit: 
W. M. Keck Observatory

Dining out associated with increased exposure to harmful chemicals called Phthalates

image: Dining out more at restaurants, cafeterias and fast-food outlets may boost total levels of potentially health-harming chemicals called phthalates in the body, according to a study out today.

Image: 
The Milken Institute School of Public Health

WASHINGTON, DC (March 28, 2018)--Dining out more at restaurants, cafeterias and fast-food outlets may boost total levels of potentially health-harming chemicals called phthalates in the body, according to a study out today. Phthalates, a group of chemicals used in food packaging and processing materials, are known to disrupt hormones in humans and are linked to a long list of health problems.

The study is the first to compare phthalate exposures in people who reported dining out to those more likely to enjoy home-cooked meals. People who reported consuming more restaurant, fast food and cafeteria meals had phthalate levels that were nearly 35 percent higher than people who reported eating food mostly purchased at the grocery store, according to the study.

"This study suggests food prepared at home is less likely to contain high levels of phthalates, chemicals linked to fertility problems, pregnancy complications and other health issues," says senior author Ami Zota, ScD, MS, an assistant professor of environmental and occupational health at Milken Institute School of Public Health (Milken Institute SPH) at the George Washington University. "Our findings suggest that dining out may be an important, and previously under-recognized source of exposure to phthalates for the U.S. population."

Lead author Julia Varshavsky, PhD, MPH, who did the work while she was at the University of California, Berkeley, School of Public Health, Zota, and their colleagues used data from the National Health and Nutrition Examination Survey (NHANES) collected between 2005 and 2014. The 10,253 participants in the study were asked to recall what they ate and where their food came from in the previous 24 hours. The researchers then analyzed the links between what people ate and the levels of phthalate break-down products found in each participant's urine sample.

The team found that 61 percent of the participants reported dining out the previous day. In addition, the researchers found:

The association between phthalate exposure and dining out was significant for all age groups but the magnitude of association was highest for teenagers;

Adolescents who were high consumers of fast food and other food purchased outside the home had 55 percent higher levels of phthalates compared to those who only consumed food at home;

Certain foods, and especially cheeseburgers and other sandwiches, were associated with increased levels of phthalates--but only if they were purchased at a fast-food outlet, restaurant or cafeteria. The study found that sandwiches consumed at fast food outlets, restaurants or cafeterias were associated with 30 percent higher phthalate levels in all age groups.

"Pregnant women, children and teens are more vulnerable to the toxic effects of hormone-disrupting chemicals, so it's important to find ways to limit their exposures," says Varshavsky, who is now a postdoctoral scientist at the University of California, San Francisco. "Future studies should investigate the most effective interventions to remove phthalates from the food supply."

A previous study by Zota and colleagues suggested that fast food may expose consumers to higher levels of phthalates. That study found that people who ate the most fast food, burgers, fries and other foods, had phthalate levels that were as much as 40 percent higher than people who rarely ate such foods

The new study looked more broadly at dining out--not just at fast-food outlets--and found that it was significantly associated with increased exposure to phthalates. The authors say the findings are worrisome because two-thirds of the U.S. population eats at least some food outside the home daily.

Additional authors of the study include Rachel Morello-Frosch at the University of California, Berkeley, and Tracey Woodruff at the University of California, San Francisco.

The team used an innovative method of assessing real-world exposures to multiple phthalates, called cumulative phthalate exposure, which takes into account evidence that some phthalates are more toxic than others. The National Academies of Sciences has weighed in twice on phthalates--first in a 2008 report, they recommended using cumulative risk assessments in order to estimate the human health risk posed by this class of chemicals; and then in 2017 with a report finding that certain phthalates are presumed to be reproductive hazards to humans.

Many products contain phthalates, including take-home boxes, gloves used in handling food, food processing equipment and other items used in the production of restaurant, cafeteria and fast food meals. Previous research suggests these chemicals can leach from plastic containers or wrapping into food.

If verified by additional research, the findings from this study suggest that people who love dining out are getting a side of phthalates with their entrée.

Home-cooked meals may be one way to limit exposure to these harmful chemicals. "Preparing food at home may represent a win-win for consumers," adds Zota. "Home cooked meals can be a good way to reduce sugar, unhealthy fats and salt. And this study suggests it may not have as many harmful phthalates as a restaurant meal."

At the same time, phthalate contamination of the food supply also represents a larger public health problem, one that must be addressed by policymakers. Zota and Woodruff's previous research shows that policy actions, such as bans, can help reduce human exposure to harmful phthalates.

Credit: 
George Washington University

Want people to fund your Kickstarter project? Sell them on your reputation first

BINGHAMTON, NY - When trying to entice people to invest in your product on a crowdfunding website, potential funders are more concerned about your ethical characteristics than your actual ability to make and deliver the product, according to new research from Binghamton University, State University of New York.

Popular crowdfunding sites, like Kickstarter and Indiegogo, give people a platform to display their ideas for products or services they'd like to create, giving virtually anyone the opportunity to fund the project. Funders may sometimes give money with the promise that they'll get the product in return once it's fully funded and completed.

Unlike other e-commerce platforms such as eBay and Amazon, most crowdfunding websites don't have a traditional product and seller rating system, meaning funders often enter into the process with a sense of uncertainty.

"Crowdfunding is interesting because you're literally buying something that isn't finished from a person who has never made it before. There are no product reviews, and there are no seller reviews," said Ali Alper Yayla, associate professor in Binghamton University's School of Management.

Binghamton University researchers Yayla and associate professor Surinder Kahai, along with Yu Lei from SUNY College at Old Westbury, dove into the uncertainties that funders experience in their decision-making process on crowdfunding platforms. They were particularly interested in how those uncertainties shifted in relation to product complexity. An example of a low-complexity product would be a t-shirt or a photo album, while a high-complexity product would be a 3D printer.

The researchers showed mock crowdfunding campaign pages for products of varying complexity to over 300 subjects. What they found is that no matter how complex the promised product was, potential funders were more concerned with a seller's reputation and opportunism than the seller's competence and expertise to actually make the product they're promising.

"We found that people worry more about the seller's honesty than whether the seller actually has the ability and knowledge to finish and deliver on the product," said Yayla. "People don't want sellers to just take their money and run."

As a product gets more complex, their research also found that concern for the seller's reputation increases, while concerns over the seller's ability to create the product actually decreases.

"This was an unexpected finding," said Yayla. "You'd assume that people would think if the product is very complex, the seller may not actually have the ability to make it. On the other hand, you'd think that people wouldn't worry about seller competence in low-complexity products."

Yayla said one possibility for why this happens is that funders for high-complexity products may already be familiar with the science behind that product, and that unfamiliar funders probably wouldn't consider looking into the product in the first place.

Based on the findings, Yayla said those looking to start a crowdfunding project should be willing to provide plenty of detail about themselves to their potential funders. He said project initiators should consider providing links to social media pages or other sites that feature ongoing projects in order to help bolster their reputation.

Credit: 
Binghamton University