We’re probably undervaluing healthy lakes and rivers

For sale: Pristine lake. Price negotiable.

Most U.S. government attempts to quantify the costs and benefits of protecting the country’s bodies of water are likely undervaluing healthy lakes and rivers, researchers argue in a new study. That’s because some clean water benefits get left out of the analyses, sometimes because these benefits are difficult to pin numbers on. As a result, the apparent value of many environmental regulations is probably discounted.

The study, published online October 8 in the Proceedings of the National Academy of Sciences, surveyed 20 government reports analyzing the economic impacts of U.S. water pollution laws. Most of these laws have been enacted since 2000, when cost-benefit analyses became a requirement. Analysis of a measure for restricting river pollution, for example, might find that it increases costs for factories using that river for wastewater disposal, but boosts tourism revenues by drawing more kayakers and swimmers.
Only two studies out of 20 showed the economic benefits of these laws exceeding the costs. That’s uncommon among analyses of environmental regulations, says study coauthor David Keiser, an environmental economist at Iowa State University in Ames. Usually, the benefits exceed the costs.

So why does water pollution regulation seem, on paper at least, like such a losing proposition?

Keiser has an explanation: Summing up the monetary benefits of environmental policies is really hard. Many of these benefits are intangible and don’t have clear market values. So deciding which benefits to count, and how to count them, can make a big difference in the results.
Many analyses assume water will be filtered for drinking, Keiser says, so they don’t count the human health benefits of clean lakes and rivers (SN: 8/18/18, p. 14). That’s different from air pollution cost-benefit studies, which generally do include the health benefits of cleaner air by factoring in data tracking things like doctor’s visits or drug prescriptions. That could explain why Clean Air Act rules tend to get more favorable reviews, Keiser says — human health accounts for about 95 percent of the measured benefits of air quality regulations.

“You can avoid a lake with heavy, thick, toxic algal blooms,” Keiser says. “If you walk outside and have very polluted air, it’s harder to avoid.”

But even if people can avoid an algae-choked lake, they still pay a price for that pollution, says environmental scientist Thomas Bridgeman, director of the Lake Erie Center at the University of Toledo in Ohio.
Communities that pull drinking water from a lake filled with toxic blooms of algae or cyanobacteria spend more to make the water safe to drink. Bridgeman’s seen it firsthand: In 2014, Lake Erie’s cyanobacteria blooms from phosphorus runoff shut down Toledo’s water supply for two days and forced the city to spend $500 million on water treatment upgrades.

Most of the studies surveyed by Keiser and his team were missing other kinds of benefits, too. The reports usually left out the value of eliminating certain toxic and nonconventional pollutants — molecules such as bisphenol A, or BPA, and perfluorooctanoic acid, or PFOA (SN: 10/3/15, p. 12). In high quantities, these compounds, which are used to make some plastics and nonstick coatings, can cause harm to humans and wildlife. Many studies also didn’t include discussion of how the quality of surface waters can affect groundwater, which is a major source of drinking water for many people.

A lack of data on water quality may also limit studies, Keiser’s team suggests. While there’s a national database tracking daily local air pollution levels, the data from various water quality monitoring programs aren’t centralized. That makes gathering and evaluating trends in water quality harder.

Plus, there are the intangibles — the value of aquatic species that are essential to the food chain, for example.
“Some things are just inherently difficult to put a dollar [value] on,” says Robin Craig, an environmental law professor at the University of Utah in Salt Lake City. “What is it worth to have a healthy native ecosystem?… That’s where it can get very subjective very fast.”

That subjectivity can allow agencies to analyze policies in ways that suit their own political agendas, says Matthew Kotchen, an environmental economist at Yale University. An example: the wildly different assessments by the Obama and Trump administrations of the value gained from the 2015 Clean Water Rule, also known as the Waters of the United States rule.

The rule, passed under President Barack Obama, clarified the definition of waters protected under the 1972 Clean Water Act to include tributaries and wetlands connected to larger bodies of water. The Environmental Protection Agency estimated in 2015 that the rule would result in yearly economic benefits ranging from $300 million to $600 million, edging out the predicted annual costs of $200 million to $500 million. But in 2017, Trump’s EPA reanalyzed the rule and proposed rolling it back, saying that the agency had now calculated just $30 million to $70 million in annual benefits.

The difference in the conclusions came down to the consideration of wetlands: The 2015 analysis found that protecting wetlands, such as marshes and bogs that purify water, tallied up to $500 million in annual benefits. The Trump administration’s EPA, however, left wetlands out of the calculation entirely, says Kotchen, who analyzed the policy swing in Science in 2017.

Currently, the rule has gone into effect in 26 states, but is still tied up in legal challenges.

It’s an example of how methodology — and what counts as a benefit — can have a huge impact on the apparent value of environmental policies and laws.

The squishiness in analyzing environmental benefits underlies many of the Trump administration’s proposed rollbacks of Obama-era environmental legislation, not just ones about water pollution, Kotchen says. There are guidelines for how such cost-benefit analyses should be carried out, he says, but there’s still room for researchers or government agencies to choose what to include or exclude.

In June, the EPA, then under the leadership of Scott Pruitt, proposed revising the way the agency does cost-benefit analyses to no longer include so-called indirect benefits. For example, in evaluating policies to reduce carbon dioxide emissions, the agency would ignore the fact that those measures also reduce other harmful air pollutants. The move would, overall, make environmental policies look less beneficial.

These sharp contrasts in how presidential administrations approach environmental impact studies are not unprecedented, says Craig, the environmental law professor. “Pretty much every time we change presidents, the priorities for how to weigh those different elements change.”

‘End of the Megafauna’ examines why so many giant Ice Age animals went extinct

Ross D.E. MacPhee and Peter Schouten (illustrator)
W.W. Norton & Co., $35

Today’s land animals are a bunch of runts compared with creatures from the not-too-distant past. Beasts as big as elephants, gorillas and bears were once much more common around the world. Then, seemingly suddenly, hundreds of big species, including the woolly mammoth, the giant ground sloth and a lizard weighing as much as half a ton, disappeared. In End of the Megafauna, paleomammalogist Ross MacPhee makes one thing clear: The science on what caused the extinctions of these megafauna — animals larger than 44 kilograms, or about 100 pounds — is far from settled.
MacPhee dissects the evidence behind two main ideas: that as humans moved into new parts of the world over the last 50,000 years, people hunted the critters into oblivion, or that changes in climate left the animals too vulnerable to survive. As MacPhee shows, neither scenario matches all of the available data.

Throughout, Peter Schouten’s illustrations, reminiscent of paintings that enliven natural history museums, bring the behemoths back to life. At times, MacPhee slips in too many technical terms. But overall, he offers readers an informative, up-to-date overview of a fascinating period in Earth’s history.

Buy End of the Megafauna from Amazon.com. Science News is a participant in the Amazon Services LLC Associates Program. Please see our FAQ for more details.

A lack of sleep can induce anxiety

SAN DIEGO — A sleepless night can leave the brain spinning with anxiety the next day.

In healthy adults, overnight sleep deprivation triggered anxiety the next morning, along with altered brain activity patterns, scientists reported November 4 at the annual meeting of the Society for Neuroscience.

People with anxiety disorders often have trouble sleeping. The new results uncover the reverse effect — that poor sleep can induce anxiety. The study shows that “this is a two-way interaction,” says Clifford Saper, a sleep researcher at Harvard Medical School and Beth Israel Deaconess Medical Center in Boston who wasn’t involved in the study. “The sleep loss makes the anxiety worse, which in turn makes it harder to sleep.”
Sleep researchers Eti Ben Simon and Matthew Walker, both of the University of California, Berkeley, studied the anxiety levels of 18 healthy people. Following either a night of sleep or a night of staying awake, these people took anxiety tests the next morning. After sleep deprivation, anxiety levels in these healthy people were 30 percent higher than when they had slept. On average, the anxiety scores reached levels seen in people with anxiety disorders, Ben Simon said November 5 in a news briefing.

What’s more, sleep-deprived people’s brain activity changed. In response to emotional videos, brain areas involved in emotions were more active, and the prefrontal cortex, an area that can put the brakes on anxiety, was less active, functional MRI scans showed.

The results suggest that poor sleep “is more than just a symptom” of anxiety, but in some cases, may be a cause, Ben Simon said.

Engineers are plugging holes in drinking water treatment

Off a gravel road at the edge of a college campus — next door to the town’s holding pen for stray dogs — is a busy test site for the newest technologies in drinking water treatment.

In the large shed-turned-laboratory, University of Massachusetts Amherst engineer David Reckhow has started a movement. More people want to use his lab to test new water treatment technologies than the building has space for.

The lab is a revitalization success story. In the 1970s, when the Clean Water Act put new restrictions on water pollution, the diminutive grey building in Amherst, Mass. was a place to test those pollution-control measures. But funding was fickle, and over the years, the building fell into disrepair. In 2015, Reckhow brought the site back to life. He and a team of researchers cleaned out the junk, whacked the weeds that engulfed the building and installed hundreds of thousands of dollars worth of monitoring equipment, much of it donated or bought secondhand.

“We recognized that there’s a lot of need for drinking water technology,” Reckhow says. Researchers, students and start-up companies all want access to test ways to disinfect drinking water, filter out contaminants or detect water-quality slipups. On a Monday afternoon in October, the lab is busy. Students crunch data around a big table in the main room. Small-scale tests of technology that uses electrochemistry to clean water chug along, hooked up to monitors that track water quality. On a lab bench sits a graduate student’s low-cost replica of an expensive piece of monitoring equipment. The device alerts water treatment plants when the by-products of disinfection chemicals in a water supply are reaching dangerous levels. In an attached garage, two startup companies are running larger-scale tests of new kinds of membranes that filter out contaminants.
Parked behind the shed is the almost-ready-to-roll newcomer. Starting in 2019, the Mobile Water Innovation Laboratory will take promising new and affordable technologies to local communities for testing. That’s important, says Reckhow, because there’s so much variety in the quality of water that comes into drinking water treatment plants. On-site testing is the only way to know whether a new approach is effective, he says, especially for newer technologies without long-term track records.

The facility’s popularity reflects a persistent concern in the United States: how to ensure affordable access to clean, safe drinking water. Although U.S. drinking water is heavily regulated and pretty clean overall, recent high-profile contamination cases, such as the 2014 lead crisis in Flint, Mich. (SN: 3/19/16, p. 8), have exposed weaknesses in the system and shaken people’s trust in their tap water.
Tapped out
In 2013 and 2014, 42 drinking water–associated outbreaks resulted in more than 1,000 illnesses and 13 deaths, based on reports to the U.S. Centers for Disease Control and Prevention. The top culprits were Legionella bacteria and some form of chemical, toxin or parasite, according to data published in November 2017.

Those numbers tell only part of the story, however. Many of the contaminants that the U.S. Environmental Protection Agency regulates through the 1974 Safe Drinking Water Act cause problems only when exposure happens over time; the effects of contaminants like lead don’t appear immediately after exposure. Records of EPA rule violations note that in 2015, 21 million people were served by drinking water systems that didn’t meet standards, researchers reported in a February study in the Proceedings of the National Academy of Sciences. That report tracked trends in drinking water violations from 1982 to 2015.
Current technology can remove most contaminants, says David Sedlak, an environmental engineer at the University of California, Berkeley. Those include microbes, arsenic, nitrates and lead. “And then there are some that are very difficult to degrade or transform,” such as industrial chemicals called PFAS.

Smaller communities, especially, can’t always afford top-of-the-line equipment or infrastructure overhauls to, for example, replace lead pipes. So Reckhow’s facility is testing approaches to help communities address water-quality issues in affordable ways.
Some researchers are adding technologies to deal with new, potentially harmful contaminants. Others are designing approaches that work with existing water infrastructure or clean up contaminants at their source.

How is your water treated?
A typical drinking water treatment plant sends water through a series of steps.

First, coagulants are added to the water. These chemicals clump together sediments, which can cloud water or make it taste funny, so they are bigger and easier to remove. A gentle shaking or spinning of the water, called flocculation, helps those clumps form (1). Next, the water flows into big tanks to sit for a while so the sediments can fall to the bottom (2). The cleaner water then moves through membranes that filter out smaller contaminants (3). Disinfection, via chemicals or ultraviolet light, kills harmful bacteria and viruses (4). Then the water is ready for distribution (5).
There’s a lot of room for variation within that basic water treatment process. Chemicals added at different stages can trigger reactions that break down chunky, toxic organic molecules into less harmful bits. Ion-exchange systems that separate contaminants by their electric charge can remove ions like magnesium or calcium that make water “hard,” as well as heavy metals, such as lead and arsenic, and nitrates from fertilizer runoff. Cities mix and match these strategies, adjusting chemicals and prioritizing treatment components, based on the precise chemical qualities of the local water supply.

Some water utilities are streamlining the treatment process by installing technologies like reverse osmosis, which removes nearly everything from the water by forcing the water molecules through a selectively permeable membrane with extremely tiny holes. Reverse osmosis can replace a number of steps in the water treatment process or reduce the number of chemicals added to water. But it’s expensive to install and operate, keeping it out of reach for many cities.

Fourteen percent of U.S. residents get water from wells and other private sources that aren’t regulated by the Safe Drinking Water Act. These people face the same contamination challenges as municipal water systems, but without the regulatory oversight, community support or funding.

“When it comes to lead in private wells … you’re on your own. Nobody is going to help you,” says Marc Edwards, the Virginia Tech engineer who helped uncover the Flint water crisis. Edwards and Virginia Tech colleague Kelsey Pieper collected water-quality data from over 2,000 wells across Virginia in 2012 and 2013. Some were fine, but others had lead levels of more than 100 parts per billion. When levels are higher than its 15 ppb threshold, the EPA mandates that cities take steps to control corrosion and notify the public about the contamination. The researchers reported those findings in 2015 in the Journal of Water and Health.

To remove lead and other contaminants, well users often rely on point-of-use treatments. A filter on the tap removes most, but not all, contaminants. Some people spring for costly reverse osmosis systems.
New tech solutions
These three new water-cleaning approaches wouldn’t require costly infrastructure overhauls.

Ferrate to cover many bases
Reckhow’s team at UMass Amherst is testing ferrate, an ion of iron, as a replacement for several water treatment steps. First, ferrate kills bacteria in the water. Next, it breaks down carbon-based chemical contaminants into smaller, less harmful molecules. Finally, it makes ions like manganese less soluble in water so they are easier to filter out, Reckhow and colleagues reported in 2016 in Journal–American Water Association. With its multifaceted effects, ferrate could potentially streamline the drinking water treatment process or reduce the use of chemicals, such as chlorine, that can yield dangerous by-products, says Joseph Goodwill, an environmental engineer at the University of Rhode Island in Kingston.

Ferrate could be a useful disinfectant for smaller drinking water systems that don’t have the infrastructure, expertise or money to implement something like ozone treatment, an approach that uses ozone gas to break down contaminants, Reckhow says.

Early next year, in the maiden voyage of his mobile water treatment lab, Reckhow plans to test the ferrate approach in the small Massachusetts town of Gloucester.
In the 36-foot trailer is a squeaky-clean array of plastic pipes and holding tanks. The setup routes incoming water through the same series of steps — purifying, filtering and disinfecting — that one would find in a standard drinking water treatment plant. With two sets of everything, scientists can run side-by-side experiments, comparing a new technology’s performance against the standard approach. That way researchers can see whether a new technology works better than existing options, says Patrick Wittbold, the UMass Amherst research engineer who headed up the trailer’s design.

Charged membranes
Filtering membranes tend to get clogged with small particles. “That’s been the Achilles’ heel of membrane treatment,” says Brian Chaplin, an engineer at the University of Illinois at Chicago. Unclogging the filter wastes energy and increases costs. Electricity might solve that problem and offer some side benefits, Chaplin suggests.

His team tested an electrochemical membrane made of titanium oxide or titanium dioxide that both filters water and acts as an electrode. Chemical reactions happening on the electrically charged membranes can turn nitrates into nitrogen gas or split water molecules, generating reactive ions that can oxidize contaminants in the water. The reactions also prevent particles from sticking to the membrane. Large carbon-based molecules like benzene become smaller and less harmful.
In lab tests, the membranes effectively filtered and destroyed contaminants, Chaplin says. In one test, a membrane transformed 67 percent of the nitrates in a solution into other molecules. The finished water was below the EPA’s regulatory nitrate limit of 10 parts per million, he and colleagues reported in July in Environmental Science and Technology. Chaplin expects to move the membrane into pilot tests within the next two years.

Obliterate the PFAS
The industrial chemicals known as PFAS present two challenges. Only the larger ones are effectively removed by granular activated carbon, the active material in many household water filters. The smaller PFAS remain in the water, says Christopher Higgins, an environmental engineer at the Colorado School of Mines in Golden. Plus, filtering isn’t enough because the chunky chemicals are hard to break down for safe disposal.

Higgins and colleague Timothy Strathmann, also at the Colorado School of Mines, are working on a process to destroy PFAS. First, a specialized filter with tiny holes grabs the molecules out of the water. Then, sulfite is added to the concentrated mixture of contaminants. When hit with ultraviolet light, the sulfite generates reactive electrons that break down the tough carbon-fluorine bonds in the PFAS molecules. Within 30 minutes, the combination of UV radiation and sulfites almost completely destroyed one type of PFAS, other researchers reported in 2016 in Environmental Science and Technology.

Soon, Higgins and Strathmann will test the process at Peterson Air Force Base in Colorado, one of nearly 200 U.S. sites known to have groundwater contaminated by PFAS. Cleaning up those sites would remove the pollutants from groundwater that may also feed wells or city water systems.

Why a chemistry teacher started a science board game company

A physicist, a gamer and two editors walk into a bar. No, this isn’t the setup for some joke. After work one night, a few Science News staffers tried out a new board game, Subatomic. This deck-building game combines chemistry and particle physics for an enjoyable — and educational — time.

Subatomic is simple to grasp: Players use quark and photon cards to build protons, neutrons and electrons. With those three particles, players then construct chemical elements to score points. Scientists are the wild cards: Joseph J. Thomson, Maria Goeppert-Mayer, Marie Curie and other Nobel laureates who discovered important things related to the atom provide special abilities or help thwart other players.
The game doesn’t shy away from difficult or unfamiliar concepts. Many players might be unfamiliar with quarks, a group of elementary particles. But after a few rounds, it’s ingrained in your brain that, for example, two up quarks and one down quark create a proton. And Subatomic includes a handy booklet that explains in easy-to-understand terms the science behind the game. The physicist in our group vouched for the game’s accuracy but had one qualm: Subatomic claims that two photons, or particles of light, can create an electron. That’s theoretically possible, but scientists have yet to confirm it in the lab.

The mastermind behind Subatomic is John Coveyou, who has a master’s degree in energy, environmental and chemical engineering. As the founder and CEO of Genius Games
, he has created six other games, including Ion ( SN: 5/30/15, p. 29 ) and Linkage ( SN: 12/27/14, p. 32 ). Next year, he’ll add a periodic table game to the list . Because Science News has reviewed several of his games, we decided to talk with Coveyou about where he gets his inspiration and how he includes real science in his products. The following discussion has been edited for length and clarity.
SN: When did you get interested in science?

Coveyou: My mom was mentally and physically disabled, and my dad was in and out of prison and mental institutions. So early on, things were very different for me. I ended up leaving home when I was in high school, hopscotching around from 12 different homes throughout my junior and senior year. I almost dropped out, but I had a lot of teachers who were amazing mentors. I didn’t know what else to do, so I joined the army. While I was in Iraq, I had a bunch of science textbooks shipped to me, and I read them in my free time. They took me out of the environments I was in and became extremely therapeutic. A lot of the issues we face as a society can be worked on by the next generation having a command of the sciences. So I’m very passionate about teaching people the sciences and helping people find joy in them.

SN: Why did you start creating science games?

Coveyou: I was teaching chemistry at a community college, and I noticed that my students were really intimidated by the chemistry concepts before they even came into the classroom. They really struggled with a lot of the basic terminology. At the same time, I’ve been a board gamer pretty much my whole life. And it kind of hit me like, “Whoa, wait a second. What if I made some games that taught some of the concepts that I’m trying to teach my chemistry students?” So I just took a shot at it. The first couple of games were terrible. I didn’t really know what I was doing, but I kept at it.

SN: How do you test the games?

Coveyou: We first test with other gamers. Once we’re ready to get feedback from the general public, we go to middle school or high school students. Once we test a game with people face-to-face, we will send it across the world to about 100 to 200 different play testers, and those vary from your hard-core gamers to homeschool families to science teachers, who try it in the classroom.

SN: How do you incorporate real science into your games?

Coveyou: I pretty much always start with a science concept in mind and think about how can we create a game that best reflects the science that we want to communicate. For all of our upcoming games, we include a booklet about the science. That document is not created by Genius Games. We have about 20 to 30 Ph.D.s and doctors across the globe who write the content and edit each other. That’s been a real treat to actually show players how the game is accurate. We’ve had so many scientists and teachers who are just astonished that we created something like this that was accurate, but also fun to play.

Voyager 2 spacecraft enters interstellar space

Voyager 2 has entered interstellar space. The spacecraft slipped out of the huge bubble of particles that encircles the solar system on November 5, becoming the second ever human-made craft to cross the heliosphere, or the boundary between the sun and the stars.

Coming in second place is no mean achievement. Voyager 1 became the first spacecraft to exit the solar system in 2012. But that craft’s plasma instrument stopped working in 1980, leaving scientists without a direct view of the solar wind, hot charged particles constantly streaming from the sun (SN Online: 9/12/13). Voyager 2’s plasma sensors are still working, providing unprecedented views of the space between stars.

“We’ve been waiting with bated breath for the last couple of months for us to be able to see this,” NASA solar physicist Nicola Fox said at a Dec. 10 news conference at the American Geophysical Union meeting in Washington, D.C.

NASA launched the twin Voyager spacecraft in 1977 on a grand tour of the solar system’s planets (SN: 8/19/17, p. 26). After that initial tour was over, both spacecraft continued travelling through the bubble of plasma that originates at the sun.
“When Voyager was launched, we didn’t know how large the bubble was, how long it would take to get [to its edge] and whether the spacecraft could last long enough to get there,” said Voyager project scientist Edward Stone of Caltech.

For most of Voyager 2’s journey, the spacecraft’s Plasma Science Experiment measured the speed, density, temperature, pressure and other properties of the solar wind. But on November 5, the experiment saw a sharp drop in the speed and the number of solar wind particles that hit the detector each second. At the same time, another detector started picking up more high-energy particles called cosmic rays that originate elsewhere in the galaxy.
Those measurements suggest that Voyager 2 has reached the region where the solar wind slams into the colder, denser population of particles that fill the space between stars. Voyager 2 is now a little more than 18 billion kilometers from the sun.

Intriguingly, Voyager 2’s measurements of cosmic rays and magnetic fields — which Voyager 1 could still make when it crossed the boundary — did not exactly match up with Voyager 1’s observations.
“That’s what makes it interesting,” Stone said. The variations are probably from the fact that the two spacecraft exited the heliosphere in different places, and that the sun is at a different part of its 11-year activity cycle than it was in 2012. “We would have been amazed if they had looked the same.”

The Voyagers probably have between five and 10 years left to continue exploring interstellar space, said Voyager project manager Suzanne Dodd from NASA’s Jet Propulsion Laboratory in Pasadena, Calif.

“Both spacecraft are very healthy if you consider them senior citizens,” Dodd said. The biggest concern is how much power they have left and how cold they are — Voyager 2 is currently about 3.6° Celsius, close to the freezing point of its hydrazine fuel. In the near future, the team will have to turn off some of the spacecraft’s instruments to keep the craft operating and sending data back to Earth.

“We do have difficult decisions ahead,” Dodd said. She added that her personal goal is to see the spacecraft last until 2027, for a total of 50 years in space. “That would be fantastic.”

NASA’s OSIRIS-REx finds signs of water on the asteroid Bennu

As the asteroid Bennu comes into sharper focus, planetary scientists are seeing signs of water locked up in the asteroid’s rocks, NASA team members announced December 10.

“It’s one of the things we were hoping to find,” team member Amy Simon of NASA’s Goddard Space Flight Center in Greenbelt, Md., said in a news conference at the American Geophysical Union meeting in Washington, D.C. “This is evidence of liquid water in Bennu’s past. This is really big news.”
NASA’s OSIRIS-REx spacecraft just arrived at Bennu on December 3 (SN Online: 12/3/18). Over the next year, the team will search for the perfect spot on the asteroid to grab a handful of dust and return it to Earth. “Very early in the mission, we’ve found out Bennu is going to provide the type of material we want to return,” said principal investigator Dante Lauretta of the University of Arizona in Tucson. “It definitely looks like we’ve gone to the right place.”

OSIRIS-REx’s onboard spectrometers measure the chemical signatures of various minerals based on the wavelengths of light they emit and absorb. The instruments were able to see signs of hydrated minerals on Bennu’s surface about a month before the spacecraft arrived at the asteroid, and the signal has remained strong all over the asteroid’s surface as the spacecraft approached, Simon said. Those minerals can form only in the presence of liquid water, and suggest that Bennu had a hydrothermal system in its past.

Bennu’s surface is also covered in more boulders and craters than the team had expected based on observations of the asteroid taken from Earth. Remote observations led the team to expect a few large boulders, about 10 meters wide. Instead they see hundreds, some of them up to 50 meters wide.

“It’s a little more rugged of an environment,” Lauretta said. But that rough surface can reveal details of Bennu’s internal structure and history.
If Bennu were one solid mass, for instance, a major impact could crack or shatter its entire surface. The fact that it has large craters means it has survived impacts intact. It may be more of a rubble pile loosely held together by its own gravity.
The asteroid’s density supports the rubble pile idea. OSIRIS-REx’s first estimate of Bennu’s density shows it is about 1,200 kilograms per cubic meter, Lauretta said. The average rock is about 3,000 kilograms per cubic meter. The hydrated minerals go some way towards lowering the asteroid’s density, since water is less dense than rock. But up to 40 percent of the asteroid may be full of caves and voids as well, Lauretta said.

Some of the rocks on the surface appear to be fractured in a spindly pattern. “If you drop a dinner plate on the ground, you get a spider web of fractures,” says team member Kevin Walsh of the Southwest Research Institute in Boulder, Colo. “We’re seeing this in some boulders.”

The boulders may have cracked in response to the drastic change in temperatures they experience as the asteroid spins. Studying those fracture patterns in more detail will reveal the properties of the rocks.

The OSIRIS-REx team also needs to know how many boulders of various sizes are strewn across the asteroid’s surface. Any rock larger than about 20 centimeters across would pose a hazard to the spacecraft’s sampling arm, says Keara Burke of the University of Arizona. Burke, an undergraduate engineering student, is heading up a boulder mapping project.
“My primary goal is safety,” she says. “If it looks like a boulder to me, within reasonable guidelines, then I mark it as a boulder. We can’t sample anything if we’re going to crash.”

The team also needs to know where the smallest grains of rock and dust are, as OSIRIS-REx’s sampling arm can pick up grains only about 2 centimeters across. One way to find the small rocks is to measure how well the asteroid’s surface retains heat. Bigger rocks are slower to heat up and slower to cool down, so they’ll radiate heat out into space even on the asteroid’s night side. Smaller grains of dust heat up and cool down much more quickly.

“It’s exactly like a beach,” Walsh says. “During the day it’s scalding hot, but then it’s instantly cold when the sun sets.”

Measurements of the asteroid’s heat storage so far suggest that there are regions with grains as small as 1 or 2 centimeters across, Lauretta said, though it is still too early to be certain.

“I am confident that we’ll find some fine-grained regions,” Lauretta said. Some may be located inside craters. The challenge will be finding an area wide enough that the spacecraft’s navigation system can steer to it accurately.

A new implant uses light to control overactive bladders

A new soft, wireless implant may someday help people who suffer from overactive bladder get through the day with fewer bathroom breaks.

The implant harnesses a technique for controlling cells with light, known as optogenetics, to regulate nerve cells in the bladder. In experiments in rats with medication-induced overactive bladders, the device alleviated animals’ frequent need to pee, researchers report online January 2 in Nature.

Although optogenetics has traditionally been used for manipulating brain cells to study how the mind works, the new implant is part of a recent push to use the technique to tame nerve cells throughout the body (SN: 1/30/10, p. 18). Similar optogenetic implants could help treat disease and dysfunction in other organs, too.
“I was very happy to see this,” says Bozhi Tian, a materials scientist at the University of Chicago not involved in the work. An estimated 33 million people in the United States have overactive bladders. One available treatment is an implant that uses electric currents to regulate bladder nerve cells. But those implants “will stimulate a lot of nerves, not just the nerves that control the bladder,” Tian says. That can interfere with the function of neighboring organs, and continuous electrical stimulation can be uncomfortable.

The new optogenetic approach, however, targets specific nerves in only one organ and only when necessary. To control nerve cells with light, researchers injected a harmless virus carrying genetic instructions for bladder nerve cells to produce a light-activated protein called archaerhodopsin 3.0, or Arch. A stretchy sensor wrapped around the bladder tracks the wearer’s urination habits, and the implant wirelessly sends that information to a program on a tablet computer.
If the program detects the user heeding nature’s call at least three times per hour, it tells the implant to turn on a pair of tiny LEDs. The green glow of these micro light-emitting diodes activates the light-sensitive Arch proteins in the bladder’s nerve cells, preventing the cells from sending so many full-bladder alerts to the brain.
John Rogers, a materials scientist and bioengineer at Northwestern University in Evanston, Ill., and colleagues tested their implants by injecting rats with the overactive bladder–causing drug cyclophosphamide. Over the next several hours, the implants successfully detected when rats were passing water too frequently, and lit up green to bring the animals’ urination patterns back to normal.

Shriya Srinivasan, a medical engineer at MIT not involved in the work, is impressed with the short-term effectiveness of the implant. But, she says, longer-term studies may reveal complications with the treatment.

For instance, a patient might develop an immune reaction to the foreign Arch protein, which would cripple the protein’s ability to block signals from bladder nerves to the brain. But if proven safe and effective in the long term, similar optogenetic implants that sense and respond to organ motion may also help treat heart, lung or muscle tissue problems, she says.

Optogenetic implants could also monitor other bodily goings-on, says study coauthor Robert Gereau, a neuroscientist at Washington University in St. Louis. Hormone levels and tissue oxygenation or hydration, for example, could be tracked and used to trigger nerve-altering LEDs for medical treatment, he says.

New Horizons shows Ultima Thule looks like a snowman, or maybe BB-8

The results are in: Ultima Thule, the distant Kuiper Belt object that got a close visit from the New Horizons spacecraft on New Year’s Day, looks like two balls stuck together.

“What you are seeing is the first contact binary ever explored by a spacecraft, two separate objects that are now joined together,” principal investigator Alan Stern of the Southwest Research Institute in Boulder, Colo., said January 2 in a news conference held at the Johns Hopkins University Applied Physics Laboratory in Laurel, Md.

“It’s a snowman, if it’s anything at all,” Stern said. (Twitter was quick to supply another analogy: the rolling BB-8 droid from Star Wars.)

That shape is enough to lend credence to the idea that planetary bodies grow up by the slow clumping of small rocks. Ultima Thule, whose official name is 2014 MU69, is thought to be among the oldest and least-altered objects in the solar system, so knowing how it formed can reveal how planets formed in general (SN Online: 12/18/18).
“Think of New Horizons as a time machine … that has brought us back to the very beginning of solar system history, to a place where we can observe the most primordial building blocks of the planets,” said Jeff Moore of NASA’s Ames Research Center in Moffett Field, Calif., who leads New Horizons’ geology team. “It’s gratifying to see these perfectly formed contact binaries in their native habitat. Our ideas of how these things form seem to be somewhat vindicated by these observations.”

The view from about 28,000 kilometers away shows that MU69 is about 33 kilometers long and has two spherical lobes, one about three times the size of the other. The spheres are connected by a narrow “neck” that appears brighter than much of the rest of the surface.
That could be explained by small grains of surface material rolling downhill to settle in the neck, because small grains tend to reflect more light than large ones, said New Horizons deputy project scientist Cathy Olkin of the Southwest Research Institute. Even the brightest areas reflected only about 13 percent of the sunlight that hit them, though. The darkest reflected just 6 percent, about the same brightness as potting soil.

Measurements also show that MU69 rotates once every 15 hours, give or take one hour. That’s a Goldilocks rotation speed, Olkin said. If it spun too fast, MU69 would break apart; too slow would be hard to explain for such a small body. Fifteen hours is just right.

The lobes’ spherical shape is best explained by collections of small rocks glomming together to form larger rocks, Moore said. The collisions between the rocks happened at extremely slow speeds, so the rocks accreted rather than breaking each other apart. The final collision was between the two spheres, which the team dubbed “Ultima” (the bigger one) and “Thule” (the smaller one).
That collision probably happened at no more than a few kilometers per hour, “the speed at which you might park your car in a parking space,” Moore said. “If you had a collision with another car at those speeds, you may not even bother to fill out the insurance forms.”

New Horizons also picked up MU69’s reddish color. The science team thinks the rusty hue comes from radiation altering exotic ice, frozen material like methane or nitrogen rather than water, although they don’t know exactly what that ice is made of yet.

The spacecraft is still sending data back to Earth, and will continue transmitting details of the flyby for the next 18 months. Even as the New Horizons team members shared the first pictures from the spacecraft’s flyby, data was arriving that will reveal details of MU69’s surface composition.

“The real excitement today is going to be in the composition team room,” Olkin said. “There’s no way to make anything like this type of observation without having a spacecraft there.”

One Antarctic ice shelf gets half its annual snowfall in just 10 days

Just a few powerful storms in Antarctica can have an outsized effect on how much snow parts of the southernmost continent get. Those ephemeral storms, preserved in ice cores, might give a skewed view of how quickly the continent’s ice sheet has grown or shrunk over time.

Relatively rare extreme precipitation events are responsible for more than 40 percent of the total annual snowfall across most of the continent — and in some places, as much as 60 percent, researchers report March 22 in Geophysical Research Letters.
Climatologist John Turner of the British Antarctic Survey in Cambridge and his colleagues used regional climate simulations to estimate daily precipitation across the continent from 1979 to 2016. Then, the team zoomed in on 10 locations — representing different climates from the dry interior desert to the often snowy coasts and the open ocean — to determine regional differences in snowfall.

While snowfall amounts vary greatly by location, extreme events packed the biggest wallop along Antarctica’s coasts, especially on the floating ice shelves, the researchers found. For instance, the Amery ice shelf in East Antarctica gets roughly half of its annual precipitation — which typically totals about half a meter of snow — in just 10 days, on average. In 1994, the ice shelf got 44 percent of its entire annual precipitation on a single day in September.

Ice cores aren’t just a window into the past; they are also used to predict the continent’s future in a warming world. So characterizing these coastal regions is crucial for understanding Antarctica’s ice sheet — and its potential future contribution to sea level rise.
Editor’s note: This story was updated April 5, 2019, to correct that the results were reported March 22 (not March 25).