‘End of the Megafauna’ examines why so many giant Ice Age animals went extinct

Ross D.E. MacPhee and Peter Schouten (illustrator)
W.W. Norton & Co., $35

Today’s land animals are a bunch of runts compared with creatures from the not-too-distant past. Beasts as big as elephants, gorillas and bears were once much more common around the world. Then, seemingly suddenly, hundreds of big species, including the woolly mammoth, the giant ground sloth and a lizard weighing as much as half a ton, disappeared. In End of the Megafauna, paleomammalogist Ross MacPhee makes one thing clear: The science on what caused the extinctions of these megafauna — animals larger than 44 kilograms, or about 100 pounds — is far from settled.
MacPhee dissects the evidence behind two main ideas: that as humans moved into new parts of the world over the last 50,000 years, people hunted the critters into oblivion, or that changes in climate left the animals too vulnerable to survive. As MacPhee shows, neither scenario matches all of the available data.

Throughout, Peter Schouten’s illustrations, reminiscent of paintings that enliven natural history museums, bring the behemoths back to life. At times, MacPhee slips in too many technical terms. But overall, he offers readers an informative, up-to-date overview of a fascinating period in Earth’s history.

Buy End of the Megafauna from Amazon.com. Science News is a participant in the Amazon Services LLC Associates Program. Please see our FAQ for more details.

A lack of sleep can induce anxiety

SAN DIEGO — A sleepless night can leave the brain spinning with anxiety the next day.

In healthy adults, overnight sleep deprivation triggered anxiety the next morning, along with altered brain activity patterns, scientists reported November 4 at the annual meeting of the Society for Neuroscience.

People with anxiety disorders often have trouble sleeping. The new results uncover the reverse effect — that poor sleep can induce anxiety. The study shows that “this is a two-way interaction,” says Clifford Saper, a sleep researcher at Harvard Medical School and Beth Israel Deaconess Medical Center in Boston who wasn’t involved in the study. “The sleep loss makes the anxiety worse, which in turn makes it harder to sleep.”
Sleep researchers Eti Ben Simon and Matthew Walker, both of the University of California, Berkeley, studied the anxiety levels of 18 healthy people. Following either a night of sleep or a night of staying awake, these people took anxiety tests the next morning. After sleep deprivation, anxiety levels in these healthy people were 30 percent higher than when they had slept. On average, the anxiety scores reached levels seen in people with anxiety disorders, Ben Simon said November 5 in a news briefing.

What’s more, sleep-deprived people’s brain activity changed. In response to emotional videos, brain areas involved in emotions were more active, and the prefrontal cortex, an area that can put the brakes on anxiety, was less active, functional MRI scans showed.

The results suggest that poor sleep “is more than just a symptom” of anxiety, but in some cases, may be a cause, Ben Simon said.

Sound-absorbent wings and fur help some moths evade bats

Some moths aren’t so easy for bats to detect.

The cabbage tree emperor moth has wings with tiny scales that absorb sound waves sent out by bats searching for food. That absorption reduces the echoes that bounce back to bats, allowing Bunaea alcinoe to avoid being so noticeable to the nocturnal predators, researchers report online November 12 in the Proceedings of the National Academy of Sciences.

“They have this stealth coating on their body surfaces which absorbs the sound,” says study coauthor Marc Holderied, a bioacoustician at the University of Bristol in England. “We now understand the mechanism behind it.”

Bats sense their surroundings using echolocation, sending out sound waves that bounce off objects and return as echoes picked up by the bats’ supersensitive ears (SN: 9/30/17, p. 22). These moths, without ears that might alert them to an approaching predator, have instead developed scales of a size, shape and thickness suited to absorbing ultrasonic sound frequencies used by bats, the researchers found.
The team shot ultrasonic sound waves at a single, microscopic scale and observed it transferring sound wave energy into movement. The scientists then simulated the process with a 3-D computer model that showed the scale absorbing up to 50 percent of the energy from sound waves.

What’s more, it isn’t just wings that help such earless moths evade bats. Other moths in the same family as B. alcinoe also have sound-absorbing fur, the same researchers report online October 18 in the Journal of the Acoustical Society of America.
Holderied and his colleagues studied the fluffy thoraxes of the Madagascan bullseye moth and the promethea silk moth, and found that the fur also absorbs sound waves through a different process called porous absorption. In lab tests, the furry-bellied moths absorbed as much as 85 percent of the sound waves encountered. Researchers suspect that the equally fluffy cabbage tree emperor moth also has this ability.

Other moths that have ears can hear bats coming, and can quickly swerve out of the way of their predators, dipping and diving in dizzying directions (SN: 5/26/18, p. 11). Some moths also have long tails on their wings that researchers suspect can be twirled to disrupt bats’ sound waves (SN: 3/21/15, p. 17). Still other moths produce toxins to fend off foes.

Having sound-absorbent fur and scales “might require a lot less energy in terms of protection from the moth’s side,” says Akito Kawahara, an evolutionary biologist at the Florida Museum of Natural History in Gainesville who was not involved with the study. “It’s a very different kind of passive defense system.”

Holderied and his colleagues hope next to study how multiple scales, locked together, respond to ultrasonic sound waves. The findings could one day help in developing better soundproofing technology for sound engineers and acousticians.

50 years ago, screwworm flies inspired a new approach to insect control

Screwworm fly upsurge

Screwworms, the first pest to be eliminated on a large scale by the use of the sterile male technique, have shown an alarming increase, according to U.S. and Mexican officials…. The screwworm fly lays its eggs in open wounds on cattle. The maggots live on the flesh of their host, causing damage and death, and economic losses of many millions of dollars.
— Science News, November 23, 1968

Update
Though eradicated in the United States in 1966, screwworms reemerged two years later, probably coming up from Mexico. Outbreaks in southern U.S. states in 1972 and in Florida in 2016 were both handled with the sterile male technique, considered one of the most successful approaches for pest control. Males are sterilized with radiation, then released into a population to breed with wild counterparts; no offspring result. The method has been used with other pests, such as mosquitoes, which were dropped by drones over Brazil this year as a test before the technology is used against outbreaks like the Zika virus.

Engineers are plugging holes in drinking water treatment

Off a gravel road at the edge of a college campus — next door to the town’s holding pen for stray dogs — is a busy test site for the newest technologies in drinking water treatment.

In the large shed-turned-laboratory, University of Massachusetts Amherst engineer David Reckhow has started a movement. More people want to use his lab to test new water treatment technologies than the building has space for.

The lab is a revitalization success story. In the 1970s, when the Clean Water Act put new restrictions on water pollution, the diminutive grey building in Amherst, Mass. was a place to test those pollution-control measures. But funding was fickle, and over the years, the building fell into disrepair. In 2015, Reckhow brought the site back to life. He and a team of researchers cleaned out the junk, whacked the weeds that engulfed the building and installed hundreds of thousands of dollars worth of monitoring equipment, much of it donated or bought secondhand.

“We recognized that there’s a lot of need for drinking water technology,” Reckhow says. Researchers, students and start-up companies all want access to test ways to disinfect drinking water, filter out contaminants or detect water-quality slipups. On a Monday afternoon in October, the lab is busy. Students crunch data around a big table in the main room. Small-scale tests of technology that uses electrochemistry to clean water chug along, hooked up to monitors that track water quality. On a lab bench sits a graduate student’s low-cost replica of an expensive piece of monitoring equipment. The device alerts water treatment plants when the by-products of disinfection chemicals in a water supply are reaching dangerous levels. In an attached garage, two startup companies are running larger-scale tests of new kinds of membranes that filter out contaminants.
Parked behind the shed is the almost-ready-to-roll newcomer. Starting in 2019, the Mobile Water Innovation Laboratory will take promising new and affordable technologies to local communities for testing. That’s important, says Reckhow, because there’s so much variety in the quality of water that comes into drinking water treatment plants. On-site testing is the only way to know whether a new approach is effective, he says, especially for newer technologies without long-term track records.

The facility’s popularity reflects a persistent concern in the United States: how to ensure affordable access to clean, safe drinking water. Although U.S. drinking water is heavily regulated and pretty clean overall, recent high-profile contamination cases, such as the 2014 lead crisis in Flint, Mich. (SN: 3/19/16, p. 8), have exposed weaknesses in the system and shaken people’s trust in their tap water.
Tapped out
In 2013 and 2014, 42 drinking water–associated outbreaks resulted in more than 1,000 illnesses and 13 deaths, based on reports to the U.S. Centers for Disease Control and Prevention. The top culprits were Legionella bacteria and some form of chemical, toxin or parasite, according to data published in November 2017.

Those numbers tell only part of the story, however. Many of the contaminants that the U.S. Environmental Protection Agency regulates through the 1974 Safe Drinking Water Act cause problems only when exposure happens over time; the effects of contaminants like lead don’t appear immediately after exposure. Records of EPA rule violations note that in 2015, 21 million people were served by drinking water systems that didn’t meet standards, researchers reported in a February study in the Proceedings of the National Academy of Sciences. That report tracked trends in drinking water violations from 1982 to 2015.
Current technology can remove most contaminants, says David Sedlak, an environmental engineer at the University of California, Berkeley. Those include microbes, arsenic, nitrates and lead. “And then there are some that are very difficult to degrade or transform,” such as industrial chemicals called PFAS.

Smaller communities, especially, can’t always afford top-of-the-line equipment or infrastructure overhauls to, for example, replace lead pipes. So Reckhow’s facility is testing approaches to help communities address water-quality issues in affordable ways.
Some researchers are adding technologies to deal with new, potentially harmful contaminants. Others are designing approaches that work with existing water infrastructure or clean up contaminants at their source.

How is your water treated?
A typical drinking water treatment plant sends water through a series of steps.

First, coagulants are added to the water. These chemicals clump together sediments, which can cloud water or make it taste funny, so they are bigger and easier to remove. A gentle shaking or spinning of the water, called flocculation, helps those clumps form (1). Next, the water flows into big tanks to sit for a while so the sediments can fall to the bottom (2). The cleaner water then moves through membranes that filter out smaller contaminants (3). Disinfection, via chemicals or ultraviolet light, kills harmful bacteria and viruses (4). Then the water is ready for distribution (5).
There’s a lot of room for variation within that basic water treatment process. Chemicals added at different stages can trigger reactions that break down chunky, toxic organic molecules into less harmful bits. Ion-exchange systems that separate contaminants by their electric charge can remove ions like magnesium or calcium that make water “hard,” as well as heavy metals, such as lead and arsenic, and nitrates from fertilizer runoff. Cities mix and match these strategies, adjusting chemicals and prioritizing treatment components, based on the precise chemical qualities of the local water supply.

Some water utilities are streamlining the treatment process by installing technologies like reverse osmosis, which removes nearly everything from the water by forcing the water molecules through a selectively permeable membrane with extremely tiny holes. Reverse osmosis can replace a number of steps in the water treatment process or reduce the number of chemicals added to water. But it’s expensive to install and operate, keeping it out of reach for many cities.

Fourteen percent of U.S. residents get water from wells and other private sources that aren’t regulated by the Safe Drinking Water Act. These people face the same contamination challenges as municipal water systems, but without the regulatory oversight, community support or funding.

“When it comes to lead in private wells … you’re on your own. Nobody is going to help you,” says Marc Edwards, the Virginia Tech engineer who helped uncover the Flint water crisis. Edwards and Virginia Tech colleague Kelsey Pieper collected water-quality data from over 2,000 wells across Virginia in 2012 and 2013. Some were fine, but others had lead levels of more than 100 parts per billion. When levels are higher than its 15 ppb threshold, the EPA mandates that cities take steps to control corrosion and notify the public about the contamination. The researchers reported those findings in 2015 in the Journal of Water and Health.

To remove lead and other contaminants, well users often rely on point-of-use treatments. A filter on the tap removes most, but not all, contaminants. Some people spring for costly reverse osmosis systems.
New tech solutions
These three new water-cleaning approaches wouldn’t require costly infrastructure overhauls.

Ferrate to cover many bases
Reckhow’s team at UMass Amherst is testing ferrate, an ion of iron, as a replacement for several water treatment steps. First, ferrate kills bacteria in the water. Next, it breaks down carbon-based chemical contaminants into smaller, less harmful molecules. Finally, it makes ions like manganese less soluble in water so they are easier to filter out, Reckhow and colleagues reported in 2016 in Journal–American Water Association. With its multifaceted effects, ferrate could potentially streamline the drinking water treatment process or reduce the use of chemicals, such as chlorine, that can yield dangerous by-products, says Joseph Goodwill, an environmental engineer at the University of Rhode Island in Kingston.

Ferrate could be a useful disinfectant for smaller drinking water systems that don’t have the infrastructure, expertise or money to implement something like ozone treatment, an approach that uses ozone gas to break down contaminants, Reckhow says.

Early next year, in the maiden voyage of his mobile water treatment lab, Reckhow plans to test the ferrate approach in the small Massachusetts town of Gloucester.
In the 36-foot trailer is a squeaky-clean array of plastic pipes and holding tanks. The setup routes incoming water through the same series of steps — purifying, filtering and disinfecting — that one would find in a standard drinking water treatment plant. With two sets of everything, scientists can run side-by-side experiments, comparing a new technology’s performance against the standard approach. That way researchers can see whether a new technology works better than existing options, says Patrick Wittbold, the UMass Amherst research engineer who headed up the trailer’s design.

Charged membranes
Filtering membranes tend to get clogged with small particles. “That’s been the Achilles’ heel of membrane treatment,” says Brian Chaplin, an engineer at the University of Illinois at Chicago. Unclogging the filter wastes energy and increases costs. Electricity might solve that problem and offer some side benefits, Chaplin suggests.

His team tested an electrochemical membrane made of titanium oxide or titanium dioxide that both filters water and acts as an electrode. Chemical reactions happening on the electrically charged membranes can turn nitrates into nitrogen gas or split water molecules, generating reactive ions that can oxidize contaminants in the water. The reactions also prevent particles from sticking to the membrane. Large carbon-based molecules like benzene become smaller and less harmful.
In lab tests, the membranes effectively filtered and destroyed contaminants, Chaplin says. In one test, a membrane transformed 67 percent of the nitrates in a solution into other molecules. The finished water was below the EPA’s regulatory nitrate limit of 10 parts per million, he and colleagues reported in July in Environmental Science and Technology. Chaplin expects to move the membrane into pilot tests within the next two years.

Obliterate the PFAS
The industrial chemicals known as PFAS present two challenges. Only the larger ones are effectively removed by granular activated carbon, the active material in many household water filters. The smaller PFAS remain in the water, says Christopher Higgins, an environmental engineer at the Colorado School of Mines in Golden. Plus, filtering isn’t enough because the chunky chemicals are hard to break down for safe disposal.

Higgins and colleague Timothy Strathmann, also at the Colorado School of Mines, are working on a process to destroy PFAS. First, a specialized filter with tiny holes grabs the molecules out of the water. Then, sulfite is added to the concentrated mixture of contaminants. When hit with ultraviolet light, the sulfite generates reactive electrons that break down the tough carbon-fluorine bonds in the PFAS molecules. Within 30 minutes, the combination of UV radiation and sulfites almost completely destroyed one type of PFAS, other researchers reported in 2016 in Environmental Science and Technology.

Soon, Higgins and Strathmann will test the process at Peterson Air Force Base in Colorado, one of nearly 200 U.S. sites known to have groundwater contaminated by PFAS. Cleaning up those sites would remove the pollutants from groundwater that may also feed wells or city water systems.

Why a chemistry teacher started a science board game company

A physicist, a gamer and two editors walk into a bar. No, this isn’t the setup for some joke. After work one night, a few Science News staffers tried out a new board game, Subatomic. This deck-building game combines chemistry and particle physics for an enjoyable — and educational — time.

Subatomic is simple to grasp: Players use quark and photon cards to build protons, neutrons and electrons. With those three particles, players then construct chemical elements to score points. Scientists are the wild cards: Joseph J. Thomson, Maria Goeppert-Mayer, Marie Curie and other Nobel laureates who discovered important things related to the atom provide special abilities or help thwart other players.
The game doesn’t shy away from difficult or unfamiliar concepts. Many players might be unfamiliar with quarks, a group of elementary particles. But after a few rounds, it’s ingrained in your brain that, for example, two up quarks and one down quark create a proton. And Subatomic includes a handy booklet that explains in easy-to-understand terms the science behind the game. The physicist in our group vouched for the game’s accuracy but had one qualm: Subatomic claims that two photons, or particles of light, can create an electron. That’s theoretically possible, but scientists have yet to confirm it in the lab.

The mastermind behind Subatomic is John Coveyou, who has a master’s degree in energy, environmental and chemical engineering. As the founder and CEO of Genius Games
, he has created six other games, including Ion ( SN: 5/30/15, p. 29 ) and Linkage ( SN: 12/27/14, p. 32 ). Next year, he’ll add a periodic table game to the list . Because Science News has reviewed several of his games, we decided to talk with Coveyou about where he gets his inspiration and how he includes real science in his products. The following discussion has been edited for length and clarity.
SN: When did you get interested in science?

Coveyou: My mom was mentally and physically disabled, and my dad was in and out of prison and mental institutions. So early on, things were very different for me. I ended up leaving home when I was in high school, hopscotching around from 12 different homes throughout my junior and senior year. I almost dropped out, but I had a lot of teachers who were amazing mentors. I didn’t know what else to do, so I joined the army. While I was in Iraq, I had a bunch of science textbooks shipped to me, and I read them in my free time. They took me out of the environments I was in and became extremely therapeutic. A lot of the issues we face as a society can be worked on by the next generation having a command of the sciences. So I’m very passionate about teaching people the sciences and helping people find joy in them.

SN: Why did you start creating science games?

Coveyou: I was teaching chemistry at a community college, and I noticed that my students were really intimidated by the chemistry concepts before they even came into the classroom. They really struggled with a lot of the basic terminology. At the same time, I’ve been a board gamer pretty much my whole life. And it kind of hit me like, “Whoa, wait a second. What if I made some games that taught some of the concepts that I’m trying to teach my chemistry students?” So I just took a shot at it. The first couple of games were terrible. I didn’t really know what I was doing, but I kept at it.

SN: How do you test the games?

Coveyou: We first test with other gamers. Once we’re ready to get feedback from the general public, we go to middle school or high school students. Once we test a game with people face-to-face, we will send it across the world to about 100 to 200 different play testers, and those vary from your hard-core gamers to homeschool families to science teachers, who try it in the classroom.

SN: How do you incorporate real science into your games?

Coveyou: I pretty much always start with a science concept in mind and think about how can we create a game that best reflects the science that we want to communicate. For all of our upcoming games, we include a booklet about the science. That document is not created by Genius Games. We have about 20 to 30 Ph.D.s and doctors across the globe who write the content and edit each other. That’s been a real treat to actually show players how the game is accurate. We’ve had so many scientists and teachers who are just astonished that we created something like this that was accurate, but also fun to play.

Voyager 2 spacecraft enters interstellar space

Voyager 2 has entered interstellar space. The spacecraft slipped out of the huge bubble of particles that encircles the solar system on November 5, becoming the second ever human-made craft to cross the heliosphere, or the boundary between the sun and the stars.

Coming in second place is no mean achievement. Voyager 1 became the first spacecraft to exit the solar system in 2012. But that craft’s plasma instrument stopped working in 1980, leaving scientists without a direct view of the solar wind, hot charged particles constantly streaming from the sun (SN Online: 9/12/13). Voyager 2’s plasma sensors are still working, providing unprecedented views of the space between stars.

“We’ve been waiting with bated breath for the last couple of months for us to be able to see this,” NASA solar physicist Nicola Fox said at a Dec. 10 news conference at the American Geophysical Union meeting in Washington, D.C.

NASA launched the twin Voyager spacecraft in 1977 on a grand tour of the solar system’s planets (SN: 8/19/17, p. 26). After that initial tour was over, both spacecraft continued travelling through the bubble of plasma that originates at the sun.
“When Voyager was launched, we didn’t know how large the bubble was, how long it would take to get [to its edge] and whether the spacecraft could last long enough to get there,” said Voyager project scientist Edward Stone of Caltech.

For most of Voyager 2’s journey, the spacecraft’s Plasma Science Experiment measured the speed, density, temperature, pressure and other properties of the solar wind. But on November 5, the experiment saw a sharp drop in the speed and the number of solar wind particles that hit the detector each second. At the same time, another detector started picking up more high-energy particles called cosmic rays that originate elsewhere in the galaxy.
Those measurements suggest that Voyager 2 has reached the region where the solar wind slams into the colder, denser population of particles that fill the space between stars. Voyager 2 is now a little more than 18 billion kilometers from the sun.

Intriguingly, Voyager 2’s measurements of cosmic rays and magnetic fields — which Voyager 1 could still make when it crossed the boundary — did not exactly match up with Voyager 1’s observations.
“That’s what makes it interesting,” Stone said. The variations are probably from the fact that the two spacecraft exited the heliosphere in different places, and that the sun is at a different part of its 11-year activity cycle than it was in 2012. “We would have been amazed if they had looked the same.”

The Voyagers probably have between five and 10 years left to continue exploring interstellar space, said Voyager project manager Suzanne Dodd from NASA’s Jet Propulsion Laboratory in Pasadena, Calif.

“Both spacecraft are very healthy if you consider them senior citizens,” Dodd said. The biggest concern is how much power they have left and how cold they are — Voyager 2 is currently about 3.6° Celsius, close to the freezing point of its hydrazine fuel. In the near future, the team will have to turn off some of the spacecraft’s instruments to keep the craft operating and sending data back to Earth.

“We do have difficult decisions ahead,” Dodd said. She added that her personal goal is to see the spacecraft last until 2027, for a total of 50 years in space. “That would be fantastic.”

NASA’s OSIRIS-REx finds signs of water on the asteroid Bennu

As the asteroid Bennu comes into sharper focus, planetary scientists are seeing signs of water locked up in the asteroid’s rocks, NASA team members announced December 10.

“It’s one of the things we were hoping to find,” team member Amy Simon of NASA’s Goddard Space Flight Center in Greenbelt, Md., said in a news conference at the American Geophysical Union meeting in Washington, D.C. “This is evidence of liquid water in Bennu’s past. This is really big news.”
NASA’s OSIRIS-REx spacecraft just arrived at Bennu on December 3 (SN Online: 12/3/18). Over the next year, the team will search for the perfect spot on the asteroid to grab a handful of dust and return it to Earth. “Very early in the mission, we’ve found out Bennu is going to provide the type of material we want to return,” said principal investigator Dante Lauretta of the University of Arizona in Tucson. “It definitely looks like we’ve gone to the right place.”

OSIRIS-REx’s onboard spectrometers measure the chemical signatures of various minerals based on the wavelengths of light they emit and absorb. The instruments were able to see signs of hydrated minerals on Bennu’s surface about a month before the spacecraft arrived at the asteroid, and the signal has remained strong all over the asteroid’s surface as the spacecraft approached, Simon said. Those minerals can form only in the presence of liquid water, and suggest that Bennu had a hydrothermal system in its past.

Bennu’s surface is also covered in more boulders and craters than the team had expected based on observations of the asteroid taken from Earth. Remote observations led the team to expect a few large boulders, about 10 meters wide. Instead they see hundreds, some of them up to 50 meters wide.

“It’s a little more rugged of an environment,” Lauretta said. But that rough surface can reveal details of Bennu’s internal structure and history.
If Bennu were one solid mass, for instance, a major impact could crack or shatter its entire surface. The fact that it has large craters means it has survived impacts intact. It may be more of a rubble pile loosely held together by its own gravity.
The asteroid’s density supports the rubble pile idea. OSIRIS-REx’s first estimate of Bennu’s density shows it is about 1,200 kilograms per cubic meter, Lauretta said. The average rock is about 3,000 kilograms per cubic meter. The hydrated minerals go some way towards lowering the asteroid’s density, since water is less dense than rock. But up to 40 percent of the asteroid may be full of caves and voids as well, Lauretta said.

Some of the rocks on the surface appear to be fractured in a spindly pattern. “If you drop a dinner plate on the ground, you get a spider web of fractures,” says team member Kevin Walsh of the Southwest Research Institute in Boulder, Colo. “We’re seeing this in some boulders.”

The boulders may have cracked in response to the drastic change in temperatures they experience as the asteroid spins. Studying those fracture patterns in more detail will reveal the properties of the rocks.

The OSIRIS-REx team also needs to know how many boulders of various sizes are strewn across the asteroid’s surface. Any rock larger than about 20 centimeters across would pose a hazard to the spacecraft’s sampling arm, says Keara Burke of the University of Arizona. Burke, an undergraduate engineering student, is heading up a boulder mapping project.
“My primary goal is safety,” she says. “If it looks like a boulder to me, within reasonable guidelines, then I mark it as a boulder. We can’t sample anything if we’re going to crash.”

The team also needs to know where the smallest grains of rock and dust are, as OSIRIS-REx’s sampling arm can pick up grains only about 2 centimeters across. One way to find the small rocks is to measure how well the asteroid’s surface retains heat. Bigger rocks are slower to heat up and slower to cool down, so they’ll radiate heat out into space even on the asteroid’s night side. Smaller grains of dust heat up and cool down much more quickly.

“It’s exactly like a beach,” Walsh says. “During the day it’s scalding hot, but then it’s instantly cold when the sun sets.”

Measurements of the asteroid’s heat storage so far suggest that there are regions with grains as small as 1 or 2 centimeters across, Lauretta said, though it is still too early to be certain.

“I am confident that we’ll find some fine-grained regions,” Lauretta said. Some may be located inside craters. The challenge will be finding an area wide enough that the spacecraft’s navigation system can steer to it accurately.

The list of extreme weather caused by human-driven climate change grows

WASHINGTON – A months-long heat wave that scorched the Tasman Sea beginning in November of 2017 is the latest example of an extreme event that would not have happened without human-caused climate change.

Climate change also increased the likelihood of 15 other extreme weather events in 2017, from droughts in East Africa and the U.S. northern Plains states to floods in Bangladesh, China and South America, scientists reported December 10 at a news conference at the American Geophysical Union’s fall meeting. The findings were also published online December 10 in a series of studies in a special issue of the Bulletin of the American Meteorological Society.
One study, of wildfires in Australia, was inconclusive on whether climate change influenced the event. And for the first time, none of the extreme events studied was determined to be the product of natural climate variability.

The findings mark the second year in a row — and only the second time — that scientists contributing to this special issue have definitively linked human-caused climate change with specific extreme weather events (SN: 1/20/18, p. 6). To the editors of the special issue, this latest tally is representative of the new normal in which the world finds itself.

“Many events were found to have appreciable climate change input; that’s not itself a surprise,” said Martin Hoerling, a special editor of the issue, at the news conference. “We are in a world that is warmer than it was in the 20th century, and we keep moving away from that baseline….”

“Nature is unfolding itself in front of our eyes,” added Hoerling, a research meteorologist with the U.S. National Oceanic and Atmospheric Administration in Boulder, Colo.
Marine heat waves
Several marine heat waves have struck the Tasman Sea, located between Australia and New Zealand, in the last decade, including a severe heat wave during the Southern Hemisphere summer of 2015 to 2016. But the 2017–2018 event extended across a much broader area, encompassing the entire sea. At its most severe point, temperatures increased to at least 2 degrees Celsius above average in the ocean, devastating the region’s iconic kelp forests and contributing to record-breaking summer temperatures in New Zealand.

Climate change was also responsible for another marine heat wave off the coast of East Africa that lasted from March to June 2017, according to a separate study. That marine heat wave, which the researchers found could not have happened in a preindustrial climate, also may have contributed to a drought in East Africa that caused food shortages for millions of people in the Horn of Africa, including 6 million in Somalia alone. The hot sea surface temperatures, the researchers found, doubled the probability that such a drought would occur.

“Any given extreme event might occur, but the severity of the events, that’s really what has changed. And it’s going to continue to change,” says Karsten Haustein, a climate scientist at the University of Oxford who is part of a research group that specializes in such climate attribution studies. Haustein is a coauthor on a study included in the collection that found that climate change dramatically increased the likelihood — by as much as 100 percent — of a six-day rainstorm that inundated Bangladesh in March 2017. The rainfall, which caused a flash flood, occurred before the onset of the monsoon season and proved devastating to farmers, Haustein says.

Legal liability
The new issue highlights how the field of climate attribution science overall has crossed a critical threshold when it comes to liability, Lindene Patton, a strategic advisor at the Earth & Water Law Group in Washington, D.C., who specializes in climate attribution, said at the news conference. Although climate change was not found to be definitively to blame in most of the studies, it very likely was responsible for or intensified the impacts of nearly every extreme event examined in the issue — and that level of statistical certainty is enough to be legally important, Patton said. “The sufficiency of certainty differs in a court of law and in science. Perfection is not required; you just need to know if it’s more likely than not.”

The threat of liability may not be the ideal way to achieve more environment-friendly policies — but there is a precedent for it, she noted. “We clearly saw the emergence of liability in the 1970s with pollution” as a precursor to pollutant legislation.

BAMS Editor in Chief Jeff Rosenfeld acknowledges that in a world where real-time attribution studies of events such as 2018’s Hurricane Florence are becoming more common (SN Online: 9/13/18), the detailed, retrospective analyses of the BAMS special issue that lag by a year may seem a bit slow. “The funny thing is, initially, we considered it fast response,” he says.

But he thinks the looming question of climate liability highlights why the slower, more deliberate BAMS studies will continue to remain relevant, even in the swiftly changing climate of attribution science. “The people who are decision makers want numbers. They want risk factors.”

A new implant uses light to control overactive bladders

A new soft, wireless implant may someday help people who suffer from overactive bladder get through the day with fewer bathroom breaks.

The implant harnesses a technique for controlling cells with light, known as optogenetics, to regulate nerve cells in the bladder. In experiments in rats with medication-induced overactive bladders, the device alleviated animals’ frequent need to pee, researchers report online January 2 in Nature.

Although optogenetics has traditionally been used for manipulating brain cells to study how the mind works, the new implant is part of a recent push to use the technique to tame nerve cells throughout the body (SN: 1/30/10, p. 18). Similar optogenetic implants could help treat disease and dysfunction in other organs, too.
“I was very happy to see this,” says Bozhi Tian, a materials scientist at the University of Chicago not involved in the work. An estimated 33 million people in the United States have overactive bladders. One available treatment is an implant that uses electric currents to regulate bladder nerve cells. But those implants “will stimulate a lot of nerves, not just the nerves that control the bladder,” Tian says. That can interfere with the function of neighboring organs, and continuous electrical stimulation can be uncomfortable.

The new optogenetic approach, however, targets specific nerves in only one organ and only when necessary. To control nerve cells with light, researchers injected a harmless virus carrying genetic instructions for bladder nerve cells to produce a light-activated protein called archaerhodopsin 3.0, or Arch. A stretchy sensor wrapped around the bladder tracks the wearer’s urination habits, and the implant wirelessly sends that information to a program on a tablet computer.
If the program detects the user heeding nature’s call at least three times per hour, it tells the implant to turn on a pair of tiny LEDs. The green glow of these micro light-emitting diodes activates the light-sensitive Arch proteins in the bladder’s nerve cells, preventing the cells from sending so many full-bladder alerts to the brain.
John Rogers, a materials scientist and bioengineer at Northwestern University in Evanston, Ill., and colleagues tested their implants by injecting rats with the overactive bladder–causing drug cyclophosphamide. Over the next several hours, the implants successfully detected when rats were passing water too frequently, and lit up green to bring the animals’ urination patterns back to normal.

Shriya Srinivasan, a medical engineer at MIT not involved in the work, is impressed with the short-term effectiveness of the implant. But, she says, longer-term studies may reveal complications with the treatment.

For instance, a patient might develop an immune reaction to the foreign Arch protein, which would cripple the protein’s ability to block signals from bladder nerves to the brain. But if proven safe and effective in the long term, similar optogenetic implants that sense and respond to organ motion may also help treat heart, lung or muscle tissue problems, she says.

Optogenetic implants could also monitor other bodily goings-on, says study coauthor Robert Gereau, a neuroscientist at Washington University in St. Louis. Hormone levels and tissue oxygenation or hydration, for example, could be tracked and used to trigger nerve-altering LEDs for medical treatment, he says.