Cause of mass starfish die-offs is still a mystery

In the summer of 2013, an epidemic began sweeping through the intertidal zone off the west coast of North America. The victims were several species of sea star, including Pisaster ochraceus, a species that comes in orange and purple variants. (It’s also notable because it’s the starfish that provided ecology with the fundamental concept of a keystone species.) Affected individuals appeared to “melt,” losing grip with the rocks to which they were attached — and then losing their arms. This sea star wasting disease, as it is known, soon killed sea stars from Baja California to Alaska.

This wasn’t the first outbreak of sea star wasting disease. A 1978 outbreak in the Gulf of California, for instance, killed so many Heliaster kubinjiisun stars that the once ubiquitous species is now incredibly rare.

These past incidents, though, happened fast and within smaller regions, so scientists had struggled to figure out what had happened. With the latest outbreak happening over such a large — and well-studied — region and period of time, marine biologists have been able to gather more data on the disease than ever before. And they’re getting closer to figuring out just what happened in this latest incident.

One likely factor is the sea star-associated densovirus, which, in 2014, scientists reported finding in greater abundance in starfish with sea star wasting disease than in healthy sea stars. But the virus can’t be the only cause of the disease; it’s found in both healthy and sick sea stars, and it has been around since at least 1942, the earliest year it has been found in museum specimens. So there must be some other factor at play.
Earlier this year, scientists studying the outbreak in Washington state reported in the Proceedings of the Royal Society B thatwarm waters may increase disease progression and rates of death. Studies of California starfish came to a similar conclusion. But a new study, appearing May 4 in PLOS One , finds that may not be true for sea stars in Oregon. Bruce Menge and colleagues at Oregon State University took advantage of their long-term study of Oregon starfish to evaluate what happened to sea stars during the recent epidemic and found that wasting disease increased with cooler , not warmer, temperatures. “Given conflicting results on the role of temperature as a trigger of [sea star wasting disease], it seems most likely that multiple factors interacted in complex ways to cause the outbreak,” they conclude.
What those factors are, though, is still a mystery.

Also unclear is what long-term effects this outbreak will have on Pacific intertidal communities.

In the 1960s, Robert Paine of the University of Washington performed what is now considered a classic experiment. For years, he removed starfish from one area of rock in Makah Bay at the northwestern tip of Washington and left another bit of rock alone as a control. Without the starfish to prey on them, mussels were able to take over. The sea stars, Paine concluded, were a “keystone species” that kept the local food web in control.

If sea star wasting disease has similar effects on the Pacific intertidal food web, Menge and his colleagues write, “it would result in losses or large reductions of many species of macrophytes, anemones, limpets, chitons, sea urchins and other organisms from the low intertidal zone.”

What happens, the group says, may depend on how quickly the disease disappears from the region and how many young sea stars can grow up and start munching on mussels.

Stephen Hawking finds the inner genius in ordinary people

It’s hard to believe that it took reality television this long to get around to dealing with space, time and our place in the cosmos.

In PBS’ Genius by Stephen Hawking, the physicist sets out to prove that anyone can tackle humankind’s big questions for themselves. Each of the series’ six installments focuses on a different problem, such as the possibility of time travel or the likelihood that there is life elsewhere in the universe. With Hawking as a guide, three ordinary folks must solve a series of puzzles that guide them toward enlightenment about that episode’s theme. Rather than line up scientists to talk at viewers, the show invites us to follow each episode’s trio on a journey of discovery.
By putting the focus on nonexperts, Genius emphasizes that science is not a tome of facts handed down from above but a process driven by curiosity. After working through a demonstration of how time slows down near a black hole, one participant reflects: “It’s amazing to see it play out like this.”
The show is a fun approach to big ideas in science and philosophy, and the enthusiasm of the guests is infectious. Without knowing what was edited out, though, it’s difficult to say whether the show proves Hawking’s belief that anyone can tackle these heady questions. Each situation is carefully designed to lead the participants to specific conclusions, and there seems to be some off-camera prompting.

But the bigger message is a noble one: A simple and often surprising chain of reasoning can lead to powerful insights about the universe, and reading about the cosmos pales next to interacting with stand-ins for its grandeur. It’s one thing, for example, to hear that there are roughly 300 billion stars in the Milky Way. But to stand next to a mountain of sand where each grain represents one of those stars is quite another. “I never would have got it until I saw it,” says one of the guests, gesturing to the galaxy of sand grains. “This I get.”

Snot could be crucial to dolphin echolocation

In hunting down delicious fish, Flipper may have a secret weapon: snot.

Dolphins emit a series of quick, high-frequency sounds — probably by forcing air over tissues in the nasal passage — to find and track potential prey. “It’s kind of like making a raspberry,” says Aaron Thode of the Scripps Institution of Oceanography in San Diego. Thode and colleagues tweaked a human speech modeling technique to reproduce dolphin sounds and discern the intricacies of their unique style of sound production. He presented the results on May 24 in Salt Lake City at the annual meeting of the Acoustical Society of America.

Dolphin chirps have two parts: a thump and a ring. Their model worked on the assumption that lumps of tissue bumping together produce the thump, and those tissues pulling apart produce the ring. But to match the high frequencies of live bottlenose dolphins, the researchers had to make the surfaces of those tissues sticky. That suggests that mucus lining the nasal passage tissue is crucial to dolphin sonar.

The vocal model also successfully mimicked whistling noises used to communicate with other dolphins and faulty clicks that probably result from inadequate snot. Such techniques could be adapted to study sound production or echolocation in sperm whales and other dolphin relatives.
Researchers modified a human speech model developed in the 1970s to study dolphin echolocation. The animation above mimics the vibration of lumps of tissue (green) in the dolphin’s nasal passage (black) that are drenched in mucus. Snot-covered tissues (blue) stick together (red) and pull apart to create the click sound.

Jupiter’s stormy weather no tempest in teapot

Jupiter’s turbulence is not just skin deep. The giant planet’s visible storms and blemishes have roots far below the clouds, researchers report in the June 3 Science. The new observations offer a preview of what NASA’s Juno spacecraft will see when it sidles up to Jupiter later this year.

A chain of rising plumes, each reaching nearly 100 kilometers into Jupiter, dredges up ammonia to form ice clouds. Between the plumes, dry air sinks back into the Jovian depths. And the famous Great Red Spot, a storm more than twice as wide as Earth that has churned for several hundred years, extends at least dozens of kilometers below the clouds as well.

Jupiter’s dynamic atmosphere provides a possible window into how the planet works inside. “One of the big questions is what is driving that change,” says Leigh Fletcher, a planetary scientist at the University of Leicester in England. “Why does it change so rapidly, and what are the environmental and climate-related factors that result from those changes?”

To address some of those questions, Imke de Pater, a planetary scientist at the University of California, Berkeley, and colleagues observed Jupiter with the Very Large Array radio observatory in New Mexico. Jupiter emits radio waves generated by heat left over from its formation about 4.6 billion years ago. Ammonia gas within Jupiter’s atmosphere intercepts certain radio frequencies. By mapping how and where those frequencies are absorbed, the researchers created a three-dimensional map of the ammonia that lurks beneath Jupiter’s clouds. Those plumes and downdrafts appear to be powered by a narrow wave of gas that wraps around much of the planet.

The depths of Jupiter’s atmospheric choppiness isn’t too surprising, says Scott Bolton, a planetary scientist at the Southwest Research Institute in San Antonio. “Almost everyone I know would have guessed that,” he says. But the observations do provide a teaser for what to expect from the Juno mission, led by Bolton. The spacecraft arrives at Jupiter on July 4 to begin a 20-month investigation of what’s going on beneath Jupiter’s clouds using tools similar to those used in this study.

The new observations confirm that Juno should work as planned, Bolton says.

By getting close to the planet — just 5,000 kilometers from the cloud tops — Juno will break through the fog of radio waves from Jupiter’s radiation belts that obscures observations made from Earth and limits what telescopes like the Very Large Array can see. But the spacecraft will see only a narrow swath of Jupiter’s bulk at a time. “That’s where ground-based work like the research de Pater has been doing is really essential,” Fletcher says. Observations such as these will let Juno scientists know what’s going on throughout the atmosphere so they can better understand what Jupiter is telling them.

Empathy for animals is all about us

There’s an osprey nest just outside Jeffrey Brodeur’s office at the Woods Hole Oceanographic Institution in Massachusetts. “I literally turn to my left and they’re right there,” says Brodeur, the organization’s communications and outreach specialist. WHOI started live-streaming the osprey nest in 2005.

For the first few years, few people really noticed. All that changed in 2014. An osprey pair had taken up residence and produced two chicks. But the mother began to attack her own offspring. Brodeur began getting e-mails complaining about “momzilla.” And that was just the beginning.

“We became this trainwreck of an osprey nest,” he says. In the summer of 2015, the osprey family tried again. This time, they suffered food shortages. The camera received an avalanche of attention, complaints and e-mails protesting the institute’s lack of intervention. One scolded, “it is absolutely disgusting that you will not take those chicks away from that demented witch of a parent!!!!! Instead you let them be constantly abused and go without [sic] food. Yes this is nature but you have a choice to help or not. This is totally unacceptable. She should be done away with so not to abuse again.” By mid-2015, Brodeur began to receive threats. “People were saying ‘we’re gonna come help them if you don’t,’” he recalls.

The osprey cam was turned off, and remains off to this day. Brodeur says he’s always wondered why people had such strong feelings about a bird’s parenting skills.

Why do people spend so much time and emotion attempting to apply their own moral sense to an animal’s actions? The answer lies in the human capacity for empathy — one of the qualities that helps us along as a social species.

When we are confronted with another person — say, someone in pain — our brains respond not just by observing, but by copying the experience. “Empathy results in emotion sharing,” explains Claus Lamm, a social cognitive neuroscientist at the University of Vienna in Austria. “I don’t just know what you are feeling, I create an emotion in myself. This emotion makes connections to situations when I was in that emotional state myself.”

Lamm and his colleagues showed that viewing someone in pain activates certain brain areas such as the insula, anterior cingulate cortex and medial cingulate cortex, regions that are active when we ourselves are in pain. “They allow us to have this first person experience of the pain of the other person,” Lamm explains.
When participants viewed someone reacting as though they were in pain to a stimulus that wasn’t painful for the viewer, the participants showed activity in the frontal cortex in areas important for distinction between “self” and “other.” We can still sympathize with someone else’s pain, even if we don’t know what it feels like, Lamm and his colleagues reported in 2010 in the Journal of Cognitive Neuroscience.

This works for animals, too: We ascribe certain emotions or feelings to animals based on their actions. “You know you have a mind, thoughts and feelings,” says Kurt Gray, a psychologist at the University of North Carolina in Chapel Hill. “You take it for granted that other people do too, but you can never really know. With animals, you can’t know for sure, so your best guess is what you would do in that situation.”

When people see an animal suffering — such as, say, a suffering osprey chick — they feel empathy. They then categorize that sufferer into a “feeler,” or a victim. But that suffering chick can’t exist in a vacuum. “When there’s a starving chick, we think, ‘oh, it’s terrible!’” Gray says. “It’s not enough for us to say nature is red in tooth and claw. There must be someone to blame for this.”
In a theory he calls dyadic completion, he explains that we think of moral situations — situations in which there is suffering — as dyads or pairs. Every victim needs a perpetrator. A sufferer with no one responsible is psychologically incomplete, and viewers will fill in a perpetrator in response. In the case of suffering osprey chicks, he notes, that perpetrator might be an uncaring osprey mom, or the camera operator who refuses to intervene in a natural process. Gray and his colleagues published their ideas on dyadic completion in 2014 in the Journal of Experimental Psychology.

Anthropomorphizing animals — whether or not it is logical or realistic — is usually pretty harmless. “It’s probably OK to say a cat is content,” says John Hadley, an ethicist at Western Sydney University in Australia. Similarly, it’s OK to say that a mother osprey is being violent when she attacks her own young. People are describing what they see in emotional terms they recognize. But this doesn’t mean that these animals should be held responsible for their actions, he says. When we judge an animal for its parenting skills, “in one sense it implies we want to hold these animals up as objects of praise or blame.” The natural tendency to ascribe emotions to animals, he says, is “only really problematic if [the emotions] are inaccurate or if they lead to some kind of ethical problem.”

People can’t put an osprey on trial for being a bad parent. But as in the case of an abandoned bison calf in Yellowstone, people do sometimes intervene — even though their actions might not be helpful. “That’s a question of ethical systems coming in to conflict,” Hadley says. “National parks apply a holistic ethic, try to let nature run its course…. But a more common-sense approach would be that you can intervene, there’s suffering you can stop and you should try and stop it.”

The feelings of pity and the desire to intervene is really all about us. “When we look at nonhuman animals and we read them as if they are humans … that might just be our being narrow and unable to imagine any creature that is not somehow a reflection of us,” says Janet Stemwedel, a philosopher at San Jose State University in California. “There’s a way in which looking at animals and reading them as human and imagining them as having emotions and inner lives is maybe a gateway to caring,” Stemwedel says. This caring might be erring on the side of caution, she explains, “acknowledging the limits of what we can know about how [animals] experience the world.” If we fail to imagine what animals might be feeling, “we could do a great deal of harm, [and] put suffering in the world that doesn’t need to be there,” she notes.

Our caring for the suffering and the lonely is part of what makes us a social species. “Evolution endowed us with a moral sense because it was useful for living in groups,” Gray notes. “It’s not crazy. It’s the same impulse that leads us to protect children from child abuse, and it so happens that we extend that to osprey children.” Those anthropomorphizing impulses aren’t stupid or useless. Instead, they tell us something, not about animals, but about ourselves.

Swapping analogous genes no problem among species

ORLANDO, Fla. — Organisms as different as plants, bacteria, yeast and humans could hold genetic swap meets and come away with fully functional genes, new research suggests.

Researchers have known for decades that organisms on all parts of the evolutionary tree have many of the same genes. “How many of these shared genes are truly functionally the same thing?” wondered Aashiq Kachroo, a geneticist at the University of Texas at Austin, and colleagues. The answer, Kachroo revealed July 15 at the Allied Genetics Conference, is that about half of shared genes are interchangeable across species.
Last year, Kachroo and colleagues reported that human genes could substitute for 47 percent of yeast genes that the two species have in common (SN: 6/27/15, p. 5). Now, in unpublished experiments, the researchers have swapped yeast genes with analogous ones from Escherichia coli bacteria or with those from the plant Arabidopsis thaliana. About 60 percent of E. coli genes could stand in for their yeast counterparts, Kachroo reported. Plant swaps are ongoing, but the researchers already have evidence that plant genes can substitute for yeast genes involved in some important biological processes.

In particular, many organisms share the eight-step biochemical chain reaction that makes the molecule heme. The researchers found that all but one of yeast’s heme-producing genes could be swapped with one from E. coli or plants.

Magnetic fields in sun rise at 500 kilometers per hour

About 20,000 kilometers beneath the sun’s surface, magnetic fields rise no faster than about 500 kilometers per hour. That speed (roughly one-third of previous estimates) is about the same speed that gas rises and falls within the sun, implying that moving parcels of gas help steer magnetic fields toward the surface, researchers report July 13 in Science Advances.

Aaron Birch of the Max Planck Institute for Solar System Research in Göttingen, Germany, and colleagues estimated the speed by combining observations of the sun’s surface with computer simulations of how gas moves within the hot orb. By studying the sun’s inner workings, researchers hope to understand what drives sunspots and flares — the blemishes and eruptions triggered by magnetic fields punching through the surface.

Genes that control toxin production in C. difficile ID’d

A new genetic discovery could equip researchers to fight a superbug by stripping it of its power rather than killing it outright.

Scientists have identified a set of genes in Clostridium difficile that turns on its production of toxins. Those toxins can damage intestinal cells, leading to diarrhea, abdominal pain and potentially life-threatening disease. Unlocking the bug’s genetic weapon-making secret could pave the way for new nonantibiotic therapies to disarm the superbug while avoiding collateral damage to other “good” gut bacteria, researchers report August 16 in mBio.
Identifying a specific set of genes that control toxin production is a big step forward, says Matthew Bogyo. Bogyo, a chemical biologist at Stanford University, also studies ways to defuse C. difficile’s toxin-making.

C. difficile bacteria infect a half million people and kill about 29,000 each year in the United States. In some individuals, though, the microbe hangs out in the gut for years without causing trouble. That’s because human intestines normally have plenty of good bacteria to keep disease-causing ones in check. However, a round of antibiotics can throw the system off balance, and if enough good bugs die off, “C. difficile takes over,” says lead author Charles Darkoh, a molecular microbiologist at the University of Texas Health Science Center at Houston. As infection rages, C. difficile can develop resistance to antibiotic drugs, turning it into an intractable superbug.

Darkoh’s team reported last year that C. difficile regulates toxin production with quorum sensing — a system that lets bacteria conserve resources and launch an attack only if their numbers reach a critical threshold. That study identified two sets of quorum-signaling genes, agr1 and agr2, that could potentially activate toxin production.

In the new analysis, Darkoh and colleagues tested the ability of a series of C. difficile strains to make toxins when incubated with human skin cells. Some C. difficile strains had either agr1 or agr2 deleted; others had all their quorum-sensing genes or lacked both gene sets. Agr1 is responsible for packing the superbug’s punch, the researchers found. C. difficile mutants without that set of genes made no detectable toxins, and skin cells growing in close quarters stayed healthy. Feeding those mutant bugs to mice caused no harm, whereas mice that swallowed normal C. difficile lost weight and developed diarrhea within days. In the skin cell cultures, agr2-deficient strains were just as lethal as normal C. difficile, showing that only agr1 is essential for toxin production.

Based on their new findings, Darkoh and colleagues have identified several compounds that inactivate C. difficile toxins or block key steps in the molecular pathway controlling their production. The researchers are testing these agents in mice.

In a mouse study published in Science Translational Medicine last year, Bogyo and colleagues found a different compound that could disarm C. difficile by targeting its toxins. And several companies are trying to fight C. difficile with probiotics — cocktails of good bacteria. Results have been mixed.

Doctors need better ways to figure out fevers in newborns

Two days after my first daughter was born, her pediatrician paid a house call to examine her newest patient. After packing up her gear, she told me something alarming: “For the next few months, a fever is an emergency.” If we measured a rectal temperature at or above 100.4° Fahrenheit, go to the hospital, she said. Call her on the way, but don’t wait.

I, of course, had no idea that a fever constituted an emergency. But our pediatrician explained that a fever in a very young infant can signal a fast-moving and dangerous bacterial infection. These infections are rare (and fortunately becoming even rarer thanks to newly created vaccines). But they’re serious, and newborns are particularly susceptible.

I’ve since heard from friends who have been through this emergency. Their newborns were poked, prodded and monitored by anxious doctors, in the hopes of quickly ruling out a serious bacterial infection. For infants younger than two months, it’s “enormously difficult to tell if an infant is seriously ill and requires antibiotics and/or hospitalization,” says Howard Bauchner, a pediatrician formerly at Boston University School of Medicine and now editor in chief of the Journal of the American Medical Association.

A new research approach, described in two JAMA papers published in August, may ultimately lead to better ways to find the cause of a fever.

These days, for most (but not all) very young infants, their arrival at a hospital will trigger a workup that includes a urine culture and a blood draw. Often doctors will perform a lumbar puncture, more commonly known as a spinal tap, to draw a sample of cerebrospinal fluid from the area around the spinal cord.

Doctors collect these fluids to look for bacteria. Blood, urine and cerebrospinal fluid are smeared onto culture dishes, and doctors wait and see if any bacteria grow. In the meantime, the feverish infant may be started on antibiotics, just in case. But this approach has its limitations. Bacterial cultivation can take several days. The antibiotics may not be necessary. And needless to say, it’s not easy to get those fluids, particularly from a newborn.

Some scientists believe that instead of looking for bacteria or viruses directly, we ought to be looking at how our body responds to them. Unfortunately, the symptoms of a bacterial and viral infection are frustratingly similar. “You get a fever. You feel sick,” says computational immunologist Purvesh Khatri of Stanford University. Sadly, there are no obvious telltale symptoms of one or the other, not even green snot. In very young infants, a fever might be the only sign that something is amiss.
But more subtle clues could betray the cause of the fever. When confronted with an infection, our immune systems ramp up in specific ways. Depending on whether we are fighting a viral or bacterial foe, different genes turn up their activity. “The immune system knows what’s going on,” Khatri says. That means that if we could identify the genes that reliably get ramped up by viruses and those that get ramped up by bacteria, then we could categorize the infection based on our genetic response.

That’s the approach used by two groups of researchers, whose study results both appear in the August 23/30 JAMA. One group found that in children younger than 2, two specific genes could help make the call on infection type. Using blood samples, the scientists found that one of the genes ramped up its activity in response to a viral infection, and the other responded to a bacterial infection.

The other study looked at immune responses in even younger children. In infants younger than 60 days, the activity of 66 genes measured in blood samples did a pretty good job of distinguishing between bacterial and viral infections. “These are really exciting preliminary results,” says Khatri, who has used a similar method for adults. “We need to do more work.”

Bauchner points out that in order to be useful, “the test would have to be very, very accurate in very young infants.” There’s very little room for error. “Only time will tell how good these tests will be,” he says. In an editorial that accompanied the two studies, he evoked the promise of these methods. If other experiments replicate and refine the results of these studies, he could envision a day in which the parents of a feverish newborn could do a test at home, call their doctor and together decide if the child needs more care.

That kind of test isn’t here yet, but scientists are working on it. The technology couldn’t come soon enough for doctors and parents desperate to figure out a fever.

Endurance training leaves no memory in muscles

Use it or lose it, triathletes.

Muscles don’t have long-term memory for exercises like running, biking and swimming, a new study suggests. The old adage that once you’ve been in shape, it’s easier to get fit again could be a myth, at least for endurance athletes, researchers in Sweden report September 22 in PLOS Genetics.

“We really challenged the statement that your muscles can remember previous training,” says Maléne Lindholm of the Karolinska Institute in Stockholm. But even if muscles forget endurance exercise, the researchers say, other parts of the body may remember, and that could make retraining easier for people who’ve been in shape before.
Endurance training is amazingly good for the body. Weak muscle contractions, sustained over a long period of time — as in during a bike ride — change proteins, mainly ones involved in metabolism. This translates into more energy-efficient muscle that can help stave off illnesses like diabetes, cardiovascular disease and some cancers. The question is, how long do those improvements last?

Previous work in mice has shown that muscles “remember” strength training (SN: 9/11/10, p. 15). But rather than making muscles more efficient, strength-training moves like squats and push-ups make muscles bigger and stronger. The muscles bulk up as they develop more nuclei. More nuclei lead to more production of proteins that build muscle fibers. Cells keep their extra nuclei even after regular exercise stops, to make protein easily once strength training restarts, says physiologist Kristian Gundersen at the University of Oslo in Norway. Since endurance training has a different effect on muscles, scientists weren’t sure if the cells would remember it or not.
To answer that question, Lindholm’s team ran volunteers through a 15-month endurance training experiment. In the first three months, 23 volunteers trained four times a week, kicking one leg 60 times per minute for 45 minutes. Volunteers rested their other leg. Lindholm’s team took muscle biopsies before and after the three-month period to see how gene activity changed with training. Specifically, the scientists looked for changes in the number of mRNAs (the blueprints for proteins) that each gene was making. Genes associated with energy production showed the greatest degree of change in activity with training.
At a follow-up, after participants had stopped training for nine months, scientists again biopsied muscle from the thighs of 12 volunteers, but didn’t find any major differences in patterns of gene activity between the previously trained legs and the untrained legs. “The training effects were presumed to have been lost,” says Lindholm. After another three-month bout of training, this time in both legs, the researchers saw no differences between the previously trained and untrained legs.
While this study didn’t find muscle memory for endurance — most existing evidence is anecdotal — it still might be easier for former athletes to get triathalon-ready, researchers say. The new result has “no bearing on the possible memory in other organ systems,” Gundersen says. The heart and cardiovascular system could remember and more easily regain previous fitness levels, for example, he says.

Even within muscle tissue, immune cells or stem cells could also have some memory not found in this study, says molecular exercise physiologist Monica Hubal of George Washington University in Washington, D.C. Lindholm adds that well-trained connections between nerves and muscles could also help lapsed athletes get in shape faster than people who have never exercised before. “They know how to exercise, how it’s supposed to feel,” Lindholm says. “Your brain knows exactly how to activate your muscles, you don’t forget how to do that.”