NASA’s Juno spacecraft will stay in its current 53-day orbit around Jupiter instead of closing into a 14-day orbit as originally planned, the Juno team announced February 17.
An issue with two helium check valves, which are tied to the spacecraft’s main engine, had scientists concerned. The valves took several minutes to open when the team pressurized the spacecraft’s propulsion system in October. During previous main engine firings, the valves took only a few seconds to open.
Another main engine burn to put the spacecraft into a shorter orbit poses a risk to completing the science goals of the mission, mission scientists say.
Juno has been circling Jupiter since July 4. Staying in the longer orbit will not change the date of the next flyby, nor will it affect voting for which Jovian features to be imaged with JunoCam. It will allow the team to probe Jupiter’s magnetic field in more depth than originally planned. And it may also help to maintain the health of the spacecraft because Juno will spend less time exposed to the planet’s radiation belts, the team noted.
As the planet warms, carbon stashed in Earth’s soils could escape into the atmosphere far faster than previously thought. In the worst-case scenario for climate change, carbon dioxide emissions from soil-dwelling microbes could increase by 34 to 37 percent by 2100, researchers report online March 9 in Science. Previous studies predicted a more modest 9 to 12 percent rise if no efforts are taken to curb climate change. Those extra emissions could further intensify global warming.
Much of that extra CO2 will originate from soils at depths overlooked by previous measurements, says study coauthor Margaret Torn, a biogeochemist at Lawrence Berkeley National Laboratory in California. “We ignore the deep at our peril,” she says. Soils cover about two-thirds of Earth’s ice-free land area and store nearly 3 trillion metric tons of organic carbon — more than three times the amount of carbon in the atmosphere. Dead organisms such as plants contribute to this carbon stockpile, and carbon-munching microbes belch some of that carbon into the atmosphere as CO2. Rising temperatures will spur the microbes to speed up their plant consumption, scientists warn, releasing more CO2 into the air. And the data back up that fear.
Scientists have mimicked future warming by heating the top 5 to 20 centimeters of experimental soil plots and measuring the resulting CO2 emissions. Those studies missed deeper soils, though, known to contain more than half of all soil carbon. Warming such deep soils is technically challenging and scientists had generally assumed that any emission increases from so far down were insignificant, says study coauthor Caitlin Hicks Pries, an ecosystem ecologist at Lawrence Berkeley.
Using heating coils and rods embedded in the soil, Hicks Pries, Torn and colleagues warmed a plot of soil for over two years in the forested foothills of California’s Sierra Nevada. The warmth extended to a meter below ground, the full depth of the soil in the area. That heating replicated the roughly 4 degrees Celsius of warming expected by the end of the century in a worst-case scenario. Annual carbon emissions from the soil jumped from 1,100 grams per square meter to 1,450 grams per square meter. Around 40 percent of this emissions increase originated below a depth of 15 centimeters, with 10 percent originating below 30 centimeters.
Assuming other soils behave similarly, by 2100, the increase in the CO2 emission rate from just the soils deeper than 30 centimeters could equal modern-day CO2 emission rates from oil burning, the researchers estimate.
While only 13.5 percent of Earth’s soils resemble the woodland soils examined in the study, Torn says that the experiment shows that scientists need to consider deep soils when calculating future climate change. Studies already in the works will test if the results hold true for other soil types.
The new experiment is exciting and well executed, says Katherine Todd-Brown, a biogeochemist at the Pacific Northwest National Laboratory in Richland, Wash. The net impact soils will have on future climate change, however, remains unclear, she says. The amount of carbon from the atmosphere entering soils could also increase as higher CO2 concentrations and warmer environments promote plant growth. That increased carbon drawdown could offset the climate impacts of the increased emissions, though the magnitude of that effect is still debated (SN Online: 9/22/16). “You really have to take both the inputs and outputs into account,” Todd-Brown says.
In a spaceflight first, the aerospace company SpaceX has successfully launched and landed a previously used rocket.
The Falcon 9 rocket blasted off March 30 from NASA’s Kennedy Space Flight Center in Florida at 6:27 p.m. EDT carrying a commercial telecommunications satellite. After separating from the rest of the rocket and its payload, the refurbished first stage of the rocket touched back down smoothly on a platform in the Atlantic Ocean. The stage is the same one SpaceX used in its first successful landing on an ocean barge in April 2016.
Although the aerospace company has recovered eight Falcon 9 rockets after previous launches, this homecoming marks the first time it has reflown one of those used boosters. In September, a Falcon 9 rocket and its payload exploded on the launchpad at Cape Canaveral during a routine test.
In the past, the spent first stages of rockets have been lost to the ocean. Capturing and reusing rockets may lead to cheaper spaceflights, the company says.
When I was pregnant, my pronoun shifted automatically. My “I” turned into “we,” as in, “What are we going to eat for dinner?” and, “Should we sit in that hot tub?” I thought about that shift to the majestic plural as we got our Tdap shot in our third trimester.
The Tdap vaccine protects against tetanus, diphtheria and pertussis, or whooping cough. Doctors recommend that women receive a dose with each pregnancy because the diseases can be particularly dangerous for young babies. But good, hard evidence for the benefits of vaccinating women while pregnant instead of shortly after giving birth has been lacking. A new study of nearly 150,000 newborns fills that gap for whooping cough.
Researchers at the Kaiser Permanente Vaccine Study Center in Oakland, Calif., studied the medical records of mothers who gave birth to babies between 2010 and 2015. Overall, about 46 percent of the mothers received a Tdap vaccine at least 8 days before giving birth.
Seventeen of the 150,000 babies got whooping cough by the time they were 2 months old. Of these 17 babies, only one had been born to a mother who had received the Tdap vaccine during her pregnancy. And this baby, the researchers note, had a mild case of whooping cough and wasn’t admitted to the hospital.
The maternal protection against whooping cough stuck around beyond 2 months, the researchers found. Though babies got their own vaccines in their first year of life, those babies who got their mothers’ antibodies during pregnancy were less likely to get whooping cough before their first birthdays than babies whose mothers had not been vaccinated while pregnant.
Babies whose mothers were vaccinated after giving birth didn’t get similar protection. The researchers found no evidence that postpartum Tdap vaccinations for mothers prevented whooping cough in babies. “Our results demonstrate the substantial benefit of vaccinating during pregnancy rather than waiting until after birth,” pediatrician and vaccine researcher Nicola Klein and colleagues wrote online April 3 in Pediatrics.
Since 2013, doctors have recommended that women get Tdap shots during every pregnancy between weeks 27 and 36 of pregnancy, a window that’s thought to be prime for antibody sharing. Babies usually get their first vaccine against whooping cough at 2 months of age. The new study shows how antibodies received in utero from mom can shepherd babies through this vulnerable unvaccinated period.
These days, whooping cough is making a comeback. That reemergence comes in part from a switch in the 1990s to a vaccine that comes with fewer side effects but is less effective. Changes in the bacterial culprit itself and lower vaccination rates also contribute to whooping cough’s reemergence. One of the best things mothers-to-be can do to keep their newborns healthy, the study shows, is to themselves deliver those antibodies to their babies by getting vaccinated during pregnancy.
Taking antidepressants during pregnancy does not increase the risk of autism or attention-deficit/hyperactivity disorder, two new large studies suggest. Genetic or environmental influences, rather than prenatal exposure to the drugs, may have a greater influence on whether a child will develop these disorders. The studies are published online April 18 in JAMA.
Clinically, the message is “quite reassuring for practitioners and for mothers needing to make a decision about antidepressant use during pregnancy,” says psychiatrist Simone Vigod, a coauthor of one of the studies. Past research has questioned the safety of expectant moms taking antidepressants (SN: 6/5/10, p. 22). “A mother’s mood disturbances during pregnancy are a big public health issue — they impact the health of mothers and their children,” says Tim Oberlander, a developmental pediatrician at the University of British Columbia in Vancouver. About one in 10 women develop a major depressive episode during pregnancy. “All treatment options should be explored. Nontreatment is never an option,” says Oberlander, who coauthored a commentary, also published in JAMA.
Untreated depression during pregnancy creates risks for the child, including poor fetal growth, preterm birth and developmental problems. Some women may benefit from psychotherapy alone. A more serious illness may require antidepressants. “Many of us have started to look at longer term child outcomes related to antidepressant exposure because mothers want to know about that in the decision-making process,” says Vigod, of Women’s College Hospital in Toronto.
Previous studies indicated that the use of antidepressants came with its own developmental risks: autism spectrum disorder, ADHD, premature birth and poor fetal growth. “The key question is whether those risks are due to the actual medication,” says psychologist Brian D’Onofrio of Indiana University Bloomington. “Could the negative outcomes be due to the depression itself, or stress or genetic factors?” D’Onofrio and his group authored the other study.
To attempt to isolate the impact of antidepressants on exposed children, both studies relied on big sample sizes and sophisticated statistical techniques. D’Onofrio’s team looked at more than 1.5 million Swedish children born from 1996 to 2012 to nearly 950,000 mothers. More than 22,000, or 1.4 percent, of these kids had mothers who reported using antidepressants, mostly selective serotonin reuptake inhibitors, in the first trimester. The researchers compared siblings in families where the mother used antidepressants in one pregnancy but not the other. “This helps account for all of the factors that make siblings similar — their shared genetics and environment,” D’Onofrio says. In the sibling matchup, the children had essentially the same risk for autism, ADHD and poor fetal growth whether they were exposed to antidepressants in the womb or not. There remained a small increased risk of preterm birth among exposed siblings compared to their unexposed siblings.
In the whole sample, looking at antidepressant use only without accounting for other possible influences, “children have roughly twice the risk of having autism if the mother takes antidepressant medication during the first trimester,” says D’Onofrio. “But that association goes completely away when you compare siblings.” Although it’s not clear exactly what’s responsible for the increased risk — depression, stress, genetic factors, poor prenatal care — “our results suggest that it is actually not due to the medication itself,” he says.
Vigod and colleagues looked at mothers who qualified for public drug coverage in Ontario, Canada, from 2002 to 2010. The women gave birth to 35,906 children; in 2,837 of those pregnancies, nearly 8 percent, the women took antidepressants, also primarily selective serotonin reuptake inhibitors. The team compared exposed children to their unexposed siblings, too, and found no association between autism risk and antidepressant use.
“The use of sibling matches in both studies is a very innovative way to account for genetics and a shared environment,” says Oberlander. “We can’t ignore the fact that there are shared genetic mechanisms that might relate autism and depression. The genetic reason that brought the mom to use the drug may say more about the risk of autism in the child.”
NEW ORLEANS — Ever since Ötzi’s mummified body was found in the Italian Alps in 1991, researchers have been trying to pin down how the 5,300-year-old Tyrolean Iceman died. It now looks like this Copper Age hunter-gatherer simply froze to death, perhaps after suffering minor blood loss from an arrow wound to his left shoulder, anthropologist Frank Rühli of the University of Zurich reported April 20 at the annual meeting of the American Association of Physical Anthropologists.
“Freezing to death is quite likely the main cause of death in this classic cold case,” Rühli said. Ötzi succumbed to exposure within anywhere from a few minutes to a few hours, he estimated.
New analyses of the Iceman’s body, based on X-rays and CT scans, argue against the idea that Ötzi died from a stone arrowhead shot into his shoulder (SN: 9/6/14, p. 6). Surprisingly shallow penetration of that weapon into Ötzi’s shoulder ruptured a blood vessel but caused no major tissue damage, Rühli said. Internal bleeding totaled only about 100 milliliters, or a half cup, he and his colleagues concluded. That’s enough of a poke to cause plenty of discomfort but not death, Rühli said.
Several depressions and fractures on the Iceman’s skull also couldn’t have proven fatal, he added. Some researchers regard those injuries as signs that Ötzi was clubbed to death. Rühli’s team found that those skull injuries are more consistent with the ancient man having accidentally fallen and hit his head while walking over rough ground. The Iceman was found with fur headgear that probably helped to protect his noggin when he took a headlong tumble, Rühli suggested.
The enzyme that turns on the light for a glow-in-the-dark mushroom seems “promiscuous.” But in a good way.
Researchers have worked out new details of how two Neonothopanus fungi shine softly green at night. The team had earlier figured out that the basic starting material for bioluminescence in these fungi is a compound called hispidin, found in some other fungi as well as plants such as horsetails. Those plants don’t spontaneously give off light, but in the two Neonothopanus mushroom species, an enzyme rejiggers a form of hispidin into a compound that glows.
The enzyme that turns a fungus into a natural night-light isn’t that fussy as enzymes go, says Cassius V. Stevani of the University of São Paulo in Brazil. He and colleagues can tweak the compound that the enzyme normally reacts with and still get a glow, the researchers report April 26 in Science Advances.
This easygoing chemistry has allowed the team to develop blue to orange glows instead of just the natural yellowish-green. These bonus colors might mark the beginnings of a new labeling tool for molecular biologists, the researchers say.
Pancreatic cancer is hard to detect early, when the disease is most amenable to treatment. But a new study describes a blood test that may aid the diagnosis of pancreatic cancer and someday make earlier screening feasible, the authors say.
The test detects a combination of five tumor proteins that appear to be a reliable signature of the disease, the researchers report in the May 24 Science Translational Medicine. In patients undergoing pancreatic or abdominal surgery, the test was 84 percent accurate at picking out those who had pancreatic cancer. “What’s exciting about the study is that it further favors the belief that one biomarker by itself may not be able to successfully identify a disease,” says Raghu Kalluri, a cancer biologist at the University of Texas MD Anderson Cancer Center in Houston who was not involved in the study. By putting the five protein biomarkers together, he says, “the power of the analysis might be more beneficial in differentiating healthy individuals and ones with pancreatic cancer.”
The National Cancer Institute estimates that in 2017 there will be more than 53,000 new cases of pancreatic cancer in the United States and just over 43,000 deaths from the cancer. Individuals with the most common form of pancreatic cancer, called pancreatic ductal adenocarcinoma, have a five-year survival rate of less than 10 percent. The cancer is usually caught late because the symptoms, including weight loss and abdominal pain, often don’t arise until the cancer has spread. And current imaging technology can’t detect the cancer at the start, says study coauthor Cesar Castro, a translational oncologist at Massachusetts General Hospital in Boston.
“The unmet need here is finding some other form of detection before a cancer grows large enough for the CT scan to detect it,” Castro says.
In their hunt for better detection methods, the researchers turned to tumor-derived extracellular vesicles, small sacs shed by tumor cells that circulate in the bloodstream. The sacs “are almost like mini-mes” of the parent tumor, Castro says, because they contain proteins and genetic material that often match the tumor.
The researchers selected five promising protein biomarkers from tumor-derived extracellular vesicles. Using a gold-coated silicon chip covered with antibodies and sporting nanopores, the team tested how well the biomarkers signaled the presence of pancreatic cancer in plasma samples from patients. When light shining through the pores encountered the extracellular vesicles, bound to the chip because of the interaction between the protein biomarkers and the antibodies, the light’s wavelength changed — signaling the presence of a tumor. In plasma samples taken from 43 patients before scheduled surgery for a medical issue in the pancreas or abdomen, the panel of five biomarkers distinguished pancreatic ductal adenocarcinoma from pancreatitis — an inflammation of the pancreas — and from benign cysts as well as from control patients’ samples. A pathology report after the surgeries confirmed the results.
Using their sensing device, the researchers report that the combined five biomarkers correctly identified whether a patient had pancreatic ductal adenocarcinoma or not in 84 percent of cases. The highest accuracy for any one of these biomarkers used by itself was just 70 percent.
The next step is to test patients at high risk for pancreatic cancer, and eventually those who are healthy, to see if these biomarkers are effective at early screening. “What we need to do now is pivot towards precancerous lesions,” Castro says. “Can they pick up any precancerous changes?”
“I’m excited about anything that can happen for these patients who are in desperate need for biomarkers and treatment,” Kalluri says. But he cautions that studies reporting the effectiveness of biomarkers as cancer screening tools often use different technologies for their assessments, making it hard for academic laboratories to reproduce the results. “There’s a tremendous lack of organized effort in the biomarker field,” he says, and if there is no way to come to a consensus on which biomarkers are most promising, “it’s very difficult for a patient to realize any benefit.”
Some of the earliest settlers of the Americas curtailed their coastal migration to hunker down in what’s now northwestern Peru, new finds suggest. Although researchers have often assumed that shoreline colonizers of the New World kept heading south from Alaska in search of marine foods, staying put in some spots made sense: Hunter-gatherers needed only simple tools to exploit rich coastal and inland food sources for thousands of years.
Excavations at two seaside sites in Peru find that people intermittently camped there from about 15,000 to 8,000 years ago, say anthropologist Tom Dillehay of Vanderbilt University in Nashville and his colleagues. Ancient people along Peru’s Pacific coast didn’t leave behind fishhooks, harpoons, nets or boats that could have been used to capture fish, sharks and sea lions, the scientists report May 24 in Science Advances. Yet remains of those sea creatures turned up at coastal campsites now buried beneath a human-made, earthen mound called Huaca Prieta and an adjacent mound called Paredones. Fish and other marine animals probably washed up on beaches or were trapped in lagoons that formed near the shore, Dillehay’s group proposes. Hungry humans needed only nets or clubs to subdue these prey. Other marine foods found at the ancient Peruvian campsites included snails, crabs, clams, sea gulls and pelicans. Fragments of material woven out of rush plants, the earliest dating to between 10,600 and 11,159 years ago, may have come from fish traps or baskets, the researchers say. Radiocarbon dating of burned wood, animal bones and plant seeds provided age estimates for a series of buried campsites at Huaca Prieta and Paredones.
Present-day hunters on Peru’s coast eat fish and small sharks that get trapped on the beach or in shallow shoreline lagoons. Hunters also build blinds where they wait to net and club birds, a tactic probably also used by ancient Americans, the investigators suspect.
Deer bones indicate that ancient Huaca Prieta and Paredones visitors hunted on land as well. And remains of avocado, beans and possibly cultivated squash and chili peppers at the ancient campsites — foods known to have been gathered or grown at inland locations — suggest that people transported these foods to the coast, possibly via trading. Evidence that early New World settlers trekked back and forth from coastal to interior parts of Peru coincides with similar human movements in southern Chile more than 14,000 years ago (SN Online: 5/8/08). A team led by Dillehay uncovered seaweed fragments in hearths and structures at Monte Verde II, located 30 kilometers from Chile’s coast. Edible rushes, reeds and stones from the coast also turned up at Monte Verde II.
“Just as there was some contact with the sea at Monte Verde II, there was some contact with the interior at Huaca Prieta,” Dillehay says.
Simple stone tools, sharpened on one side, dominate implements excavated at the Peruvian sites and at Monte Verde II. Basic tools suitable for all sorts of cutting and scraping tasks fit a lifestyle in which people sought food across varied landscapes, the researchers contend. Similar conditions may have characterized some North American coastlines by around 15,000 years ago, Dillehay says. “The problem is that these areas are now underwater” due to a global sea level rise between 20,000 and 6,000 years ago (SN: 8/13/11, p. 22).
Accumulating evidence supports the idea that early Americans favored the coast over an inland lifestyle, says archaeologist Daniel Sandweiss of the University of Maine in Orono. An ice-free corridor into North America’s interior may not have formed before 12,600 years ago (SN Online: 8/10/16), after people had reached Peru and Chile.
The pace at which people moved south from Huaca Prieta is unknown, Sandweiss says. Monte Verde II dates to roughly 500 years after the first coastal campsites in Peru, raising the possibility that Huaca Prieta folk founded the Chilean site, he suggests.
Dillehay doubts it. Modern hunter-gatherer groups vary greatly in size but usually don’t exceed several hundred members, making it unlikely that ancient Huaca Prieta and Paredones people were numerous enough to encounter food shortages, he says. Even if food ran out, hunter-gatherers only had to move a few kilometers north or south to find abundant grub. “We really don’t know where these people were coming and going,” Dillehay cautions.
If the Milky Way exists in the biggest cosmic void ever observed, that could solve a puzzling mismatch between ways to measure how fast the universe is expanding.
Observations of 120,000 galaxies bolstering the Milky Way’s loner status were presented by Benjamin Hoscheit June 7 at a meeting of the American Astronomical Society in Austin, Texas. Building on earlier work by his adviser, University of Wisconsin‒Madison astronomer Amy Barger, Hoscheit and Barger measured how the density of galaxies changed with distance from the Milky Way. In agreement with the earlier study, the pair found that the Milky Way has far fewer neighbors than it should. There was a rise in density about 1 billion light-years out, suggesting the Milky Way resides in an abyss about 2 billion light-years wide.
Simulations of how cosmic structures form suggest that most galaxies clump along dense filaments of dark matter, which are separated by vast cosmic voids.
If the Milky Way lives in such a void, it could help explain why the universe seems to be expanding at different rates depending on how it’s measured (SN: 8/6/16, p. 10). Measurements based on the cosmic microwave background, the earliest light in the universe, suggest one rate of expansion, while measurements of nearby supernovas suggest a faster one.
Those supernovas could be feeling an extra gravitational pull from all the matter at the edges of the void, Hoscheit says. The actual expansion rate is probably the slower one measured in the universe’s early light.
“If you don’t account for the void effects, you could mistake this relationship to indicate that there is too much expansion,” Hoscheit says.