50 years ago, contraception options focused on women

The pill is a sledgehammer approach to contraception…. A second-generation of [drugs] is being designed to do the job without upsetting a woman’s normal cycle of ovulation and menstruation…. A contraceptive administered to the man can be given only for a short time without actually affecting the development of sperm … and, therefore, is not being considered for actual clinical use. —Science News, April 15, 1967

Update
Contraceptives have come a long way since 1967. Women can choose low-dose pills, hormonal rings, implants and intrauterine devices — effective methods that can be less disruptive to normal menstrual cycles. Men have far fewer options, but that may eventually change. A long-acting gel injected into 16 adult male rhesus monkeys’ reproductive tracts completely prevented pregnancy in their partners over one to two breeding periods. The gel works like a vasectomy but is less invasive and can be reversed more easily, researchers report February 7 in Basic and Clinical Andrology.

Vaccinating pregnant women protects newborns from whooping cough

When I was pregnant, my pronoun shifted automatically. My “I” turned into “we,” as in, “What are we going to eat for dinner?” and, “Should we sit in that hot tub?” I thought about that shift to the majestic plural as we got our Tdap shot in our third trimester.

The Tdap vaccine protects against tetanus, diphtheria and pertussis, or whooping cough. Doctors recommend that women receive a dose with each pregnancy because the diseases can be particularly dangerous for young babies. But good, hard evidence for the benefits of vaccinating women while pregnant instead of shortly after giving birth has been lacking. A new study of nearly 150,000 newborns fills that gap for whooping cough.

Researchers at the Kaiser Permanente Vaccine Study Center in Oakland, Calif., studied the medical records of mothers who gave birth to babies between 2010 and 2015. Overall, about 46 percent of the mothers received a Tdap vaccine at least 8 days before giving birth.

Seventeen of the 150,000 babies got whooping cough by the time they were 2 months old. Of these 17 babies, only one had been born to a mother who had received the Tdap vaccine during her pregnancy. And this baby, the researchers note, had a mild case of whooping cough and wasn’t admitted to the hospital.

The maternal protection against whooping cough stuck around beyond 2 months, the researchers found. Though babies got their own vaccines in their first year of life, those babies who got their mothers’ antibodies during pregnancy were less likely to get whooping cough before their first birthdays than babies whose mothers had not been vaccinated while pregnant.

Babies whose mothers were vaccinated after giving birth didn’t get similar protection. The researchers found no evidence that postpartum Tdap vaccinations for mothers prevented whooping cough in babies. “Our results demonstrate the substantial benefit of vaccinating during pregnancy rather than waiting until after birth,” pediatrician and vaccine researcher Nicola Klein and colleagues wrote online April 3 in Pediatrics.

Since 2013, doctors have recommended that women get Tdap shots during every pregnancy between weeks 27 and 36 of pregnancy, a window that’s thought to be prime for antibody sharing. Babies usually get their first vaccine against whooping cough at 2 months of age. The new study shows how antibodies received in utero from mom can shepherd babies through this vulnerable unvaccinated period.

These days, whooping cough is making a comeback. That reemergence comes in part from a switch in the 1990s to a vaccine that comes with fewer side effects but is less effective. Changes in the bacterial culprit itself and lower vaccination rates also contribute to whooping cough’s reemergence. One of the best things mothers-to-be can do to keep their newborns healthy, the study shows, is to themselves deliver those antibodies to their babies by getting vaccinated during pregnancy.

Autism, ADHD risk not linked to prenatal exposure to antidepressants

Taking antidepressants during pregnancy does not increase the risk of autism or attention-deficit/hyperactivity disorder, two new large studies suggest. Genetic or environmental influences, rather than prenatal exposure to the drugs, may have a greater influence on whether a child will develop these disorders. The studies are published online April 18 in JAMA.

Clinically, the message is “quite reassuring for practitioners and for mothers needing to make a decision about antidepressant use during pregnancy,” says psychiatrist Simone Vigod, a coauthor of one of the studies. Past research has questioned the safety of expectant moms taking antidepressants (SN: 6/5/10, p. 22).
“A mother’s mood disturbances during pregnancy are a big public health issue — they impact the health of mothers and their children,” says Tim Oberlander, a developmental pediatrician at the University of British Columbia in Vancouver. About one in 10 women develop a major depressive episode during pregnancy. “All treatment options should be explored. Nontreatment is never an option,” says Oberlander, who coauthored a commentary, also published in JAMA.

Untreated depression during pregnancy creates risks for the child, including poor fetal growth, preterm birth and developmental problems. Some women may benefit from psychotherapy alone. A more serious illness may require antidepressants. “Many of us have started to look at longer term child outcomes related to antidepressant exposure because mothers want to know about that in the decision-making process,” says Vigod, of Women’s College Hospital in Toronto.

Previous studies indicated that the use of antidepressants came with its own developmental risks: autism spectrum disorder, ADHD, premature birth and poor fetal growth. “The key question is whether those risks are due to the actual medication,” says psychologist Brian D’Onofrio of Indiana University Bloomington. “Could the negative outcomes be due to the depression itself, or stress or genetic factors?” D’Onofrio and his group authored the other study.

To attempt to isolate the impact of antidepressants on exposed children, both studies relied on big sample sizes and sophisticated statistical techniques. D’Onofrio’s team looked at more than 1.5 million Swedish children born from 1996 to 2012 to nearly 950,000 mothers. More than 22,000, or 1.4 percent, of these kids had mothers who reported using antidepressants, mostly selective serotonin reuptake inhibitors, in the first trimester.
The researchers compared siblings in families where the mother used antidepressants in one pregnancy but not the other. “This helps account for all of the factors that make siblings similar — their shared genetics and environment,” D’Onofrio says.
In the sibling matchup, the children had essentially the same risk for autism, ADHD and poor fetal growth whether they were exposed to antidepressants in the womb or not. There remained a small increased risk of preterm birth among exposed siblings compared to their unexposed siblings.

In the whole sample, looking at antidepressant use only without accounting for other possible influences, “children have roughly twice the risk of having autism if the mother takes antidepressant medication during the first trimester,” says D’Onofrio. “But that association goes completely away when you compare siblings.” Although it’s not clear exactly what’s responsible for the increased risk — depression, stress, genetic factors, poor prenatal care — “our results suggest that it is actually not due to the medication itself,” he says.

Vigod and colleagues looked at mothers who qualified for public drug coverage in Ontario, Canada, from 2002 to 2010. The women gave birth to 35,906 children; in 2,837 of those pregnancies, nearly 8 percent, the women took antidepressants, also primarily selective serotonin reuptake inhibitors. The team compared exposed children to their unexposed siblings, too, and found no association between autism risk and antidepressant use.

“The use of sibling matches in both studies is a very innovative way to account for genetics and a shared environment,” says Oberlander. “We can’t ignore the fact that there are shared genetic mechanisms that might relate autism and depression. The genetic reason that brought the mom to use the drug may say more about the risk of autism in the child.”

Ötzi the Iceman froze to death

NEW ORLEANS — Ever since Ötzi’s mummified body was found in the Italian Alps in 1991, researchers have been trying to pin down how the 5,300-year-old Tyrolean Iceman died. It now looks like this Copper Age hunter-gatherer simply froze to death, perhaps after suffering minor blood loss from an arrow wound to his left shoulder, anthropologist Frank Rühli of the University of Zurich reported April 20 at the annual meeting of the American Association of Physical Anthropologists.

“Freezing to death is quite likely the main cause of death in this classic cold case,” Rühli said. Ötzi succumbed to exposure within anywhere from a few minutes to a few hours, he estimated.

New analyses of the Iceman’s body, based on X-rays and CT scans, argue against the idea that Ötzi died from a stone arrowhead shot into his shoulder (SN: 9/6/14, p. 6). Surprisingly shallow penetration of that weapon into Ötzi’s shoulder ruptured a blood vessel but caused no major tissue damage, Rühli said. Internal bleeding totaled only about 100 milliliters, or a half cup, he and his colleagues concluded. That’s enough of a poke to cause plenty of discomfort but not death, Rühli said.

Several depressions and fractures on the Iceman’s skull also couldn’t have proven fatal, he added. Some researchers regard those injuries as signs that Ötzi was clubbed to death. Rühli’s team found that those skull injuries are more consistent with the ancient man having accidentally fallen and hit his head while walking over rough ground. The Iceman was found with fur headgear that probably helped to protect his noggin when he took a headlong tumble, Rühli suggested.

How a mushroom gets its glow

The enzyme that turns on the light for a glow-in-the-dark mushroom seems “promiscuous.” But in a good way.

Researchers have worked out new details of how two Neonothopanus fungi shine softly green at night. The team had earlier figured out that the basic starting material for bioluminescence in these fungi is a compound called hispidin, found in some other fungi as well as plants such as horsetails. Those plants don’t spontaneously give off light, but in the two Neonothopanus mushroom species, an enzyme rejiggers a form of hispidin into a compound that glows.

The enzyme that turns a fungus into a natural night-light isn’t that fussy as enzymes go, says Cassius V. Stevani of the University of São Paulo in Brazil. He and colleagues can tweak the compound that the enzyme normally reacts with and still get a glow, the researchers report April 26 in Science Advances.

This easygoing chemistry has allowed the team to develop blue to orange glows instead of just the natural yellowish-green. These bonus colors might mark the beginnings of a new labeling tool for molecular biologists, the researchers say.

50 years ago, U.S. fell short on mosquito eradication

Mosquitoes on the way out

By 1973, just nine years after the start of an antimosquito campaign, the Aedes aegypti will be eradicated from the United States. The mosquito, a potential carrier of yellow fever, dengue and hemorrhagic fever, has been the target of a $23 million attack launched in 1964…. The carrier of these viral diseases can still be found in 10 southern states, Hawaii, the Virgin Islands and Puerto Rico. — Science News, May 20, 1967

Update
The eradication program, which used chemical sprays and eliminated breeding sites, never came close to getting rid of A. aegypti. Today, the virus-carrying insect’s potential range includes more than 20 states and other U.S. territories. And some carrying the Zika virus have been found in the continental United States. Researchers are investigating new ways to conquer A. aegypti, by inserting faulty genes into its DNA or dosing it with a sterilizing bacterium (SN: 4/1/17, p. 10).

Watch male cuttlefish fight over a female in the wild

The Bro Code apparently does not exist among wild cuttlefish. The first field video of male European cuttlefish (Sepia officinalis) getting physical over a female shows that they are not above stealing another guy’s girl.

Cuttlefish, cephalopods known for their ability to alter their skin color, have complex and competitive courtship rituals. While scientists have extensively studied common European cuttlefish fights over mates, observing such altercations has proven elusive outside of the lab.
In 2011, biologists Justine Allen of Brown University in Providence, R.I., and Derya Akkaynak of the University of Haifa in Israel lucked out. They were in the Aegean Sea off the coast of Turkey following a female cuttlefish with an underwater camera to study camouflage, when a male cuttlefish approached the female, and the pair mated. Soon after, another male appeared on the scene and edged in on the female. A battle of ink and arms ensued. “I just remember there being a lot of ink everywhere — so much ink,” Allen recalls.
It took the original male three tries to reclaim his mate, the team writes in the July issue of The American Naturalist. Each attempt escalated in intensity. That’s consistent with a game theory model where opponents assess peers’ abilities as well as their own, the scientists suggest.
The footage confirms that males in the wild use an arsenal of aggressive behaviors to oust romantic rivals — tactics like darkening the skin around their eyes and face, displaying a zebra pattern on their body, spraying ink while jetting through the water, biting and wrestling. Lab bouts pale in comparison to the viciousness of this encounter.

Determining whether any of this is typical for fights between males of this species requires more data and more cuttlefish.

Global access to quality health care has improved in the last two decades

Health care quality and availability improved globally from 1990 to 2015, but the gap between the haves and the have-nots widened in those 25 years, researchers report online May 18 in the Lancet.

As an approximate measure of citizens’ access to quality health care, an international team of researchers analyzed mortality rates for 32 diseases and injuries that are typically not fatal when effective medical care is available. The team summarized this data as a number on a scale from zero to 100, called the Healthcare Access and Quality Index, for 195 countries and territories.
Places with the highest scores in 2015 include Canada, Australia, Japan and much of Europe, while some African countries as well as India, Pakistan, Afghanistan and Papua New Guinea have the lowest scores. The countries with the greatest improvement since 1990 include South Korea, Peru and China.

The growing gap between countries with the highest and lowest scores suggests that health care inequalities due to geography may be on the rise, the authors say.

New test may improve pancreatic cancer diagnoses

Pancreatic cancer is hard to detect early, when the disease is most amenable to treatment. But a new study describes a blood test that may aid the diagnosis of pancreatic cancer and someday make earlier screening feasible, the authors say.

The test detects a combination of five tumor proteins that appear to be a reliable signature of the disease, the researchers report in the May 24 Science Translational Medicine. In patients undergoing pancreatic or abdominal surgery, the test was 84 percent accurate at picking out those who had pancreatic cancer.
“What’s exciting about the study is that it further favors the belief that one biomarker by itself may not be able to successfully identify a disease,” says Raghu Kalluri, a cancer biologist at the University of Texas MD Anderson Cancer Center in Houston who was not involved in the study. By putting the five protein biomarkers together, he says, “the power of the analysis might be more beneficial in differentiating healthy individuals and ones with pancreatic cancer.”

The National Cancer Institute estimates that in 2017 there will be more than 53,000 new cases of pancreatic cancer in the United States and just over 43,000 deaths from the cancer. Individuals with the most common form of pancreatic cancer, called pancreatic ductal adenocarcinoma, have a five-year survival rate of less than 10 percent. The cancer is usually caught late because the symptoms, including weight loss and abdominal pain, often don’t arise until the cancer has spread. And current imaging technology can’t detect the cancer at the start, says study coauthor Cesar Castro, a translational oncologist at Massachusetts General Hospital in Boston.

“The unmet need here is finding some other form of detection before a cancer grows large enough for the CT scan to detect it,” Castro says.

In their hunt for better detection methods, the researchers turned to tumor-derived extracellular vesicles, small sacs shed by tumor cells that circulate in the bloodstream. The sacs “are almost like mini-mes” of the parent tumor, Castro says, because they contain proteins and genetic material that often match the tumor.

The researchers selected five promising protein biomarkers from tumor-derived extracellular vesicles. Using a gold-coated silicon chip covered with antibodies and sporting nanopores, the team tested how well the biomarkers signaled the presence of pancreatic cancer in plasma samples from patients. When light shining through the pores encountered the extracellular vesicles, bound to the chip because of the interaction between the protein biomarkers and the antibodies, the light’s wavelength changed — signaling the presence of a tumor.
In plasma samples taken from 43 patients before scheduled surgery for a medical issue in the pancreas or abdomen, the panel of five biomarkers distinguished pancreatic ductal adenocarcinoma from pancreatitis — an inflammation of the pancreas — and from benign cysts as well as from control patients’ samples. A pathology report after the surgeries confirmed the results.

Using their sensing device, the researchers report that the combined five biomarkers correctly identified whether a patient had pancreatic ductal adenocarcinoma or not in 84 percent of cases. The highest accuracy for any one of these biomarkers used by itself was just 70 percent.

The next step is to test patients at high risk for pancreatic cancer, and eventually those who are healthy, to see if these biomarkers are effective at early screening. “What we need to do now is pivot towards precancerous lesions,” Castro says. “Can they pick up any precancerous changes?”

“I’m excited about anything that can happen for these patients who are in desperate need for biomarkers and treatment,” Kalluri says. But he cautions that studies reporting the effectiveness of biomarkers as cancer screening tools often use different technologies for their assessments, making it hard for academic laboratories to reproduce the results. “There’s a tremendous lack of organized effort in the biomarker field,” he says, and if there is no way to come to a consensus on which biomarkers are most promising, “it’s very difficult for a patient to realize any benefit.”

Peru’s plenty brought ancient human migration to a crawl

Some of the earliest settlers of the Americas curtailed their coastal migration to hunker down in what’s now northwestern Peru, new finds suggest. Although researchers have often assumed that shoreline colonizers of the New World kept heading south from Alaska in search of marine foods, staying put in some spots made sense: Hunter-gatherers needed only simple tools to exploit rich coastal and inland food sources for thousands of years.

Excavations at two seaside sites in Peru find that people intermittently camped there from about 15,000 to 8,000 years ago, say anthropologist Tom Dillehay of Vanderbilt University in Nashville and his colleagues. Ancient people along Peru’s Pacific coast didn’t leave behind fishhooks, harpoons, nets or boats that could have been used to capture fish, sharks and sea lions, the scientists report May 24 in Science Advances. Yet remains of those sea creatures turned up at coastal campsites now buried beneath a human-made, earthen mound called Huaca Prieta and an adjacent mound called Paredones. Fish and other marine animals probably washed up on beaches or were trapped in lagoons that formed near the shore, Dillehay’s group proposes. Hungry humans needed only nets or clubs to subdue these prey.
Other marine foods found at the ancient Peruvian campsites included snails, crabs, clams, sea gulls and pelicans. Fragments of material woven out of rush plants, the earliest dating to between 10,600 and 11,159 years ago, may have come from fish traps or baskets, the researchers say.
Radiocarbon dating of burned wood, animal bones and plant seeds provided age estimates for a series of buried campsites at Huaca Prieta and Paredones.

Present-day hunters on Peru’s coast eat fish and small sharks that get trapped on the beach or in shallow shoreline lagoons. Hunters also build blinds where they wait to net and club birds, a tactic probably also used by ancient Americans, the investigators suspect.

Deer bones indicate that ancient Huaca Prieta and Paredones visitors hunted on land as well. And remains of avocado, beans and possibly cultivated squash and chili peppers at the ancient campsites — foods known to have been gathered or grown at inland locations — suggest that people transported these foods to the coast, possibly via trading.
Evidence that early New World settlers trekked back and forth from coastal to interior parts of Peru coincides with similar human movements in southern Chile more than 14,000 years ago (SN Online: 5/8/08). A team led by Dillehay uncovered seaweed fragments in hearths and structures at Monte Verde II, located 30 kilometers from Chile’s coast. Edible rushes, reeds and stones from the coast also turned up at Monte Verde II.

“Just as there was some contact with the sea at Monte Verde II, there was some contact with the interior at Huaca Prieta,” Dillehay says.

Simple stone tools, sharpened on one side, dominate implements excavated at the Peruvian sites and at Monte Verde II. Basic tools suitable for all sorts of cutting and scraping tasks fit a lifestyle in which people sought food across varied landscapes, the researchers contend.
Similar conditions may have characterized some North American coastlines by around 15,000 years ago, Dillehay says. “The problem is that these areas are now underwater” due to a global sea level rise between 20,000 and 6,000 years ago (SN: 8/13/11, p. 22).

Accumulating evidence supports the idea that early Americans favored the coast over an inland lifestyle, says archaeologist Daniel Sandweiss of the University of Maine in Orono. An ice-free corridor into North America’s interior may not have formed before 12,600 years ago (SN Online: 8/10/16), after people had reached Peru and Chile.

The pace at which people moved south from Huaca Prieta is unknown, Sandweiss says. Monte Verde II dates to roughly 500 years after the first coastal campsites in Peru, raising the possibility that Huaca Prieta folk founded the Chilean site, he suggests.

Dillehay doubts it. Modern hunter-gatherer groups vary greatly in size but usually don’t exceed several hundred members, making it unlikely that ancient Huaca Prieta and Paredones people were numerous enough to encounter food shortages, he says. Even if food ran out, hunter-gatherers only had to move a few kilometers north or south to find abundant grub. “We really don’t know where these people were coming and going,” Dillehay cautions.