This robot automatically tucks its limbs to squeeze through spaces

Inspired by how ants move through narrow spaces by shortening their legs, scientists have built a robot that draws in its limbs to navigate constricted passages.

The robot was able to hunch down and walk quickly through passages that were narrower and shorter than itself, researchers report January 20 in Advanced Intelligent Systems. It could also climb over steps and move on grass, loose rock, mulch and crushed granite.

Such generality and adaptability are the main challenges of legged robot locomotion, says robotics engineer Feifei Qian, who was not involved in the study. Some robots have specialized limbs to move over a particular terrain, but they cannot squeeze into small spaces (SN: 1/16/19).
“A design that can adapt to a variety of environments with varying scales or stiffness is a lot more challenging, as trade-offs between the different environments need to be considered,” says Qian, of the University of Southern California in Los Angeles.

For inspiration, researchers in the new study turned to ants. “Insects are really a neat inspiration for designing robot systems that have minimal actuation but can perform a multitude of locomotion behaviors,” says Nick Gravish, a roboticist at the University of California, San Diego (SN: 8/16/18). Ants adapt their posture to crawl through tiny spaces. And they aren’t perturbed by uneven terrain or small obstacles. For example, their legs collapse a bit when they hit an object, Gravish says, and the ants continue to move forward quickly.

Gravish and colleagues built a short, stocky robot — about 30 centimeters wide and 20 centimeters long — with four wavy, telescoping limbs. Each limb consists of six nested concentric tubes that can draw into each other. What’s more, the limbs do not need to be actively powered or adjusted to change their overall length. Instead, springs that connect the leg segments automatically allow the legs to contract when the robot navigates a narrow space and stretch back out in an open space. The goal was to build mechanically intelligent structures rather than algorithmically intelligent robots.

“It’s likely faster than active control, [which] requires the robot to first sense the contact with the environment, compute the suitable action and then send the command to its motors,” Qian says, about these legs. Removing the sensing and computing components can also make the robots small, cheap and less power hungry.

The robot could modify its body width and height to achieve a larger range of body sizes than other similar robots. The leg segments contracted into themselves to let the robot wiggle through small tunnels and sprawled out when under low ceilings. This adaptability let the robot squeeze into spaces as small as 72 percent its full width and 68 percent its full height.
Next, the researchers plan to actively control the stiffness of the springs that connect the leg segments to tune the motion to terrain type without consuming too much power. “That way, you can keep your leg long when you are moving on open ground or over tall objects, but then collapse down to the smallest possible shape in confined spaces,” Gravish says.
Such small-scale, minimal robots are easy to produce and can be quickly tweaked to explore complex environments. However, despite being able to walk across different terrains, these robots are, for now, too fragile for search-and-rescue, exploration or biological monitoring, Gravish says.

The new robot takes a step closer to those goals, but getting there will take more than just robotics, Qian says. “To actually achieve these applications would require an integration of design, control, sensing, planning and hardware advancement.”

But that’s not Gravish’s interest. Instead, he wants to connect these experiments back to what was observed in the ants originally and use the robots to ask more questions about the rules of locomotion in nature (SN: 1/16/20).

“I really would like to understand how small insects are able to move so rapidly across certain unpredictable terrain,” he says. “What is special about their limbs that enables them to move so quickly?”

The Kuiper Belt’s dwarf planet Quaoar hosts an impossible ring

The dwarf planet Quaoar has a ring that is too big for its metaphorical fingers. While all other rings in the solar system lie within or near a mathematically determined distance of their parent bodies, Quaoar’s ring is much farther out.

“For Quaoar, for the ring to be outside this limit is very, very strange,” says astronomer Bruno Morgado of the Federal University of Rio de Janeiro. The finding may force a rethink of the rules governing planetary rings, Morgado and colleagues say in a study published February 8 in Nature.
Quaoar is an icy body about half the size of Pluto that’s located in the Kuiper Belt at the solar system’s edge (SN: 8/23/22). At such a great distance from Earth, it’s hard to get a clear picture of the world.

So Morgado and colleagues watched Quaoar block the light from a distant star, a phenomenon called a stellar occultation. The timing of the star winking in and out of view can reveal details about Quaoar, like its size and whether it has an atmosphere.

The researchers took data from occultations from 2018 to 2020, observed from all over the world, including Namibia, Australia and Grenada, as well as space. There was no sign that Quaoar had an atmosphere. But surprisingly, there was a ring. The finding makes Quaoar just the third dwarf planet or asteroid in the solar system known to have a ring, after the asteroid Chariklo and the dwarf planet Haumea (SN: 3/26/14; SN: 10/11/17).

Even more surprisingly, “the ring is not where we expect,” Morgado says.
Known rings around other objects lie within or near what’s called the Roche limit, an invisible line where the gravitational force of the main body peters out. Inside the limit, that force can rip a moon to shreds, turning it into a ring. Outside, the gravity between smaller particles is stronger than that from the main body, and rings will coalesce into one or several moons.

“We always think of [the Roche limit] as straightforward,” Morgado says. “One side is a moon forming, the other side is a ring stable. And now this limit is not a limit.”

For Quaoar’s far-out ring, there are a few possible explanations, Morgado says. Maybe the observers caught the ring at just the right moment, right before it turns into a moon. But that lucky timing seems unlikely, he notes.

Maybe Quaoar’s known moon, Weywot, or some other unseen moon contributes gravity that holds the ring stable somehow. Or maybe the ring’s particles are colliding in such a way that they avoid sticking together and clumping into moons.

The particles would have to be particularly bouncy for that to work, “like a ring of those bouncy balls from toy stores,” says planetary scientist David Jewitt of UCLA, who was not involved in the new work.

The observation is solid, says Jewitt, who helped discover the first objects in the Kuiper Belt in the 1990s. But there’s no way to know yet which of the explanations is correct, if any, in part because there are no theoretical predictions for such far-out rings to compare with Quaoar’s situation.

That’s par for the course when it comes to the Kuiper Belt. “Everything in the Kuiper Belt, basically, has been discovered, not predicted,” Jewitt says. “It’s the opposite of the classical model of science where people predict things and then confirm or reject them. People discover stuff by surprise, and everyone scrambles to explain it.”

More observations of Quaoar, or more discoveries of seemingly misplaced rings elsewhere in the solar system, could help reveal what’s going on.

“I have no doubt that in the near future a lot of people will start working with Quaoar to try to get this answer,” Morgado says.

Muon scanning hints at mysteries within an ancient Chinese wall

For nearly 650 years, the fortress walls in the Chinese city of Xi’an have served as a formidable barrier around the central city. At 12 meters high and up to 18 meters thick, they are impervious to almost everything — except subatomic particles called muons.

Now, thanks to their penetrating abilities, muons may be key to ensuring that the walls that once protected the treasures of the first Ming Dynasty — and are now a national architectural treasure in their own right — stand for centuries more.

A refined detection method has provided the highest-resolution muon scans yet produced of any archaeological structure, researchers report in the Jan. 7 Journal of Applied Physics. The scans revealed interior density fluctuations as small as a meter across inside one section of the Xi’an ramparts. The fluctuations could be signs of dangerous flaws or “hidden structures archaeologically interesting for discovery and investigation,” says nuclear physicist Zhiyi Liu of Lanzhou University in China.
Muons are like electrons, only heavier. They rain down all over the planet, produced when charged particles called cosmic rays hit the atmosphere. Although muons can travel deep into earth and stone, they are scattered or absorbed depending on the material they encounter. Counting the ones that pass through makes them useful for studying volcano interiors, scanning pyramids for hidden chambers and even searching for contraband stashed in containers impervious to X-rays (SN: 4/22/22).

Though muons stream down continuously, their numbers are small enough that the researchers had to deploy six detectors for a week at a time to collect enough data for 3-D scans of the rampart.

It’s now up to conservationists to determine how to address any density fluctuations that might indicate dangerous flaws, or historical surprises, inside the Xi’an walls.

Chicken DNA is replacing the genetics of their ancestral jungle fowl

Today’s red jungle fowl — the wild forebears of the domesticated chicken — are becoming more chickenlike. New research suggests that a large proportion of the wild fowl’s DNA has been inherited from chickens, and relatively recently.

Ongoing interbreeding between the two birds may threaten wild jungle fowl populations’ future, and even hobble humans’ ability to breed better chickens, researchers report January 19 in PLOS Genetics.

Red jungle fowl (Gallus gallus) are forest birds native to Southeast Asia and parts of South Asia. Thousands of years ago, humans domesticated the fowl, possibly in the region’s rice fields (SN: 6/6/22).
“Chickens are arguably the most important domestic animal on Earth,” says Frank Rheindt, an evolutionary biologist at the National University of Singapore. He points to their global ubiquity and abundance. Chicken is also one of the cheapest sources of animal protein that humans have.

Domesticated chickens (G. gallus domesticus) were known to be interbreeding with jungle fowl near human settlements in Southeast Asia. Given the unknown impacts on jungle fowl and the importance of chickens to humankind, Rheindt and his team wanted to gather more details. Wild jungle fowl contain a store of genetic diversity that could serve as a crucial resource for breeding chickens resistant to diseases or other threats.

The researchers analyzed and compared the genomes — the full complement of an organism’s DNA — of 63 jungle fowl and 51 chickens from across Southeast Asia. Some of the jungle fowl samples came from museum specimens collected from 1874 through 1939, letting the team see how the genetic makeup of jungle fowl has changed over time.

Over the last century or so, wild jungle fowl’s genomes have become increasingly similar to chickens’. Between about 20 and 50 percent of the genomes of modern jungle fowl originated in chickens, the team found. In contrast, many of the roughly 100-year-old jungle fowl had a chicken-ancestry share in the range of a few percent.

The rapid change probably comes from human communities expanding into the region’s wilderness, Rheindt says. Most modern jungle fowl live in close vicinity to humans’ free-ranging chickens, with which they frequently interbreed.

Such interbreeding has become “almost the norm now” for any globally domesticated species, Rheindt says, such as dogs hybridizing with wolves and house cats crossing with wildcats. Pigs, meanwhile, are mixing with wild boars and ferrets with polecats.
Wild populations that interbreed with their domesticated counterparts could pick up physical or behavioral traits that change how the hybrids function in their ecosystem, says Claudio Quilodrán, a conservation geneticist at the University of Geneva not involved with this research.

The effect is likely to be negative, Quilodrán says, since some of the traits coming into the wild population have been honed for human uses, not for survival in the local environment.

Wild jungle fowl have lost their genetic diversity as they’ve interbred too. The birds’ heterozygosity — a measure of a population’s genetic diversity — is now just a tenth of what it was a century ago.

“This result is initially counterintuitive,” Rheindt says. “If you mix one population with another, you would generally expect a higher genetic diversity.”

But domesticated chickens have such low genetic diversity that certain versions of jungle fowl genes are being swept out of the population by a tsunami of genetic homogeneity. The whittling down of these animals’ genetic toolkit may leave them vulnerable to conservation threats.

“Having lots of genetic diversity within a species increases the chance that certain individuals contain the genetic background to adapt to a varied range of different environmental changes and diseases,” says Graham Etherington, a computational biologist at the Earlham Institute in Norwich, England, who was not involved with this research.

A shallower jungle fowl gene pool could also mean diminished resources for breeding better chickens. The genetics of wild relatives are sometimes used to bolster the disease or pest resistance of domesticated crop plants. Jungle fowl genomes could be similarly valuable for this reason.

“If this trend continues unabated, future human generations may only be able to access the entirety of ancestral genetic diversity of chickens in the form of museum specimens,” Rheindt says, which could hamper chicken breeding efforts using the wild fowl genes.

Some countries such as Singapore, Rheindt says, have started managing jungle fowl populations to reduce interbreeding with chickens.

Procrastination may harm your health. Here’s what you can do

The worst procrastinators probably won’t be able to read this story. It’ll remind them of what they’re trying to avoid, psychologist Piers Steel says.

Maybe they’re dragging their feet going to the gym. Maybe they haven’t gotten around to their New Year’s resolutions. Maybe they’re waiting just one more day to study for that test.

Procrastination is “putting off to later what you know you should be doing now,” even if you’ll be worse off, says Steel, of the University of Calgary in Canada. But all those tasks pushed to tomorrow seem to wedge themselves into the mind — and it may be harming people’s health.
In a study of thousands of university students, scientists linked procrastination to a panoply of poor outcomes, including depression, anxiety and even disabling arm pain. “I was surprised when I saw that one,” says Fred Johansson, a clinical psychologist at Sophiahemmet University in Stockholm. His team reported the results January 4 in JAMA Network Open.

The study is one of the largest yet to tackle procrastination’s ties to health. Its results echo findings from earlier studies that have gone largely ignored, says Fuschia Sirois, a behavioral scientist at Durham University in England, who was not involved with the new research.

For years, scientists didn’t seem to view procrastination as something serious, she says. The new study could change that. “It’s that kind of big splash that’s … going to get attention,” Sirois says. “I’m hoping that it will raise awareness of the physical health consequences of procrastination.”

Procrastinating may be bad for the mind and body
Whether procrastination harms health can seem like a chicken-and-egg situation.

It can be hard to tell if certain health problems make people more likely to procrastinate — or the other way around, Johansson says. (It may be a bit of both.) And controlled experiments on procrastination aren’t easy to do: You can’t just tell a study participant to become a procrastinator and wait and see if their health changes, he says.
Many previous studies have relied on self-reported surveys taken at a single time point. But a snapshot of someone makes it tricky to untangle cause and effect. Instead, in the new study, about 3,500 students were followed over nine months, so researchers could track whether procrastinating students later developed health issues.

On average, these students tended to fare worse over time than their prompter peers. They were slightly more stressed, anxious, depressed and sleep-deprived, among other issues, Johansson and colleagues found. “People who score higher on procrastination to begin with … are at greater risk of developing both physical and psychological problems later on,” says study coauthor Alexander Rozental, a clinical psychologist at Uppsala University in Sweden. “There is a relationship between procrastination at one time point and having these negative outcomes at the later point.”

The study was observational, so the team can’t say for sure that procrastination causes poor health. But results from other researchers also seem to point in this direction. A 2021 study tied procrastinating at bedtime to depression. And a 2015 study from Sirois’ lab linked procrastinating to poor heart health.

Stress may be to blame for procrastination’s ill effects, data from Sirois’ lab and other studies suggest. She thinks that the effects of chronic procrastinating could build up over time. And though procrastination alone may not cause disease, Sirois says, it could be “one extra factor that can tip the scales.”

No, procrastinators are not lazy
Some 20 percent of adults are estimated to be chronic procrastinators. Everyone might put off a task or two, but chronic procrastinators make it their lifestyle, says Joseph Ferrari, a psychologist at DePaul University in Chicago, who has been studying procrastination for decades. “They do it at home, at school, at work and in their relationships.” These are the people, he says, who “you know are going to RSVP late.”

Though procrastinators may think they perform better under pressure, Ferrari has reported the opposite. They actually worked more slowly and made more errors than non-procrastinators, his experiments have shown. And when deadlines are slippery, procrastinators tend to let their work slide, Steel’s team reported last year in Frontiers in Psychology.

For years, researchers have focused on the personalities of people who procrastinate. Findings vary, but some scientists suggest procrastinators may be impulsive, worriers and have trouble regulating their emotions. One thing procrastinators are not, Ferrari emphasizes, is lazy. They’re actually “very busy doing other things than what they’re supposed to be doing,” he says.

In fact, Rozental adds, most research today suggests procrastination is a behavioral pattern.

And if procrastination is a behavior, he says, that means it’s something you can change, regardless of whether you’re impulsive.

Why procrastinators should be kind to themselves
When people put off a tough task, they feel good — in the moment.
Procrastinating is a way to sidestep the negative emotions linked to the task, Sirois says. “We’re sort of hardwired to avoid anything painful or difficult,” she says. “When you procrastinate, you get immediate relief.” A backdrop of stressful circumstances — say, a worldwide pandemic — can strain people’s ability to cope, making procrastinating even easier. But the relief it provides is only temporary, and many seek out ways to stop dawdling.

Researchers have experimented with procrastination treatments that run the gamut from the logistical to the psychological. What works best is still under investigation. Some scientists have reported success with time-management interventions. But the evidence for that “is all over the map,” Sirois says. That’s because “poor time management is a symptom not a cause of procrastination,” she adds.

For some procrastinators, seemingly obvious tips can work. In his clinical practice, Rozental advises students to simply put down their smartphones. Silencing notifications or studying in the library rather than at home can quash distractions and keep people on task. But that won’t be enough for many people, he says.

Hard-core procrastinators may benefit from cognitive behavioral therapy. In a 2018 review of procrastination treatments, Rozental found that this type of therapy, which involves managing thoughts and emotions and trying to change behavior, seemed to be the most helpful. Still, not many studies have examined treatments, and there’s room for improvement, he says.

Sirois also favors an emotion-centered approach. Procrastinators can fall into a shame spiral where they feel uneasy about a task, put the task off, feel ashamed for putting it off and then feel even worse than when they started. People need to short-circuit that loop, she says. Self-forgiveness may help, scientists suggested in one 2020 study. So could mindfulness training.

In a small trial of university students, eight weekly mindfulness sessions reduced procrastination, Sirois and colleagues reported in the January Learning and Individual Differences. Students practiced focusing on the body, meditating during unpleasant activities and discussed the best way to take care of themselves. A little self-compassion may snap people out of their spiral, Sirois says.

“You made a mistake and procrastinated. It’s not the end of the world,” she says. “What can you do to move forward?”

A bacteria-virus arms race could lead to a new way to treat shigellosis

When some bacteria manage to escape being killed by a virus, the microbes end up hamstringing themselves. And that could be useful in the fight to treat infections.

The bacterium Shigella flexneri — one cause of the infectious disease shigellosis — can spread within cells that line the gut by propelling itself through the cells’ barriers. That causes tissue damage that can lead to symptoms like bloody diarrhea. But when S. flexneri in lab dishes evolved to elude a type of bacteria-killing virus, the bacteria couldn’t spread cell to cell anymore, making it less virulent, researchers report November 17 in Applied and Environmental Microbiology.

The research is a hopeful sign for what’s known as phage therapy (SN: 11/20/02). With antibiotic-resistant microbes on the rise, some researchers see viruses that infect and kill only bacteria, known as bacteriophages or just phages, as a potential option to treat antibiotic-resistant infections (SN: 11/13/19). With phage therapy, infected people are given doses of a particular phage, which kill off the problematic bacteria. The problem, though, is that over time those bacteria can evolve to be resistant against the phage, too.

“We’re kind of expecting phage therapy to fail, in a sense,” says Paul Turner, an evolutionary biologist and virologist at Yale University. “Bacteria are very good at evolving resistance to phages.”
But that doesn’t mean the bacteria emerge unscathed. Some phages attack and enter bacteria by latching onto bacterial proteins crucial for a microbe’s function. If phage therapy treatments relied on such a virus, that could push the bacteria to evolve in such a way that not only helps them escape the virus but also impairs their abilities and makes them less deadly. People infected with these altered bacteria might have less severe symptoms or may not show symptoms at all.

Previous studies with the bacteria Pseudomonas aeruginosa, for instance, have found that phage and bacteria can engage in evolutionary battles that drive the bacteria to be more sensitive to antibiotics. The new study hints that researchers could leverage the arms race between S. flexneri and the newly identified phage, which was dubbed A1-1 after being found in Mexican wastewater, to treat shigellosis.

S. flexneri in contaminated water is a huge problem in parts of the world where clean water isn’t always available, such as sub-Saharan Africa and southern Asia, says Kaitlyn Kortright, a microbiologist also at Yale University. Every year, approximately 1.3 million people die from shigellosis, which is caused by four Shigella species. More than half of those deaths are in children younger than 5 years old. What’s more, antibiotics to treat shigellosis can be expensive and hard to access in those places. And S. flexneri is becoming resistant to many antibiotics. Phage therapy could be a cheaper, more accessible option to treat the infection.

The blow to S. flexneri’s cellular spread comes because to enter cells, A1-1 targets a protein called OmpA, which is crucial for the bacteria to rupture host cell membranes. The researchers found two types of mutations that made S. flexneri resistant to A1-1. Some bacteria had mutations in the gene that produces OmpA, damaging the protein’s ability to help the microbes spread from cell to cell. Others had changes to a structural component of bacterial cells called lipopolysaccharide.

The mutations in lipopolysaccharide were surprising, Kortright says, because the relationship between that structural component and OmpA isn’t fully worked out. One possibility is that those mutations distort OmpA’s structure in a way that the phage no longer recognizes it and can’t enter bacterial cells.

One lingering question is whether S. flexneri evolves in the same way outside a lab dish, says Saima Aslam, an infectious diseases physician at the University of California, San Diego who was not involved in the work. Still, the findings show that it’s “not always a bad thing” when bacteria become phage-resistant, she says.

How sleep may boost creativity

The twilight time between fully awake and sound asleep may be packed with creative potential.

People who recently drifted off into a light sleep later had problem-solving power, scientists report December 8 in Science Advances. The results help demystify the fleeting early moments of sleep and may even point out ways to boost creativity.

Prolific inventor and catnapper Thomas Edison was rumored to chase those twilight moments. He was said to fall asleep in a chair holding two steel ball bearings over metal pans. As he drifted off, the balls would fall. The ensuing clatter would wake him, and he could rescue his inventive ideas before they were lost to the depths of sleep.

Delphine Oudiette, a cognitive neuroscientist at the Paris Brain Institute, and colleagues took inspiration from Edison’s method of cultivating creativity. She and her colleagues brought 103 healthy people to their lab to solve a tricky number problem. The volunteers were asked to convert a string of numbers into a shorter sequence, following two simple rules. What the volunteers weren’t told was that there was an easy trick: The second number in the sequence would always be the correct final number, too. Once discovered, this cheat code dramatically cut the solving time.
After doing 60 of these trials on a computer, the volunteers earned a 20-minute break in a quiet, dark room. Reclined and holding an equivalent of Edison’s “alarm clock” (a light drinking bottle in one dangling hand), participants were asked to close their eyes and rest or sleep if they desired. All the while, electrodes monitored their brain waves.

About half of the participants stayed awake. Twenty-four fell asleep and stayed in the shallow, fleeting stage of sleep called N1. Fourteen people progressed to a deeper stage of sleep called N2.

After their rest, participants returned to their number problem. The researchers saw a stark difference between the groups: People who had fallen into a shallow, early sleep were 2.7 times as likely to spot the hidden trick as people who didn’t fall asleep, and 5.8 times as likely to spot it as people who had reached the deeper N2 stage.

Such drastic differences in these types of experiments are rare, Oudiette says. “We were quite astonished by the extent of the results.” The researchers also identified a “creative cocktail of brain waves,” as Oudiette puts it, that seemed to accompany this twilight stage — a mixture of alpha brain waves that usually mark relaxation mingled with the delta waves of deeper sleep.

The study doesn’t show that the time spent in N1 actually triggered the later realization, cautions John Kounios, a cognitive neuroscientist at Drexel University in Philadelphia who cowrote the 2015 book The Eureka Factor: Aha Moments, Creative Insight, and the Brain. “It could have been possible that grappling with the problem and initiating an incubation process caused both N1 and the subsequent insight,” he says, making N1 a “by-product of the processes that caused insight rather than the cause.”

More work is needed to untangle the connection between N1 and creativity, Oudiette says. But the results raise a tantalizing possibility, one that harkens to Edison’s self-optimizations: People might be able to learn to reach that twilight stage of sleep, or to produce the cocktail of brain waves associated with creativity on demand.

It seems Edison was onto something about the creative powers of nodding off. But don’t put too much stock in his habits. He is also said to have considered sleep “a criminal waste of time.”

In 2021, COVID-19 vaccines were put to the test. Here’s what we learned

2021 was the year the COVID-19 vaccines had to prove their mettle. We started the year full of hope: With vaccines in hand in record-breaking time and their rollout ramping up, we’d get shots in arms, curb this pandemic and get life back to normal. That was too optimistic.

Roughly 200 million people in the United States — and billions globally — have now been fully vaccinated. Three vaccines — one from Pfizer and its partner BioNTech, and the other two from Moderna and Johnson & Johnson — are available in the United States. Pfizer’s is even available for children as young as 5. About two dozen other vaccines have also been deployed in other parts of the world. In some higher-income countries, the United States included, people have already queued up for booster shots.

But 2021 has also been the year of learning the limits of the vaccines’ superpowers. With the vaccines pitted against aggressive coronavirus variants, inequitable distribution, some people’s hesitancy and the natural course of waning effectiveness, there’s still a lot of work to do to bring this pandemic to an end. As if to hammer home that point, the detection of the omicron variant in late November brought new uncertainty to the pandemic’s trajectory. Here are some of the top lessons we’ve learned in the first year of the COVID-19 vaccine. — Macon Morehouse
The shots work, even against emerging variants
Many COVID-19 vaccines proved effective over the last year, particularly at preventing severe disease and death (SN: 10/9/21 & 10/23/21, p. 4). That’s true even with the emergence of more transmissible coronavirus variants.

In January, in the midst of a bleak winter surge that saw average daily cases in the United States peaking at nearly 250,000, the vaccination rollout here began in earnest. Soon after, case numbers began a steep decline.

Over the summer, though, more reports of coronavirus infections in vaccinated people began to pop up. Protection against infection becomes less robust in the months following vaccination in people who received Pfizer’s or Moderna’s mRNA vaccines, multiple studies have shown (SN Online: 9/21/21). Yet the shots’ original target — preventing hospitalization — has held steady, with an efficacy of about 80 percent to 95 percent.
A single dose of Johnson & Johnson’s vaccine is less effective at preventing symptoms or keeping people out of the hospital than the mRNA jabs. The company claims there’s not yet evidence that the protection wanes. But even if that protection is not waning, some real-world data hint that the shot may not be as effective as clinical trials suggested (SN Online: 10/19/21).

Evidence of waning or lower protection ultimately pushed the United States and some other countries to green-light COVID-19 booster shots for adults (SN: 12/4/21, p. 6).

Much of the worry over waning immunity came amid the spread of highly contagious variants, including alpha, first identified in the United Kingdom in September 2020, and delta, first detected in India in October 2020 (SN Online: 7/30/21). Today, delta is the predominant variant globally.

The good news is that vaccinated people aren’t unarmed against these mutated foes. The immune system launches a multipronged attack against invaders, so the response can handle small molecular tweaks to viruses, says Nina Luning Prak, an immunologist at the University of Pennsylvania. Dealing with variants “is what the immune system does.”
Vaccine-prompted antibodies still attack alpha and delta, though slightly less well than they tackle the original virus that emerged in Wuhan, China, two years ago. Antibodies also still recognize more immune-evasive variants such as beta, first identified in South Africa in May 2020, and gamma, identified in Brazil in November 2020. Although protection against infection dips against many of these variants, vaccinated people remain much less likely to be hospitalized compared with unvaccinated people.
Experts will continue to track how well the vaccines are doing, especially as new variants, like omicron, emerge. In late November, the World Health Organization designated the omicron variant as the latest variant of concern after researchers in South Africa and Botswana warned that it carries several worrisome mutations. Preliminary studies suggest that, so far, omicron is spreading fast in places including South Africa and the United Kingdom, and can reinfect people who have already recovered from an infection. The variant might be at least as transmissible as delta, though that’s still far from certain, according to a December 9 report from researchers with Public Health England, a U.K. health agency. How omicron may affect vaccine effectiveness is also unclear. Pfizer’s two-dose shot, for instance, may be about 30 percent effective at preventing symptoms from omicron infections while a booster could bring effectiveness back up to more than 70 percent, according to estimates from Public Health England. But those estimates are based on low case numbers and could change as omicron spreads.

“This is the first time in history that we’re basically monitoring virus mutations in real time,” says Müge Çevik, an infectious diseases physician and virologist at the University of St. Andrews in Scotland. “This is what the viruses do. It’s just that we’re seeing it because we’re looking for it.”

But it’s unlikely that any new variant will take us back to square one, Çevik says. Because of the immune system’s varied defenses, it will be difficult for a coronavirus variant to become completely resistant to vaccine-induced protection. The vaccines are giving our immune systems the tools to fight back. — Erin Garcia de Jesús

The shots are safe, with few serious side effects
With billions of doses distributed around the world, the shots have proved not only effective, but also remarkably safe, with few serious side effects.

“We have so much safety data on these vaccines,” says Kawsar Talaat, an infectious diseases physician at the Johns Hopkins Bloomberg School of Public Health. “I don’t know of any vaccines that have been scrutinized to the same extent.”

Commonly reported side effects include pain, redness or swelling at the spot of the shot, muscle aches, fatigue, fever, chills or a headache. These symptoms usually last only a day or two.
More rare and serious side effects have been noted. But none are unique to these shots; other vaccines — plus infectious diseases, including COVID-19 — also cause these complications.

One example is inflammation of the heart muscle, known as myocarditis, or of the sac around the heart, pericarditis. Current estimates are a bit squishy since existing studies have different populations and other variables (SN Online: 10/19/21). Two large studies in Israel estimated that the risk of myocarditis after an mRNA vaccine is about 4 of every 100,000 males and 0.23 to 0.46 of every 100,000 females, researchers reported in October in the New England Journal of Medicine. Yet members of Kaiser Permanente Southern California who had gotten mRNA vaccines developed myocarditis at a much lower rate: 5.8 cases for every 1 million second doses given, researchers reported, also in October, in JAMA Internal Medicine.

What all the studies have in common is that young males in their teens and 20s are at highest risk of developing the side effect, and that risk is highest after the second vaccine dose (SN Online: 6/23/21). But it’s still fairly rare, topping out at about 15 cases for every 100,000 vaccinated males ages 16 to 19, according to the larger of the two Israeli studies. Males in that age group are also at the highest risk of getting myocarditis and pericarditis from any cause, including from COVID-19.
Components of the mRNA vaccines may also cause allergic reactions, including potentially life-threatening anaphylaxis. The U.S. Centers for Disease Control and Prevention calculated that anaphylaxis happens at a rate of about 0.025 to 0.047 cases for every 10,000 vaccine doses given.

But a study of almost 65,000 health care system employees in Massachusetts suggests the rate may be as high as 2.47 per 10,000 vaccinations, researchers reported in March in JAMA. Still, that rate is low, and people with previous histories of anaphylaxis have gotten the shots without problem. Even people who developed anaphylaxis after a first shot were able to get fully vaccinated if the second dose was broken down into smaller doses (SN Online: 6/1/21).

The only side effect of the COVID-19 vaccines not seen with other vaccines is a rare combination of blood clots accompanied by low numbers of blood-clotting platelets. Called thrombosis with thrombocytopenia syndrome, or TTS, it’s most common among women younger than 50 who got the Johnson & Johnson vaccine or a similar vaccine made by AstraZeneca that’s used around the world (SN Online: 4/23/21).
About 5 to 6 TTS cases were reported for every 1 million doses of the J&J vaccine, the company reported to the U.S. Food and Drug Administration. The clots may result from antibodies triggering a person’s platelets to form clots (SN Online: 4/16/21). Such antibodies also cause blood clots in COVID-19 patients, and the risk of developing strokes or clots from the disease is much higher than with the vaccine, Talaat says. In one study, 42.8 of every 1 million COVID-19 patients developed one type of blood clot in the brain, and 392.3 per 1 million developed a type of abdominal blood clot, researchers reported in EClinicalMedicine in September.

“Your chances of getting any of these side effects, except for the sore arm, from an illness with COVID are much higher” than from the vaccines, Talaat says. — Tina Hesman Saey

Getting everyone vaccinated is … complicated
The quest to vaccinate as many people as quickly as possible this year faced two main challenges: getting the vaccine to people and convincing them to take it. Strategies employed so far — incentives, mandates and making shots accessible — have had varying levels of success.

“It’s an incredibly ambitious goal to try to get the large majority of the country and the globe vaccinated in a very short time period with a brand-new vaccine,” says psychologist Gretchen Chapman of Carnegie Mellon University in Pittsburgh, who researches vaccine acceptance. Usually “it takes a number of years before you get that kind of coverage.”
Globally, that’s sure to be the case due to a lack of access to vaccines, particularly in middle- and lower-income countries. The World Health Organization set a goal to have 40 percent of people in all countries vaccinated by year’s end. But dozens of countries, mostly in Africa and parts of Asia, are likely to fall far short of that goal.

In contrast, the United States and other wealthy countries got their hands on more than enough doses. Here, the push to vaccinate started out with a scramble to reserve scarce appointments for a free shot at limited vaccination sites. But by late spring, eligible people could pop into their pharmacy or grocery store. Some workplaces offered vaccines on-site. For underserved communities that may have a harder time accessing such vaccines, more targeted approaches where shots are delivered by trusted sources at community events proved they could boost vaccination numbers (SN Online: 6/18/21).

Simply making the shot easy to get has driven much of the progress made so far, Chapman says. But getting people who are less enthusiastic has proved more challenging. Many governments and companies have tried to prod people, initially with incentives, later with mandates.
Free doughnuts, direct cash payments and entry into million-dollar lottery jackpots were among the many perks rolled out. Before the pandemic, such incentives had been shown to prompt some people to get vaccines, says Harsha Thirumurthy, a behavioral economist at the University of Pennsylvania. This time, those incentives made little difference nationwide, Thirumurthy and his colleagues reported in September in a preliminary study posted to SSRN, a social sciences preprint website. “It’s possible they moved the needle 1 or 2 percentage points, but we’ve ruled out that they had a large effect,” he says. Some studies of incentives offered by individual states have found a marginal benefit.

“People who are worried about side effects or safety are going to be more difficult to reach,” says Melanie Kornides, an epidemiologist at the University of Pennsylvania. And with vaccination status tangled up in personal identity, “you’re just not going to influence lots of people with a mass communication campaign right now; it’s really about individual conversations,” she says, preferably with someone trusted.
“Or,” she adds, “they’re going to respond to mandates.” Historically, sticks such as being fired from a job or barred from school are the most effective way of boosting vaccination rates, Kornides says. For example, hospitals that require flu shots for workers tend to have higher vaccination rates than those that don’t. For decades, mandates in schools have helped push vaccination rates up for diseases like measles and chickenpox, she says.

As COVID-19 mandates went into effect in the fall, news headlines often focused on protests and refusals. Yet early anecdotal evidence suggests some mandates have helped. For instance, after New York City public schools announced a vaccine requirement in late August for its roughly 150,000 employees, nearly 96 percent had received at least one shot by early November. Still, about 8,000 employees opted not to get vaccinated and were placed on unpaid leave, the New York Times reported.

Many people remain vehemently opposed to the vaccines, in part because of rampant misinformation that can spread quickly online. Whether more mandates, from the government or private companies, and targeted outreach will convince them remains to be seen. — Jonathan Lambert

Vaccines can’t single-handedly end the pandemic
One year in, it’s clear that vaccination is one of the best tools we have to control COVID-19. But it’s also clear vaccines alone can’t end the pandemic.

While the jabs do a pretty good job preventing infections, that protection wanes over time (SN Online: 3/30/21). Still, the vaccines have “worked spectacularly well” at protecting most people from severe disease, says Luning Prak, the University of Pennsylvania immunologist. And as more people around the world get vaccinated, fewer people will die, even if they do fall ill with COVID-19.

“We have to make a distinction between the superficial infections you can get — [like a] runny nose — versus the lower respiratory tract stuff that can kill you,” such as inflammation in the lungs that causes low oxygen levels, Luning Prak says. Preventing severe disease is the fundamental target that most vaccines, including the flu shot, hit, she notes. Stopping infection entirely “was never a realistic goal.”
Because vaccines aren’t an impenetrable barrier against the virus, we’ll still need to rely on other tactics to help control spread amid the pandemic. “Vaccines are not the sole tool in our toolbox,” says Saad Omer, an epidemiologist at Yale University. “They should be used with other things,” such as masks to help block exposure and COVID-19 tests to help people know when they should stay home.

For now, it’s crucial to have such layered protection, Omer says. “But in the long run, I think vaccines provide a way to get back to at least a new normal.” With vaccines, people can gather at school, concerts or weddings with less fear of a large outbreak.

Eventually the pandemic will end, though when is still anyone’s guess. But the end certainly won’t mean that COVID-19 has disappeared.

Many experts agree that the coronavirus will most likely remain with us for the foreseeable future, sparking outbreaks in places where there are pockets of susceptible people. Susceptibility can come in many forms: Young children who have never encountered the virus before and can’t yet get vaccinated, people who choose not to get the vaccine and people whose immunity has waned after an infection or vaccination. Or the virus may evolve in ways that help it evade the immune system.

The pandemic’s end may still feel out of reach, with the high hopes from the beginning of 2021 a distant memory. Still, hints of normalcy have returned: Kids are back in school, restaurants and stores are open and people are traveling more.

Vaccines have proved to be an invaluable tool to reduce the death and destruction that the coronavirus can leave in its wake. — Erin Garcia de Jesús

When James Webb launches, it will have a bigger to-do list than 1980s researchers suspected

he James Webb Space Telescope has been a long time coming. When it launches later this year, the observatory will be the largest and most complex telescope ever sent into orbit. Scientists have been drafting and redrafting their dreams and plans for this unique tool since 1989.

The mission was originally scheduled to launch between 2007 and 2011, but a series of budget and technical issues pushed its start date back more than a decade. Remarkably, the core design of the telescope hasn’t changed much. But the science that it can dig into has. In the years of waiting for Webb to be ready, big scientific questions have emerged. When Webb was an early glimmer in astronomers’ eyes, cosmological revolutions like the discoveries of dark energy and planets orbiting stars outside our solar system hadn’t yet happened.

“It’s been over 25 years,” says cosmologist Wendy Freedman of the University of Chicago. “But I think it was really worth the wait.”

An audacious plan
Webb has a distinctive design. Most space telescopes house a single lens or mirror within a tube that blocks sunlight from swamping the dim lights of the cosmos. But Webb’s massive 6.5-meter-wide mirror and its scientific instruments are exposed to the vacuum of space. A multilayered shield the size of a tennis court will block light from the sun, Earth and moon.

For the awkward shape to fit on a rocket, Webb will launch folded up, then unfurl itself in space (see below, What could go wrong?).

“They call this the origami satellite,” says astronomer Scott Friedman of the Space Telescope Science Institute, or STScI, in Baltimore. Friedman is in charge of Webb’s postlaunch choreography. “Webb is different from any other telescope that’s flown.”
Its basic design hasn’t changed in more than 25 years. The telescope was first proposed in September 1989 at a workshop held at STScI, which also runs the Hubble Space Telescope.

At the time, Hubble was less than a year from launching, and was expected to function for only 15 years. Thirty-one years after its launch, the telescope is still going strong, despite a series of computer glitches and gyroscope failures (SN Online: 10/10/18).

The institute director at the time, Riccardo Giacconi, was concerned that the next major mission would take longer than 15 years to get off the ground. So he and others proposed that NASA investigate a possible successor to Hubble: a space telescope with a 10-meter-wide primary mirror that was sensitive to light in infrared wavelengths to complement Hubble’s range of ultraviolet, visible and near-infrared.

Infrared light has a longer wavelength than light that is visible to human eyes. But it’s perfect for a telescope to look back in time. Because light travels at a fixed speed, looking at distant objects in the universe means seeing them as they looked in the past. The universe is expanding, so that light is stretched before it reaches our telescopes. For the most distant objects in the universe — the first galaxies to clump together, or the first stars to burn in those galaxies — light that was originally emitted in shorter wavelengths is stretched all the way to the infrared.

Giacconi and his collaborators dreamed of a telescope that would detect that stretched light from the earliest galaxies. When Hubble started sharing its views of the early universe, the dream solidified into a science plan. The galaxies Hubble saw at great distances “looked different from what people were expecting,” says astronomer Massimo Stiavelli, a leader of the James Webb Space Telescope project who has been at STScI since 1995. “People started thinking that there is interesting science here.”

In 1995, STScI and NASA commissioned a report to design Hubble’s successor. The report, led by astronomer Alan Dressler of the Carnegie Observatories in Pasadena, Calif., suggested an infrared space observatory with a 4-meter-wide mirror.

The bigger a telescope’s mirror, the more light it can collect, and the farther it can see. Four meters wasn’t that much larger than Hubble’s 2.4-meter-wide mirror, but anything bigger would be difficult to launch.

Dressler briefed then-NASA Administrator Dan Goldin in late 1995. In January 1996 at the American Astronomical Society’s annual meeting, Goldin challenged the scientists to be more ambitious. He called out Dressler by name, saying, “Why do you ask for such a modest thing? Why not go after six or seven meters?” (Still nowhere near Giacconi’s pie-in-the-sky 10-meter wish.) The speech received a standing ovation.

Six meters was a larger mirror than had ever flown in space, and larger than would fit in available launch vehicles. Scientists would have to design a telescope mirror that could fold, then deploy once it reached space.

The telescope would also need to cool itself passively by radiating heat into space. It needed a sun shield — a big one. The origami telescope was born. It was dubbed James Webb in 2002 for NASA’s administrator from 1961 to 1968, who fought to support research to boost understanding of the universe in the increasingly human-focused space program. (In response to a May petition to change the name, NASA investigated allegations that James Webb persecuted gay and lesbian people during his government career. The agency announced on September 27 that it found no evidence warranting a name change.)
Goldin’s motto at NASA was “Faster, better, cheaper.” Bigger was better for Webb, but it sure wasn’t faster — or cheaper. By late 2010, the project was more than $1.4 billion over its $5.1 billion budget (SN: 4/9/11, p. 22). And it was going to take another five years to be ready. Today, the cost is estimated at almost $10 billion.

The telescope survived a near-cancellation by Congress, and its timeline was reset for an October 2018 launch. But in 2017, the launch was pushed to June 2019. Two more delays in 2018 pushed the takeoff to May 2020, then to March 2021. Some of those delays were because assembling and testing the spacecraft took longer than NASA expected.

Other slowdowns were because of human errors, like using the wrong cleaning solvent, which damaged valves in the propulsion system. Recent shutdowns due to the coronavirus pandemic pushed the launch back a few more months.

“I don’t think we ever imagined it would be this long,” says University of Chicago’s Freedman, who worked on the Dressler report. But there’s one silver lining: Science marched on.

The age conflict
The first science goal listed in the Dressler report was “the detailed study of the birth and evolution of normal galaxies such as the Milky Way.” That is still the dream, partly because it’s such an ambitious goal, Stiavelli says.

“We wanted a science rationale that would resist the test of time,” he says. “We didn’t want to build a mission that would do something that gets done in some other way before you’re done.”

Webb will peek at galaxies and stars as they were just 400 million years after the Big Bang, which astronomers think is the epoch when the first tiny galaxies began making the universe transparent to light by stripping electrons from cosmic hydrogen.

But in the 1990s, astronomers had a problem: There didn’t seem to be enough time in the universe to make galaxies much earlier than the ones astronomers had already seen. The standard cosmology at the time suggested the universe was 8 billion or 9 billion years old, but there were stars in the Milky Way that seemed to be about 14 billion years old.

“There was this age conflict that reared its head,” Freedman says. “You can’t have a universe that’s younger than the oldest stars. The way people put it was, ‘You can’t be older than your grandmother!’”
In 1998, two teams of cosmologists showed that the universe is expanding at an ever-increasing rate. A mysterious substance dubbed dark energy may be pushing the universe to expand faster and faster. That accelerated expansion means the universe is older than astronomers previously thought — the current estimate is about 13.8 billion years old.

“That resolved the age conflict,” Freedman says. “The discovery of dark energy changed everything.” And it expanded Webb’s to-do list.

Dark energy
Top of the list is getting to the bottom of a mismatch in cosmic measurements. Since at least 2014, different methods for measuring the universe’s rate of expansion — called the Hubble constant — have been giving different answers. Freedman calls the issue “the most important problem in cosmology today.”

The question, Freedman says, is whether the mismatch is real. A real mismatch could indicate something profound about the nature of dark energy and the history of the universe. But the discrepancy could just be due to measurement errors.

Webb can help settle the debate. One common way to determine the Hubble constant is by measuring the distances and speeds of far-off galaxies. Measuring cosmic distances is difficult, but astronomers can estimate them using objects of known brightness, called standard candles. If you know the object’s actual brightness, you can calculate its distance based on how bright it seems from Earth.

Studies using supernovas and variable stars called Cepheids as candles have found an expansion rate of 74.0 kilometers per second for approximately every 3 million light-years, or megaparsec, of distance between objects. But using red giant stars, Freedman and colleagues have gotten a smaller answer: 69.8 km/s/Mpc.

Other studies have measured the Hubble constant by looking at the dim glow of light emitted just 380,000 years after the Big Bang, called the cosmic microwave background. Calculations based on that glow give a smaller rate still: 67.4 km/s/Mpc. Although these numbers may seem close, the fact that they disagree at all could alter our understanding of the contents of the universe and how it evolves over time. The discrepancy has been called a crisis in cosmology (SN: 9/14/19, p. 22).

In its first year, Webb will observe some of the same galaxies used in the supernova studies, using three different objects as candles: Cepheids, red giants and peculiar stars called carbon stars.

The telescope will also try to measure the Hubble constant using a distant gravitationally lensed galaxy. Comparing those measurements with each other and with similar ones from Hubble will show if earlier measurements were just wrong, or if the tension between measurements is real, Freedman says.

Without these new observations, “we were just going to argue about the same things forever,” she says. “We just need better data. And [Webb] is poised to deliver it.”
Exoplanets
Perhaps the biggest change for Webb science has been the rise of the field of exoplanet explorations.

“When this was proposed, exoplanets were scarcely a thing,” says STScI’s Friedman. “And now, of course, it’s one of the hottest topics in all of science, especially all of astronomy.”

The Dressler report’s second major goal for Hubble’s successor was “the detection of Earthlike planets around other stars and the search for evidence of life on them.” But back in 1995, only a handful of planets orbiting other sunlike stars were even known, and all of them were scorching-hot gas giants — nothing like Earth at all.

Since then, astronomers have discovered thousands of exoplanets orbiting distant stars. Scientists now estimate that, on average, there is at least one planet for every star we see in the sky. And some of the planets are small and rocky, with the right temperatures to support liquid water, and maybe life.

Most of the known planets were discovered as they crossed, or transited, in front of their parent stars, blocking a little bit of the parent star’s light. Astronomers soon realized that, if those planets have atmospheres, a sensitive telescope could effectively sniff the air by examining the starlight that filters through the atmosphere.

The infrared Spitzer Space Telescope, which launched in 2003, and Hubble have started this work. But Spitzer ran out of coolant in 2009, keeping it too warm to measure important molecules in exoplanet atmospheres. And Hubble is not sensitive to some of the most interesting wavelengths of light — the ones that could reveal alien life-forms.

That’s where Webb is going to shine. If Hubble is peeking through a crack in a door, Webb will throw the door wide open, says exoplanet scientist Nikole Lewis of Cornell University. Crucially, Webb, unlike Hubble, will be particularly sensitive to several carbon-bearing molecules in exoplanet atmospheres that might be signs of life.

“Hubble can’t tell us anything really about carbon, carbon monoxide, carbon dioxide, methane,” she says.

If Webb had launched in 2007, it could have missed this whole field. Even though the first transiting exoplanet was discovered in 1999, their numbers were low for the next decade.

Lewis remembers thinking, when she started grad school in 2007, that she could make a computer model of all the transiting exoplanets. “Because there were literally only 25,” she says.
Between 2009 and 2018, NASA’s Kepler space telescope raked in transiting planets by the thousands. But those planets were too dim and distant for Webb to probe their atmospheres.

So the down-to-the-wire delays of the last few years have actually been good for exoplanet research, Lewis says. “The launch delays were one of the best things that’s happened for exoplanet science with Webb,” she says. “Full stop.”

That’s mainly thanks to NASA’s Transiting Exoplanet Survey Satellite, or TESS, which launched in April 2018. TESS’ job is to find planets orbiting the brightest, nearest stars, which will give Webb the best shot at detecting interesting molecules in planetary atmospheres.

If it had launched in 2018, Webb would have had to wait a few years for TESS to pick out the best targets. Now, it can get started on those worlds right away. Webb’s first year of observations will include probing several known exoplanets that have been hailed as possible places to find life. Scientists will survey planets orbiting small, cool stars called M dwarfs to make sure such planets even have atmospheres, a question that has been hotly debated.

If a sign of life does show up on any of these planets, that result will be fiercely debated, too, Lewis says. “There will be a huge kerfuffle in the literature when that comes up.” It will be hard to compare planets orbiting M dwarfs with Earth, because these planets and their stars are so different from ours. Still, “let’s look and see what we find,” she says.

A limited lifetime
With its components assembled, tested and folded at Northrop Grumman’s facilities in California, Webb is on its way by boat through the Panama Canal, ready to launch in an Ariane 5 rocket from French Guiana. The most recent launch date is set for December 18.

For the scientists who have been working on Webb for decades, this is a nostalgic moment.

“You start to relate to the folks who built the pyramids,” Stiavelli says.

Other scientists, who grew up in a world where Webb was always on the horizon, are already thinking about the next big thing.

“I’m pretty sure, barring epic disaster, that [Webb] will carry my career through the next decade,” Lewis says. “But I have to think about what I’ll do in the next decade” after that.

Unlike Hubble, which has lasted decades thanks to fixes by astronauts and upgrade missions, Webb has a strictly limited lifetime. Orbiting the sun at a gravitationally fixed point called L2, Webb will be too far from Earth to repair, and will need to burn small amounts of fuel to stay in position. The fuel will last for at least five years, and hopefully as much as 10. But when the fuel runs out, Webb is finished. The telescope operators will move it into retirement in an out-of-the-way orbit around the sun, and bid it farewell.

Space rocks may have bounced off baby Earth, but slammed into Venus

Squabbling sibling planets may have hurled space rocks when they were young.

Simulations suggest that space rocks the size of baby planets struck both the newborn Earth and Venus, but many of the rocks that only grazed Earth went on to hit — and stick — to Venus. That difference in early impacts could help explain why Earth and Venus are such different worlds today, researchers report September 23 in the Planetary Science Journal.

“The pronounced differences between Earth and Venus, in spite of their similar orbits and masses, has been one of the biggest puzzles in our solar system,” says planetary scientist Shigeru Ida of the Tokyo Institute of Technology, who was not involved in the new work. This study introduces “a new point that has not been raised before.”

Scientists have typically thought that there are two ways that collisions between baby planets can go. The objects could graze each other and each continue on its way, in a hit-and-run collision. Or two protoplanets could stick together, or accrete, making one larger planet. Planetary scientists often assume that every hit-and-run collision eventually leads to accretion. Objects that collide must have orbits that cross each other’s, so they’re bound to collide again and again, and eventually should stick.
But previous work from planetary scientist Erik Asphaug of the University of Arizona in Tucson and others suggests that isn’t so. It takes special conditions for two planets to merge, Asphaug says, like relatively slow impact speeds, so hit-and-runs were probably much more common in the young solar system.

Asphaug and colleagues wondered what that might have meant for Earth and Venus, two apparently similar planets with vastly different climates. Both worlds are about the same size and mass, but Earth is wet and clement while Venus is a searing, acidic hellscape (SN: 2/13/18).

“If they started out on similar pathways, somehow Venus took a wrong turn,” Asphaug says.

The team ran about 4,000 computer simulations in which Mars-sized protoplanets crashed into a young Earth or Venus, assuming the two planets were at their current distances from the sun. The researchers found that about half of the time, incoming protoplanets grazed Earth without directly colliding. Of those, about half went on to collide with Venus.

Unlike Earth, Venus ended up accreting most of the objects that hit it in the simulations. Hitting Earth first slowed incoming objects down enough to let them stick to Venus later, the study suggests. “You have this imbalance where things that hit the Earth, but don’t stick, tend to end up on Venus,” Asphaug says. “We have a fundamental explanation for why Venus ended up accreting differently from the Earth.”

If that’s really what happened, it would have had a significant effect on the composition of the two worlds. Earth would have ended up with more of the outer mantle and crust material from the incoming protoplanets, while Venus would have gotten more of their iron-rich cores.

The imbalance in impacts could even explain some major Venusian mysteries, like why the planet doesn’t have a moon, why it spins so slowly and why it lacks a magnetic field — though “these are hand-waving kind of conjectures,” Asphaug says.

Ida says he hopes that future work will look into those questions more deeply. “I’m looking forward to follow-up studies to examine if the new result actually explains the Earth-Venus difference,” he says.

The idea fits into a growing debate among planetary scientists about how the solar system grew up, says planetary scientist Seth Jacobson of Michigan State University in East Lansing. Was it built violently, with lots of giant collisions, or calmly, with planets growing smoothly via pebbles sticking together?

“This paper falls on the end of lots of giant impacts,” Jacobson says.

Each rocky planet in the solar system should have very different chemistry and structure depending on which scenario is true. But scientists know the chemistry and structure of only one planet with any confidence: Earth. And Earth’s early history has been overwritten by plate tectonics and other geologic activity. “Venus is the missing link,” Jacobson says. “Learning more about Venus’ chemistry and interior structure is going to tell us more about whether it had a giant impact or not.”

Three missions to Venus are expected to launch in the late 2020s and 2030s (SN: 6/2/21). Those should help, but none are expected to take the kind of detailed composition measurements that could definitively solve the mystery. That would take a long-lived lander, or a sample return mission, both of which would be extremely difficult on hot, hostile Venus.

“I wish there was an easier way to test it,” Jacobson says. “I think that’s where we should concentrate our energy as terrestrial planet formation scientists going forward.”