Some of the earliest settlers of the Americas curtailed their coastal migration to hunker down in what’s now northwestern Peru, new finds suggest. Although researchers have often assumed that shoreline colonizers of the New World kept heading south from Alaska in search of marine foods, staying put in some spots made sense: Hunter-gatherers needed only simple tools to exploit rich coastal and inland food sources for thousands of years.
Excavations at two seaside sites in Peru find that people intermittently camped there from about 15,000 to 8,000 years ago, say anthropologist Tom Dillehay of Vanderbilt University in Nashville and his colleagues. Ancient people along Peru’s Pacific coast didn’t leave behind fishhooks, harpoons, nets or boats that could have been used to capture fish, sharks and sea lions, the scientists report May 24 in Science Advances. Yet remains of those sea creatures turned up at coastal campsites now buried beneath a human-made, earthen mound called Huaca Prieta and an adjacent mound called Paredones. Fish and other marine animals probably washed up on beaches or were trapped in lagoons that formed near the shore, Dillehay’s group proposes. Hungry humans needed only nets or clubs to subdue these prey. Other marine foods found at the ancient Peruvian campsites included snails, crabs, clams, sea gulls and pelicans. Fragments of material woven out of rush plants, the earliest dating to between 10,600 and 11,159 years ago, may have come from fish traps or baskets, the researchers say. Radiocarbon dating of burned wood, animal bones and plant seeds provided age estimates for a series of buried campsites at Huaca Prieta and Paredones.
Present-day hunters on Peru’s coast eat fish and small sharks that get trapped on the beach or in shallow shoreline lagoons. Hunters also build blinds where they wait to net and club birds, a tactic probably also used by ancient Americans, the investigators suspect.
Deer bones indicate that ancient Huaca Prieta and Paredones visitors hunted on land as well. And remains of avocado, beans and possibly cultivated squash and chili peppers at the ancient campsites — foods known to have been gathered or grown at inland locations — suggest that people transported these foods to the coast, possibly via trading. Evidence that early New World settlers trekked back and forth from coastal to interior parts of Peru coincides with similar human movements in southern Chile more than 14,000 years ago (SN Online: 5/8/08). A team led by Dillehay uncovered seaweed fragments in hearths and structures at Monte Verde II, located 30 kilometers from Chile’s coast. Edible rushes, reeds and stones from the coast also turned up at Monte Verde II.
“Just as there was some contact with the sea at Monte Verde II, there was some contact with the interior at Huaca Prieta,” Dillehay says.
Simple stone tools, sharpened on one side, dominate implements excavated at the Peruvian sites and at Monte Verde II. Basic tools suitable for all sorts of cutting and scraping tasks fit a lifestyle in which people sought food across varied landscapes, the researchers contend. Similar conditions may have characterized some North American coastlines by around 15,000 years ago, Dillehay says. “The problem is that these areas are now underwater” due to a global sea level rise between 20,000 and 6,000 years ago (SN: 8/13/11, p. 22).
Accumulating evidence supports the idea that early Americans favored the coast over an inland lifestyle, says archaeologist Daniel Sandweiss of the University of Maine in Orono. An ice-free corridor into North America’s interior may not have formed before 12,600 years ago (SN Online: 8/10/16), after people had reached Peru and Chile.
The pace at which people moved south from Huaca Prieta is unknown, Sandweiss says. Monte Verde II dates to roughly 500 years after the first coastal campsites in Peru, raising the possibility that Huaca Prieta folk founded the Chilean site, he suggests.
Dillehay doubts it. Modern hunter-gatherer groups vary greatly in size but usually don’t exceed several hundred members, making it unlikely that ancient Huaca Prieta and Paredones people were numerous enough to encounter food shortages, he says. Even if food ran out, hunter-gatherers only had to move a few kilometers north or south to find abundant grub. “We really don’t know where these people were coming and going,” Dillehay cautions.
Those who struggle to fit a vacation wardrobe into a carry-on might learn from ladybugs. The flying beetles neatly fold up their wings when they land, stashing the delicate appendages underneath their protective red and black forewings.
To learn how one species of ladybug (Coccinella septempunctata) achieves such efficient packing, scientists needed to see under the bug’s spotted exterior. So a team from Japan replaced part of a ladybug’s forewing with a transparent bit of resin, to get a first-of-its-kind glimpse of the folding. Slow-motion video of the altered ladybug showed that the insect makes a complex, origami-like series of folds to stash its wings, the scientists report in the May 30 Proceedings of the National Academy of Sciences. CT scans helped explain how the wings can be both strong enough to hold the insects aloft and easily foldable into a tiny package. The shape of the wing veins allows them to flex like a metal tape measure, making the wings stiff but bendable. Lessons learned from the wings could be applied to new technologies, including foldable aircraft wings or solar panels that unfurl from a spacecraft.
Climate change may make the rich richer and the poor poorer in the United States.
Counties in the South face a higher risk of economic downturn due to climate change than their northern counterparts, a new computer simulation predicts. Because southern counties generally host poorer populations, the new findings, reported in the June 30 Science, suggest that climate change will worsen existing wealth disparities.
“It’s the most detailed and comprehensive study of the effects of climate change in the United States,” says Don Fullerton, an economist at the University of Illinois at Urbana-Champaign who was not involved in the work. “Nobody has ever even considered the effects of climate change on inequality.” Researchers created a computer program called SEAGLAS that combined several climate simulations to forecast U.S. climate until 2100, assuming greenhouse gas emissions keep ramping up. Then, using data from previous studies on how temperature and rainfall affect several economic factors — including crop yields, crime rates and energy expenditures — SEAGLAS predicted how the economy of each of the 3,143 counties in the United States would fare.
By the end of the century, some counties may see their gross domestic product decline by more than 20 percent, while others may actually experience more than a 10 percent increase in GDP. This could make for the biggest transfer of wealth in U.S. history, says study coauthor Solomon Hsiang, an economist at the University of California, Berkeley.
In general, SEAGLAS predicts that counties in the lower Midwest, the South and the Southwest — already home to some of the country’s poorest communities — will bear the brunt of climate-caused economic damages, while counties in New England, the Great Lakes region and the Pacific Northwest will suffer less or see gains. For many of the examined economic factors, such as the number of deaths per year, “getting a little bit hotter is much worse if you’re already very hot,” explains Hsiang. “Most of the south is the hottest part of the country, so those are the regions where costs tend to be really high.” The economic gaps may get stretched even wider than SEAGLAS predicts, Fullerton says, because the simulation doesn’t account for wealth disparities within counties. For example, wealthier people in poor counties may have access to air conditioning while their less fortunate neighbors do not. So blisteringly hot weather is most likely to harm the poorest of the poor.
Not all researchers, however, think the future is as bleak as SEAGLAS suggests. The simulation doesn’t fully account for adaptation to climate change, says Delavane Diaz, an energy and environmental policy analyst at the Electric Power Research Institute in Washington, D.C., a nonprofit research organization. For example, people in coastal regions could mitigate the cost of sea level rise by flood-proofing structures or moving inland, she says.
And the economic factors examined in this study don’t account for some societal benefits that may arise from climate change, says Derek Lemoine, an economist at the University of Arizona in Tucson. For instance, although crime rates rise when it’s warmer because more people tend to be out and about, people being active outside could have a positive impact on health.
But SEAGLAS is designed to incorporate different societal variables as new data become available. “I really like the system,” Lemoine says. “It’s a super ambitious work and the kind of thing that needs to be done.”
Mums are now a flower of a different color. Japanese researchers have added a hint of clear sky to the humble plant’s palette, genetically engineering the first-ever “true blue” chrysanthemum.
“Obtaining blue-colored flowers is the Holy Grail for plant breeders,” says Mark Bridgen, a plant breeder at Cornell University. The results are “very exciting.”
Compounds called delphinidin-based anthocyanin pigments are responsible for the natural blues in such flowers as pansies and larkspur. Mums lack those compounds. Instead, the flowers come in a variety of other colors, evoking fiery sunsets, new-fallen snow and all things chartreuse. In previous attempts to engineer a blue hue in chrysanthemums — and roses and carnations — researchers inserted the gene for a key enzyme that controls production of these compounds, causing them to accumulate. But the resulting blooms skewed more violet-purple than blue. True blue pigment remained elusive, scientists thought, because its origin was complex; multiple genes have been shown to be involved in its generation. But Naonobu Noda, of the National Agriculture and Food Research Organization in Tsukuba, Japan, and colleagues were surprised to find that inserting only two borrowed genes into chrysanthemums created blue flowers. One gene, from Canterbury bells, got the enzyme process started; the other, from butterfly peas, further tweaked the pigment molecules.
Together, the gene double-team transformed 19 of 32 mums, or 59 percent, of the Taihei variety from having pink or magenta blooms into blue beauties. Additional analyses revealed that the blue color arose because of molecular interactions between the tweaked pigment and certain colorless compounds naturally found in many plants, including chrysanthemums. The two-part method could possibly be used in the production of other blue flowers, the researchers report July 26 in Science Advances.
With just weeks to go before the Great American Eclipse, scientists are finalizing years of planning to study the solar phenomenon. But it’s not too late to get involved.
“This is the first eclipse crossing over a major landmass in the era when lots of people have digital devices,” says astronomer Alex Filippenko of the University of California, Berkeley. The 120-kilometer-wide path of totality, where observers will see the moon completely cover the sun, will streak through 12 states from coast-to-coast on August 21, and the entire country will see at least a partial eclipse. So if you’re interested in collecting data on this rare celestial spectacle, there’s an app for that. Several, in fact.
Observe eclipse weather The GLOBE Program’s Observer app lets people to catalog changes in their local atmosphere that are affected by the amount of sunlight hitting Earth. Users can take pictures or enter descriptions of cloud cover, record temperatures and write in such observations as wind speed or air pressure if they have the equipment to make them.
“The eclipse allows us to see what happens when there’s a relatively abrupt drop in sunlight,” says Kristen Weaver of the GLOBE Observer program at NASA Goddard Space Flight Center in Greenbelt, Md. Even people who aren’t in the path of totality can contribute valuable data, Weaver says.
Students from kindergarten through high school will use the data in research projects through the GLOBE education program. And if enough people send in data, NASA researchers creating models of Earth’s energy budget — the balance between the energy our planet receives from the sun and sends back out into space — could also analyze the observations. Folks willing to take their eyes off the skies for a moment can record what plants and animals are doing around eclipse time using the California Academy of Sciences’ iNaturalist app.
Past eclipse-watchers have told tales of animals exhibiting nighttime behavior — such as birds falling silent or squirrels retreating to their dens — when the moon blocks the sun. The iNaturalist app could allow the first extensive examination of this phenomenon, says the Academy’s Elise Ricard.
People can use the app to note the behavior of whatever animals are around, be they pets, livestock, wildlife or even zoo animals. Some flowers also close up at night, Ricard notes, so plant observations are also encouraged.
The data could help scientists understand the extent of eclipse necessary to elicit certain responses. For instance, someone in an area where the moon covers 85 percent of the sun might notice odd animal behavior that isn’t seen by anyone in areas with only 70 percent coverage of the sun.
Record the sounds of an eclipse The sounds of nature can also change dramatically during an eclipse. Ricard remembers birds falling silent in the nearby jungle during an Australian eclipse. And Henry Winter, an astrophysicist at the Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass., recalls a friend’s story of crickets starting to chirp during an eclipse.
To capture these shifting sounds, Winter launched the Eclipse Soundscapes project, which will collect recordings during the solar eclipse to provide an audio experience for the visually impaired.
Soundscapes’ reps will record at 12 national parks, and anyone with the app can add their own sound bites to the database. The app will also include a specially designed narration of what’s happening in the sky for visually impaired eclipse-goers.
Solar snapshots for science For anyone in the path of totality, there’s also the chance to participate in the Eclipse MegaMovie project, the first crowdsourced image archive of a total solar eclipse.
Viewed from any single place on the ground in the path of totality, the moon completely covers the sun for only about 2½ minutes. But by gathering images from over 1,000 trained volunteers and from members of the public snapping pics with smartphones across the country, the MegaMovie project will capture the full 90-minute duration of the solar eclipse as it crosses the continental United States.
The project will give astronomers an unprecedented view of the outer layers of the sun’s atmosphere — which aren’t blocked by the moon — during the entire eclipse.
Like the data collected by all these apps, the Eclipse MegaMovie images will be publicly available online, where they can be accessed by amateur astronomers, too. “They might notice some interesting phenomenon before professional astronomers get around to looking at the data,” Filippenko says.
It’s certainly an exciting time to be a citizen scientist. But if you do decide to participate in data collection, don’t forget to take time to put down your phone and enjoy the majesty of the eclipse.
Saber-toothed kittens were the spitting image of their parents. Even as babies, the cats not only had the oversized canine teeth but also unusually powerful forelimbs, Katherine Long, a graduate student at California State Polytechnic University in Pomona, and colleagues report September 27 in PLOS ONE.
As adults, the ferocious felines used those strong forelimbs to secure wriggling prey before slashing a throat or belly (thereby avoiding breaking off a tooth in the struggle). Paleontologists have puzzled over whether saber-toothed cats such as Smilodon fatalis developed those robust limbs as they grew.
To compare the growth rate of Smilodon with that of similar-sized non‒saber-toothed cats that lived alongside it, Long and her team turned to fossils collected from the La Brea Tar Pits in Los Angeles. The ancient asphalt traps hold a wealth of species and specimens from juveniles to adults, dating to between 37,000 and 9,000 years ago.
The Smilodon bones, they found, did not show any evidence of an unusual growth spurt. Instead, the bones grew longer and slimmer as the kittens grew up, following the same developmental pattern as the other large cats. That suggests that when it comes to their mighty forelimbs, Smilodon kittens were just born that way.
Imagine a world where the polar ice sheets are melting, sea level is rising and the atmosphere is stuffed with about 400 parts per million of carbon dioxide. Sound familiar? It should. We’re living it. But the description also matches Earth a little over 3 million years ago, in the middle of the geologic epoch known as the Pliocene.
To understand how our planet might respond as global temperatures rise, scientists are looking to warm periods of the past. These include the steamy worlds of the Cretaceous Period, such as around 90 million years ago, and the boundary of the Paleocene and Eocene epochs, about 56 million years ago. But to many researchers, the best reference for today’s warming is the more recent Pliocene, which lasted from 5.3 million to 2.6 million years ago. The mid-Pliocene was the last time atmospheric CO2 levels were similar to today’s, trapping heat and raising global temperatures to above the levels Earth is experiencing now.
New research is illuminating how the planet responded to Pliocene warmth. One set of scientists has fanned out across the Arctic, gathering geologic clues to how temperatures there may have been as much as 19 degrees Celsius higher than today. The warmth allowed trees to spread far to the north, creating Arctic forests where three-toed horses, giant camels and other animals roamed. When lightning struck, wildfires roared across the landscape, spewing soot into the air and altering the region’s climate. Other researchers are pushing the frontiers of climate modeling, simulating how the oceans, atmosphere and land responded as Pliocene temperatures soared. One new study shows how the warmth may have triggered huge changes in ocean circulation, setting up an enormous overturning current in the Pacific Ocean, similar to the “conveyor belt” in today’s Atlantic that drives weather and climate. A second new paper suggests that the Greenland and Antarctic ice sheets might have responded differently to Pliocene heat, melting at different times.
All this research into the last great warm period is helping scientists think more deeply about how the future might play out. It may not be a road map to the next 100 years, but the Pliocene is a rough guide to the high sea levels, vanishing ice and altered weather patterns that might arrive hundreds to thousands of years from now.
“It’s a case study for understanding how warm climates function,” says Heather Ford, a paleoceanographer at the University of Cambridge. “It’s our closest analog for future climate change.”
Walk through history Teasing out the history of the Pliocene is a little like digging through a family’s past. One group of enthusiasts goes through genealogical records, collecting data on who lived where, and when. Another group uses computer software and modeling to look for broad patterns that describe how the family grew and moved over time.
The data detectives begin their work in rocks and sediments dating to the Pliocene that are scattered around the world like family-tree histories in city library archives. In 1988, the U.S. Geological Survey began a project called PRISM, for Pliocene Research, Interpretation and Synoptic Mapping, which aims to gather as many geologic clues as possible about Pliocene environments. At its start, PRISM focused on a collection of deep-sea cores drilled from the floor of the North Atlantic Ocean. Different types of marine organisms thrive in water of different temperatures. By comparing the relative abundance of species of tiny organisms preserved in the deep-sea cores, PRISM scientists could roughly map how cold-loving organisms gave way to warm ones (and vice versa) at different times in the past. Early results from the project, reported in 1992 by USGS research geologist Harry Dowsett and colleagues, showed that during the Pliocene, warming was amplified at higher latitudes in the North Atlantic.
Scientists continue to add to the PRISM records. One international team drilled a sediment core from beneath a Siberian lake and found that summer air temperatures there, in the mid-Pliocene, were as high as 15° C (about 59° Fahrenheit). That’s 8 degrees warmer than today (SN: 6/15/13, p. 13). Other researchers uncovered clues, such as plant fossils from peat bogs, that suggest mean annual temperatures on Canada’s now-frozen Ellesmere Island near Greenland were as much as 18 degrees higher than today (SN: 4/6/13, p. 9).
Now, a new group of biologists, geoscientists and other experts in past landscapes have banded together in a project called PoLAR-FIT, for Pliocene Landscape and Arctic Remains — Frozen in Time. The team is focusing on the Arctic because, just as today’s Arctic is warming faster than other parts of the planet, the Pliocene Arctic warmed more than the rest of the globe. “That’s what we call polar amplification,” says Tamara Fletcher, a team member and paleoecologist at the University of Montana in Missoula. “It was even more magnified in the Pliocene than what we’re seeing today.”
PoLAR-FIT scientists travel to the Arctic to collect geologic evidence about how the region responded to rising temperatures in the Pliocene. In the thawing permafrost slopes of Ellesmere Island, for instance, Fletcher and colleagues have been mapping black layers of charcoal in sediments dating from the Pliocene. Each charcoal layer represents a fire that burned through the ancient forest. By tracking the events across Ellesmere and other nearby islands, Fletcher’s team discovered that fire was widespread across what is now the Canadian Arctic. Wildfires changed vegetation across the landscape, possibly altering how the Arctic responded to rising temperatures. Soot rising from the fires would have darkened the skies, potentially leading to local or regional weather changes. “How important is that to the warming?” asks Bette Otto-Bliesner, a paleoclimatologist at the National Center for Atmospheric Research in Boulder, Colo. “That’s something we’re still trying to determine.” Fletcher, Otto-Bliesner and colleagues described the charcoal discovery, along with modeling studies of the fires’ effects, in Seattle in October at a meeting of the Geological Society of America.
In 2012, about 283,280 square kilometers of forest burned in Russia. Three years later, more than 20,230 square kilometers burned in Alaska. Last summer, a wildfire broke out in the icy landscape of western Greenland. “We’re already seeing fire in the Arctic, which is unusual today,” Fletcher says. “But it wouldn’t have been unusual in the Pliocene.”
While the work doesn’t predict how much of the Arctic will burn as temperatures rise, the findings do suggest that people need to prepare for more fires in the future. Trapped ocean heat Scientists like Fletcher are the genealogists of the Pliocene, collecting records of past environments. Other researchers — the computer modelers — put those old records into broad context, like historians analyzing family trees for patterns of migration and change.
The modelers begin with data on Pliocene temperatures — such as how hot it got on Ellesmere Island or in the North Atlantic Ocean, as revealed by plant fossils or seafloor sediments. Scientists can also estimate how much CO2 was in the atmosphere at the time by looking at clues such as the density of holes in fossilized leaves of Pliocene plants, which used those openings to take up CO2. Estimates vary, but most suggest CO2 levels were about 350 to 450 ppm in the mid-Pliocene. It’s not clear what caused the gas buildup during the Pliocene; one possibility is it came from long-term changes in the way carbon cycles between the land, ocean and atmosphere. But no matter the source, the high levels of CO 2 caused temperatures to soar by trapping heat in the atmosphere. The Pliocene isn’t a perfect crystal ball for today. For starters, scientists know why CO2 levels are now increasing — burning of fossil fuels and other human activities (SN: 5/30/15, p. 15). As the Industrial Revolution was gaining steam, in the 19th century, atmospheric CO2 levels were around 280 ppm. Today that level is just above 400 ppm, and rising.
Modeling the Pliocene climate can help reveal how Earth responded in somewhat similar conditions. That means studying changes in the Pliocene atmosphere, the land surface and most of all the oceans, which absorb the bulk of planetary warming. “That’s the sort of thing you can understand from studying past warm episodes,” Ford says. “What was different about how heat and carbon were moving around in the ocean?”
Ford has begun working with climatologist Natalie Burls of George Mason University in Fairfax, Va., to try to track how the oceans’ major water masses shifted during the Pliocene. Today the North Atlantic has a deep, cold, salty layer that is crucial to the ocean’s “conveyor belt” circulation. In this pattern, warm waters flow northward from the tropics, then cool and become saltier and denser as they reach higher latitudes. That cool water sinks and travels southward, where it warms and rises and begins the cycle all over again.
This conveyor belt circulation is important to today’s Atlantic climate, because it allows the warm Gulf Stream to moderate temperatures from the U.S. East Coast to Western Europe. Burls and colleagues have now found that a similar pattern might have existed in the Pacific during the Pliocene. They call it the Pacific meridional overturning circulation, or PMOC, just as today’s similar Atlantic circulation is known as the AMOC.
Burls’ team discovered this phenomenon by modeling how the Pliocene ocean would have responded to higher temperatures. Because the Arctic was so warm, the temperature difference between the equator and the mid- and high latitudes was not as great as it is today. The weaker temperature gradient would have meant less rainfall and more evaporation in the midlatitude North Pacific. As a result, its uppermost waters would have gotten saltier. When the North Pacific waters got salty enough, they cooled and sank, setting up an enormous current that dove deep off the coast of northeastern Russia and traveled southward until the water warmed enough to once again rise toward the surface. Real-world data back the claim: Accumulations of calcium carbonate in deep-sea Pacific sediments show that the Pliocene ocean experienced huge shifts at the time, with waters churning all the way from the surface down to about three kilometers deep, as would be expected from a conveyor belt–type circulation. The team reported the finding in Science Advances in September.
What happened in the Pliocene Pacific may say something about the Pacific of the distant future, Burls says. As temperatures rise today, most of the heat is being taken up by the surface layers of the oceans. Over the short term, that works to prevent changes in deep ocean circulation. “Today we’re very quickly turning on the heating, and it will take a while for the deep ocean to adjust,” Burls says.
But in the longer term, thousands of years from now, waters in the North Pacific may eventually become warm and salty enough to establish a PMOC, just as there was in the Pliocene. And that could lead to major changes in weather and climate patterns around the globe.
Land bridges and ice sheets Other modelers are looking beyond the Pacific to improve their understanding of how different parts of the Pliocene world behaved. About a dozen research groups recently launched a new effort called PlioMIP2, or Pliocene Model Intercomparison Project, Phase 2, to model the climate of a time somewhat similar to today in the mid-Pliocene, about 3.205 million years ago.
“We’re working to produce the best picture that we can of what life seemed to be like at the time,” says Alan Haywood, a climate modeler at the University of Leeds in England and a leader of the effort.
In one discovery, project scientists have found that small changes in the geography of their modeled world make a big improvement in the final results. Early models did not accurately capture how much the polar regions heated up. So PlioMIP2 researchers updated their starting conditions. Instead of assuming that the landmasses of the Pliocene world were identical to today, the group made two small, plausible changes in the Arctic. The researchers made a land bridge between Russia and Alaska by closing the Bering Strait, and they added land to connect a few modern islands in the Canadian Arctic, including Ellesmere.
The change “seems small, but it actually can have a huge impact on climate,” says Otto-Bliesner. For instance, closing the Bering Strait cut off a flow in which relatively fresh water from the Pacific travels over the Arctic and into the North Atlantic. With the updated geography, the PlioMIP2 models suddenly did a much better job of simulating heat in the high Arctic.
Otto-Bliesner will describe the team’s results in New Orleans this month at a meeting of the American Geophysical Union. Another PlioMIP2 group, Deepak Chandan and Richard Peltier of the University of Toronto, reported similar findings in July in Climate of the Past. They too found that closing the Bering Strait allowed their model to better simulate the Arctic heating.
Other Pliocene modelers are trying to figure out how the planet’s enormous ice sheets in Greenland and Antarctica might respond to rising temperatures. Geologic evidence, such as ancient beaches from the Pliocene, suggest that global sea levels then were as much as 25 meters higher than today. If all of Greenland’s ice were to melt, global sea levels would rise about six meters; if all of Antarctica went, it would contribute about 60 meters. So parts of these ice sheets, but not all, must have melted during the long-ago warm period.
Several of the PlioMIP2 research groups are modeling how the polar ice sheets responded in the heat of the Pliocene. “It will tell us how much we should be worried,” Otto-Bliesner says. One new study suggests that the northern and southern ice sheets may have behaved out of phase with each other. In a simulation of the mid- to late Pliocene, climate modeler Bas de Boer of Utrecht University in the Netherlands and colleagues found that as Greenland’s ice melted, Antarctica’s ice could have been relatively stable, and vice versa.
“At different points, they could be contributing to the sea level story or against it,” says Haywood. He, along with colleagues, reported the results in the Oct. 30 Geophysical Research Letters.
That out-of-sync melting suggests the Pliocene was a complicated time. Just because global temperatures were high doesn’t mean that all of Earth’s ice sheets melted equally. (Today, both Greenland and West Antarctica are losing ice to the oceans as global temperatures rise.)
The Pliocene wound to an end around 2.6 million years ago, as CO2 levels dropped. Chemical reactions with eroding rocks may have sucked much of the CO2 out of the atmosphere and tucked it away in the oceans, removing the greenhouse gas. The planet entered a long-term cooling trend. Since the end of the Pliocene, Earth has been in and out of a series of ice ages.
But now, greenhouse gases are once again flooding into the atmosphere. Global temperatures are ticking up inexorably year after year. That makes the lessons of the past all the more relevant for the future.
A misfit gang of superconducting materials may be losing their outsider status.
Certain copper-based compounds superconduct, or transmit electricity without resistance, at unusually high temperatures. It was thought that the standard theory of superconductivity, known as Bardeen-Cooper-Schrieffer theory, couldn’t explain these oddballs. But new evidence suggests that the standard theory applies despite the materials’ quirks, researchers report in the Dec. 8 Physical Review Letters.
All known superconductors must be chilled to work. Most must be cooled to temperatures that hover above absolute zero (–273.15° Celsius). But some copper-based superconductors work at temperatures above the boiling point of liquid nitrogen (around –196° C). Finding a superconductor that functions at even higher temperatures — above room temperature — could provide massive energy savings and new technologies (SN: 12/26/15, p. 25). So scientists are intent upon understanding the physics behind known high-temperature superconductors. When placed in a magnetic field, many superconductors display swirling vortices of electric current — a hallmark of the standard superconductivity theory. But for the copper-based superconductors, known as cuprates, scientists couldn’t find whirls that matched the theory’s predictions, suggesting that a different theory was needed to explain how the materials superconduct. “This was one of the remaining mysteries,” says physicist Christoph Renner of the University of Geneva. Now, Renner and colleagues have found vortices that agree with the theory in a high-temperature copper-based superconductor, studying a compound of yttrium, barium, copper and oxygen.
Vortices in superconductors can be probed with a scanning tunneling microscope. As the microscope tip moves over a vortex, the instrument records a change in the electrical current. Renner and colleagues realized that, in their copper compound, there were two contributions to the current that the probe was measuring, one from superconducting electrons and one from nonsuperconducting ones. The nonsuperconducting contribution was present across the entire surface of the material and masked the signature of the vortices.
Subtracting the nonsuperconducting portion revealed the vortices, which behaved in agreement with the standard superconductivity theory. “That, I think, is quite astonishing; it’s quite a feat,” says Mikael Fogelström of Chalmers University of Technology in Gothenburg, Sweden, who was not involved with the research. The result lifts some of the fog surrounding cuprates, which have so far resisted theoretical explanation. But plenty of questions still surround the materials, Fogelström says. “It leaves many things still open, but it sort of gives a new picture.”
One person infected with strep bacteria might get a painful sore throat; another might face a life-threatening blood infection. Now, scientists are trying to pin down why.
Variation between individuals’ immune systems may not be entirely to blame. Instead, extra genes picked up by some pathogens can cause different strains to have wildly different effects on the immune system, even in the same person, researchers report January 11 in PLOS Pathogens.
The idea that different strains of bacteria can behave differently in the body isn’t new. Take E. coli: Some strains of the bacteria that can cause foodborne illness make people far sicker than other strains. But bacteria have exceptionally large amounts of genetic variation, even between members of the same species. Scientists are still trying to figure out how that genetic diversity affects the way microbes interact with the immune system. Any species of bacteria has a core set of genes that all its members share. Then there’s a whole pot of genes that different strains of the species pick and choose to create what’s known as an accessory genome. These genes are custom add-ons that specific strains have acquired over time, from their environment or from other microbes — something like an expansion pack for a card game. Sometimes, that extra genetic material gives bacteria new traits.
Uri Sela and his colleagues at the Rockefeller University in New York City tested the way these extra genes influenced the way two common species of bacteria, Staphylococcus aureus and Streptococcus pyogenes, interacted with the immune system. Staphylococcus bacteria can cause everything from rashes to food poisoning to blood infections. Streptococcus bacteria can cause strep throat, as well as a host of more serious illnesses (SN: 10/4/14, p. 22).
Different strains of the same species provoked wildly different immune responses in blood samples collected from the same patient, the researchers first showed. But the strain-specific responses were consistent across patients. Some strains triggered lots of T cells to be made in every sample, for example; others increased B cell activity. (T cells and B cells are the two main weapons of the adaptive immune response, which enables the body to build long-lasting immunity against a particular pathogen.) In tests of strains missing some of their extra genes, though, the T cells didn’t respond as strongly as they did to a matching strain that contained the extra genes. This finding suggests that the variation in immune response across strains was coming, at least in part, from differences in these supplementary genes. “Currently when a patient comes to the hospital with an infection, we don’t define the strain of the species” for common infections like strep and staph, says Sela, an immunologist. In the future, he says, information about the strain could help doctors predict how a patient’s illness will unfold and decide on the best treatment.
The new study “adds fuel to an active debate” about the role of accessory genes, says Alan McNally, a microbiologist at the University of Birmingham in England — whether or not the collections of genetic add-ons that bacteria maintain are shaped by natural selection, the process that fuels evolution. This research suggests that for some kinds of bacteria, genetic customization might aid survival of certain strains by enabling them to provoke a tailored immune response.
But more research needs to be done to link the strain-to-strain variation in immune response to the accessory genome, he says, as this study looked at only a few extra genes, not the entire accessory genome.
Youngsters have probably been playing their way into cultural competence for at least tens of thousands of years. So why are signs of children largely absent from the archaeological record?
A cartoon that Biblical scholar Kristine Garroway taped up in her college dorm helps to explain kids’ invisibility at ancient sites: Two men in business suits stare intently at an unidentifiable round object sitting on a table. “Hey, what’s this?” asks the first guy. “I dunno, probably a toy … or a religious object,” says the second. Archaeologists have long tended to choose the second option, says Garroway, now a visiting scientist at Hebrew Union College–Jewish Institute of Religion in Los Angeles. Ambiguous finds, such as miniature pottery vessels and small figurines, get classified as ritual or decorative objects. Some of these artifacts undoubtedly were used in ceremonies. But not all of them, Garroway argues. Of 48 miniature clay vessels excavated from inside roughly 3,650- to 4,000-year-old houses at Israel’s Tel Nagila site, 10 retained fingerprints the size of children’s that were made during the shaping of soft clay, before the clay was heated and hardened, archaeologists reported in 2013. Kids must have made those somewhat unevenly shaped jars and bowls, each easily held within a child’s hand, concluded Joe Uziel of the Israel Antiquities Authority in Jerusalem and independent Israeli researcher Rona Avissar Lewis in Palestine Exploration Quarterly. Unusual finds in Israel dating to around 3,000 years ago also represent children’s early attempts to mimic adult craftwork, Garroway said in a November 18 presentation in Boston at the annual meeting of the American Schools of Oriental Research. Numerous rounded clay disks, each pierced with two holes, have mystified investigators for nearly a century. As early as 1928, an archaeologist suggested that these button-sized objects were toys. After passing a string through both of a disk’s holes and tying the ends together, a youngster could swing the string to wind up the toy and then pull both ends of the string to make the disk spin. Clay disks from six Israeli sites can be separated into those made by skilled artisans and others — featuring rough edges and unevenly spaced holes — made by novices, including children, Garroway proposes. If those items were toys, sloppy execution may have partly resulted from children’s impatience to play with the final product, she suspects.
Garroway’s proposal appears likely, especially in light of evidence that more than 10,000 years earlier, people in France and Spain made similar spinning disks decorated with animals that appeared to move as the toy twirled (SN: 6/30/12, p. 12), says archaeologist Michelle Langley of Griffith University in Brisbane, Australia.
Western European finds from as early as 14,000 to 21,000 years ago also may have gone unrecognized as children’s toys, Langley suggests in a paper published this month in the Oxford Journal of Archaeology. One specimen, a cave lion carved out of a reindeer’s antler, displays so much polish from handling that children may have played with the item for years, she says. Some bone spearpoints with broken tips bear signs of unskilled repair, suggesting adults gave the damaged weapons to children to practice bone-working skills and perhaps play with, she adds.