Stephen Hawking finds the inner genius in ordinary people

It’s hard to believe that it took reality television this long to get around to dealing with space, time and our place in the cosmos.

In PBS’ Genius by Stephen Hawking, the physicist sets out to prove that anyone can tackle humankind’s big questions for themselves. Each of the series’ six installments focuses on a different problem, such as the possibility of time travel or the likelihood that there is life elsewhere in the universe. With Hawking as a guide, three ordinary folks must solve a series of puzzles that guide them toward enlightenment about that episode’s theme. Rather than line up scientists to talk at viewers, the show invites us to follow each episode’s trio on a journey of discovery.
By putting the focus on nonexperts, Genius emphasizes that science is not a tome of facts handed down from above but a process driven by curiosity. After working through a demonstration of how time slows down near a black hole, one participant reflects: “It’s amazing to see it play out like this.”
The show is a fun approach to big ideas in science and philosophy, and the enthusiasm of the guests is infectious. Without knowing what was edited out, though, it’s difficult to say whether the show proves Hawking’s belief that anyone can tackle these heady questions. Each situation is carefully designed to lead the participants to specific conclusions, and there seems to be some off-camera prompting.

But the bigger message is a noble one: A simple and often surprising chain of reasoning can lead to powerful insights about the universe, and reading about the cosmos pales next to interacting with stand-ins for its grandeur. It’s one thing, for example, to hear that there are roughly 300 billion stars in the Milky Way. But to stand next to a mountain of sand where each grain represents one of those stars is quite another. “I never would have got it until I saw it,” says one of the guests, gesturing to the galaxy of sand grains. “This I get.”

Snot could be crucial to dolphin echolocation

In hunting down delicious fish, Flipper may have a secret weapon: snot.

Dolphins emit a series of quick, high-frequency sounds — probably by forcing air over tissues in the nasal passage — to find and track potential prey. “It’s kind of like making a raspberry,” says Aaron Thode of the Scripps Institution of Oceanography in San Diego. Thode and colleagues tweaked a human speech modeling technique to reproduce dolphin sounds and discern the intricacies of their unique style of sound production. He presented the results on May 24 in Salt Lake City at the annual meeting of the Acoustical Society of America.

Dolphin chirps have two parts: a thump and a ring. Their model worked on the assumption that lumps of tissue bumping together produce the thump, and those tissues pulling apart produce the ring. But to match the high frequencies of live bottlenose dolphins, the researchers had to make the surfaces of those tissues sticky. That suggests that mucus lining the nasal passage tissue is crucial to dolphin sonar.

The vocal model also successfully mimicked whistling noises used to communicate with other dolphins and faulty clicks that probably result from inadequate snot. Such techniques could be adapted to study sound production or echolocation in sperm whales and other dolphin relatives.
Researchers modified a human speech model developed in the 1970s to study dolphin echolocation. The animation above mimics the vibration of lumps of tissue (green) in the dolphin’s nasal passage (black) that are drenched in mucus. Snot-covered tissues (blue) stick together (red) and pull apart to create the click sound.

Jupiter’s stormy weather no tempest in teapot

Jupiter’s turbulence is not just skin deep. The giant planet’s visible storms and blemishes have roots far below the clouds, researchers report in the June 3 Science. The new observations offer a preview of what NASA’s Juno spacecraft will see when it sidles up to Jupiter later this year.

A chain of rising plumes, each reaching nearly 100 kilometers into Jupiter, dredges up ammonia to form ice clouds. Between the plumes, dry air sinks back into the Jovian depths. And the famous Great Red Spot, a storm more than twice as wide as Earth that has churned for several hundred years, extends at least dozens of kilometers below the clouds as well.

Jupiter’s dynamic atmosphere provides a possible window into how the planet works inside. “One of the big questions is what is driving that change,” says Leigh Fletcher, a planetary scientist at the University of Leicester in England. “Why does it change so rapidly, and what are the environmental and climate-related factors that result from those changes?”

To address some of those questions, Imke de Pater, a planetary scientist at the University of California, Berkeley, and colleagues observed Jupiter with the Very Large Array radio observatory in New Mexico. Jupiter emits radio waves generated by heat left over from its formation about 4.6 billion years ago. Ammonia gas within Jupiter’s atmosphere intercepts certain radio frequencies. By mapping how and where those frequencies are absorbed, the researchers created a three-dimensional map of the ammonia that lurks beneath Jupiter’s clouds. Those plumes and downdrafts appear to be powered by a narrow wave of gas that wraps around much of the planet.

The depths of Jupiter’s atmospheric choppiness isn’t too surprising, says Scott Bolton, a planetary scientist at the Southwest Research Institute in San Antonio. “Almost everyone I know would have guessed that,” he says. But the observations do provide a teaser for what to expect from the Juno mission, led by Bolton. The spacecraft arrives at Jupiter on July 4 to begin a 20-month investigation of what’s going on beneath Jupiter’s clouds using tools similar to those used in this study.

The new observations confirm that Juno should work as planned, Bolton says.

By getting close to the planet — just 5,000 kilometers from the cloud tops — Juno will break through the fog of radio waves from Jupiter’s radiation belts that obscures observations made from Earth and limits what telescopes like the Very Large Array can see. But the spacecraft will see only a narrow swath of Jupiter’s bulk at a time. “That’s where ground-based work like the research de Pater has been doing is really essential,” Fletcher says. Observations such as these will let Juno scientists know what’s going on throughout the atmosphere so they can better understand what Jupiter is telling them.

Doctors need better ways to figure out fevers in newborns

Two days after my first daughter was born, her pediatrician paid a house call to examine her newest patient. After packing up her gear, she told me something alarming: “For the next few months, a fever is an emergency.” If we measured a rectal temperature at or above 100.4° Fahrenheit, go to the hospital, she said. Call her on the way, but don’t wait.

I, of course, had no idea that a fever constituted an emergency. But our pediatrician explained that a fever in a very young infant can signal a fast-moving and dangerous bacterial infection. These infections are rare (and fortunately becoming even rarer thanks to newly created vaccines). But they’re serious, and newborns are particularly susceptible.

I’ve since heard from friends who have been through this emergency. Their newborns were poked, prodded and monitored by anxious doctors, in the hopes of quickly ruling out a serious bacterial infection. For infants younger than two months, it’s “enormously difficult to tell if an infant is seriously ill and requires antibiotics and/or hospitalization,” says Howard Bauchner, a pediatrician formerly at Boston University School of Medicine and now editor in chief of the Journal of the American Medical Association.

A new research approach, described in two JAMA papers published in August, may ultimately lead to better ways to find the cause of a fever.

These days, for most (but not all) very young infants, their arrival at a hospital will trigger a workup that includes a urine culture and a blood draw. Often doctors will perform a lumbar puncture, more commonly known as a spinal tap, to draw a sample of cerebrospinal fluid from the area around the spinal cord.

Doctors collect these fluids to look for bacteria. Blood, urine and cerebrospinal fluid are smeared onto culture dishes, and doctors wait and see if any bacteria grow. In the meantime, the feverish infant may be started on antibiotics, just in case. But this approach has its limitations. Bacterial cultivation can take several days. The antibiotics may not be necessary. And needless to say, it’s not easy to get those fluids, particularly from a newborn.

Some scientists believe that instead of looking for bacteria or viruses directly, we ought to be looking at how our body responds to them. Unfortunately, the symptoms of a bacterial and viral infection are frustratingly similar. “You get a fever. You feel sick,” says computational immunologist Purvesh Khatri of Stanford University. Sadly, there are no obvious telltale symptoms of one or the other, not even green snot. In very young infants, a fever might be the only sign that something is amiss.
But more subtle clues could betray the cause of the fever. When confronted with an infection, our immune systems ramp up in specific ways. Depending on whether we are fighting a viral or bacterial foe, different genes turn up their activity. “The immune system knows what’s going on,” Khatri says. That means that if we could identify the genes that reliably get ramped up by viruses and those that get ramped up by bacteria, then we could categorize the infection based on our genetic response.

That’s the approach used by two groups of researchers, whose study results both appear in the August 23/30 JAMA. One group found that in children younger than 2, two specific genes could help make the call on infection type. Using blood samples, the scientists found that one of the genes ramped up its activity in response to a viral infection, and the other responded to a bacterial infection.

The other study looked at immune responses in even younger children. In infants younger than 60 days, the activity of 66 genes measured in blood samples did a pretty good job of distinguishing between bacterial and viral infections. “These are really exciting preliminary results,” says Khatri, who has used a similar method for adults. “We need to do more work.”

Bauchner points out that in order to be useful, “the test would have to be very, very accurate in very young infants.” There’s very little room for error. “Only time will tell how good these tests will be,” he says. In an editorial that accompanied the two studies, he evoked the promise of these methods. If other experiments replicate and refine the results of these studies, he could envision a day in which the parents of a feverish newborn could do a test at home, call their doctor and together decide if the child needs more care.

That kind of test isn’t here yet, but scientists are working on it. The technology couldn’t come soon enough for doctors and parents desperate to figure out a fever.

Endurance training leaves no memory in muscles

Use it or lose it, triathletes.

Muscles don’t have long-term memory for exercises like running, biking and swimming, a new study suggests. The old adage that once you’ve been in shape, it’s easier to get fit again could be a myth, at least for endurance athletes, researchers in Sweden report September 22 in PLOS Genetics.

“We really challenged the statement that your muscles can remember previous training,” says Maléne Lindholm of the Karolinska Institute in Stockholm. But even if muscles forget endurance exercise, the researchers say, other parts of the body may remember, and that could make retraining easier for people who’ve been in shape before.
Endurance training is amazingly good for the body. Weak muscle contractions, sustained over a long period of time — as in during a bike ride — change proteins, mainly ones involved in metabolism. This translates into more energy-efficient muscle that can help stave off illnesses like diabetes, cardiovascular disease and some cancers. The question is, how long do those improvements last?

Previous work in mice has shown that muscles “remember” strength training (SN: 9/11/10, p. 15). But rather than making muscles more efficient, strength-training moves like squats and push-ups make muscles bigger and stronger. The muscles bulk up as they develop more nuclei. More nuclei lead to more production of proteins that build muscle fibers. Cells keep their extra nuclei even after regular exercise stops, to make protein easily once strength training restarts, says physiologist Kristian Gundersen at the University of Oslo in Norway. Since endurance training has a different effect on muscles, scientists weren’t sure if the cells would remember it or not.
To answer that question, Lindholm’s team ran volunteers through a 15-month endurance training experiment. In the first three months, 23 volunteers trained four times a week, kicking one leg 60 times per minute for 45 minutes. Volunteers rested their other leg. Lindholm’s team took muscle biopsies before and after the three-month period to see how gene activity changed with training. Specifically, the scientists looked for changes in the number of mRNAs (the blueprints for proteins) that each gene was making. Genes associated with energy production showed the greatest degree of change in activity with training.
At a follow-up, after participants had stopped training for nine months, scientists again biopsied muscle from the thighs of 12 volunteers, but didn’t find any major differences in patterns of gene activity between the previously trained legs and the untrained legs. “The training effects were presumed to have been lost,” says Lindholm. After another three-month bout of training, this time in both legs, the researchers saw no differences between the previously trained and untrained legs.
While this study didn’t find muscle memory for endurance — most existing evidence is anecdotal — it still might be easier for former athletes to get triathalon-ready, researchers say. The new result has “no bearing on the possible memory in other organ systems,” Gundersen says. The heart and cardiovascular system could remember and more easily regain previous fitness levels, for example, he says.

Even within muscle tissue, immune cells or stem cells could also have some memory not found in this study, says molecular exercise physiologist Monica Hubal of George Washington University in Washington, D.C. Lindholm adds that well-trained connections between nerves and muscles could also help lapsed athletes get in shape faster than people who have never exercised before. “They know how to exercise, how it’s supposed to feel,” Lindholm says. “Your brain knows exactly how to activate your muscles, you don’t forget how to do that.”

Primitive signs of emotions spotted in sugar-buzzed bumblebees

To human observers, bumblebees sipping nectar from flowers appear cheerful. It turns out that the insects may actually enjoy their work. A new study suggests that bees experience a “happy” buzz after receiving a sugary snack, although it’s probably not the same joy that humans experience chomping on a candy bar.

Scientists can’t ask bees or other animals how they feel. Instead, researchers must look for signs of positive or negative emotions in an animal’s decision making or behavior, says Clint Perry, a neuroethologist at Queen Mary University of London. In one such study, for example, scientists shook bees vigorously in a machine for 60 seconds — hard enough to annoy, but not hard enough to cause injury — and found that stressed bees made more pessimistic decisions while foraging for food.
The new study, published in the Sept. 30 Science, is the first to look for signs of positive bias in bee decision making, Perry says. His team trained 24 bees to navigate a small arena connected to a plastic tunnel. When the tunnel was marked with a blue “flower” (a placard), the bees learned that a tasty vial of sugar water awaited them at its end. When a green “flower” was present, there was no reward. Once the bees learned the difference, the scientists threw the bees a curveball: Rather than being blue or green, the “flower” had a confusing blue-green hue.

Faced with the ambiguous color, the bees appeared to dither, meandering around for roughly 100 seconds before deciding whether to enter the tunnel. Some didn’t enter at all. But when the scientists gave half the bees a treat — a drop of concentrated sugar water — that group spent just 50 seconds circling the entrance before deciding to check it out. Overall, the two groups flew roughly the same distances at the same speeds, suggesting that the group that had gotten a treat first had not simply experienced a boost in energy from the sugar, but were in a more positive, optimistic state, Perry says.

In a separate experiment, Perry and colleagues simulated a spider attack on the bees by engineering a tiny arm that darted out and immobilized them with a sponge. Sugar-free bees took about 50 seconds longer than treated bees to resume foraging after the harrowing encounter.

The researchers then applied a solution to the bees’ thoraxes that blocked the action of dopamine, one of several chemicals that transmit rewarding signals in the insect brain. With dopamine blocked, the effects of the sugar treat disappeared, further suggesting that a change in mood, and not just increased energy, was responsible for the bees’ behavior.

The results provide the first evidence for positive, emotion-like states in bees, says Ralph Adolphs, a neuroscientist at Caltech. Yet he suspects that the metabolic effects of sugar did influence the bees’ behavior.
Geraldine Wright, a neuroethologist at Newcastle University in England, shares that concern. “The data reported in the paper doesn’t quite convince me that eating sucrose didn’t change how they behaved, even though they say it didn’t affect flight time or speed of flight,” she says. “I would be very cautious in interpreting the responses of bees in this assay as a positive emotional state.”

‘Time crystal’ created in lab

It may sound like science fiction, but it’s not: Scientists have created the first time crystal, using a chain of ions. Just as a standard crystal repeats in a regular spatial pattern, a time crystal repeats in time, returning to a similar configuration at regular intervals.

“This is a remarkable experiment,” says physicist Chetan Nayak of Microsoft Station Q at the University of California, Santa Barbara. “There is a ‘wow factor.’”

Scientists at the University of Maryland and the University of California, Berkeley created a chain of 10 ytterbium ions. These ions behave like particles with spin, a sort of quantum mechanical version of angular momentum, which can point either up or down. Using a laser, the physicists flipped the spins in a chain of ions halfway around, from up to down, and allowed the ions to interact so that the spin of each ion would influence the others. The researchers repeated this sequence at regular intervals, flipping the ions halfway each time and letting them interact. When scientists measured the ions’ spins, on average the ions went full circle, returning to their original states, in twice the time interval at which they were flipped halfway.
This behavior is sensible — if each flip turns something halfway around, it takes two flips to return to its original position. But scientists found that the ions’ spins would return to their original orientation at that same rate even if they were not flipped perfectly halfway. This result indicates that the system of ions prefers to respond at a certain regular period — the hallmark of a time crystal — just as atoms in a crystal prefer a perfectly spaced lattice. Such time crystals are “one of the first examples of a new phase of matter,” says physicist Norman Yao of UC Berkeley, a coauthor of the new result, posted online September 27 at arXiv.org.

Time crystals take an important unifying concept in physics — the idea of symmetry breaking — and extend it to time. Physical laws typically treat all points in space equally — no one location is different from any other. In a liquid, for example, atoms are equally likely to be found at any point in space. This is a continuous symmetry, as the conditions are the same at any point along the spatial continuum. If the liquid solidifies into a crystal, that symmetry is broken: Atoms are found only at certain regularly spaced positions, with voids in between. Likewise, if you rotate a crystal, on a microscopic level it would look different from different angles, but liquid will look the same however it’s rotated. In physics, such broken symmetries underlie topics ranging from magnets to superconductors to the Higgs mechanism, which imbues elementary particles with mass and gives rise to the Higgs boson.

In 2012, theoretical physicist Frank Wilczek of MIT proposed that symmetry breaking in time might produce time crystals (SN: 3/24/12, p. 8). But follow-up work indicated that time crystals couldn’t emerge in a system in a state of equilibrium, which is settled into a stable configuration. Instead, physicists realized, driven systems, which are periodically perturbed by an external force — like the laser flipping the ions — could create such crystals. “The original examples were either flawed or too simple,” says Wilczek. “This is much more interesting.”

Unlike the continuous symmetry that is broken in the transition from a liquid to a solid crystal, in the driven systems that the scientists used to create time crystals, the symmetry is discrete, appearing at time intervals corresponding to the time between perturbations. If the system repeats itself at a longer time interval than the one it’s driven at — as the scientists’ time crystal does — that symmetry is broken.

Time crystals are too new for scientists to have a handle on their potential practical applications. “It’s like a baby, you don’t know what it’s going to grow up to be,” Wilczek says. But, he says, “I don’t think we’ve heard the last of this by a long shot.”
There probably are related systems yet to be uncovered, says Nayak. “We’re just kind of scratching the surface of the kinds of amazing phenomena — such as time crystals — that we can have in nonequilibrium quantum systems. So I think it’s the first window into a whole new arena for us to explore.”

Interactive map reveals hidden details of the Milky Way

There’s much more to the universe than meets the eye, and a new web-based app lets you explore just how much our eyes are missing. Gleamoscope presents the night sky across a range of electromagnetic frequencies. Spots of gamma rays pinpoint distant feeding black holes. Tendrils of dust glow with infrared light throughout the Milky Way. A supernova remnant — the site of a star that exploded roughly 11,000 years ago — blasts out X-rays and radio waves.

Many of these phenomena are nearly imperceptible in visible light. So astronomers use equipment, such as specialized cameras and antennas, that can detect other frequencies of electromagnetic radiation. Computers turn the data into images, often assigning colors to certain frequencies to highlight specific details or physical processes.

In Gleamoscope, a slider smoothly transitions the scene from one frequency of light to another, turning the familiar star-filled night sky into a variety of psychedelic landscapes. Pan and magnification controls allow you to scan all around the night sky and zoom in for a closer look. The interactive map combines images from many observatories and includes new data from the Murchison Widefield Array, a network of radio antennas in Australia. Over 300,000 galaxies appear as dots in images of the new radio data, described in an upcoming issue of Monthly Notices of the Royal Astronomical Society. The radio map by itself can also be explored on mobile devices in a separate app called GLEAM, available on Google Play.

The year of gravitational waves, Zika and more

There’s no bow or festive wrap, but I hope that you will consider this issue a gift of sorts. That is how the staff of Science News thinks of it, our year-end recap of the top science stories. In these pages, you’ll find the stories that continued to resonate well after we first covered them and many that we expect will resonate for years to come — all collected in one easy-to-read, extremely portable, no-batteries-required package (unless you are reading this on a smartphone or tablet, that is).
Gravitational waves, of course, occupy the top spot on our list this year. The “of course” reflects the fundamental importance of the detection of this elusive form of energy, announced in February. The finding confirmed key theories in physics, sure, but even more exciting is what it promises for the future. Gravitational waves are powerful tools for probing the universe. Just as the Hubble Space Telescope revealed cosmic beauty in electromagnetic radiation, gravitational wave detectors may show scientists an unprecedented view of far-off
Closer to home, the Zika virus became one of our most closely watched stories this year, as the extent of human suffering caused by the mosquito-borne virus became clear. But it’s also a tale of progress: Scientists have responded swiftly, creating a robust literature on the virus in a short time. We still don’t have all the answers, but we’ve come a long way in terms of creating the knowledge urgently needed to inform health recommendations.

Other stories made this year’s list with a more mixed pedigree. The discovery of a (relatively) nearby exoplanet energized many of our science fiction–fueled fantasies of other worlds, for instance. Research moved ahead on what some call “three-parent babies” — using mitochondrial donors to replace a woman’s own disease-prone mitochondria in egg cells — despite a lack of clarity on the procedure’s efficacy. Melting Arctic sea ice has led to a historically significant opening of passageways between the Pacific and the Atlantic oceans. New hope for the battle against Alzheimer’s disease seemed worthy of mention. All these developments and more were regarded by Science News reporters and editors as milestones of discovery or news of importance to society.

We also decided to add some other elements to our year-in-review coverage for 2016. Guided by the deft hand of Beth Quill, our enterprise editor, we augmented our Top 10 list with an essay by managing editor Tom Siegfried about two of physics’s noteworthy recent failures and how the two are related. Science journalist and author Sonia Shah offers a roundup of 2016 in public health, reminding us of the thorny problems associated with infectious diseases, from antibiotic resistance to the resurgence of yellow fever. Other pieces illustrate some of the challenges facing the driverless car revolution, as well as what Science News reporters see on the horizon for the coming year.

We have tried to pack as much science as possible into this issue, from the biggest stories to the more obscure nuggets of discovery and surprise. I can’t think of a gift I’d more like to receive.

Ancient otter of unusual size unearthed in China

Fossils of a giant otter have emerged from the depths of an open-pit mine in China.

The crushed cranium, jaw bone and partial skeletons of at least three animals belong to a now-extinct species of otter that lived some 6.2 million years ago, scientists report January 23 in the Journal of Systematic Palaeontology.

At roughly 50 kilograms in weight, the otter would have outclassed today’s giant otter, a river-dwelling South American mammal weighing in at around 34 kilograms. Scientists named the new species Siamogale melilutra, a nod to its unusual mix of badger and otter features. Melilutra is a mash-up of the Latin words for both creatures.

Badgers and otters both belong to a group of carnivorous animals called Mustelidae, but scientists have had trouble figuring out where to place extinct members in the mammalian family tree. (European badgers and modern otters share similar-looking teeth and skulls.) Still, Siamogale melilutra, however badgerlike, is indeed an otter, researchers concluded after CT scanning, reconstructing and analyzing the fossil skull.

Based on plant and animal fossils found near the collection site, scientists believe that the ancient otter probably lived in the shallow lake of a warm and humid swamp, lush with broad-leaved evergreens and grasses.