Mums are now a flower of a different color. Japanese researchers have added a hint of clear sky to the humble plant’s palette, genetically engineering the first-ever “true blue” chrysanthemum.
“Obtaining blue-colored flowers is the Holy Grail for plant breeders,” says Mark Bridgen, a plant breeder at Cornell University. The results are “very exciting.”
Compounds called delphinidin-based anthocyanin pigments are responsible for the natural blues in such flowers as pansies and larkspur. Mums lack those compounds. Instead, the flowers come in a variety of other colors, evoking fiery sunsets, new-fallen snow and all things chartreuse. In previous attempts to engineer a blue hue in chrysanthemums — and roses and carnations — researchers inserted the gene for a key enzyme that controls production of these compounds, causing them to accumulate. But the resulting blooms skewed more violet-purple than blue. True blue pigment remained elusive, scientists thought, because its origin was complex; multiple genes have been shown to be involved in its generation. But Naonobu Noda, of the National Agriculture and Food Research Organization in Tsukuba, Japan, and colleagues were surprised to find that inserting only two borrowed genes into chrysanthemums created blue flowers. One gene, from Canterbury bells, got the enzyme process started; the other, from butterfly peas, further tweaked the pigment molecules.
Together, the gene double-team transformed 19 of 32 mums, or 59 percent, of the Taihei variety from having pink or magenta blooms into blue beauties. Additional analyses revealed that the blue color arose because of molecular interactions between the tweaked pigment and certain colorless compounds naturally found in many plants, including chrysanthemums. The two-part method could possibly be used in the production of other blue flowers, the researchers report July 26 in Science Advances.
Nearly 5 percent of U.S. adults misused prescription opioids in 2015, a new study shows.
Based on the National Survey on Drug Use and Health, an in-person survey of more than 50,000 people, researchers estimate that 91.8 million, or 37.8 percent, of adults used prescription opioids in 2015. Some 11.5 million people misused the painkillers, and 1.9 million people reported opioid dependence or abuse, Beth Han of the Substance Abuse and Mental Health Services Administration in Rockville, Md., and colleagues report online August 1 in Annals of Internal Medicine. Relieving pain was the most commonly cited reason for people’s most recent episode of misuse — for 66 percent of those reporting misuse, such as using without a prescription, and nearly 49 percent of those with opioid dependence or abuse. (Respondents could report more than one reason for their last misuse.) These results underscore the need for improved pain management, the authors say.
In a pitch-black rainforest with fluttering moths and crawling centipedes, Christina Warinner dug up her first skeleton. Well, technically it was a full skeleton plus two headless ones, all seated and draped in ornate jewelry. To deter looters, she excavated through the night while one teammate held up a light and another killed as many bugs as possible.
As Warinner worked, unanswerable questions about the people whose skeletons she was excavating flew through her mind. “There’s only so much you can learn by looking with your own eyes at a skeleton,” she says. “I became increasingly interested in all the things that I could not see — all the stories that these skeletons had to tell that weren’t immediately accessible, but could be accessible through science.”
At age 21, Warinner cut her teeth on that incredibly complex sacrificial burial left behind by the Maya in a Belize rainforest. Today, at age 37, the molecular anthropologist scrapes at not-so-pearly whites to investigate similar questions, splitting her time between the University of Oklahoma in Norman and the Max Planck Institute for the Science of Human History in Jena, Germany. In 2014, she and colleagues reported a finding that generated enough buzz to renew interest in an archaeological resource many had written off decades ago: fossilized dental plaque, or calculus. Ancient DNA and proteins in the plaque belong to microbes that could spill the secrets of the humans they once inhabited — what the people ate, what ailed them, perhaps even what they did for a living.
Bacteria form plaque that mineralizes into calculus throughout a person’s life. “It’s the only part of your body that fossilizes while you’re still alive,” notes Warinner. “It’s also the last thing to decay.”
Though plaque is prolific in the archaeological record, most researchers viewed calculus as “the crap you scraped off your tooth in order to study it,” says Amanda Henry, an archaeologist at Leiden University in the Netherlands. With some exceptions, molecular biologists saw calculus as a shoddy source of ancient DNA.
But a few researchers, including Henry, had been looking at calculus for remnants of foods as potential clues to ancient diets. Inspired by some of Henry’s images of starch grains preserved in calculus, Warinner wondered if the plaque might yield dead bacterial structures, perhaps even bacteria’s genetic blueprints.
Her timing couldn’t have been better. Warinner began her graduate studies at Harvard in 2004, just after the sequencing of the human genome was completed and by the time she left in 2010, efforts to survey the human microbiome were in full swing. As a postdoc at the University of Zurich, Warinner decided to attempt to extract DNA from the underappreciated dental grime preserved on the teeth of four medieval skeletons from Germany. At first, the results were dismal. But she kept at it. “Tina has a very interested, curious and driven personality,” Henry notes. Warinner turned to a new instrument that could measure DNA concentrations in skimpy samples, a Qubit fluorometer. A surprising error message appeared: DNA too high. Dental calculus, it turned out, was chock-full of genetic material. “While people were struggling to pull out human DNA from the skeleton itself, there’s 100 to 1,000 times more DNA in the calculus,” Warinner says. “It was sitting there in almost every skeletal collection untouched, unanalyzed.” To help her interpret the data, Warinner mustered an army of collaborators from fields ranging from immunology to metagenomics. She and her colleagues found a slew of proteins and DNA snippets from bacteria, viruses and fungi, including dozens of oral pathogens, as well as the full genetic blueprint of an ancient strain of Tannerella forsythia, which still infects people’s gums today. In 2014, Warinner’s team revealed a detailed map of a miniature microbial world on the decaying teeth of those German skeletons in Nature Genetics.
Later in 2014, her group found the first direct protein-based evidence of milk consumption in the plaque of Bronze Age skeletons from 3000 B.C. That same study linked milk proteins preserved in the calculus of other ancient human skeletons to specific animals — providing a peek into long-ago lifestyles.
“The fact that you can tell the difference between, say, goat milk and cow milk, that’s kind of mind-blowing,” says Laura Weyrich, a microbiologist at the University of Adelaide in Australia, who also studies calculus. Since then, Warinner has found all sorts of odds and ends lurking on archaic chompers from poppy seeds to paint pigments. Warinner’s team is still looking at the origins of dairying and its microbial players, but she’s also branching out to the other end of the digestive spectrum. The researchers are looking at ancient DNA in paleofeces, which is exactly what it sounds like — desiccated or semifossilized poop. It doesn’t stay as fresh as plaque in the archaeological record. But she’s managed to find some sites with well-preserved samples. By examining the array of microbes that lived in the excrement and plaque of past humans and their relatives, Warinner hopes to characterize how our microbial communities have changed through time — and how they’ve changed us.
The research has implications for understanding chronic, complex human diseases over time. Warinner’s ancient DNA work “opens up a window on past health,” says Clark Larsen, an anthropologist at Ohio State University.
It’s all part of what Warinner calls “the archaeology of the unseen.”
Editor’s note: This story was corrected on October 4, 2017, to note that the 2014 report on milk consumption was based on protein evidence, not DNA.
A founding father of behavioral economics — a research school that has popularized the practice of “nudging” people into making decisions that authorities deem to be in their best interests — has won the 2017 Nobel Memorial Prize in Economic Sciences.
Richard Thaler, of the University of Chicago Booth School of Business, received the award October 9 for being the leader of a discipline that has championed the idea of humans not as purely rational and selfish — as long posited by economists. Instead, he argues, we are driven by simple, often emotionally fueled assumptions that can lead us astray.
“Richard Thaler has pioneered the analysis of ways in which human decisions systematically deviate from traditional economic models,” says cognitive scientist Peter Gӓrdenfors of Lund University, Sweden, a member of the Economic Sciences Prize Committee.
Thaler argues that, even if people try to make good economic choices, our thinking abilities are limited. In dealing with personal finances, for instance, he finds that most people mentally earmark money into different accounts, say for housing, food, vacations and entertainment. That can lead to questionable decisions, such as saving for a vacation in a low-interest savings account while buying household goods with a high-interest credit card.
At an October 9 news conference at the University of Chicago, Thaler referenced mental accounting in describing what he would do with the roughly $1.1 million award. “Every time I spend any money on something fun, I’ll say it came from the Nobel Prize.”
Thaler’s research has also focused on how judgments about fairness, such as sudden jumps in the prices of consumer items, affect people’s willingness to buy those items. A third area of his research finds that people’s short-term desires often override long-term plans. A classic example consists of putting off saving for retirement until later in life.
That research in particular inspired his 2008 book Nudge: Improving Decisions about Health, Wealth and Happiness, coauthored by Cass Sunstein, now at Harvard Law School. Nudging, also known as libertarian paternalism, is a way for public and private institutions to prod people to make certain decisions (SN: 3/18/17, p. 18). For instance, employees more often start saving for retirement early in their careers when offered savings plans that they must opt out of. Many governments, including those of the United Kingdom and the United States, have funded teams of behavioral economists, called nudge units, to develop ways to nudge people to, say, apply for government benefits or comply with tax laws. A total of 75 nudge units now exist worldwide, Thaler said at the news conference.
Nudging has its roots in a line of research, dubbed heuristics and biases, launched in the 1970s by two psychologists — 2002 economics Nobel laureate Daniel Kahneman of Princeton University and the late Amos Tversky of Stanford University. Investigators in heuristics and biases contend that people can’t help but make many types of systematic thinking errors, such as being overconfident in their decisions.
Thaler, like Kahneman, views the mind as consisting of one system for making rapid, intuitive decisions that are often misleading and a second system for deliberating slowly and considering as much relevant information as possible.
Despite the influence of Thaler’s ideas on research and social policy, they are controversial among decision researchers (SN: 6/4/11, p. 26). Some argue that nudging overlooks the power of simple rules-of-thumb for making decisions that people can learn to wield on their own.
“I don’t think I’ve changed everybody’s minds,” Thaler said. “But many young economists embrace behavioral economics.”
In the United States, cartoon characters are a no-no in cigarette ads, and candy- or fruit-flavored cigarettes can’t be sold. But that’s not the case for e-cigarettes, and these youth-appealing tactics are luring teens who have never used tobacco products to give e-cigs and even cigarettes a try, a new study suggests.
Researchers analyzed surveys of nearly 7,000 kids ages 12 to 17 who had never used a tobacco product as of 2013 to 2014. Teens who recalled seeing or liking e-cigarette ads were 1.6 times as likely to be open to trying e-cigs or to actually try them the next year as kids who didn’t remember the ads, researchers report online March 26 in JAMA Pediatrics. E-cig ads often feature celebrities, cartoons (one product shows a unicorn vomiting a rainbow) or references to sweet flavors, such as Skittles. Past research has shown a link between traditional cigarette advertisements and receptive nonsmoking adolescents going on to light up. Nearly nine out of 10 smokers tried their first cigarette by age 18. Gearing traditional cigarette ads toward teens has been restricted since 1998.
In 2016, more than 2.1 million U.S. middle and high school students reported using e-cigarettes. That same year, an estimated 20.5 million — or four in five — were exposed to e-cigarette ads.
But e-cigarette ads are doing more than hyping vaping, the study suggests. The ads also appeared to nudge some teens and young adults to take up cigarette smoking. Of a larger group of about 10,500 kids ages 12 to 21 who had never used tobacco products, 18 percent recalled seeing or liking e-cigarette ads but not cigarette ads. Five percent of those teens had started to smoke by the next year.
Extrapolating to the U.S. population, “105,000 12- to 21- year olds appear to have smoked their first cigarette because of the influence of e-cigarette advertising,” says John Pierce, a behavioral epidemiologist at the University of California, San Diego. Previous research has found that teens who use e-cigarettes are more likely to smoke traditional cigarettes (SN: 9/19/15, p. 14). The fact that e-cigarette ads may up the risk of smoking “raises an unprecedented concern for adolescent tobacco control,” addiction psychologist Adam Leventhal and epidemiologist Jessica L. Barrington-Trimis, both of the University of Southern California’s Keck School of Medicine in Los Angeles, write in an accompanying editorial in the journal.
In an interview, Leventhal adds that restricting such advertising is an important target for public health campaigns and policies to limit youth use of tobacco products.
Birds can sense Earth’s magnetic field, and this uncanny ability may help them fly home from unfamiliar places or navigate migrations that span tens of thousands of kilometers.
For decades, researchers thought iron-rich cells in birds’ beaks acted as microscopic compasses (SN: 5/19/12, p. 8). But in recent years, scientists have found increasing evidence that certain proteins in birds’ eyes might be what allows them to see magnetic fields (SN: 10/28/09, p. 12).
Scientists have now pinpointed a possible protein behind this “sixth sense.” Two new studies — one examining zebra finches published March 28 in Journal of the Royal Society Interface, the other looking at European robins published January 22 in Current Biology — both single out Cry4, a light-sensitive protein found in the retina. If the researchers are correct, this would be the first time a specific molecule responsible for the detection of magnetic fields has been identified in animals. “This is an exciting advance — we need more papers like these,” says Peter Hore, a chemist at the University of Oxford who has studied chemical reactions involved in bird navigation.
Cry4 is part of a class of proteins called cryptochromes, which are known to be involved in circadian rhythms, or biological sleep cycles (SN: 10/02/17, p. 6). But at least some of these proteins are also thought to react to Earth’s magnetic field thanks to the weirdness of quantum mechanics (SN: 7/23/16, p. 8). The protein’s quantum interactions could help birds sense this field, says Atticus Pinzon-Rodriguez, a biologist at the University of Lund in Sweden who was involved with the zebra finch study.
To figure out which of three cryptochromes is responsible for this quantum compass, Pinzon-Rodriguez and his colleagues examined the retinas, muscles and brains of 39 zebra finches for the presence of the three proteins Cry1, Cry2 and Cry4. The team found that while levels of Cry1 and Cry2 followed a rhythmic pattern that rose and fell over the day, Cry4 levels remained constant, indicating the protein was being produced steadily.
“We assume that birds use magnetic compasses any time of day or night,” says Lund biologist Rachel Muheim, a coauthor on the zebra finch study.
European robins also showed constant levels of Cry4 during a 24-hour cycle, and higher levels during their migratory season. And the researchers in that study found Cry4 in an area of the robin’s retina that receives a lot of light — a position that would help it work as a compass, the study says.
“We have quite a lot of evidence, but [Cry4] is not proven,” says Henrik Mouritsen, an animal navigation expert at the Institute of Biology and Environmental Sciences in Oldenburg, Germany, who participated in the robin study. More definitive evidence might come from observing birds without a functioning Cry4 protein, to see if they still seem to have an internal compass.
Even then, Hore says, we still may not understand how birds actually perceive magnetic fields. To know, you’d have to be a bird.
There’s a fine line between immersive and unnerving when it comes to touch sensation in virtual reality.
More realistic tactile feedback in VR can ruin a user’s feeling of immersion, researchers report online April 18 in Science Robotics. The finding suggests that the “uncanny valley” — a term that describes how humanoid robots that look almost but not quite human are creepier than their more cartoonish counterparts — also applies to virtual touch (SN Online: 11/22/13). Experiment participants wearing VR headsets and gripping a controller in each hand embodied a virtual avatar holding the two ends of a stick. At first, users felt no touch sensation. Then, the hand controllers gave equally strong vibrations every half-second. Finally, the vibrations were finely tuned to create the illusion that the virtual stick was being touched in different spots. For instance, stronger vibrations in the right controller gave the impression that the stick was nudged on that side.
Compared with scenarios in which users received either no touch or even buzzing sensations, participants reported feeling far less immersed in the virtual environment when they received the realistic, localized touch. This result demonstrates the existence of a tactile uncanny valley, says study coauthor Mar Gonzalez-Franco, a human-computer interaction researcher at Microsoft Research in Redmond, Washington.
But when users were shown a marble touching the virtual stick wherever they felt the localized touch, the participants found this realistic tactile feedback highly immersive rather than bothersome. The finding indicates that rich tactile feedback in VR may need to be paired with other sensory cues that explain the source of the sensation to avoid spooking users, Gonzalez-Franco says.
Better understanding how realistic touch sensations can break the VR illusion may help developers create more engaging virtual environments for games and virtual reality therapy, says Sean Follmer, a human-computer interaction researcher at Stanford University not involved in the study.
A chunk of space rock may have been forged inside a long-lost planet from the early solar system. Tiny pockets of iron and sulfur embedded in diamonds inside the meteorite probably formed under high pressures found only inside planets the size of Mercury or Mars, researchers suggest April 17 in Nature Communications.
The parent planet no longer exists, though — it was smashed to smithereens in the solar system’s violent infancy.
“We probably have in our hands a piece of one of these first planets that have disappeared,” says Philippe Gillet of École Polytechnique Fédérale de Lausanne, or EPFL, in Switzerland. EPFL physicist Farhang Nabiei, Gillet and their colleagues analyzed minuscule fragments of the Almahata Sitta meteorites. These meteorites are famous for coming from the first-ever asteroid tracked from orbit to ground as it streaked to the Nubian desert in Sudan in 2008 (SN: 4/25/09, p. 13). The meteorites belong to a class called ureilites, which have compositions different from those in any of the known stony planets in the solar system. These ureilites contain 100-micrometer diamonds — too large to have been formed in the shock of two asteroids colliding. Such diamonds could form, however, inside asteroids that are at least 1,000 kilometers in diameter, where pressures would be high enough to compress carbon. But the researchers discovered an oddity that made them question whether the gems came from an asteroid at all: The diamonds had grown around even smaller crystals of iron and sulfur, which normally would repel each other like oil and water, says EPFL physicist Cécile Hébert.
Those crystals would be stable only at pressures above 20 gigapascals, almost 200,000 times atmospheric pressure at sea level on Earth. “That can only be at the center of a very large planet” the size of Mercury, about 4,900 kilometers wide, or in the core-mantle boundary of a planet as large as Mars, about 6,800 kilometers wide, Hébert says.
Such planets probably roamed the early solar system some 4 billion years ago. But only a few survived to become the four rocky planets that exist today. Simulations of the early solar system suggest most of these early planets crashed into each other and broke apart in the first 100 million years.
“We are confirming the existence of such former planets,” Gillet says.
Those planets’ existence alone isn’t surprising, says cosmochemist Meenakshi Wadhwa of Arizona State University in Tempe. “This is the first time, though, that there is direct meteoritic evidence for the existence of a large protoplanetary body in the early solar system that is no longer in existence,” she says. Not so fast, says cosmochemist Martin Bizzarro of the Natural History Museum of Denmark in Copenhagen. The protoplanet explanation isn’t the only one possible.
“They’ve done very careful work,” he says, but more needs to be done. Testing for remnant magnetic fields could reveal if the meteorites were once within a large planet’s molten core, for instance. Whether the meteorites came from a protoplanet is “still an open question.”
A DIY universe mimics the physics of the infant cosmos, a team of physicists reports. The researchers hope to use their homemade cosmic analog to help explain the first instants of the universe’s 13.8-billion-year life.
For their stand-in, the scientists created a Bose-Einstein condensate — a state of matter in which atoms are chilled until they all take on the same quantum state. Shaped into a tiny, rapidly expanding ring, the condensate grew from about 23 micrometers in diameter to about four times that size in just 15 milliseconds. The behavior of that widening condensate re-created some of the physics of inflation, a brief period just after the Big Bang during which the universe rapidly ballooned in size (SN Online: 12/11/13) before settling into a more moderate expansion rate. In physics, seemingly unrelated systems can have similarities under the hood. Scientists have previously used Bose-Einstein condensates to simulate other mysteries of the cosmos, such as black holes (SN: 11/15/14, p. 14). And the comparison between Bose-Einstein condensates and inflation is particularly apt: A hypothetical substance called the inflaton field is thought to drive the universe’s extreme expansion, and particles associated with that field, known as inflatons, all take on the same quantum state, just as atoms do in the condensate.
Scientists still don’t fully understand how inflation progressed, “so it’s hard to know how close our system is to what really happened,” says experimental physicist Gretchen Campbell of the Joint Quantum Institute in College Park, Md. “But the hope is that our system can be a good test-bed” for studying various theories. Already, the scientists have spotted several effects in their system similar to those predicted in the baby cosmos, the team reports April 19 in Physical Review X.
As the scientists expanded the ring, sound waves that were traveling through the condensate increased in wavelength. That change was similar to the way in which light became redshifted — stretched to longer wavelengths and redder colors — as the universe enlarged. Likewise, Campbell and colleagues saw a phenomenon akin to what’s known as Hubble friction, which shows up as a decrease in the density of particles in the early universe. In the experiment, this effect appeared in the guise of a weakening in the strength of the sound waves in the condensate. And inflation’s finale, an effect known as preheating that occurs at the end of the rapid expansion period, also had a look-alike in the simulated universe. In the cosmic picture, preheating occurs when inflatons transform into other types of particles. In the condensate, this showed up as sound waves converting from one type into another: waves that had been sloshing inward and outward broke up into waves going around the ring.
However, the condensate wasn’t a perfect analog of the real universe: In particular, while our universe has three spatial dimensions, the expanding ring didn’t. Additionally, in the real universe, inflation proceeds on its own, but in this experiment, the researchers forced the ring to expand. Likewise, there were subtle differences between each of the effects observed and their cosmic counterparts.
Despite the differences, the analog universe could be useful, says theoretical cosmologist Mustafa Amin of Rice University in Houston. “Who knows?” he says. “New phenomena might happen there that we haven’t thought about in the early universe.”
Sometimes, when research crosses over between very different systems — such as Bose-Einstein condensates and the early universe — “sparks can fly,” Amin says.
A new kind of navigation system could help self-driving cars take the road less traveled.
Most autonomous vehicles test-driving in cities navigate using 3-D maps marking every curbside and off-ramp with almost centimeter-level precision (SN Online: 11/21/17). But there are millions of miles of open road that tech companies aren’t likely to plot in such detail any time soon.
Researchers now have developed a new autonomous navigation system that guides vehicles without such high-res maps, according to research being presented May 22 at the IEEE International Conference on Robotics and Automation in Brisbane, Australia. Cars equipped with this tech could hit the road for excursions off the beaten path. The navigation system charts a course down unfamiliar roads much like a human driver would — by continually scanning its surroundings, albeit with a laser sensor, to gauge how close it is to the edges of the road. Meanwhile, the car also follows a tool akin to a smartphone map app that provides GPS directions to its destination, as well as information about the rules of the road — like speed limits and the positions of stoplights — along the car’s journey.
This system assumes that a car has a clear path down the road, but it could be paired with other existing algorithms that use laser sensing to detect in-road obstacles, like other vehicles or pedestrians, to navigate more heavily trafficked roadways, says study coauthor Teddy Ort, a roboticist at MIT.
Ort and colleagues test-drove a car equipped with this navigation system on a one-lane road winding through a forest in Devens, Mass. The vehicle slowly cruised along a one-kilometer stretch without requiring any human intervention to keep it on the right track. The researchers plan to build a version of this system that can spot lane markings painted on streets, so that the car can drive on more than one-way roads, Ort says. The technology may be useful for future self-driving cars on cross-country road trips, though such vehicles would probably still use meticulous 3-D maps to weave through city traffic, says Raghvendra Cowlagi, an aerospace engineer at Worcester Polytechnic Institute in Massachusetts who wasn’t involved in the work.
Self-driving cars with this navigation system may also need other kinds of sensors to work in different conditions, says Alexander Wyglinski, an electrical and computer engineer also at Worcester Polytechnic Institute not involved in the study. Since laser sensors don’t work well in rain or snow, for example, these cars might need additional imaging technologies to drive safely in inclement weather (SN: 12/24/16, p. 34).