50 years ago, a pessimistic view for heart transplants

Now that heart recipients can realistically look forward to leaving the hospital and taking up a semblance of normal life, the question arises, what kind of semblance, and for how long? South Africa’s Dr. Christiaan Barnard, performer of the first heart transplant, has a sobering view…. “A transplanted heart will last only five years — if we’re lucky.” — Science News, September 14, 1968

Update
Barnard didn’t need to be so disheartening. Advances in drugs that suppress the immune system and keep blood pressure down have helped to pump up life expectancy after a heart transplant. Now, more than half of patients who receive a donated ticker are alive 10 years later. A 2015 study found 21 percent of recipients still alive 20 years post-transplant. In 2017, nearly 7,000 people across 46 countries got a new heart, according to the Global Observatory on Donation and Transplantation.

New images reveal how an ancient monster galaxy fueled furious star formation

New images of gas churning inside an ancient starburst galaxy help explain why this galactic firecracker underwent such frenzied star formation.

Using the Atacama Large Millimeter/submillimeter Array, or ALMA, researchers have taken the most detailed views of the disk of star-forming gas that permeated the galaxy COSMOS-AzTEC-1, which dates back to when the universe was less than 2 billion years old. The telescope observations, reported online August 29 in Nature, reveal an enormous reservoir of molecular gas that was highly susceptible to collapsing and forging new stars.
COSMOS-AzTEC-1 and its starburst contemporaries have long puzzled astronomers, because these galaxies cranked out new stars about 1,000 times as fast as the Milky Way does. According to standard theories of cosmology, galaxies shouldn’t have grown up fast enough to be such prolific star-formers so soon after the Big Bang.

Inside a normal galaxy, the outward pressure of radiation from stars helps counteract the inward pull of gas’s gravity, which pumps the brakes on star formation. But in COSMOS-AzTEC-1, the gas’s gravity was so intense that it overpowered the feeble radiation pressure from stars, leading to runaway star formation. The new ALMA pictures unveil two especially large clouds of collapsing gas in the disk, which were major hubs of star formation.
“It’s like a giant fuel depot that built up right after the Big Bang … and we’re catching it right in the process of the whole thing lighting up,” says study coauthor Min Yun, an astronomer at the University of Massachusetts Amherst.

Yun and colleagues still don’t know how COSMOS-AzTEC-1 stocked up such a massive supply of star-forming material. But future observations of the galaxy and its ilk using ALMA or the James Webb Space Telescope, set to launch in 2021, may help clarify the origins of these ancient cosmic monsters (SN Online: 6/11/14).

How plant microbes could feed the world and save endangered species

One fine Hawaiian day in 2015, Geoff Zahn and Anthony Amend set off on an eight-hour hike. They climbed a jungle mountain on the island of Oahu, swatting mosquitoes and skirting wallows of wild pigs. The two headed to the site where a patch of critically endangered Phyllostegia kaalaensis had been planted a few months earlier. What they found was dispiriting.

“All the plants were gone,” recalls Zahn, then a postdoctoral fellow at the University of Hawaii at Manoa. The two ecologists found only the red flags placed at the site of each planting, plus a few dead stalks. “It was just like a graveyard,” Zahn says.

The plants, members of the mint family but without the menthol aroma, had most likely died of powdery mildew caused by Neoerysiphe galeopsidis. Today the white-flowered plants, native to Oahu, survive only in two government-managed greenhouses on the island. Why P. kaalaensis is nearly extinct is unclear, though both habitat loss and powdery mildew are potential explanations. The fuzzy fungal disease attacks the plants in greenhouses, and the researchers presume it has killed all the plants they’ve attempted to reintroduce to the wild.

Zahn had never encountered extinction (or near to it) so directly before. He returned home overwhelmed and determined to help the little mint.
Just like humans and other animals, plants have their own microbiomes, the bacteria, fungi and other microorganisms living on and in the plants. Some, like the mildew, attack; others are beneficial. A single leaf hosts millions of microbes, sometimes hundreds of different types. The ones living within the plant’s tissues are called endophytes. Plants acquire many of these microbes from the soil and air; some are passed from generation to generation through seeds.

The friendly microbes assist with growth and photosynthesis or help plants survive in the face of drought and other stressors. Some protect plants from disease or from plant-munching animals. Scientists like Zahn are investigating how these supportive communities might help endangered plants in the wild, like the mint on the mountain, or improve output of crops ranging from breadbasket wheat to tropical cacao.

Beyond the garden store
Certain microbial plant partners are well-known, and there are scores of microbial products already on the market. Gardeners, for instance, can spike their watering pails with microbes to encourage flowering and boost plant immunity. But “we know very little about how the products out there actually do work,” says Jeff Dangl, a geneticist at the University of North Carolina at Chapel Hill. “None of those garden supply store products have proven useful at large scale.”

Big farms can use microbial treatments. The main one applied broadly in large-scale agriculture helps roots collect nitrogen, Dangl says, which plants use to produce chlorophyll for photosynthesis.

Farmers may soon have many more microbial helpers to choose from. Scientists studying plant microbiomes have described numerous unfamiliar plant partners in recent decades. Those researchers say they’ve only scratched the surface of possibilities. Many start-up companies are researching and releasing novel microbial treatments. “The last five years have seen an explosion in this,” says Dangl, who cofounded AgBiome, which soon plans to market a bacterial treatment that combats fungal diseases. Agricultural giants like Bayer AG, which recently bought Monsanto, are also investing hundreds of millions of dollars in potential microbial treatments for plants.

The hope is that microbes can provide the next great revolution in agriculture — a revolution that’s sorely needed. With the human population predicted to skyrocket from today’s 7.6 billion to nearly 10 billion by 2050, our need for plant-based food, fibers and animal feed is expected to double.

“We’re going to need to increase yield,” says Posy Busby, an ecologist at Oregon State University in Corvallis. “If we can manage and manipulate microbiomes … this could potentially represent an untapped area for increasing plant yield in agricultural settings.” Meanwhile, scientists like Zahn are eyeing the microbiome to save endangered plants.

But before microbiome-based farming and conservation can truly take off, many questions need answers. Several revolve around the complex interactions between plants, their diverse microbial denizens and the environments they live in. One concern is that the microbes that help some plants might, under certain conditions, harm others elsewhere, warns microbiologist Luis Mejía of the Institute of Scientific Research and High Technology Services in Panama City.

Save the chocolate
Cacao crops — and thus humankind’s precious M&M’s supply — are under constant threat from undesirable fungi, such as Phytophthora palmivora, which causes black pod rot. But there are good guys in cacao’s microbiome too, particularly the fungus Colletotrichum tropicale, which seems to protect the trees.
Natalie Christian, as a graduate student at Indiana University Bloomington, traveled to the Smithsonian Tropical Research Institute on Panama’s Barro Colorado Island in 2014 to study how entire communities of microbes colonize and influence cacao plants (Theobroma cacao). Christian suspected that the prime source of a young cacao tree’s microbiome would be the dead and decaying leaves on the rainforest or orchard floor.

To test this hunch and see what kind of protection microbes picked up from leaf litter might offer, Christian raised fungus-free cacao seedlings in a lab incubator. When the plants reached about half a meter tall, she placed them in pots outside, surrounding some with leaf litter from a healthy cacao tree, some with litter from other kinds of trees and some with no litter at all.

After two weeks, she brought the plants back into the greenhouse to analyze their microbiomes. She found nearly 300 kinds of endophytes, which she, Mejía and colleagues reported last year in Proceedings of the Royal Society B.

The microbiome membership differed between the litter treatments. Plants in pots with either kind of leaf litter possessed less diverse microbiomes than those without litter, probably because the microbes in the litter quickly took over before stray microbes from elsewhere could settle in. These results suggest that a seedling in the shadow of more mature trees will probably accumulate the same microbiome as its towering neighbors.
To see if some of those transferred microbes protect the cacao from disease-causing organisms, Christian rubbed a bit of black pod rot on the leaves of plants in each group. Three weeks later, she measured the size of the rotted spots.

Plants surrounded by cacao litter had the smallest lesions. Those with litter from other trees had slightly more damage, and plants with no litter had about double the damage of the mixed litter plants.

“Getting exposed to the litter of their mother or their own kind had a very strong beneficial effect on the resistance of these young plants,” says plant biologist Keith Clay of Tulane University in New Orleans, a coauthor of the study.

Scientists aren’t sure how the good fungi protect the plants against the rot. It may be that the beneficial fungi simply take up space in or on the leaves, leaving no room for the undesirables, Christian says. Or a protective microbe like C. tropicale might attack a pathogen via some kind of chemical warfare. In the case of cacao, she thinks the most likely explanation is that the good guys act as a sort of vaccine, priming the plant’s immune system to fight off the rot. In support of this idea, Mejía reported in 2014 in Frontiers in Microbiology that C. tropicale causes cacao to turn on defensive genes.

Cacao farmers may need to rethink their practices. The farmers normally clear leaf litter out of orchards to avoid transmitting disease-causing microbes from decaying leaves to living trees, says Christian, now a postdoc at the University of Illinois at Urbana-Champaign. But her work suggests that farmers might do well to at least hold on to litter from healthy trees.

Crop questions
Litter is a low-tech way to spread entire communities of microbes — good and bad. But agricultural companies want to grab only the good microbes and apply them to crops. The hunt for the good guys starts with a stroll through a crop field, says Barry Goldman, vice president and head of discovery at Indigo Ag in Boston. Chances are, you’ll find bigger and hardier plants among the crowd. Within those top performers, Indigo has found endophytes that improve plant vigor and size, and others that protect against drought.

The company, working with cotton, corn, rice, soybeans and wheat, coats seeds with these microbes. Once the seeds germinate, the microbes cover the newborn leaves and can get inside via cuts in the roots or through stomata, tiny breathing holes in the leaves. The process is akin to what happens when a baby travels through the birth canal, picking up beneficial microbial partners from mom along the way.
For example, the first-generation Indigo Wheat, released in 2016, starts from seeds treated with a beneficial microbe. In Kansas test fields, the treatment raised yields by 8 to 19 percent.

Farmers are also reporting improved drought tolerance. During the first six months of 2018 with only two rains, the participating Kansas farmers had given up on and plowed over fields with struggling regular wheat, but not those growing Indigo Wheat, Goldman says.

In St. Louis, NewLeaf Symbiotics is interested in bacteria of the genus Methylobacterium. These microbes, found in all plants, are known as methylotrophs because they eat methanol, which plants release as their cells grow. In return for methanol, M-trophs, as NewLeaf calls them, offer plants diverse benefits. Some deliver molecules that encourage plants to grow; others make seeds germinate earlier and more consistently, or protect against problem fungi.

NewLeaf released its first products this year, including Terrasym 401, a seed treatment for soybeans. Across four years of field trials, Terrasym 401 raised yields by more than two bushels per acre, says NewLeaf cofounder and CEO Tom Laurita. One bushel is worth about $9. On farms with thousands of acres, that adds up.

Farmers are pleased, but NewLeaf’s and Indigo’s work is hardly done. Plant microbiome companies all face similar challenges. One is the diverse environments where crops are grown. Just because Indigo Wheat thrives in Kansas doesn’t mean it will outgrow standard varieties in, say, North Dakota. “The big ask for the next-gen ag biotech companies like AgBiome or Indigo … is whether the products will deliver as advertised over a range of field conditions,” Dangl says.

Another issue is that crop fields and plants already have microbiomes. “We’re asking a lot of a microbe, or a mix of microbes, to invade an already-existing ecosystem and persist there and do their job,” Dangl says. Companies will need to make sure their preferred microbes take hold.

And while scientists are well aware that diverse microbial communities cooperate to affect plant health, most companies are working with one kind of microbe at a time. Indigo isn’t yet sure how to approach entire microbiomes, Goldman says, but “we certainly are thinking hard about it.”

Researchers are beginning to address these questions by studying microbes in communities — such as Christian’s leaf-litter microbiomes — instead of as individuals. In the lab, Dangl developed a synthetic community of 188 root microbes. He can apply them to plants under stress from drought or heat, then watch how the communities respond and affect the plants.

A major aim is to identify the factors that determine microbiome membership. What decides who gets a spot on a given plant? How does the plant species and its local environment affect the microbiome? How do plants welcome friendlies and eject hostiles? “This is a huge area of importance,” Dangl says.

There’s some risk in adding microbes to crops while these questions are still unanswered, Mejía cautions. Microbes that are beneficial in one situation could be harmful in other plants or different environments. It’s not a far-fetched scenario: There’s a fungal endophyte of a South American palm tree that staves off beetle infestations when the trees are in the shade. Under the sun, however, the fungus turns nasty, spewing hydrogen peroxide that kills plant tissues.

And although C. tropicale benefits cacao, the genus has a dark side: Many species of Colletotrichum can cause leaf lesions and rotted fruit or flower spots in a variety of plants ranging from avocados to zinnias.
Microbes for conservation
Back in Hawaii, after that disheartening hike to the P. kaalaensis graveyard, Zahn pondered how to protect native plants in wild environments such as Oahu’s mountains.

In people, Zahn considered, antibiotics can damage normal gut microbe populations, leaving a person vulnerable to infection by harmful microbes. P. kaalaensis got similar treatment in the greenhouse, where it received regular dosing of fungicide. In retrospect, Zahn realized, that treatment probably left the plants bereft of their natural microbiome and weakened their immune systems, leaving them vulnerable to mildew infection once dropped into the jungle.

For people on antibiotics, probiotics — beneficial bacteria — can help restore balance. Zahn thought a similar strategy, a sort of plant probiotic, could help protect P. kaalaensis in future attempts at moving it outside.

For a probiotic, Zahn looked to a P. kaalaensis cousin, Phyllostegia hirsuta, which can survive in the wild. He put P. hirsuta leaves in a blender and sprayed the slurry over P. kaalaensis growing in an incubator.

Then, Zahn placed a leaf infected with powdery mildew into the incubator’s air intake. The mint plants treated with the P. hirsuta slurry experienced delayed, less severe infections compared with untreated plants, Zahn and Amend, also at the University of Hawaii at Manoa, reported last year in PeerJ. The probiotic had worked.

Zahn used DNA sequencing to identify the microbes in the slurry. Many of the microbiome members probably benefit P. kaalaensis, but he thinks he’s found a major protector: a yeast called Pseudozyma aphidis that lives on leaves. “This yeast normally just passively absorbs nutrients from the environment,” Zahn says. “But given the right victim, it will turn into a vicious spaghetti monster.” When mildew spores land nearby, the yeast grows tentacle-like filaments that appear to envelop and feed on the mildew.

Emboldened by his results, Zahn trekked back to the jungle and planted six slurry-treated plants in April 2016. They survived for about two years, but by May 2018, they were all dead.
“It was still a huge win,” says Nicole Hynson, a community ecologist also at Manoa. After all, P. kaalaensis without probiotics last only months. And the probiotics approach might apply beyond one little Hawaiian mint, Hynson adds: “We’re really at the beginning of thinking how we might use the microbiome to address plant restoration.”

Zahn has since moved to Utah Valley University in Orem, where he’s hoping to help endangered cacti with microbes. Meanwhile, he’s left the Phyllostegia project in the hands of Jerry Koko, a graduate student in Hynson’s lab. Koko is studying how the yeast and some root-based fungi protect the plant.

Hynson says their goal is to build “a superplant.” With probiotics on both roots and shoots, an enhanced P. kaalaensis should be well-equipped to grow strong and resist mildew. In greenhouse experiments so far, Koko says, the plants with both types of beneficial fungi seem to sport fewer, smaller powdery mildew patches than plants that received no probiotic treatment.

While the restoration of a little flowering plant, or a few more bushels of soybeans, may seem like small victories, they could herald big things for plant microbiomes in conservation as well as agriculture. The farmers and conservationists of the future may find themselves seeding and tending not just plants, but their microscopic helpers, too.

Baby Jupiter glowed so brightly it might have desiccated its moon

THE WOODLANDS, TEXAS — A young, ultrabright Jupiter may have desiccated its now hellish moon Io. The planet’s bygone brilliance could have also vaporized water on Europa and Ganymede, planetary scientist Carver Bierson reported March 17 at the Lunar and Planetary Science Conference. If true, the findings could help researchers narrow the search for icy exomoons by eliminating unlikely orbits.

Jupiter is among the brightest specks in our night sky. But past studies have indicated that during its infancy, Jupiter was far more luminous. “About 10 thousand times more luminous,” said Bierson, of Arizona State University in Tempe.
That radiance would have been inescapable for the giant planet’s moons, the largest of which are volcanic Io, ice-shelled Europa, aurora-cowled Ganymede and crater-laden Callisto (SN: 12/22/22, SN: 4/19/22, SN: 3/12/15). The constitutions of these four bodies obey a trend: The more distant the moon from Jupiter, the more ice-rich its body is.

Bierson and his colleagues hypothesized this pattern was a legacy of Jupiter’s past radiance. The team used computers to simulate how an infant Jupiter may have warmed its moons, starting with Io, the closest of the four. During its first few million years, Io’s surface temperature may have exceeded 26° Celsius under Jupiter’s glow, Bierson said. “That’s Earthlike temperatures.”

Any ice present on Io at that time, roughly 4.5 billion years ago, probably would have melted into an ocean. That water would have progressively evaporated into an atmosphere. And that atmosphere, hardly restrained by the moon’s weak gravity, would have readily escaped into space. In just a few million years, Io could have lost as much water as Ganymede may hold today, which may be more than 25 times the amount in Earth’s oceans.

A coruscant Jupiter probably didn’t remove significant amounts of ice from Europa or Ganymede, the researchers found, unless Jupiter was brighter than simulated or the moons orbited closer than they do today.

The findings suggest that icy exomoons probably don’t orbit all that close to massive planets.

Here’s why some Renaissance artists egged their oil paintings

Art historians often wish that Renaissance painters could shell out secrets of the craft. Now, scientists may have cracked one using chemistry and physics.

Around the turn of the 15th century in Italy, oil-based paints replaced egg-based tempera paints as the dominant medium. During this transition, artists including Leonardo da Vinci and Sandro Botticelli also experimented with paints made from oil and egg (SN: 4/30/14). But it has been unclear how adding egg to oil paints may have affected the artwork.
“Usually, when we think about art, not everybody thinks about the science which is behind it,” says chemical engineer Ophélie Ranquet of the Karlsruhe Institute of Technology in Germany.

In the lab, Ranquet and colleagues whipped up two oil-egg recipes to compare with plain oil paint. One mixture contained fresh egg yolk mixed into oil paint, and had a similar consistency to mayonnaise. For the other blend, the scientists ground pigment into the yolk, dried it and mixed it with oil — a process the old masters might have used, according to the scant historical records that exist today. Each medium was subjected to a battery of tests that analyzed its mass, moisture, oxidation, heat capacity, drying time and more.

In both concoctions, the yolk’s proteins, phospholipids and antioxidants helped slow paint oxidation, which can cause paint to turn yellow over time, the team reports March 28 in Nature Communications.

In the mayolike blend, the yolk created sturdy links between pigment particles, resulting in stiffer paint. Such consistency would have been ideal for techniques like impasto, a raised, thick style that adds texture to art. Egg additions also could have reduced wrinkling by creating a firmer paint consistency. Wrinkling sometimes happens with oil paints when the top layer dries faster than the paint underneath, and the dried film buckles over looser, still-wet paint.

The hybrid mediums have some less than eggs-ellent qualities, though. For instance, the eggy oil paint can take longer to dry. If paints were too yolky, Renaissance artists would have had to wait a long time to add the next layer, Ranquet says.

“The more we understand how artists select and manipulate their materials, the more we can appreciate what they’re doing, the creative process and the final product,” says Ken Sutherland, director of scientific research at the Art Institute of Chicago, who was not involved with the work.

Research on historical art mediums can not only aid art preservation efforts, Sutherland says, but also help people gain a deeper understanding of the artworks themselves.

A surprising food may have been a staple of the real Paleo diet: rotten meat

In a book about his travels in Africa published in 1907, British explorer Arnold Henry Savage Landor recounted witnessing an impromptu meal that his companions relished but that he found unimaginably revolting.

As he coasted down a river in the Congo Basin with several local hunter-gatherers, a dead rodent floated near their canoe. Its decomposing body had bloated to the size of a small pig.

Stench from the swollen corpse left Landor gasping for breath. Unable to speak, he tried to signal his companions to steer the canoe away from the fetid creature. Instead, they hauled the supersize rodent aboard and ate it.
“The odour when they dug their knives into it was enough to kill the strongest of men,” Landor wrote. “When I recovered, my admiration for the digestive powers of these people was intense. They were smacking their lips and they said the [rodent] had provided most excellent eating.”

Starting in the 1500s, European and then later American explorers, traders, missionaries, government officials and others who lived among Indigenous peoples in many parts of the world wrote of similar food practices. Hunter-gatherers and small-scale farmers everywhere commonly ate putrid meat, fish and fatty parts of a wide range of animals. From arctic tundra to tropical rainforests, native populations consumed rotten remains, either raw, fermented or cooked just enough to singe off fur and create a more chewable texture. Many groups treated maggots as a meaty bonus.

Descriptions of these practices, which still occur in some present-day Indigenous groups and among northern Europeans who occasionally eat fermented fish, aren’t likely to inspire any new Food Network shows or cookbooks from celebrity chefs.

Case in point: Some Indigenous communities feasted on huge decomposing beasts, including hippos that had been trapped in dug-out pits in Africa and beached whales on Australia’s coast. Hunters in those groups typically smeared themselves with the fat of the animal before gorging on greasy innards. After slicing open animals’ midsections, both adults and children climbed into massive, rotting body cavities to remove meat and fat.

Or consider that Native Americans in Missouri in the late 1800s made a prized soup from the greenish, decaying flesh of dead bison. Animal bodies were buried whole in winter and unearthed in spring after ripening enough to achieve peak tastiness.

But such accounts provide a valuable window into a way of life that existed long before Western industrialization and the war against germs went global, says anthropological archaeologist John Speth of the University of Michigan in Ann Arbor. Intriguingly, no reports of botulism and other potentially fatal reactions to microorganisms festering in rotting meat appear in writings about Indigenous groups before the early 1900s. Instead, decayed flesh and fat represented valued and tasty parts of a healthy diet.
Many travelers such as Landor considered such eating habits to be “disgusting.” But “a gold mine of ethnohistorical accounts makes it clear that the revulsion Westerners feel toward putrid meat and maggots is not hardwired in our genome but is instead culturally learned,” Speth says.

This dietary revelation also challenges an influential scientific idea that cooking originated among our ancient relatives as a way to make meat more digestible, thus providing a rich calorie source for brain growth in the Homo genus. It’s possible, Speth argues, that Stone Age hominids such as Neandertals first used cooking for certain plants that, when heated, provided an energy-boosting, carbohydrate punch to the diet. Animals held packets of fat and protein that, after decay set in, rounded out nutritional needs without needing to be heated.
Putrid foods in the diets of Indigenous peoples
Speth’s curiosity about a human taste for putrid meat was originally piqued by present-day hunter-gatherers in polar regions. North American Inuit, Siberians and other far-north populations still regularly eat fermented or rotten meat and fish.

Fermented fish heads, also known as “stinkhead,” are one popular munchy among northern groups. Chukchi herders in the Russian Far East, for instance, bury whole fish in the ground in early fall and let the bodies naturally ferment during periods of freezing and thawing. Fish heads the consistency of hard ice cream are then unearthed and eaten whole.

Speth has suspected for several decades that consumption of fermented and putrid meat, fish, fat and internal organs has a long and probably ancient history among northern Indigenous groups. Consulting mainly online sources such as Google Scholar and universities’ digital library catalogs, he found many ethnohistorical descriptions of such behavior going back to the 1500s. Putrid walrus, seals, caribou, reindeer, musk oxen, polar bears, moose, arctic hares and ptarmigans had all been fair game. Speth reported much of this evidence in 2017 in PaleoAnthropology.

In one recorded incident from late-1800s Greenland, a well-intentioned hunter brought what he had claimed in advance was excellent food to a team led by American explorer Robert Peary. A stench filled the air as the hunter approached Peary’s vessel carrying a rotting seal dripping with maggots. The Greenlander had found the seal where a local group had buried it, possibly a couple of years earlier, so that the body could reach a state of tasty decomposition. Peary ordered the man to keep the reeking seal off his boat.

Miffed at this unexpected rejection, the hunter “told us that the more decayed the seal the finer the eating, and he could not understand why we should object,” Peary’s wife wrote of the encounter.

Even in temperate and tropical areas, where animal bodies decompose within hours or days, Indigenous peoples have appreciated rot as much as Peary’s seal-delivery man did. Speth and anthropological archaeologist Eugène Morin of Trent University in Peterborough, Canada, described some of those obscure ethnohistorical accounts last October in PaleoAnthropology.
Early hominids may have scavenged rotten meat
These accounts undermine some of scientists’ food-related sacred cows, Speth says. For instance, European explorers and other travelers consistently wrote that traditional groups not only ate putrid meat raw or lightly cooked but suffered no ill aftereffects. A protective gut microbiome may explain why, Speth suspects. Indigenous peoples encountered a variety of microorganisms from infancy on, unlike people today who grow up in sanitized settings. Early exposures to pathogens may have prompted the development of an array of gut microbes and immune responses that protected against potential harms of ingesting putrid meat.

That idea requires further investigation; little is known about the bacterial makeup of rotten meat eaten by traditional groups or of their gut microbiomes. But studies conducted over the last few decades do indicate that putrefaction, the process of decay, offers many of cooking’s nutritional benefits with far less effort. Putrefaction predigests meat and fish, softening the flesh and chemically breaking down proteins and fats so they are more easily absorbed and converted to energy by the body.

Given the ethnohistorical evidence, hominids living 3 million years ago or more could have scavenged meat from decomposing carcasses, even without stone tools for hunting or butchery, and eaten their raw haul safely long before fire was used for cooking, Speth contends. If simple stone tools appeared as early as 3.4 million years ago, as some researchers have controversially suggested, those implements may have been made by hominids seeking raw meat and marrow (SN: 9/11/10, p. 8). Researchers suspect regular use of fire for cooking, light and warmth emerged no earlier than around 400,000 years ago (SN: 5/5/12, p. 18).

“Recognizing that eating rotten meat is possible, even without fire, highlights how easy it would have been to incorporate scavenged food into the diet long before our ancestors learned to hunt or process [meat] with stone tools,” says paleoanthropologist Jessica Thompson of Yale University.

Thompson and colleagues suggested in Current Anthropology in 2019 that before about 2 million years ago, hominids were primarily scavengers who used rocks to smash open animal bones and eat nutritious, fat-rich marrow and brains. That conclusion, stemming from a review of fossil and archaeological evidence, challenged a common assumption that early hominids — whether as hunters or scavengers — primarily ate meat off the bone.

Certainly, ancient hominids were eating more than just the meaty steaks we think of today, says archaeologist Manuel Domínguez-Rodrigo of Rice University in Houston. In East Africa’s Olduvai Gorge, butchered animal bones at sites dating to nearly 2 million years ago indicate that hominids ate most parts of carcasses, including brains and internal organs.

“But Speth’s argument about eating putrid carcasses is very speculative and untestable,” Domínguez-Rodrigo says.

Untangling whether ancient hominids truly had a taste for rot will require research that spans many fields, including microbiology, genetics and food science, Speth says.

But if his contention holds up, it suggests that ancient cooks were not turning out meat dishes. Instead, Speth speculates, cooking’s primary value at first lay in making starchy and oily plants softer, more chewable and easily digestible. Edible plants contain carbohydrates, sugar molecules that can be converted to energy in the body. Heating over a fire converts starch in tubers and other plants to glucose, a vital energy source for the body and brain. Crushing or grinding of plants might have yielded at least some of those energy benefits to hungry hominids who lacked the ability to light fires.

Whether hominids controlled fire well enough to cook plants or any other food regularly before around 400,000 to 300,000 years ago is unknown.
Neandertals may have hunted animals for fat
Despite their nutritional benefits, plants often get viewed as secondary menu items for Stone Age folks. It doesn’t help that plants preserve poorly at archaeological sites.

Neandertals, in particular, have a long-standing reputation as plant shunners. Popular opinion views Neandertals as burly, shaggy individuals who huddled around fires chomping on mammoth steaks.

That’s not far from an influential scientific view of what Neandertals ate. Elevated levels of a diet-related form of nitrogen in Neandertal bones and teeth hint that they were committed carnivores, eating large amounts of protein-rich lean meat, several research teams have concluded over nearly the last 30 years.

But consuming that much protein from meat, especially from cuts above the front and hind limbs now referred to as steaks, would have been a recipe for nutritional disaster, Speth argues. Meat from wild, hoofed animals and smaller creatures such as rabbits contains almost no fat, or marbling, unlike meat from modern domestic animals, he says. Ethnohistorical accounts, especially for northern hunters including the Inuit, include warnings about weight loss, ill health and even death that can result from eating too much lean meat.

This form of malnutrition is known as rabbit starvation. Evidence indicates that people can safely consume between about 25 and 35 percent of daily calories as protein, Speth says. Above that threshold, several investigations have indicated that the liver becomes unable to break down chemical wastes from ingested proteins, which then accumulate in the blood and contribute to rabbit starvation. Limits to the amount of daily protein that can be safely consumed meant that ancient hunting groups, like those today, needed animal fats and carbohydrates from plants to fulfill daily calorie and other nutritional needs.

Modern “Paleo diets” emphasize eating lean meats, fruits and vegetables. But that omits what past and present Indigenous peoples most wanted from animal carcasses. Accounts describe Inuit people eating much larger amounts of fatty body parts than lean meat, Speth says. Over the last few centuries, they have favored tongue, fat deposits, brisket, ribs, fatty tissue around intestines and internal organs, and marrow. Internal organs, especially adrenal glands, have provided vitamin C — nearly absent in lean muscle — that prevented anemia and other symptoms of scurvy.

Western explorers noted that the Inuit also ate chyme, the stomach contents of reindeer and other plant-eating animals. Chyme provided at least a side course of plant carbohydrates. Likewise, Neandertals in Ice Age Europe probably subsisted on a fat- and chyme-supplemented diet (SN Online: 10/11/13), Speth contends.

Large numbers of animal bones found at northern European Neandertal sites — often viewed as the residue of ravenous meat eaters — may instead reflect overhunting of animals to obtain enough fat to meet daily calorie needs. Because wild game typically has a small percentage of body fat, northern hunting groups today and over the last few centuries frequently killed prey in large numbers, either discarding most lean meat from carcasses or feeding it to their dogs, ethnographic studies show.

If Neandertals followed that playbook, eating putrid foods might explain why their bones carry a carnivore-like nitrogen signature, Speth suggests. An unpublished study of decomposing human bodies kept at a University of Tennessee research facility in Knoxville called the Body Farm tested that possibility. Biological anthropologist Melanie Beasley, now at Purdue University in West Lafayette, Ind., found moderately elevated tissue nitrogen levels in 10 deceased bodies sampled regularly for about six months. Tissue from those bodies served as a stand-in for animal meat consumed by Neandertals. Human flesh is an imperfect substitute for, say, reindeer or elephant carcasses. But Beasley’s findings suggest that decomposition’s effects on a range of animals need to be studied. Intriguingly, she also found that maggots in the decaying tissue displayed extremely elevated nitrogen levels.

Paleobiologist Kimberly Foecke of George Washington University in Washington, D.C., has also found high nitrogen levels in rotting, maggot-free cuts of beef from animals fed no hormones or antibiotics to approximate the diets of Stone Age creatures (SN: 1/2/19).

Like arctic hunters did a few hundred years ago, Neandertals may have eaten putrid meat and fish studded with maggots, Speth says. That would explain elevated nitrogen levels in Neandertal fossils.

But Neandertal dining habits are poorly understood. Unusually extensive evidence of Neandertal big-game consumption has come from a new analysis of fossil remains at a roughly 125,000-year-old site in northern Germany called Neumark-Nord. There, Neandertals periodically hunted straight-tusked elephants weighing up to 13 metric tons, say archaeologist Sabine Gaudzinski-Windheuser of Johannes Gutenberg University of Mainz in Germany and colleagues.

In a study reported February 1 in Science Advances, her group analyzed patterns of stone-tool incisions on bones of at least 57 elephants from 27 spots near an ancient lake basin where Neandertals lit campfires and constructed shelters (SN: 1/29/22, p. 8). Evidence suggests that Neandertal butchers — much like Inuit hunters — removed fat deposits under the skin and fatty body parts such as the tongue, internal organs, brain and thick layers of fat in the feet. Lean meat from elephants would have been eaten in smaller quantities to avoid rabbit starvation, the researchers argue.

Further research needs to examine whether the Neandertals cooked elephant meat or boiled the bones to extract nutritious grease, Speth says. Mealtime options would have expanded for hominids who could not only consume putrid meat and fat but also heat animal parts over fires, he suspects.

Neandertals who hunted elephants must also have eaten a variety of plants to meet their considerable energy requirements, says Gaudzinski-Windheuser. But so far, only fragments of burned hazelnuts, acorns and blackthorn plums have been found at Neumark-Nord.
Neandertals probably carb-loaded
Better evidence of Neandertals’ plant preferences comes from sites in warm Mediterranean and Middle Eastern settings. At a site in coastal Spain, Neandertals probably ate fruits, nuts and seeds of a variety of plants (SN: 3/27/21, p. 32).

Neandertals in a range of environments must have consumed lots of starchy plants, argues archaeologist Karen Hardy of the University of Glasgow in Scotland. Even Stone Age northern European and Asian regions included plants with starch-rich appendages that grew underground, such as tubers.

Neandertals could also have obtained starchy carbs from the edible, inner bark of many trees and from seaweed along coastlines. Cooking, as suggested by Speth, would have greatly increased the nutritional value of plants, Hardy says. Not so for rotten meat and fat, though Neandertals such as those at Neumark-Nord may have cooked what they gleaned from fresh elephant remains.

There is direct evidence that Neandertals munched on plants. Microscopic remnants of edible and medicinal plants have been found in the tartar on Neandertal teeth (SN: 4/1/17, p. 16), Hardy says.

Carbohydrate-fueled energy helped to maintain large brains, enable strenuous physical activity and ensure healthy pregnancies for both Neandertals and ancient Homo sapiens, Hardy concludes in the January 2022 Journal of Human Evolution. (Researchers disagree over whether Neandertals, which lived from around 400,000 to 40,000 years ago, were a variant of H. sapiens or a separate species.)
Paleo cuisine was tasty
Like Hardy, Speth suspects that plants provided a large share of the energy and nutrients Stone Age folks needed. Plants represented a more predictable, readily available food source than hunted or scavenged meat and fat, he contends.

Plants also offered Neandertals and ancient H. sapiens — whose diets probably didn’t differ dramatically from Neandertals’, Hardy says — a chance to stretch their taste buds and cook up tangy meals.

Paleolithic plant cooking included preplanned steps aimed at adding dashes of specific flavors to basic dishes, a recent investigation suggests. In at least some places, Stone Age people apparently cooked to experience pleasing tastes and not just to fill their stomachs. Charred plant food fragments from Shanidar Cave in Iraqi Kurdistan and Franchthi Cave in Greece consisted of crushed pulse seeds, possibly from starchy pea species, combined with wild plants that would have provided a pungent, somewhat bitter taste, microscopic analyses show.

Added ingredients included wild mustard, wild almonds, wild pistachio and fruits such as hackberry, archaeobotanist Ceren Kabukcu of the University of Liverpool in England and colleagues reported last November in Antiquity.

Four Shanidar food bits date to about 40,000 years ago or more and originated in sediment that included stone tools attributed to H. sapiens. Another food fragment, likely from a cooked Neandertal meal, dates to between 70,000 and 75,000 years ago. Neandertal fossils found in Shanidar Cave are also about 70,000 years old. So it appears that Shanidar Neandertals spiced up cooked plant foods before Shanidar H. sapiens did, Kabukcu says.

Franchthi food remains date to between 13,100 and 11,400 years ago, when H. sapiens lived there. Wild pulses in food from both caves display microscopic signs of having been soaked, a way to dilute poisons in seeds and moderate their bitterness.

These new findings “suggest that cuisine, or the combination of different ingredients for pleasure, has a very long history indeed,” says Hardy, who was not part of Kabukcu’s team.

There’s a hefty dollop of irony in the possibility that original Paleo diets mixed what people in many societies today regard as gross-sounding portions of putrid meat and fat with vegetarian dishes that still seem appealing.

Add beer to the list of foods threatened by climate change

Beer lovers could be left with a sour taste, thanks to the latest in a series of studies mapping the effects of climate change on crops.

Malted barley — a key ingredient in beer including IPAs, stouts and pilsners — is particularly sensitive to warmer temperatures and drought, both of which are likely to increase due to climate change. As a result, average global barley crop yields could drop as much as 17 percent by 2099, compared with the average yield from 1981 to 2010, under the more extreme climate change projections, researchers report October 15 in Nature Plants.
That decline “could lead to, on average, a doubling of price in some countries,” says coauthor Steven Davis, an Earth systems scientist at University of California, Irvine. Consumption would also drop globally by an average of 16 percent, or roughly what people in the United States consumed in 2011.

The results are based on computer simulations projecting climate conditions, plant responses and global market reactions up to the year 2099. Under the mildest climate change predictions, world average barley yields would still go down by at least 3 percent, and average prices would increase about 15 percent, the study says.

Other crops such as maize, wheat and soy and wine grapes are also threatened by the global rising of average atmospheric temperatures as well as by pests emboldened by erratic weather (SN: 2/8/14, p. 3). But there’s still hope for ale aficionados. The study did not account for technological innovations or genetic tweaks that could spare the crop, Davis says.

310-million-year-old fossil blobs might not be jellyfish after all

What do you get when you flip a fossilized “jellyfish” upside down? The answer, it turns out, might be an anemone.

Fossil blobs once thought to be ancient jellyfish were actually a type of burrowing sea anemone, scientists propose March 8 in Papers in Palaeontology.

From a certain angle, the fossils’ features include what appears to be a smooth bell shape, perhaps with tentacles hanging beneath — like a jellyfish. And for more than 50 years, that’s what many scientists thought the animals were.
But for paleontologist Roy Plotnick, something about the fossils’ supposed identity seemed fishy. “It’s always kind of bothered me,” says Plotnick, of the University of Illinois Chicago. Previous scientists had interpreted one fossil feature as a curtain that hung around the jellies’ tentacles. But that didn’t make much sense, Plotnick says. “No jellyfish has that,” he says. “How would it swim?”

One day, looking over specimens at the Field Museum in Chicago, something in Plotnick’s mind clicked. What if the bell belonged on the bottom, not the top? He turned to a colleague and said, “I think this is an anemone.”

Rotated 180 degrees, Plotnick realized, the fossils’ shape — which looks kind of like an elongated pineapple with a stumpy crown — resembles some modern anemones. “It was one of those aha moments,” he says. The “jellyfish” bell might be the anemone’s lower body. And the purported tentacles? Perhaps the anemone’s upper section, a tough, textured barrel protruding from the seafloor.

Plotnick and his colleagues examined thousands of the fossilized animals, dubbed Essexella asherae, unearthing more clues. Bands running through the fossils match the shape of some modern anemones’ musculature. And some specimens’ pointy protrusions resemble an anemone’s contracted tentacles.
“It’s totally possible that these are anemones,” says Estefanía Rodríguez, an anemone expert at the American Museum of Natural History in New York City who was not involved with the work. The shape of the fossils, the comparison with modern-day anemones — it all lines up, she says, though it’s not easy to know for sure.

Paleontologist Thomas Clements agrees. Specimens like Essexella “are some of the most notoriously difficult fossils to identify,” he says. “Jellyfish and anemones are like bags of water. There’s hardly any tissue to them,” meaning there’s little left to fossilize.
Still, it’s plausible that the blobs are indeed fossilized anemones, says Clements, of Friedrich-Alexander-Universität Erlangen-Nürnberg in Germany. He was not part of the new study but has spent several field seasons at Mazon Creek, the Illinois site where Essexella lived some 310 million years ago. Back then, the area was near the shoreline, Clements says, with nearby rivers dumping sediment into the environment – just the kind of place ancient burrowing anemones may have once called home.

Bizarre metals may help unlock mysteries of how Earth’s magnetic field forms

Weird materials called Weyl metals might reveal the secrets of how Earth gets its magnetic field.

The substances could generate a dynamo effect, the process by which a swirling, electrically conductive material creates a magnetic field, a team of scientists reports in the Oct. 26 Physical Review Letters.

Dynamos are common in the universe, producing the magnetic fields of the Earth, the sun and other stars and galaxies. But scientists still don’t fully understand the details of how dynamos create magnetic fields. And, unfortunately, making a dynamo in the lab is no easy task, requiring researchers to rapidly spin giant tanks of a liquefied metal, such as sodium (SN: 5/18/13, p. 26).
First discovered in 2015, Weyl metals are topological materials, meaning that their behavior is governed by a branch of mathematics called topology, the study of shapes like doughnuts and knots (SN: 8/22/15, p. 11). Electrons in Weyl metals move around in bizarre ways, behaving as if they are massless.

Within these materials, the researchers discovered, electrons are subject to the same set of equations that describes the behavior of liquids known to form dynamos, such as molten iron in the Earth’s outer core. The team’s calculations suggest that, under the right conditions, it should be possible to make a dynamo from solid Weyl metals.

It might be easier to create such dynamos in the lab, as they don’t require large quantities of swirling liquid metals. Instead, the electrons in a small chunk of Weyl metal could flow like a fluid, taking the place of the liquid metal.
The result is still theoretical. But if the idea works, scientists may be able to use Weyl metals to reproduce the conditions that exist within the Earth, and better understand how its magnetic field forms.

Skull damage suggests Neandertals led no more violent lives than humans

Neandertals are shaking off their reputation as head bangers.

Our close evolutionary cousins experienced plenty of head injuries, but no more so than late Stone Age humans did, a study suggests. Rates of fractures and other bone damage in a large sample of Neandertal and ancient Homo sapiens skulls roughly match rates previously reported for human foragers and farmers who have lived within the past 10,000 years, concludes a team led by paleoanthropologist Katerina Harvati of the University of Tübingen in Germany.
Males suffered the bulk of harmful head knocks, whether they were Neandertals or ancient humans, the scientists report online November 14 in Nature.

“Our results suggest that Neandertal lifestyles were not more dangerous than those of early modern Europeans,” Harvati says.

Until recently, researchers depicted Neandertals, who inhabited Europe and Asia between around 400,000 and 40,000 years ago, as especially prone to head injuries. Serious damage to small numbers of Neandertal skulls fueled a view that these hominids led dangerous lives. Proposed causes of Neandertal noggin wounds have included fighting, attacks by cave bears and other carnivores and close-range hunting of large prey animals.

Paleoanthropologist Erik Trinkaus of Washington University in St. Louis coauthored an influential 1995 paper arguing that Neandertals incurred an unusually large number of head and upper-body injuries. Trinkaus recanted that conclusion in 2012, though. All sorts of causes, including accidents and fossilization, could have resulted in Neandertal skull damage observed in relatively small fossil samples, he contended (SN: 5/27/17, p. 13).
Harvati’s study further undercuts the argument that Neandertals engaged in a lot of violent behavior, Trinkaus says.

Still, the idea that Neandertals frequently got their heads bonked during crude, close-up attacks on prey has persisted, says paleoanthropologist David Frayer of the University of Kansas in Lawrence. The new report highlights the harsh reality that, for Neandertals and ancient humans alike, “head trauma, no matter the level of technological or social complexity, or population density, was common.”

Harvati’s group analyzed data for 114 Neandertal skulls and 90 H. sapiens skulls. All of these fossils were found in Eurasia and date to between around 80,000 and 20,000 years ago. One or more head injuries appeared in nine Neandertals and 12 ancient humans. After statistically accounting for individuals’ sex, age at death, geographic locations and state of bone preservation, the investigators estimated comparable levels of skull damage in the two species. Statistical models run by the team indicate that skull injuries affected an average of 4 percent to 33 percent of Neandertals, and 2 percent to 34 percent of ancient humans.

Estimated prevalence ranges that large likely reflect factors that varied from one locality to another, such as resource availability and hunting conditions, the researchers say.

Neandertals with head wounds included more individuals under age 30 than observed among their human counterparts. Neandertals may have suffered more head injuries early in life, the researchers say. It’s also possible that Neandertals died more often from head injuries than Stone Age humans did.

Researchers have yet to establish whether Neandertals experienced especially high levels of damage to body parts other than the head, writes paleoanthropologist Marta Mirazón Lahr of the University of Cambridge in a commentary in Nature accompanying the new study.