Shayan Oveis Gharan finds the shortest route to success

It’s a problem that sounds simple, but the best minds in mathematics have puzzled over it for generations: A salesman wants to hawk his wares in several cities and return home when he’s done. If he’s only visiting a handful of places, it’s easy for him to schedule his visits to create the shortest round-trip route. But the task rapidly becomes unwieldy as the number of destinations increases, ballooning the number of possible routes.

Theoretical computer scientist Shayan Oveis Gharan, an assistant professor at the University of Washington in Seattle, has made record-breaking advances on this puzzle, known as the traveling salesman problem. The problem is famous in mathematical circles for being deceptively easy to describe but difficult to solve. But Oveis Gharan has persisted. “He is relentless,” says Amin Saberi of Stanford University, Oveis Gharan’s former Ph.D. adviser. “He just doesn’t give up.”
Oveis Gharan’s unwavering focus has enabled him to identify connections between seemingly unrelated areas of mathematics and computer science. He scrutinizes the work of the fields’ most brilliant minds and adapts those techniques to fit his purposes. This strategy — bringing new tools to old problems — is the basis for leaps he has made on two varieties of the traveling salesman problem.

“If you want to build a house, you need to have a sledgehammer and a level, a wrench, tape measure,” he says. “You need to have a lot of tools and use them one after another.” Oveis Gharan, age 30, stocks his toolkit with the latest advances in fields with obscure-sounding names, including spectral graph theory, polyhedral theory and geometry of polynomials. And in a twist that only Oveis Gharan saw coming, a recent solution to a long-standing problem originating in quantum mechanics turned out to be the missing piece to one aspect of the salesman’s puzzle.
For a salesman’s tour of five cities, there are just 12 possible routes; it’s easy enough to pick the one that will save the most gas. But for 20 cities, there are 60 quadrillion possibilities, and for 80 cities, there are more routes than the number of atoms in the observable universe. Relying on brute force — calculating the distances of all the possible routes — is intractable for all but the easiest cases. Yet no one has found a simple method that can quickly find the shortest path for any number and arrangement of cities. The quandary has real-world importance: Companies like Amazon and Uber, for example, want to ferry goods and people to many destinations in the most efficient way possible.

Growing up in his home country of Iran, Oveis Gharan discovered a natural appreciation for challenging puzzles. In middle school, he acquired a book of problems from mathematics Olympiad competitions in the Soviet Union. As a student, “I tend to be one of the slower ones,” Oveis Gharan says, noting that he was usually not the first to grasp a new theorem. But within a few years, he had doggedly plowed through the 200-page book.

The effort also provided Oveis Gharan with his first taste of tool collecting, through collaboration with classmates who joined him in working through the math problems. Oveis Gharan found that solutions come easier when many minds contribute. “Each person thinks and solves problems differently,” he says. “Once someone is exposed to many different ideas and ways of thinking on a problem, that will help a lot to increase the breadth of problem-attacking directions.”

Oveis Gharan attended Sharif University of Technology in Tehran before making his first breakthroughs on the traveling salesman problem as a graduate student at Stanford University. He spent over a year cracking just one thorny facet, before moving on to a postdoctoral fellowship at the University of California, Berkeley.
Rather than attacking the problem head-on, Oveis Gharan works on approximate solutions — routes that are slightly longer than the optimal path but can be calculated in a reasonable amount of time. Since the 1970s, computer scientists have known of a strategy for quickly finding a route that is at most 50 percent longer than the shortest possible path. That record held for decades, until Oveis Gharan tackled it along with Saberi and Mohit Singh, then of McGill University in Montreal.

In a paper published in 2011, the team made what might sound like an infinitesimal improvement, shrinking the 50-percent figure by four hundredths of a trillionth of a trillionth of a trillionth of a trillionth of a percentage point. “People make fun of our paper because of that small improvement,” says Oveis Gharan, “but the thing is that in our area, the actual number is not the major question.”
Instead, the goal is to develop new ideas that can begin to crack the problem open, says Luca Trevisan, a computer scientist at Berkeley. “What’s so important is not the specific algorithm that he has devised, but that there is a whole new set of techniques that can potentially be applied to other problems.” Following the advance, other scientists revisited the traveling salesman problem, and decreased the number significantly; the selected route is now at most 40 percent longer than optimal.

To make his breakthroughs, Oveis Gharan keeps tabs on the scientific literature across a variety of mathematical fields. “Every time new papers or new techniques come out, he’s one of the first people who will pick up the paper and read it,” says Saberi. To discover tools outside his areas of expertise, Oveis Gharan poses pieces of the problem to researchers in other fields.

In 2015, Oveis Gharan and computer scientist Nima Anari, then at Berkeley, made further progress on an approximate solution for a more general, and more challenging, version of the traveling salesman problem. In this version, the distance to go from point A to point B might not be the same as going the opposite direction — a plausible situation in cities with many one-way streets. Researchers had a way to estimate the optimum tour length, but they didn’t understand how good the estimate was. Oveis Gharan and Anari showed it was exponentially better than known previously.

To make this advance, Oveis Gharan teased out connections to a seemingly unrelated problem in mathematics and quantum mechanics, known as the Kadison-Singer problem. “That was really surprising,” says computer scientist Daniel Spielman of Yale University, part of a team that solved the Kadison-Singer problem in 2013. “There was no obvious connection,” he says. “Shayan is incredibly brilliant and incredibly creative.”

Oveis Gharan is now focused on a furtherconquest of this version of the traveling salesman problem. Though his new advance helps approximate the optimal tour length, it can’t identify the corresponding route. Next, Oveis Gharan would like to produce an algorithm that can navigate the correct course.

You can bet he’ll continue to add to his tool collection by sampling from related mathematical and computational fields. “The grand plan is: Try to better understand how these different areas are connected to one another,” Oveis Gharan says. “There are many big open problems lying in this intersection.”

Methane didn’t warm ancient Earth, new simulations suggest

Methane wasn’t the cozy blanket that kept Earth warm hundreds of millions of years ago when the sun was dim, new research suggests.

By simulating the ancient environment, researchers found that abundant sulfate and scant oxygen created conditions that kept down levels of methane — a potent greenhouse gas — around 1.8 billion to 800 million years ago (SN: 11/14/15, p. 18). So something other than methane kept Earth from becoming a snowball during this dim phase in the sun’s life. Researchers report on this new wrinkle in the so-called faint young sun paradox (SN: 5/4/13, p. 30) the week of September 26 in the Proceedings of the National Academy of Sciences.

Limited oxygen increases the production of microbe-made methane in the oceans. With low oxygen early in Earth’s history, many scientists suspected that methane was abundant enough to keep temperatures toasty. Oxygen may have been too sparse, though. Recent work suggests that oxygen concentrations at the time were as low as a thousandth their present-day levels (SN: 11/28/14, p. 14).

Stephanie Olson of the University of California, Riverside and colleagues propose that such low oxygen concentrations thinned the ozone layer that blocks methane-destroying ultraviolet rays. They also estimate that high concentrations of sulfate in seawater at the time helped sustain methane-eating microbes. Together, these processes severely limited methane to levels similar to those seen today — far too low to keep Earth defrosted.

New book tells strange tales of evolution

The Wasp That Brainwashed the Caterpillar Matt Simon Penguin Books, $20
Writer Matt Simon begins his new book with a bleak outlook on life: “In the animal kingdom, life sucks and then you die.” But thanks to evolution — which Simon calls “the most majestic problem-solving force on planet Earth” — some critters have peculiar adaptations that make life suck a little less (though sometimes at the expense of other species).

From mustachioed toads to pink fairy armadillos, Simon’s debut book, The Wasp That Brainwashed the Caterpillar, recounts an eclectic cadre of animals that use creative and often bizarre solutions to find love, a babysitter, a meal or a place to crash.
Take, for instance, the book’s title characters. Technically, it’s the wasp larvae that brainwash the caterpillar. Once a female Glyptapanteles wasp deposits eggs into a living caterpillar, she takes off, leaving the oblivious host to babysit her young. After hatching, some larvae stay behind to release chemicals that manipulate the caterpillar’s brain. Once their siblings erupt from the poor creature’s body, the caterpillar mindlessly protects the youngsters from predators.

Mind control isn’t unique to wasps — flies and even fungi do it, too. But the book is about more than just the seemingly diabolical tactics of parasites. Prey species also have skin, or in some cases snot, in the game.

Hagfish, eel-like fish that scavenge the seafloor, eject thick, slimy mucus to clog the gills of sharks that try to make a meal of the hagfish. And the East African crested rat protects itself from dogs and other predators by slathering its fur with the chewed-up bark of the Acokanthera tree, traditionally used by indigenous hunters to make poison arrows. “A species may gain an edge, but any sort of edge is answered,” Simon writes. And so marches on the arms race of natural selection.

The author never dives deeply into exactly how these creatures evolved. The book is a quick, fun read that’s light on science and heavy on snark (not to mention a lot of anthropomorphizing). Readers familiar with Simon’s column for Wired, “Absurd Creature of the Week,” may already be acquainted with some of these animals. But the book is packed full of even more fascinating facts that will both impress and creep out.

Sometimes failure is the springboard to success

Some discoveries originate in failures. Lab failures, of course, can lead to serendipitous findings. Observations that fail to meet your expectations create space for a new idea to take hold. Imperfections — small failures — may tell volumes about how something was made or what it is made of. Exposing flaws in a theory inches scientists closer to a better one. Failure forces us to ask hard questions and look for new answers.
Our cover story follows the aftermath of a recent acknowledgment of a major fail: We haven’t yet taken a complete census of all minerals on Earth. Akin to the search to name all living species on the planet (but less of a moving target), a campaign is under way to add to the more than 5,000 known minerals, freelancer Sid Perkins writes in “Digging Carbon” (SN: 10/15/16, p. 18). It’s a kind of treasure hunt, as these minerals presumably have not yet been found because they are incredibly rare, perhaps existing at only a single location. Especially interesting to rock hounds are the scores of as yet unseen carbon-based minerals predicted to exist by a recent statistical analysis. Hidden in these unexplored gems might lie untold stories about how Earth’s carbon and water cycles have changed over the eons. Just as adding a new bird species to a life list is exciting for bird watchers, finding a new kind of mineral is what many rock hounds aspire to.
Another kind of failure may explain a mysterious missing star, Christopher Crockett reports in “Lost star may be failed supernova” (SN: 10/15/16, p. 8). A giant star, 25 to 30 times as massive as the sun, flared and then fizzled in 2009. Scientists now say it might be a failed supernova, a dying star that didn’t have quite the right stuff to explode and instead went from star straight to black hole. If the star is not just hiding somewhere in the dust, it’s a new cosmic character, a new type of behavior to watch for.

Imperfections in humans’ DNA help make each of us unique. These imperfections, viewed at a population scale, also offer a way (still imperfect in itself) to track ancestry, to get some idea of how human populations moved, mingled and changed in the deep past. In “The Hybrid Factor” (SN: 10/15/16, p. 22), Bruce Bower describes how recent DNA studies of ancient hominids are changing views of human evolutionary history. Early humans, the data show, mated with Neandertals and possibly other hominids, producing viable hybrid offspring. The research gives support to a longtime contention by some paleoanthropologists that certain ancient skeletons might represent human-Neandertal mixes. Further evidence for this point of view is now coming from studies of hybrid baboons and other modern species. Mixing species, it seems, was sometimes a success.

Examining the DNA of wide swaths of living people is also revising ideas about when early humans migrated out of Africa to settle the rest of the globe. Three new studies, described by Tina Hesman Saey in “One Africa exodus populated globe” (SN: 10/15/16, p. 6), suggest that the major ancestral mass migration from Africa occurred between 50,000 and 75,000 years ago. Those migrants succeeded in leaving their genetic mark on all of today’s non-Africans. Other evidence points to earlier, smaller migrations from Africa. Perhaps those were failures in a sense, failing to seed lasting populations in far-off outposts. But, perhaps those earlier, smaller scale treks were just the first steps toward success.

Erasing stigma needed in mental health care

Scientists, politicians, clinicians, police officers and medical workers agree on one thing: The U.S. mental health system needs a big fix. Too few people get the help they need for mental ailments and emotional turmoil that can destroy livelihoods and lives.

A report in the October JAMA Internal Medicine, for instance, concludes that more than 70 percent of U.S. adults who experience depression don’t receive treatment for it.

Much attention focuses on developing better psychiatric medications and talk therapies. But those tactics may not be enough. New research suggests that the longstanding but understudied problem of stigma leaves many of those suffering mental ailments feeling alone, often unwilling to seek help and frustrated with treatment when they do.
“Stigma about mental illness is widespread,” says sociologist Bernice Pescosolido of Indiana University in Bloomington. And the current emphasis on mental ills as diseases of individuals can unintentionally inflame that sense of shame. An effective mental health care system needs to address stigma’s suffocating social grip, investigators say. “If we want to explain problems such as depression and suicide, we have to see them in a social context, not just as individual issues,” Pescosolido says.

Stigma as a mark of disgrace that taints someone in others’ eyes goes back several millennia. Sociologist Erving Goffman wrote in 1963 of stigma as a “spoiled identity” caused by society’s negative attitudes toward conditions such as mental illness. New evidence supports the idea that stigma about psychological problems runs surprisingly deep. What’s more, it filters through families and communities in different ways.

Many depressed people experience their condition primarily as a family predicament, not a brain disease, says a team led by UCLA psychiatrist and medical anthropologist Elizabeth Bromley. Those who seek treatment from primary care physicians feel tremendous shame about depression-related problems, such as being unable to work, that put a burden on their families. They hide their depression and any treatments, fearing rejection by those closest to them, Bromley and her colleagues report in the October Current Anthropology. Even if antidepressants ease symptoms such as insomnia and fatigue, depressed individuals describe the treatment as a Band-Aid stuck on unresolved family fractures, which can include a violent spouse or drug-addicted child.

Bromley’s team examined data from 46 people, representing various ethnic backgrounds and economic classes, identified in primary care clinics in 1996 as having depression. After their diagnosis, participants completed surveys every six months for two years, then at the five-year and nine-year marks. Interviews about symptoms, treatments and coping occurred at a 10-year follow-up.

Only two people described the depression treatment they received as helpful and appropriate to their situation. Both had family and friends who had noticed their depression symptoms and encouraged them to seek help.
The remaining 44 people spoke of depression as a threat to their closest relationships and family standing. They kept treatment secret to avoid intensifying family conflicts and for fear of rejection. Shame and emotional distance from family members remained even if depression treatments had positive effects. Participants commonly spoke of not wanting to burden their families with their condition. Several said that being singled out for treatment, which only required that one take antidepressants or, say, learn relaxation techniques, made them feel more estranged than ever from already fragile families and, what’s more, did nothing to resolve underlying family troubles.

“Individually focused, biomedical approaches can feel stigmatizing to many people with depression,” Bromley says.

Her team’s findings fit with previous observations that stigma discourages many people from discussing depression with their doctors for fear of breaking frayed family ties, writes psychologist Rob Whitley of Montreal’s McGill University in the same issue of Current Anthropology.

Excessively close ties among a network of families can also stoke stigma, researchers find. It can flourish in a wealthy, well-manicured community where everyone knows everyone else, if not in person than by word of mouth, say sociologists Anna Mueller of the University of Chicago and Seth Abrutyn of the University of Memphis.

In one such town, given the fictional name Poplar Grove by the researchers to protect privacy, teenagers struggle mightily under the weight of an “overactive grapevine of gossip.” Parents and peers constantly monitor whether teens live up to a community-wide standard of high academic achievement, the researchers report in the October American Sociological Review. Hard work is admired, but only if it yields superior grades with no signs of extra effort, such as using tutors. Academic struggles, anxiety and depression are stigmatized as signs of imperfection. As a result, most young people fear to seek any help from adults, including parents and teachers. That situation contributed to a rash of 19 suicides among current students and recent graduates of the town’s high school between 2000 and 2015, Mueller and Abrutyn propose.

The pair conducted interviews and focus groups in 2014 and 2015 with 110 volunteers, including teens who grew up in the town and lost a friend to suicide, parents whose children killed themselves, mental health workers in the town and high school teachers and counselors. In public forums held afterward, residents were surprised to hear from Mueller that one of Poplar Grove’s strengths — strong ties among neighbors concerned about the welfare of everyone’s kids — had a dark side. Parents talked about the shame they felt if a child experienced emotional problems and of feeling like bad parents when word got around. Teens expressed intense fear of failing to ace schoolwork and make it seem effortless. Students who had killed themselves were described by friends as having emotionally wilted under those pressures.

Bromley’s and Mueller’s findings underscore the need for mental health services that reach people where they live, Pescosolido says. Local services stand the best chance of getting troubled individuals to see help-seeking as acceptable behavior with the potential to change one’s life for the better.

Possible approaches include training pastors and other religious leaders in how to assist those with mental disorders and establishing public self-help groups and high school clubs devoted to open discussion and support. Local centers housing teams of social workers and counselors able to coordinate care for serious mental disorders would be a big advance, she says.

Job No. 1, Mueller says, involves getting beyond the popular assumption that mental illness and suicide arise solely in individuals. It’s long been known, for example, that chaotic communities where people feel isolated push suicide rates higher. But as Poplar Grove demonstrates, really tight-knit communities can have the same effect. “Deep psychological pain often has family and community sources,” she says.

Blame bad incentives for bad science

Most of us spend our careers trying to meet — and hopefully exceed — expectations. Scientists do too. But the requirements for success in a job in academic science don’t always line up with the best scientific methods. The net result? Bad science doesn’t just happen — it gets selected for.

What does it mean to be successful in science? A scientist gets a job and funding by publishing a lot of high-impact papers with novel findings. Those papers and findings beget awards and funding to do more science — and publish more papers. “The problem that we face is that the incentive system is focused almost entirely on getting research published, rather than on getting research right,” says Brian Nosek, a psychologist at the University of Virginia in Charlottesville.

This idea of success has become so ingrained that scientists are even introduced when they give talks by the number of papers they have published or the amount of grant funding they have, says Marc Edwards, a civil engineer at Virginia Polytechnic Institute and State University in Blacksburg.

But rewarding researchers for the number of papers they publish results in a “natural selection” of sloppy science, new research shows. The idea of scientific “success” equated as number of publications promotes not just lazy science but also unethical science, another paper argues. Both articles proclaim that it’s time for a culture shift. But with many scientific labs to fund and little money to do it, what does a new, better scientific enterprise look like?

As young scientists apply for tenure-track academic jobs, they may bring an application filled with tens to dozens of papers. Hiring committees can often no longer read or evaluate all of them. So they may come to use numbers as shorthand — numbers of papers published, how many times those papers have been cited and whether the journals the papers are published in are high-impact. “Real evaluation of scientific quality is as hard as doing the science in the first place,” Nosek says. “So, just like everyone else, scientists use heuristics to evaluate each other’s work when they don’t have time to dig into it for a complete evaluation.”

Too much reliance on the numbers means that scientists can — unintentionally or not — game the system. They can publish novel results from experiments with low power and effort. Those novel results inflate publication numbers, increase grant funding and get the scientist a job. Ideally, other scientists would catch this careless behavior in peer review, before the studies are published, weeding out poorly done studies in favor of strong ones. But Paul Smaldino, a cognitive scientist at the University of California, Merced, suspected that when the scientific idea of “meeting expectations” on the job is measured in publication rates, bad science would always win out.

So Smaldino and his colleague Richard McElreath at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, decided to create a computer simulation of the scientific “ecosystem,” based on a model for natural selection in a biological ecosystem. Each “lab” in the simulation was represented by a number. Those labs that best met the parameters for success survived and reproduced, spawning other labs that behaved in the same way. Those labs that didn’t meet expectations “died out.”
The model allowed Smaldino and McElreath to manipulate the definitions of “success.” And when that success was defined as publishing a lot of novel findings, labs succeeded when they did science that was “low effort” — sloppy and probably irreproducible. Research groups doing high-effort, careful science didn’t publish enough. And they went the way of the dinosaurs.

Even putting an emphasis on replication — in which labs got half credit for double-checking the findings of other groups — couldn’t save the system. “That was a surprise for us,” Smaldino says. He assumed that if the low-effort labs got caught by failures to replicate, their success would go down. But scientists can’t replicate every single study, and in the simulation, the lazy labs still thrived. “The most successful are still going to low effort,” he explains, “because not everyone gets caught.” Smaldino and McElreath published their findings September 21 in Royal Society Open Science.

“I think the results they get are probably reasonable,” says John Ioannidis, a methods researcher at Stanford University in California. “Once you have bad practices they can propagate and ruin the scientific process and become dominant. And I think there’s some truth to it, unfortunately.”

The publish-or-perish culture may be having negative consequences already, Edwards says. “I’ve … seen ethical researchers leave academia, not enter in the first place or become unethical,” he says. Scientists might slice their research findings thinner, trying to publish more findings with less data, breaking experiments down to the least publishable unit. That in itself is not unethical, but Edwards worries the high stakes places scientists on the edge of a slippery slope, from least publishable units to sliced-and-diced datasets. “With the wrong incentives you can make anyone behave unethically, and academia is no different.”

Using a theoretical model of his own, Edwards and his colleague Siddhartha Roy show that, at some point, the current academic system could lead a critical mass of scientists to cross the line to unethical behavior, corrupting the scientific enterprise and losing the public’s trust. “If we ever reach this tipping point where our institutions become inherently corrupt, it will have devastating consequences for humanity,” Edwards says. “The fate of the world depends as never before on good trustworthy science to solve our problems. Will we be there?” Edwards and Roy report their model September 22 in Environmental Engineering Science.

To stay away from the slippery slope, scientists will need to change what scientific success looks like. Here’s the rub, though. Scientists are the primary people watching scientists work. When papers go through peer review at scientific journals, ideas get examined in peer-review committees for grant funding or a scientist is being considered for an academic job, it’s other scientists who are guarding those gates to scientific success. A single scientist might be publishing papers, peer-reviewing other peoples’ papers, submitting grants, serving on review committees for other peoples’ grants, editing a journal, applying for a job and serving on a hiring committee — all at the same time. And so the standards for scientific integrity, for rigorous methods, do not reside with the institutions or the funders or the journals. Those standards are within the scientists themselves. The inmates really do run the scientific asylum.

This is not an inherently bad thing. Science needs people with appropriate expertise to read the highly specialized stuff. But it does mean that a movement for culture change needs to come from within the scientific enterprise itself. “This is more likely to happen if you have a grassroots movement where lots of scientists are convinced and are used to performing research in a given way, leading to more reliable results,” Ioannidis says.

What produces more reliable research, though, still requires … research. “I think these are questions that could be addressed with scientific studies,” Ioannidis says. “This is where I’m interested in taking the research, to get studies that are telling us to [do science] this way, [or] this type of leadership is better…. You can test policies.” Science needs more studies of science.

The first step is admitting that problems exist in the current structure. “We’re bought into it — we invested our whole career into the game as it exists,” Edwards says. “We are taught to be cowards when it comes to addressing these issues, because the personal and professional costs of revealing these problems is so high.” It can be painful to see sloppy science exposed. Especially when that science is performed by colleagues and friends. But Edwards says fixing the system will be worth the pain. “I don’t want to wake up someday and realize I’m in a culture akin to professional cycling, where you have to cheat to compete.”

The solution is to add incentives for having an excellent research process, regardless of outcome, Nosek says. Scientists need to be rewarded, funded and promoted for careful, thorough research — even if it doesn’t produce huge differences and groundbreaking results. Nosek points to ideas like registered reports. These are systems where scientists report their experimental plans and methods to a journal, and the journal accepts the paper — whether or not the research produces any noteworthy results.

Despite his results, Smaldino is optimistic that incentives can change, allowing the best science to rise to the top. “I think science is great,” he says. “I think in general scientists aren’t bad scheming people.” The dire predictions of the models don’t have to come to pass. “This is not a condemnation of science,” Smaldino says. “I love science — there’s no other way to learn a lot of things that are important to learn about the world. But the science we do can always be better.”

Stone adze points to ancient burial rituals in Ireland

A stone chopping tool found in Ireland’s earliest known human burial offers a rare peek at hunter-gatherers’ beliefs about death more than 9,000 years ago, researchers say.

The curved-edge implement, known as an adze, was made to be used at a ceremony in which an adult’s largely cremated remains were interred in a pit, says a team led by archaeologist Aimée Little of the University of York in England. Previous radiocarbon dating of burned wood and a bone fragment from the pit, at a site called Hermitage near the River Shannon, places the material at between 9,546 and 9,336 years old.
A new microscopic analysis revealed a small number of wear marks on the sharpened edge of the still highly polished adze, which was probably attached to a wooden handle, the researchers report online October 20 in the Cambridge Archaeological Journal. Little’s group suspects someone wielded the 19.4-centimeter-long adze to chop wood for a funeral pyre or to fell a tree for a grave marker. A hole dug into the bottom of the riverside pit once held a tall wooden post indicating that a person lay buried there, the scientists suspect.

Once the adze fulfilled its ritual duties, a hard stone was ground across the tool’s sharp edge to render it dull and useless, further microscopic study suggests. The researchers regard this act as a symbolic killing of the adze. The dulled tool blade was then placed in the pit, next to the post grave marker, perhaps to accompany the cremated individual to the afterlife.
“By 9,000 years ago, people in Ireland were making very high quality artifacts specifically to be placed in graves, giving us a tantalizing glimpse of ancient belief systems concerning death and the afterlife,” Little says. Her conclusion challenges a popular assumption among researchers that stone tools found in ancient hunter-gatherers’ graves belonged to the deceased while they were still alive. In that scenario, tools and other grave items played no role in burial activities and rituals.
Archaeologist Erik Brinch Petersen of the University of Copenhagen is skeptical. No other European stone adzes or axes from around 10,000 to 6,000 years ago display blunted edges, Petersen says. That makes it difficult to say how such an unusual artifact was used or whether it was intended to accompany a cremated person to the afterlife. In addition, researchers have found only a few European cremations from the same time period.
Since there was no practical reason to turn an effective tool into a chunk of stone that couldn’t cut, Little responds, intentionally dulling the adze’s edge was likely a ritual act. Whatever the meaning, people in Ireland made polished stone tools several thousand years before such implements achieved widespread use in Europe with the arrival of agriculture, Little says.

Excavations in 2001 revealed the Hermitage burial pit. Two small stone tools lay near the polished adze. A couple more burial pits turned up nearby. One contained cremated remains of an adult human from around 9,000 years ago; the other held roughly 8,600-year-old cremated remnants too fragmentary to enable a species identification.

“Hermitage was a special place known about and returned to over hundreds of years,” Little says.

Sounds and glowing screens impair mouse brains

SAN DIEGO — Mice raised in cages bombarded with glowing lights and sounds have profound brain abnormalities and behavioral trouble. Hours of daily stimulation led to behaviors reminiscent of attention-deficit/hyperactivity disorder, scientists reported November 14 at the annual meeting of the Society for Neuroscience.

Certain kinds of sensory stimulation, such as sights and sounds, are known to help the brain develop correctly. But scientists from Seattle Children’s Research Institute wondered whether too much stimulation or stimulation of the wrong sort could have negative effects on the growing brain.
To mimic extreme screen exposure, mice were blasted with flashing lights and TV audio for six hours a day. The cacophony began when the mice were 10 days old and lasted for six weeks. After the end of the ordeal, scientists examined the mice’s brains.

“We found dramatic changes everywhere in the brain,” said study coauthor Jan-Marino Ramirez. Mice that had been stimulated had fewer newborn nerve cells in the hippocampus, a brain structure important for learning and memory, than unstimulated mice, Ramirez said. The stimulation also made certain nerve cells more active in general.

Stimulated mice also displayed behaviors similar to some associated with ADHD in children. These mice were noticeably more active and had trouble remembering whether they had encountered an object. The mice also seemed more inclined to take risks, venturing into open areas that mice normally shy away from, for instance.

Some of these results have been reported previously by the Seattle researchers, who have now replicated the findings in a different group of mice. Ramirez and colleagues are extending the work by looking for more detailed behavioral changes.

For instance, preliminary tests have revealed that the mice are impatient and have trouble waiting for rewards. When given a choice between a long wait for a good reward of four food pellets and a short wait for one pellet, stimulated mice were more likely to go for the instant gratification than non-stimulated mice, particularly as wait times increased.
Overstimulation didn’t have the same effects on adult mice, a result that suggests the stimulation had a big influence on the developing — but not fully formed — brain.

If massive amounts of audio and visual stimulation do harm the growing brain, parents need to ponder how their children should interact with screens. So far, though, the research is too preliminary to change guidelines (SN Online: 10/23/16).

“We are not in a position where we can give parents advice,” said neuroscientist Gina Turrigiano of Brandeis University in Waltham, Mass. The results are from mice, not children. “There are always issues in translating research from mice to people,” Turrigiano said.

What’s more, early sensory input may not affect all children the same way. “Each kid will respond very, very differently,” Turrigiano said. Those different responses might be behind why some children are more vulnerable to ADHD.

There’s still much scientists don’t understand about how sensory input early in life wires the brain. It’s possible that what seems like excessive sensory stimulation early in life might actually be a good thing for some children, sculpting brains in a way that makes them better at interacting with the fast-paced technological world, said Leah Krubitzer of the University of California, Davis. “This overstimulation might be adaptive,” she said. “The benefits may outweigh the deficits.”

Dogs form memories of experiences

Dogs don’t miss much. After watching a human do a trick, dogs remembered the tricks well enough to copy them perfectly a minute later, a new study finds. The results suggest that our furry friends possess some version of episodic memory, which allows them to recall personal experiences, and not just simple associations between, for instance, sitting and getting a treat.

Pet dogs watched a human do something — climb on a chair, look inside a bucket or touch an umbrella. Either a minute or an hour later, the dog was unexpectedly asked to copy the behavior with a “Do it!” command, an imitation that the dogs had already been trained to do. In many cases, dogs were able to obey these surprise commands, particularly after just a minute. Dogs didn’t perform as well when they had to wait an hour for the test, suggesting that the memories grew hazier with time.

Like people, dogs seem to form memories about their experiences all the time, even when they don’t expect to have to use those memories later, study coauthor Claudia Fugazza of Eötvös Loránd University in Budapest and colleagues write November 23 in Current Biology.

Solar panels are poised to be truly green

The solar panel industry has nearly paid its climate debt. The technology will break even in terms of energy usage by 2017 and greenhouse gas emissions by 2018 at the latest, if it hasn’t done so already, researchers calculate.

Building, assembling and installing solar panels consumes energy and produces climate-warming greenhouse gases. Once in use, though, the panels gradually reverse this imbalance by producing green energy.

The manufacturing process has also gotten greener over the last 40 years, environmental scientist Atse Louwen of Utrecht University in the Netherlands and colleagues report December 6 in Nature Communications. Each doubling of the combined energy-generating capacity of all solar panels has coincided with a 12 to 13 percent drop in the energy used during manufacturing and a 17 to 24 percent drop in their carbon footprint.