New sky map charts previously unknown gamma-ray sources

SALT LAKE CITY — A new map of the sky charts the origins of some of the highest energy photons ever detected. Researchers from the High-Altitude Water Cherenkov Observatory released their first year of observations of gamma rays, ultrahigh-energy light particles blasted in our direction from some of the most extreme environments in the universe.

The researchers found 40 gamma-ray sources, a quarter of which hadn’t previously been identified, they reported April 18 at an American Physical Society meeting. The map is “revealing new information about nature’s particle accelerators,” said Brenda Dingus, a leader of the HAWC collaboration. These accelerators include the relics of dead stars, such as supernova remnants, and active galaxies that shoot out blasts of particles, known as blazars.
From its perch on the edge of a dormant volcano in Mexico, HAWC detects gamma rays using 300 tanks of water, which cover an area the size of four football fields and register faint light signals from showers of particles produced when gamma rays slam into Earth’s atmosphere.

The team found new sources in areas that had already been searched by other high-energy gamma-ray telescopes. “That’s a little perplexing,” said Dingus. The discrepancy could be due to the fact that HAWC observes higher energy gamma rays, or that the sources are too spread out for the other telescopes to find.

In a region near a previously known gamma-ray source, the scientists found two other potential sources. They nicknamed the group “the executioner” — the bright gamma ray hot spots in the map bore some resemblance to a sinister human figure. If the name sticks, Dingus said, “it would be the first gamma-ray constellation.”

Words’ meanings mapped in the brain

In the brain, language pops up everywhere.

All across the wrinkly expanse of the brain’s outer layer, a constellation of different regions handle the meaning of language, scientists report online April 27 in Nature.

One region that responds to “family,” “home” and “mother,” for example, rests in a tiny chunk of tissue on the right side of the brain, above and behind the ear. That region and others were revealed by an intricate new map that charts the location of hundreds of areas that respond to words with related meanings.
Such a detailed map hints that humans comprehend language in a way that’s much more complicated — and involves many more brain areas — than scientists previously thought, says Stanford University neuroscientist Russell Poldrack, who was not involved in the work.

In fact, he says, “these data suggest we need to rethink how the brain organizes meaning.”

Scientists knew that different concepts roused action in different parts of the brain, says study coauthor Jack Gallant, a computational neuroscientist at the University of California, Berkeley. But people generally thought that big hunks of the brain each dealt with different concepts separately: one region for concepts related to vision, for example, another for concepts related to emotion. And conventional wisdom said the left hemisphere was most important.

Previous studies, though, tested just single words or sentences, and made only rough estimates of where meaning showed up in the brain, Gallant says. That’s like looking at the world’s countries in Google maps, instead of zooming in to the street view.

So he and colleagues mapped the activity of some 60,000 to 80,000 pea-sized regions across the brain’s outer layer, or cerebral cortex, as people lay in a functional MRI machine and listened to stories from The Moth Radio Hour. (The program features people telling personal, narrative tales to a live audience.)
“People actually love this experiment,” Gallant says.

It stands out from others because the authors use “real life, complicated stories,” says Princeton University neuroscientist Uri Hasson. “That’s really meaningful to see how the brain operates.”

Gallant’s team used a computer program to decipher the meaning of every 1- to 2-second snippet of the stories and then cataloged where 985 concepts showed up in the brain. Meanings conveyed by different words didn’t just engage the left hemisphere, the team found, but instead switched on groups of nerve cells spread broadly across the brain’s surface. After mapping where meaning, or semantic content, was represented in the brain, the researchers figured out where individual words might show up. Often, the same word appeared in different locations. For instance, the word “top” turned up in a spot with clothing words, as well as in an area related to numbers and measurements.

The brain maps of the seven participants in the study looked remarkably similar, Gallant says. That could be due to common life experiences: All seven were raised and educated in Western societies. With so few people, the researchers can’t pick out any gender differences, he says, but ideally he’d like to repeat the experiment with 50 or 100 people.

For now, Gallant hopes the map can serve as a resource for other researchers. One day, the work could potentially help those with ALS or locked-in syndrome communicate ­— by decoding the words in a person’s thoughts. But that’s just one piece of the puzzle, Gallant says. Researchers would also need to devise a method for measuring brain activity that’s portable, unlike MRI machines.

Risky skull surgery done for ritual reasons 6,000 years ago

Surgery has some surprisingly ritual roots.

Between around 6,000 and 4,000 years ago, skilled surgeons in southwestern Russia cut holes the size of silver dollars, or larger, out of the backs of people’s skulls. But the risky procedure wasn’t performed for medical reasons: These skull surgeries fulfilled purely ritual needs, a new study suggests. And those on the cutting end of the procedure usually lived.

Skulls of 13 people previously excavated at seven ancient sites in this region contain surgical holes in the same spot, in the middle of the back of the head, say archaeologist Julia Gresky of the German Archaeological Institute in Berlin and her colleagues. That’s a particularly dangerous location for this kind of skull surgery, also known as trepanation, the scientists report online April 21 in the American Journal of Physical Anthropology. It’s not an area of the skull typically targeted in ancient trepanations, which go back roughly 11,000 years in West Asia.
“There may have been an original medical purpose for these trepanations, which over time changed to a symbolic treatment,” Gresky says.

Archaeologist Maria Mednikova of the Russian Academy of Sciences in Moscow agrees that skulls in Gresky’s new study probably represent cases of ritual trepanation. She previously examined some of the same skulls. Trepanation may have been used in some ancient cultures as part of a rite of passage for people taking on new social roles, Mednikova speculates.

Carving a center hole in the back of peoples’ heads was a potentially fatal procedure. Surgeons would have needed to know precisely how deep to scrape or grind bone to avoid penetrating a blood-drainage cavity for the brain. They also had to know how to stop potentially fatal bleeding of veins nicked during surgery. The procedure must have been performed as fast as possible to minimize bleeding, the researchers suspect.

Yet 11 of 13 skull openings show signs of healing and bone regrowth, indicating that these individuals survived the operation and often lived for years after. The researchers identify six males and six females in the skull sample. One specimen’s sex couldn’t be determined from skull features.

Most individuals died between ages 20 and 40. One female with a layer of bone that had regrown from the inside border of a trepanation hole died between ages 14 and 16, suggesting her skull surgery had occurred as young as age 10, the researchers estimate.

CT scans, X-rays and analyses of bone surfaces produced no evidence of injuries or brain tumors that could have motivated surgery. Ancient skull surgery intended as a medical treatment often involved holes on the side of the head, near fractures from some type of blow to the head (SN Online: 4/25/08). It’s impossible to determine from bones whether trepanations were aimed at treating chronic headaches, epilepsy, psychological problems or difficulties attributed to evil spirits.

Other evidence, in addition to the risky and unusual location of trepanation holes, points to ritual skull surgeries in southern Russia, Gresky says. Many of these individuals were interred according to special customs, suggesting they ranked high in their societies. For instance, the skulls of seven people buried in a pit at one site had been grouped together near bundled fragments of limb bones in a special display. Incisions on the limb bones indicate that bodies had been dismembered after death before being ritually buried. Of the seven skulls, five display surgical openings at the back of the head. Another contains scrapes from a partial trepanation. Partial trepanations were probably intentional rather than unfinished, with their own cultural significance, Mednikova says.

Trepanation holes on the sides of another six skulls found at the same southern Russian sites were probably made to treat medical conditions, Gresky says. Surgical openings on several of these skulls are located near bone fractures.

Rituals and meanings attached to ancient trepanations in southern Russia will remain mysterious, Mednikova predicts. “We don’t know the myths and religions of tribes that lived there 6,000 years ago.”

U.S. oil and gas boom behind rising ethane levels

A single oil and gas field centered in North Dakota spews 1 to 3 percent of all global ethane emissions, about 230,000 metric tons annually. Based on that snapshot, researchers argue that the recent U.S. oil and gas boom is chiefly to blame for rising levels of ethane, a component of natural gas that can damage air quality and warm the climate.

Flying air-sniffing planes over the Bakken shale in May 2014, atmospheric scientist Eric Kort of the University of Michigan in Ann Arbor and colleagues discovered that ethane emissions were 10 to 100 times larger than expected. The region has been a major contributor to a U-turn in ethane emissions, the researchers report online April 26 in Geophysical Research Letters. Global atmospheric ethane levels declined from 14.3 million tons in 1984 to around 11.3 million tons in 2010. In recent years, however, ethane levels have increased.

Assuming that the Bakken shale’s emissions grew over time as production ramped up over the last few years, the researchers projected the region’s ethane emissions back in time. In 2012, yearly ethane emissions from the shale were large enough to cancel out half of the annual long-term decline in global ethane emissions, the researchers estimate.Additional sources, such as other oil and gas fields, contributed the rest of the increase.

Ethane typically stays in the atmosphere only around two months before breaking apart in chemical reactions. But in that short time, the gas worsens near-ground air quality and contributes to global warming both directly as a greenhouse gas and indirectly by increasing the amount of time methane, an even more potent greenhouse gas, remains in the atmosphere.

Physicists smash particle imitators

Physicists of all stripes seem to have one thing in common: They love smashing things together. This time-honored tradition has now been expanded from familiar particles like electrons, protons, and atomic nuclei to quasiparticles, which act like particles, but aren’t.

Quasiparticles are formed from groups of particles in a solid material that collectively behave like a unified particle (SN: 10/18/14, p. 22). The first quasiparticle collider, described May 11 in Nature, allows scientists to probe the faux-particles’ behavior. It’s a tool that could potentially lead researchers to improved materials for solar cells and electronics applications.
“Colliding particles is really something that has taught us so much,” says physicist Peter Hommelhoff of the University of Erlangen-Nuremberg in Germany, who was not involved with the research. Colliding quasiparticles “is really interesting and it’s really new and pretty fantastic.”

It’s a challenge to control these fleeting faux-particles. “They are very short-lived and you cannot take them out of their natural habitat,” says physicist Rupert Huber of University of Regensburg in Germany, a coauthor of the study. But quasiparticles are a useful way for physicists to understand how large numbers of particles interact in a solid.

One quasiparticle, known as a hole, results from a missing electron that produces a void in a sea of electrons. The hole moves around the material, behaving like a positively charged particle. Its apparent movement is the result of many jostling electrons.

The new quasiparticle collider works by slamming holes into electrons. Using a short pulse of light, the researchers created pairs of electrons and holes in a material called tungsten diselenide. Then, using an infrared pulse of light to produce an oscillating electric field, the researchers ripped the electrons and holes apart and slammed them back together again at speeds of thousands of kilometers per second — all within about 10 millionths of a billionth of a second.

The smashup left its imprint in light emitted in the aftermath, which researchers analyzed to study the properties of the collision. For example, when holes get together with electrons, they can bind into an atomlike state known as an exciton. The researchers used their collider to estimate the excitons’ binding energy — a measure of the effort required to separate the pair.
The collider could be useful for understanding how quasiparticles behave in materials — how they move, interact and collide. Such quasiparticle properties are particularly pertinent for materials used in solar cells, Huber says. When sunlight is absorbed in solar cells, it produces pairs of electrons and holes that must be separated and harvested to produce electricity.

The researchers also hope to study quasiparticles in other materials, like graphene, a sheet of carbon one atom thick (SN: 08/13/11, p. 26). Scientists hope to use graphene to create superthin, flexible electronics, among other applications. Graphene has a wealth of unusual properties, not least of which is that its electrons can be thought of as quasiparticles; unlike typical electrons, they behave like they are massless.

Here’s where 17,000 ocean research buoys ended up

Garbage in, garbage out. But where does all that garbage go? In the oceans, floating bits of debris — everything from plastic bags to Legos — tend to ride along ocean currents to a common destination: one of five major whirling ocean gyres, also known as the ocean garbage patches. Researchers recently got a new look at these gyres thanks to a visualization that combined 35 years’ worth of data on another thing humans drop into the oceans: scientific buoys. The visualization was a finalist in the Data Stories competition sponsored by the American Association for the Advancement of Science. The winners were announced May 5.
Free-floating buoys, released by the National Oceanic and Atmospheric Administration, track temperature, saltiness and other ocean properties. Experts at NASA’s Scientific Visualization Studio combined the movements of more than 17,000 buoys to illustrate the motions of the oceans (see animation below). The buoys start off scattered across the oceans, with some in neat lines that follow the paths of buoy-deploying research vessels. From this chaos, the buoys begin to migrate into clusters. Over time, most drop off the grid and disappear, but some buoys eventually end up in one of the ocean garbage patches.

The garbage patches aren’t floating landfills of intact soda bottles and yogurt cups. The gyres are instead speckled with tiny plastic bits smaller than grains of rice, as many as 100,000 per square kilometer. All that plastic can end up in fish and serves as a foundation for microbe colonies (SN: 2/20/16, p. 20).

Wiping out gut bacteria impairs brain

Obliterating bacteria in the gut may hurt the brain, too.

In mice, a long course of antibiotics that wiped out gut bacteria slowed the birth of new brain cells and impaired memory, scientists write May 19 in Cell Reports. The results reinforce evidence for a powerful connection between bacteria in the gut and the brain (SN: 4/2/16, p. 23).

After seven weeks of drinking water spiked with a cocktail of antibiotics, mice had fewer newborn nerve cells in a part of the hippocampus, a brain structure important for memory. The mice’s ability to remember previously seen objects also suffered.
Further experiments revealed one way bacteria can influence brain cell growth and memory. Injections of immune cells called Ly6Chi monocytes boosted the number of new nerve cells. Themonocytes appear to carry messages from gut to brain, Susanne Wolf of the Max Delbrück Center for Molecular Medicine in Berlin and colleagues found.

Exercise and probiotic treatment with eight types of live bacteria also increased the number of newborn nerve cells and improved memory in mice treated with antibiotics. The results help clarify the toll of prolonged antibiotic treatment, and hint at ways to fight back, the authors write.

Fruit fly’s giant sperm is quite an exaggeration

Forget it, peacocks. Nice try, elk. Sure, sexy feathers and antlers are showy, but the sperm of a fruit fly could be the most over-the-top, exaggerated male ornamentation of all.

In certain fruit fly species, such as Drosophila bifurca, males measuring just a few millimeters produce sperm with a tail as long as 5.8-centimeters, researchers report May 26 in Nature. Adjusted for body size, the disproportionately supersized sperm outdoes such exuberant body parts as pheasant display feathers, deer antlers, scarab beetle horns and the forward-grasping forceps of earwigs.
Fruit flies’ giant sperm have been challenging to explain, says study coauthor Scott Pitnick of Syracuse University in New York.

Now he and his colleagues propose that a complex interplay of male and female benefits has accelerated sperm length in a runaway-train scenario.

Males with longer sperm deliver fewer sperm, bucking a more-is-better trend. Yet, they still manage to transfer a few dozen to a few hundred per mating. And as newly arrived sperm compete to displace those already waiting in a female’s storage organ, longer is better. Fewer sperm per mating means females tend to mate more often, intensifying the sperm-vs.-sperm competition. Females that have the longest storage organs, which favor the longest sperm, benefit too: Males producing greater numbers of megasperm, the researchers found, tend to be the ones with good genes likely to produce robust offspring. “Sex,” says Pitnick, “is a powerful force.”
Among courtship-oriented body ornaments and weapons (red), the giant sperm of fruit flies (Drosophila) are the most disproportionately exaggerated, according to an index adjusted for body size. Higher numbers (bottom axis) indicate greater exaggeration.

Biologist Kate Rubins’ big dream takes her to the space station

When molecular biologist Kate Rubins blasts off from Kazakhstan on June 24, strapped into the Soyuz spacecraft bound for the International Space Station, the trip will cap off seven years of preparing — and 30 years of hoping.

As a child, Rubins plastered her Napa, Calif., bedroom with pictures of the space shuttle, proudly announcing her intention to be an astronaut. A week at Space Camp in Huntsville, Ala., in seventh grade cemented her vision. But by high school, she concluded that astronaut wasn’t “a realistic job,” she says.
Flash forward to 2009: Rubins is running a lab at the Whitehead Institute for Biomedical Research in Cambridge, Mass., focusing on virus-host interactions and viral genomics. A friend points out a NASA ad seeking astronaut candidates, and Rubins’ long-dormant obsession awakens. Since then, she has learned how to fly a T-38 jet, speak Russian to communicate with her cosmo-naut crewmates, conduct a spacewalk, operate the robotic arm on the ISS and even fix the habitable satellite’s toilet.

Joining NASA meant leaving her 14-person lab behind. But Rubins gained the rare opportunity to collaborate with dozens of scientists in fields as diverse as cell biology and astrophysics. On the space station, she’ll be “their hands, eyes and ears,” conducting about 100 experiments over five months.

She will, for instance, probe how heart cells behave when gravity doesn’t get in the way. And she’ll test a hand-held DNA sequencer, which reads out the genetic information stored in DNA and will be important to future missions looking for signatures of life on Mars.

At times, Rubins will be both experimenter and subject. In one study, she will observe bone cells in a lab dish, comparing their behavior with what happens in a simulated gravity-free environment on the ground. Because astronauts in space are vulnerable to rapid bone loss, CT scanning before and after the mission will also document changes in Rubins’ own hip bone.

Rubins is particularly eager to examine how liquid behaves in microgravity on a molecular scale. In 2013, Canadian astronaut Chris Hadfield created an Internet sensation when he demonstrated that wringing out a wet washcloth in space caused water to form a bubble that enveloped the cloth and his hands. “It’s incredibly bizarre,” Rubins says. Understanding how fluids move in test tubes in space will help NASA plan for Mars exploration, among other applications.
Before any of the research can begin, Rubins has to get off the ground. As treacherous as accelerating to 17,500 miles per hour may sound, she’s not worried.

“An important part of the training experience is making all the information and skills routine,” she says. She predicts that sitting down in the Soyuz spacecraft, pulling out her procedures and getting ready to launch will feel a lot like going into the lab and picking up a pipette — “a normal day at the office.”

Until the engines turn on, anyway. “I think it’s going to feel different when there’s a rocket underneath.”

Francis Crick’s good luck revolutionized biology

When Francis Crick was 31, he decided he needed to change his luck. As a graduate student in physics during World War II, his research hadn’t gone so well; his experiment was demolished by a bomb. To beat the war, he joined it, working on naval warfare mines for the British Admiralty.

After the war, he sought a new direction.

“There are lots of ways of being unlucky,” Crick told me in an interview in 1998. “One is sticking to things too long. Another is not adventuring at all.”

He decided to adventure.

Molecular biologists everywhere will celebrate that decision on June 8, the centennial of Crick’s birth, in Weston Favell, Northampton, England, in 1916.

“Crick was one of the central figures, one might say the central figure, in the molecular revolution that swept through biology in the latter half of the 20th century,” science historian Robert Olby wrote in a biographical sketch.

In 1953, at the University of Cambridge, Crick and his collaborator James Watson figured out how life’s most important molecule, deoxyribonucleic acid, was put together. DNA, the stuff that genes are made of, became the most famous of biological molecules. Today the image of its double helix structure symbolizes biology itself. It would be easy to make the case that discovering DNA’s structure was the single greatest event in the history of biology — and always will be. In 1962, Watson and Crick won the Nobel Prize for their work (which was, of course, greatly aided by X-ray diffraction imagery from Rosalind Franklin, who unfortunately died before the Nobel was awarded).

Crick’s DNA adventure began at a time when molecular biology was ripe for revolution. But Crick didn’t know that. His choice was lucky.
“I had no idea when I started that molecular biology would advance so fast,” he said. “No idea at all.”

In fact, Crick very nearly chose a different path. His interest in genes was equaled by his curiosity about the brain. Both were topics that he liked to gossip about.

“But I didn’t know enough about either subject,” he said. He just knew a little bit more about biochemistry.

“I thought ‘Well look, I have a training in physics and I know a bit of chemistry, I don’t know anything about the brain.’” So he decided it would be more sensible to start with genes.

“I thought that problem of what genes were and how they replicate and what they did would last me the rest of my life,” he said.

As it happened, genes did occupy him for a couple of decades. Crick made major contributions to elucidating the genetic code during that time. But he never forgot his interest in the brain, and more specifically, consciousness. In the 1970s, he moved from England to California, where he began consciousness research in San Diego at the Salk Institute for Biological Studies.

Consciousness turned out to be a much tougher problem than understanding genes. In retrospect, Crick could see why.

With genetics, “what really made the thing was the simplicity of the double helix. It wrote the whole research program,” he said. “It probably goes back to near the origin of life, when things had to be simple.” Consciousness appeared on the scene only much later, after the evolution of the brain’s vast complexity.

Nevertheless, Crick perceived parallels between genetics and consciousness as subjects for scientific inquiry. As the 20th century came to an end, he mused that consciousness as a concept remained vague — researchers did not all agree about what the word meant. The situation with genes had at one time been similar.

“In a sense people were just as vague about what genes were in the 1920s as they are now about consciousness,” Crick said. “It was exactly the same. The more professional people in the field, which was biochemistry at that time, thought that it was a problem that was too early to tackle. And I think they were right in the ’20s.”

At the end of the 20th century, research on consciousness found itself in much the same state.

“Everybody agrees that it’s an interesting question,” Crick said, “but there are two points of view among scientists: One is that it isn’t a scientific question and is best left to philosophers. And the other one is that, even if it’s a scientific question, it’s too early to tackle it now.”

Crick tackled it anyway. Until his death in 2004, he worked vigorously on the subject with his collaborator Christof Koch, making substantial inroads into identifying the brain activity associated with conscious awareness. Crick was not lucky enough to solve the problem of consciousness, but he perhaps brought the arrival of that solution a little closer.