40 more ‘intelligence’ genes found

Smarty-pants have 40 new reasons to thank their parents for their powerful brains. By sifting through the genetics of nearly 80,000 people, researchers have uncovered 40 genes that may make certain people smarter. That brings the total number of suspected “intelligence genes” to 52.

Combined, these genetic attributes explain only a very small amount of overall smarts, or lack thereof, researchers write online May 22 in Nature Genetics. But studying these genes, many of which play roles in brain cell development, may ultimately help scientists understand how intelligence is built into brains.
Historically, intelligence research has been mired in controversy, says neuroscientist Richard Haier of the University of California, Irvine. Scientists disagreed on whether intelligence could actually be measured and if so, whether genes had anything at all to do with the trait, as opposed to education and other life experiences. But now “we are so many light-years beyond that, as you can see from studies like this,” says Haier. “This is very exciting and very positive news.”

The results were possible only because of the gigantic number of people studied, says study coauthor Danielle Posthuma, a geneticist at VU University Amsterdam. She and colleagues combined data from 13 earlier studies on intelligence, some published and some unpublished. Posthuma and her team looked for links between intelligence scores, measured in different ways in the studies, and variations held in the genetic instruction books of 78,308 children and adults. Called a genome-wide association study or GWAS, the method looks for signs that certain quirks in people’s genomes are related to a trait.

This technique pointed out particular versions of 22 genes, half of which were not previously known to have a role in intellectual ability. A different technique identified 30 more intelligence genes, only one of which had been previously found. Many of the 40 genes newly linked to intelligence are thought to help with brain cell development. The SHANK3 gene, for instance, helps nerve cells connect to partners.

Together, the genetic variants identified in the GWAS account for only about 5 percent of individual differences in intelligence, the authors estimate. That means that the results, if confirmed, would explain only a very small part of why some people are more intelligent than others. The gene versions identified in the paper are “accounting for so little of the variance that they’re not telling us much of anything,” says differential developmental psychologist Wendy Johnson of the University of Edinburgh.

Still, knowing more about the genetics of intelligence might ultimately point out ways to enhance the trait, an upgrade that would help people at both the high and low ends of the curve, Haier says. “If we understand what goes wrong in the brain, we might be able to intervene,” he says. “Wouldn’t it be nice if we were all just a little bit smarter?”

Posthuma, however, sees many roadblocks. Beyond ethical and technical concerns, basic brain biology is incredibly intricate. Single genes have many jobs. So changing one gene might have many unanticipated effects. Before scientists could change genes to increase intelligence, they’d need to know everything about the entire process, Posthuma says. Tweaking genetics to boost intelligence “would be very tricky.”

Every breath you take contains a molecule of history

Julius Caesar could have stayed home on March 15, 44 B.C. But mocking the soothsayer who had predicted his death, the emperor rode in his litter to Rome’s Forum. There he met the iron daggers of 60 senators.

As he lay in a pool of blood, he may have gasped a final incrimination to his protégé Brutus: You too, my son? Or maybe not. But he certainly would have breathed a dying breath, a final exhalation of some 25 sextillion gas molecules. And it’s entirely possible that you just breathed in one of them.
In fact, calculating the probability of a particle of Caesar’s dying breath appearing in any given liter of air (the volume of a deep breath) has become a classic exercise for chemistry and physics students. If you make a few assumptions about the mixing of gases and the lifetimes of molecules in the atmosphere, it turns out that, on average, one molecule of “Caesar air” — or any other historic liter of air, for that matter — appears in each breath you take.

Author Sam Kean begins his book Caesar’s Last Breath with this exercise, noting that “we can’t escape the air of those around us.” It’s all recycled, and every day we breathe in a bit of our, and Earth’s, history. “The story of Earth,” he writes, “is the story of its gases.”

Kean, author of a best seller about the periodic table, The Disappearing Spoon, then tells that story. As he did in his fascinating portraits of the elements, Kean profiles individual gases such as nitrogen and oxygen primarily through the scientists and entrepreneurs who discovered or sought to harness them. These are quirky men (and they are mostly men) — every bit as obsessed, greedy and brilliant as one could hope for in a page-turner.

Along with lesser-known backstories of textbook heroes such as James Watt, Antoine-Laurent Lavoisier and Albert Einstein (who was surprisingly obsessed with building a better refrigerator), Kean clearly delights in weaving in the unexpected. In the discussion of helium, we learn about Joseph-Michel Montgolfier, the papermaker who was inspired to build the first hot-air balloon as he watched his wife’s pantaloons billowing suggestively above a fire. And in a chapter on the radioactive elements carried in nuclear fallout, there’s Pig 311, a sow that survived a nuclear test blast only to be used as propaganda for the weapons’ supposed safety.

Along the way, Kean threads in the history of Earth’s atmosphere in a surprisingly compelling narrative of geologic history. He steps aside from Lavoisier’s work on life-giving oxygen, for example, to describe the Great Oxygenation Event, which infused the atmosphere a couple billion years ago with a gas that, at the time, was toxic to most living things. The explanations of science here and throughout the book are written clearly and at a level that should be understandable with a high school education. And while they’re straightforward, the explanations have enough depth to be satisfying; by the end of the book, you realize you’ve learned quite a bit.
Even those who rarely read science will enjoy the drama — death, for instance, plays a big role in these stories. Over and over, we learn, men have taken gases’ powers too lightly, or wielded their own power too cruelly, and paid the price. Fritz Haber, for instance, could have died a hero for finding a way to make fertilizer from the nitrogen in air. Instead, he died broke and loathed for his World War I work on gas warfare.

Then there was Harry Truman — not that Truman, but the one who refused to leave his home when scientists warned of an impending volcanic eruption. Truman contended that officials were “lying like horses trot” right up until Mount St. Helens blew searing gases that erased him from the mountainside.

The links between these stories can seem at first as ephemeral as the gases, but together they tell the story of the birth of the atmosphere and humans’ history in it. In the end, like Caesar’s breath, it all comes full circle.

This history book offers excellent images but skimps on modern science

Books about the history of science, like many other histories, must contend with the realization that others have come before. Their tales have already been told. So such a book is worth reading, or buying, only if it offers something more than the same old stories.

In this case, The Oxford Illustrated History of Science offers most obviously an excellent set of illustrations and photographs from science’s past, from various ancient Egyptian papyruses to the Hubble Space Telescope’s ultradeep view of distant galaxies. Some of the images will be familiar to science fans; many others are obscure but apt; nearly all help illustrate various aspects of science’s history.
And yet the pictures, while many may be worth more than 10,000 words, are still just complements to the text. Oxford attempts a novel organization for recounting the story of science: a sometimes hard-to-follow mix of chronological and topical. The first section, “Seeking Origins,” has six chapters that cover ancient Mediterranean science, science in ancient China, medieval science (one chapter for the Islamic world and Europe, one for China), plus the scientific revolution and science in the Enlightenment. The second section, “Doing Science,” shifts to experimenting, fieldwork, biology, cosmology, theory and science communication.
Each chapter has a different author, which has the plus of bringing distinct expertise to each subject matter but the minus of vast divergence in readability and caliber of content. Some chapters (see “Exploring Nature,” on field science) are wordy, repetitive and lack scientific substance. Others (“Mapping the Universe”) are compelling, engaging and richly informative. A particularly disappointing chapter on biology (“The Meaning of Life”) focuses on 19th century evolution, with only a few paragraphs for the life science of the 20th and 21st centuries. That chapter closes with an odd, antiscientific tone lamenting the “huge numbers of people … addicted to antidepressants” and complaining that modern biology (and neuroscience) “threatens to undermine traditional values of moral responsibility.”

Some of the book’s strongest chapters are the earliest, especially those that cover aspects of science often missing in other histories, such as science in China. Who knew that the ancient Chinese had their own set of ancient elements — not the Greeks’ air, earth, water and fire, but rather wood, fire, water, soil and metal?

With the book’s second-half emphasis on how science was done rather than what science found out, the history that emerges is sometimes disjointed and out of order. Discussions of the modern view of the universe, which hinges on Einstein’s general theory of relativity, appear before the chapter on theory, where relativity is mentioned. In fact, both relativity and quantum theory are treated superficially in that chapter, as examples of the work of theorists rather than the components of a second scientific revolution.
No doubt lack of space prevented deeper treatment of science from the last century. Nevertheless the book’s merits outweigh its weaknesses. For an accessible account of the story of pre-20th century science, it’s informative and enjoyable. For more recent science, you can at least look at the pictures.

Intense storms provide the first test of powerful new hurricane forecast tools

This year’s Atlantic hurricane season has already proven to be active and deadly. Powerful hurricanes such as Harvey, Irma and Maria are also providing a testing ground for new tools that scientists hope will save lives by improving forecasts in various ways, from narrowing a storm’s future path to capturing swift changes in the intensity of storm winds.

Some of the tools that debuted this year — such as the GOES-16 satellite — are already winning praise from scientists. Others, such as a new microsatellite system aiming to improve measurements of hurricane intensity and a highly anticipated new computer simulation that forecasts hurricane paths and intensities, are still in the calibration phase. As these tools get an unprecedented workout thanks to an unusually ferocious series of storms, scientists may know in a few months whether hurricane forecasting is about to undergo a sea change.

The National Oceanic and Atmospheric Administration’s GOES-16 satellite is perhaps the clearest success story of this hurricane season so far. Public perceptions of hurricane forecasts tend to focus on uncertainty and conflicting predictions. But in the big picture, hurricane models adeptly forecasted Irma’s ultimate path to the Florida Keys nearly a week before it arrived there, says Brian Tang, an atmospheric scientist at the University at Albany in New York.
“I found that remarkable,” he says. “Ten or so years ago that wouldn’t have been possible.”

One reason for this is GOES-16, which launched late last year and will become fully operational in November. The satellite offers images at four times the resolution of previous satellites. “It’s giving unparalleled details about the hurricanes,” Tang says, including data on wind speeds and water temperatures delivered every minute that are then fed into models.

GOES-16’s crystal-clear images also give forecasters a better picture of the winds swirling around a storm’s central eye. But more data from this crucial region is needed to improve predictions of just how strong a hurricane might get. Scientists continue to struggle to predict rapid changes in hurricane intensity, Tang says. He notes how Hurricane Harvey, for example, strengthened suddenly to become a Category 4 storm right before it made landfall in Texas, offering emergency managers little time to issue warnings. “That’s the sort of thing that keeps forecasters up at night,” he says.
In December, NASA launched a system of eight suitcase-sized microsatellites called the Cyclone Global Navigation Satellite System, or CYGNSS, into orbit. The satellites measure surface winds near the inner core of a hurricane, such as between the eyewall and the most intense bands of rain, at least a couple of times a day. Those regions have previously been invisible to satellites, measured only by hurricane-hunter airplanes darting through the storm.

“Improving forecasts of rapid intensification, like what occurred with Harvey on August 25, is exactly what CYGNSS is intended to do,” says Christopher Ruf, an atmospheric scientist at the University of Michigan in Ann Arbor and the lead scientist for CYGNSS. Results from CYGNSS measurements of both Harvey and Irma look very promising, he says. While the data are not being used to inform any forecasts this year, the measurements are now being calibrated and compared with hurricane-hunter flight data. The team will give the first detailed results from the hurricane season at the annual meeting of the American Geophysical Union in December.
Meanwhile, NOAA has also been testing a new hurricane forecast model this year. The U.S. forecasting community is still somewhat reeling from its embarrassing showing during 2012’s Hurricane Sandy, which the National Weather Service had predicted would go out to sea while a European meteorological center predicted, correctly, that it would squarely hit New York City. In the wake of that event, Congress authorized $48 million to improve U.S. weather forecasting, and in 2014 NOAA held a competition to select a new weather prediction tool to improve its forecasts.

The clear winner was an algorithm developed by Shian-Jiann Lin and colleagues at NOAA’s Geophysical Fluid Dynamics Laboratory in Princeton, N.J. In May, NOAA announced that it would test the new model this hurricane season, running it alongside the more established operational models to see how it stacks up. Known as FV3 (short for Finite-Volume Cubed-Sphere Dynamical Core), the model divides the atmosphere into a 3-D grid of boxes and simulates climate conditions within the boxes, which may be as large as 4 kilometers across or as small as 1 kilometer across. Unlike existing models, FV3 can also re-create vertical air currents that move between boxes, such as the updrafts that are a key element of hurricanes as well as tornadoes and thunderstorms.

But FV3’s performance so far this year hasn’t been a slam dunk. FV3 did a far better job at simulating the intensity of Harvey than the other two leading models, but it lagged behind the European model in determining the hurricane’s path, Lin says. As for Irma, the European model outperformed the others on both counts. Still, Lin says he is confident that FV3 is on the right track in terms of its improvement. That’s good because pressure to work out the kinks may ramp up rapidly. Although NOAA originally stated that FV3 would be operational in 2019, “I hear some hints that it could be next year,” he says.

Lin adds that a good model alone isn’t enough to get a successful forecast; the data that go into a model are ultimately crucial to its success. “In our discipline, we call that ‘garbage in, garbage out,’” he says. With GOES-16 and CYGNSS nearly online, scientists are looking forward to even better hurricane models thanks to even better data.

Ice in space might flow like honey and bubble like champagne

Ice in space may break out the bubbly. Zapping simulated space ice with imitation starlight makes the ice bubble like champagne. If this happens in space, this liquidlike behavior could help organic molecules form at the edges of infant planetary systems. The experiment provides a peek into the possible origins of life.

Shogo Tachibana of Hokkaido University in Sapporo, Japan, and colleagues combined water, methanol and ammonia, all found in comets and interstellar clouds where stars form, at a temperature between ‒263° Celsius and ‒258° C. The team then exposed this newly formed ice to ultraviolet radiation to mimic the light of a young star.

As the ice warmed to ‒213° C, it cracked like a brittle solid. But at just five degrees warmer, bubbles started appearing in the ice, and continued to bubble and pop until the ice reached ‒123° C. At that point, the ice returned to a solid state and formed crystals.

“We were so surprised when we first saw bubbling of ice at really low temperatures,” Tachibana says. The team reports its finding September 29 in Science Advances.

Follow-up experiments showed fewer bubbles formed in ice with less methanol and ammonia. Ice that wasn’t irradiated showed no bubbles at all.

Analyses traced spikes of hydrogen gas during irradiation. That suggests that the bubbles are made of hydrogen that the ultraviolet light split off methane and ammonia molecules, Tachibana says. “It is like bubbling in champagne,” he says — with an exception. Champagne bubbles are dissolved carbon dioxide, while ice bubbles are dissolved hydrogen.
The irradiated ice took on another liquidlike feature: Between about ‒185° C and ‒161° C, it flowed like refrigerated honey, despite being well below its melting temperature, Tachibana adds.

That liquidity could help kick-start life-building chemistry. In 2016, Cornelia Meinert of the University Nice Sophia Antipolis in France and colleagues showed that irradiated ice forms a cornucopia of molecules essential to life, including ribose, the backbone of RNA, which may have been a precursor to DNA (SN: 4/30/16, p. 18). But it was not clear how smaller molecules could have found each other and built ribose in rigid ice.

At the time, critics said complex molecules could have been contamination, says Meinert, who was not involved in the new work. “Now this is helping us argue that at this very low temperature, the small precursor molecules can actually react with each other,” she says. “This is supporting the idea that all these organic molecules can form in the ice, and might also be present in comets.”

The brain’s helper cells have a hand in learning fear

WASHINGTON, D.C. — Helper cells in the brain just got tagged with a new job — forming traumatic memories.

When rats experience trauma, cells in the hippocampus — an area important for learning — produce signals for inflammation, helping to create a potent memory. But most of those signals aren’t coming from the nerve cells, researchers reported November 15 at the Society for Neuroscience meeting.

Instead, more than 90 percent of a key inflammation protein comes from astrocytes. This role in memory formation adds to the repertoire of these starburst-shaped cells, once believed to be responsible for only providing food and support to more important brain cells (SN Online: 8/4/15).
The work could provide new insight into how the brain creates negative memories that contribute to post-traumatic stress disorder, said Meghan Jones, a neuroscientist at the University of North Carolina at Chapel Hill.

Jones and her colleagues gave rats a short series of foot shocks painful enough to “make you curse,” she said. A week after that harrowing experience, rats confronted with a milder shock remained jumpy. In some rats, Jones and her colleagues inhibited astrocyte activity during the original trauma, which prevented the cells from releasing the inflammation protein. Those rats kept their cool in the face of the milder shock.

These preliminary results show that neurons get a lot of help in creating painful memories. Studies like these are “changing how we think about the circuitry that’s involved in depression and post-traumatic stress disorder,” says neuroscientist Georgia Hodes of Virginia Tech in Blacksburg. “Everyone’s been focused on what neurons are doing. [This is] showing an important effect of cells we thought of as only being supportive.”

CRISPR gene editor could spark immune reaction in people

Immune reactions against proteins commonly used as molecular scissors might make CRISPR/Cas9 gene editing ineffective in people, a new study suggests.

About 79 percent of 34 blood donors tested had antibodies against the Cas9 protein from Staphylococcus aureus bacteria, Stanford University researchers report January 5 at bioRxiv.org. About 65 percent of donors had antibodies against the Cas9 protein from Streptococcus pyogenes.

Nearly half of 13 blood donors also had T cells that seek and destroy cells that make S. aureus Cas9 protein. The researchers did not detect any T cells that attack S. pyogenes Cas9, but the methods used to detect the cells may not be sensitive enough to find them, says study coauthor Kenneth Weinberg.
Cas9 is the DNA-cutting enzyme that enables researchers to make precise edits in genes. Antibodies and T cells against the protein could cause the immune system to attack cells carrying it, making gene therapy ineffective.

The immune reactions may be a technical glitch that researchers will need to work around, but probably aren’t a safety concern as long as cells are edited in lab dishes rather than in the body, says Weinberg, a stem cell biologist and immunologist.

“We think we need to address this now … as we move toward clinical trials,” he says, but “this is probably going to turn out to be more of a hiccup than a brick wall.”

Humans are overloading the world’s freshwater bodies with phosphorus

Human activities are driving phosphorus levels in the world’s lakes, rivers and other freshwater bodies to a critical point. The freshwater bodies on 38 percent of Earth’s land area (not including Antarctica) are overly enriched with phosphorus, leading to potentially toxic algal blooms and less available drinking water, researchers report January 24 in Water Resources Research.

Sewage, agriculture and other human sources add about 1.5 teragrams of phosphorus to freshwaters each year, the study estimates. That’s roughly equivalent to about four times the weight of the Empire State Building. The scientists tracked human phosphorus inputs from 2002 to 2010 from domestic, industrial and agricultural sources. Phosphorus in human waste was responsible for about 54 percent of the global load, while agricultural fertilizer use contributed about 38 percent. By country, China contributed 30 percent of the global total, India 8 percent and the United States 7 percent.

New technique shows how 2-D thin films take the heat

High-energy particle beams can reveal how 2-D thin sheets behave when the heat is cranked up.

Researchers have devised a way to track how these materials, such as the supermaterial graphene, expand or contract as temperatures rise (SN: 10/3/15, p. 7). This technique, described in the Feb. 2 Physical Review Letters, showed that 2-D semiconductors arranged in single-atom-thick sheets expand more like plastics than metals when heated. Better understanding the high-temp behaviors of these and other 2-D materials could help engineers design sturdy nano-sized electronics.
Commonly used silicon-based electronics are “hitting a brick wall,” regarding how much smaller they can get, says Zlatan Aksamija, an electrical engineer at the University of Massachusetts Amherst not involved in the work. Materials made of ultrathin, 2-D films could be ideal for building the next generation of tinier devices.

But electronics warm up as electric current courses through them. If 2-D materials in a nanodevice expand or shrink at different rates from each other when heated, that could change the device’s electronic properties — such as how well it conducts electricity, says Antoine Reserbat-Plantey, a physicist at the Institute of Photonic Sciences in Barcelona not involved in the research. It’s crucial to know how the thin films react to higher temps.

The new method uses a scanning transmission electron microscope to bombard a film with a beam of high-energy particles. That particle beam stirs up electrons in the 2-D sheet, making the electrons swish back and forth through the material. The collective oscillation, called a plasmon, occurs at a frequency that depends on the material’s density, explains Matthew Mecklenburg, a physicist at the University of Southern California in Los Angeles who was not involved in the work.

The plasmon frequency affects how much energy the particles of the microscope beam lose as they streak through the 2-D material: the higher the frequency, the denser the material, and the more energy that is sapped from the beam. By using another instrument to measure the energies of beam particles after they’ve passed through the 2-D material, researchers can discern the material’s density — and track how that density changes as they turn up the heat.
Robert Klie, a physicist at the University of Illinois at Chicago, and colleagues used this technique on samples of graphene, which is made of carbon atoms, and four 2-D semiconductors made of transition metal and chalcogen atoms. (Chalcogen elements are found in group 16 on the periodic table and include sulfur and selenium). These materials were arranged in sheets from a single atom to a few atoms thick. The team measured the density of each sample at eight temperatures between about 100° and 450° Celsius. That allowed the scientists to calculate how much each material expanded or contracted per degree of temperature increase.

These measurements revealed that the thinnest structures undergo more significant size changes than thicker sheets: A single layer of graphene, which contracts when heated, shrinks more than materials composed of a few graphene layers. The 2-D semiconductors expand at higher temps, but those made of one-atom-thick sheets swell more than semiconductors a few atoms thick. In fact, the heat response of the single-atom-thick semiconductors is “more like [that] of a plastic than a metal,” Mecklenburg says.

This finding may indicate that, like plastics, some 2-D semiconductors have low melting temperatures, which could affect how or whether they’re used in future electronics.

Shipping noise can disturb porpoises and disrupt their mealtime

Harbor porpoises are frequently exposed to sounds from shipping vessels that register at around 100 decibels, about as loud as a lawnmower, scientists report February 14 in Proceedings of the Royal Society B. Sounds this loud can cause porpoises to stop echolocation, which they use to catch food.

While high-frequency submarine sonar has been found to harm whales (SN: 4/23/11, p. 16), low-frequency noise from shipping vessels is responsible for most human-made noise in the ocean, the researchers say. Porpoises have poor hearing in lower frequencies, so it was unclear if they were affected.

In the first study to assess the effects of shipping vessel noise on porpoises, researchers tagged seven harbor porpoises off the coast of Denmark with sensors that tracked the animals’ movement and echolocation usage in response to underwater noise over about 20 hours.

One ship created a 130 decibel noise — twice as loud as a chainsaw — that caused a porpoise to flee at top speed. These initial results indicate that ship noise could affect how much food porpoises hunt and consume.