Australia was first settled by between 1,000 and 3,000 humans around 50,000 years ago, but the population crashed during the Ice Age before recovering to a peak of some 1.2 million people around five centuries ago, a study said on Wednesday. Estimating the early population of Australia is a source…
Proto-Austronesian “genealogical tree.” (Credit: Image courtesy of University of California – Berkeley)
Feb. 11, 2013 — Ancient languages hold a treasure trove of information about the culture, politics and commerce of millennia past. Yet, reconstructing them to reveal clues into human history can require decades of painstaking work. Now, scientists at the University of California, Berkeley, have created an automated “time machine,” of sorts, that will greatly accelerate and improve the process of reconstructing hundreds of ancestral languages.
In a compelling example of how “big data” and machine learning are beginning to make a significant impact on all facets of knowledge, researchers from UC Berkeley and the University of British Columbia have created a computer program that can rapidly reconstruct “proto-languages” — the linguistic ancestors from which all modern languages have evolved. These earliest-known languages include Proto-Indo-European, Proto-Afroasiatic and, in this case, Proto-Austronesian, which gave rise to languages spoken in Southeast Asia, parts of continental Asia, Australasia and the Pacific.
“What excites me about this system is that it takes so many of the great ideas that linguists have had about historical reconstruction, and it automates them at a new scale: more data, more words, more languages, but less time,” said Dan Klein, an associate professor of computer science at UC Berkeley and co-author of the paper published online Feb. 11 in the journal Proceedings of the National Academy of Sciences.
The research team’s computational model uses probabilistic reasoning — which explores logic and statistics to predict an outcome — to reconstruct more than 600 Proto-Austronesian languages from an existing database of more than 140,000 words, replicating with 85 percent accuracy what linguists had done manually. While manual reconstruction is a meticulous process that can take years, this system can perform a large-scale reconstruction in a matter of days or even hours, researchers said.
Not only will this program speed up the ability of linguists to rebuild the world’s proto-languages on a large scale, boosting our understanding of ancient civilizations based on their vocabularies, but it can also provide clues to how languages might change years from now.
“Our statistical model can be used to answer scientific questions about languages over time, not only to make inferences about the past, but also to extrapolate how language might change in the future,” said Tom Griffiths, associate professor of psychology, director of UC Berkeley’s Computational Cognitive Science Lab and another co-author of the paper.
The discovery advances UC Berkeley’s mission to make sense of big data and to use new technology to document and maintain endangered languages as critical resources for preserving cultures and knowledge. For example, researchers plan to use the same computational model to reconstruct indigenous North American proto-languages.
Humans’ earliest written records date back less than 6,000 years, long after the advent of many proto-languages. While archeologists can catch direct glimpses of ancient languages in written form, linguists typically use what is known as the “comparative method” to probe the past. This method establishes relationships between languages, identifying sounds that change with regularity over time to determine whether they share a common mother language.
“To understand how language changes — which sounds are more likely to change and what they will become — requires reconstructing and analyzing massive amounts of ancestral word forms, which is where automatic reconstructions play an important role,” said Alexandre Bouchard-Côté, an assistant professor of statistics at the University of British Columbia and lead author of the study, which he started while a graduate student at UC Berkeley.
The UC Berkeley computational model is based on the established linguistic theory that words evolve along the branches of a family tree — much like a genealogical tree — reflecting linguistic relationships that evolve over time, with the roots and nodes representing proto-languages and the leaves representing modern languages.
Using an algorithm known as the Markov chain Monte Carlo sampler, the program sorted through sets of cognates, words in different languages that share a common sound, history and origin, to calculate the odds of which set is derived from which proto-language. At each step, it stored a hypothesized reconstruction for each cognate and each ancestral language.
“Because the sound changes and reconstructions are closely linked, our system uses them to repeatedly improve each other,” Klein said. “It first fixes its predicted sound changes and deduces better reconstructions of the ancient forms. It then fixes the reconstructions and re-analyzes the sound changes. These steps are repeated, and both predictions gradually improve as the underlying structure emerges over time.”
Share this story on Facebook, Twitter, and Google:
Other social bookmarking and sharing tools:
An artist’s rendering of a placental ancestor. Researchers say the small, insect-eating animal is the most likely common ancestor of the species on the most abundant and diverse branch of the mammalian family tree.
Published: February 7, 2013 137 Comments
Humankind’s common ancestor with other mammals may have been a roughly rat-size animal that weighed no more than a half a pound, had a long furry tail and lived on insects.
In a comprehensive six-year study of the mammalian family tree, scientists have identified and reconstructed what they say is the most likely common ancestor of the many species on the most abundant and diverse branch of that tree — the branch of creatures that nourish their young in utero through a placenta. The work appears to support the view that in the global extinctions some 66 million years ago, all non-avian dinosaurs had to die for mammals to flourish.
Scientists had been searching for just such a common genealogical link and have found it in a lowly occupant of the fossil record, Protungulatum donnae, that until now has been so obscure that it lacks a colloquial nickname. But as researchers reported Thursday in the journal Science, the animal had several anatomical characteristics for live births that anticipated all placental mammals and led to some 5,400 living species, from shrews to elephants, bats to whales, cats to dogs and, not least, humans.
A team of researchers described the discovery as an important insight into the pattern and timing of early mammal life and a demonstration of the capabilities of a new system for handling copious amounts of fossil and genetic data in the service of evolutionary biology. The formidable new technology is expected to be widely applied in years ahead to similar investigations of plants, insects, fish and fowl.
Given some belated stature by an artist’s brush, the animal hardly looks the part of a progenitor of so many mammals (which do not include marsupials, like kangaroos and opossums, or monotremes, egg-laying mammals like the duck-billed platypus).
Maureen A. O’Leary of Stony Brook University on Long Island, a leader of the project and the principal author of the journal report, wrote that a combination of genetic and anatomical data established that the ancestor emerged within 200,000 to 400,000 years after the great dying at the end of the Cretaceous period. At the time, the meek were rapidly inheriting the earth from hulking predators like T. rex.
Within another two million to three million years, Dr. O’Leary said, the first members of modern placental orders appeared in such profusion that researchers have started to refer to the explosive model of mammalian evolution. The common ancestor itself appeared more than 36 million years later than had been estimated based on genetic data alone.
Although some small primitive mammals had lived in the shadow of the great Cretaceous reptiles, the scientists could not find evidence supporting an earlier hypothesis that up to 39 mammalian lineages survived to enter the post-extinction world. Only the stem lineage to Placentalia, they said, appeared to hang on through the catastrophe, generally associated with climate change after an asteroid crashed into Earth.
The research team drew on combined fossil evidence and genetic data encoded in DNA in evaluating the ancestor’s standing as an early placental mammal. Among characteristics associated with full-term live births, the Protungulatum species was found to have a two-horned uterus and a placenta in which the maternal blood came in close contact with the membranes surrounding the fetus, as in humans.
The ancestor’s younger age, the scientists said, ruled out the breakup of the supercontinent of Gondwana around 120 million years ago as a direct factor in the diversification of mammals, as has sometimes been speculated. Evidence of the common ancestor was found in North America, but the animal may have existed on other continents as well.
The publicly accessible database responsible for the findings is called MorphoBank , with advanced software for handling the largest compilation yet of data and images on mammals living and extinct. “This has stretched our own expertise,” Dr. O’Leary, an anatomist, said in an interview.
“The findings were not a total surprise,” she said. “But it’s an important discovery because it relies on lots of information from fossils and also molecular data. Other scientists, at least a thousand, some from other countries, are already signing up to use MorphoBank.”
John R. Wible, curator of mammals at the Carnegie Museum of Natural History in Pittsburgh, who is another of the 22 members of the project, said the “power of 4,500 characters” enabled the scientists to look “at all aspects of mammalian anatomy, from the skull and skeleton, to the teeth, to internal organs, to muscles and even fur patterns” to determine what the common ancestor possibly looked like.
The project was financed primarily by the National Science Foundation as part of its Assembling the Tree of Life program. Other scientists from Stony Brook, the American Natural History Museum and the Carnegie Museum participated, as well as researchers from the University of Florida, the University of Tennessee at Chattanooga, the University of Louisville, Western University of Health Sciences, in Pomona, Calif., Yale University and others in Canada, China, Brazil and Argentina.
Outside scientists said that this formidable new systematic data-crunching capability might reshape mammal research but that it would probably not immediately resolve the years of dispute between fossil and genetic partisans over when placental mammals arose. Paleontologists looking for answers in skeletons and anatomy have favored a date just before or a little after the Cretaceous extinction. Those who work with genetic data to tell time by “molecular clocks” have arrived at much earlier origins.
The conflict was billed as “Fossils vs. Clocks” in the headline for a commentary article by Anne D. Yoder, an evolutionary biologist at Duke University, which accompanied Dr. O’Leary’s journal report.
Dr. Yoder acknowledged that the new study offered “a fresh perspective on the pattern and timing of mammalian evolution drawn from a remarkable arsenal of morphological data from fossil and living mammals.” She also praised the research’s “level of sophistication and meticulous analysis.”
Even so, Dr. Yoder complained that the researchers “devoted most of their analytical energy to scoring characteristics and estimating the shape of the tree rather than the length of its branches.” She said that “the disregard for the consequences of branch lengths,” as determined by the molecular clocks of genetics, “leaves us wanting more.”
John Gatesy, an evolutionary biologist at the University of California, Riverside, who was familiar with the study but was not an author of the report, said the reconstruction of the common ancestor was “very reasonable and very cool.” The researchers, he said, “have used their extraordinarily large analysis to predict what this earliest placental looked like, and it would be interesting to extend this approach to more branch points in the tree” including for early ancestors like aardvarks, elephants and manatees.
But Dr. Gatesy said the post-Cretaceous date for the placentals “will surely be controversial, as this is much younger than estimates based on molecular clocks, and implies the compression of very long molecular branches at the base of the tree.”
Arrow head. (Credit: © underb / Fotolia)
Jan. 30, 2013 — Rebutting a speculative hypothesis that comet explosions changed Earth’s climate sufficiently to end the Clovis culture in North America about 13,000 years ago, Sandia lead author Mark Boslough and researchers from 14 academic institutions assert that other explanations must be found for the apparent disappearance.
“There’s no plausible mechanism to get airbursts over an entire continent,” said Boslough, a physicist. “For this and other reasons, we conclude that the impact hypothesis is, unfortunately, bogus.”
In a December 2012 American Geophysical Union monograph, first available in January, the researchers point out that no appropriately sized impact craters from that time period have been discovered, nor have any unambiguously “shocked” materials been found.
In addition, proposed fragmentation and explosion mechanisms “do not conserve energy or momentum,” a basic law of physics that must be satisfied for impact-caused climate change to have validity, the authors write.
Also absent are physics-based models that support the impact hypothesis. Models that do exist, write the authors, contradict the asteroid-impact hypothesizers.
The authors also charge that “several independent researchers have been unable to reproduce reported results” and that samples presented in support of the asteroid impact hypothesis were later discovered by carbon dating to be contaminated with modern material.
The Boslough trail
Boslough has a decades-long history of successfully interpreting the effects of comet and asteroid collisions.
His credibility was on the line on in July 1994 when Eos, the widely read newsletter of the American Geophysical Union, ran a front-page prediction by a Sandia National Laboratories team, led by Boslough, that under certain conditions plumes from the collision of comet Shoemaker-Levy 9 with the planet Jupiter would be visible from Earth.
The Sandia team — Boslough, Dave Crawford, Allen Robinson and Tim Trucano — were alone among the world’s scientists in offering that possibility.
“It was a gamble and could have been embarrassing if we were wrong,” said Boslough. “But I had been watching while Shoemaker-Levy 9 made its way across the heavens and realized it would be close enough to the horizon of Jupiter that the plumes would show.” His reasoning was backed by simulations from the world’s first massively parallel processing supercomputer, Sandia’s Intel Paragon.
On the one hand, it was a chance to check the new Paragon’s logic against real events, a shakedown run for the defense-oriented machine. On the other, it was a hold-your-breath prediction, a kind of Babe Ruth moment when the Babe is reputed to have pointed to the spot in the center field bleachers he intended to hit the next ball. No other scientists were willing to point the same way, partly due to previous failures in predicting the behavior of comets Kohoutek and Halley, and partly because most astronomers believed the plumes would be hidden behind Jupiter’s bulk.
That the plumes indeed proved visible started Boslough on his own trajectory as a media touchstone for things asteroidal and meteoritic.
It didn’t hurt that, when he stands before television cameras to discuss celestial impacts, his earnest manner, expressive gestures and extraterrestrial subject matter make him seem a combination of Carl Sagan and Luke Skywalker, or perhaps Tom Sawyer and Indiana Jones.
Standing in jeans, work shirt and hiking boots for the Discovery Channel at the site in Siberia where a mysterious explosion occurred 105 years ago, or discussing it at Sandia with his supercomputer simulations in bold colors on a big screen behind him, the rangy, 6-foot-3 Sandia researcher vividly and accurately explained why the mysterious explosion at Tunguska that decimated hundreds of square miles of trees and whose ejected debris was seen as far away as London most probably was caused neither by flying saucers drunkenly ramming a hillside (a proposed hypothesis) nor by an asteroid striking the Earth’s surface, but rather by the fireball of an asteroid airburst — an asteroid exploding high above ground, like a nuclear bomb, compressed to implosion as it plunged deeper into Earth’s thickening, increasingly resistive atmosphere. The governing physics, he said, was precisely the same as for the airburst on Jupiter.
Among later triumphs, Boslough was the Sandia component of a National Geographic team flown to the Libyan Desert to make sense of strange yellow-green glass worn as jewelry by pharaohs in days past. Boslough’s take: It was the result of heat on desert sands from a hypervelocity impact caused by an even bigger asteroid burst.
In the present case
In the Clovis case, Boslough felt that his ideas were taken further than he could accept when other researchers claimed that the purported demise of Clovis civilization in North America was the result of climate change produced by a cluster of comet fragments striking Earth.
In a widely reported press conference announcing the Clovis comet hypothesis in 2007, proponents showed a National Geographic animation based on one of Boslough’s simulations as inspiration for their idea.
Indiana Jones-style, Boslough responded. Confronted by apparently hard asteroid evidence, as well as a Nova documentary and an article in the journal Science, all purportedly showing his error in rebutting the comet hypothesis, Boslough ordered carbon dating of the major evidence provided by the opposition: nanodiamond-bearing carbon spherules associated with the shock of an asteroid’s impact. The tests found the alleged 13,000-year-old carbon to be of very recent formation.
While this raised red flags to those already critical of the impact hypothesis, “I never said the samples were salted,” Boslough said carefully. “I said they were contaminated.”
That find, along with irregularities reported in the background of one member of the opposing team, was enough for Nova to remove the entire episode from its list of science shows available for streaming, Boslough said.
“Just because a culture changed from Clovis to Folsom spear points didn’t mean their civilization collapsed,” he said. “They probably just used another technology. It’s like saying the phonograph culture collapsed and was replaced by the iPod culture.”
Share this story on Facebook, Twitter, and Google:
Other social bookmarking and sharing tools:
Using ancient DNA (aDNA) sampling, Jaime Mata-Míguez, an anthropology graduate student and lead author of the study, tracked the biological comings and goings of the Otomí people following the incorporation of Xaltocan into the Aztec empire. (Credit: Photos provided by Lisa Overholtzer, Wichita State University.)
Jan. 30, 2013 — For centuries, the fate of the original Otomí inhabitants of Xaltocan, the capital of a pre-Aztec Mexican city-state, has remained unknown. Researchers have long wondered whether they assimilated with the Aztecs or abandoned the town altogether.
According to new anthropological research from The University of Texas at Austin, Wichita State University and Washington State University, the answers may lie in DNA. Following this line of evidence, the researchers theorize that some original Otomies, possibly elite rulers, may have fled the town. Their exodus may have led to the reorganization of the original residents within Xaltocan, or to the influx of new residents, who may have intermarried with the Otomí population.
Using ancient DNA (aDNA) sampling, Jaime Mata-Míguez, an anthropology graduate student and lead author of the study, tracked the biological comings and goings of the Otomí people following the incorporation of Xaltocan into the Aztec empire. The study, published in American Journal of Physical Anthropology, is the first to provide genetic evidence for the anthropological cold case.
Learning more about changes in the size, composition, and structure of past populations helps anthropologists understand the impact of historical events, including imperial conquest, colonization, and migration, Mata-Míguez says. The case of Xaltocan is extremely valuable because it provides insight into the effects of Aztec imperialism on Mesoamerican populations.
Historical documents suggest that residents fled Xaltocan in 1395 AD, and that the Aztec ruler sent taxpayers to resettle the site in 1435 AD. Yet archaeological evidence indicates some degree of population stability across the imperial transition, deepening the mystery. Recently unearthed human remains from before and after the Aztec conquest at Xaltocan provide the rare opportunity to examine this genetic transition.
As part of the study, Mata-Míguez and his colleagues sampled mitochondrial aDNA from 25 bodies recovered from patios outside excavated houses in Xaltocan. They found that the pre-conquest maternal aDNA did not match those of the post-conquest era. These results are consistent with the idea that the Aztec conquest of Xaltocan had a significant genetic impact on the town.
Mata-Míguez suggests that long-distance trade, population movement and the reorganization of many conquered populations caused by Aztec imperialism could have caused similar genetic shifts in other regions of Mexico as well.
In focusing on mitochondrial DNA, this study only traced the history of maternal genetic lines at Xaltocan. Future aDNA analyses will be needed to clarify the extent and underlying causes of the genetic shift, but this study suggests that Aztec imperialism may have significantly altered at least some Xaltocan households.
Share this story on Facebook, Twitter, and Google:
Other social bookmarking and sharing tools:
“It would have made enough scampi to feed an army for a month.”
Published May 27, 2011
The specimens include the largest yet of its kind and suggests the spiny, somewhat shrimplike beasts dominated pre-dinosaur seas for millions of years longer than thought.
Early offshoots of an evolutionary line that led to modern crustaceans, the so-called anomalocaridids looked sort of like modern cuttlefish. But the fossil creatures had spiny limbs sprouting from their heads and circular, plated mouths, which opened and closed like the diaphragm of a camera.
Previous anomalocaridid fossils had shown the animals grew to perhaps 2 feet (0.6 meter) long, which already would have made them the largest animals of the Cambrian period (542 to 501 million years ago)—an evolutionarily explosive time, when invertebrate life evolved into many new varieties, such as sea lilies and worms.
But at a foot longer than previous specimens, the largest of the new anomalocaridids suggests the segmented animals grew to bigger sizes than scientists had imagined.
“It would have made enough scampi to feed an army for a month—it was giant, and no doubt very tasty,” quipped study co-author Derek Briggs, director of the Yale Peabody Museum of Natural History.
Story continues -> 3-Foot “Shrimp” Discovered—Dominated Prehistoric Seas
A campaign has been launched to build the first working model of Charles Babbage”s Analytical Engine – 173 years after it was designed.
The nineteenth-century mathematician produced detailed drawings of the steam-powered, general-purpose computer, which are now held at London”s Science Museum.
Parts of the machine have been constructed several times, by babbage himself, his family and others. But although his Difference Engine finally became a reality in 1991 and can be seen at the Science Museum no full version of the Analytical Engine has ever been created.
“What a marvel it would be to stand before this giant metal machine, powered by a steam engine, and running programs fed to it on a reel of punched cards,” says programmer and blogger John Graham-Cumming, who has launched the campaign.
“And what a great educational resource so that people can understand how computers work. One could even imagine holding competitions for people (including school children) to write programs to run on the engine. And it would be a way to celebrate both Charles Babbage and Ada Lovelace. How fantastic to be able to execute Lovelace”s code!”
It won”t be easy. Unlike the Analytical Engine, for which Babbage left a complete set of blueprints, the Analytical Engine was still a work in progress at the time of his death.
The first stage of the project, therefore, would be to go carefully through all the different versions to devide which one to build from.
Graham-Cumming is attempting to raise funds for the project, which would require several people to work on it, as well as some rather expensive materials. He says that, when complete, the machine would be donated to either the Science Museum or the National Museum of Computing.
Graham-Cumming has a long way to go. He”s asking people to sign up here and pledge £10/$10, saying he reckons he needs about 50,000 people. So far, 2,403 have agreed.