New Nanolens Breaks Resolution Record

By Lisa Grossman

A new kind of lens reaches an unprecedentedly sharp focus by giving up on being perfect. The lens is the first ever to help take visual light images of structures smaller than 100 nanometers (four one-millionths of inch), which could make it useful for nanotechnology and probing the insides of cells.

Ordinary lenses, like those used in magnifying glasses, have curved surfaces that bend light to a single point. A small object sitting at that point appears larger and sharply focused, helping myopic readers discern fine print and old-school detectives search for fingerprints. But conventional lenses need to be almost perfect to work. Scratches and roughness destroy the clear image.

“Every deviation from the perfect surface results in a deteriorated focus,” said Elbert van Putten, a graduate student at the University of Twente in the Netherlands. “And in practice you’ll always see surface defects.”

The smallest object on which physicists have managed to focus a single conventional lens is 200 nanometers across, just larger than the smallest known bacteria (although more complicated microscopy systems have reached down to 50 nanometers). But a lot of structures that physicists and chemists are interested in, like subcellular structures, nanoelectric circuits and photonic structures, are less than half that size.

Story Continues -> New NanoLens

Traditional Inuit Knowledge Combines With Science to Shape Weather Insights

Inuit forecasters possessing generations of environmental knowledge are helping scientists better understand changes in Arctic weather. (Credit: Photo courtesy Shari Gearheard, CU-Boulder National Snow and Ice Data Center.)

Using skills passed down through generations, Inuit forecasters living in the Canadian Arctic look to the sky to tell by the way the wind scatters a cloud whether a storm is on the horizon or if it’s safe to go on a hunt.

Thousands of miles away in a lab tucked in Colorado’s Rocky Mountains, scientists take data measurements and use the latest computer models to predict weather. They are two practices serving the same purpose that come from disparate worlds.

But in the past 20 years, something has run amok with Inuit forecasting. Old weather signals don’t seem to mean what they used to. The cloud that scatters could signal a storm that comes in an hour instead of a day.

Now researchers are combining indigenous environmental knowledge with modern science to learn new things about what’s happening to the Arctic climate.

“It’s interesting how the western approach is often trying to understand things without necessarily experiencing them,” said Elizabeth Weatherhead, a research scientist with the University of Colorado at Boulder’s Cooperative Institute for Research in Environmental Sciences. “With the Inuit, it’s much more of an experiential issue, and I think that fundamental difference brings a completely different emphasis both in defining what the important scientific questions are, and discerning how to address them.”

For years, researchers had heard reports of unpredictable weather coming in from Arctic communities. But the stories didn’t seem to match up with the numbers. By scientific measurement, weather around the world appeared to be growing more persistent with less variation. The disparity left scientists scratching their heads, said Weatherhead.

“I had been hearing about this problem from other environmental statisticians for a number of years,” said Weatherhead, who also works closely with the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory in Boulder, Colo., and who is chief author on a new study on the subject. “But the Inuit used a different language than what we statisticians used, and none of us could really figure out what matched up with their observations.”

That’s where Shari Gearheard, a scientist with CU-Boulder’s National Snow and Ice Data Center, also part of CIRES, comes in. Gearheard lives in Clyde River, Nunavut, Canada, an Inuit community on eastern Baffin Island, and for the past 10 years has been working with Inuit hunters and elders to document their knowledge of the environment and environmental change.

Weather has a special importance in Arctic environments, where a reliable forecast can mean the difference between life and death. There are members of the Inuit community who possess the skills to predict the weather, but that knowledge is dying off as both the culture and climate change, according to the scientists.

“The impacts of that are a loss of confidence in those forecasters and concerns about incorrect forecasts,” said Gearheard. Forecasters don’t want to send somebody out to go hunting if they’re going to be unsafe and be in poor weather conditions.”

Gearheard meticulously collects the stories told to her by the Inuit and makes systematic records of indigenous environmental knowledge. Through this, patterns begin to emerge, she said.

Of special importance were changes experienced by the Inuit during the spring, a time of transition for many environmental processes. During spring, the Inuit would notice that the top layer of the snow melted during the day and then would refreeze at night, forming a crust.

“In fact, in a lot of places, the season is named after a particular process by the Inuit,” said Gearheard. “In cases like this where the Inuit are not seeing that process anymore, it is an indicator to them that something had changed.”

Gearheard’s records created a resolution of detail for Arctic weather observation that, by bringing the two studies together, gave Weatherhead the information she needed to bridge indigenous knowledge with scientific knowledge. “What was incredibly helpful was Shari’s detailed description of what they were experiencing on what sort of timescales,” said Weatherhead. “That really allowed us to start focusing on our statistical tests and try to find exactly what matched their observations.”

Statistical analysis of day-to-day temperatures at Baker Lake, Nunavut, showed that in May and June the persistence of temperature had recently declined, matching Inuit reports of greater unpredictability at that season. “People hadn’t previously looked at persistence in this way,” said CIRES fellow Roger Barry, also director of the World Data Center for Glaciology at the National Snow and Ice Data Center at CU-Boulder and a study co-author along with Gearheard.

What they found was a scientific story more in line with what people were witnessing on the ground. Weather along the Arctic latitudes was behaving more unpredictably than in other parts of the world.

“That’s an incredibly important parameter to care about,” said Weatherhead. “The way I try to describe it to some people is if we get an inch of rain out at my house in the month of July, I don’t need to turn on the sprinklers. But if we get an inch of rain on July 1, and no rain after that, my lawn is dead.

“Ecosystems have evolved under a certain type of pattern. So if that is changing, that could be just as important as a small increase in temperature or some of the other changes we’re talking about,” Weatherhead said.

The new study helps scientists refine and test climate models, while also providing such models with a new category of information to consider, said Weatherhead. And Gearheard’s work with the Inuit is demonstrating the value of indigenous environmental knowledge to modern climate science.

“When we first started talking about this, indigenous knowledge didn’t have the place it does now in research,” Gearheard said. “It’s growing. People are becoming more familiar with it, more respectful of it.”

Weatherhead and Gearheard said they are intrigued by the insights that incorporate indigenous knowledge and climate studies, but they don’t want to stop there. The new study has sparked an interest in the type of environmental knowledge other communities could provide to climate scientists, from ranchers and farmers to indigenous groups. “When you treat these perspectives as different forms of evidence or knowledge and see where that takes you, that is when exciting stuff happens,” said Gearheard.

The study appears this month in the journal Global Environmental Change. The National Science Foundation and the Social Sciences and Humanities Research Council of Canada provided funding for the study.

Story Source:

Adapted from materials provided by University of Colorado at Boulder.


Journal Reference:

  1. Weatherhead et al. Changes in weather persistence: Insight from Inuit knowledge. Global Environmental Change, 2010; DOI: 10.1016/j.gloenvcha.2010.02.002

http://www.sciencedaily.com/releases/2010/04/100407190000.htm

“Key” Human Ancestor Found: Fossils Link Apes, First Humans?

An Australopithecus sediba skull bears both human and ape traits. Photograph courtesy Brett Eloff

Australopithecus sediba had human-like face and could walk well upright but was apelike in other ways.

Ker Than

for National Geographic News

Published April 8, 2010

Identified via two-million-year-old fossils, a new human ancestor dubbed Australopithecus sediba may be the “key transitional species” between the apelike australopithecines—and the first Homo, or human, species, according to a new study.

“We’ve never seen this combination of traits in any one [early human species],” study leader Lee Berger told the journal Science, where the new study is published today.

Found in the remnants of an underground cave network in South Africa, the partial Australopithecus sediba skeletons are believed to be from a roughly 30-year-old woman and an 8- to 13-year-old boy.

The pre-human pair, who may or may not have been related, apparently fell to their deaths into a chasm littered with corpses of saber-toothed cats and other predators.

The new species may be the wellspring—“sediba” in the local Sotho tribal language—from which our ancestors flowed, the report suggests.

Berger, of the University of the Witwatersrand in Johannesburg, conjures a different metaphor.

“It’s the opinion of my colleagues and I that [Australopithecus sediba] may very well be the Rosetta stone that unlocks our understanding of the genus Homo,” Berger said in a statement, referring to the artifact that helped decipher ancient Egyptian hieroglyphics.

(Also see “Oldest Skeleton of Human Ancestor Found.”)

A. Sediba Fossils Suggest Human-Like Ape

Growing to just over 4 feet (1.2 meters) tall, A. sediba has a number of key traits that some would say mark it as an early human, like Homo habilis, which many consider the first human species.

A. sediba, for example, had long legs and certain humanlike characteristics in its pelvis, which would have made it the first human ancestor to walk—perhaps even run—in an energy-efficient manner, the study says. (Related: “Did Early Humans Start Walking for Sex?”)

Also, A. sediba’s face had small teeth and a modern—rather than chimpanzee-like—nose, the study says.

And as in humans, the shapes of A. sediba’s left and right brain halves—discernible from indentations on a remarkably preserved skull—appear to have been uneven.

A facial reconstruction is in the works, and many people will be surprised by how human the new fossil species looks, Berger predicted in a press conference Wednesday. “What you’ll see, I suspect, is something surprisingly more modern than we would expect in … other things that have been called Australopithecus,” which translates to “southern ape.”

So if our newest evolutionary ancestor is so human-like, why doesn’t the new study classify it as human?

Berger’s team believes that certain apelike traits force the new species into the Australopithecus genus, or group of species.

For one thing, unlike human species but like other australopithecines, A. sediba had a very small brain. The fossil species also had long ape-like arms with primitive wrists that were well suited for climbing trees.

Australopithecus Sediba’s World

In what’s now South Africa, A. sediba lived in a patchwork of grasslands and woods, where the fossil species likely ate fleshy fruits, young leaves, and perhaps small animals.

The generally flat landscape was broken up by small hills and cliffs, some of which contained caves, which could apparently be treacherous.

Scientists speculate that a harsh drought may have driven two desperately thirsty members of A. sediba to enter one of these caves in an attempt to find an underground source of water.

The pair may have clambered partway down into the cave, only to slip and fall several yards to their deaths. The deathtrap also contained fossils of 25 species that lived alongside A. sediba, including potential predators such as saber-toothed cats, hyenas, and wild dogs.

A. Sediba Only Human After All?

Other anthropologists seem to be unanimously excited about the new human-ancestor fossils. But not everyone is so sure the new species is the “key transitional species” between prehistoric apes and humans suggested by the study.

“I don’t think there’s a lot of compelling evidence to suggest that [A. sediba] lies between Australopithecus and Homo,” said anthropologist Bernard Wood of George Washington University.

A. sediba “doesn’t fit what our preconceptions would be about the ancestor of Homo,” said Wood, who wasn’t involved in the study.

For example, A. sediba’s arms are too long—too apelike—and the species isn’t as well adapted for upright walking as some scientists expect the direct ancestor to the first humans to be, Wood said.

Also, at 1.95 to 1.78 million years old, the A. sediba fossils simply aren’t old enough to represent an ancestor to Homo, said anthropologist Brian Richmond, also of George Washington University. (Explore a prehistoric time line.)

“It’s hard to argue this is the ancestor of Homo when it’s occurring much later than the earliest members of the genus Homo by half a million years,” Richmond said, referring to an early fossil of H. habilis that dates back to 2.3 million years ago.

Anthropologist William Kimbel thinks this chronological conundrum could be resolved by calling the new specimens Homo instead of Australopithecus.

“By putting it in Australopithecus and saying it’s ancestral to Homo, you’re left with having to wonder how to accommodate earlier Homo [species],” Kimbel said.

“If you put it in Homo, that problem falls away,” he said. “It’s then just one of several species around two million years ago that are near the base of the Homo lineage.”

Susan Anton, an anthropologist at New York University and a joint editor of the Journal of Human Evolution, agreed.

A. sediba has so many similarities with Homo that “I think they might have been better off including it in Homo,” Anton said.

“If you do that, then this is really no longer a transitional species between Australopithecus and Homo. It is Homo”—and just an evolutionary dead end in human ancestry.

With Fossils, Timing Is Everything

Berger, who has been funded in the past by the National Geographic Society, maintains that A. sediba belongs with other australopithecines because its anatomy suggests it was still climbing trees. (The National Geographic Society owns National Geographic News.)

“It hasn’t made that grade-level shift to the genus Homo” yet, he said.

As for questions about its timing, Berger believes future discoveries could turn up A. sediba fossils that are hundreds of thousands of years older, which would make them old enough to be the ancestors of early Homo species.

“This [discovery] site is only a point in time. It doesn’t represent the first appearance of this species, nor will it probably represent the last,” he said.

Regardless of where A. sediba ends up in the human family tree, it’s already an important fossil precisely because of all the questions that it raises, said paleontologist Scott Simpson of Case Western Reserve University in Cleveland, Ohio.

“This fossil is not one that resoundingly answers any specific questions,” Simpson said. “What it does is reinforce the idea that we haven’t even asked all the appropriate questions yet.

“People are going to be discussing this for a long, long time.”

http://news.nationalgeographic.com/news/2010/04/100408-fossils-australopithecus-sediba-missing-link-new-species-human/

Turning Noise Into Vision: New Way to Reveal Images of Hidden Objects

By adjusting an electrical voltage across a crystal of nonlinear material, the researchers recovered an image of lines and numbers that originally was hidden in noise (upper left). As they tuned the system (from left to right across each row from top to bottom), the image “stole” energy from the noise, first appearing and then degrading as they adjusted past the optimal voltage. (Credit: Jason Fleischer/Dmitry Dylov)

A new technique for revealing images of hidden objects may one day allow pilots to peer through fog and doctors to see more precisely into the human body without surgery.

Developed by Princeton engineers, the method relies on the surprising ability to clarify an image using rays of light that would typically make the image unrecognizable, such as those scattered by clouds, human tissue or murky water.

In their experiments, the researchers restored an obscured image into a clear pattern of numbers and lines. The process was akin to improving poor TV reception using the distorted, or “noisy,” part of the broadcast signal.

“Normally, noise is considered a bad thing,” said Jason Fleischer, an assistant professor of electrical engineering at Princeton. “But sometimes noise and signal can interact, and the energy from the noise can be used to amplify the signal. For weak signals, such as distant or dark images, actually adding noise can improve their quality.”

He said the ability to boost signals this way could potentially improve a broad range of signal technologies, including the sonograms doctors use to visualize fetuses and the radar systems pilots use to navigate through storms and turbulence. The method also potentially could be applied in technologies such as night vision goggles, inspection of underwater structures such as levies and bridge supports, and in steganography, the practice of masking signals for security purposes.

The findings were reported online March 14 in Nature Photonics.

In their experiments, Fleischer and co-author Dmitry Dylov, an electrical engineering graduate student, passed a laser beam through a small piece of glass engraved with numbers and lines, similar to the charts used during eye exams. The beam carried the image of the numbers and lines to a receiver connected to a video monitor, which displayed the pattern.

The researchers then placed a translucent piece of plastic similar to cellophane tape between the glass plate and the receiver. The tape-like material scattered the laser light before it arrived at the receiver, making the visual signal so noisy that the number and line pattern became indecipherable on the monitor, similar to the way smoke or fog might obstruct a person’s view.

The crucial portion of the experiment came when Fleischer and Dylov placed another object in the path of the laser beam. Just in front of the receiver, they mounted a crystal of strontium barium niobate (SBN), a material that belongs to a class of substances known as “nonlinear” for their ability to alter the behavior of light in strange ways. In this case, the nonlinear crystal mixed different parts of the picture, allowing signal and noise to interact.

By adjusting an electrical voltage across the piece of SBN, the researchers were able to tune in a clear image on the monitor. The SBN gathered the rays that had been scattered by the translucent plastic and used that energy to clarify the weak image of the lines and numbers.

“We used noise to feed signals,” Dylov said. “It’s as if you took a picture of a person in the dark, and we made the person brighter and the background darker so you could see them. The contrast makes the person stand out.”

The technique, known as “stochastic resonance,” only works for the right amount of noise, as too much can overwhelm the signal. It has been observed in a variety of fields, ranging from neuroscience to energy harvesting, but never has been used this way for imaging.

Based on the results of their experiment, Fleischer and Dylov developed a new theory for how noisy signals move through nonlinear materials, which combines ideas from the fields of statistical physics, information theory and optics.

The research was funded by the National Science Foundation, the U.S. Department of Energy and the U.S. Air Force.

Their theory provides a general foundation for nonlinear communication that can be applied to a wide range of technologies. The researchers plan to incorporate other signal processing techniques to further improve the clarity of the images they generate and to apply the concepts they developed to biomedical imaging devices, including those that use sound and ultrasound instead of light.

Story Source:

Adapted from materials provided by Princeton University, Engineering School.


Journal Reference:

  1. Dylov et al. Nonlinear self-filtering of noisy images via dynamical stochastic resonance. Nature Photonics, 2010; DOI: 10.1038/nphoton.2010.31

http://www.sciencedaily.com/releases/2010/04/100402110133.htm

Now in Broadband: Acoustic Imaging of the Ocean

Lavery’s research team towed a high frequency broadband acoustic system through turbulent water to measure the abundance of tiny zooplankton. The broadband sensor was able to discriminate between a region characterized by a patch of zooplankton and a region characterized by turbulence. (Credit: Jayne Doucette, Woods Hole Oceanographic Institution)

Researchers at Woods Hole Oceanographic Institution (WHOI) have developed two advanced broadband acoustic systems that they believe could represent the acoustic equivalent of the leap from black-and-white television to high-definition color TV. For oceanographers, this could mean a major upgrade in their ability to count and classify fish and to pinpoint tiny zooplankton amid seas of turbulence.

Lead authors Tim Stanton and Andone Lavery in the Department of Applied Ocean Physics and Engineering have already tested the two systems off the east coast of the U.S., with highly promising results. They and their colleagues describe the groundbreaking work in back-to-back papers recently published in the International Council for Exploration of the Sea (ICES) Journal of Marine Science. The technology is the culmination of efforts spanning two decades. Stanton explains, “Components of these advances separately have been achieved by previous investigators, but this is the first of its kind with all of the technologies in one package.”

A Problem of Interpretation

Because sound quickly travels large distances in water, oceanographers have long recognized the power of acoustic measurements to rapidly survey what lies beneath the ocean surface. When a sound wave hits an object, such as a fish or a shrimp, it scatters. Acoustic scientists analyze the frequency, strength, and timing of the scattered signal to determine what caused the echo.

Most acoustic instruments use sound waves that contain only one or a few frequencies. But, interpretation of these echoes is not straightforward. A single frequency sensor used to study two different patches of ocean will probably measure two different echo levels. Those different echo levels might mean the two patches contain different numbers of fish, different sizes of fish, different species of fish, that the fish were oriented differently in the water, or some combination of all of these factors. Stanton emphasizes that these ambiguities can change acoustic estimates of the numbers of fish by orders of magnitude.

Interpretation becomes even trickier when using acoustics to study millimeter-to-centimeter-sized animals called zooplankton. Certain types of zooplankton are attracted to places where there are gradients in the temperature and salinity of ocean water. Energy from tides or currents, interacting with rough topographical features of the ocean bottom, such as shelf breaks, result in the generation of turbulence in these stratified locations. Sound waves scattered off turbulence and zooplankton can have similar levels over a range of frequencies, making interpretation of single frequency signals in this frequency range impossible. As Lavery points out, “If you have a region of high turbulence, how do you know if scattering is from turbulence or from zooplankton that have accumulated in the region?”

The Broadband Breakthrough

If single frequency sensors provide an image of the ocean that is like looking at a black and white television, then Stanton, Lavery, and their colleagues have built acoustic systems that are like viewing high-definition color TV. The new instruments measure sound scattering at, not just a few frequencies, but over a continuous range of frequencies, generating broadband acoustic spectra. Years of theoretical work and laboratory modeling by Stanton, Lavery, and other researchers has laid the groundwork for interpretation of these spectra.

Stanton found that, for fish, most of the acoustic action occurs at very low frequencies. Much like blowing across the top of a soda bottle creates a unique tone, low frequency sound waves resonate with air in a fish’s swim bladder creating a characteristic scattering signal. In a broadband spectrum, this signal looks like a peak centered at a frequency between 1 and 10 kHz for small fish. Since most echosounders measure frequencies at 38 or 120 kHz, they miss this key indicator.

Importantly, the peak resonance frequency changes for different sized fish, but doesn’t depend on the fish’s orientation in the water. Also, few other marine organisms scatter sound at these low frequencies. All this means broadband signals can be used to not only to discriminate between fish and other marine organisms, but also to identify both sizes and densities of fish.

At higher frequencies, the researchers also exploit other aspects of the shape of the acoustic spectrum to determine what scattered the sound wave. For example, a downward slope from 150 to 600 kHz signals high levels of turbulence. In the same frequency range, a curve with an opposite slope, sloping upward, means the water is full of small zooplankton. Further, the frequency at which the shape becomes flat indicates the size of the zooplankton.

Not only do the continuous range of frequencies used by broadband systems have the advantage of improving interpretation, they provide a lot more information so Stanton and Lavery can use sophisticated processing algorithms. A method called pulse compression decreases the noise of the signal compared to that of traditional echosounders, increasing the distance at which they can detect the organisms. The same advanced processing also improves range resolution, bringing the acoustic images into sharp focus so closely spaced organisms can be distinguished from each other.

These advances will enable scientists to study biological processes, such as predator-prey interactions (that is, determining “who eats who”) with far better accuracy. A crucial element of studying the biological process is to first characterize the temporal and spatial distribution of organisms. The quality of the process study is only as good as the quality of the characterization of the distribution of organisms. If a scientist uses a single frequency acoustic system to study the organisms and misreads a turbulence echo as one from zooplankton, or misreads an echo from a large fish as one from many small fish, then the study of the biological process will be fundamentally flawed. Use of these new broadband systems will greatly facilitate characterizing the distributions of organisms, eliminating many of the ambiguities and improving the accuracy.

The Whole Package

Stanton and Lavery have incorporated all of these theoretical and processing improvements in two new broadband acoustic systems, and demonstrated their use in the ocean. An instrument spanning lower frequencies (1.5 kHz to 100 kHz) was developed for the detection of fish. Another package that collects measurements at higher frequencies (150 to 600 kHz) was built to discriminate zooplankton from turbulence. Both systems are custom modifications of commercial systems originally designed for studying the seafloor by EdgeTech. The developments were in collaboration with engineers at EdgeTech, who made the hardware modifications.

In its first use, the lower frequency package was towed over 1 km patch of Atlantic herring off of Cape Cod, MA where it recorded a consistent resonance peak at about 3.7 kHz. To Stanton and his colleagues this meant that the fish were all the same size (around 24 cm) and that only the density of fish caused differences in the scattering signal. Stanton accurately identified parts of the school where the density was as high as two fish per cubic meter and as low as 0.05 per cubic meter.

Lavery and her colleagues first deployed the higher frequency system over the New Jersey continental shelf. In one patch of water, the shape of the broadband spectrum indicated that zooplankton were present. Further analysis showed that these animals were probably copepods — small crustaceans about 1 to 2 mm in length. In another patch, the distinctive downward slope of the spectrum meant that turbulence, rather than zooplankton, caused the signal. Stanton emphasizes, “That is first of a kind data. Broadband sound has not been used to identify turbulence before.”

The Future of Broadband

“We aren’t going to stop with these two instruments.” says Stanton. Both researchers plan to develop new broadband systems that span larger ranges of sound frequencies to detect smaller zooplankton and bigger fish. These new packages will be mounted on ships, automatic underwater vehicles (AUVs), and moorings to study a variety of environments over different time and space scales.

Beyond the scientific community, broadband technology has important regulatory, commercial, and military value. Classifying and counting marine organisms is fundamental to fisheries managers who need to determine stock sizes. Fishermen benefit from accurate identification of fish sizes and densities, as well as from finding the zooplankton fish eat. The Navy is using broadband technology to learn how fish interfere with underwater systems. Lavery explains, “My hope is that one of the major companies that makes acoustic systems will pick up on broadband technology and make it accessible to the general user community.”

Stanton summarizes the accomplishment of twenty years of theoretical, laboratory, and field research as he looks toward the potential of broadband acoustic technology in the ocean, “We have created a body of work. And new milestones are in front of us.”

This work was supported by the US Office of Naval Research, National Oceanic and Atmospheric Administration, and WHOI.

Story Source:

Adapted from materials provided by Woods Hole Oceanographic Institution.


Journal References:

  1. Stanton et al. New broadband methods for resonance classification and high-resolution imagery of fish with swimbladders using a modified commercial broadband echosounder. ICES Journal of Marine Science, 2010; 67 (2): 365 DOI: 10.1093/icesjms/fsp262
  2. Lavery et al. Measurements of acoustic scattering from zooplankton and oceanic microstructure using a broadband echosounder. ICES Journal of Marine Science, 2010; 67 (2): 379 DOI: 10.1093/icesjms/fsp242

http://www.sciencedaily.com/releases/2010/04/100401135823.htm

‘That Was My Idea’: Group Brainstorming Settings and Fixation

When people, groups, or organizations are looking for a fresh perspective on a project, they often turn to a brainstorming exercise to get those juices flowing. An upcoming study from Applied Cognitive Psychology suggests that this may not be the best route to take to generate unique and varied ideas.

The researchers from Texas A&M University show that group brainstorming exercises can lead to fixation on only one idea or possibility, blocking out other ideas and possibilities, and leading eventually to a conformity of ideas. Lead researcher Nicholas Kohn explains, “Fixation to other people’s ideas can occur unconsciously and lead to you suggesting ideas that mimic your brainstorming partners. Thus, you potentially become less creative.”

The researchers used AOL Instant Messenger as their electronic discussion format when conducting the experiments, which included groups of two, three, and four subjects. This study and other studies have also shown that taking a break (allowing for a mental incubation period in participants) can stem the natural decline in quantity (production deficit) and the variety of ideas, and encourage problem solving.

Therefore, group creativity may be an overestimated method to generate ideas and individual brainstorming exercises (such as written creativity drills) may be more effective. If ideas are to be shared in a group setting, members of the group need to be aware of this fixation phenomenon, and take steps to prevent conformity. This will lead to a more vibrant, fresh discussion and a wider range of possible solutions.

Story Source:

Adapted from materials provided by Wiley-Blackwell.


Journal Reference:

  1. Nicholas W. Kohn; Steven M. Smith. Collaborative fixation: Effects of others’ ideas on brainstorming. Applied Cognitive Psychology, 2010; DOI: 10.1002/acp.1699

http://www.sciencedaily.com/releases/2010/03/100329112157.htm

Rube Goldberg competition gets teens excited about STEM

String ‘a’ pulls pencil ‘b,’ which allows mallet ‘c’ to drop on scissors ‘d,’ which then cuts wire ‘e’ allowing weight to fall towards pulley ‘f’…

By Chris Foresman

In recent years the US has begun to lag in education for science, technology, engineering, and mathematics (STEM), and a number of efforts are underway to address this issue. We know that giving kids hands-on experience is one of the best ways to spark and keep their interest in STEM-related fields, and to this end, high schoolers all over the country are getting an opportunity to learn and apply STEM knowledge by participating in the annual Rube Goldberg Machine Contest.

Rube Goldberg, who was himself an engineer, is most famous for his cartoons that depicted contrived, complex contraptions for executing the most mundane tasks. The cartoons were meant to serve as a criticism for the encroachment of technology in our lives during the early part of the 20th century, and the tendency to favor “exerting maximum effort to achieve minimal results.” Rube Goldberg machines, named in honor of these cartoons, typically involve complex arrangements of levers, pulleys, balloons, ball bearings, mouse traps, and other mechanical means that could accomplish something as simple as starting a phonograph.

Perhaps the most recent example of a Rube Goldberg machine in pop culture was featured in a recent video by the band OK Go. Known for their unusual and widely viewed music videos, the band spent 5 months with a team of as many as 60 engineers and designers to create an elaborate Rube Goldberg machine synced to its song “This Too Shall Pass.” The machine ultimately shoots colored paint out of air cannons and all over the band members at the end of the four-minute song.

In 1949, two engineering fraternities at Purdue University began a competition to devise the most complex machine to accomplish a given goal. That competition lasted for six years, but was later revived by one of the fraternities in 1983. The competition culminated in the first national Rube Goldberg Machine Contest in 1988, and it has been held at Purdue ever since. In 2007, high schools were permitted to participate in the national contest for the first time, giving kids as young as 14 exposure to one of the most exciting and diabolical engineering competitions ever.

Classic  Rube Goldberg cartoon
This is a classic Rube Goldberg machine, “How to Keep the Boss from Knowing you are Late for Work.” It sometimes comes in handy around here in the Orbiting HQ.

Having grown up in the same city as Purdue University, I got to witness several collegiate national competitions, including the classics, “put coins in a bank, toast a slice of bread” and “squeeze the juice from an orange.” This year’s goal was to create a machine that can dispense hand sanitizer in no less than 20 steps. I had the incredible opportunity to serve as a judge for a Chicago-area regional high school competition held recently at Prosser Career Academy. Five schools competed for a chance to go on to compete at the national contest at Purdue on March 27, including one school that traveled all the way from North Carolina.

Competition builds better engineers

The teams competing included Prosser and Jones College Prep from the Chicago Public Schools system, Niles West High School and Downers Grove North High School from the suburbs, and River Mill Academy from Graham, North Carolina. Schools normally compete in a local regional competition, and advisor Sandi Daigle told Ars that the Rivermill team had originally planned to compete in Knoxville, Tennessee. However, only River Mill and a team from Knoxville were registered for that event, and the rules require at least three teams to compete to move on to nationals. “It was either here, Las Vegas, or Korea,” she said.

Funding seemed to be a big concern for most of the schools participating. “Our school is tiny—we don’t even have bells,” Daigle told Ars. “We ended up having a parent donate frequent flyer miles to pay for the hotel.”

The team from Jones Prep had an even bigger budget constraint: $0. The science budget only included enough money for the entry fee for the regional competition; the machine itself had to be built using only materials in advisor Alexis Kovacs’ Physics lab. “Using what we could find, including cardboard and duct tape, we built a pretty good Rube Goldberg machine without buying anything fancy,” Kovacs told Ars.

The machine also included masses from a balance set, physics text books, astronomy board games, the ever-popular series of dominos, a vintage motorized solar system model, and trash cans “appropriated” from the school’s sixth floor. The team did admit to pooling loose change to buy water balloons, however.

“I think it makes it more challenging,” Kovacs said, “but I think in the end the kids feel better about themselves about having built something that didn’t require buying anything.”

The team from Jones College Prep describes how their machine works  for the judges.
The team from Jones College Prep describes how their machine works for the judges.

The teams were all built from membership in a science or engineering club, and most noted that the Rube Goldberg competition gave them a directed goal to work towards throughout the school year. “We don’t have a lot of extra-curricular activities,” River Mill’s Daigle told Ars, “and I said, ‘Hey, I’m interested in starting an engineering club.’ The Rube Goldberg contest gave us something to do in the club, as opposed to sitting around and tinkering with stuff.”

Most of the teacher/advisors agreed that giving kids a hands-on opportunity was a critical aspect of participating in the competition. “It’s a great way to get kids who might otherwise be intimidated by the math or hard science involved and interested in STEM,” Kovacs said. “For those that are already interested in science or engineering fields, this helps them be able to solve real problems and really use their hands.”

Downer Grove North’s Jeff Grant agreed, noting that the competition helped his students to see their ideas turned into something tangible. “I think that’s super vital,” he told Ars. “There’s just no opportunity to do something like this in the classroom.”

Teamwork and the team-building experience were also considered important aspects of the competition. Niles West advisor Ben Brzezinski told Ars that working as a team will be a skill that the students will use for the rest of their lives. “Wherever they end up, they are going to have to work as a team to build things, whether they go into engineering or any other field,” he said. “There definitely has to be a sort of synergy there, as well as leadership, and camaraderie.”

That camaraderie is something that River Mill team member Joshua Crumb valued most. “It’s a great bond—you have eight or so people that you might honestly never talk to, and by the end we’re all like family,” he told Ars.

Prosser advisor and competition organizer Nathan Dolenc said that getting to know his students better was his number-one priority, something he felt many teachers miss out on. “Yes, it’s about making the machine work, but it’s also about the journey getting there,” he told Ars. “Hopefully, some of my lessons rub off on them.”

Article Continues -> http://arstechnica.com/science/news/2010/03/rube-goldberg-competition-gets-teens-excited-about-stem.ars

By Tracking Water Molecules, Physicists Hope to Unlock Secrets of Life

Supercool. As individual water molecules fluctuate, breaking and forming bonds with their nearest neighbors, the result is slightly imperfect tetrahedral structures that are constantly in flux. Research suggests that these fluctuations give rise to some of water’s most unusual and life-sustaining features. (Credit: Image courtesy of Rockefeller University)

The key to life as we know it is water, a tiny molecule with some highly unusual properties, such as the ability to retain large amounts of heat and to lose, instead of gain, density as it solidifies. It behaves so differently from other liquids, in fact, that by some measures it shouldn’t even exist. Now scientists have made a batch of new discoveries about the ubiquitous liquid, suggesting that an individual water molecule’s interactions with its neighbors could someday be manipulated to solve some of the world’s thorniest problems — from agriculture to cancer.

The work, led by Pradeep Kumar, a fellow at Rockefeller University’s Center for Studies in Physics and Biology who looks at the role of water in biology, makes it possible to measure how interaction between water molecules affect any number of properties in a system. It also paves the way for understanding how water can be manipulated to facilitate or prevent substances from dissolving in it, an advance that could impact every corner of society, from reforming agricultural practices to improving chemotherapy drugs whose side effects arise from their solubility or insolubility in water.

Kumar and his colleagues first tracked individual water molecules in a “supercooled” state (water that remains in liquid form even at below freezing temperatures), during which water’s many anomalies are enhanced. “When you put water in a freezer, it doesn’t freeze instantaneously,” says Kumar. “It takes some time. If you have extremely pure water, then you can go down to about 230 Kelvin and still have enough time to measure different physical properties of water including the specific heat in its liquid state.” Kumar and his colleagues then used theoretical and computational approaches to simulate the activity of these water molecules and measure their interactions with neighbors.

In the liquid state, every water molecule fleetingly interacts with its four nearest neighbors, forming a tetrahedron, explains Kumar. These tetrahedrons, however, are slightly imperfect and the degree to which they are changes as temperature and pressure change, ultimately affecting which individual water molecules partner up with each other. Kumar found that it is the fluctuations in the degree of tetrahedrality that contribute most to one of water’s most notable and valuable features — its capacity to resist heating or cooling and thereby regulating and maintaining the temperature of biological systems.

The ability to measure water’s shifting degrees of tetrahedrality also gives scientists a means of measuring how much order or disorder each water molecule imparts. The better the tetrahedron, the more order it imparts in the system. “What we have done essentially is define the structural entropy of every molecule in our system,” says Kumar. “And since water molecules are constantly moving in space and time, this gives you a way to study the transport of entropy associated with local tetrahedrality — something that has never been done before.”

Understanding how individual water molecules maneuver in a system to form fleeting tetrahedral structures and how changing physical conditions such as temperatures and pressures affect the amount of disorder each imparts on that system may help scientists understand why certain substances, like drugs used in chemotherapy, are soluble in water and why some are not.

It could also help understand how this changing network of bonds and ordering of local tetrahedrality between water molecules changes the nature of protein folding and degradation. “Understanding hydrophobicity, and how different conditions change it, is probably one of the most fundamental components in understanding how proteins fold in water and how different biomolecules remain stable in it,” says Kumar. “And if we understand this, we will not only have a new way of thinking about physics and biology but also a new way to approach health and disease.”

Story Source:

Adapted from materials provided by Rockefeller University.


Journal Reference:

  1. Kumar et al. A tetrahedral entropy for water. Proceedings of the National Academy of Sciences, 2009; 106 (52): 22130 DOI: 10.1073/pnas.0911094106

http://www.sciencedaily.com/releases/2010/02/100227215943.htm

Sound lasers inch closer to reality

Researchers are making quick progress toward high-frequency sound lasers that could be used for precise and non-destructive medical imaging. (Credit: Alan Stonebraker)

By Dario Borghino

Fifty years after the invention of the optical laser, two separate research groups have independently made important steps toward making phonon lasers – a type of laser that emits very high-frequency, coordinated sound rather than light waves – a reality. The studies, published in the current issue of the journal Physical Review Letters, could lead to a completely new kind of laser that could find interesting applications in medical imaging.

The quantum nature of light means its’ possible to emit coherent photons of the same frequency and phase, in a process called “stimulated emission”. This was predicted in 1917 by Albert Einstein and first put into practice in 1960, when the first optical laser was built.

Despite some fundamental differences, light and sound waves are both formed by quanta, meaning that sound lasers (or “sasers”) are also possible. Researchers have been looking at sound lasers for some time, but haven’t been able to build one working at very high (terahertz) frequencies just yet.

The interest around sound lasers is not just purely academic: sound propagates at a speed that is about 100,000 smaller than the speed of light, and therefore has a proportionately smaller wavelength, along with lower energy levels. The combination of these two factors means sound lasers would allow for extremely precise imaging of living tissue without damaging it in the process (as is often the case with optical imaging).

The main obstacle to the implementation of a high-frequency saser is also what makes it so attractive: the shorter wavelengths make it harder to coordinate the quantum particles to travel coherently and realize the “stimulated emission” in phonons.

Two research teams from the US and the UK tackled the problem using different approaches, and both made important progress towards making sasers a reality. A group from Caltech assembled a pair of microscopic cavities that only permit specific frequencies of phonons to be emitted, effectively producing a resonator that ensures the waves are always in phase with each other.

A second group from the University of Nottingham in the UK took a different approach: they built their device out of electrons moving through a series of structures known as “quantum wells”: whenever an electron hops from one quantum well to the next, it produces a phonon. While this system doesn’t have the properties of a true phonon lasing, the system showed it amplifies high-frequency sound and could be used in the future as a fundamental building block of the first sound laser.

Both these studies are important breakthroughs that will one day bring to practical, high-frequency phonon lasers. While it’s hard to predict right away what repercussion this could have in the long run — the optical laser was deemed next to useless shortly after being invented — medical imaging would surely benefit greatly from its development even in the short term.

Papers: Phonon Laser Action in a Tunable Two-Level System, Coherent Terahertz Sound Amplification and Spectral Line Narrowing in a Stark Ladder Superlattice

http://www.gizmag.com/sound-lasers-medical-imaging/14337/

Interactions Between Species: Powerful Driving Force Behind Evolution?

Computer rendering of virus particles. In a new study, researchers used fast-evolving viruses to observe hundreds of generations of evolution. They found that for every viral strategy of attack, the bacteria would adapt to defend itself, which triggered an endless cycle of co-evolutionary change. (Credit: iStockphoto/Martin McCarthy)

Scientists at the University of Liverpool have provided the first experimental evidence that shows that evolution is driven most powerfully by interactions between species, rather than adaptation to the environment.

The team observed viruses as they evolved over hundreds of generations to infect bacteria. They found that when the bacteria could evolve defences, the viruses evolved at a quicker rate and generated greater diversity, compared to situations where the bacteria were unable to adapt to the viral infection.

The study shows, for the first time, that the American evolutionary biologist Leigh Van Valen was correct in his ‘Red Queen Hypothesis’. The theory, first put forward in the 1970s, was named after a passage in Lewis Carroll’s Through the Looking Glass in which the Red Queen tells Alice, ‘It takes all the running you can do to keep in the same place’. This suggested that species were in a constant race for survival and have to continue to evolve new ways of defending themselves throughout time.

Dr Steve Paterson, from the University’s School of Biosciences, explains: “Historically, it was assumed that most evolution was driven by a need to adapt to the environment or habitat. The Red Queen Hypothesis challenged this by pointing out that actually most natural selection will arise from co-evolutionary interactions with other species, not from interactions with the environment.

“This suggested that evolutionary change was created by ‘tit-for-tat’ adaptations by species in constant combat. This theory is widely accepted in the science community, but this is the first time we have been able to show evidence of it in an experiment with living things.”

Dr Michael Brockhurst said: “We used fast-evolving viruses so that we could observe hundreds of generations of evolution. We found that for every viral strategy of attack, the bacteria would adapt to defend itself, which triggered an endless cycle of co-evolutionary change. We compared this with evolution against a fixed target, by disabling the bacteria’s ability to adapt to the virus.

“These experiments showed us that co-evolutionary interactions between species result in more genetically diverse populations, compared to instances where the host was not able to adapt to the parasite. The virus was also able to evolve twice as quickly when the bacteria were allowed to evolve alongside it.”

The team used high-throughput DNA sequencing technology at the Centre for Genomic Research to sequence thousands of virus genomes. The next stage of the research is to understand how co-evolution differs when interacting species help, rather than harm, one another.

The research is published in Nature and was supported by funding from the Natural Environment Research Council (NERC); the Wellcome Trust; the European Research Council and the Leverhulme Trust.

Story Source:

Adapted from materials provided by University of Liverpool, via EurekAlert!, a service of AAAS.


Journal Reference:

  1. Steve Paterson, Tom Vogwill, Angus Buckling, Rebecca Benmayor, Andrew J. Spiers, Nicholas R. Thomson, Mike Quail, Frances Smith, Danielle Walker, Ben Libberton, Andrew Fenton, Neil Hall & Michael A. Brockhurst. Antagonistic coevolution accelerates molecular evolution. Nature, 2010; DOI: 10.1038/nature08798

http://www.sciencedaily.com/releases/2010/02/100225091344.htm