Category Archives: Technology
A job to do: Much as the Luddites feared mechanical looms 200 years ago, today’s middle-class workers have reason to worry that information technology erodes their employment prospects. Credit: Mary Evans Picture Library/Alamy
Information technology is reducing the need for certain jobs faster than new ones are being created.
The United States faces a protracted unemployment crisis: 6.3 million fewer Americans have jobs than was true at the end of 2007. And yet the country’s economic output is higher today than it was before the financial crisis. Where did the jobs go? Several factors, including outsourcing, help explain the state of the labor market, but fast-advancing, IT-driven automation might be playing the biggest role.
Since the beginning of the Industrial Revolution, people have feared that new technologies would permanently erode employment. Over and over again, these dislocations of labor have been temporary: technologies that made some jobs obsolete eventually led to new kinds of work, raising productivity and prosperity with no overall negative effect on employment.
There’s nothing to suggest that this dynamic no longer operates, but new research is showing that advances in workplace automation are being deployed at a faster pace than ever, making it more difficult for workers to adapt and wreaking havoc on the middle class: the clerks, accountants, and production-line workers whose tasks can increasingly be mastered by software and robots. “Do I think we will have permanently high unemployment as a consequence of technology? No,” says Peter Diamond, the MIT economist who won a 2010 Nobel Prize for his work on market imperfections, including those that affect employment. “What’s different now is that the nature of jobs going away has changed. Communication and computer abilities mean that the type of jobs affected have moved up the income distribution.”
Erik Brynjolfsson and Andrew McAfee study information-supercharged workplaces and the innovations and productivity advances they continually create. Now they have turned their sights to how these IT-driven improvements affect employment. In their new book, Brynjolfsson, director of the Center for Digital Business at MIT’s Sloan School of Management, and McAfee, its principal research scientist, see a paradox in the first decade of the 2000s. Even before the economic downturn caused U.S. unemployment to rise from 4.4 percent in May 2007 to 10.1 percent in October 2009, a disturbing trend was visible. From 2000 to 2007, GDP and productivity rose faster than they had in any decade since the 1960s, but employment growth was comparatively tepid.
Brynjolfsson and McAfee posit that more work was being done by, or with help from, machines. For example, Amazon.com reduced the need for retail staffers; computerized kiosks in hotels and airports replaced clerks; voice-recognition and speech systems replaced customer support staff and operators; and businesses of all kinds took advantage of tools such as enterprise resource planning software. “A classically trained economist would say: ‘This just means there’s a big adjustment taking place until we find the new equilibrium—the new stuff for people to do,’ ” says McAfee.
We’ve certainly made such adjustments before. But whereas agricultural advances played out over a century and electrification and factory automation rolled out over decades, the power of some information technologies is essentially doubling every two years or so as a consequence of Moore’s Law. It took some time for IT to fully replace the paper-driven workflows in cubicles, management suites, and retail stores. (In the 1980s and early 1990s productivity grew slowly, and then it took off after 1996; some economists explained that IT was finally being used effectively.) But now, Brynjolfsson and McAfee argue, the efficiencies and automation opportunities made possible by IT are advancing too fast for the labor market to keep up.
More evidence that technology has reduced the number of good jobs can be found in a working paper by David Autor, an economist at MIT, and David Dorn, an economist at the Center for Monetary and Financial Studies in Madrid. They too point to the crucial years of 2000–2005. Job growth happened mainly at the ends of the spectrum: in lower-paying positions, in areas such as personal care, cleaning services, and security, and in higher-end professional positions for technicians, managers, and the like. For laborers, administrative assistants, production workers, and sales representatives, the job market didn’t grow as fast—or even shrank. Subsequent research showed that things got worse after 2007. During the recession, nearly all the nation’s job losses were in those middle categories—the positions easiest to replace, fully or in part, by technology.
Brynjolfsson says the trends are “troubling.” And they are global; some of the jobs that IT threatens, for example, are at electronics factories in China and transcription services in India. “This is not about replacing all work, but rather about tectonic shifts that have left millions much worse off and others much better off,” he says. While he doesn’t believe the problem is permanent, that’s of little solace to the millions out of work now, and they may not be paid at their old rates even when they do find new jobs. “Over the longer term, they will develop new skills, or entrepreneurs will figure out ways of making use of their skills, or wages will drop, or all three of those things will happen,” he says. “But in the short run, your old set of skills that created a lot of value are not useful anymore.”
This means there’s a risk, unless the economy generates new high-quality jobs, that the people in the middle will face the prospect of menial jobs—whose wages will actually decline as more people compete for them. “Theory says the labor market will ‘clear.’ There are always things for people to do,” Autor says. “But it doesn’t say at what price.” And even as it gets crowded and potentially even less rewarding at the bottom, employees at the top are getting paid more, thanks to the multiplier effects of technology. Some 60 percent of the income growth in the United States between 2002 and 2007 went to the top 1 percent of Americans—the bulk of whom are executives whose companies are getting richer by using IT to become more efficient, Brynjolfsson and McAfee point out.
Story continues -> “Tectonic Shifts” in Employment
By Mat Smith
The high-definition pride of your living room may not want to hear it, but it looks like ultra high-definition TV (or UHDTV) has now taken another step towards reality. While shop-floor products remain years away, experts in the ITU Study Group on Broadcasting Service have made several agreements on technical standards for your (next?) next TV purchase. Increasing pixel count in future sets is also expected to improve viewing angles on glasses-free 3D, which needs more dots to work its lenticular magic. 33 megapixels sounds like it should be enough to work with.
A campaign has been launched to build the first working model of Charles Babbage”s Analytical Engine – 173 years after it was designed.
The nineteenth-century mathematician produced detailed drawings of the steam-powered, general-purpose computer, which are now held at London”s Science Museum.
Parts of the machine have been constructed several times, by babbage himself, his family and others. But although his Difference Engine finally became a reality in 1991 and can be seen at the Science Museum no full version of the Analytical Engine has ever been created.
“What a marvel it would be to stand before this giant metal machine, powered by a steam engine, and running programs fed to it on a reel of punched cards,” says programmer and blogger John Graham-Cumming, who has launched the campaign.
“And what a great educational resource so that people can understand how computers work. One could even imagine holding competitions for people (including school children) to write programs to run on the engine. And it would be a way to celebrate both Charles Babbage and Ada Lovelace. How fantastic to be able to execute Lovelace”s code!”
It won”t be easy. Unlike the Analytical Engine, for which Babbage left a complete set of blueprints, the Analytical Engine was still a work in progress at the time of his death.
The first stage of the project, therefore, would be to go carefully through all the different versions to devide which one to build from.
Graham-Cumming is attempting to raise funds for the project, which would require several people to work on it, as well as some rather expensive materials. He says that, when complete, the machine would be donated to either the Science Museum or the National Museum of Computing.
Graham-Cumming has a long way to go. He”s asking people to sign up here and pledge £10/$10, saying he reckons he needs about 50,000 people. So far, 2,403 have agreed.
An image of the earth and the blackness of outer space, obtained by Luke and Max Geissbuhler – Click image for more pictures
By Ben Coxworth
It’s an inspiring story that reminds you how the wonders of scientific exploration aren’t just limited to research institutions with big budgets… in August of this year, Luke Geissbuhler and his seven year-old son Max attached an HD video camera to a weather balloon and set it loose. They proceeded to obtain footage of the blackness of outer space, 19 miles (30 km) above the surface of the earth. Needless to say, there was a little more to it than just tying a piece of string around a camcorder.
Luke and Max created a miniature space capsule for their Brooklyn Space Program experiment, using a food take-out container. It contained the camera (with a peep hole for its lens), hand warmers to keep its battery warm, a “please return if you find this” note, and an iPhone, so that they could use its GPS to locate the capsule once it landed. The whole thing was coated in foam, to absorb the energy of a high-speed landing, and attached to a parachute.
The pair launched the balloon from Newburgh, New York, near their home in Brooklyn. Over the next 72 minutes, it proceeded to climb to over 100,000 feet (30,480 meters), encountering 100mph (161km/h) winds and temperatures of -60F(-51C) along the way. Due to the lack of pressure at such high altitudes, the balloon eventually expanded beyond its capacity and burst, sending the capsule on a 150mph (241km/h) parachute-assisted fall back to earth.
Amazingly, it landed just 30 miles (48 km) from its lift-off point, in the middle of the night. Using its external LED lamp to locate it visually, the Geissbuhlers found the capsule hanging from its parachute in a tree.
The project involved eight months of research and testing, but as you can see in the video below, the results were well worth the effort.
Follow link for video -> http://www.gizmag.com/father-and-son-send-camera-to-outer-space/16650/
By Darren Quick
Just under a year ago we reported on a method to clean polluted water and soil by infusing them with pressurized ozone gas microbubbles. The process was developed by Andy Hong at the University of Utah and has now moved out of the lab and is being put the test in a demonstration project in eastern China. If all goes to plan the process has the potential to boost a wide range of environmental cleanup efforts around the world.
The process, which Hong calls heightened ozonation treatment (or HOT), exposes pollutants and makes them easier to remove. Its uses include removing oil and gas byproducts from water, removing organics and heavy metals from industrial sites, and removing harmful algae from lakes.
The China project has seen the University of Utah partner with Honde LLC, a large Chinese environmental cleanup company, and the Chinese government to remediate an industrial site on the shore of Lake Taihu. The large lake is located adjacent to Wuxi, a major Chinese industrial city west of Shanghai with a population of about 4.5 million. Lake Taihu is polluted by numerous contaminants as it receives runoff from across the region, which is dotted with polluted factory sites that results in nutrients collecting in the lake that feed harmful algae.
“The lake requires extensive environmental cleanup after years of neglect,” Hong says. “We hope this restoration project will be the first among many to come for the area. We are fortunate that the Chinese government is aggressively cleaning up this area and willing to tackle challenging issues with new techniques that haven”t been used anywhere else. This is a great opportunity for us and China.”
The focus of the project, which began in September and is expected to last three months, is removing heavy metals and other contaminants from the soil. The centerpiece of the effort is a HOT reactor – a pressurized metal vessel that produces ozone microbubbles. The reactor is currently being used to treat soil, but it can also be used to treat water, algae or sewage waste.
The HOT reactor is placed on the site to be cleaned and filled with contaminated soil. Organic contaminants (hydrocarbons) are removed first by repeatedly pressurizing and depressurizing the reactor with ozone gas, creating microbubbles that degrade the hydrocarbons. Metal contaminants then are removed by adding a chelating agent to extract them, then adding lime to precipitate the contaminants so they can be filtered out and then disposed of.
“The clean soil will be used for tree planting on public lands, and the water is recycled and reused in subsequent batches of soil cleanup,” Hong says.
If the demonstration is successful, Hong expects the project to be replicated at other sites for different types of contaminants around Lake Taihu. He also expects his new method to be applied in the United States and other countries across the world. To that end, in addition to Honde, the technology has been licensed by 7Revolutions Energy Technology Fund – an investment company based in Salt Lake City and a University of Utah startup – which has started a company to explore using the technology in North America and elsewhere.
Not so long ago, the idea that a car could drive itself seemed mildly insane, but thanks to the impetus provided by the DARPA Urban Grand Challenge and ongoing research around the globe, driving might become a hobby rather than a necessity much sooner than you think. One of the pioneers in the field, the Berlin-based AutoNOMOS group unveiled its latest project earlier this year. Known as FU-X “Made in Germany” the tech-laden VW Passat uses GPS, video cameras, on-board laser scanners and radars to navigate autonomously, giving it the potential to be used as a driverless taxi cab. Its latest trick – you can now hail it with an iPad.
AutoNOMOS labs conducts research at the Freie Universität Berlin. Its aim is both to develop an unmanned vehicle navigation system that can co-exist on our roads with conventional cars and to explore potential uses for these systems.
One of the promising applications is for driverless taxis and the iPad demo is an extension of this system which showcases the benefits of the technology. Using the iPad, a “call” is made to the taxi and it immediately knows where you”re at. You can also follow the car”s progress as it makes its way to your location (no more ringing the taxi company to ask where that cab got to!) and use the iPad to tell it where you want to be dropped off.
The video below provides an overview of the iPad controlled “Pick me up!” system.
Follow link for video -> http://www.gizmag.com/autonomous-taxi-ipad/16649/
In an effort to explore what is perhaps the last salient region of our solar system yet to be visited by a spacecraft, NASA has announced Solar Probe Plus, a mission that will launch a probe directly into the sun””s atmosphere. The mission will seek to answer some of the outstanding questions about the nature of our very own star, while helping to understand and forecast the radiation environment in which future space explorers will be living and operating.
Humans have been observing the sun for millions of years, and although much knowledge has been gathered in the past few decades, at least two outstanding questions keep puzzling scientists even today. The first enigma is the discovery, back in the 1940s, that our star””s atmosphere (or corona) appears to be several hundred times hotter than the photosphere, the visible surface. The second is about the origin of the solar winds in the atmosphere, which travel at supersonic speeds and affect our planet as well as the rest of the solar system.
Because a definite answer to these questions can only be provided by direct measurements in the solar atmosphere, we have been forced to wait since 1958, when a mission to provide these answers was first recommended even though we lacked the necessary technology. Since then, a solar mission has remained one of the agency””s top priorities, and several studies of its possible implementations have been conducted.
As the spacecraft approaches the sun, it will be keeping a distance of “only” 6 million kilometers (3.7 million miles) from its surface. The extreme conditions in this region, where scientists expect to find temperatures in excess of 1400 degrees Celsius (2552F) and intense radiation, requires an ad-hoc structure for adequate protection. This function is performed by the innovative Thermal Protection System (TPS) – a large, flat carbon shield 2.7 meters (8.86 feet) in diameter protecting the spacecraft and instruments within its shadow during the solar encounters.
Power is provided by two separate solar array systems. The first is only intended to function as the probe approaches the star, while the second – consisting of two movable, liquid-cooled panels of high-temperature cells – is specifically designed to withstand the high temperatures of the Sun””s corona. As the spacecraft moves even closer to the Sun, these secondary arrays will be partially retracted behind the TPS in order to maintain constant temperature and power output, while a lithium-ion battery will function as a backup power source.
The craft, which is roughly the size of a small car, will be guided by a system of three star trackers, an inertial measurement unit, as well as a solar horizon sensor. Four reaction wheels for attitude control and a monopropellant system for trajectory correction maneuvers are also part of the system.
Back in 2009, NASA invited researchers to submit science proposals that would be useful toward its goals of solving the outstanding questions about the solar atmosphere. Now that the project is being finalized, the five chosen projects (whose total cost is estimated at around US$180 million) have been announced.
- Solar Wind Electrons Alphas and Protons Investigation: the project will capture particles in the sun””s atmosphere, such as electrons, protons and helium ions, into a specially designed cup and will directly measure their properties.
- Wide-field Imager: a telescope will take 3-D images of the solar corona, including three-dimensional images of clouds and radiation shocks as they approach and travel past the spacecraft.
- Fields Experiment: this experiment involves the direct measurement of electric and magnetic fields, radio emissions and shock waves that course through the sun””s atmospheric plasma. The experiment will also serve as a giant dust detector, registering voltage signatures as space dust hits the spacecraft””s antenna.
- Integrated Science Investigation of the Sun: two instruments will take an inventory of elements in the sun””s atmosphere using a mass spectrometer to weigh and sort ions in the vicinity of the spacecraft.
- Heliospheric Origins with Solar Probe Plus: directed by the mission””s observatory scientist Marco Velli, the aim of this project is to provide an independent assessment of the scientific performance of the overall mission.
Gary Munkhoff at the Green Living Journal PDX blog believes that Portland, Oregon deserves to have Shweeb human-powered monorails as an addition to Portland’s current Tri-Met transit offerings. Because of a recent Google investment, Munkhoff’s desire to see Shweeb in Portland has become a real possibility.
In September 2010, Google invested in research and development for the Shweeb human-powered monorails. Shweeb is now determining the location for its very first public use of the Shweeb transit. Whether Portland will be chosen is yet to be seen.
Portland is noted for its innovative ideas, sustainability, and for being bicycle friendly. Well check out Shweeb, because it is all of the above and more. (greenlivingpdx.blogspot)
The Long March 3C rocket carrying China”s second unmanned lunar probe, Chang”e-2, lifts off from the launch pad at the Xichang Satellite Launch Center, Sichuan Province October 1, 2010. REUTERS/Stringer
By Ben Blanchard
BEIJING, Oct. 2, 2010 (Reuters) — China launched its second lunar exploration probe on Friday, boosting the country”s efforts to rise as a major space power eventually capable of landing a man on the moon and perhaps one day exploring far beyond.
The Chang”e-2 lunar orbiter blasted off from a remote corner of the southwestern province of Sichuan a few seconds before 7 p.m. (7 a.m. EDT), state media said, on the same day the country celebrates 61 years since the founding of Communist China.
“Chang”e-2 lays foundation for the soft-landing on the moon and further exploration of outer space,” Xinhua news agency quoted head of the orbiter”s design team Wu Weiren as saying.
“It (will) travel faster and closer to the moon, and it will capture clear pictures,” Wu added.
State television delayed the start of its main evening news to carry live pictures of the launch, bumping a story about the country”s top leaders attending National Day ceremonies on Beijing”s central Tiananmen Square into second place.
The Chang”e-2 is expected to fly as close as 15 km (9.3 miles) above the moon, testing skills and technology intended to pave the way for an unmanned landing planned in about 2013.
It will take high-resolution photos of the moon”s Bay of Rainbows, where engineers plan to land Chang”e-3, the official China Daily said.
China is jostling with neighbors Japan and India for a bigger presence in outer space but its plans have faced international scrutiny.
Fears of a space arms race with the United States and other powers have mounted since China blew up one of its own weather satellites with a ground-based missile in January 2007. China says its aims are purely peaceful.
The Chang”e is named after a mythical Chinese goddess who flew to the moon. A successful Chang”e-2 mission would mark another advance in China”s plan to establish itself as a space power in the same league as the United States and Russia.
Chief designer Huang Jiangchuan told Xinhua before the launch that Chang”e-2 may be given an extra mission — flying into outer space to “test China”s capability to probe further into space”. He did not elaborate.
In 2003, China became only the third country, after the United States and Russia, to send a man into space aboard its own rocket.
In October 2005, it sent two men into orbit, and in 2008 it staged its first “space walk”, when an astronaut floated outside a vehicle orbiting the Earth.
Chinese space officials said they are considering a manned landing on the moon by 2025-2030, state media reported last year.
China launched its first moon orbiter, the Chang”e-1, in October 2007, accompanied by a blaze of patriotic propaganda celebrating the country”s technological prowess.
(Editing by Alex Richardson)
A 3D volume rendering of a reconstructed yeast spore from a set of 2D projectional images by a tomographic method, showing nucleus (orange), ER (green), vacuole (white), mitochondria (blue), and granules (light blue). (Credit: Image courtesy of UCLA/California NanoSystems Institute)
Three-dimensional imaging is dramatically expanding the ability of researchers to examine biological specimens, enabling a peek into their internal structures. And recent advances in X-ray diffraction methods have helped extend the limit of this approach.
While significant progress has been made in optical microscopy to break the diffraction barrier, such techniques rely on fluorescent labeling technologies, which prohibit the quantitative 3-D imaging of the entire contents of cells. Cryo-electron microscopy can image structures at a resolution of 3 to 5 nanometers, but this only works with thin or sectioned specimens.
And although X-ray protein crystallography is currently the primary method used for determining the 3-D structure of protein molecules, many biological specimens — such as whole cells, cellular organelles, some viruses and many important protein molecules — are difficult or impossible to crystallize, making their structures inaccessible. Overcoming these limitations requires the employment of different techniques.
Now, in a paper published May 31 in Proceedings of National Academy of Sciences, UCLA researchers and their collaborators demonstrate the use of a unique X-ray diffraction microscope that enabled them to reveal the internal structure of yeast spores. The team reports the quantitative 3-D imaging of a whole, unstained cell at a resolution of 50 to 60 nanometers using X-ray diffraction microscopy, also known as lensless imaging.
Researchers identified the 3-D morphology and structure of cellular organelles, including the cell wall, vacuole, endoplasmic reticulum, mitrochondria, granules and nucleolus. The work may open a door to identifying the individual protein molecules inside whole cells using labeling technologies.
The lead authors on the paper are Huaidong Jiang, a UCLA assistant researcher in physics and astronomy, and John Miao, a UCLA professor of physics and astronomy. The work is a culmination of a collaboration started three years ago with Fuyu Tamanoi, UCLA professor of microbiology, immunology and molecular genetics. Miao and Tamanoi are both researchers at UCLA’s California NanoSystems Institute. Other collaborators include teams at Riken Spring 8 in Japan and the Institute of Physics, Academia Sinica, in Taiwan.
“This is the first time that people have been able to peek into the 3-D internal structure of a biological specimen, without cutting it into sections, using X-ray diffraction microscopy,” Miao said.
“By avoiding use of X-ray lenses, the resolution of X-ray diffraction microscopy is ultimately limited by radiation damage to biological specimens. Using cryogenic technologies, 3-D imaging of whole biological cells at a resolution of 5 to 10 nanometers should be achievable,” Miao said. “Our work hence paves a way for quantitative 3-D imaging of a wide range of biological specimens at nanometer-scale resolutions that are too thick for electron microscopy.”
Tamanoi prepared the yeast spore samples analyzed in this study. Spores are specialized cells that are formed when they are placed under nutrient-starved conditions. Cells use this survival strategy to cope with harsh conditions.
“Biologists wanted to examine internal structures of the spore, but previous microscopic studies provided information on only the surface features. We are very excited to be able to view the spore in 3-D,” Tamanoi said. “We can now look into the structure of other spores, such as Anthrax spores and many other fungal spores. It is also important to point out that yeast spores are of similar size to many intracellular organelles in human cells. These can be examined in the future.”
Since its first experimental demonstration by Miao and collaborators in 1999, coherent diffraction microscopy has been applied to imaging a wide range of materials science and biological specimens, such as nanoparticles, nanocrystals, biomaterials, cells, cellular organelles, viruses and carbon nanotubes using X-ray, electron and laser facilities worldwide. Until now, however, the radiation-damage problem and the difficulty of acquiring high-quality 3-D diffraction patterns from individual whole cells have prevented the successful high-resolution 3-D imaging of biological cells by X-ray diffraction.
The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials provided by University of California – Los Angeles. The original article was written by Jennifer Marcus.
- H. Jiang, C. Song, C.-C. Chen, R. Xu, K. S. Raines, B. P. Fahimian, C.-H. Lu, T.-K. Lee, A. Nakashima, J. Urano, T. Ishikawa, F. Tamanoi, J. Miao. Quantitative 3D imaging of whole, unstained cells by using X-ray diffraction microscopy. Proceedings of the National Academy of Sciences, 2010; DOI: 10.1073/pnas.1000156107