When a Ping Pong Ball Breaks the Sound Barrier

When a Ping Pong Ball Breaks the Sound Barrier

Better Nuclear Power Through Ping Pong

It easily blasts a hole in a ping pong paddle. It also demonstrates a revolutionary way to harvest nuclear power.

The lab is deep-space quiet. A long, narrow hallway hung with fluorescent lights extends to my left. Four or five doors interrupt the flow of drywall. A few of those doors are open, the occupants of the rooms within now out in the hall and staring, ears plugged in anticipation.

A technician flips a small lever to activate the vacuum pumps on an 18-foot cannon that is tented in bulletproof polycarbonate. He’s dressed casually in dark jeans and a black button-down, an ID card coolly clipped to his pants. He wears clear safety glasses and bright red protective headphones. Like the scientists down the hall, he is part of Intellectual Ventures in Bellevue, Washington—a skunkworks created by Nathan Myhrvold (Microsoft’s former chief technical officer and a bit of a mad scientist), who pays some of the smartest doctors, biologists, chemists, nuclear scientists, demolition experts, and hackers to work together to create great things. Things like the cannon we’re about to fire, which demonstrates technology that could change the nuclear power industry.

The pumps chitter away, sucking air from the barrel. That’s the secret to breaking the sound barrier with a ping pong ball. If any air were left in front of the ball, it would crush the ball under the force of the acceleration. I press a button to release 400 psi of helium gas into the accumulator. The breech is loaded and the silence returns—until I yell “Fire in the hole!” and press the red fire button. A shattering ka-BAWOOOMM roars through the lab complex. The smell of smoke hits my nostrils. Splinters burst everywhere, crashing into the plywood backstop and bulletproof protection panels. They came from the ping pong paddle mounted two inches in front of the cannon. That paddle, a multilayered rubber-and-wood Stiga, now has a ping-pong-ball-shaped hole through its center. Considering that the little yellow ping pong ball was traveling at Mach 2.09, the paddle didn’t have a chance.

The cannon is a prop, really—something to get potential investors excited about the technology. After our test fire, the scientists in the hallway are cheering. This isn’t just work.

Conventional reactors use designs that remain basically unchanged since the 1950s. They require expensive enriched uranium and frequent fuel changes. The Intellectual Ventures design, from a spin-off called TerraPower, uses unenriched uranium and needs fuel changes every ten years.

That’s the secret to breaking the sound barrier with a ping pong ball: If any air were left in front of the ball, it would crush the ball under the force of the acceleration.

What does any of that have to do with ping pong? Imagine the ping pong ball is a neutron. In a conventional reactor, a neutron knocks into an atom and releases two or three neutrons, creating heat in a slow chain reaction. In the TerraPower reactor, that neutron travels more like the ping pong ball: at an insanely high speed. It bashes into atoms, freeing neutrons like the shards that fly from the demolished ping pong paddle—as many as six per collision. Those neutrons retain most of the speed of the first and go on to cause collisions of their own, freeing even more neutrons and continuing the chain reaction with exponentially higher efficiency. The design, called a Traveling Wave Reactor, unlocks about 30 times more energy, produces three to six times less waste, improves safety, and, TerraPower contends, will eventually eliminate the need to use enrichment. It also manages to use the plutonium created without having to remove it from the reaction and process it, which means the technology could be shared with rogue nations without worrying that it would be weaponized. (If the plutonium never comes out of the system, it can’t be put in a missile.)

With our tests finished, the researchers head back to their labs to work on the next great project. 3ric Johanson (not a typo, he’s a hacker and engineer), who worked on the cannon, turns to me with joy. Even if there hadn’t been a nuclear project, he says, “we would have made the cannon anyway, just because it’s cool.” The group plans for the technology to be operational by 2027. In the meantime, they’ll be doing a lot of testing of the ping pong cannon. Whether they need to or not.

Two Types of Nuclear Reactions


U-235 is the enriched uranium isotope. It’s easily split by a neutron moving at slow speed. When the neutron hits the uranium atom, the atom divides into two fission products and releases two to three neutrons. One of those neutrons might be absorbed by unenriched uranium, U-238. One might hit another U-235 atom to continue the chain reaction. And most others will leak out and no longer contribute to the process. Enriched U-235 atoms must be added to continue the reaction. If too many U-238 atoms are present, the reaction will die.


Neutrons in fast reactions move much more quickly because they use liquid metal sodium as coolant instead of water. Sodium atoms are heavier than the atoms in water, so neutrons bounce off of them harder and retain their speed. When a neutron hits a U-235 atom, the higher velocity releases three to six neutrons. According to Nick Touran at TerraPower, one hits a U-235 atom to continue the reaction. Two or three hit U-238 atoms and convert them to plutonium. The rest are lost. Slow reactions don’t have many extra neutrons, so U-238 atoms are rarely hit with another. But in fast reactions, free neutrons split the plutonium atoms, release more neutrons, and continue the reaction—without the need to remove the plutonium from the system for purification.

Want more Popular Mechanics? Get Instant Access!

This story appears in the May 2017 Popular Mechanics.

April 27, 2017 at 03:14PM
Open in Evernote

New bill would let companies force workers to get genetic tests, share results

New bill would let companies force workers to get genetic tests, share results

Under guise of “voluntary” wellness programs, employees’ genetics could be exposed.

Beth Mole3/10/2017, 11:14 AM

It’s hard to imagine a more sensitive type of personal information than your own genetic blueprints. With varying degrees of accuracy, the four-base code can reveal bits of your family’s past, explain some of your current traits and health, and may provide a glimpse into your future with possible conditions and health problems you could face. And that information doesn’t just apply to you but potentially your blood relatives, too.

Most people would likely want to keep the results of genetic tests highly guarded—if they want their genetic code deciphered at all. But, as STAT reports, a new bill that is quietly moving through the House would allow companies to strong-arm their employees into taking genetic tests and then sharing that data with unregulated third parties as well as the employer. Employees that resist could face penalties of thousands of dollars.

In the past, such personal information has been protected by a law called GINA, the Genetic Information Nondiscrimination Act, which shields people from DNA-based discrimination. But the new bill, HR 1313, gets around this by allowing genetic testing to be part of company wellness programs.

Company wellness programs, which often involve filling out health surveys and undergoing screenings, are pitched as a way to improve employee health and reduce overall health costs. But, research has shown that they have little effect on employee health and may actually end up costing companies. Still, they may survive as a way to push healthcare costs onto employees. As Ars has reported before, companies use financial incentives to get employees to participate in these wellness programs. Under the ACA, these incentives can include all sorts of rewards and compensations. For instance, people who don’t want to participate can pay up to 60 percent more on employer-sponsored insurance premiums. That can easily amount to thousands of dollars each year.

Despite the heavy financial pressure, employee participation is still considered voluntary. Under HR 1313, GINA wouldn’t apply to anything voluntarily collected through wellness programs, and companies would have access to genetic data. That information would be stripped of identifiers, but in small companies, it could be fairly easy to match certain genetic profiles to specific employees.

Moreover, employers tend to hire third parties to collect and manage health data. These companies are not heavily regulated and can review genetic and other health data with identifiers. Some of the companies even sell health information to advertisers, STAT notes.

Civil rights and genetic privacy advocates strongly opposed the bill. In a press release, Nancy Cox, PhD, president of the American Society of Human Genetics said:

“We urge the Committee not to move forward with consideration of this bill. As longtime advocates of genetic privacy, we instead encourage the Committee to pursue ways to foster workplace wellness and employee health without infringing upon the civil rights afforded by [Americans with Disabilities Act] and GINA.”

On Wednesday, the House Education and the Workforce Committee approved HR 1313 along party lines, with 22 Republicans supporting and 17 Democrats opposing the bill.

March 10, 2017 at 03:01PM
Open in Evernote

Self-Healing Transistors for Chip-Scale Starships

Self-Healing Transistors for Chip-Scale Starships

A new design could survive the radiation of a 20-year trip to Alpha Centauri
Photo: Yang-Kyu Choi
Cosmic-Ray-Proof: A test chip includes DRAM and logic circuits made from self-healing gate-all-around transistors.

Working with the Korea Advanced Institute of Science and Technology (KAIST), NASA is pioneering the development of tiny spacecraft, each made from a single silicon chip, that could slash interstellar exploration times.

Speaking at the IEEE International Electron Devices Meeting in San Francisco last December, NASA’s Dong-Il Moon detailed this new technology, which is aimed at ensuring such spacecraft survive the potentially powerful radiation they’ll encounter on their journey.

Calculations suggest that if silicon chips were used to form the heart of a spacecraft powered by a tiny, featherweight solar sail and accelerated by a gigawatt-scale laser system, the craft could accelerate to one-fifth the speed of light. At such high speeds, it would reach the nearest stars in just 20 years, compared with the tens of thousands of years it would take a conventional spacecraft.

Moon and coworkers argue that 20 years in space is still too long for an ordinary silicon chip, because on its journey it will be bombarded by more high-energy radiation than chips encounter on Earth. “You are above most of the magnetic fields that block a lot of radiation, and above most of the atmosphere, which also does a good job of blocking radiation,” says Brett Streetman, who leads efforts in chip-scale spacecraft at the Charles Stark Draper Laboratory, in Cambridge, Mass.

Radiation leads to the accumulation of positively charged defects in the chip’s silicon dioxide layer, where they degrade device performance. The most serious of the impairments is an increase in the current that leaks through a transistor when it is supposed to be turned off, according to Yang-Kyu Choi, leader of the team at KAIST, where the work was done.

Two options for addressing chip damage are to select a path through space that minimizes radiation exposure and to add shielding. But the former leads to longer missions and constrains exploration, and the latter adds weight and nullifies the advantage of using a miniaturized craft. A far better approach, argues Moon, is to let the devices suffer damage but to design them so that they can heal themselves with heat.

“On-chip healing has been around for many, many years,” says Jin-Woo Han, a member of the NASA team. The critical addition made now, Han says, is the most comprehensive analysis of radiation damage so far.

This study uses KAIST’s experimental “gate-all-around” nanowire transistor. These devices use nanoscale wires as the transistor channel instead of today’s fin-shaped channels. The gate-all-around device may not be well known today, but production is expected to rocket in the early 2020s. [See “Transistors Could Stop Shrinking in 2021,” IEEE Spectrum, August 2016.]

The gate—the electrode that turns the flow of charge through the channel on or off—completely surrounds the nanowire. Adding an extra contact to the gate allows you to pass current through it. That current heats the gate and the channel it surrounds, fixing any radiation-induced defects.

Nanowire transistors are ideal for space, according to KAIST, because they naturally have a relatively high degree of immunity to cosmic rays and because they are small, with dimensions in the tens of nanometers. “The typical size for [transistor dimensions on] chips devoted to spacecraft applications is about 500 nanometers,” says Choi. “If you can replace 500-nm feature sizes with 20-nm feature sizes, the chip size and weight can be reduced.” Costs fall too.

KAIST’s design has been used to form three key building blocks for a single-chip spacecraft: a microprocessor, DRAM memory for supporting this, and flash memory that can serve as a hard drive.

Repairs to radiation-induced damage can be made many times, with experiments showing that flash memory can be recovered up to around 10,000 times and DRAM returned to its pristine state 1012 times. With logic devices, an even higher figure is expected. These results indicate that a lengthy interstellar space mission could take place, with the chip powered down every few years, heated internally to recover its performance, and then brought back to life.

Philip Lubin, a professor at the University of California, Santa Barbara, believes that this annealing-based approach is “creative and clever” but wonders how much danger from cosmic rays there really will be to these chips. He would like to see a thorough evaluation of existing technologies for chip-scale spacecraft, pointing out that there are already radiation hardened electronics developed in the military.

Today, efforts at NASA and KAIST are focusing on the elimination of the second gate contact for heating. This contact is not ideal because it modifies chip design and demands the creation of a new transistor library, which escalates production costs. Those at KAIST are investigating the capability of a different design, called a junctionless nanowire transistor, which heats the channel during normal operation. Separately, at NASA, researchers are developing on-chip embedded microheaters that are compatible with standard circuits.

Cutting the costs of self-healing tech will play a key role in determining its future in chip-scale spacecraft, which will require many more years of investment before they can get off the ground.

February 09, 2017 at 11:24AM
Open in Evernote

Posted: Inuits Inherited Cold Adaptation Genes from Denisovan-Related Species | Genetics, Paleoanthropology | Sci-News.com

Posted: Inuits Inherited Cold Adaptation Genes from Denisovan-Related Species | Genetics, Paleoanthropology | Sci-News.com

In the Arctic, the Inuits have adapted to cold and a seafood diet. After the first genomic analysis of Greenlandic Inuits, a region in the genome containing two genes (TBX15 and WARS2) has now been scrutinized by researchers.

Denisovans were probably dark-skinned, unlike the pale Neandertals. Image credit: Mauro Cutrona.

Dr. Fernando Racimo of the New York Genome Center and his colleagues have now followed up on that study to trace back the origins of these adaptations.

“To identify genes responsible for biological adaptations to life in the Arctic, Fumagalli et al. scanned the genomes of Greenlandic Inuit using the population branch statistic, which detects loci that are highly differentiated from other populations,” the researchers explained.

“Using this method, they found two regions with a strong signal of selection: (i) one region contains the cluster of FADS genes, involved in the metabolism of unsaturated fatty acids; (ii) the other region contains WARS2 and TBX15, located on chromosome 1.”

“WARS2 encodes the mitochondrial tryptophanyl-tRNA synthetase. TBX15 is a transcription factor from the T-box family and is a highly pleotropic gene expressed in multiple tissues at different stages of development.”

“TBX15 plays a role in the differentiation of brown and brite adipocytes. Brown and brite adipocytes produce heat via lipid oxidation when stimulated by cold temperatures, making TBX15 a strong candidate gene for adaptation to life in the Arctic.”

In their own study, Dr. Racimo and co-authors used the genomic data from nearly 200 Greenlandic Inuits and compared this to the 1000 Genomes Project and ancient DNA from Neanderthals and Denisovans.

The results provide convincing evidence that the Inuit variant of the TBX15/WARS2 region first came into modern humans from an archaic hominid population, likely related to the Denisovans.

“The Inuit DNA sequence in this region matches very well with the Denisovan genome, and it is highly differentiated from other present-day human sequences, though we can’t discard the possibility that the variant was introduced from another archaic group whose genomes we haven’t sampled yet,” Dr. Racimo said.

The scientists found that the variant is present at low-to-intermediate frequencies throughout Eurasia, and at especially high frequencies in the Inuits and Native American populations, but almost absent in Africa.

They speculate that the archaic variant may have been beneficial to modern humans during their expansion throughout Siberia and across Beringia, into the Americas.

The team also worked to understand the physiological role of the TBX15/WARS2 region.

They found an association between the archaic region and the gene expression of TBX15 and WARS2 in various tissues, like fibroblasts and adipose tissue.

They also observed that the methylation patterns in this region in the Denisovan genome are very different from those of Neanderthals and present-day humans.

“All this suggests that the introduced variant may have altered the regulation of these genes, thought the exact mechanism by which this occurred remains elusive,” Dr. Racimo said.

The team’s results were published online this week in the journal Molecular Biology and Evolution.


Fernando Racimo et al. Archaic adaptive introgression in TBX15/WARS2. Mol Biol Evol, published online December 21, 2016; doi: 10.1093/molbev/msw283

February 09, 2017 at 11:23AM
Open in Evernote

Do not buy the House Science Committee���s claim that scientists faked data

Do not buy the House Science Committee���s claim that scientists faked data

No credible evidence supports that NOAA fabricated data; evidence still points to climate change
By Kendra Pierre-Louis 7 hours ago

Donald LeRoi, NOAA Southwest Fisheries Science Center

Climate scientists have worked hard for decades to prove climate change. Why is the US House Committee on Science, Space and Technology working so hard not to believe them?

On Sunday February 5th, the U.S. House of Representatives Committee on Science, Space, and Technology published a press release alleging, based on questionable evidence, that the National Oceanic and Atmospheric Administration (NOAA) “manipulated climate records.”

The source of their evidence, according to Committee spokesperson Thea McDonalds, was a Daily Mail article. The Daily Mail is a British tabloid most famous for outlandish headlines such as “Is the Bum the New Side Boob” and “ISIS Chief executioner winning hearts with his rugged looks.” This is not the first time that the House Science Committee has used spurious evidence to dispute the existence of human-driven climate change.

The piece, which quotes John Bates—a scientist who NOAA once employed—challenges the data used in the famous 2015 Karl study. The study, named after Thomas R. Karl—the director of the NOAA’s Centers for Environmental Information (NCEI) and the paper’s lead author—was published in Science and debunked the notion of a climate “hiatus” or “cooling.”

The White House Press release, which includes quotes from committee Chairman Lamar Smith as well as Darin Lahood (R-Ill) and Andy Biggs (R-Ariz), misrepresents a procedural disagreement as proof that human caused climate change is not occurring. It’s akin to pointing to a family argument as proof that they aren’t actually related.

“What the House Committee is trying to do, like they did in the past, is debunk the whole issue of global warming,” said Yochanan Kushnir, a Senior Scientist at the Lamont Doherty Earth Observatory.

At the center of the argument is contention over how NOAA maintains climate data records. Climate researchers receive grants to process and develop climate-related data sets. Once those data sets are fully developed, it becomes the responsibility of NOAA’s National Climactic Data Center (NCDC) to preserve, monitor, and update that data—which can sometimes be what data scientists refer to as messy.

“The problem,” said Kevin Trenberth a Distinguished Senior Scientist at the National Center for Atmospheric Research, “is that this is quite an arduous process, and can take a long time. And, of course NOAA doesn’t necessarily get an increase of funds to do this.”

Maintaining this data fell under the purview of Bates’ group, and it’s this data that he has taken issue with publicly.

“Bates was complaining that not all of the data sets were being done as thoroughly as he wanted to,” said Ternberth. “But there’s a compromise you have to make as to whether you can do more data sets or whether you can do more really thoroughly. And the decision was made that you try and do more.”

Ice core samples are used as proxy indicators for past global climate temperatures and atmospheric CO2 concentrations.

Bates takes particular issue with the way Karl handled land temperature data in the Science study which addressed the so-called “climate hiatus.” Early analyses of global temperature trends during the first ten years of this century seemed to suggest that warming had slowed down. Climate change doubters used this analysis to support their belief that—despite climatological data which includes 800,000 year ice-core records of atmospheric carbon dioxide—humans have not affected the atmosphere by releasing billions of tons of carbon dioxide per year.

“His primary complaint seems to be that when researchers at NOAA published this paper in Science, while they used a fully developed and vetted ocean temperature product, they used an experimental land temperature product,” said Zeke Hausfather, an energy systems analyst and environmental economist with Berkeley Earth. Because climate data comes from a number of different sources, methods of handling that data go through a vetting process that ultimately dictates the use of one for the official government temperature product. That can mean controlling for known defects in the devices that gather climate data or figuring out the best way to put them together. The product that Karl used for land temperature data hadn’t finished that process.

“That said,” said Hausfather, “the land temperature data they used in the paper is certainly up to the standards of an experimental or research product.”

So what does that mean for those of us on the outside?

Not much.

The record data that Bates takes umbrage with showed roughly the same amount of warming as the old record. And the evidence that the Karl paper cites as to why there’s no hiatus is based on ocean temperatures—not land. A government source who does not wish to be named emphasized that there is no evidence or even a credible suggestion that NOAA falsified data in the Karl et al (K15) study. And even if Bates’ critiques were valid—and given that this methodology, after much peer review, is now the default way that NOAA calculates land temperatures, his complaints seem problematic—it doesn’t upend the study’s conclusion. And —the evidence still supportsAs for the differences in water temperatures, that can be easily be accounted for by differences in the tools used to measure water temperatures. In the past, as PopSci previously reported, most ocean temperature data was taken by ships which pulled water into their engine rooms—rooms warmer than the ocean outside, making ocean temperature recordings slightly higher. When ocean temperature tracking switched to buoys, which stay in the water all of the time and don’t heat up, NOAA failed to control for the cooler (and arguably more accurate) water temperatures due to the lack of hot ship engines. The Karl study corrects for that temperature difference and Bates’ complaints do nothing to discredit it.

“People should be aware of the fact that there are different groups that analyze the data,” said Kushnir. “if you look at all of the sources together you get a bigger, more reliable picture of what’s happening. There’s the Hadley Center from the UK meteorology office that puts together a data set of global mean temperatures, there’s NASA, NOAA, then there’s the Berkeley group and the Japanese who have their own way of putting information together.”NOAA

Zeke Hausfather at Berkeley Earth independently developed an updated version of Figure 2 in Karl et al 2015. The black line shows the new NOAA record, while the thinner blue line shows the results from raw land stations, ships, and buoys with no corrections for station moves or instrument changes. The two are quite similar over the last 50 years; over the last 100 years the corrected data [the one Karl uses] actually shows less global warming.

The Karl paper is also not the only one to tackle the hiatus. Studies in Nature by Stephan Lewandowsky of the Cabot Institute University of Bristol, and this one in the journal Climate Change by Bala Rajaratnam of Stanford University, all say the same thing.

The Karl study’s high profile, however, has made it a frequent target for criticism.

“The whole issue of this hiatus issue was discussed quite heavily in science,” said Kushnir. “And as scientists we understand what happened in this long period.”

Basically, there’s the natural climate variability, and then there’s the variability caused by climate change. The natural variability during this period was cooler, but the climate change impact on top of it was not.

But that isn’t even Bates’ complaint, as the House Committee would imply—his complaint is that the data wasn’t vetted heavily enough.

“I interpret a key part of the issue,” said Trenberth, “as, how deep do you go and how far into the research do you go for one particular data set, as opposed to moving onto the next data set and getting that into a much better state than it would have been otherwise.”

Trenberth points to a backlog of data that hasn’t yet been released or updated, pressuring NOAA to focus on volume over perfection. If this sounds to you like an argument for more funding for climate change research instead of less, you’re not alone.

“Recommendations about doing these things have been made, but they’ve never been adequately funded. So we muddle along,” said Trenberth. “And Lamar smith under the house has been responsible for some of this, because they actually cut the funding to enable NOAA to properly deal with and process the data by 30 percent in 2012. So the ability to do this properly has actually been compromised by the House Science Committee and by Lamar Smith in particular.” 

The current administration has talked a lot about the “politicization of science.” Meanwhile on the House Committee’s website, Representative Smith states that Bates has exposed the “previous administration’s efforts to push their costly climate agenda at the expense of scientific integrity.” With the House Committee misrepresenting both Bates’ complaint and the overarching scientific consensus, it does indeed seem that the politicization of science is a problem the administration needs to deal with.

February 09, 2017 at 11:22AM
Open in Evernote

Why a Tax Break for Security Cameras Is a Terrible Idea

Why a Tax Break for Security Cameras Is a Terrible Idea

Why a Tax Break for Security Cameras Is a Terrible Idea

Law enforcement agencies around the country have been expanding their surveillance capabilities by recruiting private citizens and businesses to share their security camera footage and live feeds. The trend is alarming, since it allows government to spy on communities without the oversight, approval, or legal processes that are typically required for police. 

EFF is opposing new legislation introduced in California by Assemblymember Marc Steinorth that would create a tax credit worth up to $500 for residents who purchase home security systems, including fences, alarms and cameras. In a letter, EFF has asked the lawmaker to strike the tax break for surveillance cameras, citing privacy concerns as well as the potential threat created by consumer cameras that can be exploited by botnets. As we write in the letter: 

Personal privacy is an inalienable right under Section 1 of the California Constitution. Yet, in 2017, privacy is under threat on multiple fronts, including through the increase in use of privately operated surveillance cameras. Law enforcement agencies throughout the state have been encouraging private individuals and businesses to install cameras and share access to expand government’s surveillance reach through private cooperation. The ability for facial recognition technology to be applied routinely and automatically to CCTV footage will present even more dangers for personal privacy. EFF has significant concerns that, by using tax credits to encourage residents of California to buy and install security cameras, A.B. 54 will not only increase the probability that Californians will use cameras to spy on one another but will also build the infrastructure to allow for the growth of a “Big Brother” state.

In addition, this tax credit for surveillance cameras may create a new weakness for security. In October, a massive cyberattack that exploited personal cameras disabled Internet traffic across the country. EFF and independent security researchers have also discovered surveillance cameras that were openly accessible over the Internet, allowing anyone with a browser to watch live footage and manipulate the cameras. The potential for breaches will grow commensurately with the increase in the number of cameras in communities promoted by the tax incentive.

EFF urges Steinorth to amend A.B. 54 and, failing that, we ask his colleagues in the California legislature to vote against the bill. 

January 09, 2017 at 03:13PM
Open in Evernote

NASA���s Cassini Spacecraft Prepares for Ring-Grazing Phase | Space Exploration | Sci-News.com

NASA���s Cassini Spacecraft Prepares for Ring-Grazing Phase | Space Exploration | Sci-News.com

NASA’s Cassini Spacecraft Prepares for Ring-Grazing Phase

In the final year of its epic voyage, on Nov. 30, NASA’s Cassini orbiter will begin a daring set of ‘ring-grazing orbits,’ skimming past the outside edge of Saturn’s main rings.

Artist’s concept of NASA’s Cassini spacecraft at Saturn. Image credit: NASA.

Artist’s concept of NASA’s Cassini spacecraft at Saturn. Image credit: NASA.

Launched in 1997, Cassini has been touring the Saturn system since arriving there in 2004 for an up-close study of the gas giant, its rings and moons.

During its journey, the probe has made numerous discoveries, including a global ocean within Enceladus and liquid methane seas on Titan.

On Nov. 30, following a gravitational nudge from Titan, Cassini will enter the first phase of the mission’s dramatic endgame.

Cassini will fly closer to Saturn’s rings than it has since its 2004 arrival. It will begin the closest study of the rings and offer unprecedented views of moons that orbit near them.

These orbits, a series of 20, are called ring-grazing orbits, or F-ring orbits.

During these weekly orbits, Cassini will approach to within 4,850 miles (7,800 km) of the center of the narrow F ring, with its peculiar kinked and braided structure.

Cassini’s instruments will attempt to directly sample ring particles and molecules of faint gases.

“Even though we’re flying closer to the F ring than we ever have, we’ll still be more than 4,850 miles distant. There’s very little concern over dust hazard at that range,” said Cassini project manager Dr. Earl Maize, from NASA’s Jet Propulsion Laboratory (JPL).

The F ring marks the outer boundary of the main ring system. This ring is complex and constantly changing. Cassini images have shown structures like bright streamers, wispy filaments and dark channels that appear and develop over mere hours.

The ring is also quite narrow — only about 500 miles (800 km) wide. At its core is a denser region about 30 miles (50 km) wide.

Cassini’s ring-grazing orbits also offer unprecedented opportunities to observe the menagerie of small moons that orbit in or near the edges of Saturn’s rings, including best-ever looks at the moons Pandora, Atlas, Pan and Daphnis.

“During the F-ring orbits we expect to see the rings, along with the small moons and other structures embedded in them, as never before,” said Cassini project scientist Dr. Linda Spilker, also from JPL.

“The last time we got this close to the rings was during arrival at Saturn in 2004, and we saw only their backlit side.”

“Now we have dozens of opportunities to examine their structure at extremely high resolution on both sides.”

During ring-grazing orbits, the spacecraft will pass as close as about 56,000 miles (90,000 km) above Saturn’s cloud tops. But even with all their exciting science, these orbits are merely a prelude to the planet-grazing passes that lie ahead.

In April 2017, Cassini will begin its Grand Finale phase. After nearly 20 years in space, the mission is drawing near its end because the spacecraft is running low on fuel.

The Cassini team carefully designed the finale to conduct an extraordinary science investigation before sending the spacecraft into Saturn to protect its potentially habitable moons.

During this phase, the probe will pass as close as 1,012 miles (1,628 km) above the clouds as it dives repeatedly through the narrow gap between Saturn and its rings, before making its mission-ending plunge into the planet’s atmosphere on Sept. 15, 2017.

November 28, 2016 at 01:50PM
Open in Evernote