- Russia Leads Nuclear Space Race After U.S. Drops Out
- Space Junk Could Be Tracked Like Vultures
- How to Keep Planes From Colliding With Lasers
- Gamma-Ray Mystery Traced to Star-Birth Frenzy
- Oldest Preserved Spider Web Dates Back to Dinosaurs
- AI Spacesuits Turn Astronauts Into Cyborg Biologists
Posted: 03 Nov 2009 09:27 AM PST
The Russian space agency may build a nuclear-powered spacecraft with the blessing of the country's leader, Russian and international media reported Thursday.
The craft would cost $600 million and Russian scientists claim it could be ready as early as 2012.
"The idea (of nuclear-powered spaceflight) has bright prospects, and if Russia could stage a breakthrough it could become our main contribution to any future international program of deep space exploration," Andrei Ionin, an independent Moscow-based space expert, told Christian Science Monitor.
Building a nuclear-powered spacecraft is feasible, said Patrick McDaniel, a nuclear engineer and co-director of the University of New Mexico's Institute for Space and Nuclear Power Studies, but probably not in the short time frame that the Russians have proposed.
"To have a test article that they could test on the ground, that's very reasonable," McDaniel said, "To have a completed system, that's highly unlikely."
If the spaceship actually gets built, it would complete a half-century long quest to bring nuclear power to space propulsion beginning with a 1947 report by North American Aviation to the Air Force.
It's not hard to see why engineers would want to use nuclear power. Fission reactors provide a lot of power for their size, which is a key attribute in designing space systems. One engineer claims nuclear rockets are inherently twice as efficient as their chemical brethren. Their attributes could have increased the exploration range of the space program, nuclear propulsion advocates argue, allowing us to get to more interesting places.
"We could have done a lot more things in space. We could have gone more places," McDaniel said of nuclear rocket research. "It's highly likely we would have gone to Mars."
The current plans to potentially return to Mars do not include a nuclear rocket, but several decades of plans from the 1950s through the 1980s just assumed that nuclear power would be a part of the effort to reach the Red Planet.
Towards that end, the Air Force, which preceded NASA in managing space programs, created Project Rover in conjunction with Los Alamos National Laboratory.
The goal of Rover was to develop a reactor that could be used for propulsion. Various incarnations of the reactor the scientists developed, called Kiwi, were tested at Jackass Flats, Nevada (see video). The idea behind the reactor was to use the heat generated by fission to heat hydrogen, which would expand, generating the force to push the rocket.
None of the reactors ran for more than eight minutes, but they were considered to have met their goals. Technically, they worked.
Though the exhaust from the rockets is radioactive, the first serious program to build a nuclear-powered rocket, Project Rover, enjoyed broad government support, even after it hit some cost overrun problems in the early 1960s.
"Everyone likes Rover — the White House, the Atomic Energy Commission, the National Aeronautics and Space Administration," TIME magazine wrote in 1962. "Senator [Clinton] Anderson insists that nuclear-powered rocketry is as important to U.S. security as the hydrogen bomb."
Beginning in July 1958, with the creation of NASA, work on nuclear rockets became the provenance of the Space Nuclear Propulsion Office. They began to consolidate the various programs, creating the Nuclear Engine for Rocket Vehicle Application program.
Further reactor and nuclear rocket development occurred under NERVA. Several other reactors and rocket designs were tested, most successfully the Phoebus and XE-Prime. There were test failures, but the programs, overall, are considered technical successes.
Beyond the nuclear rocket designs, the United States also launched a small nuclear reactor, SNAP10a, into space that generated electricity. It orbited for 43 days before a non-reactor-related technical failure shut it down. (See the video for an animated explanation of the project.)
Later, the concept was largely abandoned just because no one really knew what to do with a nuclear reactor in space.
"Snap 10A was a technology demo, the question was then, well, what do we want to do with it?" McDaniel said. "And no one had a really good answer."
Other, more fanciful nuclear propulsion ideas were proposed, too. One, Project Orion, would have been powered by nuclear bombs. The physicist Freeman Dyson, who worked on the project, told The New York Times Magazine he saw it "as the solution to a problem. With one trip we'd have got rid of 2,000 bombs."
"Orion was a delightful scientific exercise, but not very feasible," McDaniel said.
These various technologies cost money to develop, of course, and the scale of the cash that flowed their way shows how seriously Americans took nuclear propulsion. Between 1955 and 1972, the United States spent more than $1.4 billion in then-year dollars on developing nuclear rockets and related technologies. At the end of that period, when the Nixon administration cut NASA's budget generally and NERVA's specifically, the United States was well on its way to developing nuclear power for spacefaring and space purposes.
"It is indeed remarkable that the adoption of the Rover–NERVA database, upgraded and modernized by current rocket-engine technology, would fully satisfy NASA's space transfer propulsion and long-distance exploration requirements and permit realization of a safe and low programmatic risk development programme," wrote Stanley Gunn, who worked on the nuclear propulsion program for Rocketdyne, in a 2001 article for Space Policy.
There were several attempts to resurrect nuclear propulsion of various types, most recently the mothballed Project Prometheus. None, though, have garnered much support. One major reason is that NASA picks its propulsion systems based on its targets — and true exploration of the solar system and beyond hasn't really been a serious goal, the Constellation plans for a return to the moon aside.
"The destinations dictate the power system," said Rao Surampudi, a Jet Propulsion Laboratory engineer who works on the development of power systems.
By and large, it's cheaper and easier to go with solar power or very low-power radioisotope generators like the one that powers the Cassini mission.
McDaniel agreed that the targets drive things, citing the general decline of pure technology development research at NASA.
"Until we commit to going back to Mars, we're not going to have a nuclear rocket," McDaniel said.
Or perhaps a new nuclear-powered Russian spacecraft could get anxious minds at the Pentagon and NASA worrying about the need to keep pace with the Ivanovs.
After all, the Soviet nuclear rocket program may have been more advanced than the American efforts at the time of the USSR's collapse.
Images: 1) A NERVA test engine going to the testing spot./NASA. 2) A proposed nuclear rocket to Mars rendering./NASA.
Posted: 03 Nov 2009 08:39 AM PST
Dangerous debris near rocket launches could be tracked in real time by combining tricks from particle colliders, moon landings and vulture tracking, a new study finds.
In a paper to be published in Acta Astronautica, physicist Philip Metzger of NASA's Kennedy Space Center and colleagues describe a technique to plot the paths and determine the densities of worrisome detritus kicked up during launch. This method could help flight engineers know instantly which pieces of debris threaten the spacecraft.
"We combined together two different types of software that can do on-site analysis," Metzger says. "In the future we can take video of the launch environment, and the software can automatically … conclude what were the sources and the makeup of the debris." The paper was published online at arXiv.org on October 22.
"For manned missions, this is very important. I'm surprised it's not been done yet," comments Nilton Renno of the University of Michigan, who studies how rocket plumes from Mars landers affect the Martian surface. "It will improve our confidence in the assessment of potential damage, not just for the space shuttle but for any other future spacecraft."
In 2003, the Space Shuttle Columbia was damaged by a dislodged piece of insulation. The damage caused the shuttle to break apart upon reentry, killing all seven astronauts on board. The event showed how crucial it is to catch debris damage early.
The need for better tracking systems during launch was highlighted during the Space Shuttle Discovery launch in May 2008, when several thousand bricks blew out the end of the flame trench under the shuttle. Simultaneously, a mysterious piece of debris flew high into the air near the shuttle, apparently from the flame trench. Had the mystery object been a brick, it could have damaged the shuttle and put the crew at risk.
To identify the object, Metzger and colleagues took advantage of NASA's bird watching system. In 2005, during the first launch after the Space Shuttle Columbia disaster, a shuttle was threatened by another flying menace: a vulture that smacked into the external tank during takeoff.
Since then, engineers have tracked vultures as a routine part of launch by taking pictures of the launch area from two different angles. Combining the pictures gives a three-dimensional view of the region, so moving objects like birds and bricks can be found and followed. Metzger and colleagues have used similar measures to model how a rocket plume could scatter dust and boulders in the low-gravity environment of the moon, threatening nearby lunar outposts.
For the May 2008 Discovery launch, Metzger's team used images from the vulture-tracking system to plot the mystery object's trajectory, then did some simple ballistics analysis to figure out the object's density. Neither technique is complicated or new, but the combination of images and ballistics had never been used for launch analyses before.
"We're not the first people to study this by a long shot," Metzger says. "All we've done is developed a new technique that seems to be very useful and very fast."
The object was too light to be a brick, but it was the same density as a piece of foam from a solid rocket booster. "At no point was the orbiter in any danger" from the foam, says Bob Carilli a coauthor of the study and engineer with NASA contractor United Space Alliance.
Now that they've shown that the method works, Metzger and colleagues hope to speed the system up with tricks from particle colliders. Particle colliders automatically flag collisions that look interesting and store data for further inspection. NASA could use similar techniques to target only the potentially dangerous launch debris.
"The objective is not to study the individual particles that result from the collision, but rather to burrow down into the originating event," Metzger says. "That's our ultimate objective too."
The system will be used on the next launch. "We're using it right away. We'll use it routinely from now on, whenever we analyze debris," Metzger says. "Now that the software's been written and debugged, it only takes a couple hours to do the whole process, which is a big improvement."
Posted: 03 Nov 2009 06:48 AM PST
Beaming high-powered lasers into the sky allows scientists to study changing weather patterns, pollution in the Earth's atmosphere and even gravity on the Moon. But if one of those helpful lasers happens to cross paths with an airplane, it can temporarily blind or distract the pilot and potentially cause a crash.
The current method to avoid plane-laser collisions is decidedly low-tech: Federal Aviation Administration regulations require anyone who's sending a laser up into the atmosphere to employ multiple human observers, called "spotters," to watch for planes flying within 25 degrees of the laser beam. Now, researchers have created a radio-tracking device that can perform the same task as a pair of eyes, without the potential for human error.
"The two-spotter system is a problem because spotters can forget to set their alarm clocks, they can get sick, they can get confused about the schedule, and then suddenly you don't have two spotters anymore and you can't operate your program," said physicist Tom Murphy of the University of California, San Diego, who is co-leading the radio detection project.
In addition, Murphy says spending all night watching the sky for airplanes can be a cold and windy experience. "I think we have concerns, and the FAA certainly has concerns, about the attention and the wherewithal of spotters," he said. "An automated system would be vigilant, wouldn't get tired, and as long as you have checks that it's working as you expect, would be highly reliable."
The new plane-spotting device takes advantage of air traffic control systems carried by nearly all types of aircraft. Using two radio antennae with different beam widths, researchers can track the constant radio emissions from these systems and figure out when a plane is getting too close for comfort.
"The FAA suggests that an angle of about 15 degrees is about where you better start worrying," said engineer Bill Coles, who co-designed the radio system. But because the strength of a radio signal drops off quickly with distance, the device can't judge how close a plane is getting based on the power of a single radio signal. Instead, the researchers compare signals from two antennae, one with a beam width of 15 degrees and one with a much wider beam width.
"We point both antennas in the direction of the laser beam, and then we check the relative power," Coles said. "If the power in the narrow beam antenna is greater than the power in the broad-beam antenna, we know it must be within 15 degrees of the beam — that works no matter how far away the aircraft is because it's a ratio of the powers."
So far, the researchers have tested a prototype of the device on two laser telescopes, one at Apache Point Observatory in New Mexico and the other at the Palomar Observatory in San Diego. After collecting nearly a year's worth of data at Apache Point, the group posted their results on the open-access repository arXiv.org.
"Any time the telescope is open, whether or not the laser is on, we're collecting loads and loads of information on what the system sees," Murphy said. So far, they've compared the radio system to observations made by human spotters — and every plane identified by human eyes has also been detected by the radio device.
Eventually, the researchers hope to get the device sanctioned by the FAA as an alternative or addition to human spotters. First, however, they want to cross-check their data with records kept by the FAA, to make sure they're not missing any airplanes, including those that might have been overlooked by human eyes.
Detecting every at-risk plane is important because, like any powerful laser pointed in the wrong direction, research lasers can cause temporary or permanent damage if flashed into the eyes. According to an FAQ page on the FAA site, "the temporary adverse visual effects may include distraction, startle, glare, flashblindness, and/or afterimage." And during critical tasks like take-off and landing, the FAA says even a brief flash of laser light can be enough to cause an accident.
Indeed, more than a thousand incidents involving lasers and airplanes have been documented since 2004. Most of the recent incidents have involved laser pointers directed either intentionally or unintentionally at aircraft during takeoff and landing — for example, one malcontent with a laser pointer targeted 12 planes trying to land at a Seattle airport in February.
Although the researchers said they don't know of any airplane accidents specifically caused by research lasers, the near-constant use of these powerful instruments makes safety a prime concern.
"Sending lasers through the atmosphere is probably happening somewhere in the world," Murphy said, "at any given time."
Image: A telescope at NASA Goddard sends a green laser up to the moon to track lunar spacecraft, Tom Zagwodzki/Goddard Space Flight Center.
Posted: 02 Nov 2009 11:55 AM PST
WASHINGTON, D.C. — Astronomers have for the first time traced gamma rays, the most energetic form of light, to galaxies undergoing a frenzy of star birth. The finding, which has revealed a new class of galactic gamma-ray sources, is not unexpected. But it provides new hints about the origin of many cosmic rays, the high-speed protons and other charged particles of extraordinarily high energies that bombard Earth.
According to the prevailing theory, cosmic rays are accelerated to energies of billions to trillions of electron volts by the expanding shock waves generated when massive stars explode as supernovas. (Cosmic rays with even higher energies are thought to be powered by supermassive black holes at the centers of galaxies.) Kinks in a galaxy's magnetic field keep the particles, mainly protons and other charged particles, bouncing back and forth like ping-pong balls between the advancing shock wave and the region just in front of it, revving up the particles to these high energies, the model suggests.
Massive stars live for only a few million years before exploding — an eyeblink in astronomical terms. Galaxies that produce lots of newborn stars therefore have lots of dying stars that explode as supernovas and ought to have an abundance of cosmic rays.
Testing the theory that supernova shock waves generate cosmic rays hasn't been easy, however, because galactic magnetic fields bend the direction of travel of all charged particles, including cosmic rays, preventing astronomers from tracing any but the highest energy particles — which can escape the magnetic fields — back to their home galaxies.
But when cosmic rays collide with other atomic nuclei in surrounding gas or dust, they produce gamma rays, the most energetic form of light. And unlike charged particles, light can't be bent by magnetic fields.
A new generation of gamma-ray telescopes, including the Fermi Gamma-ray Space Telescope launched in 2008, and VERITAS, an array of four 12-meter telescopes atop Mount Hopkins in Arizona, has now succeeded in detecting gamma rays from three galaxies undergoing intense waves of starbirth. The finding helps to confirm the connection between supernovas and cosmic rays.
Researchers reported the findings November 2 at the 2009 Fermi Symposium (named for the Fermi Gamma-ray telescope, the main focus of the conference). VERITAS observed gamma rays ranging from 700 billion eV to several trillion eV from the galaxy M82, which is some 12 million light-years from Earth. M82 is classified as a starburst galaxy because within a small, central region it makes stars at a rate 10 times higher than that of the entire Milky Way.
Although M82 is one of the closest starburst galaxies, "it took us two years of all-out observations of M82 to acquire all the necessary data," said VERITAS researcher Wystan Benbowof the Smithsonian Astrophysical Observatory in Cambridge, Mass. Starburst galaxies produce a diffuse gamma-ray glow that is about one-millionth the brightness of galaxies that have active, supermassive black holes at their centers — the only type of galaxy from which gamma-ray telescopes had previously recorded emissions.
Finding gamma rays in a starburst galaxy "had been long predicted, but nobody had ever done it before this year," noted Benbow, whose team also reported the discovery online November 1 in Nature.
The Fermi Gamma-ray Space Telescope, which records lower-energy gamma rays than does VERITAS, also found gammas from M82 and from another starburst galaxy, NGC 253, reported Keith Bechtolof the SLAC National Accelerator Laboratory in Menlo Park, Calif. In addition, Fermi recorded a diffuse gamma-ray glow from a region of intense star formation in the Large Magellanic Cloud, a small satellite galaxy of the Milky Way, said Jürgen Knödlseder of the Center for the Study of Space Radiation in Toulouse, France.
The Large Magellanic Cloud is close enough to the Milky Way that Knödlseder and his colleagues could tell that the gamma rays emanated from a region that has highly ionized gas. Such sites are places where massive stars, which produce ionizing radiation, are expected to be common. The finding "implies that massive star-forming regions are the main source of cosmic rays in the Large Magellanic Cloud," Knödlseder said.
Bechtol noted that the two starburst galaxies, M82 and NGC 253, have a higher rate of supernova production and emit more gamma rays than the Large Magellanic Cloud, another clue that supernovas and cosmic rays are intrinsically linked.
After hearing the news, "I didn't fall out of my chair but I got a big smile on my face," says theorist Brian Fields of the University of Illinois at Urbana-Champaign, who is not a member of either discovery team. Had Fermi and VERITAS not found gamma-ray emissions from starburst galaxies, he notes, "it would have been back to the drawing board" for understanding the origin of cosmic rays.
Although the discoveries "bolster our confidence that cosmic rays are accelerated by supernova remnants," they do not clinch the case, said gamma-ray theorist Charles Dermer of the Naval Research Laboratory in Washington, D.C., a member of the Fermi collaboration.
The clincher may come, he says, if an ongoing analysis of Fermi data finds that gamma-ray emissions from starburst galaxies peak at an energy of 70 million eV. That corresponds to the energy generated when a subatomic particle called a neutral pion decays into two gamma rays. Because galactic pions can only be generated by cosmic-ray collisions, finding this peak would provide compelling proof for the theory, Dermer said.
Images: 1) M82 / NASA, ESA, CXC, and JPL-Caltech. 2) Gamma-ray emissions coming from M82 / CfA/V.A. Acciari. 3) Steve Criswell, SAO.
Posted: 02 Nov 2009 11:22 AM PST
The world's oldest known spider web has been discovered on a beach in Sussex, England, trapped inside an ancient chunk of amber.
Scientists found the rare amber fossil in December, and have now confirmed that it contains remnants of spider silk spun roughly 140 million years ago by an ancestor of modern orb-weaving spiders. After slicing the amber into thin sections and examining each piece under a high-powered microscope, the researchers discovered that the ancient silk threads share several features common to modern spider webs, including droplets of sticky glue used to hold the web together and capture prey.
According to paleobiologist Martin Brasier of Oxford University, the gooey droplets suggest that spiders were starting to spin webs that werebetter adapted for catching flying insects. "Interestingly, a huge radiation took place in flying insects and bark beetles about 140-130 million years ago," Brasier wrote in an email to Wired.com. "So we may be seeing a co-evolution of spiders and insects here."
"Silk is a relatively delicate material and it is rarely preserved in the fossil record, except when entombed in amber," Brasier and colleagues wrote about the discovery in the upcoming December issueof the Journal of the Geological Society. The researchers think pieces of organic material, including the spider silk, became embalmed during a severe wildfire, when amber resins seeped out from the charred bark of coniferous trees and were eventually swept away by flooding.
In addition to ancient spider silk, the amber chunk contains well-preserved soil microbes, including the oldest known examples of actinobacteria, a common type of bacteria that plays a major role in soil formation.
Image 1: A spider and web trapped in amber, Mila Zinkova/Wikipedia Commons. Image 2: Light micrograph of new amber fossil showing a web of tiny silk threads, plus droplets of sticky glue. Courtesy of Martin Brasier.
Posted: 02 Nov 2009 10:55 AM PST
Equipped with wearable AI systems and digital eyes that see what human eyes can't, space explorers of the future could be not just astronauts, but "cyborg astrobiologists."
That's the vision of a research team led by Patrick McGuire, a University of Chicago geoscientist who's developed algorithms that can recognize signs of life in a barren landscape.
"When they look at scenery, children gravitate towards the thing that's different from the other things," said McGuire. "That's how I looked at the cyborg astrobiologist."
At the heart of McGuire's system is a Hopfield neural network, a type of artificial intelligence that compares incoming data against patterns it's seen before, eventually picking out those details that qualify as new or unusual.
As described in a paper published Thursday in arXiv, the system successfully differentiates lichen from surrounding rock — a proof-of-principle test that lays the foundation for adding other types of data.
For the last several years, McGuire worked on CRISM, a Mars-orbiting imager that detects infrared and other invisible-to-human-eye wavelengths of light, allowing it to identify different types of rock and soil. McGuire envisions the digital eyes of cyborg astrobiologists as scaled-down versions of CRISM, their data perpetually crunched by the Hopfield networks on their hips.
"You would have a very complex artificial intelligence system, with access to different remote sensing databases, to field work that's been done before in the area, and it would have the ability to reason about these in human-like ways," said McGuire.
The lichen tests were conducted in Spain and at Utah's Mars Desert Research Station, where two of the researchers donned spacesuits and lived for two weeks in the field as astronauts. They carried hand-held digital microscopes and cell phone cameras, which sent the data via bluetooth to netbooks running McGuire's Hopfield network.
The lichen identification was based on color data. McGuire next plans to train the network to process different textures. Ultimately he wants to conduct analysis at different scales, from the microscopic up to landscape-wide.
McGuire cautioned that his team's system is "nowhere near" its ready-for-Mars ideal, and it will likely be decades before people explore the surface of Mars in person. In the meantime, cyborg astrobiologists might search the South Pole for Martian meteorites, and feature-identifying algorithms could be uploaded to Mars-roving robots.
"Then you'd have a robotic astrobiologist, and the humans would be back here on Earth, in Mission Control," he said. "The algorithms help us out, but humans are ultimately responsible."
Images: Patrick McGuire
Citation: "The Cyborg Astrobiologist: Testing a Novelty-Detection Algorithm on Two Mobile Exploration Systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah." P.C. McGuire, C. Gross, L. Wendt1, A. Bonnici, V. Souza-Egipsy, J. Ormö, E. Díaz-Martínez, B.H. Foing, R. Bose, S. Walter, M. Oesker, J. Ontrup, R. Haschke, H. Ritter. arXiv, October 29, 2009.
|You are subscribed to email updates from Johnus Morphopalus's Facebook notes |
To stop receiving these emails, you may unsubscribe now.
|Email delivery powered by Google|
|Google Inc., 20 West Kinzie, Chicago IL USA 60610|