domenica 10 maggio 2009

The Day The Universe Froze: New Model For Dark Energy

SOURCE

ScienceDaily (May 11, 2009) — Imagine a time when the entire universe froze. According to a new model for dark energy, that is essentially what happened about 11.5 billion years ago, when the universe was a quarter of the size it is today.
The model, published online May 6 in the journal Physical Review D, was developed by Research Associate Sourish Dutta and Professor of Physics Robert Scherrer at Vanderbilt University, working with Professor of Physics Stephen Hsu and graduate student David Reeb at the University of Oregon.
A cosmological phase transition — similar to freezing — is one of the distinctive aspects of this latest effort to account for dark energy — the mysterious negative force that cosmologists now think makes up more than 70 percent of all the energy and matter in the universe and is pushing the universe apart at an ever-faster rate.
Another feature that distinguishes the new formulation is that it makes a testable prediction regarding the expansion rate of the universe. In addition, the micro-explosions created by the largest particle colliders should excite the dark energy field and these excitations could appear as exotic, never-seen-before sub-atomic particles.
"One of the things that is very unsatisfying about many of the existing explanations for dark energy is that they are difficult to test,” says Scherrer, "We designed a model that can interact with normal matter and so has observable consequences.”
The model associates dark energy with something called vacuum energy. Like a number of existing theories, it proposes that space itself is the source of the repulsive energy that is pushing the universe apart. For many years, scientists thought that the energy of empty space averaged zero. But the discovery of quantum mechanics changed this view. According to quantum theory, empty space is filled with pairs of "virtual” particles that spontaneously pop into and out of existence too quickly to be detected.
This sub-atomic activity is a logical source for dark energy because both are spread uniformly throughout space. This distribution is consistent with evidence that the average density of dark energy has remained constant as the universe has expanded. This characteristic is in direct contrast to ordinary matter and energy, which become increasingly dilute as the universe inflates.
The theory is one of those that attribute dark energy to an entirely new field dubbed quintessence. Quintessence is comparable to other basic fields like gravity and electromagnetism, but has some unique properties. For one thing, it is the same strength throughout the universe. Another important feature is that it acts like an antigravity agent, causing objects to move away from each other instead of pulling them together like gravity.
In its simplest form, the strength of the quintessence field remains constant through time. In this case it plays the role of the cosmological constant, a term that Albert Einstein added to the theory of general relativity to keep the universe from contracting under the force of gravity. When evidence that the universe is expanding came in, Einstein dropped the term since an expanding universe is a solution to the equations of general relativity. Then, in the late 90's, studies of supernovae (spectacular stellar explosions so powerful that they can briefly outshine entire galaxies consisting of millions of stars) indicated that the universe is not just expanding but also that the rate of expansion is speeding up instead of slowing down as scientists had expected.
That threw cosmologists for a loop since they thought gravity was the only long-range force acting between astronomical objects. So they had no idea what could possibly be pushing everything apart. The simplest way to account for this bizarre phenomenon was to bring back Einstein's cosmological constant with its antigravity properties. Unfortunately, this explanation suffers from some severe drawbacks so physicists have been actively searching for other antigravity agents.
These antigravity agents (dubbed "dark energy models” in the technical literature) usually invoke quintessence or even more exotic fields. Because none of these fields have been detected in nature; however, their proponents generally assume that they do not interact significantly with ordinary matter and radiation.
One of the consequences of allowing quintessence to interact with ordinary matter is the likelihood that the field went through a phase transition — froze out — when the universe cooled down to a temperature that it reached 2.2 billion years after the Big Bang. As a result, the energy density of the quintessence field would have remained at a relatively high level until the phase transition when it abruptly dropped to a significantly lower level where it has remained ever since.
This transition would have released a fraction of the dark energy held in the field in the form of dark radiation. According to the model, this dark radiation is much different than light, radio waves, microwaves and other types of ordinary radiation: It is completely undetectable by any instrument known to man. However, nature provides a detection method. According to Einstein's theory of general relativity, gravity is produced by the distribution of energy and momentum. So the changes in net energy and momentum caused by the sudden introduction of dark radiation should have affected the gravitational field of the universe in a way that has slowed its expansion in a characteristic fashion.
In the next 10 years or so, the large astronomical surveys that are just starting up to plot the expansion of the universe by measuring the brightness of the most distant supernovas should be able to detect the slowdown in the expansion rate that the model predicts. At the same time, new particle accelerators, like the Large Hadron Collider nearing operation in Switzerland, can produce energies theoretically large enough to excite the quintessence field and these excitations could appear as new exotic particles, the researchers say.
The research was funded by grants from the U.S. Department of Energy.
Journal reference:
Sourish Dutta, Emmanuel N. Saridakis, and Robert J. Scherrer. Dark energy from a quintessence (phantom) field rolling near a potential minimum (maximum). Physical Review D, 2009; 79 (10): 103005 DOI: 10.1103/PhysRevD.79.103005
Adapted from materials provided by Vanderbilt University. Original article written by David F. Salisbury.

Last Dance with the Shuttle: What's in Store for the Final Hubble Servicing Mission

SOURCE

A Q&A with Hubble Space Telescope senior project scientist David Leckrone.
Last month marked the 19th anniversary of the launch of the Hubble Space Telescope, an orbiting observatory that has become a household name and a linchpin of astronomical science. The telescope has proved remarkably resilient, enduring numerous glitches over the years—from a flawed primary mirror at deployment to a serious electronic failure this past September. Each time, Hubble has held on until astronauts arrived to perform repairs, an operation that is about to take place for the final time by a shuttle crew.On Monday space shuttle Atlantis is slated to lift off on the fifth and final servicing mission to Hubble (confusingly dubbed Servicing Mission 4—the nominal third mission was split into two parts, Missions 3A and 3B). Four mission specialists alternating in two-astronaut teams will attempt a total of five spacewalks from Atlantis to replace broken components, add new science instruments, and swap out the telescope's six 125-pound (57-kilogram) batteries, original parts that have powered Hubble's night-side operations for nearly two decades.To find out what a refurbished Hubble will be capable of and how long the telescope will operate without further service, we spoke to astrophysicist David Leckrone, senior project scientist for the Hubble Space Telescope at the NASA Goddard Space Flight Center in Greenbelt, Md.
From the Hubble team's perspective, what are the goals for this shuttle mission?This is our final opportunity to service and upgrade Hubble. So we're replacing some items that are getting long in the tooth to give Hubble longevity, and then we'll try to take advantage of that five- to 10-year extra lifetime with the most powerful instrumental tools we've ever had on board.We have to do maintenance on the spacecraft itself, like replacing the batteries. There are six batteries that were launched in 1990 and have never been replaced—I bet you couldn't do that with your flashlight. And we have gyroscopes that help keep Hubble pointing stably so it doesn't jitter and smear out our very high-resolution imagery. These things have known average lifetimes and wear-out mechanisms, so it's time to replace all six gyroscopes. We have to replace another sensor called a fine guidance sensor that is used both to help control the pointing of the telescope in the sky and also for the science of astrometry, which is very precisely measuring the positions of stars.It's been seven years since we've serviced Hubble, and the normal servicing interval is three and a half years or so. It's as if you're supposed to service your car at 5,000 miles, but it's been 10,000 miles and things are starting to break down—particularly within our suite of scientific instruments.In 2002, after the last time we serviced Hubble, we had 11 different channels operating among the six scientific instruments. A channel is like an individual camera within a box; for example, we put a new instrument on board in 2002, the Advanced Camera for Surveys, that has three separate cameras in it, each with unique capabilities, and each of these cameras we call a channel. So we had 11 channels active after the last servicing mission; we're now down to three. And among those three channels, only one was really heavily used prior to recent times. So there has been significant deterioration in the tools that we use for observing the sky.After this mission is over, if everything goes perfectly—and this is an extraordinarily complex and ambitious mission, so nobody should be surprised if we don't get absolutely everything done—we should be up to 14 channels with the very highest technology that we've ever flown on Hubble. It will be more powerful as a scientific tool than it's ever been before.
Originally this mission was scheduled for October 2008, but with the problems in September with Hubble's data formatter, it was pushed back. How has that glitch changed the mission?That was a scientific instrument command and data handling system (SI C&DH) and its subunit, known as a science data formatter, which is absolutely essential for doing observations and getting the data back home. Luckily we had two redundant electronic sides in what is essentially a computer system. It was one side that failed, so we were able to switch over to the other side. We had never done that before, but it worked fine.The only problem is that we no longer have redundancy. And if we risk the lives of seven astronauts and go to all this trouble to get Hubble fully up to snuff for five to 10 more years, we don't want to have a single-point failure possibility, where if side B failed, suddenly all science would be over on Hubble. We didn't want to do that, and Mike Griffin, who was the NASA administrator at the time, didn't want to do that. So he called a halt to preparations for launching in October and we got our spare SI C&DH system ready to fly.So you will replace the A side that failed?We'll replace both the A side and the B side. We're going to replace the entire unit.In terms of technical upgrades or longevity boosts, what do you hope to get out of this mission?We're putting on two brand-new scientific instruments, and then the astronauts are going to attempt to repair two, including the Advanced Camera for Surveys and the spectrograph, which are quite modern instruments but had electronic failures.One of the new instruments is called Wide Field Camera 3, and it's going to replace Wide Field Planetary Camera 2 (WFPC2)—the jargon is a little strange. We're taking out WFPC2, which has been in the observatory since 1993, and replacing it with a really golly-gee-whiz new camera that has two channels in it. One channel is optimized to observe light in the ultraviolet wavelengths, and the other channel is optimized for the near-infrared. We have a near-infrared instrument on board Hubble already, but its technology is very primitive, whereas the new infrared channel is superb. This thing is going to just clean up.The most important program it's going to be doing in the year following the mission is another ultra-deep field. There was a Hubble Deep Field in 1995 and an Ultra-Deep Field in 2004 or so, and those were at visible wavelengths. Now we're going to another ultra-deep field in near-infrared wavelengths. Because the universe is expanding, the light emitted by very, very distant, far-back-in-time objects is shifted to red wavelengths. It may have been emitted in the visible or ultraviolet, but by the time the light gets to us, it's been shifted by the expansion of the universe to red and near-infrared wavelengths. So if you want to look really far back in time, as far back as you can, you really need to look in near-infrared or infrared wavelengths.This near-infrared channel will probe further back in time than any image that humans have ever taken—with the exception of the microwave background explorers, which went all the way back to the big bang.So this an ultra-ultra-ultra–deep field, essentially. Is there a name for it yet?That's as good as any.The same team that's going to be doing this "ultra-ultra-ultra–deep field" worked hard on the original Ultra-Deep Field to find the faintest protogalaxies or clumps of star formation that they could. And they now have identified seven or eight objects that emitted the light we see when the universe was about 700 million to 800 million years old. We think we will push back another 200 million years or so with this new camera.
What about the other new instrument?The Cosmic Origins Spectrograph (COS) is the other box, as it were, and it has its own two channels. It's a spectrograph, not a camera, so it takes the light from a distant light source and spreads it out into its component colors. If you measure how the intensity of light changes as a function of color, that gives you a lot of information about the medium that emitted that light—its temperature, density, rotation, chemical composition, and so on.This is the most sensitive spectrograph ever to fly in space, to the best of our knowledge. And the combination of the spectrograph behind our telescope will allow the observers to look at very distant light sources, such as quasars, and use them as background flashlight beams. A beam of light from a distant quasar will pass through the material between the galaxies, and that material is dark—it's not what we call dark matter, but it's not glowing, it doesn't emit its own light. So you have to look at the imprint of absorption that it leaves on light passing through it. And the idea in doing this is to analyze what's called the cosmic web—the large-scale, weblike structure within which galaxies are formed.I like to say we're going to trace the story of galaxy formation and evolution from the nursery to advanced adulthood. And COS will play a huge role in that, complementing the Wide Field Camera 3 in the process—the two instruments can work together to put together this family album of galaxy history.As you mentioned before, this is Hubble's last servicing mission. It appears that the shuttle program is now truly entering its planned obsolescence. Is there some chance that, if Hubble manages to hang on and NASA readies a replacement spaceflight system in time, there could be another mission to Hubble?That is principally a policy question: Do you spend more money servicing Hubble, which will be 25 years old at the end of the life extension that we're trying to achieve here?So the planned life extension from this mission is to get it to at least 2014 or so?That's right. And of course it's going to be a remarkably refurbished observatory with lots of new things on board, so it wouldn't surprise anyone if it kept going much longer than that, but on paper that's the objective.So now you have an observatory that may be working fine and that has upgraded technology on it, but it's 24 or 25 years old and is a rather small telescope in space. Would you rather spend money continuing its lifetime for another five or 10 years, or would you rather invest that money in building a similar telescope that is much bigger?There are two camps: One camp says it's going to be a long time before we get the next big telescope after Hubble and the [infrared-only] James Webb Space Telescope, and we'll need…[Hubble's]…ultraviolet, visible and near-infrared capability. So in the interim, "Let's go ahead and plan another servicing mission using the Constellation vehicles that are being developed to replace the shuttle." The other school of thought says, "Let's use that money to go for the next big step." And right now the latter is the official policy of NASA.I really do think we need to get on with what I like to call Daughter of Hubble, with an aperture of between nine and 16 meters, rather than Hubble's 2.4 meters. I can hardly imagine what we would see with that.

venerdì 8 maggio 2009

NASA Nanosatellite To Study Antifungal Drug Effectiveness In Space

SOURCE
ScienceDaily (May 8, 2009) — NASA is preparing to fly a small satellite about the size of a loaf of bread that could help scientists better understand how effectively drugs work in space. The nanosatellite, known as PharmaSat, is a secondary payload aboard a U.S. Air Force four-stage Minotaur 1 rocket planned for launch the evening of May 5.
PharmaSat weighs approximately 10 pounds. It contains a controlled environment micro-laboratory packed with sensors and optical systems that can detect the growth, density and health of yeast cells and transmit that data to scientists for analysis on Earth. PharmaSat also will monitor the levels of pressure, temperature and acceleration the yeast and the satellite experience while circling Earth at 17,000 miles per hour. Scientists will study how the yeast responds during and after an antifungal treatment is administered at three distinct dosage levels to learn more about drug action in space, the satellite's primary goal.
The Minotaur 1 rocket is on the launch pad at NASA's Wallops Flight Facility and the Mid-Atlantic Regional Spaceport located at Wallops Island, Va. The Wallops range is conducting final checkouts. The U.S. Air Force has announced that the rocket could launch at any time during a three-hour launch window beginning at 8 p.m. EDT May 5.
"Secondary payload nanosatellites expand the number of opportunities available to conduct research in microgravity by providing an alternative to the International Space Station or space shuttle conducted investigations," said Elwood Agasid, PharmaSat project manager at NASA's Ames Research Center in Moffett Field, Calif. "The PharmaSat spacecraft builds upon the GeneSat-1 legacy with enhanced monitoring and measurement capabilities, which will enable more extensive scientific investigation."
After PharmaSat separates from the Minotaur 1 rocket and successfully enters low Earth orbit at approximately 285 miles above Earth, it will activate and begin transmitting radio signals to two ground control stations. The primary ground station at SRI International in Menlo Park, Calif., will transmit mission data from the satellite to the spacecraft operators in the mission control center at NASA's Ames Research Center. A secondary station is located at Santa Clara University in Santa Clara, Calif.
When NASA spaceflight engineers make contact with PharmaSat, which could happen as soon as one hour after launch, the satellite will receive a command to initiate its experiment, which will last 96 hours. Once the experiment begins, PharmaSat will relay data in near real-time to mission managers, engineers and project scientists for further analysis. The nanosatellite could transmit data for as long as six months.
"PharmaSat is an important experiment that will yield new information about the susceptibility of microbes to antibiotics in the space environment," said David Niesel, PharmaSat's co-investigator from the University of Texas Medical Branch Department of Pathology and Microbiology and Immunology in Galveston. "It also will prove that biological experiments can be conducted on sophisticated autonomous nanosatellites."
As with NASA's previous small satellite missions, such as the GeneSat-1, which launched in 2006 and continues to transmit a beacon to Earth, Santa Clara University invites amateur radio operators around the world to tune in to the satellite's broadcast.
For more information and instructions about how to contact PharmaSat, visit: http://www.nasa.gov/mission_pages/smallsats/pharmasat.html
Adapted from materials provided by NASA.

Refined Hubble Constant Narrows Possible Explanations For Dark Energy


ScienceDaily (May 8, 2009) — Whatever dark energy is, explanations for it have less wiggle room following a Hubble Space Telescope observation that has refined the measurement of the universe's present expansion rate to a precision where the error is smaller than five percent.
The new value for the expansion rate, known as the Hubble constant, or Ho (after Edwin Hubble who first measured the expansion of the universe nearly a century ago), is 74.2 kilometers per second per megaparsec (error margin of ± 3.6). The results agree closely with an earlier measurement gleaned from Hubble of 72 ± 8 km/sec/megaparsec, but are now more than twice as precise.
The Hubble measurement, conducted by the SHOES (Supernova Ho for the Equation of State) Team and led by Adam Riess, of the Space Telescope Science Institute and the Johns Hopkins University, uses a number of refinements to streamline and strengthen the construction of a cosmic "distance ladder," a billion light-years in length, that astronomers use to determine the universe's expansion rate.
Hubble observations of pulsating stars called Cepheid variables in a nearby cosmic mile marker, the galaxy NGC 4258, and in the host galaxies of recent supernovae, directly link these distance indicators. The use of Hubble to bridge these rungs in the ladder eliminated the systematic errors that are almost unavoidably introduced by comparing measurements from different telescopes.
Riess explains the new technique: "It's like measuring a building with a long tape measure instead of moving a yard stick end over end. You avoid compounding the little errors you make every time you move the yardstick. The higher the building, the greater the error."
Lucas Macri, professor of physics and astronomy at Texas A&M, and a significant contributor to the results, said, "Cepheids are the backbone of the distance ladder because their pulsation periods, which are easily observed, correlate directly with their luminosities. Another refinement of our ladder is the fact that we have observed the Cepheids in the near-infrared parts of the electromagnetic spectrum where these variable stars are better distance indicators than at optical wavelengths."
This new, more precise value of the Hubble constant was used to test and constrain the properties of dark energy, the form of energy that produces a repulsive force in space, which is causing the expansion rate of the universe to accelerate.
By bracketing the expansion history of the universe between today and when the universe was only approximately 380,000 years old, the astronomers were able to place limits on the nature of the dark energy that is causing the expansion to speed up. (The measurement for the far, early universe is derived from fluctuations in the cosmic microwave background, as resolved by NASA's Wilkinson Microwave Anisotropy Probe, WMAP, in 2003.)
Their result is consistent with the simplest interpretation of dark energy: that it is mathematically equivalent to Albert Einstein's hypothesized cosmological constant, introduced a century ago to push on the fabric of space and prevent the universe from collapsing under the pull of gravity. (Einstein, however, removed the constant once the expansion of the universe was discovered by Edwin Hubble.)
"If you put in a box all the ways that dark energy might differ from the cosmological constant, that box would now be three times smaller," says Riess.
"That's progress, but we still have a long way to go to pin down the nature of dark energy."
Though the cosmological constant was conceived of long ago, observational evidence for dark energy didn't come along until 11 years ago, when two studies, one led by Riess and Brian Schmidt of Mount Stromlo Observatory, and the other by Saul Perlmutter of Lawrence Berkeley National Laboratory, discovered dark energy independently, in part with Hubble observations. Since then astronomers have been pursuing observations to better characterize dark energy.
Riess's approach to narrowing alternative explanations for dark energy--whether it is a static cosmological constant or a dynamical field (like the repulsive force that drove inflation after the big bang)--is to further refine measurements of the universe's expansion history.
Before Hubble was launched in 1990, the estimates of the Hubble constant varied by a factor of two. In the late 1990s the Hubble Space Telescope Key Project on the Extragalactic Distance Scale refined the value of the Hubble constant to an error of only about ten percent. This was accomplished by observing Cepheid variables at optical wavelengths out to greater distances than obtained previously and comparing those to similar measurements from ground-based telescopes.
The SHOES team used Hubble's Near Infrared Camera and Multi-Object Spectrometer (NICMOS) and the Advanced Camera for Surveys (ACS) to observe 240 Cepheid variable stars across seven galaxies. One of these galaxies was NGC 4258, whose distance was very accurately determined through observations with radio telescopes. The other six galaxies recently hosted Type Ia supernovae that are reliable distance indicators for even farther measurements in the universe. Type Ia supernovae all explode with nearly the same amount of energy and therefore have almost the same intrinsic brightness.
By observing Cepheids with very similar properties at near-infrared wavelengths in all seven galaxies, and using the same telescope and instrument, the team was able to more precisely calibrate the luminosity of supernovae. With Hubble's powerful capabilities, the team was able to sidestep some of the shakiest rungs along the previous distance ladder involving uncertainties in the behavior of Cepheids.
Riess would eventually like to see the Hubble constant refined to a value with an error of no more than one percent, to put even tighter constraints on solutions to dark energy.
Journal reference:
Riess et al. A Redetermination of the Hubble Constant with the Hubble Space Telescope from a Di%uFB00erential Distance Ladder. The Astrophysical Journal, 2009; (accepted for publication) [link]
Adapted from materials provided by Space Telescope Science Institute.

Hubble Repair Mission On Track For May 11 Launch


ScienceDaily (May 8, 2009) — A $70 million instrument designed by the University of Colorado at Boulder to probe the evolution of galaxies, stars and intergalactic matter from its perch on the orbiting Hubble Space Telescope is on schedule for its slated May 11 launch from Kennedy Space Center in Florida aboard NASA's space shuttle Atlantis.
Originally scheduled for launch in 2004, NASA's Hubble Servicing mission has been beset by delays over the years by causes ranging from the Columbia space shuttle accident to mechanical glitches. But CU-Boulder Professor James Green of the Center for Astrophysics and Space Astronomy, principal investigator for $70 million Cosmic Origin Spectrograph, or COS, said from the Kennedy Space Center today things look very good for the launch of Atlantis next Monday at 2:01 p.m. EDT.
""There have been no hiccups this time around and everything is going very smoothly," said Green. We are right on schedule and the team is optimistic about the launch."
The telephone-booth-sized COS, built primarily by CU-Boulder's industrial partner, Ball Aerospace & Technology Corp. of Boulder, should help scientists better understand the "cosmic web" of material believed to permeate the universe, said Green. COS will gather information from ultraviolet light emanating from distant objects, allowing scientists to look back several billion years and reconstruct the physical conditions and evolution of the early universe.
Distant quasars will be used as "flashlights" to track light as it passes through the cosmic web of long, narrow filaments of galaxies and intergalactic gas separated by enormous voids, said Green. Astrophysicists have theorized that a single cosmic web filament may stretch for hundreds of millions of light-years, an astonishing length considering a single light-year is about 5.9 trillion miles.
Light absorbed by material in the web should reveal "fingerprints" of matter like hydrogen, helium and heavier elements, allowing scientists to build up a picture of how the gases are distributed and how matter has changed over time as the universe has aged, Green said.
The spectrograph will break light into its individual components much like a prism, revealing the temperature, density, velocity, distance and chemical composition of galaxies, stars and gas clouds, said Professor Michael Shull of CASA, a co-investigator on COS. The team has chosen hundreds of astronomical targets in all directions of space, which will allow them to build a picture of the way matter is organized in the universe on a grand scale, Shull said.
Shull said one of the earliest COS targets will be a quasar previously looked at by Hubble that is believed to have formed about 5 billion years ago – more than one-third of the way back in time and space to the Big Bang. "This instrument is ten times more sensitive than any previous Hubble ultraviolet instruments, so we are looking forward to studying intergalactic space at this distant epoch in detail."
While matter is thought to have been distributed uniformly throughout space just after the Big Bang, gravity has shaped it into its present filamentary structure known as the cosmic web, said Shull. "Pointing our instrument at hundreds of targets over time will allow us to take a CAT scan of the universe."
COS also will be used to detect young hot stars shrouded in the thick dust clouds they formed in, providing new information on star birth, said CASA Senior Research Associate Cynthia Froning, COS project scientist. Scientists also will point COS at gas surrounding the outer planets of the solar system to glean new clues about planetary evolution.
Green and his COS science team, which is made up of 14 CU-Boulder scientists and engineers and 10 scientists from other institutions, have been allotted 552 orbits of observation time on Hubble. CU-Boulder's CASA is in the process of hiring several dozen postdoctoral researchers, graduate students and undergraduates to work on the project in the coming years, Green said.
Other members of the COS science team are from Ball, the Southwest Research Institute in Boulder, the University of Wisconsin-Madison, the University of California, Berkeley, NASA's Goddard Space Flight Center in Greenbelt, Md., and the Space Telescope Science Institute in Baltimore, Green said.
Adapted from materials provided by University of Colorado at Boulder.

mercoledì 6 maggio 2009

Star Crust 10 Billion Times Stronger Than Steel, Physicist Finds

SOURCE

ScienceDaily (May 6, 2009) — Research by a theoretical physicist at Indiana University shows that the crusts of neutron stars are 10 billion times stronger than steel or any other of the earth's strongest metal alloys.
Charles Horowitz, a professor in the IU College of Arts and Sciences' Department of Physics, came to the conclusion after large-scale molecular dynamics computer simulations were conducted at Indiana University and Los Alamos National Laboratory in New Mexico.
Exhibiting extreme gravity while rotating as fast as 700 times per second, neutron stars are massive stars that collapsed once their cores ceased nuclear fusion and energy production. The only things more dense are black holes, as a teaspoonful of neutron star matter would weigh about 100 million tons.
Scientists want to understand the structure of neutron stars, in part, because surface irregularities, or mountains, in the crust could radiate gravitational waves and in turn may create ripples in space-time. Understanding how high a mountain might become before collapsing from the neutron star's gravity, or estimating the crust's breaking strain, also has implications for better understanding star quakes or magnetar giant flares.
"We modeled a small region of the neutron star crust by following the individual motions of up to 12 million particles," Horowitz said of the work conducted through IU's Nuclear Theory Center in the Office of the Vice Provost for Research. "We then calculated how the crust deforms and eventually breaks under the extreme weight of a neutron star mountain."
Performed on a large computer cluster at Los Alamos National Laboratory and built upon smaller versions created on special-purpose molecular dynamics computer hardware at IU, the simulations identified a neutron star crust that far exceeded the strength of any material known on earth.
The crust could be so strong as to be able to elicit gravitational waves that could not only limit the spin periods of some stars, but that could also be detected by high-resolution telescopes called interferometers, the modeling found.
"The maximum possible size of these mountains depends on the breaking strain of the neutron star crust," Horowitz said. "The large breaking strain that we find should support mountains on rapidly rotating neutron stars large enough to efficiently radiate gravitational waves."
Because of the intense pressure found on neutron stars, structural flaws and impurities that weaken things like rocks and steel are less likely to strain the crystals that form during the nucleosynthesis that occurs to form neutron star crust. Squeezed together by gravitational force, the crust can withstand a breaking strain 10 billion times the pressure it would take to snap steel.
Horowitz's most recent work on neutron stars was supported by a grant from the U.S. Department of Energy and through Shared University Research Grants from IBM to IU. Working with Horowitz were Don Berry, a principal systems analyst with the High Performance Applications Group in University Information Technology Services at Indiana University, and Kai Kadau at Los Alamos National Laboratory.
Journal reference:
C. J. Horowitz, Kai Kadau. The breaking strain of neutron star crust and gravitational waves. Physical Review Letters, Online May 8, 2009 [link]
Adapted from materials provided by Indiana University.

Astronomer To Search Space For Precursors Of Life

SOURCE

ScienceDaily (May 6, 2009) — Many of the organic molecules that make up life on Earth have also been found in space. A University of Michigan astronomer will use the Herschel Space Observatory to study these chemical compounds in new detail in the warm clouds of gas and dust around young stars.
They hope to gain insights into how organic molecules form in space, and possibly, how life formed on Earth.
"The chemistry of space makes molecules that are the precursors of life. It's possible that the Earth didn't have to make these things on its own, but that they were provided from space," said Ted Bergin, an associate professor in the Department of Astronomy.
Bergin is a co-investigator on the Heterodyne Instrument for the Infrared aboard Herschel and a principal investigator on one of its key observing programs. Herschel, a European Space Agency mission with NASA participation, is scheduled to launch May 6. An orbiting telescope that will unlock new wavelengths on the electromagnetic spectrum, it will allow astronomers to observe at the far-infrared wavelengths where organic molecules and water emit their chemical signatures.
"We'll be studying the full extent of chemistry in space and we hope to learn what types of organics are out there as a function of their distance from a star," Bergin said. "And we want to understand the chemical machinery that led to the formation of these organics."
Meteorites flecked with amino acids, which make proteins, have fallen to Earth from space. In faraway galaxies and stellar nurseries, astronomers have detected complex organic sugar and hydrocarbon molecules that are key components in chlorophyll in plants and RNA. Bergin expects to detect tens if not hundreds of these kinds of compounds---some of which have never been found before outside the Earth.
He is also involved in a Herschel project to look for water molecules in space. Traces of water in warm clouds of gas and dust around young stars could hold clues to how water forms and behaves in space, and how this elixir of life came to be so abundant on Earth. Scientists believe water got to Earth in a similar way as organic molecules.
"Most of the water in the solar system is not where we are, but further out in the solar system," Bergin said. "Most theories suggest that the Earth formed dry and impacts from asteroids or other objects provided the water here."
Adapted from materials provided by University of Michigan.