- UID
- 504111
- 在线时间
- 小时
- 注册时间
- 2010-1-19
- 最后登录
- 1970-1-1
- 主题
- 帖子
- 性别
- 保密
|
【速度】
【计时1】 A Supernova Cocoon Breakthrough
ScienceDaily (May 15, 2012) — Observations with NASA's Chandra X-ray Observatory have provided the first X-ray evidence of a supernova shock wave breaking through a cocoon of gas surrounding the star that exploded. This discovery may help astronomers understand why some supernovas are much more powerful than others. [attachimg=420,475]100637[/attachimg]
On November 3, 2010, a supernova was discovered in the galaxy UGC 5189A, located about 160 million light years away. Using data from the All Sky Automated Survey telescope in Hawaii taken earlier, astronomers determined this supernova exploded in early October 2010 (in Earth's time-frame). This composite image of UGC 5189A shows X-ray data from Chandra in purple and optical data from Hubble Space Telescope in red, green and blue. SN 2010jl is the very bright X-ray source near the top of the galaxy (mouse-over for a labeled version). A team of researchers used Chandra to observe this supernova in December 2010 and again in October 2011. The supernova was one of the most luminous that has ever been detected in X-rays. In optical light, SN 2010jl was about ten times more luminous than a typical supernova resulting from the collapse of a massive star, adding to the class of very luminous supernovas that have been discovered recently with optical surveys. Different explanations have been proposed to explain these energetic supernovas including (1) the interaction of the supernova's blast wave with a dense shell of matter around the pre-supernova star, (2) radioactivity resulting from a pair-instability supernova (triggered by the conversion of gamma rays into particle and anti-particle pairs), and (3) emission powered by a neutron star with an unusually powerful magnetic field. 【274 words】
【计时2】 In the first Chandra observation of SN 2010jl, the X-rays from the explosion's blast wave were strongly absorbed by a cocoon of dense gas around the supernova. This cocoon was formed by gas blown away from the massive star before it exploded. In the second observation taken almost a year later, there is much less absorption of X-ray emission, indicating that the blast wave from the explosion has broken out of the surrounding cocoon. The Chandra data show that the gas emitting the X-rays has a very high temperature -- greater than 100 million degrees Kelvin -- strong evidence that it has been heated by the supernova blast wave. The energy distribution, or spectrum, of SN 2010jl in optical light reveals features that the researchers think are explained by the following scenario: matter around the supernova has been heated and ionized (electrons stripped from atoms) by X-rays generated when the blast wave plows through this material. While this type of interaction has been proposed before, the new observations directly show, for the first time, that this is happening. This discovery therefore supports the idea that some of the unusually luminous supernovas are caused by the blast wave from their explosion ramming into the material around it. In a rare example of a cosmic coincidence, analysis of the X-rays from the supernova shows that there is a second unrelated source at almost the same location as the supernova. These two sources strongly overlap one another as seen on the sky. This second source is likely to be an ultraluminous X-ray source, possibly containing an unusually heavy stellar-mass black hole, or an intermediate mass black hole. 【274 words】
【计时3】 Statistical Analysis Projects Future Temperatures in North America
ScienceDaily (May 15, 2012) — For the first time, researchers have been able to combine different climate models using spatial statistics -- to project future seasonal temperature changes in regions across North America. [attachimg=300,222]100638[/attachimg]
They performed advanced statistical analysis on two different North American regional climate models and were able to estimate projections of temperature changes for the years 2041 to 2070, as well as the certainty of those projections. The analysis, developed by statisticians at Ohio State University, examines groups of regional climate models, finds the commonalities between them, and determines how much weight each individual climate projection should get in a consensus climate estimate. Through maps on the statisticians' website, people can see how their own region's temperature will likely change by 2070 -- overall, and for individual seasons of the year. Given the complexity and variety of climate models produced by different research groups around the world, there is a need for a tool that can analyze groups of them together, explained Noel Cressie, professor of statistics and director of Ohio State's Program in Spatial Statistics and Environmental Statistics. Cressie and former graduate student Emily Kang, now at the University of Cincinnati, present the statistical analysis in a paper published in the International Journal of Applied Earth Observation and Geoinformation. "One of the criticisms from climate-change skeptics is that different climate models give different results, so they argue that they don't know what to believe," he said. "We wanted to develop a way to determine the likelihood of different outcomes, and combine them into a consensus climate projection. We show that there are shared conclusions upon which scientists can agree with some certainty, and we are able to statistically quantify that certainty." 【291 words】
【计时4】 For their initial analysis, Cressie and Kang chose to combine two regional climate models developed for the North American Regional Climate Change Assessment Program. Though the models produced a wide variety of climate variables, the researchers focused on temperatures during a 100-year period: first, the climate models' temperature values from 1971 to 2000, and then the climate models' temperature values projected for 2041 to 2070. The data were broken down into blocks of area 50 kilometers (about 30 miles) on a side, throughout North America. Averaging the results over those individual blocks, Cressie and Kang's statistical analysis estimated that average land temperatures across North America will rise around 2.5 degrees Celsius (4.5 degrees Fahrenheit) by 2070. That result is in agreement with the findings of the United Nations Intergovernmental Panel on Climate Change, which suggest that under the same emissions scenario as used by NARCCAP, global average temperatures will rise 2.4 degrees Celsius (4.3 degrees Fahrenheit) by 2070. Cressie and Kang's analysis is for North America -- and not only estimates average land temperature rise, but regional temperature rise for all four seasons of the year. Cressie cautioned that this first study is based on a combination of a small number of models. Nevertheless, he continued, the statistical computations are scalable to a larger number of models. The study shows that climate models can indeed be combined to achieve consensus, and the certainty of that consensus can be quantified. The statistical analysis could be used to combine climate models from any region in the world, though, he added, it would require an expert spatial statistician to modify the analysis for other settings. The key is a special combination of statistical analysis methods that Cressie pioneered, which use spatial statistical models in what researchers call Bayesian hierarchical statistical analyses. 【298 words】
【计时5】 The latter techniques come from Bayesian statistics, which allows researchers to quantify the certainty associated with any particular model outcome. All data sources and models are more or less certain, Cressie explained, and it is the quantification of these certainties that are the building blocks of a Bayesian analysis. In the case of the two North American regional climate models, his Bayesian analysis technique was able to give a range of possible temperature changes that includes the true temperature change with 95 percent probability. After producing average maps for all of North America, the researchers took their analysis a step further and examined temperature changes for the four seasons. On their website, they show those seasonal changes for regions in the Hudson Bay, the Great Lakes, the Midwest, and the Rocky Mountains. In the future, the region in the Hudson Bay will likely experience larger temperature swings than the others, they found. That Canadian region in the northeast part of the continent is likely to experience the biggest change over the winter months, with temperatures estimated to rise an average of about 6 degrees Celsius (10.7 degrees Fahrenheit) -- possibly because ice reflects less energy away from Earth's surface as it melts. Hudson Bay summers, on the other hand, are estimated to experience only an increase of about 1.2 degrees Celsius (2.1 degrees Fahrenheit). According to the researchers' statistical analysis, the Midwest and Great Lakes regions will experience a rise in temperature of about 2.8 degrees Celsius (5 degrees Fahrenheit), regardless of season. The Rocky Mountains region shows greater projected increases in the summer (about 3.5 degrees Celsius, or 6.3 degrees Fahrenheit) than in the winter (about 2.3 degrees Celsius, or 4.1 degrees Fahrenheit). In the future, the researchers could consider other climate variables in their analysis, such as precipitation. This research was supported by NASA's Earth Science Technology Office. The North American Regional Climate Change Assessment Program is funded by the National Science Foundation, the U.S. Department of Energy, the National Oceanic and Atmospheric Administration, and the U.S. Environmental Protection Agency office of Research and Development. 【347 words】
【越障】
New Look at Prolonged Radiation Exposure: At Low Dose-Rate, Radiation Poses Little Risk to DNA, Study Suggests
ScienceDaily (May 15, 2012) — A new study from MIT scientists suggests that the guidelines governments use to determine when to evacuate people following a nuclear accident may be too conservative.
The study, led by Bevin Engelward and Jacquelyn Yanch and published in the journal Environmental Health Perspectives, found that when mice were exposed to radiation doses about 400 times greater than background levels for five weeks, no DNA damage could be detected. Current U.S. regulations require that residents of any area that reaches radiation levels eight times higher than background should be evacuated. However, the financial and emotional cost of such relocation may not be worthwhile, the researchers say. "There are no data that say that's a dangerous level," says Yanch, a senior lecturer in MIT's Department of Nuclear Science and Engineering. "This paper shows that you could go 400 times higher than average background levels and you're still not detecting genetic damage. It could potentially have a big impact on tens if not hundreds of thousands of people in the vicinity of a nuclear powerplant accident or a nuclear bomb detonation, if we figure out just when we should evacuate and when it's OK to stay where we are." Until now, very few studies have measured the effects of low doses of radiation delivered over a long period of time. This study is the first to measure the genetic damage seen at a level as low as 400 times background (0.0002 centigray per minute, or 105 cGy in a year). "Almost all radiation studies are done with one quick hit of radiation. That would cause a totally different biological outcome compared to long-term conditions," says Engelward, an associate professor of biological engineering at MIT. How much is too much? Background radiation comes from cosmic radiation and natural radioactive isotopes in the environment. These sources add up to about 0.3 cGy per year per person, on average. "Exposure to low-dose-rate radiation is natural, and some people may even say essential for life. The question is, how high does the rate need to get before we need to worry about ill effects on our health?" Yanch says. Previous studies have shown that a radiation level of 10.5 cGy, the total dose used in this study, does produce DNA damage if given all at once. However, for this study, the researchers spread the dose out over five weeks, using radioactive iodine as a source. The radiation emitted by the radioactive iodine is similar to that emitted by the damaged Fukushima reactor in Japan. At the end of five weeks, the researchers tested for several types of DNA damage, using the most sensitive techniques available. Those types of damage fall into two major classes: base lesions, in which the structure of the DNA base (nucleotide) is altered, and breaks in the DNA strand. They found no significant increases in either type. DNA damage occurs spontaneously even at background radiation levels, conservatively at a rate of about 10,000 changes per cell per day. Most of that damage is fixed by DNA repair systems within each cell. The researchers estimate that the amount of radiation used in this study produces an additional dozen lesions per cell per day, all of which appear to have been repaired. Though the study ended after five weeks, Engelward believes the results would be the same for longer exposures. "My take on this is that this amount of radiation is not creating very many lesions to begin with, and you already have good DNA repair systems. My guess is that you could probably leave the mice there indefinitely and the damage wouldn't be significant," she says. Doug Boreham, a professor of medical physics and applied radiation sciences at McMaster University, says the study adds to growing evidence that low doses of radiation are not as harmful as people often fear. "Now, it's believed that all radiation is bad for you, and any time you get a little bit of radiation, it adds up and your risk of cancer goes up," says Boreham, who was not involved in this study. "There's now evidence building that that is not the case." Conservative estimates Most of the radiation studies on which evacuation guidelines have been based were originally done to establish safe levels for radiation in the workplace, Yanch says -- meaning they are very conservative. In workplace cases, this makes sense because the employer can pay for shielding for all of their employees at once, which lowers the cost, she says. However, "when you've got a contaminated environment, then the source is no longer controlled, and every citizen has to pay for their own dose avoidance," Yanch says. "They have to leave their home or their community, maybe even forever. They often lose their jobs, like you saw in Fukushima. And there you really want to call into question how conservative in your analysis of the radiation effect you want to be. Instead of being conservative, it makes more sense to look at a best estimate of how hazardous radiation really is." Those conservative estimates are based on acute radiation exposures, and then extrapolating what might happen at lower doses and lower dose-rates, Engelward says. "Basically you're using a data set collected based on an acute high dose exposure to make predictions about what's happening at very low doses over a long period of time, and you don't really have any direct data. It's guesswork," she says. "eople argue constantly about how to predict what is happening at lower doses and lower dose-rates." However, the researchers say that more studies are needed before evacuation guidelines can be revised. "Clearly these studies had to be done in animals rather than people, but many studies show that mice and humans share similar responses to radiation. This work therefore provides a framework for additional research and careful evaluation of our current guidelines," Engelward says. "It is interesting that, despite the evacuation of roughly 100,000 residents, the Japanese government was criticized for not imposing evacuations for even more people. From our studies, we would predict that the population that was left behind would not show excess DNA damage -- this is something we can test using technologies recently developed in our laboratory," she adds. The first author on these studies is former MIT postdoc Werner Olipitz, and the work was done in collaboration with Department of Biological Engineering faculty Leona Samson and Peter Dedon. These studies were supported by the DOE and by MIT's Center for Environmental Health Sciences. 【1100 words】
|
本帖子中包含更多资源
您需要 登录 才可以下载或查看,没有帐号?立即注册
x
|