ChaseDream
搜索
123下一页
返回列表 发新帖
查看: 6169|回复: 26
打印 上一主题 下一主题

[阅读小分队] 【每日阅读训练第四期——速度越障1系列】【1-15】科技

[复制链接]
跳转到指定楼层
楼主
发表于 2012-5-16 07:35:39 | 只看该作者 回帖奖励 |倒序浏览 |阅读模式
【速度】

【计时1
A Supernova Cocoon Breakthrough

ScienceDaily (May 15, 2012) — Observations with NASA's Chandra X-ray Observatory have provided the first X-ray evidence of a supernova shock wave breaking through a cocoon of gas surrounding the star that exploded. This discovery may help astronomers understand why some supernovas are much more powerful than others.
[attachimg=420,475]100637[/attachimg]

On November 3, 2010, a supernova was discovered in the galaxy UGC 5189A, located about 160 million light years away. Using data from the All Sky Automated Survey telescope in Hawaii taken earlier, astronomers determined this supernova exploded in early October 2010 (in Earth's time-frame).
This composite image of UGC 5189A shows X-ray data from Chandra in purple and optical data from Hubble Space Telescope in red, green and blue. SN 2010jl is the very bright X-ray source near the top of the galaxy (mouse-over for a labeled version).
A team of researchers used Chandra to observe this supernova in December 2010 and again in October 2011. The supernova was one of the most luminous that has ever been detected in X-rays.
In optical light, SN 2010jl was about ten times more luminous than a typical supernova resulting from the collapse of a massive star, adding to the class of very luminous supernovas that have been discovered recently with optical surveys. Different explanations have been proposed to explain these energetic supernovas including (1) the interaction of the supernova's blast wave with a dense shell of matter around the pre-supernova star, (2) radioactivity resulting from a pair-instability supernova (triggered by the conversion of gamma rays into particle and anti-particle pairs), and (3) emission powered by a neutron star with an unusually powerful magnetic field.
274 words



【计时2
In the first Chandra observation of SN 2010jl, the X-rays from the explosion's blast wave were strongly absorbed by a cocoon of dense gas around the supernova. This cocoon was formed by gas blown away from the massive star before it exploded.
In the second observation taken almost a year later, there is much less absorption of X-ray emission, indicating that the blast wave from the explosion has broken out of the surrounding cocoon. The Chandra data show that the gas emitting the X-rays has a very high temperature -- greater than 100 million degrees Kelvin -- strong evidence that it has been heated by the supernova blast wave.
The energy distribution, or spectrum, of SN 2010jl in optical light reveals features that the researchers think are explained by the following scenario: matter around the supernova has been heated and ionized (electrons stripped from atoms) by X-rays generated when the blast wave plows through this material. While this type of interaction has been proposed before, the new observations directly show, for the first time, that this is happening.
This discovery therefore supports the idea that some of the unusually luminous supernovas are caused by the blast wave from their explosion ramming into the material around it.
In a rare example of a cosmic coincidence, analysis of the X-rays from the supernova shows that there is a second unrelated source at almost the same location as the supernova. These two sources strongly overlap one another as seen on the sky. This second source is likely to be an ultraluminous X-ray source, possibly containing an unusually heavy stellar-mass black hole, or an intermediate mass black hole.
274 words



【计时3
Statistical Analysis Projects Future Temperatures in North America

ScienceDaily (May 15, 2012) — For the first time, researchers have been able to combine different climate models using spatial statistics -- to project future seasonal temperature changes in regions across North America.
[attachimg=300,222]100638[/attachimg]

They performed advanced statistical analysis on two different North American regional climate models and were able to estimate projections of temperature changes for the years 2041 to 2070, as well as the certainty of those projections.
The analysis, developed by statisticians at Ohio State University, examines groups of regional climate models, finds the commonalities between them, and determines how much weight each individual climate projection should get in a consensus climate estimate.
Through maps on the statisticians' website, people can see how their own region's temperature will likely change by 2070 -- overall, and for individual seasons of the year.
Given the complexity and variety of climate models produced by different research groups around the world, there is a need for a tool that can analyze groups of them together, explained Noel Cressie, professor of statistics and director of Ohio State's Program in Spatial Statistics and Environmental Statistics.
Cressie and former graduate student Emily Kang, now at the University of Cincinnati, present the statistical analysis in a paper published in the International Journal of Applied Earth Observation and Geoinformation.
"One of the criticisms from climate-change skeptics is that different climate models give different results, so they argue that they don't know what to believe," he said. "We wanted to develop a way to determine the likelihood of different outcomes, and combine them into a consensus climate projection. We show that there are shared conclusions upon which scientists can agree with some certainty, and we are able to statistically quantify that certainty."
291 words



【计时4
For their initial analysis, Cressie and Kang chose to combine two regional climate models developed for the North American Regional Climate Change Assessment Program. Though the models produced a wide variety of climate variables, the researchers focused on temperatures during a 100-year period: first, the climate models' temperature values from 1971 to 2000, and then the climate models' temperature values projected for 2041 to 2070. The data were broken down into blocks of area 50 kilometers (about 30 miles) on a side, throughout North America.
Averaging the results over those individual blocks, Cressie and Kang's statistical analysis estimated that average land temperatures across North America will rise around 2.5 degrees Celsius (4.5 degrees Fahrenheit) by 2070. That result is in agreement with the findings of the United Nations Intergovernmental Panel on Climate Change, which suggest that under the same emissions scenario as used by NARCCAP, global average temperatures will rise 2.4 degrees Celsius (4.3 degrees Fahrenheit) by 2070. Cressie and Kang's analysis is for North America -- and not only estimates average land temperature rise, but regional temperature rise for all four seasons of the year.
Cressie cautioned that this first study is based on a combination of a small number of models. Nevertheless, he continued, the statistical computations are scalable to a larger number of models. The study shows that climate models can indeed be combined to achieve consensus, and the certainty of that consensus can be quantified.
The statistical analysis could be used to combine climate models from any region in the world, though, he added, it would require an expert spatial statistician to modify the analysis for other settings.
The key is a special combination of statistical analysis methods that Cressie pioneered, which use spatial statistical models in what researchers call Bayesian hierarchical statistical analyses.
298 words



【计时5
The latter techniques come from Bayesian statistics, which allows researchers to quantify the certainty associated with any particular model outcome. All data sources and models are more or less certain, Cressie explained, and it is the quantification of these certainties that are the building blocks of a Bayesian analysis.
In the case of the two North American regional climate models, his Bayesian analysis technique was able to give a range of possible temperature changes that includes the true temperature change with 95 percent probability.
After producing average maps for all of North America, the researchers took their analysis a step further and examined temperature changes for the four seasons. On their website, they show those seasonal changes for regions in the Hudson Bay, the Great Lakes, the Midwest, and the Rocky Mountains.
In the future, the region in the Hudson Bay will likely experience larger temperature swings than the others, they found.
That Canadian region in the northeast part of the continent is likely to experience the biggest change over the winter months, with temperatures estimated to rise an average of about 6 degrees Celsius (10.7 degrees Fahrenheit) -- possibly because ice reflects less energy away from Earth's surface as it melts. Hudson Bay summers, on the other hand, are estimated to experience only an increase of about 1.2 degrees Celsius (2.1 degrees Fahrenheit).
According to the researchers' statistical analysis, the Midwest and Great Lakes regions will experience a rise in temperature of about 2.8 degrees Celsius (5 degrees Fahrenheit), regardless of season. The Rocky Mountains region shows greater projected increases in the summer (about 3.5 degrees Celsius, or 6.3 degrees Fahrenheit) than in the winter (about 2.3 degrees Celsius, or 4.1 degrees Fahrenheit).
In the future, the researchers could consider other climate variables in their analysis, such as precipitation.
This research was supported by NASA's Earth Science Technology Office. The North American Regional Climate Change Assessment Program is funded by the National Science Foundation, the U.S. Department of Energy, the National Oceanic and Atmospheric Administration, and the U.S. Environmental Protection Agency office of Research and Development.
347 words





【越障】

New Look at Prolonged Radiation Exposure: At Low Dose-Rate, Radiation Poses Little Risk to DNA, Study Suggests

ScienceDaily (May 15, 2012) — A new study from MIT scientists suggests that the guidelines governments use to determine when to evacuate people following a nuclear accident may be too conservative.


The study, led by Bevin Engelward and Jacquelyn Yanch and published in the journal Environmental Health Perspectives, found that when mice were exposed to radiation doses about 400 times greater than background levels for five weeks, no DNA damage could be detected.
Current U.S. regulations require that residents of any area that reaches radiation levels eight times higher than background should be evacuated. However, the financial and emotional cost of such relocation may not be worthwhile, the researchers say.
"There are no data that say that's a dangerous level," says Yanch, a senior lecturer in MIT's Department of Nuclear Science and Engineering. "This paper shows that you could go 400 times higher than average background levels and you're still not detecting genetic damage. It could potentially have a big impact on tens if not hundreds of thousands of people in the vicinity of a nuclear powerplant accident or a nuclear bomb detonation, if we figure out just when we should evacuate and when it's OK to stay where we are."
Until now, very few studies have measured the effects of low doses of radiation delivered over a long period of time. This study is the first to measure the genetic damage seen at a level as low as 400 times background (0.0002 centigray per minute, or 105 cGy in a year).
"Almost all radiation studies are done with one quick hit of radiation. That would cause a totally different biological outcome compared to long-term conditions," says Engelward, an associate professor of biological engineering at MIT.
How much is too much?
Background radiation comes from cosmic radiation and natural radioactive isotopes in the environment. These sources add up to about 0.3 cGy per year per person, on average.
"Exposure to low-dose-rate radiation is natural, and some people may even say essential for life. The question is, how high does the rate need to get before we need to worry about ill effects on our health?" Yanch says.
Previous studies have shown that a radiation level of 10.5 cGy, the total dose used in this study, does produce DNA damage if given all at once. However, for this study, the researchers spread the dose out over five weeks, using radioactive iodine as a source. The radiation emitted by the radioactive iodine is similar to that emitted by the damaged Fukushima reactor in Japan.
At the end of five weeks, the researchers tested for several types of DNA damage, using the most sensitive techniques available. Those types of damage fall into two major classes: base lesions, in which the structure of the DNA base (nucleotide) is altered, and breaks in the DNA strand. They found no significant increases in either type.
DNA damage occurs spontaneously even at background radiation levels, conservatively at a rate of about 10,000 changes per cell per day. Most of that damage is fixed by DNA repair systems within each cell. The researchers estimate that the amount of radiation used in this study produces an additional dozen lesions per cell per day, all of which appear to have been repaired.
Though the study ended after five weeks, Engelward believes the results would be the same for longer exposures. "My take on this is that this amount of radiation is not creating very many lesions to begin with, and you already have good DNA repair systems. My guess is that you could probably leave the mice there indefinitely and the damage wouldn't be significant," she says.
Doug Boreham, a professor of medical physics and applied radiation sciences at McMaster University, says the study adds to growing evidence that low doses of radiation are not as harmful as people often fear.
"Now, it's believed that all radiation is bad for you, and any time you get a little bit of radiation, it adds up and your risk of cancer goes up," says Boreham, who was not involved in this study. "There's now evidence building that that is not the case."
Conservative estimates
Most of the radiation studies on which evacuation guidelines have been based were originally done to establish safe levels for radiation in the workplace, Yanch says -- meaning they are very conservative. In workplace cases, this makes sense because the employer can pay for shielding for all of their employees at once, which lowers the cost, she says.
However, "when you've got a contaminated environment, then the source is no longer controlled, and every citizen has to pay for their own dose avoidance," Yanch says. "They have to leave their home or their community, maybe even forever. They often lose their jobs, like you saw in Fukushima. And there you really want to call into question how conservative in your analysis of the radiation effect you want to be. Instead of being conservative, it makes more sense to look at a best estimate of how hazardous radiation really is."
Those conservative estimates are based on acute radiation exposures, and then extrapolating what might happen at lower doses and lower dose-rates, Engelward says. "Basically you're using a data set collected based on an acute high dose exposure to make predictions about what's happening at very low doses over a long period of time, and you don't really have any direct data. It's guesswork," she says. "eople argue constantly about how to predict what is happening at lower doses and lower dose-rates."
However, the researchers say that more studies are needed before evacuation guidelines can be revised.
"Clearly these studies had to be done in animals rather than people, but many studies show that mice and humans share similar responses to radiation. This work therefore provides a framework for additional research and careful evaluation of our current guidelines," Engelward says.
"It is interesting that, despite the evacuation of roughly 100,000 residents, the Japanese government was criticized for not imposing evacuations for even more people. From our studies, we would predict that the population that was left behind would not show excess DNA damage -- this is something we can test using technologies recently developed in our laboratory," she adds.
The first author on these studies is former MIT postdoc Werner Olipitz, and the work was done in collaboration with Department of Biological Engineering faculty Leona Samson and Peter Dedon. These studies were supported by the DOE and by MIT's Center for Environmental Health Sciences.
1100 words

本帖子中包含更多资源

您需要 登录 才可以下载或查看,没有帐号?立即注册

x
收藏收藏 收藏收藏
沙发
发表于 2012-5-16 09:04:11 | 只看该作者
诶? 我这么晚来还是沙发?

1:48
发现一枚超新星SN 在夏威夷的观测发现他爆炸时候非常亮 被称为超亮超新星- -||| 并给出可能的3个原因:1气体与地壳反映2gamma射线与反粒子射线反映?3中子与磁场反映
1:57
两次的天文观测:第一次X射线少 没有突破超新星周围的包层 第二次的多 突破了包层 并且发现了大批blast
explain:物质加热而电离→产生粒子blast→与星体周围物体反映→超量超新星
第二种不相关物质 X射线--黑洞


1:30
科学家预测非北美天气:展示了各种数学模型 复杂的预测方式 此外还需要某个特殊工具 因为不同模型的结果不定 因此需要更确切的抉择方法
1:44
确定两种模型 划分不同地区不同年份来预测 发现各地区的温度上涨各是多少 估测出各地区的各季节的趋势为多少等等
1:35...
大概意思就是估测出的结果吧:看了半天都是某地区气温即将上涨多少度。。。还说可能有降雨量的预测 最后说明资金来源赞助公司等等



7:59
MIT的核辐射调查结果:认为小幅度核辐射影响甚微 政府撤离群众的估计过于保守(意思是太谨慎了辐射辐射没啥事。。)
大概框架:
1P:拿老鼠做实验 长时间多次数同条件的在小幅度辐射下实验老鼠 发现老鼠没有DNA的破坏 并由老鼠推人类 认为以前的估算过于保守 人们一直研究核爆炸而不是小幅度核事故核泄漏 因此高估核能量危险
2P:定义安全辐射底线 发现以前的定的太低了 拿高于安全线的辐射 且与日本核辐射一样的物质 貌似也没什么事 没有发现老鼠的DNA任何破坏
3P:以前的高估了危险 虽然给环境带来危害并且有一定的社会不良效应比如失业 但是确实是对人没啥危险
但是认为现在提高这个rate还是为时过早 需要再进行谨慎的测算实验证明

板凳
发表于 2012-5-16 09:08:45 | 只看该作者
谢谢Christine~很好的文章~~~
1:21

1:22

1:30

1:20

1:22

~~

9:29

Topiclow dose radiation may not increase the risk of DNA

1.nuclear accidents->conservate?

2.An experiment:

*mice are exposed to radiation level greater than 400 times,but their DNAs dont change.

3.The act of government wastes money but not worth doing.

4.no data shows that radiation exposure will change the damage? Level.

5.Few study do similar research and this study is the first one.

6.Q:How much is too much?

*radiation come from . and

#ANS: rediation level 10.5cGY ->DNA damage(premise: all at once)

*radiative iodin=JP F(福岛) reactor

7.test of DNA damage

#2 classes:*1.Baslesion(nuclear is altered)

*2.break DNA strand(这个应该是DNA)

8.result=for longer exposure ??(I forget this detail..)

9.Radiation is not as harmful as people fear.

10.”Radiation will increase the risk of cancer”

*This saying is not the case.

11.The case in JP (福岛) is hazardous, and it forced may people to leave their home and jobs.

12.Conservative estimate based on accut radiation exposure.

----àlower dose & lower dose-rate

13.More studies about this areais needed before the conservative concerns are revised.

14.The reason to choose mice in the experiment is that mice is similar to human ???(看不懂自己写了什么。。。) to radiation

15.It is interesting that even thought JP government excuated many people after the disaster, it was criticized not excuated more people.

16.Introduction of the researchers.
地板
发表于 2012-5-16 10:24:21 | 只看该作者
地板~

---
1. 1'40
2. 1'29
3. 1'36
4. 1'22
5. 1'43
越障:5'38
1. discribe a experiment that low background radiation does not cause DNA damage as people expected.
2. point out differences beteween this experiment and former ones
3. Now the problem is: how to measure the benchmark of safe radiation
4. the experiment is very conservative. discuss the difference between lab experiment and field experiment.
5#
发表于 2012-5-16 11:28:20 | 只看该作者
科技题 好可怕啊~~
1、2‘07’‘ 291w 讲的是有个C仪器,用X-ray来探测。有个S星体的爆发被它探测到了 这是它首次探测到星体爆发。这个星体离地球160million光年,很可能是2010年10月爆发的,最早是夏威夷的某个检测机构检测到的。S星体比其他的能量都大,然后列了三四个原因,木看懂。。。
2、2’03’‘ 291w 一年后的观测中发现周围的X-ray吸收了 验证了某个设想理论 并且有些被加热的现象也证明了这个解释。然后又发现了第二个未爆发的souce 有可能是黑洞之类
3、1’46‘’ 291w 大意就是现在预测天气的模型和理论太多了 产生了很多不同的结果 于是他们建造了一个统计的模型和理论 能把所有的模型都放在其下综合考虑 这样就能甚至预测2070年北美的天气
4、1‘54’‘  298w 他们取了两个模型做实验 把所有的地区分成50千米一块,得出到2070年北美大陆会升温2.5摄氏度。他们的这个大系统甚至还可以测全球得未来温度,不过需要专家的手动调整
5、 2’   347w 他们作了实验说四个北美大陆有代表性的地方温度都回有不同程度的上升及东北部上升的猜测原因。未来会加入更多的考虑因素和模型之类的。
        最后总结了这个实验是NASA赞助的,关于北美的实验是几个部门赞助的。

越障:6‘13’‘ 1100w
       第一部分通过一个研究提出了对我们传统认为辐射多可怕的质疑 认为有些辐射一次性释放是对身体有害 但是如果这些量长时间释放 对人体是没有影响,因此劳民伤财地大举迁徙有时候没有必要的
       第二部分通过实验 尤其是模拟日本和辐射那次 发现对小白鼠没有影响 因此用实验验证了上面提出的质疑
      第三部分表示之前的实验是在老鼠身上的 没有实验人 因此有待考察 不过任何老鼠对辐射的影响程度是类似的 之前日本政府撤了10万人 大家都质疑怎么才10万 但是现在通过我们的实验说明其他人并没有受到辐射影响
       最后说了这个实验谁赞助的。
6#
发表于 2012-5-16 13:07:25 | 只看该作者
1'30"
1'38"
1'43"
1'55"
1'59"

6'45"
MIT的科学家用老鼠做实验,发现长时间低剂量的暴露于辐射中,并不会引起DNA的损伤。由此对现在的撤离标准提出质疑。这是第一个致力于研究低剂量长时间辐射暴露的实验,之前都是一次性高剂量的,而这两者应该产生的结果是不一样的。
介绍了实验的过程:总剂量等于一次性实验的剂量,但是分配于5周,辐射源放射性碘。5周后发现老鼠的DNA并没有明显的损伤。因为细胞具有修复系统,由于辐射造成的损伤都被修复了。虽然实验只进行了5周,但是对于更长时间仍然会有同样结果。
指出现在的标准过于保守。撤离是一件劳命伤财的事情,人们需要知道最佳的危害标准。并且需要知道在长时间低剂量辐射暴露的数据,而不是以前那样一次性大量暴露的数据。在修改标准之前还有很多工作需要完成。
举了福岛的例子,预测当地那些被留下的居民在接受长时间低剂量的辐射情况下,并不会有额外的DNA损伤。
介绍了合作人员和赞助机构。
7#
发表于 2012-5-16 16:46:30 | 只看该作者
2'18
1'33
1'46
2'33
1'55

9'04
小剂量的辐射是不会对人体产生伤害的
现在认为伤害标准也是过高的,可能因为细胞的自我修复作用
社会对于辐射观点还太保守
其中的实验和数据比较就不详细说了
8#
发表于 2012-5-16 17:27:49 | 只看该作者
速度
57s
59s
58s
1min12s
1min8s

越障
5min47s
1.引:
1)最近的一项研究表明,当老鼠被暴露于高于background400的辐射的时候,它们将不能被观察到任何的DNA了。
2)通常来说,当一个地区的辐射量比background要高8倍的时候,政府就会撤离所有在一定范围内的居民。但是通过这个study,学者们提出了一个问题,在这种程度的辐射下撤离有没有必要。
3)E认为,在八倍的辐射量的情况下,人体是不会受到太大的损害的。
4)传统的研究都是在一次性的强力辐射下得出的结果,但是从来没有人研究过,如果人们长期处于一种低密度的辐射的状况下,会不会受到影响。
2.实验:
1)实验目的:how much is too much? 通常来说,普通的辐射量是0.3盖革是一个限,而当辐射量到达10.5的时候,人们的基因就会受到损害,但这些实验数据都是在一次性的受辐射的情况下得出的,而E想知道,如果在长期的低密度的辐射下,人们的DNA是不是也会受到损害。
2)实验方法:找一些研究对象,对他们使用辐射(辐射源是碘,和在日本福岛的一样)进行为期五个星期的实验。
3)结果:除了DNA的连接方面出现了一点问题之外,DNA没有发生任何改变
4)结论: 证明了E和DB的观点,即少量的长期的辐射是不会对人类基因造成损害的。(虽然只有五个星期,但是E和DB认为足以说明问题)
3.其实我们一直处于被辐射的状态(宇宙辐射和其他自然辐射),人们不应该恐慌,因为他们是不会造成危害的。
4.但是E认为,仅凭他们的这个实验也是不能够revise辐射撤离标准的,更多的实验将要被进行,而且大多数必须先从动物开始。

我发现其实不强迫自己记,反而能够记得更多。不过我觉得在越障和实战还是有挺大区别的。我在做阅读的时候,文章一般都可以读懂,但是在分辨选项的一些细微之处时,总是要出错。有没有童鞋和我一个情况?
9#
发表于 2012-5-16 18:00:26 | 只看该作者
1.2`06``发现了一个超星星,讲述了发现的过程以及发现的意义。揭示了一个什么理论,下文说了3个逐渐发生的过程。
2.2`01``这一部分主要说了用什么方式观察超星星,得出结论,支持了什么论点,得出了什么结论。
3.2`13``说了一个模型可以做到什么,利用分析检测等方式,可以得出什么。最后说一个大学的什么人发飙了一篇文章,有解释作用。
4.2`11``具体说了C和K的研究过程,他们结合了一些小模型,研究出了一些数据,也说了他们的实验的不足。
5.2`18``
越障:7`26``MIT的两位教授研究了核辐射到达什么计量会对人体的DNA产生影响。第一部分:揭示了核辐射到一个什么计量会对人造成伤害,以及为什么低剂量伤害不大。第二部分:核辐射的累积量的问题,怎么测量,怎么观察等问题。最后说了这个论文的影响。
10#
发表于 2012-5-16 18:06:08 | 只看该作者
赶快占座,今天不能再错过了咯~
计时一1:50;
计时二1:53;
计时三1:58;
计时四1:55;
计时五2:20
有一种比其他的威力更强的超新星,是最闪亮的;2010年观察到一个爆炸,超新星周围一个叫C的物质吸收了很多能量,2011年又观察到C物质没有了,冲击波摧毁了它,一种气体对X射线有加热作用,表明是超新星的冲击波引起的,另外还可能存在一种像黑洞一样的物质。
气候模型与数据分析结合,分析北美2041~2070的气候变化;问题在于现在各种不同的模型有不同的结果,想办法把它们结合起来以得到一个确定的结论。C\K两人研究出北美到2070年会上升2.5摄氏度,有一个组织数据表明全球会上升2.4摄氏度,他们将其后模型与已知的结论结合,这些结论是可以通过数据量化确定的,并且使用与全世界任何地区。然后讲了北美几个地方温度变化,都是上升。最后由许多赞助机构和组织~~
越障:8:00
MIT的最新实验表明,政府在核事故中疏散群众的撤离标准线是保守的;用白老鼠做实验,400倍低强度的长时间的持续辐射,老鼠的DNA没有得到破坏,现在的撤离安全线是通过一次性高强度的辐射的后过来确定的,所以是有问题的;
具体讲这个实验:五周时间,将高强度的辐射量分配,老鼠没有DNA变化,研究表明DNA内有自我修复系统,损伤被修复了;更长时间的辐射,应该也是没问题的。
因为现在的撤离标准太保守了,那些在日本核事故中的被隔离的人们···远离了家人、工作和社会,这是不需要的,人们身体都没有问题,更没有癌变的危险。
当然现在的研究只是为后面搭一个框架,这个撤离标准线要怎么定要有更多的研究;不过研究人员说,老鼠和人对待辐射的反应是相似的,所以不会有大变化。
您需要登录后才可以回帖 登录 | 立即注册

Mark一下! 看一下! 顶楼主! 感谢分享! 快速回复:

手机版|ChaseDream|GMT+8, 2024-4-25 01:19
京公网安备11010202008513号 京ICP证101109号 京ICP备12012021号

ChaseDream 论坛

© 2003-2023 ChaseDream.com. All Rights Reserved.

返回顶部