ChaseDream
搜索
12345下一页
返回列表 发新帖
查看: 9518|回复: 44

[阅读小分队] 【每日阅读训练第四期——速度越障1系列】【1-08】科技

[复制链接]
发表于 2012-5-9 08:07:11 | 显示全部楼层 |阅读模式
昨天我的电脑热崩溃了。。。结果就没能在晚上发上来。。对不起大家。。。

另外,今天越障因为不是很难,所以稍微有一点点长。。。。本来想找论文来着。。。后来没搞到合适的,还是新闻吧。。O(∩_∩)O~

【计时1
Smart Meter Use Will Grow, But More Slowly

Government incentives are drying up and will slow deployment of the new technology for electricity



Smart meters that give home and business owners more control over their electricity consumption patterns have been a huge success in North America, with penetration rates now approaching 35 percent and expected to reach nearly 70 percent by 2020.

But the future of smart meters will be volatile, according to a new analysis from Pike Research, with North American deployments peaking in 2011 before an expected drop-off over the next two years as government incentives that helped drive demand expire.

According to Pike's analysis, a record 12.4 million smart meters shipped in North America last year as U.S. and Canadian utilities and consumers embraced the technology, which allows for a direct exchange of supply-and-demand information at the household or business level.

But by 2013, such shipments are expected to drop to about 7.2 million as adoption rates plateau and incentives provided under the 2009 American Recovery and Reinvestment Act expire.

"The goal under the Recovery Act was to accelerate deployment, to put in as many [smart meters] as possible, and it was required to happen by the end of this year," said Bob Gohn, a Pike Research vice president and lead author of the analysis. "So you've got a whole bunch of big programs in states like California, Texas and elsewhere where deployment is winding up this year."

As of September 2011, the Edison Foundation's Institute for Electric Efficiency reported, U.S. utilities had installed roughly 27 million smart meters nationwide, with an additional 38 million expected to come online by 2015.

The projected slowdown in North American installations will be countered by robust growth in Europe and Asia, where government mandates and other factors are driving regional booms in smart meter adoptions, the report found.

China will lead, but are its meters 'smart'?
According to Pike, smart meters globally should see a compounded annual growth rate of just under 5 percent for the 2010-20 period, with Europe eventually emerging as the No. 1 deployer of the technology in terms of percentage of households and businesses using the devices.

【字数:370
【计时2

On a country basis, China will be the largest deployer of smart meters by 2016, with more than 310 million units installed, according to Pike.

At the same time, the analysis predicts "very dynamic and even volatile regional market characteristics, with dramatic shifts over the forecast period and very different communications technologies and standards."

In fact, Pike researchers found that Chinese deployment of smart meters was happening much more quickly and more aggressively than expected just three years ago, when the firm did its last major market analysis on the technology.

The State Grid Corporation of China, for example, has significantly expanded its use of advanced remote meters, which it considers to be "smart metering." But others have questioned whether the Chinese technology meets the full definition of "smart meter" because it does not allow for full two-way communication and the ability to read and store data on a near-continuous basis.

First-generation problems linger
Even with the technology's limitations, Pike Research opted to include the Chinese remote meters in its market analysis, resulting in an additional 40 million to 50 million meters per year shipping from 2010 to 2014. "As a result, the view of the global smart meter market has changed dramatically," the report states.

In other regions, notably North America and Europe, problems with the first-generation technologies such as proprietary wireless radio frequency networks and power line communication have added pressure on utilities to adopt new open systems on Internet protocol networks that allow for fuller integration of smart metering tools.

The analysis also found that companies manufacturing and selling smart meters are becoming larger and more globalized as demand for the meters grows worldwide. As part of this process, the industry is seeing some conglomeration as well-established firms such as Toshiba purchase smaller startups that specialize in smart meter applications.

The report also notes that utilities, regulators and smart meter manufacturers are trying to address consumer concerns about the first generation of smart meters, which many said had problems with meter accuracy, data security and privacy, and health effects.

"The consumer response experience to date calls into question just how consumers will ultimately incorporate smart meters into their lives and adjust their current lifestyles to respond to the information they will receive about energy usage and, in many cases, energy prices," the analysis states.



Reprinted from Climatewire with permission from Environment & Energy Publishing, LLC.

【字数:387
【计时3
Norway Opens Major Facility to Test Carbon Capture

Norway on Monday launched the world's largest facility of its kind to develop carbon capture and storage (CCS), the so-far commercially unproven technology that would allow greenhouse gases from power plants to be buried safely underground.

[attachimg=365,260]100264[/attachimg]

MONGSTAD, Norway (Reuters) - Norway on Monday launched the world's largest facility of its kind to develop carbon capture and storage (CCS), the so-far commercially unproven technology that would allow greenhouse gases from power plants to be buried safely underground.

A 5.8 million Norwegian crown ($1.00 billion) government-funded centre will test two post-combustion carbon capture technologies that could be extended to industrial-scale use if shown to be cost-effective and safe.

"Today we are opening the world's largest and most advanced laboratory to test carbon capture technologies... It is an important project for Norway and for the world," Prime Minister Jens Stoltenberg told the opening ceremony at the Technology Centre Mongstad (TCM), northwest of Bergen city.

The facility will be able unique in that it can test exhaust gases from two nearby sources - a 280-megawatt combined heat and power plant and the 10-million tons per year Mongstad refinery. These produce flue gases with different carbon dioxide (CO2) contents - about 3.5 percent and 13 percent respectively.

Mongstad's emissions have a similar carbon dioxide content to those emitted by coal fuelled power plants - which scientist say make a particularly serious contribution to climate change.

CCS offers the prospect of possibly continuing to burn fossil fuels while avoiding the worst effects by burying the emissions, for example in depleted natural gas fields under the sea, although it will be costly.

Stoltenberg said in a 2007 speech that carbon capture and storage would be Norway's equivalent of a Moon landing.

The centre has two carbon capture plants with a combined capacity to process 100,000 tons of carbon dioxide per year, making it the largest, Olav Folk Pedersen, the TCM's technology manager, told Reuters.

"CATCH AND RELEASE"

However, the capacity is only slightly more than a tenth of what Mongstad refinery emits per year. During the testing period all CO2 captured will be released into the atmosphere, thus having no impact on reducing the refinery's emissions costs.

【字数:368
【计时4

"Speaking in fishing terms, it's a "catch and release" facility," Pedersen said earlier at the briefing.

So far, few countries have agreed to invest heavily in carbon capture. Those with projects include the United States, Australia, Britain and China.

European Union Energy Commissioner Guenther Oettinger applauded Norway's efforts, which take place at a time when other CCS demonstration projects in Europe have stalled due to lack of investment.

"It's an important milestone in Europe's undertaking to develop CCS technologies... It will provide a new momentum to the discussion of CCS use in Europe," he told the ceremony.

Oettinger has said that natural gas can have a long-term future in Europe only if CCS can be applied.

Norway, with a population of just 4.9 million, is the world's eighth-biggest exporter of oil and Western Europe's biggest exporter of natural gas.

Ola Borten Moe, Norway's petroleum and energy minister, told the ceremony a full-scale CCS facility at Mongstad might be possible later, with an investment decision due in 2016, but emissions permit prices had to rise to justify the spending.

Helge Lund, the chief executive of oil firm Statoil, a partner in TCM, added: "We cannot defend it economically, because CO2 prices are so low now. We need to find some sort of the solution in the future."

The EU's carbon market was touted as the cheapest and most effective way to cut emissions by putting a price on carbon dioxide emissions and getting the private sector to factor that cost into long-term investment decisions.

However, carbon permit prices are trading below 7 euros, around a quarter of what many lawmakers say is the minimum level needed to get companies to invest in clean technology.

TCM will test two carbon capture technologies, one based on amine and the other on chilled ammonia solvent, to trap CO2 emitted from the plant and the refinery. ($1 = 5.7996 Norwegian crowns)

【字数:316
【计时5

How Biodiversity Keeps Earth Alive

Species loss lessens the total amount of biomass on a given parcel, suggesting that the degree of diversity directly impacts the amount of life the planet can support



[attachimg=277,277]100265[/attachimg]
In 1994 biologists seeded patches of grassland in Cedar Creek, Minn. Some plots got as many as 16 species of grasses and other
plants—and some as few as one. In the first few years plots with eight or more species fared about as well as those with fewer species, suggesting that a complex mix of species—what is known as biodiversity—didn't affect the amount of a plot's leaf, blade, stem and root (or biomass, as scientists call it). But when measured over a longer span—more than a decade—those plots with the most species produced the greatest abundance of plant life.

"Different species differ in how, when and where they acquire water, nutrients and carbon, and maintain them in the ecosystem. Thus, when many species grow together, they have a wider set of traits that allow them to gain the resources needed," explains ecologist Peter Reich of the University of Minnesota, who led this research to be published in Science on May 4. This result suggests "no level of diversity loss can occur without adverse effects on ecosystem functioning." That is the reverse of what numerous studies had previously found, largely because those studies only looked at short-term outcomes.

The planet as a whole is on the cusp of what some researchers have termed the sixth mass extinction event in the planet's history: the wiping out of plants, animals and all other forms of life due to human activity. The global impact of such biodiversity loss is detailed in a meta-analysis led by biologist David Hooper of Western Washington University. His team examined 192 studies that looked at species richness and its effect on ecosystems. "The primary drivers of biodiversity loss are, in rough order of impact to date: habitat loss, overharvesting, invasive species, pollution and climate change," Hooper explains. Perhaps unsurprisingly, "biodiversity loss in the 21st century could rank among the major drivers of ecosystem change," Hooper and his colleagues wrote in Nature on May 3. (Scientific American is part of Nature Publishing Group.)

【字数:371
【文章剩余部分】

Losing just 21 percent of the species in a given ecosystem can reduce the total amount of biomass in that ecosystem by as much as 10 percent—and that's likely to be a conservative estimate. And when more than 40 percent of an ecosystem's species disappear—whether plant, animal, insect, fungi or microbe—the effects can be as significant as those caused by a major drought. Nor does this analysis take into account how species extinction can both be driven by and act in concert with other changes—whether warmer average temperatures or nitrogen pollution. In the real world environmental and biological changes "are likely to be happening at the same time," Hooper admits. "This is a critical need for future research."

The major driver of human impacts on the rest of life on this planet—whether through clearing forests or dumping excess fertilizer on fields—is our need for food. Maintaining high biomass from farming ecosystems, which often emphasize monocultures (single species) while also preserving biodiversity—some species now appear only on farmland—has become a "key issue for sustainability," Hooper notes, "if we're going to grow food for nine billion people on the planet in the next 40 to 50 years."

Over the long term, maintaining soil fertility may require nurturing, creating and sparing plant and microbial diversity. After all, biodiversity itself appears to control the elemental cycles—carbon, nitrogen, water—that allow the planet to support life. Only by acting in conjunction with one another, for example, can a set of grassland plant species maintain healthy levels of nitrogen in both soil and leaf. "As soil fertility increases, this directly boosts biomass production," just as in agriculture, Reich notes. "When we reduce diversity in the landscape—think of a cornfield or a pine plantation or a suburban lawn—we are failing to capitalize on the valuable natural services that biodiversity provides."

At least one of those services is largely unaffected, however, according to Hooper's study—decomposition. Which means the bacteria and fungi will still happily break down whatever plants are left after this sixth extinction. But thousands of unique species have already been lost, most unknown even to science—a rate that could halve the total number of species on the planet by 2100, according to entomologist E. O. Wilson of Harvard University. Ghosts of species past haunt ecosystems worldwide, which have already lost not just one or another type of grass or roundworm but also some of their strength at sustaining life as a whole.

【字数:419







【越障】

New Technology Allows Better Extreme Weather Forecasts

New technology that increases the warning time for tornadoes and hurricanes could potentially save hundreds of lives every year

[attachimg=277,277]100266[/attachimg]

After the deafening roar of a thunderstorm, an eerie silence descends. Then the blackened sky over Joplin, Mo., releases the tentacles of an enormous, screaming multiple-vortex tornado. Winds exceeding 200 miles per hour tear a devastating path three quarters of a mile wide for six miles through the town, destroying schools, a hospital, businesses and homes and claiming roughly 160 lives.

Nearly 20 minutes before the twister struck on the Sunday evening of May 22, 2011, government forecasters had issued a warning. A tornado watch had been in effect for hours and a severe weather outlook for days. The warnings had come sooner than they typically do, but apparently not soon enough. Although emergency officials were on high alert, many local residents were not.

The Joplin tornado was only one of many twister tragedies in the spring of 2011. A month earlier a record-breaking swarm of tornadoes devastated parts of the South, killing more than 300 people. April was the busiest month ever recorded, with about 750 tornadoes.

At 550 fatalities, 2011 was the fourth-deadliest tornado year in U.S. history. The stormy year was also costly. Fourteen extreme weather and climate events in 2011—from the Joplin tornado to hurricane flooding and blizzards—each caused more than $1 billion in damages. The intensity continued early in 2012; on March 2, twisters killed more than 40 people across 11 Midwestern and Southern states.

Tools for forecasting extreme weather have advanced in recent decades, but researchers and engineers at the National Oceanic and Atmospheric Administration are working to enhance radars, satellites and supercomputers to further lengthen warning times for tornadoes and thunderstorms and to better determine hurricane intensity and forecast floods. If the efforts succeed, a decade from now residents will get an hour’s warning about a severe tornado, for example, giving them plenty of time to absorb the news, gather family and take shelter.

The Power of Radar
Meteorologist doug forsyth is heading up efforts to improve radar, which plays a role in forecasting most weather. Forsyth, who is chief of the Radar Research and Development division at NOAA’s National Severe Storms Laboratory in Norman, Okla., is most concerned about improving warning times for tornadoes because deadly twisters form quickly and radar is the forecaster’s primary tool for sensing a nascent tornado.

Radar works by sending out radio waves that reflect off particles in the atmosphere, such as raindrops or ice or even insects and dust. By measuring the strength of the waves that return to the radar and how long the round-trip takes, forecasters can see the location and intensity of precipitation. The Doppler radar currently used by the National Weather Service also measures the frequency change in returning waves, which provides the direction and speed at which the precipitation is moving. This key information allows forecasters to see rotation occurring inside thunderstorms before tornadoes form.

In 1973 NOAA meteorologists Rodger Brown, Les Lemon and Don Burgess discovered this information’s predictive power as they analyzed data from a tornado that struck Union City, Okla. They noted very strong outbound velocities right next to very strong inbound velocities in the radar data. The visual appearance of those data was so extraordinary that the researchers initially did not know what it meant. After matching the data to the location of the tornado, however, they named the data “Tornadic Vortex Signature.” The TVS is now the most important and widely recognized metric indicating a high probability of either an ongoing tornado or the potential for one in the very near future. These data enabled longer lead times for tornado warnings, increasing from a national average of 3.5 minutes in 1987 to 14 minutes today.

Although Doppler radar has been transformative, it is not perfect. It leaves meteorologists like Forsyth blind to the shape of a given particle, which can distinguish, say, a rainstorm from a dust storm. Ironically, the trajectory of his career path changed when a failed eye exam led him from U.S. Air Force pilot ambitions to a career in meteorology. Since then, Forsyth has focused on radar upgrades that give forecasters a better view of the atmosphere.

One critical upgrade is called dual polarization. This technology allows forecasters to differentiate more confidently between types of precipitation and amount. Although raindrops and hailstones may sometimes have the same horizontal width—and therefore appear the same in Doppler radar images—raindrops are flatter. Knowing the difference in particle shape reduces the guesswork required by a forecaster to identify features in the radar scans. That understanding helps to produce more accurate forecasts, so residents know they should prepare for hail and not rain, for example.

Information about particle size and shape also helps to distinguish airborne bits of debris lofted by tornadoes and severe thunderstorms, so meteorologists can identify an ongoing damaging storm. Particle data are especially important when trackers are dealing with a tornado that is invisible to the human eye. If a tornado is cloaked in heavy rainfall or is occurring at night, dual polarization can still detect the airborne debris.

The National Weather Service is integrating dual-polarization technology—which is also helpful for monitoring precipitation in hurricanes and blizzards—into all 160 Doppler radars across the nation, expecting to finish by mid-2013. At the same time, NOAA personnel are training forecasters to interpret the new images. The Weather Forecast Office in Newport/Morehead City, N.C., was the first to scan a tropical cyclone using such radar when Hurricane Irene made landfall in North Carolina in 2011. During that storm, dual-polarization radars proved more accurate in detecting precipitation rates, and therefore predicting flooding, than conventional Doppler radars farther north. The improved capabilities surely saved lives in the Carolinas; farther up the coast, without this technology, Hurricane Irene was deadlier despite early warnings, claiming nearly 30 lives.

NOAA research meteorologist Pam Heinselman believes another advanced radar technology used by the U.S. Navy to detect and track enemy ships and missiles has great potential to improve weather forecasting as well. Heinselman leads a team of electrical engineers, forecasters and social scientists at the National Weather Radar Testbed in Norman, Okla., focused on a technology called phased-array radar.

Current Doppler radars scan at one elevation angle at a time, with a parabolic dish that is mechanically turned. Once the dish completes a full 360-degree slice, it tilts up to sample another small sector of the atmosphere. After sampling from lowest to highest elevation, which during severe weather equates to 14 individual slices, the radar returns to the lowest angle and begins the process all over again. Scanning the entire atmosphere during severe weather takes Doppler radar four to six minutes.

In contrast, phased-array radar sends out multiple beams simultaneously, eliminating the need to tilt the antennas, decreasing the time between scans of storms to less than a minute. The improvement will allow meteorologists to “see” rapidly evolving changes in thunderstorm circulations and, ultimately, to more quickly detect the changes that cause tornadoes. Heinsel­man and her team have demonstrated that phased-array radar can also gather storm information not currently available, such as fast changes in wind fields, which can precede rapid changes in storm intensity.

Heinselman and others believe phased-array technology alone could extend tornado warnings  to more than 18 minutes, but much more research and development needs to be done. Ideally, the phased-array system would have four panels that emitted and received radio waves, to provide a 360-degree view of the atmosphere—one each for the north, south, east and west. Researchers in Norman have made only one-panel systems operable for weather surveillance, and it is likely to be at least a decade before phased arrays become the norm across the country.

【字数:1303


【文章剩余部分】

Eyes in the Sky
Of course, even the best radars cannot see over mountains or out into the oceans, where hurricanes form. Forecasters rely on satellites for these situations and also rely on them to provide broader data that supplement the localized information from a given radar. NOAA’s weather satellites supply more than 90 percent of the data that go into daily and long-range forecasts, and they are critical in providing alerts of severe weather potential multiple days in advance. To improve the delivery of this essential environmental intelligence, NOAA will deploy a range of new technologies in the next five years.

Without more detailed satellite observations, extending the range of accurate weather forecasts—especially for such extreme events as hurricanes—would be severely restricted. Monitoring weather requires two types of satellites: geostationary and polar-orbiting. Geostationary satellites, which stay fixed in one spot at an altitude of about 22,000 miles, transmit near-continuous views of the earth’s surface. Using loops of pictures taken at 15-minute intervals, forecasters can monitor rapidly growing storms or detect changes in hurricanes (but not tornadoes).

Polar satellites, which orbit the earth from pole to pole at an altitude of approximately 515 miles, give closer, more detailed observations of the temperature and humidity of different layers of the atmosphere. A worldwide set of these low Earth orbit (LEO) satellites covers the entire globe every 12 hours.

NOAA plans to launch a new series of LEO satellites this decade, as part of the Joint Polar Satellite System, with updated hardware, fitted with more sophisticated instruments. Their data will be used in computer models to improve weather forecasts, including hurricane tracks and intensities, severe thunderstorms and floods. The suite of advanced microwave and infrared sensors will relay much improved three-dimensional information on the atmosphere’s temperature, pressure and moisture, because rapid changes in temperature and moisture, combined with low pressure, signify a strong storm. Infrared sensors provide these measurements in cloud-free areas, and microwave sensors can “see through clouds” to the earth’s surface.

In April 2011, five days before a powerful storm system tore through six southern states, NOAA’s current polar-orbiting satellites provided data that, when fed into models, prompted the NOAA Storm Prediction Center to forecast “a potentially historic tornado outbreak.” The center elevated the risk to the highest level at midnight before the event. This level of outlook is reserved for the most extreme cases, with the least uncertainty, and is only used when the possibility for extremely explosive storms is detected. The new LEO satellites should allow such predictions as much as five to seven days before a storm.

Geostationary satellites will improve, too. Advanced instruments that will image the earth every five minutes in both visible and infrared wavelengths will be onboard the GOES-R series of satellites to be launched in 2015. They will increase observations from every 15 minutes to every five minutes or less, allowing scientists to monitor the rapid intensification of severe storms. The GOES-R satellites will also provide the world’s first space view of where lightning is occurring in the Western Hemi­sphere. The lightning mapper will help forecasters detect jumps in the frequency of in-cloud and cloud-to-ground lightning flashes. Research suggests that these jumps occur up to 20 minutes or more before hail, severe winds and even tornadoes.

Billions of Data
Each of the new radar technologies and satellites could improve warning times by several minutes, but incorporating the data derived from all these systems into forecasting computer models could provide even more time. Warnings for tornadoes, for example, could be issued up to an hour in advance. That is the kind of lead time that would have made a big difference in Joplin.

Forecasting models are based on physical laws governing atmospheric motion, chemical reactions and other relationships. They crunch millions of numbers that represent current weather and environmental conditions, such as temperature, pressure and wind, to predict the future state of the atmosphere. Imagine a grid that lies over the planet’s surface. Imagine another one a few hundred feet above that—and another and another, in layer after layer, all the way to the top of the stratosphere some 30 miles up. Millions of lines of code are needed to translate the billions of grid points under observation.

A typical forecast model today uses grids at the surface that run about five to 30 miles square. The smaller the squares, the higher the model’s resolution and the better it will be at detecting small-scale atmospheric changes that could spawn storms. Processing more data points, however, requires faster supercomputers.

Advances in modeling also require talented people who can integrate all these data and interpret them. Bill Lapenta, acting director of NOAA’s Environmental Modeling Center, heads that translation effort, which churns out numerical forecasts for 12, 24, 36, 48 and 72 hours ahead and beyond. Meteorologists compare NOAA’s models with others from international modeling centers to come up with the forecasts seen on the Web or the evening news.

NOAA supercomputers in Fairmont, W.Va., can process 73.1 trillion calculations a second. But Lapenta believes faster speeds are possible, which will allow the models to run at even smaller scales. For example, grids of just one mile square would enable models to simulate the small-scale conditions that catapult a routine thunderstorm or hurricane into a monster. NOAA plans to access some of the latest supercomputers at Oak Ridge National Laboratory to begin to build such models. Lapenta hopes such high-resolution models might begin to appear by 2020.

Lapenta foresees a day in the next decade when the increasing capabilities of new radars and satellites will be coupled with an evolving generation of finely detailed weather-prediction models running in real time on computers at speeds exceeding a quintillion computations a second. To make them a reality, scientists such as Lapenta are working on the mathematical, physical and biogeochemical relations that need to be encoded in a way that enables those relations to work together seamlessly.

If major NOAA investments in this “brainware” pay off, forecasters will not have to wait for a radar image to detect an actual storm before issuing a warning with 14 or 18 minutes of lead time. Instead they will be able to issue tornado, severe thunderstorm and flash-flood warnings based on highly accurate model forecasts produced well in advance, giving the public 30 to 60 minutes to take safety precautions.

Better Science, Better Decisions
With all these improvements, meteorologists such as Gary Conte in the New York City Weather Forecast Office will be able to predict more accurately, with longer lead times, weather hazards that can shut down the city, such as storms with snow and ice. Severe weather outlooks will extend beyond five days, hurricane forecasts beyond seven days, and the threat of spring floods will be known weeks in advance. This vision for a weather-ready nation is motivated by the desire to avoid the unmitigated disasters of 2011.

The goal is that by 2021 the rebuilt and thriving city of Joplin would receive a severe tornado warning more than an hour in advance. Families would have more time to gather and get to a safe room. Nursing homes and hospitals would be able to transfer residents and patients to shelter. Retailers would have time to get employees to safety and close up shop. Cell phones would thrum with multiple messages to seek shelter while local meteorologists broadcast similar warnings on television and radio. The clarion call of tornado sirens would reinforce the urgency of these warnings. As a result, even nature’s most powerful tornado would pass through town without any loss of life.

【字数:1266

本帖子中包含更多资源

您需要 登录 才可以下载或查看,没有帐号?立即注册

x
发表于 2012-5-9 08:24:11 | 显示全部楼层
占沙发!

2:36
small meter不知道啥意思 读完查了是智能电表?是家里的读电卡么
主要讲了small Meter的市场不稳定 虽然前景较好但是一会升一会降
2:12
中国电表市场广大 但是中国本土电表产业竞争力强发展迅速 因此在中国的市场变化很大
电表的三方面优势 电表系统化 电表公司国际化 产品客户满意高度化
2:59
挪威探测CSS技术
但愿没理解错误 CSS应该发掘地下可燃气体的?
挪威竭尽人力物力财力的投资此项目 项目操作是用什么燃料厂+提炼厂吧 然后合成气体 某speecher认为此项目意义深远堪比挪威人再登月球 节省不可再生燃料 为人类社会做出贡献
1;43
说CSS项目支持的人不多 但是支持的人感觉挺厉害的 有美国欧盟什么什么协会 在最开始缺乏资金的时候就有欧盟能源协会给予支持 2016正式投产 但是在原材料的来源与价格上有些问题 最后提到了两种capture原材料CO2的方式
2:15
生物多样性导致植物多样性
举例research:一个plots里面的动物多了植物就多了
于是有学者就阐述这两种多样性的关系 :动物多样了 什么食物链啊的关系就多了 导致植物就多样性了 最终得出结论 认为生物多样性的损伤必然导致生态系统紊乱
最后说明生态系统遭到破坏的原因 以及再次强调生态系统破坏不得
2:40
说了生物多样性遭到破坏后导致植物遭到的破坏好比一次大干旱 就是说很严重 之后阐述人们破坏植物环境的主要原因是因为要作为耕地纳用 导致生物如今都住在稻田中 学者强调 如果人们还想在地球上多住几年 这件事必须好好处理
于是提出了处理办法:增强土壤肥沃
最后举例 回应上上段吧 意思就是环境破坏不得 破坏了就完蛋了


7:50
读的比较急 只记住了框架;
文章主要描述了两个预言海啸或者暴风雨等等灾难的方法:
1P:叙述文的方法讲述了一场2011年的灾难 导致300人死亡多少多少钱的损失 引出下文 说明此时预测技术的重要性
2P:讲述预测技术能够争取避难时间 保护生命珍惜财产
于是开始描述雷达探测系统
3P:描述第一种预测方法 具体是通过预测雨滴大小来评判灾难性
4P:描述第二种方法 具体操作没记住 光记住是什么圆盘有个角度 此方法有灾难预警同时性的优点 警告实时  

后面的1200就偷个懒凹。。
 楼主| 发表于 2012-5-9 08:39:33 | 显示全部楼层
我也想占楼。。。
1'45
1'56
1'28
1'15
1'31
1'46
08'13
08'00
严重表示。。。之前大体看过,现在果然不想再看。。。看的时候都快睡着了。。。。
发表于 2012-5-9 08:44:30 | 显示全部楼层
占座~``
发表于 2012-5-9 09:34:50 | 显示全部楼层
我怎么还是读的这么慢啊
速度
2‘17
2’42
2‘28
2’09
2‘18
自由2’45
越障9‘23
主要讲了新技术对预测极端天气的promotion。。开始说新技术增加了极端天气的预警时间给人们足够的时间撤离。然后就是说极端天气造成的伤亡和损失有多大,还有以前的技术只能预测有极端天气,但是预警时间太短,往往只有少数人能够及时逃离,大部分还是会遭受危险。好像举了一个美国哪个地方的例子。之后就是说气象学家’其他学科的科学家(具体记不清了)希望能够通过使用雷达、卫星、超级电脑等方法来延长预警时间以保证更多的人能够及时逃生。
第二部分就是说关于一个雷达技术增加报警时间的,说是一个人发明了一个TVS的技术 这种技术极大的改善了预测像龙卷风和海啸的准确性,延长了预警时间,使得人们有足够的时间逃离,然后就是详细介绍这个技术的原理。。这块没记住。。囧 。。但是后来 证明这个方法有缺陷,介绍了这个缺陷是什么然后又有一个人站出来说怎么改进然后介绍改进后的原理,好像要用到卫星。最后又出来一个人说改进的方法也有缺陷还要提高。然后就是说气象学家要研究用美国海军发现敌舰的那种超级雷达中的一种技术来提高预测极端天气的准确性,而且一个这些科学家相信单单是这一项技术就能提高预测龙卷风预警时间的时长。。。
发表于 2012-5-9 10:47:16 | 显示全部楼层
占个座~再读~

----
1. 1'28
2. 2'05
3. 1'33
4. 1'29
5. 1'49
越障:7'11
1. discribe a recent tornado and its forecast
2. point out the question that shortening warning time is necessary.
3. talking about raders in forecasting weathers: how it functions, how to improve its usage.

科技文果然是软肋~~~~~
读不进去也记不得·····TAT
发表于 2012-5-9 14:06:30 | 显示全部楼层
计时1    1'45
计时2    2'55
计时3    2'00
计时4    1'28
计时5    2'05
越障    11'34

新技术可以使预测龙卷风和飓风时间提前,拯救生命
现在的技术只能提前预测20min
由于技术限制导致的悲剧,举例;有许多这样的悲剧,等等~
已经开始对某种雷达的研究以预测飓风
雷达的重要性
雷达的机理
历史:XX年某专家首次通过研究数据预测出了飓风
该雷达的不足
雷达的几项升级
作为对比介绍了一种新雷达
发表于 2012-5-9 14:33:55 | 显示全部楼层
2:42s


2:51s


2:50s


2:00


2:30s

8:40s
发表于 2012-5-9 14:44:03 | 显示全部楼层
2'07
2'19
2'32
2'39
2'23

10'34
讲了飓风等气象灾难对人们的影响,以及现在的预报技术只能提前预报20分钟左右。
新的技术在研发,某专家建议大力加强对于雷达的改进。
讲了新技术的优势。
科普类的文章有点难以理解。
发表于 2012-5-9 17:20:41 | 显示全部楼层
2'12(今天第一次做,第一篇就给我一个下马威啊,,5555)
我读完就只记得传说中的smart meters(谁能告诉我这个是什么)到各种年代之前好像都是在不断发展吧。但是使用者好像有增有长
2‘51
中国的smart meters和过去比发展很快。公司在不断拓展自己的业务。
但是这个program好像有一些问题,然后说了外国的公司是如何解决这个问题的
然后说到这样的公司现在越做越大。
他们正在想办法吸引第一批因为某些问题而失去的客户。
3’00(这篇又纠结了。。。)
挪威发射了一个什么东西,里面贮存大量CO2,来吸收空气中的有害气体
最后一段没怎么看懂,好像是说,发射这个东西很费钱。
2‘30
欧洲关于这方面的委员会很欣赏挪威的CCS项目,并且说只有CCS才能缓解某些情况的恶化吧
挪威人口很少,却是世界第8大gas出口国,也是欧洲最大的gas出口国。CCS计划估计到2016年截止
他们同时还将tes另一个计划,我记得最后一个是trap树林了什么的排放co2
3'02
近年来越来越多的植物种类消失
一些专家做了实验 ,在一片区域内的植物种类多,他们消失的几率越低。。。。(MS是这样)
近年来引起 Biodiversity loss有不少原因,最大的原因是climate change.

10’12
雷阵雨之后来的台风使美国各个地方损失惨重,还举例说了某个地方预测台风要来了,但是台风来得比预测得早。
现在台风预测得技术正在发展,但是还需要不断改进。
使用雷达预测台风已经被应用,但是还需要不断改进。比如时间上。
还特别说了一种Dopple Rader,说了他的优势,但是也指出它在时间和测试角度上的缺陷
然后最后提出了另一种array什么的东西,说他的时间优势更强,角度也更multiple。但是同样有科学家觉得他有很大缺点。。

(看得迷迷糊糊的。。。。)
您需要登录后才可以回帖 登录 | 立即注册

Mark一下! 看一下! 顶楼主! 感谢分享! 快速回复:

手机版|ChaseDream|GMT+8, 2024-3-29 13:24
京公网安备11010202008513号 京ICP证101109号 京ICP备12012021号

ChaseDream 论坛

© 2003-2023 ChaseDream.com. All Rights Reserved.

返回顶部