- UID
- 504111
- 在线时间
- 小时
- 注册时间
- 2010-1-19
- 最后登录
- 1970-1-1
- 主题
- 帖子
- 性别
- 保密
|
昨天我的电脑热崩溃了。。。结果就没能在晚上发上来。。对不起大家。。。
另外,今天越障因为不是很难,所以稍微有一点点长。。。。本来想找论文来着。。。后来没搞到合适的,还是新闻吧。。O(∩_∩)O~
【计时1】 Smart Meter Use Will Grow, But More Slowly
Government incentives are drying up and will slow deployment of the new technology for electricity
Smart meters that give home and business owners more control over their electricity consumption patterns have been a huge success in North America, with penetration rates now approaching 35 percent and expected to reach nearly 70 percent by 2020.
But the future of smart meters will be volatile, according to a new analysis from Pike Research, with North American deployments peaking in 2011 before an expected drop-off over the next two years as government incentives that helped drive demand expire.
According to Pike's analysis, a record 12.4 million smart meters shipped in North America last year as U.S. and Canadian utilities and consumers embraced the technology, which allows for a direct exchange of supply-and-demand information at the household or business level.
But by 2013, such shipments are expected to drop to about 7.2 million as adoption rates plateau and incentives provided under the 2009 American Recovery and Reinvestment Act expire.
"The goal under the Recovery Act was to accelerate deployment, to put in as many [smart meters] as possible, and it was required to happen by the end of this year," said Bob Gohn, a Pike Research vice president and lead author of the analysis. "So you've got a whole bunch of big programs in states like California, Texas and elsewhere where deployment is winding up this year."
As of September 2011, the Edison Foundation's Institute for Electric Efficiency reported, U.S. utilities had installed roughly 27 million smart meters nationwide, with an additional 38 million expected to come online by 2015.
The projected slowdown in North American installations will be countered by robust growth in Europe and Asia, where government mandates and other factors are driving regional booms in smart meter adoptions, the report found.
China will lead, but are its meters 'smart'? According to Pike, smart meters globally should see a compounded annual growth rate of just under 5 percent for the 2010-20 period, with Europe eventually emerging as the No. 1 deployer of the technology in terms of percentage of households and businesses using the devices.
【字数:370】 【计时2】
On a country basis, China will be the largest deployer of smart meters by 2016, with more than 310 million units installed, according to Pike.
At the same time, the analysis predicts "very dynamic and even volatile regional market characteristics, with dramatic shifts over the forecast period and very different communications technologies and standards."
In fact, Pike researchers found that Chinese deployment of smart meters was happening much more quickly and more aggressively than expected just three years ago, when the firm did its last major market analysis on the technology.
The State Grid Corporation of China, for example, has significantly expanded its use of advanced remote meters, which it considers to be "smart metering." But others have questioned whether the Chinese technology meets the full definition of "smart meter" because it does not allow for full two-way communication and the ability to read and store data on a near-continuous basis.
First-generation problems linger Even with the technology's limitations, Pike Research opted to include the Chinese remote meters in its market analysis, resulting in an additional 40 million to 50 million meters per year shipping from 2010 to 2014. "As a result, the view of the global smart meter market has changed dramatically," the report states.
In other regions, notably North America and Europe, problems with the first-generation technologies such as proprietary wireless radio frequency networks and power line communication have added pressure on utilities to adopt new open systems on Internet protocol networks that allow for fuller integration of smart metering tools.
The analysis also found that companies manufacturing and selling smart meters are becoming larger and more globalized as demand for the meters grows worldwide. As part of this process, the industry is seeing some conglomeration as well-established firms such as Toshiba purchase smaller startups that specialize in smart meter applications.
The report also notes that utilities, regulators and smart meter manufacturers are trying to address consumer concerns about the first generation of smart meters, which many said had problems with meter accuracy, data security and privacy, and health effects.
"The consumer response experience to date calls into question just how consumers will ultimately incorporate smart meters into their lives and adjust their current lifestyles to respond to the information they will receive about energy usage and, in many cases, energy prices," the analysis states.
Reprinted from Climatewire with permission from Environment & Energy Publishing, LLC.
【字数:387】 【计时3】 Norway Opens Major Facility to Test Carbon Capture
Norway on Monday launched the world's largest facility of its kind to develop carbon capture and storage (CCS), the so-far commercially unproven technology that would allow greenhouse gases from power plants to be buried safely underground.
[attachimg=365,260]100264[/attachimg]
MONGSTAD, Norway (Reuters) - Norway on Monday launched the world's largest facility of its kind to develop carbon capture and storage (CCS), the so-far commercially unproven technology that would allow greenhouse gases from power plants to be buried safely underground.
A 5.8 million Norwegian crown ($1.00 billion) government-funded centre will test two post-combustion carbon capture technologies that could be extended to industrial-scale use if shown to be cost-effective and safe.
"Today we are opening the world's largest and most advanced laboratory to test carbon capture technologies... It is an important project for Norway and for the world," Prime Minister Jens Stoltenberg told the opening ceremony at the Technology Centre Mongstad (TCM), northwest of Bergen city.
The facility will be able unique in that it can test exhaust gases from two nearby sources - a 280-megawatt combined heat and power plant and the 10-million tons per year Mongstad refinery. These produce flue gases with different carbon dioxide (CO2) contents - about 3.5 percent and 13 percent respectively.
Mongstad's emissions have a similar carbon dioxide content to those emitted by coal fuelled power plants - which scientist say make a particularly serious contribution to climate change.
CCS offers the prospect of possibly continuing to burn fossil fuels while avoiding the worst effects by burying the emissions, for example in depleted natural gas fields under the sea, although it will be costly.
Stoltenberg said in a 2007 speech that carbon capture and storage would be Norway's equivalent of a Moon landing.
The centre has two carbon capture plants with a combined capacity to process 100,000 tons of carbon dioxide per year, making it the largest, Olav Folk Pedersen, the TCM's technology manager, told Reuters.
"CATCH AND RELEASE"
However, the capacity is only slightly more than a tenth of what Mongstad refinery emits per year. During the testing period all CO2 captured will be released into the atmosphere, thus having no impact on reducing the refinery's emissions costs.
【字数:368】 【计时4】
"Speaking in fishing terms, it's a "catch and release" facility," Pedersen said earlier at the briefing.
So far, few countries have agreed to invest heavily in carbon capture. Those with projects include the United States, Australia, Britain and China.
European Union Energy Commissioner Guenther Oettinger applauded Norway's efforts, which take place at a time when other CCS demonstration projects in Europe have stalled due to lack of investment.
"It's an important milestone in Europe's undertaking to develop CCS technologies... It will provide a new momentum to the discussion of CCS use in Europe," he told the ceremony.
Oettinger has said that natural gas can have a long-term future in Europe only if CCS can be applied.
Norway, with a population of just 4.9 million, is the world's eighth-biggest exporter of oil and Western Europe's biggest exporter of natural gas.
Ola Borten Moe, Norway's petroleum and energy minister, told the ceremony a full-scale CCS facility at Mongstad might be possible later, with an investment decision due in 2016, but emissions permit prices had to rise to justify the spending.
Helge Lund, the chief executive of oil firm Statoil, a partner in TCM, added: "We cannot defend it economically, because CO2 prices are so low now. We need to find some sort of the solution in the future."
The EU's carbon market was touted as the cheapest and most effective way to cut emissions by putting a price on carbon dioxide emissions and getting the private sector to factor that cost into long-term investment decisions.
However, carbon permit prices are trading below 7 euros, around a quarter of what many lawmakers say is the minimum level needed to get companies to invest in clean technology.
TCM will test two carbon capture technologies, one based on amine and the other on chilled ammonia solvent, to trap CO2 emitted from the plant and the refinery. ($1 = 5.7996 Norwegian crowns)
【字数:316】 【计时5】
How Biodiversity Keeps Earth Alive
Species loss lessens the total amount of biomass on a given parcel, suggesting that the degree of diversity directly impacts the amount of life the planet can support
[attachimg=277,277]100265[/attachimg] In 1994 biologists seeded patches of grassland in Cedar Creek, Minn. Some plots got as many as 16 species of grasses and other plants—and some as few as one. In the first few years plots with eight or more species fared about as well as those with fewer species, suggesting that a complex mix of species—what is known as biodiversity—didn't affect the amount of a plot's leaf, blade, stem and root (or biomass, as scientists call it). But when measured over a longer span—more than a decade—those plots with the most species produced the greatest abundance of plant life.
"Different species differ in how, when and where they acquire water, nutrients and carbon, and maintain them in the ecosystem. Thus, when many species grow together, they have a wider set of traits that allow them to gain the resources needed," explains ecologist Peter Reich of the University of Minnesota, who led this research to be published in Science on May 4. This result suggests "no level of diversity loss can occur without adverse effects on ecosystem functioning." That is the reverse of what numerous studies had previously found, largely because those studies only looked at short-term outcomes.
The planet as a whole is on the cusp of what some researchers have termed the sixth mass extinction event in the planet's history: the wiping out of plants, animals and all other forms of life due to human activity. The global impact of such biodiversity loss is detailed in a meta-analysis led by biologist David Hooper of Western Washington University. His team examined 192 studies that looked at species richness and its effect on ecosystems. "The primary drivers of biodiversity loss are, in rough order of impact to date: habitat loss, overharvesting, invasive species, pollution and climate change," Hooper explains. Perhaps unsurprisingly, "biodiversity loss in the 21st century could rank among the major drivers of ecosystem change," Hooper and his colleagues wrote in Nature on May 3. (Scientific American is part of Nature Publishing Group.)
【字数:371】 【文章剩余部分】
Losing just 21 percent of the species in a given ecosystem can reduce the total amount of biomass in that ecosystem by as much as 10 percent—and that's likely to be a conservative estimate. And when more than 40 percent of an ecosystem's species disappear—whether plant, animal, insect, fungi or microbe—the effects can be as significant as those caused by a major drought. Nor does this analysis take into account how species extinction can both be driven by and act in concert with other changes—whether warmer average temperatures or nitrogen pollution. In the real world environmental and biological changes "are likely to be happening at the same time," Hooper admits. "This is a critical need for future research."
The major driver of human impacts on the rest of life on this planet—whether through clearing forests or dumping excess fertilizer on fields—is our need for food. Maintaining high biomass from farming ecosystems, which often emphasize monocultures (single species) while also preserving biodiversity—some species now appear only on farmland—has become a "key issue for sustainability," Hooper notes, "if we're going to grow food for nine billion people on the planet in the next 40 to 50 years."
Over the long term, maintaining soil fertility may require nurturing, creating and sparing plant and microbial diversity. After all, biodiversity itself appears to control the elemental cycles—carbon, nitrogen, water—that allow the planet to support life. Only by acting in conjunction with one another, for example, can a set of grassland plant species maintain healthy levels of nitrogen in both soil and leaf. "As soil fertility increases, this directly boosts biomass production," just as in agriculture, Reich notes. "When we reduce diversity in the landscape—think of a cornfield or a pine plantation or a suburban lawn—we are failing to capitalize on the valuable natural services that biodiversity provides."
At least one of those services is largely unaffected, however, according to Hooper's study—decomposition. Which means the bacteria and fungi will still happily break down whatever plants are left after this sixth extinction. But thousands of unique species have already been lost, most unknown even to science—a rate that could halve the total number of species on the planet by 2100, according to entomologist E. O. Wilson of Harvard University. Ghosts of species past haunt ecosystems worldwide, which have already lost not just one or another type of grass or roundworm but also some of their strength at sustaining life as a whole.
【字数:419】
【越障】
New Technology Allows Better Extreme Weather Forecasts
New technology that increases the warning time for tornadoes and hurricanes could potentially save hundreds of lives every year
[attachimg=277,277]100266[/attachimg]
After the deafening roar of a thunderstorm, an eerie silence descends. Then the blackened sky over Joplin, Mo., releases the tentacles of an enormous, screaming multiple-vortex tornado. Winds exceeding 200 miles per hour tear a devastating path three quarters of a mile wide for six miles through the town, destroying schools, a hospital, businesses and homes and claiming roughly 160 lives.
Nearly 20 minutes before the twister struck on the Sunday evening of May 22, 2011, government forecasters had issued a warning. A tornado watch had been in effect for hours and a severe weather outlook for days. The warnings had come sooner than they typically do, but apparently not soon enough. Although emergency officials were on high alert, many local residents were not.
The Joplin tornado was only one of many twister tragedies in the spring of 2011. A month earlier a record-breaking swarm of tornadoes devastated parts of the South, killing more than 300 people. April was the busiest month ever recorded, with about 750 tornadoes.
At 550 fatalities, 2011 was the fourth-deadliest tornado year in U.S. history. The stormy year was also costly. Fourteen extreme weather and climate events in 2011—from the Joplin tornado to hurricane flooding and blizzards—each caused more than $1 billion in damages. The intensity continued early in 2012; on March 2, twisters killed more than 40 people across 11 Midwestern and Southern states.
Tools for forecasting extreme weather have advanced in recent decades, but researchers and engineers at the National Oceanic and Atmospheric Administration are working to enhance radars, satellites and supercomputers to further lengthen warning times for tornadoes and thunderstorms and to better determine hurricane intensity and forecast floods. If the efforts succeed, a decade from now residents will get an hour’s warning about a severe tornado, for example, giving them plenty of time to absorb the news, gather family and take shelter.
The Power of Radar Meteorologist doug forsyth is heading up efforts to improve radar, which plays a role in forecasting most weather. Forsyth, who is chief of the Radar Research and Development division at NOAA’s National Severe Storms Laboratory in Norman, Okla., is most concerned about improving warning times for tornadoes because deadly twisters form quickly and radar is the forecaster’s primary tool for sensing a nascent tornado.
Radar works by sending out radio waves that reflect off particles in the atmosphere, such as raindrops or ice or even insects and dust. By measuring the strength of the waves that return to the radar and how long the round-trip takes, forecasters can see the location and intensity of precipitation. The Doppler radar currently used by the National Weather Service also measures the frequency change in returning waves, which provides the direction and speed at which the precipitation is moving. This key information allows forecasters to see rotation occurring inside thunderstorms before tornadoes form.
In 1973 NOAA meteorologists Rodger Brown, Les Lemon and Don Burgess discovered this information’s predictive power as they analyzed data from a tornado that struck Union City, Okla. They noted very strong outbound velocities right next to very strong inbound velocities in the radar data. The visual appearance of those data was so extraordinary that the researchers initially did not know what it meant. After matching the data to the location of the tornado, however, they named the data “Tornadic Vortex Signature.” The TVS is now the most important and widely recognized metric indicating a high probability of either an ongoing tornado or the potential for one in the very near future. These data enabled longer lead times for tornado warnings, increasing from a national average of 3.5 minutes in 1987 to 14 minutes today.
Although Doppler radar has been transformative, it is not perfect. It leaves meteorologists like Forsyth blind to the shape of a given particle, which can distinguish, say, a rainstorm from a dust storm. Ironically, the trajectory of his career path changed when a failed eye exam led him from U.S. Air Force pilot ambitions to a career in meteorology. Since then, Forsyth has focused on radar upgrades that give forecasters a better view of the atmosphere.
One critical upgrade is called dual polarization. This technology allows forecasters to differentiate more confidently between types of precipitation and amount. Although raindrops and hailstones may sometimes have the same horizontal width—and therefore appear the same in Doppler radar images—raindrops are flatter. Knowing the difference in particle shape reduces the guesswork required by a forecaster to identify features in the radar scans. That understanding helps to produce more accurate forecasts, so residents know they should prepare for hail and not rain, for example.
Information about particle size and shape also helps to distinguish airborne bits of debris lofted by tornadoes and severe thunderstorms, so meteorologists can identify an ongoing damaging storm. Particle data are especially important when trackers are dealing with a tornado that is invisible to the human eye. If a tornado is cloaked in heavy rainfall or is occurring at night, dual polarization can still detect the airborne debris.
The National Weather Service is integrating dual-polarization technology—which is also helpful for monitoring precipitation in hurricanes and blizzards—into all 160 Doppler radars across the nation, expecting to finish by mid-2013. At the same time, NOAA personnel are training forecasters to interpret the new images. The Weather Forecast Office in Newport/Morehead City, N.C., was the first to scan a tropical cyclone using such radar when Hurricane Irene made landfall in North Carolina in 2011. During that storm, dual-polarization radars proved more accurate in detecting precipitation rates, and therefore predicting flooding, than conventional Doppler radars farther north. The improved capabilities surely saved lives in the Carolinas; farther up the coast, without this technology, Hurricane Irene was deadlier despite early warnings, claiming nearly 30 lives.
NOAA research meteorologist Pam Heinselman believes another advanced radar technology used by the U.S. Navy to detect and track enemy ships and missiles has great potential to improve weather forecasting as well. Heinselman leads a team of electrical engineers, forecasters and social scientists at the National Weather Radar Testbed in Norman, Okla., focused on a technology called phased-array radar.
Current Doppler radars scan at one elevation angle at a time, with a parabolic dish that is mechanically turned. Once the dish completes a full 360-degree slice, it tilts up to sample another small sector of the atmosphere. After sampling from lowest to highest elevation, which during severe weather equates to 14 individual slices, the radar returns to the lowest angle and begins the process all over again. Scanning the entire atmosphere during severe weather takes Doppler radar four to six minutes.
In contrast, phased-array radar sends out multiple beams simultaneously, eliminating the need to tilt the antennas, decreasing the time between scans of storms to less than a minute. The improvement will allow meteorologists to “see” rapidly evolving changes in thunderstorm circulations and, ultimately, to more quickly detect the changes that cause tornadoes. Heinselman and her team have demonstrated that phased-array radar can also gather storm information not currently available, such as fast changes in wind fields, which can precede rapid changes in storm intensity.
Heinselman and others believe phased-array technology alone could extend tornado warnings to more than 18 minutes, but much more research and development needs to be done. Ideally, the phased-array system would have four panels that emitted and received radio waves, to provide a 360-degree view of the atmosphere—one each for the north, south, east and west. Researchers in Norman have made only one-panel systems operable for weather surveillance, and it is likely to be at least a decade before phased arrays become the norm across the country.
【字数:1303】
【文章剩余部分】
Eyes in the Sky Of course, even the best radars cannot see over mountains or out into the oceans, where hurricanes form. Forecasters rely on satellites for these situations and also rely on them to provide broader data that supplement the localized information from a given radar. NOAA’s weather satellites supply more than 90 percent of the data that go into daily and long-range forecasts, and they are critical in providing alerts of severe weather potential multiple days in advance. To improve the delivery of this essential environmental intelligence, NOAA will deploy a range of new technologies in the next five years.
Without more detailed satellite observations, extending the range of accurate weather forecasts—especially for such extreme events as hurricanes—would be severely restricted. Monitoring weather requires two types of satellites: geostationary and polar-orbiting. Geostationary satellites, which stay fixed in one spot at an altitude of about 22,000 miles, transmit near-continuous views of the earth’s surface. Using loops of pictures taken at 15-minute intervals, forecasters can monitor rapidly growing storms or detect changes in hurricanes (but not tornadoes).
Polar satellites, which orbit the earth from pole to pole at an altitude of approximately 515 miles, give closer, more detailed observations of the temperature and humidity of different layers of the atmosphere. A worldwide set of these low Earth orbit (LEO) satellites covers the entire globe every 12 hours.
NOAA plans to launch a new series of LEO satellites this decade, as part of the Joint Polar Satellite System, with updated hardware, fitted with more sophisticated instruments. Their data will be used in computer models to improve weather forecasts, including hurricane tracks and intensities, severe thunderstorms and floods. The suite of advanced microwave and infrared sensors will relay much improved three-dimensional information on the atmosphere’s temperature, pressure and moisture, because rapid changes in temperature and moisture, combined with low pressure, signify a strong storm. Infrared sensors provide these measurements in cloud-free areas, and microwave sensors can “see through clouds” to the earth’s surface.
In April 2011, five days before a powerful storm system tore through six southern states, NOAA’s current polar-orbiting satellites provided data that, when fed into models, prompted the NOAA Storm Prediction Center to forecast “a potentially historic tornado outbreak.” The center elevated the risk to the highest level at midnight before the event. This level of outlook is reserved for the most extreme cases, with the least uncertainty, and is only used when the possibility for extremely explosive storms is detected. The new LEO satellites should allow such predictions as much as five to seven days before a storm.
Geostationary satellites will improve, too. Advanced instruments that will image the earth every five minutes in both visible and infrared wavelengths will be onboard the GOES-R series of satellites to be launched in 2015. They will increase observations from every 15 minutes to every five minutes or less, allowing scientists to monitor the rapid intensification of severe storms. The GOES-R satellites will also provide the world’s first space view of where lightning is occurring in the Western Hemisphere. The lightning mapper will help forecasters detect jumps in the frequency of in-cloud and cloud-to-ground lightning flashes. Research suggests that these jumps occur up to 20 minutes or more before hail, severe winds and even tornadoes.
Billions of Data Each of the new radar technologies and satellites could improve warning times by several minutes, but incorporating the data derived from all these systems into forecasting computer models could provide even more time. Warnings for tornadoes, for example, could be issued up to an hour in advance. That is the kind of lead time that would have made a big difference in Joplin.
Forecasting models are based on physical laws governing atmospheric motion, chemical reactions and other relationships. They crunch millions of numbers that represent current weather and environmental conditions, such as temperature, pressure and wind, to predict the future state of the atmosphere. Imagine a grid that lies over the planet’s surface. Imagine another one a few hundred feet above that—and another and another, in layer after layer, all the way to the top of the stratosphere some 30 miles up. Millions of lines of code are needed to translate the billions of grid points under observation.
A typical forecast model today uses grids at the surface that run about five to 30 miles square. The smaller the squares, the higher the model’s resolution and the better it will be at detecting small-scale atmospheric changes that could spawn storms. Processing more data points, however, requires faster supercomputers.
Advances in modeling also require talented people who can integrate all these data and interpret them. Bill Lapenta, acting director of NOAA’s Environmental Modeling Center, heads that translation effort, which churns out numerical forecasts for 12, 24, 36, 48 and 72 hours ahead and beyond. Meteorologists compare NOAA’s models with others from international modeling centers to come up with the forecasts seen on the Web or the evening news.
NOAA supercomputers in Fairmont, W.Va., can process 73.1 trillion calculations a second. But Lapenta believes faster speeds are possible, which will allow the models to run at even smaller scales. For example, grids of just one mile square would enable models to simulate the small-scale conditions that catapult a routine thunderstorm or hurricane into a monster. NOAA plans to access some of the latest supercomputers at Oak Ridge National Laboratory to begin to build such models. Lapenta hopes such high-resolution models might begin to appear by 2020.
Lapenta foresees a day in the next decade when the increasing capabilities of new radars and satellites will be coupled with an evolving generation of finely detailed weather-prediction models running in real time on computers at speeds exceeding a quintillion computations a second. To make them a reality, scientists such as Lapenta are working on the mathematical, physical and biogeochemical relations that need to be encoded in a way that enables those relations to work together seamlessly.
If major NOAA investments in this “brainware” pay off, forecasters will not have to wait for a radar image to detect an actual storm before issuing a warning with 14 or 18 minutes of lead time. Instead they will be able to issue tornado, severe thunderstorm and flash-flood warnings based on highly accurate model forecasts produced well in advance, giving the public 30 to 60 minutes to take safety precautions.
Better Science, Better Decisions With all these improvements, meteorologists such as Gary Conte in the New York City Weather Forecast Office will be able to predict more accurately, with longer lead times, weather hazards that can shut down the city, such as storms with snow and ice. Severe weather outlooks will extend beyond five days, hurricane forecasts beyond seven days, and the threat of spring floods will be known weeks in advance. This vision for a weather-ready nation is motivated by the desire to avoid the unmitigated disasters of 2011.
The goal is that by 2021 the rebuilt and thriving city of Joplin would receive a severe tornado warning more than an hour in advance. Families would have more time to gather and get to a safe room. Nursing homes and hospitals would be able to transfer residents and patients to shelter. Retailers would have time to get employees to safety and close up shop. Cell phones would thrum with multiple messages to seek shelter while local meteorologists broadcast similar warnings on television and radio. The clarion call of tornado sirens would reinforce the urgency of these warnings. As a result, even nature’s most powerful tornado would pass through town without any loss of life.
【字数:1266】
|
本帖子中包含更多资源
您需要 登录 才可以下载或查看,没有帐号?立即注册
x
|