Google Search

Current Weather Condtions By City

New Cities Are Being Added. Please Keep Checking if You Don't See Yours Chicago, Los Angeles, Miami, New York, Philadelphia, Washington, DC

USA Current Weather and Current Radar Maps

Current Watches, Warnings and Advisories for the US Issued by NOAA

Saturday, January 31, 2015

Average temperature in Finland has risen by more than two degrees

Over the past 166 years, the average temperature in Finland has risen by more than two degrees. During the observation period, the average increase was 0.14 degrees per decade, which is nearly twice as much as the global average.

According to a recent University of Eastern Finland and Finnish Meteorological Institute study, the rise in the temperature has been especially fast over the past 40 years, with the temperature rising by more than 0.2 degrees per decade. "The biggest temperature rise has coincided with November, December and January. Temperatures have also risen faster than the annual average in the spring months, i.e., March, April and May. In the summer months, however, the temperature rise has not been as significant," says Professor Ari Laaksonen of the University of Eastern Finland and the Finnish Meteorological Institute. As a result of the temperature rising, lakes in Finland get their ice cover later than before, and the ice cover also melts away earlier in the spring. Although the temperature rise in the actual growth season has been moderate, observations of Finnish trees beginning to blossom earlier than before have been made.

Temperature has risen in leaps

The annual average temperature has risen in two phases, the first being from the beginning of the observation period to the late 1930s, and the second from the late 1960s to present. Since the 1960s, the temperature has risen faster than ever before, with the rise varying between 0.2 and 0.4 degrees per decade. Between the late 1930s and late 1960s, the temperature remained nearly steady. "The stop in the temperature rise can be explained by several factors, including long-term changes in solar activity and post-World War II growth of human-derived aerosols in the atmosphere. When looking at recent years' observations from Finland, it seems that the temperature rising is not slowing down," University of Eastern Finland researcher Santtu Mikkonen explains.

The temperature time series was created by averaging the data produced by all Finnish weather stations across the country. Furthermore, as the Finnish weather station network wasn't comprehensive nation-wide in the early years, data obtained from measurement stations in Finland's neighbouring countries was also used.

Finland is located between the Atlantic Ocean and the continental Eurasia, which causes great variability in the country's weather. In the time series of the average temperature, this is visible in the form of strong noise, which makes it very challenging to detect statistically significant trends. The temperature time series for Finland was analysed by using a dynamic regression model. The method allows the division of the time series into sections indicating mean changes, i.e. trends, periodic variation, observation inter-dependence and noise. The method makes it possible to take into consideration the seasonal changes typical of Nordic conditions, as well as significant annual variation.

Journal Reference:

S. Mikkonen, M. Laine, H. M. M?kel?, H. Gregow, H. Tuomenvirta, M. Lahtinen, A. Laaksonen. Trends in the average temperature in Finland, 1847–2013. Stochastic Environmental Research and Risk Assessment, 2014; DOI: 10.1007/s00477-014-0992-2

View the original article here

Friday, January 30, 2015

Human influence important factor in possible global and UK temperature records

Early figures from the University of East Anglia (UEA) show 2014 is on course to be one of, if not the warmest year on record both globally and for the UK.

Recent research from the Met Office suggests breaking the existing global and UK temperature records is much more likely due to human influence on the climate.

Early figures suggest global record possible

The global mean temperature for January to October based on the HadCRUT4 dataset (compiled by the Met Office and UEA's Climatic Research Unit) is 0.57 ?C (+/- 0.1) above the long-term (1961-1990) average. This is consistent with the statement from the World Meteorological Organization (WMO) today.

With two months of data still to add, the full-year figure could change but presently 2014 is just ahead of the current record of 0.56 ?C set in 2010 in the global series which dates back to 1850. The final value for this year will be very close to the central estimate of 0.57?C from the Met Office global temperature forecast for 2014, which was issued late last year.

Colin Morice, a climate monitoring scientist at the Met Office, said: "Record or near-record years are interesting, but the ranking of individual years should be treated with some caution because the uncertainties in the data are larger than the differences between the top ranked years. We can say this year will add to the set of near-record temperatures we have seen over the last decade."

UK's run of warm months makes record likely

The UK's mean temperature from 1 January to 25 November is 1.6 ?C above the long term (1961-1990) average, which means this year is currently the warmest in our UK series dating back to 1910. This would beat the record of 1.4 ?C set in 2006, but a cold December could change the final ranking for this year.

This year is also set to be one of the warmest on record in the Central England Temperature (CET) series, which goes back to 1659 and is the longest instrumental temperature series in the world.

Interestingly, while all months this year except August have seen above average temperatures in the UK, no single month has seen a temperature record. Instead the year has been consistently warm.

Phil Jones, research director of UEA's Climatic Research Unit, said: "Spatially, 2014 has so far been warmer than the 1961-1990 average almost everywhere, the main exception being central and eastern parts of North America. For Europe, many countries in northern and eastern parts will likely have had near-record warm years."

CRU climate scientist Prof Tim Osborn said: "The last decade has been the warmest period in our 165-year-long record, yet during this decade there has been no clear warming at the Earth's surface. Coming at the end of this warm decade, record warmth in 2014 would be of significant interest but one year isn't enough to end the warming pause."

Human influence a likely factor

One warm year does not necessarily say anything about long-term climate change -- these trends need to be looked at over longer timescales of several decades.

However, new research techniques developed by the Met Office allow for rapid assessment of how human influence might have affected the chances of breaking temperature records.

This technique, known as an attribution study, uses climate models and observations to see how likely an event would be in the real world and in a world without human greenhouse gas emissions -- enabling assessment of how human influence has altered the chances of an event.

Peter Stott, Head of Climate Attribution at the Met Office, said: "Our research shows current global average temperatures are highly unlikely in a world without human influence on the climate.

"Human influence has also made breaking the current UK temperature record about ten times more likely."

A wet year for the UK, but not a record

This is also set to be a notably wet year for the UK, with 1162 mm of rain between 1 January and 25 November.

If we saw average rainfall for the rest of the year, 2014 would rank as the 4th wettest year in the UK records dating back to 1910. It would also be 11th in the longer running England and Wales precipitation series, which dates back to 1766.

However, if we do have a very wet December this year could still break the UK record set in 2000 of 1337 mm.

Due to the large amount of variability in UK rainfall, it's not yet possible to say whether human influence directly impacted this year's total.


View the original article here

Thursday, January 29, 2015

In the mood to trade? Weather may influence institutional investors' stock decisions

Weather changes may affect how institutional investors decide on stock plays, according to a new study by a team of finance researchers. Their findings suggest sunny skies put professional investors more in a mood to buy, while cloudy conditions tend to discourage stock purchases.

The researchers conclude that cloudier days increase the perception that individual stocks and the Dow Jones Industrials are overpriced, increasing the inclination for institutions to sell.

The research paper, "Weather-Induced Mood, Institutional Investors, and Stock Returns," has been published in the January 2015 issue of The Review of Financial Studies. The research was collaborated by Case Western Reserve University's Dasol Kim and three other finance professors (William Goetzmann of Yale University, Alok Kumar of University of Miami and Qin Wang of University of Michigan-Dearborn).

Institutional investors represent large organizations, such as banks, mutual funds, labor union funds and finance or insurance companies that make substantial investments in stocks. Kim said the results of the study are surprising, given that professional investors are well regarded for their financial sophistication.

"We focus on institutional investors because of the important role they have in how stock prices are formed in the markets," said Kim, assistant professor of banking and finance at Case Western Reserve's Weatherhead School of Management. "Other studies have already shown that ordinary retail investors are susceptible to psychological biases in their investment decisions. Trying to evaluate similar questions for institutional investors is challenging, because relevant data is hard to come by."

Building on previous findings from psychological studies about the effect of sunshine on mood, the researchers wanted to learn how mood affects professional investor opinions on their stock market investments.

By linking responses to a survey of investors from the Yale Investor Behavior Project of Nobel Prize-winning economist Robert Shiller and institutional stock trade data with historical weather data from the National Oceanic and Atmospheric Administration, the researchers concluded aggregated data shows that seasonably sunnier weather leads to optimistic responses and a willingness to buy.

The research accounts for differences in weather across regions of the country and seasons. They show that these documented mood effects also influence stock prices, and that the observed impact does not persist for long periods of time.

A summary of the research was also recently featured at The Harvard Law School Forum on Corporate Governance and Financial Regulation.

Journal Reference:

W. N. Goetzmann, D. Kim, A. Kumar, Q. Wang. Weather-Induced Mood, Institutional Investors, and Stock Returns. Review of Financial Studies, 2014; 28 (1): 73 DOI: 10.1093/rfs/hhu063

View the original article here

Wednesday, January 28, 2015

Even in restored forests, extreme weather strongly influences wildfire's impacts

The 2013 Rim Fire, the largest wildland fire ever recorded in the Sierra Nevada region, is still fresh in the minds of Californians, as is the urgent need to bring forests back to a more resilient condition. Land managers are using fire as a tool to mimic past fire conditions, restore fire-dependent forests, and reduce fuels in an effort to lessen the potential for large, high-intensity fires, like the Rim Fire. A study led by the U.S. Forest Service's Pacific Southwest Research Station (PSW) and recently published in the journal Forest Ecology and Management examined how the Rim Fire burned through forests with restored fire regimes in Yosemite National Park to determine whether they were as resistant to high-severity fire as many scientists and land managers expected.

Since the late 1960s, land managers in Yosemite National Park have used prescribed fire and let lower intensity wildland fires burn in an attempt to bring back historical fire regimes after decades of fire suppression. For this study, researchers seized a unique opportunity to study data on forest structure and fuels collected in 2009 and 2010 in Yosemite's old-growth, mixed-conifer forests that had previously burned at low to moderate severity. Using post-Rim Fire data and imagery, researchers found that areas burned on days the Rim Fire was dominated by a large pyro-convective plume -- a powerful column of smoke, gases, ash, and other debris -- burned at moderate to high severity regardless of the number of prior fires, topography, or forest conditions.

"The specific conditions leading to large plume formation are unknown, but what is clear from many observations is that these plumes are associated with extreme burning conditions," says Jamie Lydersen, PSW biological science technician and the study's lead author. "Plumes often form when atmospheric conditions are unstable, and result in erratic fire behavior driven by its own local effect on surface wind and temperatures that override the influence of more generalized climate factors measured at nearby weather stations."

When the extreme conditions caused by these plumes subsided during the Rim Fire, other factors influenced burn severity. "There was a strong influence of elapsed time since the last burn, where forests that experienced fire within the last 14 years burned mainly at low severity in the Rim Fire. Lower elevation areas and those with greater shrub cover tended to burn at higher severity," says Lydersen.

When driven by extreme weather, which often coincides with wildfires that escape initial containment efforts, fires can severely burn large swaths of forest regardless of ownership and fire history. These fires may only be controlled if more forests across the landscape have been managed for fuel reduction to allow early stage suppression before weather- and fuels-driven fire intensity makes containment impossible. Coordination of fire management activities by land management agencies across jurisdictions could favor burning under more moderate weather conditions when wildfires start and reduce the occurrences of harmful, high-intensity fires.


View the original article here

Tuesday, January 27, 2015

New insights into predicting future droughts in California: Natural cycles, sea surface temperatures found to be main drivers in ongoing event

According to a new NOAA-sponsored study, natural oceanic and atmospheric patterns are the primary drivers behind California's ongoing drought. A high pressure ridge off the West Coast (typical of historic droughts) prevailed for three winters, blocking important wet season storms, with ocean surface temperature patterns making such a ridge much more likely. Typically, the winter season in California provides the state with a majority of its annual snow and rainfall that replenish water supplies for communities and ecosystems.

Further studies on these oceanic conditions and their effect on California's climate may lead to advances in drought early warning that can help water managers and major industries better prepare for lengthy dry spells in the future.

"It's important to note that California's drought, while extreme, is not an uncommon occurrence for the state. In fact, multi-year droughts appear regularly in the state's climate record, and it's a safe bet that a similar event will happen again. Thus, preparedness is key," said Richard Seager, report lead author and professor with Columbia University's Lamont Doherty Earth Observatory.

This report builds on earlier studies, published in September in the Bulletin of the American Meteorological Society, which found no conclusive evidence linking human-caused climate change and the California drought. The current study notes that the atmospheric ridge over the North Pacific, which has resulted in decreased rain and snowfall since 2011, is almost opposite to what models project to result from human-induced climate change. The report illustrates that mid-winter precipitation is actually projected to increase due to human-induced climate change over most of the state, though warming temperatures may sap much of those benefits for water resources overall, while only spring precipitation is projected to decrease.

The report makes clear that to provide improved drought forecasts for California, scientists will need to fully understand the links between sea surface temperature variations and winter precipitation over the state, discover how these ocean variations are generated, and better characterize their predictability.

This report contributes to a growing field of science-climate attribution-where teams of scientists aim to identify the sources of observed climate and weather patterns.

"There is immense value in examining the causes of this drought from multiple scientific viewpoints," said Marty Hoerling, report co-author and researcher with NOAA's Earth System Research Laboratory. "It's paramount that we use our collective ability to provide communities and businesses with the environmental intelligence they need to make decisions concerning water resources, which are becoming increasingly strained."

To view the report, visit:?http://cpo.noaa.gov/MAPP/californiadroughtreport.


View the original article here

Monday, January 26, 2015

NASA's Fermi Mission brings deeper focus to thunderstorm gamma rays

Each day, thunderstorms around the world produce about a thousand quick bursts of gamma rays, some of the highest-energy light naturally found on Earth. By merging records of events seen by NASA's Fermi Gamma-ray Space Telescope with data from ground-based radar and lightning detectors, scientists have completed the most detailed analysis to date of the types of thunderstorms involved.

"Remarkably, we have found that any thunderstorm can produce gamma rays, even those that appear to be so weak a meteorologist wouldn't look twice at them," said Themis Chronis, who led the research at the University of Alabama in Huntsville (UAH).

The outbursts, called terrestrial gamma-ray flashes (TGFs), were discovered in 1992 by NASA's Compton Gamma-Ray Observatory, which operated until 2000. TGFs occur unpredictably and fleetingly, with durations less than a thousandth of a second, and remain poorly understood.

In late 2012, Fermi scientists employed new techniques that effectively upgraded the satellite's Gamma-ray Burst Monitor (GBM), making it 10 times more sensitive to TGFs and allowing it to record weak events that were overlooked before.

"As a result of our enhanced discovery rate, we were able to show that most TGFs also generate strong bursts of radio waves like those produced by lightning," said Michael Briggs, assistant director of the Center for Space Plasma and Aeronomic Research at UAH and a member of the GBM team.

Previously, TGF positions could be roughly estimated based on Fermi's location at the time of the event. The GBM can detect flashes within about 500 miles (800 kilometers), but this is too imprecise to definitively associate a TGF with a specific storm.

Ground-based lightning networks use radio data to pin down strike locations. The discovery of similar signals from TGFs meant that scientists could use the networks to determine which storms produce gamma-ray flashes, opening the door to a deeper understanding of the meteorology powering these extreme events.

Chronis, Briggs and their colleagues sifted through 2,279 TGFs detected by Fermi's GBM to derive a sample of nearly 900 events accurately located by the Total Lightning Network operated by Earth Networks in Germantown, Maryland, and the World Wide Lightning Location Network, a research collaboration run by the University of Washington in Seattle. These systems can pinpoint the location of lightning discharges -- and the corresponding signals from TGFs -- to within 6 miles (10 km) anywhere on the globe.

From this group, the team identified 24 TGFs that occurred within areas covered by Next Generation Weather Radar (NEXRAD) sites in Florida, Louisiana, Texas, Puerto Rico and Guam. For eight of these storms, the researchers obtained additional information about atmospheric conditions through sensor data collected by the Department of Atmospheric Science at the University of Wyoming in Laramie.

"All told, this study is our best look yet at TGF-producing storms, and it shows convincingly that storm intensity is not the key," said Chronis, who will present the findings Wed., Dec. 17, in an invited talk at the American Geophysical Union meeting in San Francisco. A paper describing the research has been submitted to the Bulletin of the American Meteorological Society.

Scientists suspect that TGFs arise from strong electric fields near the tops of thunderstorms. Updrafts and downdrafts within the storms force rain, snow and ice to collide and acquire electrical charge. Usually, positive charge accumulates in the upper part of the storm and negative charge accumulates below. When the storm's electrical field becomes so strong it breaks down the insulating properties of air, a lightning discharge occurs.

Under the right conditions, the upper part of an intracloud lightning bolt disrupts the storm's electric field in such a way that an avalanche of electrons surges upward at high speed. When these fast-moving electrons are deflected by air molecules, they emit gamma rays and create a TGF.

About 75 percent of lightning stays within the storm, and about 2,000 of these intracloud discharges occur for each TGF Fermi detects.

The new study confirms previous findings indicating that TGFs tend to occur near the highest parts of a thunderstorm, between about 7 and 9 miles (11 to 14 kilometers) high. "We suspect this isn't the full story," explained Briggs. "Lightning often occurs at lower altitudes and TGFs probably do too, but traveling the greater depth of air weakens the gamma rays so much the GBM can't detect them."

Based on current Fermi statistics, scientists estimate that some 1,100 TGFs occur each day, but the number may be much higher if low-altitude flashes are being missed.

While it is too early to draw conclusions, Chronis notes, there are a few hints that gamma-ray flashes may prefer storm areas where updrafts have weakened and the aging storm has become less organized. "Part of our ongoing research is to track these storms with NEXRAD radar to determine if we can relate TGFs to the thunderstorm life cycle," he said.

Video: https://www.youtube.com/watch?v=JgK4Ds_Sj6Q#t=66


View the original article here

Sunday, January 25, 2015

NASA satellite set to get the dirt on Earth's soil moisture

A new NASA satellite that will peer into the topmost layer of Earth's soils to measure the hidden waters that influence our weather and climate is in final preparations for a Jan. 29 dawn launch from California.

The Soil Moisture Active Passive (SMAP) mission will take the pulse of a key measure of our water planet: how freshwater cycles over Earth's land surfaces in the form of soil moisture. The mission will produce the most accurate, highest-resolution global maps ever obtained from space of the moisture present in the top 2 inches (5 centimeters) of Earth's soils. It also will detect and map whether the ground is frozen or thawed. This data will be used to enhance scientists' understanding of the processes that link Earth's water, energy and carbon cycles.

"With data from SMAP, scientists and decision makers around the world will be better equipped to understand how Earth works as a system and how soil moisture impacts a myriad of human activities, from floods and drought to weather and crop yield forecasts," said Christine Bonniksen, SMAP program executive with the Science Mission Directorate's Earth Science Division at NASA Headquarters in Washington. "SMAP's global soil moisture measurements will provide a new capability to improve our understanding of Earth's climate."

Globally, the volume of soil moisture varies between three and five percent in desert and arid regions, to between 40 and 50 percent in saturated soils. In general, the amount depends on such factors as precipitation patterns, topography, vegetation cover and soil composition. There are not enough sensors in the ground to map the variability in global soil moisture at the level of detail needed by scientists and decision makers. From space, SMAP will produce global maps with 6-mile (10-kilometer) resolution every two to three days.

Researchers want to measure soil moisture and its freeze/thaw state better for numerous reasons. Plants and crops draw water from the soil through their roots to grow. If soil moisture is inadequate, plants fail to grow, which over time can lead to reduced crop yields. Also, energy from the sun evaporates moisture in the soil, thereby cooling surface temperatures and also increasing moisture in the atmosphere, allowing clouds and precipitation to form more readily. In this way, soil moisture has a significant effect on both short-term regional weather and longer-term global climate.

In summer, plants in Earth's northern boreal regions -- the forests found in Earth's high northern latitudes -- take in carbon dioxide from the air and use it to grow, but lay dormant during the winter freeze period. All other factors being equal, the longer the growing season, the more carbon plants take in and the more effective forests are in removing carbon dioxide from the air. Since the start of the growing season is marked by the thawing and refreezing of water in soils, mapping the freeze/thaw state of soils with SMAP will help scientists more accurately account for how much carbon plants are removing from the atmosphere each year. This information will lead to better estimates of the carbon budget in the atmosphere and, hence, better assessments of future global warming.

SMAP data will enhance our confidence in projections of how Earth's water cycle will respond to climate change.

"Assessing future changes in regional water availability is perhaps one of the greatest environmental challenges facing the world today," said Dara Entekhabi, SMAP science team leader at the Massachusetts Institute of Technology in Cambridge. "Today's computer models disagree on how the water cycle -- precipitation, clouds, evaporation, runoff, soil water availability -- will increase or decrease over time and in different regions as our world warms. SMAP's higher-resolution soil moisture data will improve the models used to make daily weather and longer-term climate predictions."

SMAP also will advance our ability to monitor droughts, predict floods and mitigate the related impacts of these extreme events. It will allow the monitoring of regional deficits in soil moisture and provide critical inputs into drought monitoring and early warning systems used by resource managers. The mission's high-resolution observations of soil moisture will improve flood warnings by providing information on ground saturation conditions before rainstorms.

SMAP's two advanced instruments work together to produce soil moisture maps. Its active radar works much like a flash camera, but instead of transmitting visible light, it transmits microwave pulses that pass through clouds and moderate vegetation cover to the ground and measures how much of that signal is reflected back. Its passive radiometer operates like a natural-light camera, capturing emitted microwave radiation without transmitting a pulse. Unlike traditional cameras, however, SMAP's images are in the microwave range of the electromagnetic spectrum, which is invisible to the naked eye. Microwave radiation is sensitive to how much moisture is contained in the soil.

The two instruments share a large, lightweight reflector antenna that will be unfurled in orbit like a blooming flower and then spin at about 14 revolutions per minute. The antenna will allow the instruments to collect data across a 621-mile (1,000-kilometer) swath, enabling global coverage every two to three days.

SMAP's radiometer measurements extend and expand on soil moisture measurements currently made by the European Space Agency's Soil Moisture Ocean Salinity (SMOS) mission, launched in 2009. With the addition of a radar instrument, SMAP's soil moisture measurements will be able to distinguish finer features on the ground.

SMAP will launch from Vandenberg Air Force Base on a United Launch Alliance Delta II rocket and maneuver into a 426-mile (685-kilometer) altitude, near-polar orbit that repeats exactly every eight days. The mission is designed to operate at least three years.

SMAP is managed for NASA's Science Mission Directorate in Washington by the agency's Jet Propulsion Laboratory in Pasadena, California, with instrument hardware and science contributions made by NASA's Goddard Space Flight Center in Greenbelt, Maryland. JPL is responsible for project management, system engineering, radar instrumentation, mission operations and the ground data system. Goddard is responsible for the radiometer instrument. Both centers collaborate on science data processing and delivery to the Alaska Satellite Facility, in Fairbanks, and the National Snow and Ice Data Center, at the University of Colorado in Boulder, for public distribution and archiving. NASA's Launch Services Program at the agency's Kennedy Space Center in Florida is responsible for launch management. JPL is managed for NASA by the California Institute of Technology in Pasadena.

For more information about the Soil Moisture Active Passive mission, visit:

http://www.nasa.gov/smap

and

http://smap.jpl.nasa.gov

SMAP will be the fifth NASA Earth science mission to launch within a 12-month period. NASA monitors Earth's vital signs from land, air and space with a fleet of satellites and ambitious airborne and ground-based observation campaigns. NASA develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing.

For more information about NASA's Earth science activities, visit:

http://www.nasa.gov/earthrightnow


View the original article here

Saturday, January 24, 2015

NASA's CATS eyes clouds, smoke and dust from the space station

Turn on any local TV weather forecast and you can get a map of where skies are blue or cloudy. But for scientists trying to figure out how clouds affect Earth's environment, what's happening inside that shifting cloud cover is critical and hard to see.

To investigate the layers and composition of clouds and tiny airborne particles like dust, smoke and other atmospheric aerosols, , scientists at NASA's Goddard Space Flight Center in Greenbelt, Maryland have developed an instrument called the Cloud-Aerosol Transport System, or CATS. The instrument, which launches to the International Space Station in December 2014, will explore new technologies that could also be used in future satellite missions.

From space, streaks of white clouds can be seen moving across Earth's surface. Other tiny solid and liquid particles called aerosols are also being transported around the atmosphere, but these are largely invisible to our eyes. Aerosols are both natural and man-made, and include windblown desert dust, sea salt, smoke from fires, sulfurous particles from volcanic eruptions, and particles from fossil fuel combustion.

Currently, scientists get a broad picture of clouds and air quality conditions in the atmosphere and generate air quality forecasts by combining satellite, aircraft, and ground-based data with sophisticated computer models. However, most datasets do not provide information about the layered structure of clouds and aerosols.

CATS will provide data about aerosols at different levels of the atmosphere. The data are expected to improve scientists' ability to track different cloud and aerosol types throughout the atmosphere. These datasets will be used to improve strategic and hazard-warning capabilities of events in near real-time, such as tracking plumes from dust storms, volcanic eruptions, and wildfires. The information could also feed into climate models to help understand the effects of clouds and aerosols on Earth's energy balance.

Clouds and aerosols reflect and absorb energy from the sun in a complex way. For example, when the sun's energy reaches the top of the atmosphere, clouds can reflect incoming sunlight, cooling Earth's surface. However, clouds can also absorb heat emitted from Earth and re-radiate it back down, warming the surface. The amount of warming or cooling is heavily dependent on the height, thickness, and structure of clouds in the atmosphere above.

"Clouds are one of the largest uncertainties in predicting climate change," said Matt McGill, principal investigator and payload developer for CATS at Goddard. "For scientists to create more accurate models of Earth's current and future climate, they'll have to include more accurate representations of clouds."

That's where a new instrument like CATS comes in. CATS is a lidar -- similar to a radar, but instead of sending out sound, lidars use light. CATS will send a laser pulse through the atmosphere towards a distant object like a cloud droplet or aerosol particle. Once the energy reaches the object, some of the energy is reflected back to the lidar receiver. Scientists can calculate the distance between the instrument and the object, based on the time it takes the energy to return to the receiver, thereby determining the altitudes of cloud and aerosol layers. The intensity of this return pulse also allows scientists to infer other properties, such as the composition of clouds, and the abundance and sizes of aerosols,.

In 2006 NASA launched the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations, or CALIPSO, spacecraft -- a joint mission between NASA and France's space agency, the Centre National d'?tudes Spatiales. CALIPSO carries a lidar that provides vertical distributions and properties of clouds and aerosols along a flight track. However, the CALIPSO lidar has exceeded its three-year prime mission and has been using its backup laser since 2009.

A unique opportunity to continue gathering this type of data presented itself in 2011 when the International Space Station Progam's NASA Research Office offered scientists at Goddard a mounting location aboard the space station for a new lidar instrument -- CATS, and provided the funding for its construction.

Designed to operate for at least six months, CATS has a goal of operating for three years. With beams at three wavelengths (1064, 532, and 355 nanometers), CATS will be used to derive a variety of properties of cloud and aerosol layers. These properties include layer height, layer thickness, and at least coarse information on the type of aerosols and cloud in various atmospheric layers.

CATS will orbit aboard the space station, which flies at an altitude between 230 miles (375 kilometers) and 270 miles (435 kilometers) above Earth's surface at a 51-degree inclination. This unique orbit path will allow the CATS instrument to observe locations at different times of day and allow scientists to study day-to-night changes in cloud and aerosol effects from space.

Studying clouds and aerosols won't just help scientists study the climate, it's also a chance to investigate air quality and how atmospheric particles affect daily life. That can range from volcano ash plumes, to dust storms, to pollution outbreaks, to wildfires, like the California Rim Fire in September 2013 that choked Yosemite National Park during the busy Labor Day weekend. These particles pose health risks to populations, especially to the medically vulnerable, By infusing CATS data directly into aerosol models, data from CATS can make a difference in tracking and responding to impacts of similar events in the future.


View the original article here

Friday, January 23, 2015

Carbon dioxide warming effects felt just a decade after being emitted

It takes just 10 years for a single emission of carbon dioxide (CO2) to have its maximum warming effects on the Earth.

This is according to researchers at the Carnegie Institute for Science who have dispelled a common misconception that the main warming effects from a CO2 emission will not be felt for several decades.

The results, which have been published today, 3 December, in IOP Publishing's journal Environmental Research Letters, also confirm that warming can persist for more than a century and suggest that the benefits from emission reductions will be felt by those who have worked to curb the emissions and not just future generations.

Some of these benefits would be the avoidance of extreme weather events, such as droughts, heatwaves and flooding, which are expected to increase concurrently with the change in temperature.

However, some of the bigger climate impacts from warming, such as sea-level rise, melting ice sheets and long-lasting damage to ecosystems, will have a much bigger time lag and may not occur for hundreds or thousands of years later, according to the researchers.

Lead author of the study Dr Katharine Ricke said: "Amazingly, despite many decades of climate science, there has never been a study focused on how long it takes to feel the warming from a particular emission of carbon dioxide, taking carbon-climate uncertainties into consideration.

"A lot of climate scientists may have an intuition about how long it takes to feel the warming from a particular emission of CO2, but that intuition might be a little bit out of sync with our best estimates from today's climate and carbon cycle models."

To calculate this timeframe, Dr Ricke, alongside Professor Ken Caldeira, combined results from two climate modelling projects.

The researchers combined information about the Earth's carbon cycle--specifically how quickly the ocean and biosphere took up a large pulse of CO2 into the atmosphere--with information about the Earth's climate system taken from a group of climate models used in the latest IPCC assessment.

The results showed that the median time between a single CO2 emission and maximum warming was 10.1 years, and reaffirmed that most of the warming persists for more than a century.

The reason for this time lag is because the upper layers of the oceans take longer to heat up than the atmosphere. As the oceans take up more and more heat which causes the overall climate to warm up, the warming effects of CO2 emissions actually begin to diminish as CO2 is eventually removed from the atmosphere. It takes around 10 years for these two competing factors to cancel each other out and for warming to be at a maximum.

"Our results show that people alive today are very likely to benefit from emissions avoided today and that these will not accrue solely to impact future generations," Dr Ricke continued.

"Our findings should dislodge previous misconceptions about this timeframe that have played a key part in the failure to reach policy consensus."


View the original article here

Thursday, January 22, 2015

Electromagnetic waves linked to particle fallout in Earth's atmosphere, new study finds

In a new study that sheds light on space weather's impact on Earth, Dartmouth researchers and their colleagues show for the first time that plasma waves buffeting the planet's radiation belts are responsible for scattering charged particles into the atmosphere.

The study is the most detailed analysis so far of the link between these waves and the fallout of electrons from the planet's radiation belts. The belts are impacted by fluctuations in "space weather" caused by solar activity that can disrupt GPS satellites, communication systems, power grids and manned space exploration.

The results appear in the journal Geophysical Research Letters. A PDF is available on request.

The Dartmouth space physicists are part of a NASA-sponsored team that studies the Van Allen radiation belts, which are donut-shaped belts of charged particles held in place by Earth's magnetosphere, the magnetic field surrounding our planet. In a quest to better predict space weather, the Dartmouth researchers study the radiation belts from above and below in complementary approaches -- through satellites (the twin NASA Van Allen Probes) high over Earth and through dozens of instrument-laden balloons (BARREL, or Balloon Array for Radiation belt Relativistic Electron Losses) at lower altitudes to assess the particles that rain down.

The Van Allen Probes measure particle, electric and magnetic fields, or basically everything in the radiation belt environment, including the electrons, which descend following Earth's magnetic field lines that converge at the poles. This is why the balloons are launched from Antarctica, where some of the best observations can be made. As the falling electrons collide with the atmosphere, they produce X-rays and that is what the balloon instruments are actually recording.

"We are measuring those atmospheric losses and trying to understand how the particles are getting kicked into the atmosphere," says co-author Robyn Millan, an associate professor in Dartmouth's Department of Physics and Astronomy and the principal investigator of BARREL. "Our main focus has been really on the processes that are occurring out in space. Particles in the Van Allen belts never reach the ground, so they don't constitute a health threat. Even the X-rays get absorbed, which is why we have to go to balloon altitudes to see them."

In their new study, the BARREL researchers' major objective was to obtain simultaneous measurements of the scattered particles and of ionoized gas called plasma out in space near Earth's equator. They were particularly interested in simultaneous measurements of a particular kind of plasma wave called electromagnetic ion cyclotron waves and whether these waves were responsible for scattering the particles, which has been an open question for years.

The researchers obtained measurements in Antarctica in 2013 when the balloons and both the Geostationary Operational Environmental Satellite (GOES) and Van Allen Probe satellites were near the same magnetic field line. They put the satellite data into their model that tests the wave-particle interaction theory, and the results suggest the wave scattering was the cause of the particle fallout. "This is the first real quantitative test of the theory," Millan says.


View the original article here

Wednesday, January 21, 2015

Deep Space Climate Observatory to provide 'EPIC' views of Earth

NASA has contributed two Earth science instruments for NOAA's space weather observing satellite called the Deep Space Climate Observatory or DSCOVR, set to launch in January 2015. One of the instruments called EPIC or Earth Polychromatic Imaging Camera will image the Earth in one picture, something that hasn't been done before from a satellite. EPIC will also provide valuable atmospheric data.

Currently, to get an entire Earth view, scientists have to piece together images from satellites in orbit. With the launch of the National Oceanic and Atmospheric Administration's (NOAA) DSCOVR and the EPIC instrument, scientists will get pictures of the entire sunlit side of Earth. To get that view, EPIC will orbit the first sun-Earth Lagrange point (L1), 1 million miles from Earth. At this location, four times further than the orbit of the Moon, the gravitational pull of the sun and Earth cancel out providing a stable orbit for DSCOVR. Most other Earth-observing satellites circle the planet within 22,300 miles.

"Unlike personal cameras, EPIC will take images in 10 very narrow wavelength ranges," said Adam Szabo, DSCOVR project scientist at NASA's Goddard Space Flight Center, Greenbelt, Maryland. "Combining these different wavelength images allows the determination of physical quantities like ozone, aerosols, dust and volcanic ash, cloud height, or vegetation cover. These results will be distributed as different publicly available data products allowing their combination with results from other missions."

These data products are of interest to climate science, as well as hydrology, biogeochemistry, and ecology. Data will also provide insight into Earth's energy balance.

EPIC was built by Lockheed Martin's Advanced Technology Center, in Palo Alto, California. It is a 30 centimeter (11.8 inch) telescope that measures in the ultraviolet, and visible areas of the spectrum. EPIC images will have a resolution of between 25 and 35 kilometers (15.5 to 21.7 miles).


View the original article here