Google Search

Current Weather Condtions By City

New Cities Are Being Added. Please Keep Checking if You Don't See Yours Chicago, Los Angeles, Miami, New York, Philadelphia, Washington, DC

USA Current Weather and Current Radar Maps

Current Watches, Warnings and Advisories for the US Issued by NOAA

Friday, February 20, 2015

Hurricane-forecast satellites will keep close eyes on the tropics

A set of eight hurricane-forecast satellites being developed at the University of Michigan is expected to give deep insights into how and where storms suddenly intensify--a little-understood process that's becoming more crucial to figure out as the climate changes, U-M researchers say.

The Cyclone Global Navigation Satellite System is scheduled to launch in fall 2016. At the American Geophysical Union Meeting in San Francisco this week, U-M researchers released estimates of how significantly CYGNSS could improve wind speed and storm intensity forecasts.

CYGNSS--said like the swan constellation--is a $173-million NASA mission that U-M is leading with Texas-based Southwest Research Institute. Each of its eight observatories is about the size of a microwave oven. That's much smaller than a typical weather satellite, which is about the size of a van.

The artificial CYGNSS "constellation," as researchers refer to it, will orbit at tropical, hurricane-belt latitudes. Its coverage will stretch from the 38th parallel north near Delaware's latitude to its counterpart in the south just below Buenos Aires.

Because of their arrangement and number, the observatories will be able to measure the same spot on the globe much more often than the weather satellites flying today can. CYGNSS's revisit time will average between four and six hours, and at times, it can be as fast as 12 minutes.

Conventional weather satellites only cross over the same point once or twice a day. Meteorologists can use ground-based Doppler radar to help them make predictions about storms near land, but hurricanes, which form over the open ocean, present a tougher problem.

"The rapid refresh CYGNSS will offer is a key element of how we'll be able to improve hurricane forecasts," said CYGNSS lead investigator Christopher Ruf, director of the U-M Space Physics Research Lab and professor of atmospheric, oceanic and space sciences.

"CYGNSS gets us the ability to measure things that change fast, like extreme weather. Those are the hardest systems to measure with today's satellites. And because the world is warmer and there's more energy to feed storm systems, there's more likelihood of extreme weather."

Through simulations, the researchers quantified the improvement CYGNSS could have on storm intensity predictions. They found that for a wind speed forecast that is off by 33 knots, or 38 miles per hour--the average error with current capabilities--CYGNSS could reduce that by 9 knots, or about 10 mph.

Considering that the categories of hurricane strength ratchet up, on average, every 20 mph, the accuracy boost is "a very significant number," Ruf said.

"I'd describe the feeling about it as guarded excitement," he said. "It's preliminary and it's all based on models. People will be really excited when we get up there and it works."

The numbers could also improve as scientists update weather prediction tools to better use the new kind of information that CYGNSS will provide.

For people who live in common hurricane or typhoon paths, closer wind speed predictions could translate into more accurate estimates of the storm surge at landfall, Ruf said. That's the main way these systems harm people and property.

"The whole ocean gets higher because the wind pushes the water. That's really hard to forecast now and it's an area we hope to make big improvements in," Ruf said.

Researchers expect the satellite system to give them new insights into storm processes. Hurricanes evolve slowly at first, but then they reach a tipping point, says Aaron Ridley, a professor of atmospheric, oceanic and space sciences.

"The hurricane could be meandering across the Atlantic Ocean and then something happens." Ridley said. "It kicks up a notch and people aren't exactly sure why. A lot of scientists would like to study this rapid intensification in more detail. With a normal mission, you might not be able to see it, but with CYGNSS, you have a better chance."

The satellites will operate in a fundamentally different way than their counterparts do. Rather than transmit a signal and read what reflects back, they'll measure how GPS signals from other satellites bounce off the ocean surface. Each of the eight CYGNSS nodes will measure signals from four of the 32 Global Positioning System satellites.

They'll also be able to take measurements through heavy rain--something other weather satellites are, surprisingly, not very good at.


View the original article here

Thursday, February 19, 2015

Physicist presents new observational solar weather model

Scientists now have an observational framework to help predict solar weather and how it will affect Earth.

"Now it's possible that we can have a space weather model that's like Earth's meteorology," says physicist Dr. S.T. Wu, distinguished professor emeritus of The University of Alabama in Huntsville (UAH) Department of Mechanical and Aerospace Engineering.

That's thanks to the observation-based model that predicts the occurrence and timing of solar mass ejections recently presented by Dr. Wu at the Scientific Committee on Solar-Terrestrial Physics' (SCOSTEP's) 13th Quadrennial Solar-Terrestrial Physics Symposium in Xi'An, China.

Being able to predict such events is important because a powerful direct hit by a coronal mass ejection (CME) is like a huge space hurricane that can deform Earth's magnetic field and fry the circuits of orbiting satellites, spacecraft and delicate terrestrial electronics.

In a large-scale storm, cell service would stop, air traffic control would lose its eyes and ears, and everything could be affected from traffic light control to the automated heating and cooling of buildings to the critical systems that control nuclear armaments. Earth would become largely dark as power grids blink offline.

The solar radiation could directly affect the health of humans, too, Dr. Wu says.

"If you travel to the Pacific, the airlines like to fly you there over the North Pole," he says. "That is the most direct route. But during a coronal event, the solar radiation can affect people, so the airlines try to avoid it by flying below the polar route."

Diverting around the polar route is necessary but costs extra time and money, Dr. Wu says, which is why the National Oceanic and Atmospheric Administration meets with airlines annually about possible upcoming events. Better predictions would help airlines.

The new predictive model is the culmination of decades of work by Dr. Wu, the founder and first director of UAH's Center for Space Plasma and Aeronomic Research (CSPAR). He wrote his first research paper on the subject of CME modeling in 1978.

The new model advances previous CME work by Dr. Wu and a global research group that includes his former students and post-doctoral students. Working with Dr. Wu are Dr. Chaowei Jiang of CSPAR and Dr. Xueshang Feng, Dr. Yufen Zhou and Dr. Qiang Hu of China's State Key Laboratory of Space Weather, Solar-Interplanetary-GeoMAgnetic Weather Group (SIGMA Weather Group).

In the previous work, Dr. Wu's team devised a model for the development of CMEs that was tested and proven against CMEs observed in the past. The scientists also modeled the conditions that are present in solar magnetic shear -- a sigmoidal twisting of the sun's magnetic field, flux emergence, null formation, torus instability, reconnection and free energy that can cause a CME. Their mathematical model of developing coronal mass ejections was shown to be accurate by comparison with actual observed phenomena from spacecraft tracking events on the sun's surface.

The researchers successfully performed a data-driven magnetohydrodynamic (MHD) simulation of a realistic CME initiation process, a step that helped lead to the predictive model and to better understanding the precursors to these solar storms. "Last time, we only modeled a coronal mass ejection," Dr. Wu says. "Now, we have put that eruption result into our propagation model. We have integrated what we did before into a global propagation model."

The predictive model can foresee the development and impacts of a CME from its genesis on the sun through its journey in the interplanetary medium and to its interaction with Earth.

The presentation was well received by the scientists at the conference because "I didn't use any theoretical inputs," says Dr. Wu.

"Others are doing this work, but they are still using theoretical models, but our work is observational, and that is the difference," he says. "It is more realistic because we start out from the sun and what you can see there, and then we work our way out."

The model provides important information to other scientists working on solar storm prediction.

"I've got the framework that says it can be done, so now everybody can do it," says Dr. Wu. Further development of an accurate solar weather prediction system will take supercomputers and the efforts of many researchers and universities, he says. "Now, everybody can jump in with their own research."

Arriving at the working model caps took 36 years of research for Dr. Wu, a period of time he puts in perspective by talking about the development of meteorology.

"It took 60 years to develop accurate meteorology on Earth," he says. "Everybody knows we needed to do this, but finally we got a result. I feel really good about that I can get a handle on it."


View the original article here

Wednesday, February 18, 2015

Giant atmospheric rivers add mass to Antarctica's ice sheet

Extreme weather phenomena called atmospheric rivers were behind intense snowstorms recorded in 2009 and 2011 in East Antarctica. The resulting snow accumulation partly offset recent ice loss from the Antarctic ice sheet, report researchers from KU Leuven.

Atmospheric rivers are long, narrow water vapour plumes stretching thousands of kilometres across the sky over vast ocean areas. They are capable of rapidly transporting large amounts of moisture around the globe and can cause devastating precipitation when they hit coastal areas.

Although atmospheric rivers are notorious for their flood-inducing impact in Europe and the Americas, their importance for Earth's polar climate -- and for global sea levels -- is only now coming to light.

In this study, an international team of researchers led by Irina Gorodetskaya of KU Leuven's Regional Climate Studies research group used a combination of advanced modelling techniques and data collected at Belgium's Princess Elisabeth polar research station in East Antarctica's Dronning Maud Land to produce the first ever in-depth look at how atmospheric rivers affect precipitation in Antarctica.

The researchers studied two particular instances of heavy snowfall in the East Antarctic region in detail, one in May 2009 and another in February 2011, and found that both were caused by atmospheric rivers slamming into the East Antarctic coast.

The Princess Elisabeth polar research station recorded snow accumulation equivalent to up to 5 centimetres of water for each of these weather events, good for 22 per cent of the total annual snow accumulation in those years.

The findings point to atmospheric rivers' impressive snow-producing power. "When we looked at all the extreme weather events that took place during 2009 and 2011, we found that the nine atmospheric rivers that hit East Antarctica in those years accounted for 80 per cent of the exceptional snow accumulation at Princess Elisabeth station," says Irina Gorodetskaya.

And this can have important consequences for Antarctica's diminishing ice sheet. "There is a need to understand how the flow of ice within Antarctica's ice sheet responds to warming and gain insight in atmospheric processes, cloud formation and snowfall," adds Nicole Van Lipzig, co-author of the study and professor of geography at KU Leuven.

A separate study found that the Antarctic ice sheet has lost substantial mass in the last two decades -- at an average rate of about 68 gigatons per year during the period 1992-2011.

"The unusually high snow accumulation in Dronning Maud Land in 2009 that we attributed to atmospheric rivers added around 200 gigatons of mass to Antarctica, which alone offset 15 per cent of the recent 20-year ice sheet mass loss," says Irina Gorodetskaya.

"This study represents a significant advance in our understanding of how the global water cycle is affected by atmospheric rivers. It is the first to look at the effect of atmospheric rivers on Antarctica and to explore their role in cryospheric processes of importance to the global sea level in a changing climate," says Martin Ralph, contributor to the study and Director of the Center for Western Weather and Water Extremes at the University of California, San Diego.

"Moving forward, we aim to explore the impact of atmospheric rivers on precipitation in all Antarctic coastal areas using data records covering the longest possible time period. We want to determine exactly how this phenomenon fits into climate models," says Irina Gorodetskaya.

"Our results should not be misinterpreted as evidence that the impacts of global warming will be small or reversed due to compensating effects. On the contrary, they confirm the potential of Earth's warming climate to manifest itself in anomalous regional responses. Thus, our understanding of climate change and its worldwide impact will strongly depend on climate models' ability to capture extreme weather events, such as atmospheric rivers and the resulting anomalies in precipitation and temperature," she concludes.


View the original article here

Tuesday, February 17, 2015

Improving forecasts for rain-on-snow flooding

Many of the worst West Coast winter floods pack a double punch. Heavy rains and melting snow wash down the mountains together to breach riverbanks, wash out roads and flood buildings.

These events are unpredictable and difficult to forecast. Yet they will become more common as the planet warms and more winter precipitation falls as rain rather than snow.

University of Washington mountain hydrology experts are using the physics behind these events to better predict the risks.

"One of the main misconceptions is that either the rain falls and washes the snow away, or that heat from the rain is melting the snow," said Nicholas Wayand, a UW doctoral student in civil and environmental engineering. He will present his research Dec. 18 at the annual meeting of the American Geophysical Union.

Most of the largest floods on record in the western U.S. are associated with rain falling on snow. But it's not that the rain is melting or washing away the snow.

Instead, it's the warm, humid air surrounding the drops that is most to blame for the melting, Wayand said. Moisture in the air condenses on the cold snow just like water droplets form on a cold drink can. The energy released when the humid air condenses is absorbed by the snow. The other main reason is that rainstorms bring warmer air, and this air blows across the snow to melt its surface. His work support previous research showing that these processes provide 60 to 90 percent of the energy for melting.

Places that experience rain-on-snow flooding are cities on rivers that begin in the mountains, such as Sacramento, California, and Centralia, Washington. In the 1997 New Year's Day flood in Northern California, melting snow exacerbated flooding, which broke levees and caused millions of dollars in damage. The biggest recent rain-on-snow event in Washington was the 2009 flood in the Snoqualmie basin. And the Calgary flood in summer of 2013 included snow from the Canadian Rockies that caused rivers to overflow their banks.

The UW researchers developed a model by recreating the 10 worst rain-on-snow flooding events between 1980 and 2008 in three regions: the Snoqualmie basin in Washington state, the upper San Joaquin basin in central California and the East North Fork of the Feather River basin in southern California.

Their results allow them to gauge the risks for any basin and any incoming storm. The three factors that matter most, they found, are the shape of the basin, the elevation of the rain-to-snow transition before and during the storm, and the amount of tree cover. Basins most vulnerable to snowmelt are treeless basins with a lot of area within the rain-snow transition zone, where the precipitation can fall as snow and then rain.

Trees reduce the risk of flooding because they slow the storm's winds.

"If you've ever been in a forest on a windy day, it's a lot calmer," Wayand said. That slows the energy transferred from condensation and from contact with warm air to the snowpack.

Simulations also show that meltwater accounted for up to about a quarter of the total flooding. That supports earlier research showing that snow is not the main contributor to rain-on-snow floods, but cannot be neglected since it adds water to an already heavy winter rainstorm.

The complexity of mountain weather also plays a role.

"The increase in precipitation with elevation is much greater than usual for some of these storms," said Jessica Lundquist, a UW associate professor of civil and environmental engineering. "Higher flows can result from heavier rainfall rates at higher elevations, rather than from snowmelt."

In related work, Lundquist's group has developed a tennis-ball snow sensor and is measuring growth and melt of the snowpack in the foothills east of Seattle. The scientists aim to better understand how changes in climate and forestry practices might affect municipal water supplies and flood risks.

Wayand and another student in the group have developed a high school curriculum for Seattle teachers to explain rain-on-snow events and the physics behind why they occur. They hope to begin teaching the curriculum sometime next year.

The other collaborator on the work being presented in San Francisco is Martyn Clark at the National Center for Atmospheric Research in Colorado.


View the original article here

Monday, February 16, 2015

Small volcanic eruptions partly explain 'warming hiatus'

The "warming hiatus" that has occurred over the last 15 years has been caused in part by small volcanic eruptions.

Scientists have long known that volcanoes cool the atmosphere because of the sulfur dioxide that is expelled during eruptions. Droplets of sulfuric acid that form when the gas combines with oxygen in the upper atmosphere can persist for many months, reflecting sunlight away from Earth and lowering temperatures at the surface and in the lower atmosphere.

Previous research suggested that early 21st-century eruptions might explain up to a third of the recent warming hiatus.

New research available online in the journal Geophysical Research Letters (GRL) further identifies observational climate signals caused by recent volcanic activity. This new research complements an earlier GRL paper published in November, which relied on a combination of ground, air and satellite measurements, indicating that a series of small 21st-century volcanic eruptions deflected substantially more solar radiation than previously estimated.

"This new work shows that the climate signals of late 20th- and early 21st-century volcanic activity can be detected in a variety of different observational data sets," said Benjamin Santer, a Lawrence Livermore National Laboratory scientist and lead author of the study.

The warmest year on record is 1998. After that, the steep climb in global surface temperatures observed over the 20th century appeared to level off. This "hiatus" received considerable attention, despite the fact that the full observational surface temperature record shows many instances of slowing and acceleration in warming rates. Scientists had previously suggested that factors such as weak solar activity and increased heat uptake by the oceans could be responsible for the recent lull in temperature increases. After publication of a 2011 paper in the journal Science by Susan Solomon of the Massachusetts Institute of Technology (link is external) (MIT), it was recognized that an uptick in volcanic activity might also be implicated in the warming hiatus.

Prior to the 2011 Science paper, the prevailing scientific thinking was that only very large eruptions -- on the scale of the cataclysmic 1991 Mount Pinatubo eruption in the Philippines, which ejected an estimated 20 million metric tons (44 billion pounds) of sulfur -- were capable of impacting global climate. This conventional wisdom was largely based on climate model simulations. But according to David Ridley, an atmospheric scientist at MIT and lead author of the November GRL paper, these simulations were missing an important component of volcanic activity.

Ridley and colleagues found the missing piece of the puzzle at the intersection of two atmospheric layers, the stratosphere and the troposphere -- the lowest layer of the atmosphere, where all weather takes place. Those layers meet between 10 and 15 kilometers (six to nine miles) above Earth.

Satellite measurements of the sulfuric acid droplets and aerosols produced by erupting volcanoes are generally restricted to above 15 km. Below 15 km, cirrus clouds can interfere with satellite aerosol measurements. This means that toward the poles, where the lower stratosphere can reach down to 10 km, the satellite measurements miss a significant chunk of the total volcanic aerosol loading.

To get around this problem, the study by Ridley and colleagues combined observations from ground-, air- and space-based instruments to better observe aerosols in the lower portion of the stratosphere. They used these improved estimates of total volcanic aerosols in a simple climate model, and estimated that volcanoes may have caused cooling of 0.05 degrees to 0.12 degrees Celsius since 2000.

The second Livermore-led study shows that the signals of these late 20th and early 21st eruptions can be positively identified in atmospheric temperature, moisture and the reflected solar radiation at the top of the atmosphere. A vital step in detecting these volcanic signals is the removal of the "climate noise" caused by El Ni?os and La Ni?as.

"The fact that these volcanic signatures are apparent in multiple independently measured climate variables really supports the idea that they are influencing climate in spite of their moderate size," said Mark Zelinka, another Livermore author. "If we wish to accurately simulate recent climate change in models, we cannot neglect the ability of these smaller eruptions to reflect sunlight away from Earth."


View the original article here

Sunday, February 15, 2015

Temperature anomalies are warming faster than Earth's average, study finds

It's widely known that Earth's average temperature has been rising. But research by an Indiana University geographer and colleagues finds that spatial patterns of extreme temperature anomalies -- readings well above or below the mean -- are warming even faster than the overall average.

And trends in extreme heat and cold are important, said Scott M. Robeson, professor of geography in the College of Arts and Sciences at IU Bloomington. They have an outsized impact on water supplies, agricultural productivity and other factors related to human health and well-being.

"Average temperatures don't tell us everything we need to know about climate change," he said. "Arguably, these cold extremes and warm extremes are the most important factors for human society."

Robeson is the lead author of the article "Trends in hemispheric warm and cold anomalies," which will be published in the journal Geophysical Research Letters and is available online. Co-authors are Cort J. Willmott of the University of Delaware and Phil D. Jones of the University of East Anglia.

The researchers analyzed temperature records for the years 1881 to 2013 from HadCRUT4, a widely used data set for land and sea locations compiled by the University of East Anglia and the U.K. Met Office. Using monthly average temperatures at points across the globe, they sorted them into "spatial percentiles," which represent how unusual they are by their geographic size.

Their findings include:

Temperatures at the cold and warm "tails" of the spatial distribution -- the 5th and 95th percentiles -- increased more than the overall average Earth temperature.Over the 130-year record, cold anomalies increased more than warm anomalies, resulting in an overall narrowing of the range of Earth's temperatures.In the past 30 years, however, that pattern reversed, with warm anomalies increasing at a faster rate than cold anomalies. "Earth's temperature was becoming more homogenous with time," Robeson said, "but now it's not."

The study records separate results for the Northern and Southern Hemispheres. Temperatures are considerably more volatile in the Northern Hemisphere, an expected result because there's considerably less land mass in the South to add complexity to weather systems.

The study also examined anomalies during the "pause" in global warming that scientists have observed since 1998. While a 16-year-period is too short a time to draw conclusions about trends, the researchers found that warming continued at most locations on the planet and during much of the year, but that warming was offset by strong cooling during winter months in the Northern Hemisphere.

"There really hasn't been a pause in global warming," Robeson said. "There's been a pause in Northern Hemisphere winter warming."

Co-author Jones of the University of East Anglia said the study provides scientists with better knowledge about what's taking place with Earth's climate. "Improved understanding of the spatial patterns of change over the three periods studied are vital for understanding the causes of recent events," he said.

It may seem counterintuitive that global warming would be accompanied by colder winter weather at some locales. But Robeson said the observation aligns with theories about climate change, which hold that amplified warming in the Arctic region produces changes in the jet stream, which can result in extended periods of cold weather at some locations in the mid-northern latitudes.

And while the rate of planetary warming has slowed in the past 16 years, it hasn't stopped. The World Meteorological Organization announced this month that 2014 is on track to be one of the warmest, if not the warmest, years on record as measured by global average temperatures.

In the U.S., the East has been unusually cold and snowy in recent years, but much of the West has been unusually warm and has experienced drought. And what happens here doesn't necessarily reflect conditions on the rest of the planet. Robeson points out that the United States, including Alaska, makes up only 2 percent of Earth's surface.


View the original article here

Saturday, February 14, 2015

How soil microorganisms get out of step through climate change

Scientists at Helmholtz Zentrum M?nchen, in collaboration with colleagues from the TU M?nchen and the Karlsruhe Institute of Technology (KIT), have studied how soil microorganisms react to climatic change. Their result: Extreme weather events such as long periods of drought and heavy rainfall have a strong impact on the metabolic activity of microbes. This may lead to a change in the nutrient balance in soils and, in extreme cases, may even increase greenhouse gas emissions like nitrous oxide to the atmosphere concentrations.

In order to observe the impact of climate change on soil microorganisms under as natural conditions as possible, the scientists transferred intact young beech seedlings from a cool, wet, northwest-exposed site of a slope approximately corresponding to present climatic conditions to a warmer site exposed to the southwest. This transfer simulated temperature and precipitation profiles as can be expected from climate change. "We tried to keep initial soil type and nutrient content sin soil as comparable as possible to avoid additional factors influencing our data"," said Prof. Dr. Michael Schloter, head of the Research Unit Environmental Genomics (EGEN) at Helmholtz Zentrum M?nchen. "In addition to these natural changes due to the transplantation of the trees, we exacerbated the scenario by simulating long periods of drought followed byheavy rainfall," he added.

To determine the dynamics of the soil microflora, the researchers studied marker genes of microorganisms that are typically involved in nutrient turnover. They found that already a transfer from NW to SW without simulating extreme weather conditions led to a drastic change in metabolic activity and in the composition of the microorganisms. "Under extreme climatic conditions these effects were even more pronounced," said Dr. Silvia Gschwendtner (EGEN), who carried out the research project. The findings indicate that the activity of microorganisms primarily involved in denitrification is positively stimulated by the chosen conditions. "This has an impact on the competition between plants and microorganisms for nitrogen," Gschwendtner explained. "Furthermore, this may also lead to increased emission rates of the climate-relevant greenhouse gas N2O."

Journal Reference:

Silvia Gschwendtner, Javier Tejedor, Carolin Bimueller, Michael Dannenmann, Ingrid K?gel Knabner, Michael Schloter. Climate Change Induces Shifts in Abundance and Activity Pattern of Bacteria and Archaea Catalyzing Major Transformation Steps in Nitrogen Turnover in a Soil from a Mid-European Beech Forest. PLoS ONE, 2014; 9 (12): e114278 DOI: 10.1371/journal.pone.0114278

View the original article here

Friday, February 13, 2015

Top weather conditions that amplify Lake Erie algal blooms revealed

Of the many weather-related factors that contribute to harmful algal blooms (HABs) in Lake Erie, a new study has identified one as most important: the wind.

Over a 10-year period in Lake Erie, wind speed contributed more consistently to HABs than sunshine or even precipitation, researchers at The Ohio State University and their colleagues found.

The ongoing study is unusual, in that researchers are building the first detailed analyses of how the various environmental factors influence each other -- in the context of satellite studies of Lake Erie.

They gave their early results at the American Geophysical Union meeting on Dec. 17.

To C.K. Shum, Distinguished University Scholar and professor of geodetic science at Ohio State, the finding "underscores the need for environmental agencies to incorporate the threat of extreme weather events caused by climate change into future algae mitigation strategies."

Where other studies have linked weather phenomena to HABs, this study goes a step further to look at how environmental drivers impact each other, and "ranks" them by their relative importance in promoting HABs, said Song Liang, formerly of Ohio State and now an associate professor of environmental and global health at the University of Florida.

"What surprised us the most was how the impact of nonweather factors, such as nitrogen and phosphorus pollution, varied strongly by season, while weather factors remained consistently important throughout the year," he said.

Researchers have long known that high nitrogen and phosphorus levels are the actual causes of HABs, which choke freshwater ecosystems and render the water toxic. But when it comes to the various environmental factors that can amplify the amount of these nutrients in the water, or aid or hamper the spread of algae, the relationships are much more complex.

"One of the objectives of this project is investigating historical patterns of harmful algal blooms and their linkage to water quality and environmental factors," explained project leader Jiyoung Lee, associate professor of environmental health sciences at Ohio State. "By doing this, we can better understand and predict the future of HABs and water safety in the Lake Erie community with the impact of changing climate and environmental factors."

Liang and his group analyzed nine environmental factors, including solar radiation, wind speed, precipitation, nitrogen concentration, water temperature and water quality in Lake Erie from 2002 to 2012. Then the larger research team used data from the sensor onboard the European Space Agency's Envisat satellite MEdium Resolution Imaging Spectrometer (MERIS) to examine how the color of the lake water changed during those years -- an indication of the concentration of the toxic blue-green algae present in HABs.

The researchers examined the environmental drivers by season, and found that wind speed affected the spread of algal blooms consistently throughout spring, summer and fall. Seasons of low winds led to larger blooms. That's because when wind speed is low, lake water is more still, and algae can more easily float to the top and form thick mats that spread along the lake surface.

Sunlight, meanwhile, was important in the spring and summer as a source of energy for the algae. Precipitation was very important in the summer and the winter, when rains and melting snow boosted runoff and delivered nitrogen and phosphorus, which algae use as food sources, to the lake.

As the project continues, the researchers hope to get a better understanding of how the variables relate to each other, and explore the notion of weather and climate as factors in a kind of "early warning system" for HABs.


View the original article here

Thursday, February 12, 2015

California's drought is the worst in 1,200 years, evidence suggests

As California finally experiences the arrival of a rain-bearing Pineapple Express this week, two climate scientists from the University of Minnesota and Woods Hole Oceanographic Institution have shown that the drought of 2012-2014 has been the worst in 1,200 years.

Daniel Griffin, an assistant professor in the Department of Geography, Environment and Society at the University of Minnesota, and Kevin Anchukaitis, an assistant scientist at Woods Hole Oceanographic Institution, asked the question, "How unusual is the ongoing California drought?" Watching the severity of the California drought intensify since last autumn, they wondered how it would eventually compare to other extreme droughts throughout the state's history.

To answer those questions, Griffin and Anchukaitis collected new tree-ring samples from blue oak trees in southern and central California. "California's old blue oaks are as close to nature's rain gauges as we get," says Griffin. "They thrive in some of California's driest environments." These trees are particularly sensitive to moisture changes and their tree rings display moisture fluctuations vividly.

As soon as the National Oceanic and Atmospheric Administration (NOAA) released climate data for the summer of 2014, the two scientists sprang into action. Using their blue oak data, they reconstructed rainfall back to the 13th century. They also calculated the severity of the drought by combining NOAA's estimates of the Palmer Drought Severity Index (PDSI), an index of soil moisture variability, with the existing North American Drought Atlas, a spatial tree-ring based reconstruction of drought developed by scientists at Columbia University's Lamont-Doherty Earth Observatory. These resources together provided complementary data on rainfall and soil moisture over the past millennium. Griffin and Anchukaitis found that while the current period of low precipitation is not unusual in California's history, these rainfall deficits combined with sustained record high temperatures created the current multiyear severe water shortages. "While it is precipitation that sets the rhythm of California drought, temperature weighs in on the pitch," says Anchukaitis.

"We were genuinely surprised at the result," says Griffin, a NOAA Climate & Global Change Fellow and former WHOI postdoctoral scholar. "This is California--drought happens. Time and again, the most common result in tree-ring studies is that drought episodes in the past were more extreme than those of more recent eras. This time, however, the result was different." While there is good evidence of past sustained, multi-decadal droughts or so-called "megadroughts"' in California, the authors say those past episodes were probably punctuated by occasional wet years, even if the cumulative effect over decades was one of overall drying. The current short-term drought appears to be worse than any previous span of consecutive years of drought without reprieve.

Tree rings are a valuable data source when tracking historical climate, weather and natural disaster trends. Floods, fires, drought and other elements that can affect growing conditions are reflected in the development of tree rings, and since each ring represents one year the samples collected from centuries-old trees are a virtual timeline that extend beyond the historical record in North America.

So what are the implications? The research indicates that natural climate system variability is compounded by human-caused climate change and that "hot" droughts such as the current one are likely to occur again in the future. California is the world's 8th largest economy and the source of a substantial amount of U.S. produce. Surface water supply shortages there have impacts well beyond the state's borders.

With an exceptionally wet winter, parts of California might emerge from the drought this year. "But there is no doubt," cautions Anchukaitis, "that we are entering a new era where human-wrought changes to the climate system will become important for determining the severity of droughts and their consequences for coupled human and natural systems."


View the original article here

Wednesday, February 11, 2015

Sudden jump in a storm's lightning might warn a supercell is forming

A sudden jump in the number of lightning strikes inside a garden-variety thunderstorm might soon give forecasters a new tool for predicting severe weather and issuing timely warnings, according to research at The University of Alabama in Huntsville (UAH).

The sudden increase in lightning is one sign a normal storm is rapidly evolving into a supercell, with a large rotating updraft -- or mesocyclone -- at its heart.

"Supercells are more prone to produce severe weather events, including damaging straight line winds and large hail," said Sarah Stough, a UAH graduate student in atmospheric science. "Supercells also produce the strongest and most deadly tornadoes."

Early results from Stough's research were presented Jan. 7 in Phoenix at the American Meteorological Society's annual meeting.

"Roughly 90 percent of mesocyclones are related to severe weather of some kind, while only 25 percent are associated with tornadoes," Stough said.

Because the sudden increase in lightning strikes is either concurrent with -- or within minutes of -- a supercell forming, UAH researchers are developing algorithms that might be used by forecasters to issue timely severe weather warnings.

"Basically, we keep a 10-minute running average of the number of lightning flashes in a cell," Stough said. "Then, if the flash rate suddenly jumps to at least twice the standard deviation of that running average, there is a high probability the updraft in that cell has strengthened, a supercell is forming and severe weather is more likely with that storm."

"We can use the lightning jump as a nowcasting tool for supercells if the jump is set in the context of that storm's environmental data," said Dr. Larry Carey, a UAH associate professor in atmospheric science. "If the meteorology of the day suggests supercells are likely, the jump can tell us when and where that is happening. Early warning of supercells -- especially the first of a severe weather day -- is an important forecasting challenge."

The lightning jump has been tested as a forecast tool by National Weather Service forecasters in Huntsville, Ala., and at NWS testing facilities in Norman, Okla.

"I know a lot of forecasters are excited about having this information," Stough said.

While the ongoing research uses ground-based lightning detection networks, the UAH team is also working on being able to use lightning counts reported by the Geostationary Lightning Mapper aboard the GOES-R geostationary weather satellite, which is scheduled to launch in 2016.

"The lightning jump is getting in front of forecasters now so we can get feedback, and fit the lightning jump concept into their forecasting methods," said Chris Schultz, an atmospheric science graduate student at UAH and an intern at NASA's Marshall Space Flight Center. "This way, when the real-time data from GLM is available and the lightning jump is implemented, it will immediately fit into the forecasters' warning operations."


View the original article here

Tuesday, February 10, 2015

Glacier beds can get slipperier at higher sliding speeds

As a glacier's sliding speed increases, the bed beneath the glacier can grow slipperier, according to laboratory experiments conducted by Iowa State University glaciologists.

They say including this effect in efforts to calculate future increases in glacier speeds could improve predictions of ice volume lost to the oceans and the rate of sea-level rise.

The glaciologists -- Lucas Zoet, a postdoctoral research associate, and Neal Iverson, a professor of geological and atmospheric sciences -- describe the results of their experiments in the Journal of Glaciology. The paper uses data collected from a newly constructed laboratory tool, the Iowa State University Sliding Simulator, to investigate glacier sliding. The device was used to explore the relationship between drag and sliding speed for comparison with the predictions of theoretical models.

"We really have a unique opportunity to study the base of glaciers with these experiments," said Zoet, the lead author of the paper. "The other tactic you might take is studying these relationships with field observations, but with field data so many different processes are mixed together that it becomes hard to untangle the relevant data from the noise."

Data collected by the researchers show that resistance to glacier sliding -- the drag that the bed exerts on the ice -- can decrease in response to increasing sliding speed. This decrease in drag with increasing speed, although predicted by some theoreticians a long as 45 years ago, is the opposite of what is usually assumed in mathematical models of the flow of ice sheets.

These are the first empirical results demonstrating that as ice slides at an increasing speed -- perhaps in response to changing weather or climate -- the bed can become slipperier, which could promote still faster glacier flow.

The response of glaciers to changing climate is one of the largest potential contributors to sea-level rise. Predicting glacier response to climate change depends on properly characterizing the way a glacier slides over its bed. There has been a half-century debate among theoreticians as to how to do that.

The simulator features a ring of ice about 8 inches thick and about 3 feet across that is rotated over a model glacier bed. Below the ice is a hydraulic press that can simulate the weight of a glacier several hundred yards thick. Above are motors that can rotate the ice ring over the bed at either a constant speed or a constant stress. A circulating, temperature-regulated fluid keeps the ice at its melting temperature -- a necessary condition for significant sliding.

"About six years were required to design, construct, and work the bugs out of the new apparatus," Iverson said, "but it is performing well now and allowing hypothesis tests that were formerly not possible."


View the original article here

Monday, February 9, 2015

The legend of the Kamikaze typhoons

In the late 13th century, Kublai Khan, ruler of the Mongol Empire, launched one of the world's largest armada of its time in an attempt to conquer Japan. Early narratives describe the decimation and dispersal of these fleets by the "Kamikaze" of CE 1274 and CE 1281 -- a pair of intense typhoons divinely sent to protect Japan from invasion.

These historical accounts are prone to exaggeration, and significant questions remain regarding the occurrence and true intensity of these legendary typhoons. For independent insight, we provide a new 2,000 year sedimentary reconstruction of typhoon overwash from a coastal lake near the location of the Mongol invasions. Two prominent storm deposits date to the timing of the Kamikaze typhoons and support them being of significant intensity.

Our new storm reconstruction also indicates that events of this nature were more frequent in the region during the timing of the Mongol invasions. Results support the paired Kamikaze typhoons in having played an important role in preventing the early conquest of Japan by Mongol fleets. In doing so, the events may provide one of the earliest historical cases for the shaping of a major geopolitical boundary by an increased probability of extreme weather due to changing atmospheric and oceanic conditions.

Journal Reference:

J. D. Woodruff, K. Kanamaru, S. Kundu, T. L. Cook. Depositional evidence for the Kamikaze typhoons and links to changes in typhoon climatology. Geology, 2014; DOI: 10.1130/G36209.1

View the original article here

Sunday, February 8, 2015

Birds sensed severe storms and fled before tornado outbreak

Golden-winged warblers apparently knew in advance that a storm that would spawn 84 confirmed tornadoes and kill at least 35 people last spring was coming, according to a report in the Cell Press journal Current Biology on December 18. The birds left the scene well before devastating supercell storms blew in.

The discovery was made quite by accident while researchers were testing whether the warblers, which weigh "less than two nickels," could carry geolocators on their backs. It turns out they can, and much more. With a big storm brewing, the birds took off from their breeding ground in the Cumberland Mountains of eastern Tennessee, where they had only just arrived, for an unplanned migratory event. All told, the warblers travelled 1,500 kilometers in 5 days to avoid the historic tornado-producing storms.

"The most curious finding is that the birds left long before the storm arrived," says Henry Streby of the University of California, Berkeley. "At the same time that meteorologists on The Weather Channel were telling us this storm was headed in our direction, the birds were apparently already packing their bags and evacuating the area."

The birds fled from their breeding territories more than 24 hours before the arrival of the storm, Streby and his colleagues report. The researchers suspect that the birds did it by listening to infrasound associated with the severe weather, at a level well below the range of human hearing.

"Meteorologists and physicists have known for decades that tornadic storms make very strong infrasound that can travel thousands of kilometers from the storm," Streby explains. While the birds might pick up on some other cue, he adds, the infrasound from severe storms travels at exactly the same frequency the birds are most sensitive to hearing.

The findings show that birds that follow annual migratory routes can also take off on unplanned trips at other times of the year when conditions require it. That's probably good news for birds, as climate change is expected to produce storms that are both stronger and more frequent. But there surely must be a downside as well, the researchers say.

"Our observation suggests [that] birds aren't just going to sit there and take it with regards to climate change, and maybe they will fare better than some have predicted," Streby says. "On the other hand, this behavior presumably costs the birds some serious energy and time they should be spending on reproducing." The birds' energy-draining journey is just one more pressure human activities are putting on migratory songbirds.

In the coming year, Streby's team will deploy hundreds of geolocators on the golden-winged warblers and related species across their entire breeding range to find out where they spend the winter and how they get there and back.

"I can't say I'm hoping for another severe tornado outbreak," Streby says, "but I am eager to see what unpredictable things happen this time."


View the original article here

Saturday, February 7, 2015

NASA, NOAA find 2014 warmest year in modern record

The year 2014 ranks as Earth's warmest since 1880, according to two separate analyses by NASA and National Oceanic and Atmospheric Administration (NOAA) scientists.

The 10 warmest years in the instrumental record, with the exception of 1998, have now occurred since 2000. This trend continues a long-term warming of the planet, according to an analysis of surface temperature measurements by scientists at NASA's Goddard Institute of Space Studies (GISS) in New York.

In an independent analysis of the raw data, also released Friday, NOAA scientists also found 2014 to be the warmest on record.

"NASA is at the forefront of the scientific investigation of the dynamics of the Earth's climate on a global scale," said John Grunsfeld, associate administrator for the Science Mission Directorate at NASA Headquarters in Washington. "The observed long-term warming trend and the ranking of 2014 as the warmest year on record reinforces the importance for NASA to study Earth as a complete system, and particularly to understand the role and impacts of human activity."

Since 1880, Earth's average surface temperature has warmed by about 1.4 degrees Fahrenheit (0.8 degrees Celsius), a trend that is largely driven by the increase in carbon dioxide and other human emissions into the planet's atmosphere. The majority of that warming has occurred in the past three decades.

"This is the latest in a series of warm years, in a series of warm decades. While the ranking of individual years can be affected by chaotic weather patterns, the long-term trends are attributable to drivers of climate change that right now are dominated by human emissions of greenhouse gases," said GISS Director Gavin Schmidt.

While 2014 temperatures continue the planet's long-term warming trend, scientists still expect to see year-to-year fluctuations in average global temperature caused by phenomena such as El Ni?o or La Ni?a. These phenomena warm or cool the tropical Pacific and are thought to have played a role in the flattening of the long-term warming trend over the past 15 years. However, 2014's record warmth occurred during an El Ni?o-neutral year.

"NOAA provides decision makers with timely and trusted science-based information about our changing world," said Richard Spinrad, NOAA chief scientist. "As we monitor changes in our climate, demand for the environmental intelligence NOAA provides is only growing. It's critical that we continue to work with our partners, like NASA, to observe these changes and to provide the information communities need to build resiliency."

Regional differences in temperature are more strongly affected by weather dynamics than the global mean. For example, in the U.S. in 2014, parts of the Midwest and East Coast were unusually cool, while Alaska and three western states -- California, Arizona and Nevada -- experienced their warmest year on record, according to NOAA.

The GISS analysis incorporates surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations. This raw data is analyzed using an algorithm that takes into account the varied spacing of temperature stations around the globe and urban heating effects that could skew the calculation. The result is an estimate of the global average temperature difference from a baseline period of 1951 to 1980.

NOAA scientists used much of the same raw temperature data, but a different baseline period. They also employ their own methods to estimate global temperatures.

GISS is a NASA laboratory managed by the Earth Sciences Division of the agency's Goddard Space Flight Center, in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.

NASA monitors Earth's vital signs from land, air and space with a fleet of satellites, as well as airborne and ground-based observation campaigns. NASA develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. The agency shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

The data set of 2014 surface temperature measurements is available at:

http://data.giss.nasa.gov/gistemp/

The methodology used to make the temperature calculation is available at:

http://data.giss.nasa.gov/gistemp/sources_v3/

For more information about NASA's Earth science activities, visit:

http://www.nasa.gov/earthrightnow


View the original article here

Friday, February 6, 2015

Global warming's influence on extreme weather

Understanding the cause-and-effect relationship between global warming and record-breaking weather requires asking precisely the right questions.

Extreme climate and weather events such as record high temperatures, intense downpours and severe storm surges are becoming more common in many parts of the world. But because high-quality weather records go back only about 100 years, most scientists have been reluctant to say if global warming affected particular extreme events.

On Wednesday, Dec. 17, at the American Geophysical Union's Fall Meeting in San Francisco, Noah Diffenbaugh, an associate professor of environmental Earth system science at the Stanford School of Earth Sciences, will discuss approaches to this challenge in a talk titled "Quantifying the Influence of Observed Global Warming on the Probability of Unprecedented Extreme Climate Events." He will focus on weather events that -- at the time they occur -- are more extreme than any other event in the historical record.

Diffenbaugh emphasizes that asking precisely the right question is critical for finding the correct answer.

"The media are often focused on whether global warming caused a particular event," said Diffenbaugh, who is a senior fellow at the Stanford Woods Institute for the Environment. "The more useful question for real-world decisions is: 'Is the probability of a particular event statistically different now compared with a climate without human influence?'"

Diffenbaugh said the research requires three elements: a long record of climate observations; a large collection of climate model experiments that accurately simulate the observed variations in climate; and advanced statistical techniques to analyze both the observations and the climate models.

One research challenge involves having just a few decades or a century of high-quality weather data with which to make sense of events that might occur once every 1,000 or 10,000 years in a theoretical climate without human influence.

But decision makers need to appreciate the influence of global warming on extreme climate and weather events.

"If we look over the last decade in the United States, there have been more than 70 events that have each caused at least $1 billion in damage, and a number of those have been considerably more costly," said Diffenbaugh. "Understanding whether the probability of those high-impact events has changed can help us to plan for future extreme events, and to value the costs and benefits of avoiding future global warming."


View the original article here

Thursday, February 5, 2015

New study explains the role of oceans in 'global warming hiatus'

New research shows that ocean heat uptake across three oceans is the likely cause of the 'warming hiatus' -- the current decade-long slowdown in global surface warming.

Using data from a range of state-of-the-art ocean and atmosphere models, the research shows that the increased oceanic heat drawdown in the equatorial Pacific, North Atlantic and Southern Ocean basins has played a significant role in the hiatus.

The new analysis has been published in Geophysical Research Letters by Professor Sybren Drijfhout from the University of Southampton and collaborators from the National Oceanography Centre (NOC) Dr Adam Blaker, Professor Simon Josey, Dr George Nurser and Dr Bablu Sinha, together with Dr Magdalena Balmaseda from the European Centre for Medium Range Weather Forecasting (ECMWF).

Professor Drijfhout said: "This study attributes the increased oceanic heat drawdown in the equatorial Pacific, North Atlantic and Southern Ocean to specific, different mechanisms in each region. This is important as current climate models have been unable to simulate the hiatus. Our study gives clues to where the heat is drawn down and by which processes. This can serve as a benchmark for climate models on how to improve their projections of future global mean temperature."

Previously, the drawdown of heat by the Equatorial Pacific Ocean over the hiatus period, due to cool sea-surface temperatures associated with a succession of cool-surface La Nina episodes, was thought to be sufficient to explain the hiatus.

However, this new analysis reveals that the northern North Atlantic, the Southern Ocean and Equatorial Pacific Ocean are all important regions of ocean heat uptake. Each basin contributes a roughly equal amount to explaining the hiatus, but the mechanisms of heat drawdown are different and specific in each basin.

In the North Atlantic, more heat has been retained at deep levels as a result of changes to both the ocean and atmospheric circulations, which have led to the winter atmosphere extracting less heat from the ocean.

In the Southern Ocean, the extra drawdown of heat had gone unnoticed and is increasing on a much longer timescale (multi-decadal) than the other two regions (decadal). Here, gradual changes in the prevailing westerly winds have modified the ocean-atmosphere heat exchange, particularly in the Southern Indian Ocean.

The team calculated the change in the amount of heat entering the ocean using a state-of-the-art high resolution ocean model developed and run by NOC scientists that is driven by surface observations. This estimate was compared with results from an ocean model-data synthesis from ECMWF and a leading atmospheric model-data synthesis produced in the US. Professor Josey said: "It is the synthesis of information from models and observational data that provides a major strength of our study."

Dr Sinha concluded: "The deeper understanding gained in this study of the processes and regions responsible for variations in oceanic heat drawdown and retention will improve the accuracy of future climate projections."


View the original article here

Wednesday, February 4, 2015

Muddy forests, shorter winters present challenges for loggers

Stable, frozen ground has long been recognized a logger's friend, capable of supporting equipment and trucks in marshy or soggy forests. Now, a comprehensive look at weather from 1948 onward shows that the logger's friend is melting.

The study, published in the current issue of the Journal of Environmental Management, finds that the period of frozen ground has declined by an average of two or three weeks since 1948. During that time, wood harvests have shifted in years with more variability in freezing and thawing to red pine and jack pine -- species that grow in sandy, well-drained soil that can support trucks and heavy equipment when not frozen.

Jack pine, a characteristic north woods Wisconsin species, is declining, and areas that have been harvested are often replaced with a different species, changing the overall ecosystem.

The study was an effort to look at how long-term weather trends affect forestry, says author Adena Rissman, an assistant professor of forest and wildlife ecology at the University of Wisconsin-Madison. "When my co-author, Chad Rittenhouse, and I began this project, we wanted to know how weather affects our ability to support sustainable working forests. We found a significant decline in the duration of frozen ground over the past 65 years, and at the same time, a significant change in the species being harvested."

"This study identifies real challenges facing forest managers, loggers, landowners, and industry," says Rittenhouse, now an assistant research professor of natural resources and the environment at the University of Connecticut. "Once we understood the trends in frozen ground, we realized how pulling out that issue tugged on economics, livelihoods, forest ecology, wildlife habitat and policy."

Mud can make forests impassable in fall, and even more so after the snow melts in spring, making life difficult for companies that buy standing trees, Rittenhouse says. "Nobody wants to get stuck; you lose time and have to get hauled out or wait for the ground to firm up again."

Shorter winters and uncertainty complicate management for logging companies, Rissman adds. "They often need to plan out their jobs for the next six months or year." The same is true for managers of state and county forests, which typically allow two years for a cut to be completed. "In some cases," she says, "they are going to three-year contracts to give more time to get the timber out."

Even if equipment can traverse muddy roads, their ruts may ruin the road and cause unacceptable erosion. "There is increased attention to rutting on public land, and on private land that is in the state's managed forest program or in a form of sustainable forest certification," says Rissman. "Excessively wet and muddy ground during harvest is a lose-lose-lose for the logger, the landowner and the environment."

The study drew data from weather records from airports, used to model when the ground was frozen; Department of Natural Resources records on harvest levels for various tree species; and interviews with forest managers and loggers.

"People in the forestry industry say this is a big deal; winter is normally the most profitable time," Rissman observes. "It's more and more difficult to make a profit in forestry (with) more loggers (taking) on a lot of debt -- they are heavily mechanized, have heavy labor and insurance expenses, and these costs don't end when they don't have work."

The uncertainty about when and where they can work emerged during an interview with a veteran logger, who is quoted as follows in the study: "When I started in the business ... the typical logger ... would shut down and not do anything for the month or two months that the spring break up would last for. Nowadays, with the cost of equipment, and just the cost of insurance on that equipment alone, you're looking for work almost 12 months out of the year."

The shorter winters seem linked to climate change, Rissman acknowledges. "For many people, climate change is something that happens, or not, in places that are far away, at scales that are difficult to see or understand through personal experience. Here's an example of something we can clearly document, of a trend that is having an impact on how forests are managed, right here at home."


View the original article here

Tuesday, February 3, 2015

Atmospheric rivers, cloud-creating aerosol particles, and california reservoirs

In the midst of the California rainy season, scientists are embarking on a field campaign designed to improve the understanding of the natural and human-caused phenomena that determine when and how the state gets its precipitation. They will do so by studying atmospheric rivers, meteorological events that include the famous rainmaker known as the Pineapple Express.

CalWater 2015 is an interagency, interdisciplinary field campaign starting January 14, 2015. CalWater 2015 will entail four research aircraft flying through major storms while a ship outfitted with additional instruments cruises below. The research team includes scientists from Scripps Institution of Oceanography at UC San Diego, the Department of Energy's Pacific Northwest National Laboratory, NOAA, and NASA and uses resources from the DOE's Atmospheric Radiation Measurement (ARM) Climate Research Facility -- a national scientific user facility.

The study will help provide a better understanding of how California gets its rain and snow, how human activities are influencing precipitation, and how the new science provides potential to inform water management decisions relating to drought and flood.

"After several years in the making by an interdisciplinary science team, and through support from multiple agencies, the CalWater 2015 field campaign is set to observe the key conditions offshore and over California like has never been possible before," said Scripps climate researcher Marty Ralph, a CalWater lead investigator. "These data will ultimately help develop better climate projections for water and will help test the potential of using existing reservoirs in new ways based on atmospheric river forecasts."

Like land-based rivers, atmospheric rivers carry massive amounts of moisture long distances -- in California's case, from the tropics to the U.S. West Coast. When an atmospheric river hits the coast, it releases its moisture as precipitation. How much and whether it falls as rain or snow depends on aerosols -- tiny particles made of dust, sea salt, volatile molecules, and pollution.

The researchers will examine the strength of atmospheric rivers, which produce up to 50 percent of California's precipitation and can transport 10-20 times the flow of the Mississippi River. They will also explore how to predict when and where atmospheric rivers will hit land, as well as the role of ocean evaporation and how the ocean changes after a river passes.

"Climate and weather models have a hard time getting precipitation right," said Ralph. "In fact, the big precipitation events that are so important for water supply and can cause flooding, mostly due to atmospheric rivers, are some of the most difficult to predict with useful accuracy. The severe California drought is essentially a result of a dearth of atmospheric rivers, while, conversely, the risk of Katrina-like damages for California due to severe ARs has also been quantified in previous research."

For the next month or more, instrument teams will gather data from the NOAA research vessel Ronald H. Brown and two NOAA, one DOE, and one NASA research aircraft with a coordinated implementation strategy when weather forecasters see atmospheric rivers developing in the Pacific Ocean off the coast of California. NASA will also provide remote sensing data for the project.

"Improving our understanding of atmospheric rivers will help us produce better forecasts of where they will hit and when, and how much rain and snow they will deliver," said Allen White, NOAA research meteorologist and CalWater 2015 mission scientist. "Better forecasts will give communities the environmental intelligence needed to respond to droughts and floods."

Most research flights will originate at McClellan Airfield in Sacramento. Ground-based instruments in Bodega Bay, Calif., and scattered throughout the state will also collect data on natural and human contributions to the atmosphere such as dust and pollution. This data-gathering campaign follows the 2009-2011 CalWater1 field campaign, which yielded new insights into how precipitation processes in the Sierra Nevada can be influenced by different sources of aerosols that seed the clouds.

"This will be an extremely important study in advancing our overall understanding of aerosol impacts on clouds and precipitation," said Kimberly Prather, a CalWater lead investigator and Distinguished Chair in Atmospheric Chemistry with appointments at Scripps Oceanography and the Department of Chemistry and Biochemistry at UC San Diego. "It will build upon findings from CalWater1, adding multiple aircraft to directly probe how aerosols from different sources, local, ocean, as well as those from other continents, are influencing clouds and precipitation processes over California."

"We are collecting this data to improve computer models of rain that represent many complex processes and their interactions with the environment," said PNNL's Leung. "Atmospheric rivers contribute most of the heavy rains along the coast and mountains in the West. We want to capture those events better in our climate models used to project changes in extreme events in the future."

Prather's group showed during CalWater1 that aerosols can have competing effects, depending on their source. Intercontinental mineral dust and biological particles possibly from the ocean corresponded to events with more precipitation, while aerosols produced by local air pollution correlated with less precipitation.

The CalWater 2015 campaign is comprised of two interdependent efforts. Major investments in facilities include aircraft, ship time, and sensors by NOAA. Marty Ralph, Kim Prather, and Dan Cayan from Scripps, and Chris Fairall, Ryan Spackman, and Allen White of NOAA lead CalWater-2. The DOE-funded ARM Cloud Aerosol Precipitation Experiment (ACAPEX) is led by Ruby Leung from PNNL. NSF and NASA have also provided major support for aspects of CalWater, leveraging the NOAA and DOE investments.


View the original article here

Monday, February 2, 2015

Hurricane sandy increased incidents of heart attacks, stroke in hardest hit New Jersey counties

Heart attacks and strokes are more likely to occur during extreme weather and natural disasters such as earthquakes and floods. Researchers at the Cardiovascular Institute of New Jersey at Rutgers Robert Wood Johnson Medical School have found evidence that Hurricane Sandy, commonly referred to as a superstorm, had a significant effect on cardiovascular events, including myocardial infarction (heart attack) and stroke, in the high-impact areas of New Jersey two weeks following the 2012 storm. The study, led by Joel N. Swerdel, MS, MPH, an epidemiologist at the Cardiovascular Institute and the Rutgers School of Public Health, was published in the Journal of the American Heart Association.

Utilizing the Myocardial Infarction Data Acquisition System (MIDAS), the researchers examined changes in the incidence of and mortality from myocardial infarctions and strokes from 2007 to 2012 for two weeks prior to and two weeks after October 29, the date of Hurricane Sandy. MIDAS is an administrative database containing hospital records of all patients discharged from non-federal hospitals in New Jersey with a cardiovascular disease diagnosis or invasive cardiovascular procedure.

In the two weeks following Hurricane Sandy, the researchers found that in the eight counties determined to be high-impact areas, there was a 22 percent increase in heart attacks as compared with the same time period in the previous five years. In the low impact areas (the remaining 13 counties), the increase was less than one percent. 30-day mortality from heart attacks also increased by 31 percent in the high-impact area.

"We estimate that there were 69 more deaths from myocardial infarction during the two weeks following Sandy than would have been expected. This is a significant increase over typical non-emergency periods," said Swerdel. "Our hope is that the research may be used by the medical community, particularly emergency medical services, to prepare for the change in volume and severity of health incidents during extreme weather events."

In regard to stroke, the investigators found an increase of 7 percent compared to the same time period in the prior five years in areas of the state impacted the most. There was no change in the incidence of stroke in low-impact areas. There also was no change in the rate of 30-day mortality due to stroke in either the high- or low-impact areas.

"Hurricane Sandy had unprecedented environmental, financial and health consequences on New Jersey and its residents, all factors that can increase the risk of cardiovascular events," said John B. Kostis, MD, director of the Cardiovascular Institute of New Jersey and associate dean for cardiovascular research at Rutgers Robert Wood Johnson Medical School. "Increased stress and physical activity, dehydration and a decreased attention or ability to manage one's own medical needs probably caused cardiovascular events during natural disasters or extreme weather. Also, the disruption of communication services, power outages, gas shortages, and road closures, also were contributing factors to efficiently obtaining medical care."

Journal Reference:

J. N. Swerdel, T. M. Janevic, N. M. Cosgrove, J. B. Kostis. The Effect of Hurricane Sandy on Cardiovascular Events in New Jersey. Journal of the American Heart Association, 2014; 3 (6): e001354 DOI: 10.1161/JAHA.114.001354

View the original article here

Sunday, February 1, 2015

Summer no sweat for Aussies but winter freeze fatal

Australians are more likely to die during unseasonably cold winters than hotter than average summers, QUT research has found.

Across the country severe winters that are colder and drier than normal are a far bigger risk to health than sweltering summers that are hotter than average.

QUT Associate Professor Adrian Barnett, a statistician with the Institute of Health and Biomedical Innovation and the lead researcher of the study, said death rates in Australian cities were up to 30 per cent higher in winter than summer.

The researchers analyzed temperature, humidity and mortality data from 1988 to 2009 for Adelaide Brisbane, Melbourne, Perth and Sydney.

Professor Barnett said the finding that hotter or more humid summers had no effect on mortality was "surprising."

"We know that heatwaves kill people in the short-term, but our study did not find any link between hotter summers and higher deaths," he said.

"The increase in deaths during colder winter could be because Australians are well-prepared for whatever summer throws at them, but are less able to cope with cold weather. There isn't the same focus on preparing for cold weather as there is for hot weather, for example through public health campaigns or even wearing the right sort of clothes.

"The strongest increase in deaths during a colder winter was in Brisbane, the city with the warmest climate, with an extra 59 deaths a month on average for a one degree decrease in mean winter temperature."

"Brisbane has the mildest winter of the five cities but has the greatest vulnerability. We believe this is because most homes are designed to lose heat in summer, which also allows cold outdoor air to get inside during winter."

Professor Barnett said the findings of the study, published in the journal Environmental Research, could trigger more prevention programs to help reduce the future burden on the health system.

"Excess winter deaths have a significant impact on health systems across Australia," he said.

"There are extra demands on doctors, hospitals and emergency departments in winter months, especially for cardiovascular and respiratory diseases which are triggered by exposure to cold weather.

"Our findings show the winter increases in mortality are predictable so ramping up public health measures, such as influenza vaccinations and insulating homes, particularly for vulnerable groups, should be considered to try to reduce the impact of severe winters."

Journal Reference:

Cunrui Huang, Cordia Chu, Xiaoming Wang, Adrian G. Barnett. Unusually cold and dry winters increase mortality in Australia. Environmental Research, 2015; 136: 1 DOI: 10.1016/j.envres.2014.08.046

View the original article here

Saturday, January 31, 2015

Average temperature in Finland has risen by more than two degrees

Over the past 166 years, the average temperature in Finland has risen by more than two degrees. During the observation period, the average increase was 0.14 degrees per decade, which is nearly twice as much as the global average.

According to a recent University of Eastern Finland and Finnish Meteorological Institute study, the rise in the temperature has been especially fast over the past 40 years, with the temperature rising by more than 0.2 degrees per decade. "The biggest temperature rise has coincided with November, December and January. Temperatures have also risen faster than the annual average in the spring months, i.e., March, April and May. In the summer months, however, the temperature rise has not been as significant," says Professor Ari Laaksonen of the University of Eastern Finland and the Finnish Meteorological Institute. As a result of the temperature rising, lakes in Finland get their ice cover later than before, and the ice cover also melts away earlier in the spring. Although the temperature rise in the actual growth season has been moderate, observations of Finnish trees beginning to blossom earlier than before have been made.

Temperature has risen in leaps

The annual average temperature has risen in two phases, the first being from the beginning of the observation period to the late 1930s, and the second from the late 1960s to present. Since the 1960s, the temperature has risen faster than ever before, with the rise varying between 0.2 and 0.4 degrees per decade. Between the late 1930s and late 1960s, the temperature remained nearly steady. "The stop in the temperature rise can be explained by several factors, including long-term changes in solar activity and post-World War II growth of human-derived aerosols in the atmosphere. When looking at recent years' observations from Finland, it seems that the temperature rising is not slowing down," University of Eastern Finland researcher Santtu Mikkonen explains.

The temperature time series was created by averaging the data produced by all Finnish weather stations across the country. Furthermore, as the Finnish weather station network wasn't comprehensive nation-wide in the early years, data obtained from measurement stations in Finland's neighbouring countries was also used.

Finland is located between the Atlantic Ocean and the continental Eurasia, which causes great variability in the country's weather. In the time series of the average temperature, this is visible in the form of strong noise, which makes it very challenging to detect statistically significant trends. The temperature time series for Finland was analysed by using a dynamic regression model. The method allows the division of the time series into sections indicating mean changes, i.e. trends, periodic variation, observation inter-dependence and noise. The method makes it possible to take into consideration the seasonal changes typical of Nordic conditions, as well as significant annual variation.

Journal Reference:

S. Mikkonen, M. Laine, H. M. M?kel?, H. Gregow, H. Tuomenvirta, M. Lahtinen, A. Laaksonen. Trends in the average temperature in Finland, 1847–2013. Stochastic Environmental Research and Risk Assessment, 2014; DOI: 10.1007/s00477-014-0992-2

View the original article here

Friday, January 30, 2015

Human influence important factor in possible global and UK temperature records

Early figures from the University of East Anglia (UEA) show 2014 is on course to be one of, if not the warmest year on record both globally and for the UK.

Recent research from the Met Office suggests breaking the existing global and UK temperature records is much more likely due to human influence on the climate.

Early figures suggest global record possible

The global mean temperature for January to October based on the HadCRUT4 dataset (compiled by the Met Office and UEA's Climatic Research Unit) is 0.57 ?C (+/- 0.1) above the long-term (1961-1990) average. This is consistent with the statement from the World Meteorological Organization (WMO) today.

With two months of data still to add, the full-year figure could change but presently 2014 is just ahead of the current record of 0.56 ?C set in 2010 in the global series which dates back to 1850. The final value for this year will be very close to the central estimate of 0.57?C from the Met Office global temperature forecast for 2014, which was issued late last year.

Colin Morice, a climate monitoring scientist at the Met Office, said: "Record or near-record years are interesting, but the ranking of individual years should be treated with some caution because the uncertainties in the data are larger than the differences between the top ranked years. We can say this year will add to the set of near-record temperatures we have seen over the last decade."

UK's run of warm months makes record likely

The UK's mean temperature from 1 January to 25 November is 1.6 ?C above the long term (1961-1990) average, which means this year is currently the warmest in our UK series dating back to 1910. This would beat the record of 1.4 ?C set in 2006, but a cold December could change the final ranking for this year.

This year is also set to be one of the warmest on record in the Central England Temperature (CET) series, which goes back to 1659 and is the longest instrumental temperature series in the world.

Interestingly, while all months this year except August have seen above average temperatures in the UK, no single month has seen a temperature record. Instead the year has been consistently warm.

Phil Jones, research director of UEA's Climatic Research Unit, said: "Spatially, 2014 has so far been warmer than the 1961-1990 average almost everywhere, the main exception being central and eastern parts of North America. For Europe, many countries in northern and eastern parts will likely have had near-record warm years."

CRU climate scientist Prof Tim Osborn said: "The last decade has been the warmest period in our 165-year-long record, yet during this decade there has been no clear warming at the Earth's surface. Coming at the end of this warm decade, record warmth in 2014 would be of significant interest but one year isn't enough to end the warming pause."

Human influence a likely factor

One warm year does not necessarily say anything about long-term climate change -- these trends need to be looked at over longer timescales of several decades.

However, new research techniques developed by the Met Office allow for rapid assessment of how human influence might have affected the chances of breaking temperature records.

This technique, known as an attribution study, uses climate models and observations to see how likely an event would be in the real world and in a world without human greenhouse gas emissions -- enabling assessment of how human influence has altered the chances of an event.

Peter Stott, Head of Climate Attribution at the Met Office, said: "Our research shows current global average temperatures are highly unlikely in a world without human influence on the climate.

"Human influence has also made breaking the current UK temperature record about ten times more likely."

A wet year for the UK, but not a record

This is also set to be a notably wet year for the UK, with 1162 mm of rain between 1 January and 25 November.

If we saw average rainfall for the rest of the year, 2014 would rank as the 4th wettest year in the UK records dating back to 1910. It would also be 11th in the longer running England and Wales precipitation series, which dates back to 1766.

However, if we do have a very wet December this year could still break the UK record set in 2000 of 1337 mm.

Due to the large amount of variability in UK rainfall, it's not yet possible to say whether human influence directly impacted this year's total.


View the original article here

Thursday, January 29, 2015

In the mood to trade? Weather may influence institutional investors' stock decisions

Weather changes may affect how institutional investors decide on stock plays, according to a new study by a team of finance researchers. Their findings suggest sunny skies put professional investors more in a mood to buy, while cloudy conditions tend to discourage stock purchases.

The researchers conclude that cloudier days increase the perception that individual stocks and the Dow Jones Industrials are overpriced, increasing the inclination for institutions to sell.

The research paper, "Weather-Induced Mood, Institutional Investors, and Stock Returns," has been published in the January 2015 issue of The Review of Financial Studies. The research was collaborated by Case Western Reserve University's Dasol Kim and three other finance professors (William Goetzmann of Yale University, Alok Kumar of University of Miami and Qin Wang of University of Michigan-Dearborn).

Institutional investors represent large organizations, such as banks, mutual funds, labor union funds and finance or insurance companies that make substantial investments in stocks. Kim said the results of the study are surprising, given that professional investors are well regarded for their financial sophistication.

"We focus on institutional investors because of the important role they have in how stock prices are formed in the markets," said Kim, assistant professor of banking and finance at Case Western Reserve's Weatherhead School of Management. "Other studies have already shown that ordinary retail investors are susceptible to psychological biases in their investment decisions. Trying to evaluate similar questions for institutional investors is challenging, because relevant data is hard to come by."

Building on previous findings from psychological studies about the effect of sunshine on mood, the researchers wanted to learn how mood affects professional investor opinions on their stock market investments.

By linking responses to a survey of investors from the Yale Investor Behavior Project of Nobel Prize-winning economist Robert Shiller and institutional stock trade data with historical weather data from the National Oceanic and Atmospheric Administration, the researchers concluded aggregated data shows that seasonably sunnier weather leads to optimistic responses and a willingness to buy.

The research accounts for differences in weather across regions of the country and seasons. They show that these documented mood effects also influence stock prices, and that the observed impact does not persist for long periods of time.

A summary of the research was also recently featured at The Harvard Law School Forum on Corporate Governance and Financial Regulation.

Journal Reference:

W. N. Goetzmann, D. Kim, A. Kumar, Q. Wang. Weather-Induced Mood, Institutional Investors, and Stock Returns. Review of Financial Studies, 2014; 28 (1): 73 DOI: 10.1093/rfs/hhu063

View the original article here

Wednesday, January 28, 2015

Even in restored forests, extreme weather strongly influences wildfire's impacts

The 2013 Rim Fire, the largest wildland fire ever recorded in the Sierra Nevada region, is still fresh in the minds of Californians, as is the urgent need to bring forests back to a more resilient condition. Land managers are using fire as a tool to mimic past fire conditions, restore fire-dependent forests, and reduce fuels in an effort to lessen the potential for large, high-intensity fires, like the Rim Fire. A study led by the U.S. Forest Service's Pacific Southwest Research Station (PSW) and recently published in the journal Forest Ecology and Management examined how the Rim Fire burned through forests with restored fire regimes in Yosemite National Park to determine whether they were as resistant to high-severity fire as many scientists and land managers expected.

Since the late 1960s, land managers in Yosemite National Park have used prescribed fire and let lower intensity wildland fires burn in an attempt to bring back historical fire regimes after decades of fire suppression. For this study, researchers seized a unique opportunity to study data on forest structure and fuels collected in 2009 and 2010 in Yosemite's old-growth, mixed-conifer forests that had previously burned at low to moderate severity. Using post-Rim Fire data and imagery, researchers found that areas burned on days the Rim Fire was dominated by a large pyro-convective plume -- a powerful column of smoke, gases, ash, and other debris -- burned at moderate to high severity regardless of the number of prior fires, topography, or forest conditions.

"The specific conditions leading to large plume formation are unknown, but what is clear from many observations is that these plumes are associated with extreme burning conditions," says Jamie Lydersen, PSW biological science technician and the study's lead author. "Plumes often form when atmospheric conditions are unstable, and result in erratic fire behavior driven by its own local effect on surface wind and temperatures that override the influence of more generalized climate factors measured at nearby weather stations."

When the extreme conditions caused by these plumes subsided during the Rim Fire, other factors influenced burn severity. "There was a strong influence of elapsed time since the last burn, where forests that experienced fire within the last 14 years burned mainly at low severity in the Rim Fire. Lower elevation areas and those with greater shrub cover tended to burn at higher severity," says Lydersen.

When driven by extreme weather, which often coincides with wildfires that escape initial containment efforts, fires can severely burn large swaths of forest regardless of ownership and fire history. These fires may only be controlled if more forests across the landscape have been managed for fuel reduction to allow early stage suppression before weather- and fuels-driven fire intensity makes containment impossible. Coordination of fire management activities by land management agencies across jurisdictions could favor burning under more moderate weather conditions when wildfires start and reduce the occurrences of harmful, high-intensity fires.


View the original article here

Tuesday, January 27, 2015

New insights into predicting future droughts in California: Natural cycles, sea surface temperatures found to be main drivers in ongoing event

According to a new NOAA-sponsored study, natural oceanic and atmospheric patterns are the primary drivers behind California's ongoing drought. A high pressure ridge off the West Coast (typical of historic droughts) prevailed for three winters, blocking important wet season storms, with ocean surface temperature patterns making such a ridge much more likely. Typically, the winter season in California provides the state with a majority of its annual snow and rainfall that replenish water supplies for communities and ecosystems.

Further studies on these oceanic conditions and their effect on California's climate may lead to advances in drought early warning that can help water managers and major industries better prepare for lengthy dry spells in the future.

"It's important to note that California's drought, while extreme, is not an uncommon occurrence for the state. In fact, multi-year droughts appear regularly in the state's climate record, and it's a safe bet that a similar event will happen again. Thus, preparedness is key," said Richard Seager, report lead author and professor with Columbia University's Lamont Doherty Earth Observatory.

This report builds on earlier studies, published in September in the Bulletin of the American Meteorological Society, which found no conclusive evidence linking human-caused climate change and the California drought. The current study notes that the atmospheric ridge over the North Pacific, which has resulted in decreased rain and snowfall since 2011, is almost opposite to what models project to result from human-induced climate change. The report illustrates that mid-winter precipitation is actually projected to increase due to human-induced climate change over most of the state, though warming temperatures may sap much of those benefits for water resources overall, while only spring precipitation is projected to decrease.

The report makes clear that to provide improved drought forecasts for California, scientists will need to fully understand the links between sea surface temperature variations and winter precipitation over the state, discover how these ocean variations are generated, and better characterize their predictability.

This report contributes to a growing field of science-climate attribution-where teams of scientists aim to identify the sources of observed climate and weather patterns.

"There is immense value in examining the causes of this drought from multiple scientific viewpoints," said Marty Hoerling, report co-author and researcher with NOAA's Earth System Research Laboratory. "It's paramount that we use our collective ability to provide communities and businesses with the environmental intelligence they need to make decisions concerning water resources, which are becoming increasingly strained."

To view the report, visit:?http://cpo.noaa.gov/MAPP/californiadroughtreport.


View the original article here

Monday, January 26, 2015

NASA's Fermi Mission brings deeper focus to thunderstorm gamma rays

Each day, thunderstorms around the world produce about a thousand quick bursts of gamma rays, some of the highest-energy light naturally found on Earth. By merging records of events seen by NASA's Fermi Gamma-ray Space Telescope with data from ground-based radar and lightning detectors, scientists have completed the most detailed analysis to date of the types of thunderstorms involved.

"Remarkably, we have found that any thunderstorm can produce gamma rays, even those that appear to be so weak a meteorologist wouldn't look twice at them," said Themis Chronis, who led the research at the University of Alabama in Huntsville (UAH).

The outbursts, called terrestrial gamma-ray flashes (TGFs), were discovered in 1992 by NASA's Compton Gamma-Ray Observatory, which operated until 2000. TGFs occur unpredictably and fleetingly, with durations less than a thousandth of a second, and remain poorly understood.

In late 2012, Fermi scientists employed new techniques that effectively upgraded the satellite's Gamma-ray Burst Monitor (GBM), making it 10 times more sensitive to TGFs and allowing it to record weak events that were overlooked before.

"As a result of our enhanced discovery rate, we were able to show that most TGFs also generate strong bursts of radio waves like those produced by lightning," said Michael Briggs, assistant director of the Center for Space Plasma and Aeronomic Research at UAH and a member of the GBM team.

Previously, TGF positions could be roughly estimated based on Fermi's location at the time of the event. The GBM can detect flashes within about 500 miles (800 kilometers), but this is too imprecise to definitively associate a TGF with a specific storm.

Ground-based lightning networks use radio data to pin down strike locations. The discovery of similar signals from TGFs meant that scientists could use the networks to determine which storms produce gamma-ray flashes, opening the door to a deeper understanding of the meteorology powering these extreme events.

Chronis, Briggs and their colleagues sifted through 2,279 TGFs detected by Fermi's GBM to derive a sample of nearly 900 events accurately located by the Total Lightning Network operated by Earth Networks in Germantown, Maryland, and the World Wide Lightning Location Network, a research collaboration run by the University of Washington in Seattle. These systems can pinpoint the location of lightning discharges -- and the corresponding signals from TGFs -- to within 6 miles (10 km) anywhere on the globe.

From this group, the team identified 24 TGFs that occurred within areas covered by Next Generation Weather Radar (NEXRAD) sites in Florida, Louisiana, Texas, Puerto Rico and Guam. For eight of these storms, the researchers obtained additional information about atmospheric conditions through sensor data collected by the Department of Atmospheric Science at the University of Wyoming in Laramie.

"All told, this study is our best look yet at TGF-producing storms, and it shows convincingly that storm intensity is not the key," said Chronis, who will present the findings Wed., Dec. 17, in an invited talk at the American Geophysical Union meeting in San Francisco. A paper describing the research has been submitted to the Bulletin of the American Meteorological Society.

Scientists suspect that TGFs arise from strong electric fields near the tops of thunderstorms. Updrafts and downdrafts within the storms force rain, snow and ice to collide and acquire electrical charge. Usually, positive charge accumulates in the upper part of the storm and negative charge accumulates below. When the storm's electrical field becomes so strong it breaks down the insulating properties of air, a lightning discharge occurs.

Under the right conditions, the upper part of an intracloud lightning bolt disrupts the storm's electric field in such a way that an avalanche of electrons surges upward at high speed. When these fast-moving electrons are deflected by air molecules, they emit gamma rays and create a TGF.

About 75 percent of lightning stays within the storm, and about 2,000 of these intracloud discharges occur for each TGF Fermi detects.

The new study confirms previous findings indicating that TGFs tend to occur near the highest parts of a thunderstorm, between about 7 and 9 miles (11 to 14 kilometers) high. "We suspect this isn't the full story," explained Briggs. "Lightning often occurs at lower altitudes and TGFs probably do too, but traveling the greater depth of air weakens the gamma rays so much the GBM can't detect them."

Based on current Fermi statistics, scientists estimate that some 1,100 TGFs occur each day, but the number may be much higher if low-altitude flashes are being missed.

While it is too early to draw conclusions, Chronis notes, there are a few hints that gamma-ray flashes may prefer storm areas where updrafts have weakened and the aging storm has become less organized. "Part of our ongoing research is to track these storms with NEXRAD radar to determine if we can relate TGFs to the thunderstorm life cycle," he said.

Video: https://www.youtube.com/watch?v=JgK4Ds_Sj6Q#t=66


View the original article here