New snow average data changes idea of ‘normal’
February 17, 2013
VAIL – The severity of drought conditions in Colorado might start looking less bad due to a switch in the data set that represents so-called normal or average conditions.
In January, the drought update issued by the Water Availability Task Force reported that as of Jan. 22, the state needs 140 percent of normal snowpack accumulation to reach its long-term season peak in early April. But what constitutes “normal” has just changed.
“The Natural Resources Conservation Service uses a 30-year running average that is updated every 10 years. This month marks the transition to the new ‘normal’ period of 1981-2010 for all NRCS products (previous months used the 1971-2000 period),” according to the January report. “NRCS is also transitioning to the use of median rather than average to define normal for all (snow water equivalent) products. Average is still used for precipitation, reservoir and streamflow products. Please keep in mind that this transition will affect the data when presented as a percent of normal.”
Diane Johnson, of the Eagle River Water and Sanitation District, said the change won’t have an effect at the local level because the district uses data that doesn’t go back as far, but the state and river basin reports are going to look different. What was reported as 90 percent of average might now be reported as 100 percent of average, for example.
“Now any given value is now a higher percent than it used to be,” Johnson said. “People use that phrase (new normal) all the time and now we actually have to shift to accepting these types of conditions would now be considered ‘good,’ for lack of a better word, or ‘average’ at least.”
A dry decade in the 2000s has now replaced a fairly wet decade in the 1970s, so that 30-year average is going to be a drier average – another reason why the Natural Resources Conservation Service is now using a median rather than an average to define normal snow water equivalents.
Recommended Stories For You
In a presentation to the task force on Thursday, hydrologist and climate data analyst Jim Marron, of the Natural Resources Conservation Service , reported that datasets are being developed for each river basin that will provide the basin median number for each year.
When looking at the new datasets, the Natural Resources Conservation Service warns of possible flaws.
“Caution is recommended in using the 1971-2000 and 1981-2010 normal sets for climatic comparison for two important reasons. First, SNOTEL (snow telemetry) sites did not exist in 1971. The first major installation of SNOTEL sites occurred in the late 1970s to early 1980s,” according to the Natural Resources Conservation Service. “Therefore, data at SNOTEL sites for the 1970s was estimated in the 1971-2000 normals. Second, the procedures to determine normals for sites that did not have a complete 30-year record were different in the two datasets.”
So while comparing current conditions to a new dataset will certainly be accurate when referring to the new 30-year period, peoples’ memories are going to tell them something else. Johnson said the numbers might show that things aren’t that bad when people’s memories are telling them otherwise.
“Last year, July 1 historic median was 230 cfs (cubic feet per second), and now the data for this summer tells me the historic median is 209 cfs,” Johnson said. “So one year dropped it that much.”
Memory is important when talking about water supplies. Even this winter, people aren’t talking about the drought as much as last year because there have been more powder days, Johnson said, even though snowpack conditions aren’t better off. The state is currently at 76 percent of average (based on the new 30-year period), and the Colorado River Basin is at 70 percent of average.
Having back-to-back years of below average snowfall and drought conditions, though, is making the change to the new 30-year time frame scrutinized even more.
“There’s probably a heightened awareness,” Johnson said. “Because everyone is examining it so much.”