How did the National Weather Service and the weather forecast enterprise do? Can we do better? This blog will provide some analysis and a few recommendations.
Let me say at the outset, a lot went right and this event demonstrates the substantial improvements in weather prediction technology during the past decades.
For several days before, based on a range of numerical forecast models and substantial knowledge of severe thunderstorms, the National Weather Service forecasters at the Norman, Oklahoma NWS office and staff at the NOAA/NWS Storm Prediction Center (SPC) had predicted the threat of severe thunderstorms on Monday over Oklahoma. Here is the graphical outlook for severe convection released on Sunday. Central and eastern Oklahoma was clearly in a high risk area.
And the Storm Prediction Center's discussion highlighted the threat for Monday
TSTMS MAY DEVELOP BY 20-21Z ACROSS OK ...AND THE
PROSPECT FOR A VERY MOIST WARM SECTOR FAVOR
NUMEROUS SUPERCELL STRUCTURES...VERY LARGE HAIL AND
TORNADOES ARE POSSIBLE WITH SUPERCELLS ...
One of my graduate students, Luke Madaus, happened to be in Oklahoma on Sunday and several people commented on the potential severe storm threat for the next day.
The next morning it was clear that the threat of severe weather was enhanced. Many of the numerical models showed the development of strong convection, although their solutions differed considerably in strength and position. The radiosonde (balloon-launched weather instruments) sounding at Norman, Oklahoma showed extraordinary instability (CAPE of roughly 5000 Joules per kilogram and plenty of vertical shear; CAPE stands for Convective Available Potential Energy. Northwest locations rarely gets above a few hundred, a few thousand is very large, 5000 is extreme). PLUS, there was a frontal boundary and a dry line that intersected near Norman, Oklahoma. A very, very big threat.
Our nation is lucky to have the best severe storm forecasters in the world, backed by world-leading research at the National Severe Storms Lab, the University of Oklahoma, the National Center for Atmospheric Research, and many others). They proved themselves on Monday. The 11 AM CDT forecast (communicated via YouTube among other ways), painted out the threat and even talked about dangers to schools. (see image, click on it to see the video).
http://www.youtube.com/watch?feature=player_embedded&v=9Q7iUn9YfWA
You will notice that the warnings of severe thunderstorms was over an area. Our current level of forecasting technology, coupled with the substantial uncertainty and the chaotic nature of convection, made it impossible to do better. But an extraordinarily valuable forecast. A major threat was communicated well.
Then the next level of warming technology took over: the U.S. Doppler radar network. The U.S. has invested heavily in state-of-the-art Doppler radars across the country, radars that have recently been upgraded to dual-polarization (allows the radar to determine the type of precipitation or the nature of the "targets" it view).
Around 2:30 PM the CDT the Doppler radar in Norman, OK observed the classic signs of a rotating, supercell tornado, including a hook echo and a mesocyclone (an area of rotation 5-10 km wide). The image below (at 3:06 PM CDT) shows the hook echo, with a "debris ball" at the end.
With the radar image and reports from spotters, a tornado warning went out at 2:40 PM, 36 minutes before the tornado hit Moore. You may not think that is much, but 36 minutes is a far larger compared to the pre-radar days (average of 5 minutes in the 1980s). (Thanks to Mike Smith's blog for an analysis of the lead time issue). This is enough time to run the tornado sirens, put out warnings in the media, and to give folks a chance to move to safe locations...if there ARE safe locations.
That is one big problem. For an EF-4 or 5 storm the damage can be catastrophic, with buildings either being blown away or experiencing severe structural damage. Safety can only be found in specially hardened rooms or enclosures. And such protective spaces were not available to many resident of Moore and for several of the schools. This needs to be changed.
In short, National Weather Service forecasters did a magnificent job for this event. But could we do better? I believe the answer is yes.
Because the atmosphere is chaotic (which means small errors in the initial state can have large negative impacts on the subsequent forecasts, impacts that increase in time) and the requirement of very detailed information to describe the initial environment for thunderstorm forecasts, it is virtually impossible to predict the details of severe thunderstorms a day or more ahead. And this is not going to change soon. Yes, we can predict that a major threat exists, but we can't get the exact locations or strength of the future storms correct. So the day ahead forecast will have to be broad brushed.
But there IS the potential for major forecasting advances in the period from 1 to roughly 6 hours before the storm, if we can run models with enough resolution and can get enough information to describe the initial 3D atmosphere with lots of detail. And we need to run many simulations (called ensemble forecasts) to get a handle on the uncertainties of the forecasts.
What kind of model resolution am I talking about? Probably 1-2 km between the grid points, which requires huge computer resources. We need to apply new ensemble-based data assimilation approaches (data assimilation is the technology of using data to describe the structure of the atmosphere). And this modeling system needed to be frequently updated, at least once per hour.
We also need much more detailed information about the structure of the atmosphere, using innovative new data sources. For example, I have a graduate student, Luke Madaus, who is using pressures from smartphones to improve weather forecasts and he is planning on testing this approach with strong thunderstorms. A potentially huge advance for convective storm forecasting. Unfortunately, the NWS support for this work (through the NWS CSTAR program) was cancelled for lack of funds....a tremendous frustration. One of the weaknesses of the NWS is its inability to support and take advantage of university research.
The NOAA and the National Weather Service has been developing an early version of an advanced short-term, high resolution prediction system (the 3-km grid spacing High Resolution Rapid Refresh System, HRRR), which is only run in research mode because of the lack of sufficiently powerful computers in the NWS. Below is an example of the HRRR forecast 6 and 3 hour out for 2100 UTC (4 PM CDT)----not bad, but not perfect).
Skillful 1-6 hr forecasts are potentially achievable since the forecasts are short enough that the growth in forecast error is modest. And a few hour warming of a major storm allows sufficient time to evacuate folks from areas which severe weather is probable.
To achieve better short-term predictions, more model development is needed, including higher resolution, state-of-the-art data assimilation, and moving to an ensemble approach. But with enough research and sufficient computer resources, we can do better. But the NWS did very well in this case.
No comments:
Post a Comment