Sunday, July 31, 2016
The National Weather Service Moves to a New Global Model: Will It Do it Right?
This week the National Weather Service (NWS) made an important announcement: its decision for the dynamical core of its new global model, the Geophysical Fluid Dynamics Lab (GFDL) FV-3. At the same time, they turned down the global model developed by the National Center for Atmospheric Research (NCAR) called MPAS (NCAR is the combined entity of the academic research community of the U.S., representing over 100 college and universities).
Considering the importance of this decision regarding the main US numerical weather prediction model, this blog will attempt to describe:
As I have discussed in several blogs (e.g., here), the current NWS Global Forecast System (GFS) is out-of-date in many ways. It was designed for low-resolution weather prediction and does not scale well on modern supercomputers (which can have tens or hundreds of thousands of processors). GFS physics (the descriptions of physical processes such as convection and clouds/precipitation) are ancient, representation the state of the science 20-30 years ago. The range and amount of data assimilated into GFS forecasts is less than leading centers (such as the European Center) and the quality control system lags. We can do much better.
The NWS GFS global model is now third or fourth globally, a particular embarrassment since the range and quality of US. weather research is by far the greatest in the world. The general U.S. population finally understood the situation when important forecasts by the GFS (e.g., Hurricane Sandy) were clearly inferior to those of the European Center and UKMET office. So does NWS/ NOAA management and Congress, which provided funds for both a new operational computer and model replacement.
Replacing the core
The first step in moving to a new modeling system is to replace its guts: the dynamical core. But what is this? A reasonable analog is the chassis of a a car, including the frame and engine (see below). No body, no seats, no electronics, no AC or ventilation, no steering column, no gas tank. Although it is perhaps 25% of what makes up a modern car, the chassis does serve as the frame on which everything else is mounted and connected. To build the best car, you want a good chassis.
Numerical prediction dynamical cores are like that. They provide the framework of the model: its grid structure, the central atmospheric equations (conservation of mass, momentum, energy, and water substance), how the model can provide more detail in local areas (e.g., nesting or variable resolution grid elements).
The dynamical core does not take encompass key physical processes, such as boundary layer drag, condensation in clouds, radiation from the sun/clouds/surface, and much more. And the core does not deal with the critical element of data assimilation, taking all of the sources of weather information to create a physically consistent initialization, or starting place, for the forecast.
So a key point is this: although the National Weather Service picked a dynamical core, MOST of the work to develop a state of science global model for the US is still ahead.
The Decision is Made: GFDL's FV-3
The announcement last week was that the National Weather Service picked the FV-3. In some sense, it was the safe choice, but I believe it wasn't the best choice. Let me be clear: FV-3 is far, far better than the current GFS model and the handiwork of a very talented numerical modeler, S.J. Lin. But I believe it was not the best choice for important reasons. The alternative (MPAS) uses a grid structure that is far better at high resolution (like 1-4 km grid spacing), which is where global modeling is going in the next ten years. Thus, picking FV-3 was like buying a house that is good for two, when you are planning on a large family. FV-3 is far more conventional that MPAS in structure (e.g., it has grids, not the hexagonal elements of MPAS) and its compromises make it faster at the same grid spacing.
But just as important, picking MPAS would have brought the research and NOAA/NWS efforts together, and I believe the lack of nation cooperation is one of the key reasons US operational NWP is lagging.
But the decision, good or bad, has been made, and as noted above, the dynamical core is perhaps 25% of a modern modeling system. As Caesar would say, the die has been cast. And if future testing supports the superiority of MPAS, its dynamical core could be switched for FV-3.
So What Should Be Done Now?
Get a plan
The National Weather Service needs a plan to complete the remaining 75% of the new global modeling system and to ensure that it is state-of-the-art and STAYS state-of-the-art.
The trouble? They don't have a plan at this point and need to put it together immediately. Actually, they need TWO plans: a broad strategic plan on where they are going during the next ten years, and a shorter term implementation plan for the next few years.
Planning has been a singular weakness of the National Weather Service weather forecast modeling efforts, with a tendency to do too little of it, and what they have done has been in-house with inadequate vision.
It is clear that NOAA/NWS needs to open up the planning process to include the research community, the private sector, and others. They have done a bit of this with the NCAR UCACN and UMAC committees, but they need to take the planning process to a completely different level in detail, vision, and inclusion. Perhaps, a standing committee that mest several times a year. Let's call it the NOAA Global Model Advisory Committee (NGMAC), with modeling experts from NOAA (ESRL and GFDL), academia, NCAR, the private sector, and other Federal agencies. The committee would recommend a broad strategic plan to NOAA/NWS managemen,t and subcommittees (with additional experts) could build the implementation plans.
Build a National Team
The history of the past few decades is clear: when the National Weather Service develops models in house without the entrainment of the outside community, the models are inevitably not state of the science. But when they partner with others (such as ESRL working with NCAR on the High Resolution Rapid Refresh, HRRR), amazing things happen.
Specifically, NOAA must make the model development a cooperative national effort. Bill Kuo of UCAR , a wise and experienced player in US numerical weather prediction efforts, has suggested a wonderful plan. Center the development of the new national modeling effort in UCAR (the University Corporation for Atmospheric Research), with a center of gravity in Boulder, Colorado (the intellectual center of the US weather community). A team of ESRL researchers, NCAR and academic scientists, private sector modelers, GFDL and NOAA EMC staff) could build the system there. The UCAR Developmental Testbed Center (also in Boulder ) would provide extensive testing of new model components and could organize community support. ESRL, with considerable personnel in model development and basic physics, is found in Boulder.
EMC scientists in DC could work on the new system components, complete pre-operational testing, and take on key tasks such as improving quality control.
NOAA/NWS could continue and expand its research funding of advances that support the new system and hopefully NSF could help support associated basic research.
Think Unified and Probabilistic.
The future of numerical weather prediction is clearly moving towards unified modeling across all scales. That is the way the real atmosphere works. The days of one model for global forecasts and another at ultra-high resolution are numbered, if not over all ready. Leading groups, such as the UKMET Office, have already moved to a unified approach. NOAA must do the same.
Future prediction will be essentially probabilistic at all scales, with ensembles (running models many times with different initial states and physics) being the foundation. The NWS has generally not given enough emphasis and thought to their ensemble systems, with their size and resolution being insufficient. They have also held back from creating a high-resolution (2-4 km grid spacing) over the US, something recommended by endless National Academy reports and national workshops. Ensembles need more resources and thought.
US NWP has become risk adverse and conservative. It needs to be more forward-leaning and innovative.
The Bottom Line
The National Weather Service made a conservative choice for the core of its new global modeling system, but that is not the end of the world. The U.S. can still build a far better global prediction system IF NOAA/NWS will methodically plan the development of the model and entrain the large and strong US research community. Such strategic planning and inclusive development would be a very different approach than NOAA has done in the past. Leaders of NOAA (e.g., Kathryn Sullivan) and the Weather Service (Louis Uccellini, Director, and Bill Lapenta, had of NCEP) say they are ready to follow such an approach, and with their support, U.S. numerical weather prediction could move in a new direction. If U.S. numerical weather prediction stagnates because the NWS follows the path of the past several decades, you will know who to blame.
from Cliff Mass Weather Blog http://ift.tt/2a92euY
Considering the importance of this decision regarding the main US numerical weather prediction model, this blog will attempt to describe:
- What this means.
- Some issues regarding this decision.
- Some recomendations on the route the NWS must take if it wants to develop a world class global modeling system.
As I have discussed in several blogs (e.g., here), the current NWS Global Forecast System (GFS) is out-of-date in many ways. It was designed for low-resolution weather prediction and does not scale well on modern supercomputers (which can have tens or hundreds of thousands of processors). GFS physics (the descriptions of physical processes such as convection and clouds/precipitation) are ancient, representation the state of the science 20-30 years ago. The range and amount of data assimilated into GFS forecasts is less than leading centers (such as the European Center) and the quality control system lags. We can do much better.
72-h cumulative precipitation from the GFS forecast initialized at 1800 UTC 30 July 2016. The "popcorn" look to the precipitation in mountains is not realistic and reflects inferior, 20-year old precipitation physics in the model.
The NWS GFS global model is now third or fourth globally, a particular embarrassment since the range and quality of US. weather research is by far the greatest in the world. The general U.S. population finally understood the situation when important forecasts by the GFS (e.g., Hurricane Sandy) were clearly inferior to those of the European Center and UKMET office. So does NWS/ NOAA management and Congress, which provided funds for both a new operational computer and model replacement.
Replacing the core
The first step in moving to a new modeling system is to replace its guts: the dynamical core. But what is this? A reasonable analog is the chassis of a a car, including the frame and engine (see below). No body, no seats, no electronics, no AC or ventilation, no steering column, no gas tank. Although it is perhaps 25% of what makes up a modern car, the chassis does serve as the frame on which everything else is mounted and connected. To build the best car, you want a good chassis.
Numerical prediction dynamical cores are like that. They provide the framework of the model: its grid structure, the central atmospheric equations (conservation of mass, momentum, energy, and water substance), how the model can provide more detail in local areas (e.g., nesting or variable resolution grid elements).
The dynamical core does not take encompass key physical processes, such as boundary layer drag, condensation in clouds, radiation from the sun/clouds/surface, and much more. And the core does not deal with the critical element of data assimilation, taking all of the sources of weather information to create a physically consistent initialization, or starting place, for the forecast.
So a key point is this: although the National Weather Service picked a dynamical core, MOST of the work to develop a state of science global model for the US is still ahead.
The Decision is Made: GFDL's FV-3
The announcement last week was that the National Weather Service picked the FV-3. In some sense, it was the safe choice, but I believe it wasn't the best choice. Let me be clear: FV-3 is far, far better than the current GFS model and the handiwork of a very talented numerical modeler, S.J. Lin. But I believe it was not the best choice for important reasons. The alternative (MPAS) uses a grid structure that is far better at high resolution (like 1-4 km grid spacing), which is where global modeling is going in the next ten years. Thus, picking FV-3 was like buying a house that is good for two, when you are planning on a large family. FV-3 is far more conventional that MPAS in structure (e.g., it has grids, not the hexagonal elements of MPAS) and its compromises make it faster at the same grid spacing.
The NWS might have been better off with a house more suitable for the future
But just as important, picking MPAS would have brought the research and NOAA/NWS efforts together, and I believe the lack of nation cooperation is one of the key reasons US operational NWP is lagging.
But the decision, good or bad, has been made, and as noted above, the dynamical core is perhaps 25% of a modern modeling system. As Caesar would say, the die has been cast. And if future testing supports the superiority of MPAS, its dynamical core could be switched for FV-3.
So What Should Be Done Now?
Get a plan
The National Weather Service needs a plan to complete the remaining 75% of the new global modeling system and to ensure that it is state-of-the-art and STAYS state-of-the-art.
The trouble? They don't have a plan at this point and need to put it together immediately. Actually, they need TWO plans: a broad strategic plan on where they are going during the next ten years, and a shorter term implementation plan for the next few years.
Planning has been a singular weakness of the National Weather Service weather forecast modeling efforts, with a tendency to do too little of it, and what they have done has been in-house with inadequate vision.
It is clear that NOAA/NWS needs to open up the planning process to include the research community, the private sector, and others. They have done a bit of this with the NCAR UCACN and UMAC committees, but they need to take the planning process to a completely different level in detail, vision, and inclusion. Perhaps, a standing committee that mest several times a year. Let's call it the NOAA Global Model Advisory Committee (NGMAC), with modeling experts from NOAA (ESRL and GFDL), academia, NCAR, the private sector, and other Federal agencies. The committee would recommend a broad strategic plan to NOAA/NWS managemen,t and subcommittees (with additional experts) could build the implementation plans.
Build a National Team
The history of the past few decades is clear: when the National Weather Service develops models in house without the entrainment of the outside community, the models are inevitably not state of the science. But when they partner with others (such as ESRL working with NCAR on the High Resolution Rapid Refresh, HRRR), amazing things happen.
Specifically, NOAA must make the model development a cooperative national effort. Bill Kuo of UCAR , a wise and experienced player in US numerical weather prediction efforts, has suggested a wonderful plan. Center the development of the new national modeling effort in UCAR (the University Corporation for Atmospheric Research), with a center of gravity in Boulder, Colorado (the intellectual center of the US weather community). A team of ESRL researchers, NCAR and academic scientists, private sector modelers, GFDL and NOAA EMC staff) could build the system there. The UCAR Developmental Testbed Center (also in Boulder ) would provide extensive testing of new model components and could organize community support. ESRL, with considerable personnel in model development and basic physics, is found in Boulder.
EMC scientists in DC could work on the new system components, complete pre-operational testing, and take on key tasks such as improving quality control.
NOAA/NWS could continue and expand its research funding of advances that support the new system and hopefully NSF could help support associated basic research.
Think Unified and Probabilistic.
The future of numerical weather prediction is clearly moving towards unified modeling across all scales. That is the way the real atmosphere works. The days of one model for global forecasts and another at ultra-high resolution are numbered, if not over all ready. Leading groups, such as the UKMET Office, have already moved to a unified approach. NOAA must do the same.
Future prediction will be essentially probabilistic at all scales, with ensembles (running models many times with different initial states and physics) being the foundation. The NWS has generally not given enough emphasis and thought to their ensemble systems, with their size and resolution being insufficient. They have also held back from creating a high-resolution (2-4 km grid spacing) over the US, something recommended by endless National Academy reports and national workshops. Ensembles need more resources and thought.
US NWP has become risk adverse and conservative. It needs to be more forward-leaning and innovative.
The Bottom Line
The National Weather Service made a conservative choice for the core of its new global modeling system, but that is not the end of the world. The U.S. can still build a far better global prediction system IF NOAA/NWS will methodically plan the development of the model and entrain the large and strong US research community. Such strategic planning and inclusive development would be a very different approach than NOAA has done in the past. Leaders of NOAA (e.g., Kathryn Sullivan) and the Weather Service (Louis Uccellini, Director, and Bill Lapenta, had of NCEP) say they are ready to follow such an approach, and with their support, U.S. numerical weather prediction could move in a new direction. If U.S. numerical weather prediction stagnates because the NWS follows the path of the past several decades, you will know who to blame.
from Cliff Mass Weather Blog http://ift.tt/2a92euY
The National Weather Service Moves to a New Global Model: Will It Do it Right?
This week the National Weather Service (NWS) made an important announcement: its decision for the dynamical core of its new global model, the Geophysical Fluid Dynamics Lab (GFDL) FV-3. At the same time, they turned down the global model developed by the National Center for Atmospheric Research (NCAR) called MPAS (NCAR is the combined entity of the academic research community of the U.S., representing over 100 college and universities).
Considering the importance of this decision regarding the main US numerical weather prediction model, this blog will attempt to describe:
As I have discussed in several blogs (e.g., here), the current NWS Global Forecast System (GFS) is out-of-date in many ways. It was designed for low-resolution weather prediction and does not scale well on modern supercomputers (which can have tens or hundreds of thousands of processors). GFS physics (the descriptions of physical processes such as convection and clouds/precipitation) are ancient, representation the state of the science 20-30 years ago. The range and amount of data assimilated into GFS forecasts is less than leading centers (such as the European Center) and the quality control system lags. We can do much better.
The NWS GFS global model is now third or fourth globally, a particular embarrassment since the range and quality of US. weather research is by far the greatest in the world. The general U.S. population finally understood the situation when important forecasts by the GFS (e.g., Hurricane Sandy) were clearly inferior to those of the European Center and UKMET office. So does NWS/ NOAA management and Congress, which provided funds for both a new operational computer and model replacement.
Replacing the core
The first step in moving to a new modeling system is to replace its guts: the dynamical core. But what is this? A reasonable analog is the chassis of a a car, including the frame and engine (see below). No body, no seats, no electronics, no AC or ventilation, no steering column, no gas tank. Although it is perhaps 25% of what makes up a modern car, the chassis does serve as the frame on which everything else is mounted and connected. To build the best car, you want a good chassis.
Numerical prediction dynamical cores are like that. They provide the framework of the model: its grid structure, the central atmospheric equations (conservation of mass, momentum, energy, and water substance), how the model can provide more detail in local areas (e.g., nesting or variable resolution grid elements).
The dynamical core does not take encompass key physical processes, such as boundary layer drag, condensation in clouds, radiation from the sun/clouds/surface, and much more. And the core does not deal with the critical element of data assimilation, taking all of the sources of weather information to create a physically consistent initialization, or starting place, for the forecast.
So a key point is this: although the National Weather Service picked a dynamical core, MOST of the work to develop a state of science global model for the US is still ahead.
The Decision is Made: GFDL's FV-3
The announcement last week was that the National Weather Service picked the FV-3. In some sense, it was the safe choice, but I believe it wasn't the best choice. Let me be clear: FV-3 is far, far better than the current GFS model and the handiwork of a very talented numerical modeler, S.J. Lin. But I believe it was not the best choice for important reasons. The alternative (MPAS) uses a grid structure that is far better at high resolution (like 1-4 km grid spacing), which is where global modeling is going in the next ten years. Thus, picking FV-3 was like buying a house that is good for two, when you are planning on a large family. FV-3 is far more conventional that MPAS in structure (e.g., it has grids, not the hexagonal elements of MPAS) and its compromises make it faster at the same grid spacing.
But just as important, picking MPAS would have brought the research and NOAA/NWS efforts together, and I believe the lack of nation cooperation is one of the key reasons US operational NWP is lagging.
But the decision, good or bad, has been made, and as noted above, the dynamical core is perhaps 25% of a modern modeling system. As Caesar would say, the die has been cast. And if future testing supports the superiority of MPAS, its dynamical core could be switched for FV-3.
So What Should Be Done Now?
Get a plan
The National Weather Service needs a plan to complete the remaining 75% of the new global modeling system and to ensure that it is state-of-the-art and STAYS state-of-the-art.
The trouble? They don't have a plan at this point and need to put it together immediately. Actually, they need TWO plans: a broad strategic plan on where they are going during the next ten years, and a shorter term implementation plan for the next few years.
Planning has been a singular weakness of the National Weather Service weather forecast modeling efforts, with a tendency to do too little of it, and what they have done has been in-house with inadequate vision.
It is clear that NOAA/NWS needs to open up the planning process to include the research community, the private sector, and others. They have done a bit of this with the NCAR UCACN and UMAC committees, but they need to take the planning process to a completely different level in detail, vision, and inclusion. Perhaps, a standing committee that mest several times a year. Let's call it the NOAA Global Model Advisory Committee (NGMAC), with modeling experts from NOAA (ESRL and GFDL), academia, NCAR, the private sector, and other Federal agencies. The committee would recommend a broad strategic plan to NOAA/NWS managemen,t and subcommittees (with additional experts) could build the implementation plans.
Build a National Team
The history of the past few decades is clear: when the National Weather Service develops models in house without the entrainment of the outside community, the models are inevitably not state of the science. But when they partner with others (such as ESRL working with NCAR on the High Resolution Rapid Refresh, HRRR), amazing things happen.
Specifically, NOAA must make the model development a cooperative national effort. Bill Kuo of UCAR , a wise and experienced player in US numerical weather prediction efforts, has suggested a wonderful plan. Center the development of the new national modeling effort in UCAR (the University Corporation for Atmospheric Research), with a center of gravity in Boulder, Colorado (the intellectual center of the US weather community). A team of ESRL researchers, NCAR and academic scientists, private sector modelers, GFDL and NOAA EMC staff) could build the system there. The UCAR Developmental Testbed Center (also in Boulder ) would provide extensive testing of new model components and could organize community support. ESRL, with considerable personnel in model development and basic physics, is found in Boulder.
EMC scientists in DC could work on the new system components, complete pre-operational testing, and take on key tasks such as improving quality control.
NOAA/NWS could continue and expand its research funding of advances that support the new system and hopefully NSF could help support associated basic research.
Think Unified and Probabilistic.
The future of numerical weather prediction is clearly moving towards unified modeling across all scales. That is the way the real atmosphere works. The days of one model for global forecasts and another at ultra-high resolution are numbered, if not over all ready. Leading groups, such as the UKMET Office, have already moved to a unified approach. NOAA must do the same.
Future prediction will be essentially probabilistic at all scales, with ensembles (running models many times with different initial states and physics) being the foundation. The NWS has generally not given enough emphasis and thought to their ensemble systems, with their size and resolution being insufficient. They have also held back from creating a high-resolution (2-4 km grid spacing) over the US, something recommended by endless National Academy reports and national workshops. Ensembles need more resources and thought.
US NWP has become risk adverse and conservative. It needs to be more forward-leaning and innovative.
The Bottom Line
The National Weather Service made a conservative choice for the core of its new global modeling system, but that is not the end of the world. The U.S. can still build a far better global prediction system IF NOAA/NWS will methodically plan the development of the model and entrain the large and strong US research community. Such strategic planning and inclusive development would be a very different approach than NOAA has done in the past. Leaders of NOAA (e.g., Kathryn Sullivan) and the Weather Service (Louis Uccellini, Director, and Bill Lapenta, had of NCEP) say they are ready to follow such an approach, and with their support, U.S. numerical weather prediction could move in a new direction. If U.S. numerical weather prediction stagnates because the NWS follows the path of the past several decades, you will know who to blame.
from Cliff Mass Weather Blog http://ift.tt/2a92euY
Considering the importance of this decision regarding the main US numerical weather prediction model, this blog will attempt to describe:
- What this means.
- Some issues regarding this decision.
- Some recomendations on the route the NWS must take if it wants to develop a world class global modeling system.
As I have discussed in several blogs (e.g., here), the current NWS Global Forecast System (GFS) is out-of-date in many ways. It was designed for low-resolution weather prediction and does not scale well on modern supercomputers (which can have tens or hundreds of thousands of processors). GFS physics (the descriptions of physical processes such as convection and clouds/precipitation) are ancient, representation the state of the science 20-30 years ago. The range and amount of data assimilated into GFS forecasts is less than leading centers (such as the European Center) and the quality control system lags. We can do much better.
72-h cumulative precipitation from the GFS forecast initialized at 1800 UTC 30 July 2016. The "popcorn" look to the precipitation in mountains is not realistic and reflects inferior, 20-year old precipitation physics in the model.
The NWS GFS global model is now third or fourth globally, a particular embarrassment since the range and quality of US. weather research is by far the greatest in the world. The general U.S. population finally understood the situation when important forecasts by the GFS (e.g., Hurricane Sandy) were clearly inferior to those of the European Center and UKMET office. So does NWS/ NOAA management and Congress, which provided funds for both a new operational computer and model replacement.
Replacing the core
The first step in moving to a new modeling system is to replace its guts: the dynamical core. But what is this? A reasonable analog is the chassis of a a car, including the frame and engine (see below). No body, no seats, no electronics, no AC or ventilation, no steering column, no gas tank. Although it is perhaps 25% of what makes up a modern car, the chassis does serve as the frame on which everything else is mounted and connected. To build the best car, you want a good chassis.
Numerical prediction dynamical cores are like that. They provide the framework of the model: its grid structure, the central atmospheric equations (conservation of mass, momentum, energy, and water substance), how the model can provide more detail in local areas (e.g., nesting or variable resolution grid elements).
The dynamical core does not take encompass key physical processes, such as boundary layer drag, condensation in clouds, radiation from the sun/clouds/surface, and much more. And the core does not deal with the critical element of data assimilation, taking all of the sources of weather information to create a physically consistent initialization, or starting place, for the forecast.
So a key point is this: although the National Weather Service picked a dynamical core, MOST of the work to develop a state of science global model for the US is still ahead.
The Decision is Made: GFDL's FV-3
The announcement last week was that the National Weather Service picked the FV-3. In some sense, it was the safe choice, but I believe it wasn't the best choice. Let me be clear: FV-3 is far, far better than the current GFS model and the handiwork of a very talented numerical modeler, S.J. Lin. But I believe it was not the best choice for important reasons. The alternative (MPAS) uses a grid structure that is far better at high resolution (like 1-4 km grid spacing), which is where global modeling is going in the next ten years. Thus, picking FV-3 was like buying a house that is good for two, when you are planning on a large family. FV-3 is far more conventional that MPAS in structure (e.g., it has grids, not the hexagonal elements of MPAS) and its compromises make it faster at the same grid spacing.
The NWS might have been better off with a house more suitable for the future
But just as important, picking MPAS would have brought the research and NOAA/NWS efforts together, and I believe the lack of nation cooperation is one of the key reasons US operational NWP is lagging.
But the decision, good or bad, has been made, and as noted above, the dynamical core is perhaps 25% of a modern modeling system. As Caesar would say, the die has been cast. And if future testing supports the superiority of MPAS, its dynamical core could be switched for FV-3.
So What Should Be Done Now?
Get a plan
The National Weather Service needs a plan to complete the remaining 75% of the new global modeling system and to ensure that it is state-of-the-art and STAYS state-of-the-art.
The trouble? They don't have a plan at this point and need to put it together immediately. Actually, they need TWO plans: a broad strategic plan on where they are going during the next ten years, and a shorter term implementation plan for the next few years.
Planning has been a singular weakness of the National Weather Service weather forecast modeling efforts, with a tendency to do too little of it, and what they have done has been in-house with inadequate vision.
It is clear that NOAA/NWS needs to open up the planning process to include the research community, the private sector, and others. They have done a bit of this with the NCAR UCACN and UMAC committees, but they need to take the planning process to a completely different level in detail, vision, and inclusion. Perhaps, a standing committee that mest several times a year. Let's call it the NOAA Global Model Advisory Committee (NGMAC), with modeling experts from NOAA (ESRL and GFDL), academia, NCAR, the private sector, and other Federal agencies. The committee would recommend a broad strategic plan to NOAA/NWS managemen,t and subcommittees (with additional experts) could build the implementation plans.
Build a National Team
The history of the past few decades is clear: when the National Weather Service develops models in house without the entrainment of the outside community, the models are inevitably not state of the science. But when they partner with others (such as ESRL working with NCAR on the High Resolution Rapid Refresh, HRRR), amazing things happen.
Specifically, NOAA must make the model development a cooperative national effort. Bill Kuo of UCAR , a wise and experienced player in US numerical weather prediction efforts, has suggested a wonderful plan. Center the development of the new national modeling effort in UCAR (the University Corporation for Atmospheric Research), with a center of gravity in Boulder, Colorado (the intellectual center of the US weather community). A team of ESRL researchers, NCAR and academic scientists, private sector modelers, GFDL and NOAA EMC staff) could build the system there. The UCAR Developmental Testbed Center (also in Boulder ) would provide extensive testing of new model components and could organize community support. ESRL, with considerable personnel in model development and basic physics, is found in Boulder.
EMC scientists in DC could work on the new system components, complete pre-operational testing, and take on key tasks such as improving quality control.
NOAA/NWS could continue and expand its research funding of advances that support the new system and hopefully NSF could help support associated basic research.
Think Unified and Probabilistic.
The future of numerical weather prediction is clearly moving towards unified modeling across all scales. That is the way the real atmosphere works. The days of one model for global forecasts and another at ultra-high resolution are numbered, if not over all ready. Leading groups, such as the UKMET Office, have already moved to a unified approach. NOAA must do the same.
Future prediction will be essentially probabilistic at all scales, with ensembles (running models many times with different initial states and physics) being the foundation. The NWS has generally not given enough emphasis and thought to their ensemble systems, with their size and resolution being insufficient. They have also held back from creating a high-resolution (2-4 km grid spacing) over the US, something recommended by endless National Academy reports and national workshops. Ensembles need more resources and thought.
US NWP has become risk adverse and conservative. It needs to be more forward-leaning and innovative.
The Bottom Line
The National Weather Service made a conservative choice for the core of its new global modeling system, but that is not the end of the world. The U.S. can still build a far better global prediction system IF NOAA/NWS will methodically plan the development of the model and entrain the large and strong US research community. Such strategic planning and inclusive development would be a very different approach than NOAA has done in the past. Leaders of NOAA (e.g., Kathryn Sullivan) and the Weather Service (Louis Uccellini, Director, and Bill Lapenta, had of NCEP) say they are ready to follow such an approach, and with their support, U.S. numerical weather prediction could move in a new direction. If U.S. numerical weather prediction stagnates because the NWS follows the path of the past several decades, you will know who to blame.
from Cliff Mass Weather Blog http://ift.tt/2a92euY
Saturday, July 30, 2016
Friday, July 29, 2016
The Upcoming Cool Down
During the past week we have experienced a few warmer than average days, although none pushed temperatures into the 90s over the Puget Sound region (see Sea Tac temperatures below, with average max and min values in red and blue).
Subsequently, the trough moves out and is replaced by a strong, small-scale, low centered over NW Washington (see below for 11 PM on Monday). This will bring both cooler temperatures and some precipitation.
Fast forward to Thursday at 5 PM. Another low is moving into our area.
Turning the National Weather Service GFS model, here is the accumulated forecast precipitation over the next 10 days. The West Coast is starkly drier than the rest of the continent, with Oregon and coastal CA being dry. In contrast, the East Coast is soaked.
With much cooler and wetter conditions over the Northwest, the number of wildfires has been minimal. With the upcoming cooling, one should expect the continuation of the benign fire season.
from Cliff Mass Weather Blog http://ift.tt/2asLhLw
The recent warmth has been associated with a broad upper level ridge of high pressure over the West Coast and eastern Pacific, as illustrated by the upper level map at 5 PM Thursday below.
However, the hallmark of this summer has been the transient nature of West Coast high pressure, with the tendency to have periods in which troughing (low pressure) has developed over the region. Such troughing will be happening in spades during the next week, as illustrated by the upper level map for Saturday at 5 PM: a low is centered over British Columbia, with a trough over the Northwest.
Fast forward to Thursday at 5 PM. Another low is moving into our area.
This time of the year it is hard to get heavy rain even with weak troughing. Thus, the 24-h precipitation forecast ending 4 PM on Tuesday shows some light rain over western WA, but modest wetting over British Columbia.
Turning the National Weather Service GFS model, here is the accumulated forecast precipitation over the next 10 days. The West Coast is starkly drier than the rest of the continent, with Oregon and coastal CA being dry. In contrast, the East Coast is soaked.
With much cooler and wetter conditions over the Northwest, the number of wildfires has been minimal. With the upcoming cooling, one should expect the continuation of the benign fire season.
from Cliff Mass Weather Blog http://ift.tt/2asLhLw
The Upcoming Cool Down
During the past week we have experienced a few warmer than average days, although none pushed temperatures into the 90s over the Puget Sound region (see Sea Tac temperatures below, with average max and min values in red and blue).
Subsequently, the trough moves out and is replaced by a strong, small-scale, low centered over NW Washington (see below for 11 PM on Monday). This will bring both cooler temperatures and some precipitation.
Fast forward to Thursday at 5 PM. Another low is moving into our area.
Turning the National Weather Service GFS model, here is the accumulated forecast precipitation over the next 10 days. The West Coast is starkly drier than the rest of the continent, with Oregon and coastal CA being dry. In contrast, the East Coast is soaked.
With much cooler and wetter conditions over the Northwest, the number of wildfires has been minimal. With the upcoming cooling, one should expect the continuation of the benign fire season.
from Cliff Mass Weather Blog http://ift.tt/2asLhLw
The recent warmth has been associated with a broad upper level ridge of high pressure over the West Coast and eastern Pacific, as illustrated by the upper level map at 5 PM Thursday below.
However, the hallmark of this summer has been the transient nature of West Coast high pressure, with the tendency to have periods in which troughing (low pressure) has developed over the region. Such troughing will be happening in spades during the next week, as illustrated by the upper level map for Saturday at 5 PM: a low is centered over British Columbia, with a trough over the Northwest.
Fast forward to Thursday at 5 PM. Another low is moving into our area.
This time of the year it is hard to get heavy rain even with weak troughing. Thus, the 24-h precipitation forecast ending 4 PM on Tuesday shows some light rain over western WA, but modest wetting over British Columbia.
Turning the National Weather Service GFS model, here is the accumulated forecast precipitation over the next 10 days. The West Coast is starkly drier than the rest of the continent, with Oregon and coastal CA being dry. In contrast, the East Coast is soaked.
With much cooler and wetter conditions over the Northwest, the number of wildfires has been minimal. With the upcoming cooling, one should expect the continuation of the benign fire season.
from Cliff Mass Weather Blog http://ift.tt/2asLhLw
Thursday, July 28, 2016
Wednesday, July 27, 2016
Are Pacific Northwest Summers Becoming More Humid?
A number of people have told me that Pacific Northwest's summers are getting more humid. Is this true?
Let's examine this issue by looking at trends of dew point, which is probably the best measure of stickiness and unpleasantly moist conditions. Dew point, the temperature at which the air becomes saturated when it is cooled, is a good measure of the amount of water vapor in the air. If there is more moisture, you don't have to cool as much to get saturation. Thus, high dew points mean more water vapor in the atmosphere. When dew point gets into the 60s F, we start to feel uncomfortable. 70s is unpleasant. 80s are oppressive.
I will begin by showing you the daily dew points over the past 20 years at Seattle Tacoma Airport. You don't need to be a meteorologist to see there is no apparent trend, either up or down. You will also notice that Seattle dew points rarely rise to the mid-60s, reflecting our very pleasant climate.
Next, let's consider summer precipitation, could that be increasing? More humidity might be expected to enhance rainfall. Here is plot of Washington State precipitation over the last century--no apparent trend is obvious. During the past 30 years, there has been a drying trend, if anything.
The bottom line of the above information is that there is really no evidence that Pacific Northwest summers have been getting more humid.
from Cliff Mass Weather Blog http://ift.tt/2aJZ9P0
Let's examine this issue by looking at trends of dew point, which is probably the best measure of stickiness and unpleasantly moist conditions. Dew point, the temperature at which the air becomes saturated when it is cooled, is a good measure of the amount of water vapor in the air. If there is more moisture, you don't have to cool as much to get saturation. Thus, high dew points mean more water vapor in the atmosphere. When dew point gets into the 60s F, we start to feel uncomfortable. 70s is unpleasant. 80s are oppressive.
I will begin by showing you the daily dew points over the past 20 years at Seattle Tacoma Airport. You don't need to be a meteorologist to see there is no apparent trend, either up or down. You will also notice that Seattle dew points rarely rise to the mid-60s, reflecting our very pleasant climate.
A longer period plot (35 years) for summer dew point at Seattle is shown below. Again no trend.
What about Yakima, along the eastern slopes of the Cascades during the past two decades? As shown below, there is no temporal trend there either.
Next, let's consider summer precipitation, could that be increasing? More humidity might be expected to enhance rainfall. Here is plot of Washington State precipitation over the last century--no apparent trend is obvious. During the past 30 years, there has been a drying trend, if anything.
The bottom line of the above information is that there is really no evidence that Pacific Northwest summers have been getting more humid.
from Cliff Mass Weather Blog http://ift.tt/2aJZ9P0
Are Pacific Northwest Summers Becoming More Humid?
A number of people have told me that Pacific Northwest's summers are getting more humid. Is this true?
Let's examine this issue by looking at trends of dew point, which is probably the best measure of stickiness and unpleasantly moist conditions. Dew point, the temperature at which the air becomes saturated when it is cooled, is a good measure of the amount of water vapor in the air. If there is more moisture, you don't have to cool as much to get saturation. Thus, high dew points mean more water vapor in the atmosphere. When dew point gets into the 60s F, we start to feel uncomfortable. 70s is unpleasant. 80s are oppressive.
I will begin by showing you the daily dew points over the past 20 years at Seattle Tacoma Airport. You don't need to be a meteorologist to see there is no apparent trend, either up or down. You will also notice that Seattle dew points rarely rise to the mid-60s, reflecting our very pleasant climate.
Next, let's consider summer precipitation, could that be increasing? More humidity might be expected to enhance rainfall. Here is plot of Washington State precipitation over the last century--no apparent trend is obvious. During the past 30 years, there has been a drying trend, if anything.
The bottom line of the above information is that there is really no evidence that Pacific Northwest summers have been getting more humid.
from Cliff Mass Weather Blog http://ift.tt/2aJZ9P0
Let's examine this issue by looking at trends of dew point, which is probably the best measure of stickiness and unpleasantly moist conditions. Dew point, the temperature at which the air becomes saturated when it is cooled, is a good measure of the amount of water vapor in the air. If there is more moisture, you don't have to cool as much to get saturation. Thus, high dew points mean more water vapor in the atmosphere. When dew point gets into the 60s F, we start to feel uncomfortable. 70s is unpleasant. 80s are oppressive.
I will begin by showing you the daily dew points over the past 20 years at Seattle Tacoma Airport. You don't need to be a meteorologist to see there is no apparent trend, either up or down. You will also notice that Seattle dew points rarely rise to the mid-60s, reflecting our very pleasant climate.
A longer period plot (35 years) for summer dew point at Seattle is shown below. Again no trend.
What about Yakima, along the eastern slopes of the Cascades during the past two decades? As shown below, there is no temporal trend there either.
Next, let's consider summer precipitation, could that be increasing? More humidity might be expected to enhance rainfall. Here is plot of Washington State precipitation over the last century--no apparent trend is obvious. During the past 30 years, there has been a drying trend, if anything.
The bottom line of the above information is that there is really no evidence that Pacific Northwest summers have been getting more humid.
from Cliff Mass Weather Blog http://ift.tt/2aJZ9P0
Tuesday, July 26, 2016
Monday, July 25, 2016
Dry Weather on Schedule
One of the most extraordinary idiosyncrasies of Northwest weather is the profound drought during midsummer. During a magical few weeks, generally including the last week of July and first week of October, the Pacific Northwest is usually the driest region in the nation. Drier than Arizona for instance. And the latest forecast charts suggest this year will be no exception.
from Cliff Mass Weather Blog http://ift.tt/2akGYR8
Let's take a look at the precipitation climatology of Seattle Tacoma Airport, specifically the climatological probability of getting .01 inches of precipitation over a day. The driest day is in late July (about 8%), but late July and early August are right behind--only about a 10% chance of getting one hundredth of an inch, the definition of measurable precipitation. The wettest period? November.
What about a significant rain, like a tenth of an inch in a day? Lower chances of course, and a very flat minimum from the second week of July to early August.
Really going for the gusto, how about .25 inches in a day? Very low probability over June, July, and August. November really stands out.
Why is midsummer so dry? With the jet stream heading north during the summer we get little rain from fronts and midlatitude cyclones. And thunderstorms are infrequent west of the Cascade crest during midsummer due to lack of humidity, a relatively cool lower atmosphere (due to the ocean influence), and few upper level disturbances to give air parcels an upward kick.
So if you are planning a wedding, hike, or outdoor activity: do it now. Our weather world will be very different in a month.
from Cliff Mass Weather Blog http://ift.tt/2akGYR8
Dry Weather on Schedule
One of the most extraordinary idiosyncrasies of Northwest weather is the profound drought during midsummer. During a magical few weeks, generally including the last week of July and first week of October, the Pacific Northwest is usually the driest region in the nation. Drier than Arizona for instance. And the latest forecast charts suggest this year will be no exception.
from Cliff Mass Weather Blog http://ift.tt/2akGYR8
Let's take a look at the precipitation climatology of Seattle Tacoma Airport, specifically the climatological probability of getting .01 inches of precipitation over a day. The driest day is in late July (about 8%), but late July and early August are right behind--only about a 10% chance of getting one hundredth of an inch, the definition of measurable precipitation. The wettest period? November.
What about a significant rain, like a tenth of an inch in a day? Lower chances of course, and a very flat minimum from the second week of July to early August.
Really going for the gusto, how about .25 inches in a day? Very low probability over June, July, and August. November really stands out.
Why is midsummer so dry? With the jet stream heading north during the summer we get little rain from fronts and midlatitude cyclones. And thunderstorms are infrequent west of the Cascade crest during midsummer due to lack of humidity, a relatively cool lower atmosphere (due to the ocean influence), and few upper level disturbances to give air parcels an upward kick.
So if you are planning a wedding, hike, or outdoor activity: do it now. Our weather world will be very different in a month.
from Cliff Mass Weather Blog http://ift.tt/2akGYR8
Sunday, July 24, 2016
Consider Pallet Furniture When Decorating Your Interior
When it comes to interior decorating there are lots of things to consider: from the light fixtures to the colour of the walls, everything must be planned down to the smallest detail.
One of the biggest decisions to make is what kind of furniture to put in the home. After all the furniture needs to be comfortable as well as stylish. It’s no good having to replace broken or worn furniture every 6 months, so durability also needs to be considered.
Pallet wood furniture is a fantastic addition to any home. This guide explores the benefits of installing DIY pallet furniture in the house. It is a decision that many homeowners have been overjoyed to make in the past.
Cost
Fancy leather couches or chrome and steel revolving chairs may seem like a good idea at the time, but soon the cost of choosing expensive materials for furniture can start to add up. In contrast, pallet wood is extremely affordable. In addition, replacing a glass coffee table that has been smashed or a marble table that is cracked is costly – in contrast pallet wood furniture can be replaced without having to fork out lots of cash.
Durability
Pallet wood furniture sold by Pallet West is treated with non-harmful chemicals which make it extremely durable and long lasting. This is especially true of chairs and loungers which get a lot of repeated use. Whereas leather chairs can split and stuffing fall out over time, the strength of pallet wood means that it very rarely has to be repaired or replaced for extensive damage.
Comfort
The smoothness of pallet wood makes it an extremely comfortable choice for furniture. Put some deckchairs out in the garden and laze away in front of the barbecue when the sun appears. Pallet wood is also a great choice for a bedframe as it is lightweight enough to be carried up a flight of stairs, yet strong enough to ensure a comfortable night’s sleep.
Stylishness
At first glance it may seem that pallet wood furniture is not the stylist’s first choice, however this is incorrect. Think of pallets as a ‘blank canvas’ which can be altered due to the whim of the homeowner. Paint the pallets a range of bright colours to make them really stand out and catch the eye of any guests or family members who happen to pay a visit.
Ease Of Use
Another drawback of big leather sofas or clunky glass coffee tables is that it can be extremely difficult to move them around. This means that if guests or family come over space can become a problem. However, pallets are lightweight and can be easily moved about the house. This means that any time extra space needs to be made, furniture can be shifted about in the blink of an eye.
Pallet Furniture is a great addition to any home – hopefully this guide has made the numerous benefits a lot clearer.
from Home Design Ideas | Interior Design Ideas And Architcture http://ift.tt/2a4QXa8
Consider Pallet Furniture When Decorating Your Interior
When it comes to interior decorating there are lots of things to consider: from the light fixtures to the colour of the walls, everything must be planned down to the smallest detail.
One of the biggest decisions to make is what kind of furniture to put in the home. After all the furniture needs to be comfortable as well as stylish. It’s no good having to replace broken or worn furniture every 6 months, so durability also needs to be considered.
Pallet wood furniture is a fantastic addition to any home. This guide explores the benefits of installing DIY pallet furniture in the house. It is a decision that many homeowners have been overjoyed to make in the past.
Cost
Fancy leather couches or chrome and steel revolving chairs may seem like a good idea at the time, but soon the cost of choosing expensive materials for furniture can start to add up. In contrast, pallet wood is extremely affordable. In addition, replacing a glass coffee table that has been smashed or a marble table that is cracked is costly – in contrast pallet wood furniture can be replaced without having to fork out lots of cash.
Durability
Pallet wood furniture sold by Pallet West is treated with non-harmful chemicals which make it extremely durable and long lasting. This is especially true of chairs and loungers which get a lot of repeated use. Whereas leather chairs can split and stuffing fall out over time, the strength of pallet wood means that it very rarely has to be repaired or replaced for extensive damage.
Comfort
The smoothness of pallet wood makes it an extremely comfortable choice for furniture. Put some deckchairs out in the garden and laze away in front of the barbecue when the sun appears. Pallet wood is also a great choice for a bedframe as it is lightweight enough to be carried up a flight of stairs, yet strong enough to ensure a comfortable night’s sleep.
Stylishness
At first glance it may seem that pallet wood furniture is not the stylist’s first choice, however this is incorrect. Think of pallets as a ‘blank canvas’ which can be altered due to the whim of the homeowner. Paint the pallets a range of bright colours to make them really stand out and catch the eye of any guests or family members who happen to pay a visit.
Ease Of Use
Another drawback of big leather sofas or clunky glass coffee tables is that it can be extremely difficult to move them around. This means that if guests or family come over space can become a problem. However, pallets are lightweight and can be easily moved about the house. This means that any time extra space needs to be made, furniture can be shifted about in the blink of an eye.
Pallet Furniture is a great addition to any home – hopefully this guide has made the numerous benefits a lot clearer.
from Home Design Ideas | Interior Design Ideas And Architcture http://ift.tt/2a4QXa8
Saturday, July 23, 2016
Weather Cam Provides a VERY Close Up View of a Lightning Strike
Near midnight yesterday (Thursday at 11:58 PM), lightning was apparent in one of my favorite weather cams (skunkbayweather)--see the image below. A classic with all kinds of forking of the lightning channel in a series of discrete steps.
Less than 15 minutes later, the unimaginable happened: lightning nearly struck the cam, apparently hitting the water a few dozen feet away. Amazingly you can see the undulations of the lightning channel.
You are looking at something that is hotter than the surface of the sun; the core of lightning channels reach roughly 50,000 K (Kelvin), with the surface of the sun only around 6000 K. The huge currents associated with lightning converts atmospheric gases into plasma, with the electrons stripped off the nuclei. A pressure shock radiates from the superheated channel: thunder.
There were many other amazing lightning pictures taken this morning, such as this stunning picture on Whidbey Island (Coupeville, Camp Casey) by Ron Newberry.
Lightning detection networks picked up the storms on Thursday night--here is the 24h lightning strikes ending 1 AM Friday. Lots of lightning over the Olympics, with some moving over Whidbey. A lot more lightning over the Cascades. Fortunately, there was substantial rain over the Cascades and the ground is still relatively moist: thus, few fires are expected.
Talking about rain, Thursday evening and Friday morning were pretty wet: here are the 24 totals ending 6 PM Friday PDT. A number of locations in the Cascades received more than an inch.
The storms and rain are over now, with the next week being warm and dry.
from Cliff Mass Weather Blog http://ift.tt/29RTjOf
Less than 15 minutes later, the unimaginable happened: lightning nearly struck the cam, apparently hitting the water a few dozen feet away. Amazingly you can see the undulations of the lightning channel.
You are looking at something that is hotter than the surface of the sun; the core of lightning channels reach roughly 50,000 K (Kelvin), with the surface of the sun only around 6000 K. The huge currents associated with lightning converts atmospheric gases into plasma, with the electrons stripped off the nuclei. A pressure shock radiates from the superheated channel: thunder.
There were many other amazing lightning pictures taken this morning, such as this stunning picture on Whidbey Island (Coupeville, Camp Casey) by Ron Newberry.
Lightning detection networks picked up the storms on Thursday night--here is the 24h lightning strikes ending 1 AM Friday. Lots of lightning over the Olympics, with some moving over Whidbey. A lot more lightning over the Cascades. Fortunately, there was substantial rain over the Cascades and the ground is still relatively moist: thus, few fires are expected.
Talking about rain, Thursday evening and Friday morning were pretty wet: here are the 24 totals ending 6 PM Friday PDT. A number of locations in the Cascades received more than an inch.
The storms and rain are over now, with the next week being warm and dry.
from Cliff Mass Weather Blog http://ift.tt/29RTjOf
Subscribe to:
Posts (Atom)