...From the Director's Office Dr. A.E. MacDonald
FSL Host to Conrad Lautenbacher, Under Secretary of Commerce for Oceans and Atmosphere and NOAA Administrator During his April visit to the Boulder NOAA campus, Vice Admiral Conrad Lautenbacher, Jr., U.S. Navy (Ret.), toured the FSL Supercomputer Center and the new prototype of Science on a SphereTM. He was briefed on AWIPS and how our supercomputer Jet supports development of advanced workstation and observing systems, high-performance computing, and high-resolution modeling at FSL and other government agencies.
NOAA Administrator Lautenbacher is one of several guests from headquarters to receive a demonstration of Science on a SphereTM, created by Dr.Sandy MacDonald for presenting NOAA's global science to the public in a new and exciting way. Science on a SphereTM will impart the extraordinary research and discoveries at NOAA to the public, students, and policymakers. NOAA observes, studies, and predicts weather and climate of the earth, encircled by oceans and atmosphere. Thus, Science on a SphereTM displays imagery of the earth on a three-dimensional sphere, and people are able to visualize it as if they are in a spaceship.
The current prototype uses a 6-foot plastic sphere, a MacIntosh computer, software developed by the Weather Discovery Center, and four projectors to create a dynamic, moving earth. The media set includes the earth's topography and bathymetry (from the National Geophysical Data Center) that slowly rotates. On the back side, as the earth rotates, the lights of the cities are seen. Clouds are shown in motion over the entire earth, as seen by five geostationary satellites. Another display shows x-ray images of the sun, with solar storms raging across the face of the sphere. Additional displays would include the latest 10-day forecast from the National Weather Service, ocean current circulation, chemical diagnoses, sun and planet motion (via satellite), and climate prediction. FSL plans to work with other NOAA laboratories and other government and commercial organizations to develop this project. These spheres would be scaled for public display at locations such as the NOAA Science Center, the lobby of the Department of Commerce building, museums such as the Smithsonian, or classrooms.
Prior to his trip to Boulder, Admiral Lautenbacher visited the FSL exhibit booth at the 2002 Annual Meeting of the American Meteorological Society (see photo). Dr. MacDonald gave him demonstrations of some of FSL's latest technologies: FX-Net, FX-Collaborate, GFESuite, the WorldWide Weather Workstation, and D2D/D3D-Linux interactive displays.
NOAA Administrator Conrad Lautenbacher (right) receiving a demonstration from Sandy MacDonald (center) and Carl Bullock (Chief, Modernization Division) at the 2002 AMS Annual Meeting.
Global Universal Profiling System In April Dr. MacDonald presented a seminar, sponsored by the Climate Modeling and Diagnostics Laboratory (CMDL), on Global Atmospheric and Oceanic Profiles for Weather and Climate Prediction. He is proposing a new operational global observing system, called the Global Universal Profiling System (GUPS), that would collect detailed profiles of state and forcing variables at 240 points equally distributed over the oceans and polar regions. GUPS would use advanced automated aircraft, such as Northrup's Global Hawk, to routinely (every three days) drop sondes for atmospheric measurements and expendable bathythermographs for the ocean. The primary ocean observing systems would be a hierarchy of buoys, from sophisticated ocean observatories, to inexpensive versions of the Tropical Atmosphere-Ocean (TAO)-Triton buoys. The atmospheric sondes would occasionally include sondes to measure chemistry, aerosols and cloud content. The aircraft is similar to NASA's ER2 in payload and could be fitted with an array of sophisticated instruments to measure chemistry, aerosols, and cloud droplet spectra. The flight budget of the proposed system would allow one or two descents per month at each point to near the surface for in situ measurements. GUPS aircraft flight budget would also allow adaptive observing for weather prediction on a two day cycle over the entire domain (75% of the earth¹s surface). The system is complementary to existing observing systems such as satellites, and would be important for improving weather prediction and long range climate analysis and prediction. A paper on this topic is being submitted to the open literature for publication.
...From Information and Technology Services Dr. Peter A. Mandics, Chief Information Officer
High-Performance Computing System Today's atmospheric researchers depend on supercomputers to develop high-resolution prediction models that more closely resemble the actual weather taking place outdoors. FSL's High-Performance Computing System, Jet, has become more powerful in recent upgrades. An interim upgrade late last year added 280 additional Alpha CPUs to Jet, bringing the total number of computational CPUs in Jet to 560. Half of the CPUs are clocked at 667MHz while the new CPUs are clocked at 833MHz. The combined machine achieved 442 GFLOPS (billion arithmetic operations per second) on the standard LINPACK benchmark. The performance rating makes Jet the 68th most powerful computer in the world. A final upgrade to Jet is due later this year.
Jet is a cluster of GNU/Linux computational nodes interconnected with a high performance network known as MyriNet. The interconnection speed is 1.28 billion bits per second (Gbps) on the older portion of Jet and 2 Gbps on the newer portion. This interconnection network makes it possible for Jet to function as a supercomputer.
Jet provides computational capability for FSL modeling efforts, high-performance computing software development, and other NOAA projects. Last November FSL started a new quarter for project allocations on Jet. Internal and external projects are reviewed on the basis of scientific merit and appropriateness for a commodity-distributed memory machine such as Jet. The projects include modeling of the oceans, climate, atmosphere, and air quality, and represent a wide cross section of the NOAA community, including 8 of the 12 NOAA Research Laboratories and the NESDIS National Geophysical Data Center.
FSL Network The FSL network has undergone significant upgrades and enhancements which have improved network stability, reliability, bandwidth, and performance for FSL research and users.
Since moving into the David Skaggs Research Center (DSRC), the FSL Asynchronous Transfer Mode (ATM) and Ethernet network sustained two years of increasing demand for bandwidth and access for new high-performance servers, as well as for an increasing number of high-speed workstations connecting to the network. Minimal upgrades in response to these growing demands left the network running suboptimally at times. This was primarily due to increases in bandwidth usage, but also stemmed from an imbalance in the distribution of critical ATM services over a number of seven-year-old devices (PowerHubs). The network was also experiencing a severe shortage in the number of ATM ports available for new server connections, and an over subscription of the bandwidth available in Ethernet edge devices. This created bottlenecks for many users.
The FSL Network Administration group addressed these issues by researching several IT market vendors of high-speed network devices. The top three vendors each had a different design approach to the network needs of FSL. The design proposals were evaluated against our current needs, as well as for the opportunities that their designs provided for future growth, with trends toward the latest networking technologies that FSL plans to take advantage of. In the end, the Marconi Communications upgrade proposal best fit FSL networking needs by providing four ESX-3000 ATM/Ethernet Campus Switches that were installed last June and July. Specifically, the four 20-Gigabit ESX-3000 switches provided the following improvements: management of the distributed ATM Local Area Network (LAN) Emulation services that underpin the FSL network core (thus offloading the failing PowerHubs), an extension of the dual ATM OC-12 (2 x 622 Mbps) links from the core to the edge (the user access point of the network), provision for additional ATM and Ethernet port capacity, and a four-fold increase in performance over legacy Fast Ethernet edge modules. This upgrade, which also included operating system software upgrades and a redistribution of legacy components to lower demand areas, has yielded a stabilizing effect on general network operations.
Current network enhancements address two new technologies and a security architecture that will further benefit FSL. Our highly distributed computing environment is serviced by 28 subnetwork IP address spaces, which necessitates a large amount of internal routing, accomplished via host computer CPU or software-based routing. Routing is now possible via Application Specific Integrated Circuits (ASIC computer chips), or in other words, hardware-based routing. Implementation of ASIC routing will greatly improve Local Area Network performance among FSL divisons and projects. The other technology that has shown a quick acceptance of standards in the computing industry is Gigabit Ethernet (GigE). FSL has clusters of GigE and GigE-capable servers, such as Jet, and we need a way to integrate this technology into our multigigabit ATM core. Merging ATM and GigE will help FSL to optimize the high- performance and familiarity of Ethernet while protecting our investment in ATM and the resiliency provided by our fully meshed topology.
IT security is also being incorporated into the design phase of FSL network upgrades. While security in-depth is emphasized to address perimeter, local access, and host-based security, the Network Administration Group is taking steps to restructure the backbone architecture between FSL and the DSRC. The new topology will allow FSL to more effectively implement perimeter security controls. This is important for maintaining FSL data and research integrity, overall security, and network accessibility, all of which are critical to our computing infrastructure and mission.
...From the Forecast Research Division Dr. Steven E. Koch, Chief
RUC20 Operational Dr. Stan Benjamin announced that the 20-km version of the Rapid Update Cycle (RUC20) became operational at the National Centers for Environmental Prediction (NCEP) in April. Improvements in this new version cover horizontal and vertical resolution, moist physics, assimilation of GOES cloud-top data, use of observations in analysis, land-surface physics, lateral boundary conditions, and postprocessing. The most significant improvements over the 40-km version include precipitation (both summer and winter), all surface fields (temperature, moisture, and winds), upper-level winds and temperatures, and orographically induced precipitation and circulations. See the article in this issue for more information on RUC20.
Congratulations to the RUC development team, including Stan Benjamin, John Brown, Kevin Brundage, Dezso Devenyi, Georg Grell, Dongsoo Kim, Barry Schwartz, Tatiana Smirnova, Tracy Lorraine Smith, and Stephen Weygandt, all of FSL; and Geoffrey Manikin of NCEP.
International H20 Project (IHOP) Field Experiment FSL investigators (along with researchers from other NOAA laboratories, NASA, NCAR, and universities) are participating in the IHOP field experiment over the Southern Great Plains from May 13 June 25, 2002. The primary goal of the IHOP program is improved characterization of the four-dimensional distribution of water vapor and its application to improving the understanding and prediction of convection.
At any given time, at least three FSL people are in Norman, Oklahoma, engaged in IHOP work as a nowcaster, a model evaluator, and an aircraft scientist. Of the many different scientific missions performed, FSL is most interested in the study devoted to understanding the structure of the low-level jet and the transport of moisture by small-scale inhomogeneities in the jet, using a combination of Doppler lidar, differential absorption lidar for moisture mapping, and dropwindsondes. Six FSL models are scheduled to run and be available to the IHOP forecasters. FX-Net is being used to display model output.
More information on how the models performed during this experiment will be featured in a future Forum article.
Wind Energy Forecasts from the Rapid Update Cycle Wind energy, the fastest growing energy technology, has become an important component of the nation's electrical power grid. Increased energy demands, combined with insufficient generation capabilities, underscore the importance of wind energy as a renewable alternative. The European Wind Energy Association predicts that by 2020, ten percent of world energy needs will be provided by wind energy. Energy managers require accurate estimates of power generation potential in order to effectively manage this growing resource. To derive these estimates, they need accurate forecasts of surface and near-surface winds for a period from a few hours to several days. Accurate estimates of generation capacities would provide energy planners and traders with critical planning tools. In addition, planners want a level of confidence that wind speed forecasts are accurate. All forecasts are subject to some inaccuracies; however, some situations are more difficult to accurately predict than others. Parameters which quantify the uncertainties associated with a given forecast would improve the utility of those forecasts in the energy planning process.
FSL's Kevin Brundage and Stan Benjamin are collaborating on a project with Marc Schwartz of the National Renewable Energy Laboratory (Department of Energy) to evaluate the use of RUC forecasts in energy planning. This study was expanded last year to explore the use of ensembles of forecasts as a predictor of forecast uncertainty. Preliminary results support the use of ensembles as a basis in determining the degree of confidence associated with given forecasts.
The RUC model was selected for this study for several reasons. It employs a 40-level hybrid sigma/isentropic vertical coordinate system with very high vertical resolution near the surface. A sophisticated multilevel soil/vegetation model improves the treatment of forecast surface fluxes. Also, the RUC model was designed to run at a higher temporal frequency (hourly), taking advantage of the volume of surface, profiler, and aircraft reports available hourly. More information will be published on this project as it matures and more studies are completed and analyzed.
A Generalized Approach to Parameterizing Convection Combining Ensemble and Data Assimilation Techniques Properly parameterizing the effects of convection is still a challenging problem for numerical weather prediction. There are many different parameterizations for deep and shallow convection that exploit the current understanding of the complicated physics and dynamics of convective clouds to express the interaction between the larger scale flow and the convective clouds to express the interaction between the larger scale flow and the convective clouds in simple "parameterized" terms. These parameterizations often differ fundamentally in closure assumptions and parameters used to solve the interaction problem, leading to a large spread and uncertainty in possible solutions. In past studies, these uncertainties have led to many discussions regarding which assumptions are the proper ones to use under certain conditions.
Drs. Georg Grell and Dezso Devenyi are developing a convective parameterization that offers a generalized approach to make use of these uncertainties by combining ensemble and data assimilation techniques. This article has been accepted for publication in Geophysical Research Letters.
Space Launch Range Operations Support Brent Shaw, Steve Albers, Dr. John McGinley (Chief, Local Analysis and Prediction Branch), Paul Schultz, John Smart, and Linda Wharton, members of the LAPS Group at FSL, are working with the U.S. Air Force and Lockheed Martin on the Range Standardization and Automation (RSA) program. RSA is an Air Force program to modernize and standardize the command and control infrastructure of the Eastern Range in Florida and the Western Range in California.
Since 1996 FSL has been working on various aspects of this program, such as developing a data assimilation and forecast system using the Local Analysis and Prediction System (LAPS). The system is completely integrated into AWIPS, utilizing a cost-effective, multiprocessor, Intel-based Linux cluster to handle the necessary computations. This full integration with AWIPS allows the data assimilation system to take advantage of the wide array of unique local data sources ingested via the AWIPS Local Data Acquisition and Dissemination (LDAD) interface to provide high spatial and temporal resolution analyses of the atmosphere suitable for diabatic initialization of the PSU/NCAR MM5 model, which is used for the predictive component of LAPS. The integration within AWIPS also ensures that the operational forecasters are able to effectively exploit the full-resolution gridded analyses and forecasts in conjunction with all other sources of meteorological data as a single, coherent package in the most efficient manner possible. More information on this project will be presented in a future Forum.
Using Ensembles of Numerical Weather Forecasts for Road Weather Prediction Paul Schultz is leading a project, in cooperation with other research laboratories, to develop the Maintenance Decision Support System. The MDSS uses weather observations, statistics, and numerical models to make automated point-specific weather forecasts for key points along roadways. The weather forecast information is used by road-condition algorithms, graphical displays, and other tools that help state departments of transportation decide how best to deploy snowplows and road treatments. Although the focus of the project is winter weather problems, the MDSS approach to providing tailored point weather forecasts is intended to be more broadly applicable.
FSL's role in MDSS is to provide a variety of very high-resolution forecasts from multiple locally configured mesoscale models such as RAMS, MM5, WRF, and ARPS. The concept of an ensemble of models is based on the premise that, given an imperfectly observed predictant (the atmosphere in this case), it is in principle possible to combine multiple predictions so that the ensemble forecast is superior to any single prediction included in the ensemble. This assumes that each of the predictions is equally likely to be closest to "reality," and that the forecast errors among the models are uncorrelated, two assumptions that are typically not valid in most ensemble modeling systems. Still, there have been notable successes in ensemble weather forecasting.
The Environmental Science Data Integration and Management (ESDIM) Project Dr. Edward Tollerud heads the ESDIM project, part of the "Health of our Network" program at the National Climatic Data Center (NCDC) in Asheville, North Carolina. FSL, along with researchers at the National Center for Atmospheric Research in Boulder, are devising strategies to automatically assess and monitor the quality of precipitation observations made by volunteers in the Cooperative Observing network.
The primary objective of this project is to develop automatic statistical tests that can identify and bring to the attention of NCDC scientists climatic observing sites that experience sudden degradation or unexplained changes in observations. Because precipitation is highly variable in both space and time, it is likely that techniques sensitive to several attributes of precipitation will be necessary. Among these attributes are frequency as well as amount, seasonal variability, and correlation with sets of neighboring stations that have shown themselves to have similar precipitation characteristics in the past. Possible reasons for such changes are unreported station moves, equipment failure, changes in exposure, and systematic observer.
Initial diagnostic analyses of some of the several thousand observing sites, many of which have periods of record spanning several decades, suggest that monthly or seasonal comparison of distributions of well-formulated precipitation variables can be effective in identifying observing problems. As a first step, determination of the quantities that are most sensitive to changeable characteristics of precipitation is necessary. A step in this direction is shown in the figure below, in which each of a set of observing sites in Iowa is matched with six neighboring sites each day during the course of one month. From the resulting sets of pairs (six per day at each site) a percentage of all pairs for which both sites recorded nonzero precipitation is computed. These percentages (one per site) are shown in the distribution in the figure. If rainy or clear conditions were uniform across the state, and each site was measured accurately, the resulting distribution should be sharply peaked at a value that reflects the frequency of daily precipitation during the period. Given precipitation variability, the actual peak would be expected to spread toward higher and lower values, as shown. However, the width of the spread from near-zero percentages to large values and a suggestion of another peak just below 35% suggest that observing problems are also reflected in the distribution. In this case, examination of the data show that the main source of error can be attributed to stations that misreported several (in some cases, all) missed daily reports during the month. Although this particular error could be corrected with a careful look at station inventories, other more subtle observing problems might also be identified by this kind of procedure.
Figure. See text above.
Evaluation of Balloon Trajectory Forecast Routines for GAINS The GAINS (Global Air-ocean IN-situ System) project, led by Dr. Cecilia Girz, is a global observing system designed to augment current environmental observing and monitoring networks. GAINS is a network of long-duration, stratospheric platforms carrying onboard sensors and hundreds of dropsondes for acquisition of meteorological, air chemistry, and climatic data over oceans and remote land regions of the globe. Two vehicles comprise the GAINS networksuper-pressure balloons and remotely operated aircraft. The 33.5-m diameter superpressure balloons will carry payloads of 350 kg and remain aloft for a period of 6 12 months at altitudes from 18.3 22.9 km. Remotely operated aircraft, which are expected to play a role in providing year-round global coverage in the Northern Hemisphere, midlatitudes, and polar regions, also carry hundreds of kilograms of payload, but for shorter periods.
Vital to meeting the goal of an operational program is completion of a rigorous set of laboratory and field tests of balloon shell and payload instrumentation. A flight of up to 48 hours has been conducted, and flights with durations of several days to weeks are planned. Proper evaluation of these test flights requires recovery of the balloon and payload, thus the need to know their flight path and landing site. Software utilizing observational and numerical model sources has been developed for trajectory prediction, with four versions currently in use.
An article by Randall Collander and Cecilia Girz discusses the results of an evaluation of predicted trajectories for a specific period. (See http://www.fsl.noaa.gov, and click on Publications and Research Articles).
...From the Demonstration Division Margot Ackley, Chief
NOAA Profiler Program Celebrates 10th Anniversary On May 16, 2002, FSL hosted a 10th Anniversary celebration of the NOAA Profiler Program. The guest of honor was Dr. James R. Mahoney (below), NOAA's newly appointed Assistant Secretary of Commerce for Oceans and Atmosphere. During his speech, FSL Director Dr. Sandy MacDonald expressed appreciation to all participants who worked hard to bring the NOAA Profiler Network into operations. He presented Margot Ackley, current head of the Profiler Program and her predecessor, Dr. Russell Chadwick, with an award, representative of all who made this project a success. The process has involved years of research and development, most of which occurred within OAR's Environmental Research Laboratories.
Dr. James Mahoney, Assistant Secretary of Commerce for Oceans and Atmosphere, at the Profiler Anniversary celebration in Boulder.
Dr. Sandy MacDonald presenting an award to Margot Ackley, Chief of FSL's Demonstration Division.
Many research scientists, engineers, computer scientists, managers, and support staff of the Aeronomy, Environmental Technology (formerly Wave Propagation, WPL) and Forecast Systems laboratories contributed to the overall effort, which began in 1985. Many of these people, some retired, attended the celebration, as follows: Dr. Vernon Derr, former Director of the Environmental Research Laboratories and Profiler Program Manager; Dr. C. Gordon Little, first Director of WPL; Dr. David C. Hogg, Program Area Chief with WPL and former director of antenna research at Bell Laboratories; Dr. Richard G. Strauch, Program Area Chief, member of the National Academy of Engineers; and John Green, formerly of Aeronomy Laboratory, who built one of the very early profilers.
The $16M Profiler Network began full operations on May 18, 1992, with the government's acceptance of the system located at Blue River, Wisconsin. For the past 10 years, 35 systems located in Alaska and the lower 48 states have been sending wind information to the Profiler Hub in Boulder, Colorado. Hourly processed wind data are transmitted the National Weather Service offices throughout the United States for use in local forecasts and numerical weather prediction models.
Dr. Mahoney (center) learns more about the NOAA Profiler Network from Margot Ackley (standing) and Douglas van de Kamp (left), head of Network Operations.
During its lifetime, the NOAA Profiler Network has proven its worth many times over. For example, during the May 3, 1999, Oklahoma tornado outbreak, when a record 52 tornadoes hit southwestern and central Oklahoma, forecasters received critical weather information from the profilers, especially from the system at Tucumcari, New Mexico. In the summer of 2000, all three New Mexico profilers provided invaluable data for fire weather support of the Los Alamos wildfire. Last March, the Alaska profilers were vital in determining the strength and duration of the famous "100-year snow event" in Anchorage.
Unfortunately, possible extinction of the NOAA Profiler Network looms in the futuredespite a decade of highly successful Profiler operation and a customer base of many federal organizations in addition to the National Weather Service, the military, universities, the research community, private weather groups, and the media. Without funding, the Profiler Network may have to be dismantled unless a critical upgrade is made involving the change of the profilers' transmitting frequency. This is required to avoid interference with the global search and rescue instruments that will be on 60 to 70 satellites in a few years. At present no funds have been identified in NOAA's operating budget for this upgrade.
Ground-Based GPS Meteorological Observations in Numerical Weather Prediction For lack of sufficient observations, the definition of atmospheric moisture fields (including water vapor and clouds) remains a difficult problem whose solution is essential for improved weather forecasts. Moisture fields are underobserved in time and space, primarily due to the high variability of water in the atmosphere. Because of the important role of water in weather and climate processes, a significant effort has been expended to develop new or improved remote sensing systems to mitigate this problem. One such system uses ground-based Global Positioning Systems (GPS) receivers to make accurate all-weather estimates of atmospheric refractivity at very low cost. This application of GPS has led to a new and potentially significant upper-air observing system for meteorological agencies and researchers around the world. The first and most mature uses of GPS for this purpose is in the estimation of integrated (total column) recipitable water vapor above a fixed site.
A paper by Seth Gutman and Stan Benjamin (published in GPS Solutions 2001) discusses the techniques used at FSL to collect, process, and distribute GPS water vapor observations. FSL has shown that GPS integrated water vapor data can be used effectively in objective (i.e., numerical weather prediction) and subjective weather forecasting. To understand the strengths and limitations of GPS for weather forecasting, it is essential to understand what types of information are currently available to forecasters and modelers, how models use the data to describe the current and probable future state of the atmosphere, and the current trends in modern weather prediction to ensure that GPS observing systems play a significant role in the future.
...From the Systems Development Divid Kennedy Space Center. The Air Force Space Command is responsible for managing operations of the launch locations, providing common services and ensuring public safety. The locations also serve as test ranges for ballistic missile and other types of tests. The Air Force provides comprehensive operational meteorological services to both ranges. These services include weather support for personnel and resource protection, prelaunch ground processing, and day-of-launch operations for the Department of Defense, NASA, and commercial launch customers.
In the mid-1990s, Space Command undertook an effort to modernize the two ranges. The Range Standardization and Automation (RSA) program is designed to improve efficiency and reduce costs by providing more automated, standard systems to the space launch ranges. As part of this modernization effort, the Mission Systems Division of Lockheed Martin is completing development and delivery of the infrastructure, instrumentation, communication, and software applications necessary to operate the ranges.
FSL's Darien Davis (primary author of a paper on this topic) presented a talk at the 2002 AMS Meeting on the integration of range forecasting functionality into AWIPS, and weather launch requirements and instrumentation used to collect and disseminate required data to the launch directors.
AWIPS Build 5.2.2 MAPS Surface Assimilation System (MSAS) For mesoscale weather forecasting, surface data are crucial because their time and space resolution is unmatched among in situ observations. They provide direct measurements of surface conditions, permit inference of conditions aloft, and often give crucial indicators of the potential for the onset and location of severe weather. Frequent analyses, or gridded fields, of surface data can provide detailed information on the development and movement of surface weather systems, especially when used in combination with the looping capabilities of modern meteorological workstations.
Patricia Miller and Michael Barth developed the Mesoscale Analysis and Prediction System (MAPS) Surface Assimilation System (MSAS) at FSL which provides the National Weather Service with accurate quality control for surface observations, and timely and detailed gridded fields of surface variables. Since 1989, the system has been running at the National Centers for Environmental Prediction, where it is known as the RUC Surface Assimilation System. It has been running operationally on AWIPS since build 4.1. MSAS produces gridded fields for 16 surface variables including sea-level pressure, lifted index, and moisture convergence, and also provides quality control information for the AWIPS Quality Control and Monitoring System (QCMS).
...From the Aviation Division Michael Kraus, Chief
Using Verification Techniques to Evaluate Differences Among Convective Forecasts Jennifer Mahoney heads the Real-Time Verification System (RTVS) project at FSL. The RTVS team has been comparing the statistical results generated for various convective forecasts to demonstrate the strengths and weakness of those forecasts became apparent. When making comparisons between forecasts, it is important to emphasize their differences and develop methods that are appropriate for each forecasting system. For instance, the differences between the National Convective Weather Forecast (NCWF) and the C-SIGMETs (Convective-significant meteorological information) made it difficult to clearly compare the two forecasts. However, the comparisons showed the quality of the experimental against the operational standard, which was provided by the C-SIGMETs. This critical information is needed by decisionmakers to provide guidance when evaluating whether an experimental forecast should become operationally supported by the NWS. Forecast comparisons can also be used to gain further understanding of what the forecast is and how it should be used. In this case, the C-SIGMETs were evaluated as if they provided forecasts at four different time intervals. The results indicated that the forecast ploygons were best at capturing convection at the 0-hour time period.
Prototype Aviation Collaboration Effort (PACE) Adverse weather conditions, especially those associated with thunderstorms, contribute significantly to disruptions in air traffic operations. The effects extend to every sector of the air transportation community and can result in delays, reroutes, trip cancellations, and a reduced margin of operating safety. Weather that adversely impacts aviation operations also increases demands on FAA Air Traffic resources.
Through the Prototype Aviation Collaborative Effort (PACE), an operational test is designed for demonstrating and evaluating the effective employment of developing science, technology and computer communication interfaces. The PACE facility will initially develop a focused range of high-resolution forecast products specifically tailored to the air traffic environment, building on experience gained from ongoing operations and testing, such as the Collaborative Convective Forecast Product. An initial suite of graphical products will consist of convective forecasts, followed by a phased approach to include icing, turbulence, and ceiling and visibility products based on requirements outlined by the FAA.
A paper on this topic written by Dennis Rodgers of FSL and Thomas Amis of the NWS Center Weather Service Unit at Fort Worth, Texas (location of the operational testbed) is available at the FSL Website.
...From the Modernization Division Carl Bullock, Chief
GFESuite Training Workshop The National Weather Service and FSL sponsored a GFESuite training session on April 30 May 3 in Boulder. This workshop covered advanced concepts of the Graphical Forecast Editor (called GFESuite) for forecasters who had already taken the Essential GFE Techniques course. About 60 forecasters from the NWS regional offices attended, including 10 from the Alaska and Pacific regions.
Mark Mathewson, Chief, Enhanced Forecast Tools Branch at FSL, instructs forecasters at the GFESuite Training Workshop. bottom) He discusses a special issue with Deirdre Kahn, meteorologist with the National Weather Service.
Since the forecasters at this workshop had already used GFESuite in the operational setting, they were able to bring their experiences and issues to the workshop for discussion and/or clarification with the developers. Forecasters use the GFESuite, which is part of the Interactive Forecast Preparation System (IFPS), to define weather forecast elements in a gridded digital format. They can manipulate gridded values such as temperature, wind, and dew point to enhance the detail and precision of forecast preparation. Products based on these digital grids can take many forms, including graphical representation which can describe the weather more precisely.
FSL developers consider the training conducted at these GFESuite workshops crucial to the success of IFPS in the NWS field. For instance, it is essential that the IFP focal points get a solid technical foundation on GFESuite software. This not only allows them to do their job more efficiently, it also lessens the support burden on the developers. The symbiosis that develops between forecaster and developer frequently leads to elegant solutions to outstanding problems. Forecasters know what needs to be done in general terms, while developers have the skills to articulate the details of new ideas and implement them. By the end of these workshops, forecasters and developers better understand each other's position and hence communicate better in the future.
Anatomy of Training Sessions at the Korean Meteorological Administration (KMA) FSL is involved in a collaborative effort with the Korea Meteorological Administration to develop a nowcasting system (Forecaster's Analysis System), support startup and operation, and implement a training program for KMA forecasters. In conjunction with the training plans, Ed Szoke, Woody Roberts, Patrice Kucera, and others at FSL have been conducting D2D meteorological training at the KMA and at FSL.
The training at the KMA late last year emphasized the uses of various fields and functions available on D2D for analyzing and forecasting meteorological conditions. As with earlier training, five teams of five meteorologists (one chief meteorologist and four regional meteorologists), along with additional support staff received training. Review cases were used from both FSL's D2D system and KMA's Forecaster's Analysis System (an adaptation of D2D). The D2D review case was from the October 1997 blizzard, and the FAS cases were from January 2001 and February 2001. All three cases focused on cool-season weather events, which were useful to review during the winter months in Korea. Future training will also focus on warm-season events (i.e., typhoons). Trainees were encouraged to ask questions and contribute suggested fields for display during the lectures.
Two lectures were conducted with each forecast team. The first lecture covered the D2D/FSL review case and the second on the FAS/KMA cases. The first segment of the FSL/D2D case presentation focused on the prestorm environment. Observations (e.g., satellite, rawinsonde winds, etc.) were compared to forecast model analyses in order to determine whether the analyses adequately reflected the prestorm environment. Ed Szoke emphasized the importance of this comparison as a way of judging whether the forecast models would likely capture future weather conditions. The "Snellman Funnel" approach (i.e., looking first at the large scale, then focusing down to smaller scales) was also emphasized. The second part of the lecture focused on the forecast models, model comparison, and forcing mechanisms, as well as the actual storm environment during the height of the storm over the Boulder forecast area. Several different display techniques (plan views, cross sections, time height) and different fields were used in order to present the complex storm interactions occurring during this period. In the final session the storm was examined as it went through a forecast area in Nebraska. This portion of the storm was also difficult to forecast due to the rain/snow boundary within the forecast area.
The first two segments of the KMA/FAS case presentations covered the January KMA case which was well captured in the analyses and models. As with the FSL case, emphasis was placed on observation/analysis comparisons, the Snellman Funnel approach, and forcing mechanisms. The third session covered a fast developing, smaller scale system in February 2001, which offered a good contrast to the January case. It was not captured well in the model analyses nor in the forecast models due to its rapid development over the Yellow Sea and its small size. Satellite observations appeared to offer the best indications of the storm's development. Data for this case were reviewed in a similar manner and the instructors and KMA staff concluded that this was a difficult system to forecast.
The D2D and FAS systems both performed adequately during these presentations. This review case capability is a very significant enhancement, especially considering the complexity of these systems. However, there were some problems with both systems, which are being addressed for future training. The KMA/FAS system performance was significantly slower than the D2D system and this slowed the tempo of the presentations. Fortunately, performance on the operational KMA/FAS system was considerably better. In all cases, the instructors were able to find "work-arounds" so that the problems did not significantly affect the presentations. Manual product and function loads and procedures were used to familiarize the forecasters with the selection process, as well as shortcuts often used by U.S. forecasters.
Feedback from the KMA staff during the lectures varied considerably. A few of the teams were quite inquisitive about the U.S. case, but most of the feedback and discussions came during the presentations of the Korean cases. This is understandable, considering they were more familiar with these types of cases. This was very useful since the instructors did not have a chance to review these cases before arrival at KMA, and it helped them understand the coastal/ocean interactions that occur in Korea. The "interpreters" also helped consider-ably in interactions with the forecasters, and their help during the lectures was a key element in the success of the training. Additional feedback was obtained from a survey given to the staff at the end of the training.
A "Certificate of Completion" was presented to each participant at a closing ceremony for the KMA training. Both presenters are confident that this training method provides the KMA staff with a good understanding of how the workstation can be used to meteorologically analyze, monitor, and forecast cool-season weather events. Users were encouraged to start taking advantage of these capabilities more during day-to-day operations in order to reinforce lecture material.
Other activities were carried out by Ed Szoke and Woody Roberts while at the KMA, such as testing the performance of the operational FAS system and comparing it with the Linux/PC-D2D system at FSL.
...From the International Division Dr. William Bendel, Chief
FX-Net at the 2002 Olympics and Upgrades FSL collaborated with the National Weather Service and the Salt Lake City Organizing Committee to provide an FX-Net Internet-based workstation for onsite forecasting at each outdoor venue of the 2002 Winter Olympic Games. FSL received positive feedback on FX-Net's performance in providing forecasts during the Games.
Under Dr. Renate Brümmer's leadership, Sean Madine, Ning Wang, and others are developing a new version of FX-Net, called FX-Net National. Recent AWIPS workstation developments have been leveraged to deliver forecasting information for any location in the lower 48 states. As in previous versions, this information includes satellite imagery, model forecasts, and observations. Significant improvements include access to nationally oriented scales and radar products for each of the WSR-88D sites. More powerful interactive functionality, such as a user-defined vertical cross-section capability, has also been implemented. The client, which emulates the AWIPS D2D interface, runs on readily available PC hardware over network bandwidths as low as phone line based modems. These developments have led to an FX-National server that can now provide relatively low-cost service to a large number of Internet-connected clients.
Investigation of Data Compression Techniques Applied to AWIPS Datasets The current AWIPS system has an incoming data volume of 5 8 GB per day, which continues to increase with the development of higher resolution numerical models and more frequently available observations. Maturation of network computing technology and the need for distributed meteorological workstations require frequent transmission of this large volume of data over the Internet or other communication channels (satellite links, for example) to the workstations located remotely. Thus, it has become increasingly important to apply appropriate data compression techniques to reduce the volume of data. This also makes it possible to deliver and store large volumes of data on the remote meteorological workstations.
Ning Wang and Sean Madine are investigating the use of the compression techniques for various types of meteorological datasets, and the results are being analyzed.
Transfer of FSL's AWIPS technology to Taiwan's Central Weather Bureau FSL's technology transfer extends to foreign countries in need of advanced weather information systems. Its longest standing cooperative project is the Joint Forecast Systems Project with the Central Weather Bureau (CWB) of Taiwan. Under the management of Fanthune Moeng since 1990, FSL has been collaborating with CWB in modernizing its forecast center. It has gone through many phases of weather system development, and both organizations benefit from increased knowledge of information systems, data assimilation and modeling, high-performance computing, and observing systems. The latest work with CWB involves development and implementation of their new forecast workstation, the Weather Integration and Nowcasting System. More information on this and other international projects is available at the FSL Website.
WorldWide Weather Workstation (W4) The W4 initiative involves plans to design a workstation system that meets the forecasting needs of developing nations worldwide. These countries, of course, experience the same types of natural disasters that occur elsewhere, but often lack the infrastructure available in modernized forecast offices, such as data communications, local capability for operating numerical forecast models, and access to observations and data from surrounding areas.
The advanced workstation technology and varied datasets offered by the W4 system will help forecasters and emergency response managers in these countries to better deal with hazardous weather events, ultimately saving lives. FSL is currently involved in W4 work in various countries in Central and South America.
The W4 project will be featured in future FSL Forum issues.
(More information on these and other FSL projects is available at http://www.fsl.noaa.gov.)
iction System (LAPS). The system is completely integrated into AWIPS, utilizing a cost-effective, multiprocessor, Intel-based Linux cluster to handle the