NCEP Modeling Synergy Meeting Highlights: January 29, 2007

 

This meeting was led by Bill Bua and attended by Geoff Dimego, Jim Hoke, Dave Michaud, Keith Brill, Bill Bua, Mark Iredell, and Ed Danaher. SPC’s Steve Weiss and David Bright, and UCAR/COMET’s Stephen Jascourt attended by remote conference.

 

1. CCS

 

Dave Michaud reported that the new computer system went live on 24 January (rather than 23 January) because of critical weather day declaration. The old white and blue machines are to be shut down 8 a.m. Wednesday, 31 January. Needed NFS and blue/white gpfs files should be transferred before that.

 

The operational machine is mist (in Gaithersburg MD), while dew is the developmental machine (in Fairmont WV).  A third machine with smaller capacity but the same architecture as mist and dew, called haze (an R&D computer, separate from the other computers by firewall, used by test beds, joint centers, and climate forecasters) is also in Gaithersburg.  It was noted that there has been a shift in where prod and dev are, but that both machines can serve as production sites. Because of dew being in Fairmont, there are logistical HPSS issues which will have to be resolved.

 

Machine capacity:

The new machines provide 3 times blue and white’s calculation capacity; in practical terms, the capacity is actually increased by about 2.5. There are 16 processors per node (but 32 virtual processors because load leveler splits processors in half), with each physical processor having 2 Gb memory.  The memory chip speed is 10% faster.  Most of the increase in capacity is from doubling the number of processors.  Performance depends on whether the code being run has been optimized or not.  More information can be found on the IBMdocs web page (http://ibmdocs.ncep.noaa.gov/userman4/index.html and click on Carolyn Pasti’s Quick-Start training [a PPT download]). 

 

The following are now useable as aliases: prodccs.ncep.noaa.gov, devccs.ncep.noaa.gov.  Access to prod via loadleveler or devonprod is limited to 30 people.  Mirroring of files between mist and dew is being set up.  Some centers have already done mirroring and processing by running cron jobs on both systems.  See http://ibmdocs.ncep.noaa.gov/userman4/Mirror.User.Instructions.htm for more information on how to mirror files between dev and prod (usually defaults to dew and mist, respectively). 

 

Planned Implementations: 

FY07-Q2:  

·        HYCOM data assimilation:  Carlos Lozano is the Marine Modeling and Analysis Branch (MMAB) contact for this.

·        Expansion of global ensemble to 20 members plus one control.  Parallel 20-member ensemble data in /com/gens/para/gefs.<date> will be pushed out to ncosrv next week. Implementation anticipated in March.

 

FY07-Q3: 

·        GFS hybrid/GSI upgrade.  Data are on /com/gfs/para right now.  Note that the .sf files are hybrid (rather than sigma) in the GFS parallel. Global Climate and Weather Analysis Branch will be fleshing out the implementation schedule soon. Conversion/data integrity between SSI and GSI.  BUFRlib will be tested over the weekend.  AIRS and COSMIC satellite data will be added to the data assimilation capability in February or March, for later use in the GFS/GSI data assimilation system.   

·        Hurricane WRF (HWRF), probabilistic storm surge from Meteorological Development Lab (MDL). HWRF and GFDL will both be run this year, with computer capacity for up to 4 hurricanes for each at the same time. However, these hurricane runs will bump high-resolution window runs (regional WRF nested in NAM).  Bill O’Connor is contact on HWRF and Arthur Taylor for surge model forced by the HWRF at MDL.  The Tropical Prediction Center is sponsoring the implementation.

 

2. NOTES FROM EMC

 

2a. Global Modeling Branch:

 

Mark Iredell reported that GSI+hybrid coordinate parallel is being run retrospectively from 7/05 through10/05 (covers Hurricanes Dennis thru Wilma), and for 8/06 through 10/06.  The winter retrospective will be from 2006-07. Results have been mixed thus far: better tropical scores, extratropical scores are sometimes better, sometimes worse. The GSI has different background error specifications in tropics than the old SSI, because the GSI can use regionally varying background error.

 

The hybrid coordinate has had hardly any effect except in Asian Jet region (does much better), because the effect of Himalayas is simulated much better.

 

The unified postprocessor will be ready in third quarter. However, it does not output some of the parameters generated by the current postprocessor, so it may be necessary to have a period when output is produced from both the old and new postprocessors until those items can be added. 

 

2b. Mesoscale Modeling Branch:

 

Dave Michaud substituted for Geoff DiMego. NAM changes are expected to be implemented at the end of Q3 or beginning of Q4 of this year. One big change would be expansion of the domain and addition of new data types for analysis, general work on QPF issues, and updates to the convective parameterization.

 

Regarding the hires window runs:

 

·        Get them closer to 4 km

·        Instead of 4 large regional domains once per day, run 2 bigger regional domains 2x per day.  

 

The time frame for this change is the same as for the next NAM bundle.

 

Q4 change to SREF: send out bias corrected files, and get all members to 32-km. 

 

Global post changes: Precipitation type algorithm (posting of major type out of an ensemble of 5 algorithms) will be implemented.  Both posts may be run for awhile.


 

2c. Global Ensemble Prediction System: 

 

The 0.95 sigma winds in forecast output will be added for Marine Prediction Center.

 

2d. Short Range Ensemble Prediction System:

 

Note was made that the SREF tends to send derived products to the field, while the MREF global ensemble takes a “distribute everything” with gives the field the ability to come up with its own ensemble post products. A unified approach might be a good discussion topic for a future meeting.

 

2e. Marine Modeling and Analysis Branch (MMAB): 

 

Nothing was reported at this time.

 

3. Feedback from operational centers

 

Nothing was reported at this time.

 

4. The next meeting will be held Monday, February 26, 2007 at noon in EMC Rm. 209, with remote conference capability.