2012 Developers/Users Forum Minutes

From CMASWIKI
Jump to: navigation, search

Developers/Users Forum Minutes

CMAS Conference
October 15-17, 2012

Facilitator: Zac Adelman (UNC)

Panelists: Prakash Karamchandani (ENVIRON), Amir Hakami (Carleton U.), Naresh Kumar (EPRI), Heather Simon (US EPA OAQPS), Jon Pleim (USEPA ORD)


Introduction by Adelman introducing the concept of the panel discussion. The session is presented as an open forum for discussing thoughts, experiences, and ideas within the community. The objective of the meeting is for the CMAS Center to get feedback from the community on emerging needs and comments on past experiences with the provided software and services. A series of charge questions will be presented to stimulate discussion. The panelists will have the chance to respond to the questions first and then the discussion will be open to the audience.

Question 1: For the release of CMAQv5.0, a development period was 
opened prior to the official release in which the community had a few 
months to add to the code base  and submit their development code to EPA for 
inclusion in the official release. What were the strengths of this 
approach and how might it be improved or done differently for future 
releases? Please suggest other model release procedures to encourage 
and leverage community involvement. 

Prakash: Overall positive experience. The beta release helped them get started with their development lines of code prior to the official release. Suggests that there should be a way to provide developers faster access to incremental changes to the code between major releases. One way to do this would be to have a separate download area on the CMAS website with development codes that is distinct from the official, operational version of the model.

Jon: Felt the beta release was a good idea although the actual experience was mixed. There didn’t appear to be enough time for the outside developers to work on the code and timing in the coordination of the different development branches was an issue. From the ORD side, it was difficult to balance the beta development release with the continued development of the main release branch of the codes. The idea of distributing development codes through CMAS is interesting but there is concern about the propagation and application of unofficial versions of CMAQ in the operational/regulatory modeling community

Heather: There are advantages to standardized code versions and concerns within EPA about intermediate/unofficial releases being used for SIP development applications; a system would need to be set up that clearly delineates between development and official versions of the CMAQ codes

Naresh: Positive about the beta release and EPRI took advantage of it to sponsor development in CMAQ. Questions about the sustainability of this approach given the resources required for developing on new lines of code. The outside development community needs EPA to commit to keeping core modules intact between releases; example given with HDDM

Jon: EPA clearly doesn’t have resources to address all outstanding issues and community needs; HDDM version is being developed now

Amir: Liked the structure that the beta release presented for facilitating community involvement in development. Understands that there are practical limitations from the EPA side. He benefited from being able to get early/pre-release versions of the code for development. Would like to see more formal involvement between EPA and outside developers to keep the EPA development direction transparent; outside needs to be aware of was to be expected during CMAQ development within EPA

Audience:

  • CMAQ is both an academic and regulatory tool; be clear and put a system in place to distinguish between and distribute different versions of the code
  • There could be a better infrastructure to make CMAQ more of a community development effort; not clear who the community should talk to when they have issues during development
  • Look to the Red Hat/Fedora model of how development vs. enterprise releases are handled
  • Does EPA have a code czar? There needs to be a single point of contact who also knows everything that is happening in terms of development within EPA
  • There needs to be a discussion about how to involved the community in more frequent releases; see WRF example (2 releases per year)
  • Concern about the proliferation of unofficial versions; more frequent releases may be a way to mitigate this issue; if move in this direction there would be a trade off with less thorough evaluation for each release; more frequent releases = less evaluation per release
  • A code repository needs to be set up; testing burden for codes in the repository would fall on the community
  • Implement a protocol to embed code version IDs in the code and netCDF files written by the code
  • To address code base stability and evolution of the codes nail down the interfaces early in the process; allow inner workings to change but keep interfaces stable
Question 2: What are the near (2-3 years) and longer term (5-10 years)
research and application needs for modeling and analysis software?

Zac: EM magazine article summarizing workshop facilitated by Naresh that outlines air quality modeling current development needs; will put a link to this article on the CMAS website. Proceedings of the 2011 EPRI Workshop

Naresh: 1) winter ozone needs to be addressed; 2) unified chemical mechanism that bring together gas-phase, heterogeneous, and aerosols chemistry in a single mechanism; 3) removal processes need to be researched, improved, and evaluated

There is a need for alternatives to the current generation of dispersion models, particularly for the 1-hour NAAQS

More efforts are needed for testing the applications of the models that demonstrate and stress their capabilities

Heather: Short term from the regulatory user side include a need for instrumented models (e.g. DDM & source apportionment); need to have reliable systems for evaluating the impacts from single sources; there is a need for improved visualization tools, expressed frustration with VERDI; long term needs include continued and robust evaluation efforts; identify and diagnose causes of poor model performance; solve model deficiencies and identify strengths/weaknesses, particularly certain conditions and pollutants where the models under perform.

Amir: integrate observations, including satellites into the model predictions; instrumented models are a short term need, such as adjoint and backward sensitivity tools; many tools are available that are not connected with the application/user community; there is a current need for integrated assessment and decision support systems – must implement the software infrastructure to meet these needs; In the longer term, needs to be effort devoted to uncertainty analysis, including the capability for stochastic applications of the modeling systems; need for quantified uncertainties on emissions inventory data; given the amount of time required to develop model inputs (e.g. emissions and met) need a systematic way to share model-ready data

Jon: current development includes improving the removal processes in the model, such bi-directional NH3 and Hg; faster satellite data integration, including land cover data; improvements of boundary conditions from intensive global modeling; testing adaptive grid structures; very fine scale modeling for studying health effects of air pollution, including simulating urban heat islands and urban stress effects

Audience:

  • Need for improved data sharing infrastructure within the community; torrent server hosts are needed to store and share large data files; if we can provide the infrastructure we can negate the need to mail hard drives around
    • See RSIG as an example of a data sharing infrastructure
    • Look to other existing data warehousing efforts
  • Efforts are underway to make Cyair a reality; will need to develop data sharing tools and training courses to familiarize the community with the new data sharing environments
  • A user from a state agency expressed their happiness with the release of the CoST model, requested that an updated inventory be packaged with the model and to include future year inventories
    • Discussion on how any inventory can be loaded into the CoST database and that what is distributed with the software is only a tutorial to get people started
  • Needs for ozone source apportionment tools in CMAQ
    • Discussion on how ozone SA development is underway in CMAQ and the PM SA is currently being tested
  • Needs for AMET updates
  • Needs for data assimilation; want to be able to assimilate surface data before satellite data
  • There needs to be more focus in the CMAS community on policy making needs; CoST and DDM need better support, documentation, and training
  • Application needs should be driven by pending ozone and PM regulations; maybe a philosophical conversation is needed about what these models are used for
  • Data requirements of the modeling software are restricted by the lack of middleware for users to interact with these data; need tiered sets of tools that allow users to deal with huge amounts of data; step back from the technical issues and look user needs in different sections of the community: policy/regulatory, health effects, economics, etc.
  • Look to the needs outside of the modeler/developer groups in the community and ensure that the data are being used by people outside of these groups; if not, why and how can we make that happen
    • Start from an immediate need not a philosophical starting point
    • Technology will evolve as needs evolve
  • Multimedia issues are important; are there any ideas to move CMAQ into a multi-media modeling environment, i.e. a one-environment model? Start by coupling to hydrology.
    • There is coupling to an agricultural model (EPIC); CAFOs will be next
  • Needs access to more diverse observational networks, such as aircraft and remote sensing data; there are data gaps in the observational data archive and QC on these data are needed; need more timely surface obs and information on the quality of the data
    • Airnow data are preliminary, AQS are the final data
  • Visualization tool needs: interface with other communities; need to link CMAQ with other analysis tools
  • Develop a CMAS community toolbox, maybe through GitHub, where people can exchange tools/scripts/software
  • CMAQ memory model needs to be upgraded to keep pace with the science; inefficiencies in CMAQ need to be addressed; move towards OpenMP codes