We value the physical get-together, focus, and dedicated time. That's why we have decided to move PDC to 2022, where we are quite sure we all can travel again and get the most out of it.
Trying to replicate the same experience online could only recreate a small aspect of the real meeting, even if it were to be successful. As we have already been waiting some time to get together, we thought it would be great to connect again before next year's meeting.
The result is PDC Discussion Coffee (virtual). Scheduled for the 8th of June 2021, initially. During the abstract submission stage, we may decide to add more dates, depending on demand.
The format will be quite different from the normal PDC meeting. At PDC, we usually value longer talks, with detail and time to explain. Not just a quick show of the most admirable results, leaving us to wonder if we got the right idea, how it was done, etc.
But with different time zones to bridge and us already spending a fair amount of time in video conferences of one kind or another, we did not want to timetable yet another DbVC (Death by Video Conference) marathon.
At PDC Discussion Coffee the idea is that we have short talks, no more than 5min, no more than two slides, which help us to pitch our ideas, surprises, next project, etc., to stimulate an online conversation about the topic or idea afterward. Everything to be timetabled over 90 min, roughly.
What would you do if you had the opportunity to build a climate model from the ground up? Many of the leading global climate models (GCMs) represent generations of improvements built on a baseline software architecture that is decades old. The Simplified Cloud Resolving E3SM Model (SCREAM) project has provided an opportunity to redesign the software architecture for the US-DOE Energy Exascale Earth System Model (E3SM) atmosphere model. A complete rewrite of the model exposes fundamental questions in a multi-physics code, including how to design a code that can enable best practices in physics/dynamics coupling research and application? The approach adopted by SCREAM is to a) minimize the number of variables passed between processes and b) treat each individual process as a core, model-agnostic component. Inter-process coupling is handled by SCREAM-specific interface layers which massages input variables into the appropriate form needed for every individual process. Model-agnostic processes makes running single processes in standalone test mode easy and porting to/from other models simpler. But what other best practices should we consider?
This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344
Global cloud-resolving models promise to simplify and improve current global models by removing the need for convective and gravity-wave drag parameterizations. We explore this notion through the development of the X-SHiELD GCRM. I found that although convective plumes and mountain waves are very well-represented in X-SHiELD, the shallow-convective parameterizations and gravity-wave drag parameterizations still improve clouds and large-scale circulations. These results are discussed in the context of explicit-convection and explicit-gravity wave simulation of which GCRMs are a part of.
Various studies in recent years have provided the clue that time integration methods used in the atmosphere component of the Energy Exascale Earth System Model (E3SM) can benefit from significant accuracy improvements. In our attempts to systematically quantify and attribute the time integration error, multiple issues related to process coupling have been identified. This lightning talk will give a brief overview of the methods we are using for the investigation, the causes and impacts of various process coupling issues, and our plans for the next steps.
Precipitation is an important climate quantity that is critically relevant to society. In spite of intense efforts, significant precipitation biases remain in most climate models. Using the DOE-E3SM model version 1, the inclusion of a missing process, convective gustiness, is shown to reduce a pervasive and persistent bias found in many general circulation models that occurs in the Tropical West Pacific. Convective gustiness increases surface evaporation, which acts as an energy source to invigorate the large-scale circulation. A normalized gross moist stability framework is used to diagnose the impact surface evaporation has on the precipitation response to gustiness. Including the impact of another subgrid-scale process, the large eddy wind variance taken from the Cloud Layers Unified by Binormals subgrid turbulence and shallow convection scheme, shows reductions in the Amazon dry bias. These results highlight the importance of interactions between the resolved and subgrid-scale processes, particularly in regions where the resolved surface winds are weak and convection is favorable.
In the GFDL global weather-to-seasonal prediction system SHiELD, the cloud microphysics parameterization is entirely built in the FV3 dynamical core. This inline microphysics takes advantage of the rapid evolution of temperature and hydrometeors by dynamical advection. In return, extreme heating and cloud formation from the cloud microphysics can be rapidly propagated to a broader region. An additional benefit of embedding the microphysics into the dynamical core is to make the physical parameterization thermodynamically consistent with the dynamical core. This talk will cover the design of the inline microphysics and its performance in GFDL SHiELD.
Much atmospheric convection is sub-grid-scale in climate models and so is represented by convection parameterisations. These parameterisations lead to some of the largest and most persistent errors in tropical precipitation. Convection parameterisations traditionally assume a quasi-equilibrium and that there is no net mass transport between grid columns due to convection. Therefore convection parameterisations lead to source terms in the temperature, moisture and momentum equations but not the continuity equation. If convection schemes provided source terms to the continuity equation, the model would become unstable due to explicit treatment of acoustic and gravity waves.
This talk will describe multi-fluid equations to simulate sub-grid-scale convection in one fluid and stable air in another. This equation set includes interactions between convection and the continuity equation but requires changes to the whole dynamical core rather than just a stand alone parameterisation. Exchanges between the fluids are equivalent to entrainment and a new formulation for entrainment is presented. We will present solutions representing dry convection at coarse resolution and compare with high resolution solutions.
The Common Community Physics Package (CCPP) consists of a repository of physics schemes that adhere to a well-defined set of rules governing their data interface and a software framework for autogenerating "caps" that function as drivers for user-selectable collections (or suites) of compliant physics schemes. The intent and design of this package is to allow physics schemes to be "dycore-agnostic" such that physics may easily be shared across atmospheric modeling systems from many institutions. NOAA's UFS has adopted this package for many of its applications for current and future development and it is slated for operational use in the near future. Other partners include NCAR and NRL, whose flagship institutional models such as CESM, MPAS, WRF and NEPTUNE, respectively, have adopted or are currently adopting the CCPP framework or contributing to CCPP physics in some way. Development of the CCPP thus far has endeavored to maintain as much flexibility for experimenting with physics-dynamics coupling as possible. Through participation in the PDC Workshops, CCPP developers hope to continue to learn of novel methods in the atmospheric modeling community so that their use may not be precluded by the CCPP software design. The rapid talk will provide a brief update of the status of the CCPP project and provide an opportunity for discussion of its use and future development direction.
Geometric (variational/Lagrangian and bracket) structures that describe irreversible (entropy-generating) processes for geophysical fluid dynamics are now well understood and developed; and such structures can be used to design consistent energy-conserving and entropy-generating subgrid scale parameterizations. However, at current model resolutions the subgrid scale processes are mostly reversible (entropy-conserving) dynamics, and it is not clear that treating them by analogy with physical small-scale irreversible processes properly encodes their behavior. This rapid talk will present some tentative ideas about new approaches to consistently treating subgrid scale processes that takes into account their reversible nature.
We present the Python framework Tasmania which aims to facilitate the numerical investigation of time-stepping-related issues in atmospheric models. The package offers a favorable platform to write self-documented and plug-compatible dynamical cores and physical parameterizations with a clean and common interface. Components can be composed via couplers to form flexible, modular and maintainable models which pursue well-defined physics-dynamics coupling algorithms. Within each component, stencil-based computations arising from Eulerian-type dynamics and single-column physics can be encoded using a variety of tools, ranging from scientific computing packages like NumPy and CuPy, to just-in-time accelerators like Numba, to domain specific libraries like GT4Py. Indeed, Tasmania allows to define, organize and manage multiple backends in an organic fashion. Infrastructure code ensures that memory allocation, kernel compilation and stencil launch are properly dispatched. This highly relieves the user code of boilerplate and backend-specific instructions, favors a smooth transition from prototyping to production, and ultimately enables performance portability.
Tasmania has fueled the inter-comparison of four coupling algorithms on a hydrostatic model in isentropic coordinates. The model is used for vertical slice simulations of a moist airflow past an isolated mountain. Self-convergence tests are conducted which show that the coupling error of moist variables (e.g. the precipitation rate) emerges gradually as the spatio-temporal resolution increases. Eventually, each coupling scheme tends towards the formal order of accuracy, upon a careful treatment of the grid cell condensation. Indeed, it is found that the well-established saturation adjustment may cap the convergence rate to first order.
Can we save computational resources in NWP models by running different parts of the model at different resolutions, but without degrading the quality of the solution?
That is the main question that we are attempting to answer in a project that we have recently started in the new LFRic model at the Met Office. We will investigate how LFRic performs with different components at different horizontal resolutions, e.g. running dynamics at a coarser resolution than the physics and vice versa.
This talk will briefly outline the motivation and scope of this project.
Open discussion amongst delegates and speakers. May even extend beyond the official closing time.