Preprint Article Version 1 This version is not peer-reviewed

Stochastic Parameterization of Moist Physics Using Probabilistic Diffusion Model

Version 1 : Received: 16 September 2024 / Approved: 16 September 2024 / Online: 16 September 2024 (16:48:27 CEST)

How to cite: Wang, L.-Y.; Wang, Y.; Hu, X.; Wang, H.; Zhou, R. Stochastic Parameterization of Moist Physics Using Probabilistic Diffusion Model. Preprints 2024, 2024091262. https://doi.org/10.20944/preprints202409.1262.v1 Wang, L.-Y.; Wang, Y.; Hu, X.; Wang, H.; Zhou, R. Stochastic Parameterization of Moist Physics Using Probabilistic Diffusion Model. Preprints 2024, 2024091262. https://doi.org/10.20944/preprints202409.1262.v1

Abstract

Deep-learning-based convection schemes receive wide attention due to its impressive im-provement on precipitation distribution and tropical convections of earth system simulation. But they cannot represent the stochasticity of moist physics, which will degrade the simulation of large-scale circulations, climate mean, and variability. To solve this problem, a stochastic pa-rameterization scheme based on probabilistic diffusion model named DIFF-MP is developed. The cloud-resolving data from GRIST model is coarse-grained into resolved-scale variables and sub-grid contributions due to moist physics to form the training data. DIFF-MP’s performance is compared against generative adversarial network and variational autoencoder. Results show that DIFF-MP is consistently better than the other two models on prediction error, coverage ratio, and spread-skill correlation. The standard deviation, skewness, and kurtosis of subgrid contributions generated by DIFF-MP is also closer to the testing data than the others. Interpretability experiment shows that DIFF-MP’s parameterization of moist physics is physically reasonable.

Keywords

convection parameterization; diffusion model; generative model; machine learning

Subject

Environmental and Earth Sciences, Atmospheric Science and Meteorology

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.