[an error occurred while processing this directive]

themes

 
 
 
 

a large-domain LES of tropical deep convection

From Chin-Hoh Moeng
Date February 11, 2008

Hi,

Sorry for the duplication of the email if you happen to be on both mailing lists.

Dear members:

First of all, Dave has changed the name of “Deep-Shallow Convection” theme to “Physical Processes”. This theme now includes all important physical processes of the tropical deep convection system.

The purpose of this email is to seek your advice on how to best make use of a proposed LES of a tropical deep convection system. As many of you know, Marat has agreed to perform a large-domain LES of deep convection using SAM to cover a domain of ~ 300 km x 300 km x 27 km in the tropics with 3000 x 3000 x 256 grid points. This will be a mega-simulation with idealized horizontally homogeneous forcing and SST; we will choose a deep-convection case that has a lot of shallow clouds co-exising. We will run the case for 24 hours---after the spin-up time period---when the deep convection system reaches a quasi-steady state with its shallow clouds and turbulence. The main purpose of this benchmark run is to develop, test, or improve parameterizations of shallow clouds and turbulence effects in CRMs or MMFs. We will also use this benchmark simulation to study some fundamental issues such as: How do deep, middle and shallow convection, as well as large turbulent convection, co-exist and affect each other? How does the PBL respond to deep convection and precipitation?

The best numerical way to study the above issues is to resolve all of these motions---as best as we can. A minimum requirement to resolve all these important motions (from deep-convection system down to large turbulent eddies) is probably ~3000 grid points in x and y. We believe the computer power is ready for this mega-run.

Because of the gigantic data size, many analyses (such as statistical profiles) will be computed during the run. For those who are interested in using this LES, please provide a list of statistics that you would like to get---along with your proposed work with these statistics. Please keep in mind that the computations of your statistics should take only a tiny portion of the computer time; we cannot afford to have any analysis that will slow down the run. Also keep in mind that the SAM runs on MPI with a grid partition in x and y. If you are able and willing to develop and test your (and possibly other's) analysis output code in SAM yourself, please let us know. Before submitting the full-domain simulation, we could perform smaller (e.g., 500 x 500 x 256) test runs to test out the analyses.

Besides the time series of the above statistics, we will also produce the following output files for later analyses:
(1) Full 3D snapshot fields (about 200 GB each) stored at every ½ hour interval.
(2) Slightly spatially averaged 3D fields (i.e., averaged over a small volumes of, say, 4x4x2 grid points) stored every minute.
(3) Full 2D slices of the fields in several selected grid levels (e.g., near the surface, inside the PBL, at cloud base, melting level and cloud top, etc.) every few minutes.

This is going to be a very expensive run producing a very valuable dataset. We hope our theme members can learn as much as possible from this run about the dynamical interaction between deep convection and its smaller-scale physical processes, which hopefully will lead to improvements over the existing parameterizations of shallow clouds and the PBL in MMFs. If you are interested in this dataset and have specific ideas on how to utilize it, we would like to hear from you. We are proposing to form a working group of 5-10 people and start the discussion very soon. This should be done before Marat starts performing the big run which may begin in 2-3 months.

Chin-Hoh Moeng and Steve Kreuger

From Kuan-Man Xu
Date February 11, 2008

Hi, Chin-Hoh and Steve,
It would be more productive to have Marat tell us what basic/standard statistics he will save. Individual investigators can add a few variables to his listing.
By the way, Anning and I will add some parameters related to subgrid-scale microphysics parameterization and third-order turbulence closure.
What is the deadline for submitting the request?
Thanks. ---- Kuan-Man

From Marat Khairoutdinov
Date February 12, 2008

Kuan-Man,

I guess all the fields in the standard lst file will be saved. The deadline for the requests is ASAP.

All,

For those who don't know what lst-file is, attached is the list of standard hori zontally averaged vertical profile statistics that already have been implemented in standard SAM. I am currently working on extending the list mostly by adding diagnostics contributed by the U of Washington group.

lst

Cheers

Marat

From Robert Pincus
Date February 19, 2008

Chin-hoh -

I've brought this question up before, so forgive me if this detail has been
worked out, but can you tell me what you're planning to do about radiation?

The issue is that the scales at which you're planning to do this simulation
are well within the regime where horizontal transfers of radiation will be
significant. Given unlimited computer power you'd use a Monte Carlo method
to compute the fluxes and heating rates. Since that's absurd I suggest you
opt for specified, idealized radiation profiles. What you don't want to do
is compute plane-parallel radiation on an O(10-100m) grid - it's
demonstrably wrong and entirely unnecessary.



From Chris Bretherton
Date February 20, 2008

Dear Robert,

To my knowledge, we don't have an efficient non-plane-parallel radiation
scheme coded into SAM, so we have to go with a plane-parallel one we have
(e. g. CAM3 radiation). While I understand this makes local biases, my
understanding was that plane-parallel still does a decent job on
horizontally averaged fluxes and heating rates, and probably also on the
radiative heating rates in a stratiform anvil region. These are going to
be what matters for the simulated convective cloud ensemble.

Chris

From Marat Khairoutdinov
Date February 20, 2008

Actually, Jason Cole from U of Toronto has implemented the 3D Monte-Carlo
radiation scheme in SAM.

As I remember, the only drawback is that it parallelized by number of
photons not by the domain decomposition, so could be a problem in the case
of very large domains.

Jason, do I remember right?

From Wojciech Grabowski
Date February 20, 2008

All,

I think one should keep in mind why we want to do LES of deep
convection. I do not think cloud-radiation interaction is the focus.
There are other parts of the model that are arguably even less realistic
at LES resolutions (like the microphysics...), so putting a lot of effort
into one part (radiation) and leaving other elements unchanged would
not make sense to me (it would result in a Honda Civic with a Ferrari
race-car engine...).

If I understand the motivation for these experiments (which seems to
be to look at turbulent processes in deep convection), then the setup
should be as simple as possible (thus, prescribed radiative cooling
like in many LES shallow convection simulatios, e.g., BOMEX, RICO) and
most of resources used in as high spatial resolution as possible (say,
gridlength of 50 m). More realistic physics (better microphysics, PP
or 3D radiation, etc) should be included in follow-up simulations,
sometime down the road...

I am sure most of you agree with this logic...

Wojtek.

From Xiaoqing Wu
Date February 20, 2008

Chris and Wojtek's 'logic' make sense to me. For deep convection case, the
large-scale forcing will be a dominant factor. The entrainment rate for 3D
deep convection is another more important aspect to look at with 3D high
resolution idealized LES.

Xiaoqing

From Robert Pincus
Date February 20, 2008

Chris -

An efficient scheme does not exist to my knowledge. Howard Barker has
advocated the idea of using Monte Carlo with few photons but a) he hasn't yet
published the noise/cost trade-offs, so it's not clear at this point what
we'd want to do and b) as Marat points out, the radiative transfer can not
be parallelized across domains. Bjorn and I are working on an alternative
but it's pie-in-the-sky thinking at this point.

I was trying to make the point that Wojceich made more eloquently namely
that, if the point is to look at turbulence and mixing processes in the
transition from shallow to deep turbulence, it may make more sense to
idealize the radiation than to do a more expensive but no more realistic
calculation using plane-parallel radiation.



From Kuan-Man Xu
Date February 20, 2008

Hi, there,
Ideally, we should run a large-domain LES simulation with
fully-bin-resolved microphysics (including aerosols) with 3D radiative
transfer and perfect dynamics. This is not possible at this moment because
neither microphysics nor radiative parameterization is nearly perfect.
Because the SAM LES has fully tested against all boundary-layer cloud cases,
we can assume that the dynamics is in a much better shape than the other
components of the models. Now, we have to think about what other components
of the model we should try to simplify in order to achieve the goal of this
simulation; i.e., the interactions between deep convection and shallow
convection and a better understanding of the small-scale processes. Are
radiation and microphysics important to this goal? If not, we should
simplify/prescribe both. If they are, we should simplify both a little bit
less (says, including the cloud top cooling and clear sky cooling, but
nothing else).

By the way, I would be fun to have a Honda Civic with a Ferrari
race-car engine, but not in the context of this LES simulation. Good
thought, Wojtek!
----- Kuan-Man

From Wojciech Grabowski
Date February 20, 2008

Perhaps an irrelevant point at this stage...

> Ideally, we should run a large-domain LES simulation with
> fully-bin-resolved microphysics (including aerosols) with 3D radiative
> transfer and perfect dynamics.

Well, perfect dynamics implies one that acounts for processes across
all scales from LES gridlength down to the dissipation scale where
the microphysics works. This is obviously not the case in what we
use today; please look at my 2007 JAS paper (Grabowski, W. W., 2007:
Representation of turbulent mixing and buoyancy reversal in bulk cloud
models. {\it J. Atmos. Sci.}, {\bf 64}, 3666-3680.). Steve Krueger would
agree with me on this one. And using detailed micro at 50 m gridlength
is also problematic as one needs to think about what happens at subgrid
scales. For instance, by design current LES detailed micro models assume
homogeneous mixing for subgrid scales, which is definitely not appropriate
for LES resolutions...

Thus, I do not agree that "we should run with...".

W.

From Jason Cole
Date February 20, 2008

All,

> An efficient scheme does not exist to my knowledge. Howard Barker has
> advocated the idea of using Monte Carlo with few photons but a) he
> hasn't yet published the noise/cost trade-offs, so it's not clear at
> this point what we'd want to do and b) as Marat points out, the
> radiative transfer can not be parallelized across domains. Bjorn and
> I are working on an alternative but it's pie-in-the-sky thinking at
> this point.

Thus far I have been using the Monte Carlo with smaller domains which
have allowed me to break up the parallelization based on the photons
since I could replicate the full domain onto each processor. This is
relatively efficient, at least for my research purposes. However, for
large domains, Marat is correct that there is a need to parallelize the
Monte Carlo code in a different manner.

> I was trying to make the point that Wojceich made more eloquently
> namely that, if the point is to look at turbulence and mixing
> processes in the transition from shallow to deep turbulence, it may
> make more sense to idealize the radiation than to do a more expensive
> but no more realistic calculation using plane-parallel radiation.

I agree with Wojceich, and Robert, that for this simulation you should
keep things as simple, and tractable, as possible and play with things
after the fact. Besides, since this simulation will be such a
significant use of resources, and Marat wants to run it sooner than
later, I don't think we would want to start changing the model too much
from it's current state.

Jason

From Steve Krueger
Date February 20, 2008

Yes I do agree.

One of the unsolved problems in bin-resolved microphysics is how to allow it
to know about SGS variability. One method to do it is using an approach
similar to MMF but applied SGS in a CRM or LES: use a 1D representation of
turbulence, analogous to using a 2D CRM in GCM.

Steve

From Chin-Hoh Moeng
Date February 20, 2008

Dear Robert:

Sorry for this late reply. I got flu over the long weekend and just got
well enough to come back to work.

Yes, you have brought this question about radiation several months ago, and I th ought I had
answered you. The time or spatial scale we are most interested is that of the in teraction scale between
deep and shallow convection (and also with turbulence), which is perhaps on the order
of an hour or less. This is perhaps also the time scale that the PBL responds to
the deep convection. We hope that time scale is shorter than the time scale gene rated
by radiation processes, and longer than that due to microphysics.
I don't know whether this assumption is valid. The heated discussions you
have generated here are definitely very helpful.

For now I tend to agree with Wojek that we should keep
other SGS processes as simple as possible if our focus is on the effects of
turbulence and small clouds to deep convection.

Chin-Hoh

From Chin-Hoh Moeng
Date February 20, 2008

Hi, Xiaoqing:

Yes, the lateral entrainment rate of deep convection can be a good topic
to study with this large-domain LES run.
Do you know how to retrieve the entrainment rate from such a simulation?
If you do, we may want to add it into the analysis during the run.

Chin-Hoh

From Xiaoqing Wu
Date February 20, 2008

Hi Chin-Hoh,

In the AMWG meeting last week, Guang Zhang, David Neelin and I have some
discussions on the use of my year-long CRM simulation to investigate the
entrainment and mass flux problems. The 3D LES will certainly have much
higher resolution to do the analysis. I would like to see characteristics
of the rates from the two runs.

Xiaoqing

From Chris Bretherton
Date February 20, 2008

Dear Xiaoqing, Chin-Hoh, and others,

Peter Blossey and I have developed some 1D statistics for SAM based on
conditional sampling of cloudy updraft properties from which bulk
entrainment and detrainment rate profiles can be derived for the cumulus
ensemble. Peter is working with Marat to make sure these are in the run.

Chris

From Chin-Hoh Moeng
Date February 20, 2008

Hi, Chris:

That sounds great to me.

Chin-Hoh

From Mitch Moncrieff
Date February 20, 2008

Chin-Hoh et al,

The idea is to run the large-domain LES simulation for a very short period
(24 hrs). As a matter of fact, the useful length of this simulation
considerably shorter (perhaps 12 hours) depending on how the simulation is
initialized. At the two extremes, random small-amplitude perturbations take
time to evolve into deep convection and finite-amplitude excitation means
the model may need several hours to adjust to a realistic state. This
clarifies the scientific priorities - e.g., little can be done in regard to
cloud-radiative interaction since the characteristic timescale is much
longer than 12 -24 hours and, anyway, the radiative heating are completely
dominated by latent heating and evaporative cooling.

One of the greatest uncertainties in the MMF is how "moist turbulence"
a.k.a. "sub-cloud-scale turbulence" a.k.a "hot towers" should be
parameterized in 4-km-grid models. This includes, but is more than, lateral
detrainment -- what's needed are the transport properties of the hot towers,
turbulence spectra, etc. A pertinent question: how should such an analysis
differ from classical LES analysis apart from the fact that moist variables
are dominant? What cloud-microphysics parameterization should be used
clearly an salient issue. In a nutshell, this extends or says in a
different way what Chin-hoh, Wojtek and Xiaoqing have noted as priorities.

Mitch Moncrieff.

From Howard Barker
Date February 20, 2008

I agree with Wojtek's reasoning. I also agree that doing ICA on such a
massive grid is a huge waste of resources (each rad timestep would be
something like doing a full season of rad calculations in a regular
GCM!) and going to 3D represents too much work (for what?). I wonder if
a happy medium exists... like sectioning the domain into 100 20x20 km
sub-domains and doing a McICA on each sub-domain (getting clear-sky and
cloudy-sky averages and allocating them to their respective skies in
each sub-domain)? This would reduce rad calculations to almost nothing
yet still provide something that tracks along with the simulation (i.e.,
better than prescribed)... possibly too much work again??

Could (Have) you satisfied yourselves by running some preliminary simulations at less resolution with a variety of prescribed Q_rad
profiles to see if this configuration cares at all about radiation? If
not, and my sense is that it won't, there's no reason not to prescribe
Q_rad.

Regardless of what you do with rad, it would be nice to get a single
snapshot and feed it to the EarthCARE simulator. Such high resolution
over a large domain is ideal for remote sensing studies.

Howard

From Xiaoqing Wu
Date February 20, 2008

Hi Chris and Chin-Hoh,

Here is a paper by David Gregory about the estimation of entrainment rate which we are using.

Gregory, D., 2001: Estimation of entrainment rate in simple models of convective clouds. Quarterly Journal of the Royal Meteorological Society, 127, 53-72.

Xiaoqing

From Chris Bretherton
Date February 21, 2008

Dear Xiaoqing,

Our approach is also like that discussed by Gregory (2001) in his appendix,
which follows Siebesma and Cuijpers (1995 JAS). We use frozen moist static
energy as our conserved variable, which has the advantage of only having
minor sources/sinks in the updrafts, even in deep convection. We also
choose to sample all cloudy updrafts rather than buoyant cloudy updrafts to
allow consideration of overshoot of convective updrafts. Kuang and
Bretherton (2006 JAS) and Khairoutdinov and Randall (2006 JAS) showed the
application of FMSE to entrainment calculations in SAM, though from a
cumulus ensemble rather than a bulk updraft perspective.

Chris

[an error occurred while processing this directive]