*** Introduction: 500 words
defining background:
- stochastic
- (deterministic chaos)
- In GCMs: we have a natural focus on decadal variability and spatial patterns
- PDO, AMO: reasonably well captured, but sometimes the time scale is wrong
- are PDO, AMO *really* modes ? (a peak, or a wide bump)
- looking at centennial is the next step (there must be variability modes)
- the difficulty: you see a _mode_ but how do we _validate_ it [no data]
- you _see_ it with, e.g.: EOF
# Where does centennial variabiltiy come from:
- you cannot isolate centennial variability from the rest
- the _cascade_ from both sides will determine what you will see at the centennial scale
# Four ways to generate centennial variability
1. linear accumulation / decay of noise (stochasticity)
2. cascade, turbulence, symmetry breaking processes which are scaling and non-linear in ature
3. mode excitation / resonance / synchronisation, internally or externally forced
4. external forcing (linear or non-linear forcing).
- what do we know about external forcing, btw ? Solar, volcanic ?
- and all combinations thereof (there are not completely contradicting)
--> and characterise what would be the characteristics of the data depending on 1/2/3/4, and come up with specific criteria / tests.
# General questions:
- _what_ is the _scaling_ process: non-linear cascade or linear integration / decay or "mode entrainment"
(see above)
- is centennial variability (and its charcateristics) is a _global_ feature ?
- take vegtation: stability vs instability at different spatial / temporal scales.
- negative and positive feedbacks can dominate at differnt scales, and at differnt _epochs_. What do we know about this ?
- can we learn something about centennial variability by _studying_ ice ages
- the role of the forcing (the role of forcing overestimated in the literature ?
- how does the cascade work in GCMs, vs in Nature, vs in simple stochastic dynamical sytsems ?
- How do we explain the transition towards the millennium (fluctuating) regime, and connect (smoothly) with astronomical time-scale variability ?
- What are the proper statistical representations of space-time centennial variability?
( accounting for intermittency in space/time, and non-Gaussianity).
# Thinking out of the box
- rethink the 'why' function. Not focus too much on temperature, but asking 'why' the complex system settles to genreate the kind of variability we see. Think of biology.
# Material support of centennial variablity
- Vegetation
- effects on albedo, hydrology, etc. E.g.: El-Nino -> impact -> long-lasting consequences
- role of bare ground albedo
- soil dynamics can generate tipping points / slow dynamics
- Ice sheets
- Carbon cycle
# Case studies
- slab ocean aquaplanet.
# Brainstorming
- we face a difficulty : produce an empirical (observation-driven)
'synchronised' space-time picture of variability
- another difficulty : produce meta-model descriptions of GCM output:
e.g.: we have models that can reproduce PDO [even though we still do not know what is "causing" it : an integration of ENSO ? Or internal ocean variability of the Nort Pacific, how much of it is coupled (non-lineary) with ENSO and gyre circulation
- modes, described as EOF, change over time (e.g.: anthropogenic forcing)
- CVAS could be a hub for producing space-time reconstructions but of course this exercise requires some hypotheses (e.g.: data assimilation, or more data driven like stochastic dynamical system identification, but do we have enough data? )
- (a couple a papers on Bayesian reconstructions, with AR(1) space-time, and also off-line data assimilation : Mike Evans -> PAGES2K / Last Millennium assimilation):
- Are such products potentially problematic from our prospective?
- Can we investigate this?
- How can we iterate / interact with the PAGES2K effort?
- What about the 'non-linear' (non-Gaussian) properties?
- What about the possible detection of tipping points?
- Does the Ensemble Kalman Filter kill the non-Gaussianity ?
e.g.: Are PAGES2K kind-of assimilation products lead us to underestimate the probability of very large events? [grey swan].
- Can we suggest more realistic covariance function (rather than AR(1)), physically motivated, to produce statistical reconstructions (improvement compared to earlier works such as Tingley, Li, etc.)
maybe we need to feedback theses questions
- Maybe we do not have enough data to decide linear vs non-linear scaling explanations. Can we maintain the broader spectrum? What does this entail in practice?
# Stakeholders
- PAGES2K
- PMIP (last millennium, control simulation, and control vs forced simulation).
# A science agenda :
Case studies for points 1 - 4 in our "four ways" item.
# Gerrit volnteers for a NCC cordination.