Whole systems become compressed when ....
Indeed, several graphic communication studies have already shown that  individual graphic images become perceptibly more 'abstract' and 'simple' they are effectively undergoing a process of compression (REFs). [Expand this] Furthermore, as Garrod et al found, reductions in complexity tends to correlate with a loss of iconicity.  

Our proposal

Using a method pioneered by Helena Miton, we measure compression in individual items by ....
To effectively compare the results from semiotic experiments with real-world scripts we propose to measure 
We propose a new approach to comparing systematic evolutionary change in graphic transmission experiments with real-world writing systems.
Our proposal dispenses with measures of iconicity-to-abstraction, focusing only markers of compression over multiple generations.
Further, we reject ancient inventions of writing as viable comparators, and turn our attention to one relatively recent reinvention of writing. Indeed, for some time, scholars have suggested that non-literate inventions of writing via stimulus diffusion might serve as heuristics for probing the emergence and evolution of writing itself across time (Diringer 1948; Gelb 1952; Dalby 1967; Kotei 1972; Ferguson 1995; for a summary see Kelly forthcoming [Studi micenei]). Although this intuition has never been properly explored there are reasons to suppose that it is worth considering. Writing has been reinvented from within non-literate communities at least seven times in recent history. These isolated reinventions potentially serve as naturalistic transmission experiments in script change: the inventors have taught their scripts to new generations of non-literates who have in turn passed their knowledge to subsequent generations. And given their independence from existing script families, Galton’s problem is already dealt with.
The best known non-literate reinvention of writing is that of the Cherokee script, developed by Sequoyah in 1821 and used by Cherokee speakers continuously to this day. However the Vai script of Liberia, created in ca. 1833 is a much stronger candidate for comparative analysis for a number of reasons: a) it did not take the form of an existing script as a model as Cherokee did with the Roman script; b) its mode of transmission has remained informal over its history (no schools or institutions); c) campaigns to standardise the script have occurred much later in its development; d) and it has been independently documented on fifteen separate occasions between 1834 and 1981 meaning that its full history is adequately known and does not require any interpretive reconstruction.
We propose an analysis of field data from a real-world setting that will either provide the first validation of graphic-based semiotic transmission experiments. Or if not, our study will reveal that these experiments have less explanatory power than their authors have presumed.

Structure of the study

Our study has two parts. For the first part, our expectation is that the transmission-driven compression effects that have been discovered in semiotic experiments (REFs), will be reflected in commensurate changes to the Vai script over its history. This is ascertained by vectorising all samples and deriving complexity via file size, but also by hand-coding the changes to the script across relevant parameters, most importantly the addition or subtraction of graphic features.
In the second part of our study we will bring the Vai script into the laboratory. A few experiments are proposed with varying conditions. In one of these we ask participants to simply learn and then write out individual high-frequency Vai symbols with the output becoming the input for the next generation. In an alternative version we ask them to learn and reproduce whole words to test whether compositional effects of linear writing come into play. A variant experiment starts participants at different points in the history of the script. One group is trained on high-frequency characters as they were attested in 1834 and another group on the same set of characters as recorded in 1981. The prediction being that the 1981 characters will be less susceptible to change as they’ve already gone through the transmission bottleneck several times (when compared with the original 1834 script)

Predictions

For both parts of the study, our predictions are the same:
1. High-frequency characters within the Vai character set (historical and experimental) will become more compressed over successive generations in both the graphic form of individual items as well as in the relationships between items.
This will be indicated by the following:
Characters will become more self-similar. In other words, these items will come to be be generated from the same set of rules, resulting in typographic stereotypes (as for example, ‘stems’ in the Roman script ‘bowls’ and ‘lunettes’ in the Arabic script etc)
Graphemes will start off distinctive (and complex) and lose their complexity
Within character shapes, any repetitious forms, eg, a series of identical dots, waves, lines, will become abbreviated. In other words, standard repetitions will come to be inferred.
2. The Vai character set will exhibit greater speed of change in earlier generations than in later generations. In other words, there is a half-life for script change.
There will be greater levels of redundancy in earlier generations, measured by grapheme inventory size, presence/absence of allographic variation

Diachronic analysis of historical Vai script

dd

Results

dd

Experiments using high-frequency Vai characters

dd

Results

dd

Discussion

d

Conclusion

dd
PK notes (ignore, I will tidy up!)
Item-system trade-offs
-relationship between items in a set
-standardisation pressure, systematicity
-rule to generate the whole system there are consistent patterns making it more compressible
1 . Ask James to read Rovenchak article and make sense of 55 most frequent characters
2. Ask Helena what her meausre of compression is, and thus what needs to happen next wrt vectorisation etc. 
We expect compression etc, because when something is repeatedly transmitted you tend to amplify biases towards compression. The principle of least effort for single items, but contrasted with system wide compression. 
If nothing is found, not all cultural items will change when they go through a transmission chain. This would speak to the trade off between set and the item. `permancy = resistance to change? clay impressions. 
Egyptian standardised, vai more language like. We remove the common sources of inertia, there is no institutional pressure, and yet maybe we still see this resistance to becoming more compressed. If it doesn't change we can't say it's beduse of institutional pressure (unless we don't know what the institutions are) 
Is the system already close to optimally compressed when it is first gneerated, eg, from semasiography? We should find that early versus late items in both dataset and experiments, are equally reproducible.