Nelson Fernández edited Measures.tex  almost 11 years ago

Commit id: 2d89c6289b9ea44e21a52ee2e1b41fa8263c0b7b

deletions | additions      

       

Self-organization has been correlated with an increase in order, i.e. a reduction of entropy (Gershenson and Heylighen, 2003). If emergence implies an increase of information, which is analogous to entropy and disorder, self-organization should be anti-correlated with emergence. We propose as the measure {\it S = 1 − I = 1 − E}.  We can define complexity C as the balance between change (chaos) and stability (order). We have just defined such measures: emergence and self-organization. Hence we propose: {\it C = 4 · E · S.}. Where the constant 4 is added to normalize the measure to [0, 1] For homeostasis {\it H}, we are interested on how all variables of a system change or not in time. A useful function for comparing strings of equal length is the Hamming distance. The Hamming distance{\it d} measures the percentage of different symbols in two strings {\it X} and {\it X'}.     As it has been proposed, adaptive systems require a high{\it C} in order to be able to cope with changes of its environment while at the same time maintaining their integrity (Langton,1990; Kauffman, 1993). If we have {\it X} represent the trajectories of the variables of a system and {\it Y} represent the trajectories of the variables of the environment of the system, If {\it X} had a high {\it E}, then it would not be able to produce its own information. With a high{\it S}, {\it X }would not be able to adapt to changes in {\it Y} . Therefore, we propose: {\it A = C(X )/C(Y)} .