Main Data History
Show Index Toggle 0 comments
  •  Quick Edit
  • Measuring Complexity in an Aquatic Ecosystem


    Abstract. We apply to aquatic ecosystem components, formal measures of emergence, self-organization, homeostasis, autopoiesis and complexity developed by the authors. These measures are based on information theory, created by Claude Shannon. In particular, they were applied to the physiochemical component of an aquatic ecosystem located in the Arctic Polar Circle. The results show that variables with a “homogeneous” distribution of its values in all states presented obtained higher values of emergence, while variables with a more “heterogeneous” value’s distribution had a higher self-organization. It is confirmed that variables having high complexity values reflect a balance between change (emergence) and regularity/order (self-organization). In addition, homeostasis values were coinciding with the variation of winter and summer season. Also, autopoiesis values confirmed a higher degree of independence of some components (physiochemical and biological) over others. This approach showed how the ecological dynamics can be described in terms of information.

    Keywords Complex Systems, Information Theory; Complexity, Self-organization, Emergence, Homeostasis, Autopoiesis.


    Traditionally, science has been reductionistic. Reductionism—the most popular approach in science—is not appropriate for studying biological and ecological systems, as it attempts to simplify and separate in order to predict their future behavior and states. Due to prediction difficulty, biological and ecological systems have been considered as complex. This “complexity” is due to the relevant interactions between components. It is important to highlight that, etymologically, the term complexity comes from the Latin plexus, which means interwoven. In other words, something complex is difficult to separate.

    Being complex, biological and ecological systems have properties like emergence, self-organization, and life. It means, biological and ecological systems dynamics generate novel information from the relevant interactions among components. Interactions determine the future of systems and its complex behavior. Novel information limits predictability, as it is not included in initial or boundary conditions. It can be said that this novel information is emergent since it is not in the components, but produced by their interactions. Interactions can also be used by components to self-organize, i.e. produce a global pattern from local dynamics. The balance between change (chaos) and stability (order) states has been proposed as a characteristic of complexity. Since more chaotic systems produce more information (emergence) and more stable systems are more organized, complexity can be defined as the balance between emergence and self-organization. In addition, there are two properties that support the above processes: homeostasis refers to regularity of states in the system and autopoiesis that reflects autonomy.

    Due to a plethora of definitions, notions, and measures of these concepts have been proposed, the authors proposed abstract measures of emergence, selforganization, complexity, homeostasis an autopoiesis based on information theory, in focus to clarify their meaning with formal definitions (Gershenson and Fernández, 2012). Now we propose apply to aquatic ecosystem formal measures developed. From the application to the case of study, an Artic lake, we clarify the ecological meaning of these notions, and we showed how the ecological dynamics can be described in terms of information. This way the study of the complexity in biological and ecological cases, now is easier.

    In the next section, we present a brief explanation of notions of self-organization, emergence, complexity, homeostasis, autopoiesis and limnology (as the field that studies the lakes9. In section 3, we present the synthetic measures of results of emergence, selforganization, complexity, and homeostasis. Section 4 describes our experiments and results with the Artic lake, which illustrate the useful of the proposed measures. This is followed by a discussion in Section 5. The article closes with proposals for future work and conclusions.


    The measures applied in this paper have recently developed and compared with other previously proposed in the literature (Fernández et al., 2012; Gershenson and Fernández, 2012); more refined measures, based on axioms, have been presented in Fernández et al., (2013).

    In general, Emergence refers to properties of a phenomenon that are present now and were not before. If we suppose these properties as non-trivial, we could say it is harder now than before to reproduce the phenomenon. In other words, there is emergence in a phenomenon when this phenomenon is producing information and, if we recall, Shannon proposed a quantity which measures how much information was “produced” by a process.Therefore, we can say that the emergence is the same as the Shannon’s information I. Thus =I

    Self-organization has been correlated with an increase in order, i.e. a reduction of entropy (Gershenson and Heylighen, 2003). If emergence implies an increase of information, which is analogous to entropy and disorder, self-organization should be anti-correlated with emergence. We propose as the measure S = 1 − I = 1 − E.

    We can define complexity C as the balance between change (chaos) and stability (order). We have just defined such measures: emergence and self-organization. Hence we propose: C = 4 · E · S.. Where the constant 4 is added to normalize the measure to [0, 1]

    For homeostasis H, we are interested on how all variables of a system change or not in time. A useful function for comparing strings of equal length is the Hamming distance. The Hamming distanced measures the percentage of different symbols in two strings X and X’.

    As it has been proposed, adaptive systems require a highC in order to be able to cope with changes of its environment while at the same time maintaining their integrity (Langton,1990; Kauffman, 1993). If we have X represent the trajectories of the variables of a system and Y represent the trajectories of the variables of the environment of the system, If X had a high E, then it would not be able to produce its own information. With a highS, X would not be able to adapt to changes in Y . Therefore, we propose: A = C(X )/C(Y) .