The speculation over the methodological aspects of the current chemical research implies the use of a logical architecture wherein the concept of complex system [1] is essential not only from a theoretical viewpoint. In fact, reasoning in a complex perspective may provide useful insights for data interpretation and experimental design within a variety of experimental research fields, ranging from nanomaterials to protein chemistry. Moreover, the complexity debate opens up new possibilities for classifying the borderline phenomenology that falls at the edge of different disciplines and may be described under different and irreducible perspectives in distinct disciplinary do-mains. This is the case of many phenomena that lie at the border between chemistry, physics and biology. The investigation on complex systems leads inevitably to questioning about the possibility of defining the complexity degree of a system or to recognize, inside a complex system, distinct levels of complexity. From a qual-itative perspective, this multiplicity may often appear intuitive, based on classification hierachies such as the “complexity pyramid of life” proposed by Oltvai e Barabási [2]; but the consistency of these classifications be-comes critical as soon as a quantitative evaluation is attempted. Is it really possible to quantify the complexity degree of a system? Some authors propose to use entropy as an index of complexity [3]. This instance made us drill down the concept of entropy and highlight a plethora of entropy definitions (more than 40), not always consistent between them, that can be found in the scientific literature. In order to perform a critical investigation of the existing relationships among these different definitions and of their implications, we classified the entropy definitions in five main groups, each one based on a distinct theoretical reference system: i) physical entropy, an experimentally measurable entity, includes the original definition by Clausius (dS=δq/T) and represents a measure of the transformation content (Verwandlungsinhalt) of a thermodynamic system [4]; ii) mathematical entropy, a formal instance, includes all definitions referring to the Pfaff’s theory on differential equations and the assiomatic approach to the Second Law of thermodynamics, e.g. the entropies by Caratheodory [5], Lieb-Yngvason [6] and Tsaliss [7]; iii) quantum entropy, outlined for the first time by von Neumann [8] with reference to the Density Matrix formalism, can be seen as a measure of the “purity” of an arbitrary quantum state. iv) cybernetic entropy, related with the Information Theory, includes the functions by Kolmogorov and Shannon [9]; v) statistical-mechanics entropy includes definitions developed inside Statistical Thermodynamics, e.g. the Gibbs’s [10] and Boltzmann’s [11] entropies. Our work leads towards two main conclusions: i) being aware of the variety of entropy definitions suggests the need for carefully contextualizing the concept of entropy before using it; ii) the plurality of entropy definitions can be exploited as a tool for defining the complexity degree of a system, according to the definition of complex system proposed by Rosen [12]. In fact, the availability of a variety of functions, referred to distinctly different theoretical models, that are irreducible to each other, is a peculiarity of the complex nature of a system. The application of this theoretical approach to the case of a protein system will be discussed.

### Potential use of entropy functions as a measure of complexity

#####
*PELLEGRINO, EMILIO MARCO;GHIBAUDI, Elena Maria*

##### 2014-01-01

#### Abstract

The speculation over the methodological aspects of the current chemical research implies the use of a logical architecture wherein the concept of complex system [1] is essential not only from a theoretical viewpoint. In fact, reasoning in a complex perspective may provide useful insights for data interpretation and experimental design within a variety of experimental research fields, ranging from nanomaterials to protein chemistry. Moreover, the complexity debate opens up new possibilities for classifying the borderline phenomenology that falls at the edge of different disciplines and may be described under different and irreducible perspectives in distinct disciplinary do-mains. This is the case of many phenomena that lie at the border between chemistry, physics and biology. The investigation on complex systems leads inevitably to questioning about the possibility of defining the complexity degree of a system or to recognize, inside a complex system, distinct levels of complexity. From a qual-itative perspective, this multiplicity may often appear intuitive, based on classification hierachies such as the “complexity pyramid of life” proposed by Oltvai e Barabási [2]; but the consistency of these classifications be-comes critical as soon as a quantitative evaluation is attempted. Is it really possible to quantify the complexity degree of a system? Some authors propose to use entropy as an index of complexity [3]. This instance made us drill down the concept of entropy and highlight a plethora of entropy definitions (more than 40), not always consistent between them, that can be found in the scientific literature. In order to perform a critical investigation of the existing relationships among these different definitions and of their implications, we classified the entropy definitions in five main groups, each one based on a distinct theoretical reference system: i) physical entropy, an experimentally measurable entity, includes the original definition by Clausius (dS=δq/T) and represents a measure of the transformation content (Verwandlungsinhalt) of a thermodynamic system [4]; ii) mathematical entropy, a formal instance, includes all definitions referring to the Pfaff’s theory on differential equations and the assiomatic approach to the Second Law of thermodynamics, e.g. the entropies by Caratheodory [5], Lieb-Yngvason [6] and Tsaliss [7]; iii) quantum entropy, outlined for the first time by von Neumann [8] with reference to the Density Matrix formalism, can be seen as a measure of the “purity” of an arbitrary quantum state. iv) cybernetic entropy, related with the Information Theory, includes the functions by Kolmogorov and Shannon [9]; v) statistical-mechanics entropy includes definitions developed inside Statistical Thermodynamics, e.g. the Gibbs’s [10] and Boltzmann’s [11] entropies. Our work leads towards two main conclusions: i) being aware of the variety of entropy definitions suggests the need for carefully contextualizing the concept of entropy before using it; ii) the plurality of entropy definitions can be exploited as a tool for defining the complexity degree of a system, according to the definition of complex system proposed by Rosen [12]. In fact, the availability of a variety of functions, referred to distinctly different theoretical models, that are irreducible to each other, is a peculiarity of the complex nature of a system. The application of this theoretical approach to the case of a protein system will be discussed.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.