Physical Entropy and Universal Chaos in Biological Diversity

We investigate the longstanding problem concerning diversity indices heuristically applied throughout biological systems. Using stochastic simulations, we demonstrate the existence of a relation that encompass the diversity, one of the ﬂagships of Darwinism, and quantum thermodynamics. Our results prove that informational entropy is indeed a physical/biological diversity index only in the presence of two novel mandatory phenomenological parameters. Moreover, we solve paradoxes related to diversity-entropy depletion in favor of the patterns formation in nature. Our ﬁndings pave the way for an unifying theory of phylogenetic analysis, properties of active matter, information theory and universal quantum chaos typical of nuclear and mesoscopic physics.


Introduction
In the mid-nineteenth century, Charles Darwin settled the foundations of evolutionary biology, establishing non-linear and irreversible mechanisms of interactions between different species 1 . The diversity observed in nature emerges in a stochastic way, strongly depening from molecular processes codified in the DNA and also from the interaction with the environment 2-6 . While the understanding, preservation and managing of ecological diversity have become a great endeavor for humanity, there are still missing columns in the foundations of an unified theory for acceptable diversity measurable [7][8][9][10][11][12] . Extensive debates over decades about the diversity indices and their ramifications and interpretations have culminated in the denial of the concept of diversity by a number of biologists 13,14 , although these are still widely used in a myriad of ecosystem investigations and, in particular, in phylogenetic analysis 15 .
Most diversity indices, including Shannon's, can be categorized into a general formula q D ≡ (∑ R i=1 p q i ) 1/(1−q) in which the different species (richness) R are distributed with probability p i in the available habitat locus 14 . The parameter or order q defines the entropy employed and, in particular, q = 0 indicates D = R, i.e., the diversity index in this order would treat indistinctly rare and common species 14,[16][17][18][19][20] . The critical order parameter that ponders all ecosystem species is q = 1 which, at the limit, yields the diversity index 1 D = exp(− ∑ R i=1 p i ln p i ), highlighting precisely the exponential of Shannon's entropy, S ≡ − ∑ R i=1 p i ln p i , which emerges naturally, without any reference to information theory. Biologists would have discovered Shannon's entropy and used it as the main diversity index regardless of information theory. Claude Shannon used the concept of missing information to classify electrical signals and inevitably came to a generalization of Gibbs-Boltzmann's physical entropy 21 .
Notwithstanding entropies are effective diversities indices, there is no obvious reason to claim that they are diversities themselves or, more importantly, it is not clear at all whether they are physical entropies compatible with the thermodynamics and the underlying information theory 22 . Therefore, from the study of biological systems or active matter [23][24][25][26] emerges the important question focused, in the first place, on the existence ad hoc of entropy merely as a consequence of specific desiderata. Secondly, it concerns its objective meaning to such an extent that it is measured experimentally in a phenomenological way, consistently with the laws of thermodynamics and not as an indirect diversity inference index.
In this investigation, through methods of quantum thermodynamics 27-31 , nuclear physics 32-34 , uncertainty relations 35,36 and Random Matrix Theory 37-39 , we show that entropy is a valid measure of diversity, as long as there are two new parameters or constants: The first is multiplicative and cannot be taken as the numerical identity as occurs in electrical signals; The second is additive and can be interpreted in light of the chaos theory underlying the dynamics of interspecies interaction, also demonstrating the compatibility of biological entropy with the third law of thermodynamics. In addition, also performing an extensive computational simulation, we provide clear evidences for the existence of universal chaos, similar to that of compound nucleus systems and mesoscopic devices, competing with the entropic processes, demonstrated here as containing the physical thermodynamics.

Methodology
We implement a stochastic spatial variant of the "rock-paper-scissors game" (RPS), also known as the Lotka-Volterra cyclic model [40][41][42][43][44][45] . The system emulates the emergent behavior of spiral spatial patterns supported by stationary temporal behavior, chemical reactions [46][47][48] , artificial life [23][24][25][26] , among other analogues. While our findings have a broad spectrum of applicability, we will use the prototype model proposed by May and Leonard in which, for instance, three species (RPS 3 ), ABC, provide a sustainable cyclical competition held by codominance rate σ and reproduction rate µ according to the following rules Therefore, the individual interactions are reduced to, in reading, (similar for greater species richness): A will prey on the species B, AB → A / 0 and will proliferate at a rate µ if an empty space, / 0, is within its immediate reach. For completeness, and to make it realistic, we will incorporate a process of mobility of individuals characterized by the rate ε among the closest sites. The generality of the results become explicit through the analysis of two more models, one cyclic contemplating one more species D and another with the addition of the apex predator, denoted, respectively, as RPS 4 and Apex. The Apex reaction is defined as a total codominance towards other individuals, but with a finite temporal existence of the Apex. The three models are schematically represented at the top of Fig.(1).

Results and discussion
The three previous models impose the formation of intriguing non-equilibrium spiral patterns emerging temporally even if the beginning is of a random (homogeneous and isotropic) distribution. In particular, there is the formation of stationary vortices defined by the points of contact between different domains of species. The vortices can rotate in a dextrorotatory or levorotatory way and are never annihilated or created after stabilization. Examples include those in Fig.(1) for the three models. While the non-equilibrium pattern can be visualized directly, the density of individuals fluctuates in time, as depicted in the Fig.(2), ensuring cyclic coexistence 44 . The number of empty sites tends asymptotically to a low fixed value, meanwhile the number of individuals of a specie i, E i , fluctuates around a fixed value for any i and it is independent of the initial condition. The execution of R = 10 3 realizations supported by both initial conditions and random movements demonstrates the ergodicity of the process signed by average E i distributed in a Gaussian way, as shown in Fig.(2). Whilst the number (richness) of species is fixed in the stationary regime, the fluctuations of non-equilibrium depend on {σ , ε, . . . } and, thus, a concrete question emerges about the diversity of an ecosystem. Diversity indices include those derived from Shannon-Wiener (SW) entropy, among others from generalized entropies such as Tsallis and Rényi [16][17][18][19][20][21] . In terms of "species richness", R, such diversity indices can be written, respectively, as being q the order of entropic non-extensiveness. The effective number of elements is at the heart of diversity in biology. When all species are equally common, diversity is proportional to the number of species (richness, R). Entropy can be interpreted as the uncertainty in the identification of species in a sample and not as the number of individuals of each specie in the community. When the number of individuals of each specie is heterogeneous, the question naturally arises as to whether or not the place is occupied by an individual of the specie E i ; SW entropy on a 2 basis provides the minimum number of yes/no inquiries required to specify, on average, species identity E i in a probabilistically distributed sample. We can primarily ask if S SW ≡ − ∑ R i=1 p i log 2 p i is a diversity index as well as recovers Gibbs-Boltzmann's thermodynamics 21 . Secondly, we can question whether the value provides the uncertainty associated with the heterogeneity of individuals following an objective hypothesis such as that of molecular chaos in Boltzmann's theory. Furthermore, we could inquire about a solution via entropic non-extensibility i.e. if SW entropy does not objectively satisfy an universal criterion for biological systems consistent with thermodynamics, other generalized entropies, for instance those from Tsallis and Rényi, could function as diversity indices. We simulate all entropies for the three models as a function of time. For instance, the Fig.(3) shows the SW entropy of the three distinct biological models considered here, (RPS 3 , RPS 4 , and Apex), as function of time. The initial suppression of its value is associated with the pattern formation of the initially random samples. The stability of the process is guaranteed by the entropic depletion that fluctuates around an average value indicated in Fig.(2). Fluctuating entropy is a characteristic of open systems, i.e. the periodicity of the sample manifests the exchange of resources with the environment. The inexorable production of irreversible entropy over time and its fluctuation naturally lead us to treat it as a random variable distributed according to a probability distribution P(S), whose numerical result in the form of a histogram is shown in Fig.(3,right), which must, as a condition, satisfy the so-called Fluctuation Theorems (FT). As long as S is treated as a biology observable, FT should represent a generalization of the second law of thermodynamics (of biological systems). Recent results establish a powerful relation between the fluctuations of any observable and the entropy production. The unfolding of these results generically known as Thermodynamic Uncertainty Relations (TURs) can be written in our notation as [27][28][29][30] where E i ≡ E 2 i − E i 2 is a Measure of the Amplitude of the Fluctuation (MAF), depicted in the Fig.(2), indicating a concatenation with our result shown in the Fig.(3).
The uncertainty-type relation between the two quantities is another indication in biology that entropy can be seen as a diversity index. Note that the Eq.(1) can be written as E i S ≥ constant 49 , indicating that the MAF can be associated with the dominance of specific specie relative to the others in a certain moment of time and, therefore, inversely proportional to the diversity index. More recently, the Eq.(1) has been generalized to contemplate the tightest saturable (TS) tradoff bound 31 where f (η) = csch 2 (g(η/2)) denoting the inverse of η tanh(η). Naturally, we must inquire if the entropy commonly used in biological systems satisfies Eqs. (1,2). Equally, we wish to identify whether the diversity indices used in the biology literature are acceptable in the light of non-equilibrium processes at the stochastic level. In the stationary state, we test the three entropies, both in Eq.(1) and in TS for the three stochastic biological models. In every possible scenario, a diversity index written as described so far violates the TURs. Simply asking whether or not there is an individual in a site does not make it possible to quantify the diversity. In the same manner, changing the logarithm basis in the entropy formula produces the same violations. Thereby we raise the plausible hypothesis that there is a fundamental constant associated with biological processes analogous to the Boltzmann constant for gases. From now on, we will call this "universal imperative of biological systems" as K BS treating it on the emergent scale in the SW, Rényi and Tsallis entropies, , respectively. The TUR can be implemented by defining the function g(q, In Fig.(4), we present our results from the simulation in a diagram form that represents the regions with the function signs g(q, K BS ) in the stationary regime; very similar results are obtained for the non-stationary regime (early times). The blank portion of each diagram for the different entropies denote the positive sector (satisfies the TUR) and the colored sectors denote 4/8 negative regions (violations of the TUR). In Fig.(4c), we see the parameter g of SW entropy, clearly indicating the violation for small values of K BS . We conclude that the TUR establishes the validity of the three entropies as diversity indices as long as there is a mandatory fundamental constant K BS > 1 in the nature of the problem, even admitting the greatest (TS) g(q, K BS ) possible from the generalized TUR.
The understanding of the arrow of time related to biological process requires association with the concept of information. The decision about an individual's presence in a locus requires a binary response, that is, if V is the volume of the phase space (configuration) occupied in a certain moment of time t and Z its total volume, so the informational content of the system can be written as I = log 2 (Z /V ). Entropy from the informational perspective is the "lack of information" 50 and can be written as S = k T L − I, where the constant k T L is arbitrary. In terms of phase space volume, S = log 2 V +C, where C is some other constant. Note that all our stochastic simulations demonstrate that the entropies of the systems decrease until they reach the equilibrium established together with the formation of stable spiral patterns. Therefore, in the stationary regime, the volume V is smaller than the total volume of the configuration space. The formation of emerging patterns means that the presence of an individual in a locus affects the probability of detecting an individual of the same type in the neighboring locus, that is, there will always be minimum conditional information content about the ecosystem associated with the stationary pattern itself.
Whereas the entropic mechanism tends to increase the configuration space occupied relative to the total until the information is zero, V = Z , another mechanism tends to deplete it. This paradox 51 can be understood in the light of the symplectic geometry associated with Liouville's space, which states that phase space tends to remain constant in time 52,53 . Thus, chaos solves this paradox by stating that the shape of phase space can change drastically, but its volume tends to remain constant in the stationary condition [54][55][56] .
The conceptual evidence of chaos in biological systems has been established in different investigations [40][41][42][43][44] , while here we will show, for the first time, its universality property, also present in billiards with arbitrary geometries or in quantum billiards characterized by heavy nuclei or mesoscopic systems [32][33][34][37][38][39] . We built the correlation matrix C = H H T in which the matrix entries H i j ≡ H i j − H j /σ H j make explicit the number of individuals H i j of any specie at the time i of a realization j and H j (σ H j ) is the average (variance) in the ensemble. The eigenvalues of C are ordered, λ 1 < λ 2 < · · · < λ M in order to set the non-random level repulsion parameter as a function of eigenvalues as also known as Nearest Neighbor Spacing (NNS). If NNS occurs exponentially P 0 ∝ e −s i (Poisson-like distribution) the temporal evolution is random; if not, the evolution characterized by the correlation matrix is chaotic. Additionally, if there is a maximum correlation in the ensemble that defines the dynamics, the system satisfies the so called universal chaos and the spacing distribution is part of the Gaussian Orthogonal Ensemble (GOE) We We observe GOE in all systems, indicating repulsive levels and nicely confirming our prediction about the presence of chaos. So in essence, biological systems satisfy the universal chaos [37][38][39] , which explains the paradox of the arrow of time in 5/8 that systems. The existence of diversity indices based on physical informational entropies also imposes modifications to them in such a way that S † = K BS S + K T L .
Fixing K T L = S T L , we can interpret it as a minimum entropy [54][55][56] . We show numerically that adding S T L to the commonly used entropy would also lead to satisfy the TUR. More generally, the presence of S T L can be readily considered the requirement for establish an analogy to the third law of thermodynamics for biological systems if we consider that zero entropy would occur in an ecosystem with a single specie, a condition never found in nature that always requires a food chain. This remarkable feature is similar to the position-momentum quantum uncertainty relation, (∆x) 2 (∆p) 2 ≥h 2 /4, which imposes movement to a set of atoms that never finds zero temperature. The reciprocal relation E i S = constant captures direct interpretations of an ecosystem. For instance, consider S = 4 in an interaction of the type RPS 4 and Apex. According to Fig.(2), the MAF in the presence of the Apex is lower, which indicates more sparse distribution and tendency to suppress the chaos manifest in natural patterns, exactly as shown in Fig.(1). Therefore, we can infer that the diversity S (canonically conjugated variable to MAF) in the presence of the Apex is amplified, which is nicely confirmed in Fig.(3). Experimental evidence of amplifying diversity in the presence of the Apex can be found, for example, in 57 .

Discussion
Using stochastic simulations, we categorically demonstrate the reconciliation between biology diversity indices and quantum thermodynamics. Thereby we have removed the persistent doubt of a vast literature about the ad hoc character of entropy merely seen as a desiderata obtained independently in biology. However, we show the mandatory introduction of two new phenomenological constants in the definition of entropy. In particular, the multiplicative constant here referred to as " universal imperative of biological systems" establishes the connection with the second law of thermodynamics, emerging from the analysis of Thermodynamic Uncertainty Relations; The additive constant, on the other hand, provides an interpretation of the third law of thermodynamics and arises from the concatenation between information theory and chaos theory. According to our analysis, the commonly used diversity indices must undergo corrections that give them a physical/thermodynamic character. In particular, the diversity derived from the Shannon-Wiener entropy is rewritten in terms of the new parameters as D † = e S † = e K T L ( 1 D) K BS , 1 D being the effectively non-physical diversity index.
Moreover, we find evidence of the universal chaos in biological systems, demonstrating the similarities with nuclear and mesoscopic systems and we also perform an analysis of its relation with negentropy and the third law of thermodynamics. These remarkable features are significant in order to elucidate questions about the physical/informational character of biological entropies and the natural and universal emergence of the diversity concept. Our work can be straightforwardly extended to study a variety of systems, such as artificial life, chemical reactions, social and economical. [58][59][60]