1. Introduction
From quantum mechanics, it is well known that any separable state belongs to a composite space that can be factored into individual states from separate subspaces. One state is said to be correlated if it is not separable. As a matter of fact, to determine if a state is separable or not is not trivial and the problem is classed as
NP-hard from the theory of complex systems [
1].
A density operator, in quantum mechanics, is used to describe the statistical state of a quantum system. The usual meaning of it is that the eigenvalues are the probabilities of finding a system in one state corresponding to the eigenvectors. In a physical sense, we can see its elements as relative frequencies corresponding to an appropriate ensemble of
N identical copies of the system that are in several possible states under a certain setup or preparation protocol. Thus, we have a superposition of quantum states
with probabilities
(real numbers) satisfying
. However, it can be shown that the density operator formalism can be recovered in a Bayesian formalism for noncommutative expectations, wherein the system depends on the order of the measurements. This is not restricted just to a quantum mechanical system, and can be understood under the framework of what we have called a
fragile system [
2]. This operator provides a convenient mean for describing quantum systems whose state is not completely known, it being mathematically equivalent to a state vector approach [
3,
4].
By applying the entropy to the density matrix, we can obtain the degree of disinformation of the state of the system. The systems can be composed of subsystems and, using the subadditivity property (the probability of the whole is less than that of its parts) [
5], it is possible to quantify if the entropy of the whole is less than that of its parts. Holzer and De Meer [
6] make a comparison between the information at the system level with the information at a lower level. As they state, “this measure gives a high value of emergence for systems with many dependencies (interdependent components) and a low value of emergence for systems with few dependencies (independent components)”; therefore, the information of the whole is more than the information of the parts. In that sense, the entropy can be a good parameter to measure a type of emergence in systems.
This paper is organized as follows. In
Section 2, we talk about emergent systems and its current definitions. In
Section 3, we define fragile systems as ones that are modified by the act of measurement because of the change in internal variables. In
Section 4, we introduce the density matrix formalism. In
Section 5, we depict a mathematical formulation of emergent systems within the density matrix formalism. In
Section 6, we show a concrete example of a subadditive system. Finally, we provide some concluding remarks in
Section 7.
2. Emergent Systems
Several definitions of emergence exist, taking into account different aspects of their origin or behavior. For instance, Peter Checkland [
7] defines emergence as “the principle that entities exhibit properties which are meaningful only when attributed to the whole, not to its parts”. Emergent systems are structured in such a way that their components interact, allowing for the structure of global patterns, depicted as a consequence of interrelations/correlations between subsystem elements, them being the result of complex and self-organizing processes. This process may be triggered by an external stimulus.
There are basically three types of emergence: simple, weak and strong ones, described as follows:
Simple emergence is composed by the combination of certain properties and relationships between elements in a non-linear behavior. For instance, in order to achieve the flight of an airplane, we cannot consider the motors, the propulsion system and their wings separately; all of these properties must be considered together because they are interconnected and they have interrelations through which flight emerges. This type of emergence can be predicted from the functioning of its parts and it is referred to as the concept of synergy, which means interacting or working together [
8].
Weak emergence describes the emergence of properties of systems that may be predictable (not completely) and also reducible. They can be reduced to basic rules at an initial time. After a while, the behavior can be unpredictable, as is mentioned in chaos theory [
9]; nevertheless, it is possible to make computational simulations about such systems because of the knowledge of the basic rules. Weak emergence is the product of complex system dynamics (i.e., non-linear behavior, spontaneous order and adaptation); an example of the latter is cellular automaton, known as Conway’s Game of Life [
10,
11,
12].
Strong emergence is a case of non-expected emergence, as well as weak emergence. The difference lies in its non-reducible behavior, which appears just when the system is running. As it is systematically determined by low-level attributes, it is not possible to deduce it from the components at lower levels. The consciousness phenomenon is one example of this type of emergence, and appears as a construction process. There is likely no algorithm from the bottom up because it is a dynamical process evolving along time at the highest level, with non-linear relations at the lower ones.
Another way of conceptualizing emergence is the separation of levels of complexity of the system at different spatial or temporal scales [
13].
Some main characteristics of strong emergence to consider are the following:
Non-reducible phenomenon: the global state of the emergent system cannot be explainable, and neither is it reducible to its sub-system components.
Downward causation: emergent high-level properties appear from a non-obvious consequence of low-level properties, but, at the same time, all processes at the lower level of hierarchy are constrained by and act in coherence with the laws of the higher level [
14].
Wholeness: a phenomenon wherein a complex, interesting high-level function appears as a result of combining low-level mechanisms in straightforward ways.
Radical novelty emergence: a phenomenon wherein a system is designed according to certain principles. Interesting unexpected properties arise from the behavior of sub-system elements [
15].
In general, emergent systems are common in nature and technology. One example of the latter is the speed of a vehicle affected by the center of gravity, the driver skills, weather and friction, among other attributes (in this case, we can predict the emergent property of the speed from the relation between components, it being the case of weak emergence). It is possible to find strong emergent properties in the consciousness phenomenon [
16], human body and social phenomena, among others. One important example is ’self-awareness’, which is a result of the interconnection of neurons in the brain [
17].
In the case of biological systems (as well as in the case of social phenomena), emergent models are appropriate for describing those situations. We can observe the characteristics mentioned regarding strong emergence: non-reducibility, radical novelty, wholeness and downward causation.
The main aspect of biological emergent systems is that they may be observed from inside them by the system (“from internal control of a system which might be fully controlled by an observer/controller architecture that is part of the system” [
18]); this is in concordance with the autopoiesis theory proposed by the Chilean biologists H. Maturana and F. Varela [
19] to define the self-maintaining chemistry of any living cell, which is under the perspective of chemical organization theory used to formalize autopoietic structures, “providing a basis to operationalize goals as an emergent process” [
20]. When the measurement is affected by the observer, it is the case of what we will call a
fragile system. In the next section, we will go deeper into the concept.
3. Fragile Systems
In simple terms, a fragile system is one that is affected by the measurement [
21]. This distinguishes it from a non-fragile (classical) system, which is not modified upon observation.
Because any system (being fragile or not) possesses information, we will think of a system as a “black box” that can be found in different internal states, which is to be denoted by . In general, contains many degrees of freedom, but we will not make use of that inner structure here. The internal state contains all of the information necessary to describe any aspect of the system.
The crucial difference between a fragile system and a non-fragile one is that, in a fragile system, access to the internal state
is impossible, because it is precisely this internal state that is modified by the measurement. As the modification of the state
depends on the details of the environment when carrying out the measurement (which we do not know or control with accuracy), the outcome of a measurement is unavoidably stochastic, and a mathematical formulation requires probability theory [
22].
4. The Density Operator
In this section, we formulate the density matrix operator, as a previous concept, in order to define emergent systems using the calculation of entropies.
The density operator is a positive semi-definite Hermitian operator of trace one. If
is the matrix representation of an arbitrary observable
, we can write
Hence, we have
where
is the conjugate of the element
, and, in the case of real numbers, the same element. We can take the average of different measurements represented by different matrices
of the same observable, and then normalize and diagonalize it; finally, we obtain the density matrix of a mixture as below:
where
and
is obtained from the eigen-value problem
[
21].
We can formulate the density matrix operator by the use of a complex Hilbert space just as in Von Neumann’s formulation of quantum theory [
23]. For this, we consider an arbitrary orthonormal basis set |
n〉 (
n = 1, …,
N) with
, and define the density operator [
24]
as
with
complex numbers. Imposing that
is Hermitian, we see that the diagonal elements
must be real and
. It is always possible to make a choice of such complex matrix elements
so that they are proportional to the elements
; these are given by
where
is a normalization factor that imposes
.
Pure and Mixed States
Consider an ensemble of measurements
. If the state vector is known [
25], the ensemble represents a pure state. Assuming that the system is in the state |
v〉, we can expand it with respect to the eigenvector of a Hermitian operator
as follows:
Finally, we can define a pure state by the following term [
25]:
Because
, we can distinguish a density operator of a pure state by tracing; then, we have
When we cannot repeat exactly the same initial condition, because of the noise of the system, we represent this situation in a mathematical formulation in terms of an operator called a density matrix for mixed states. This is a superposition of pure states [
26]:
where the weights of each measurement satisfy the normalization condition
. Each
is the probability of finding a system in a given pure state.
In contrast to a pure state, when we have a mixed-density matrix, the trace of the square density matrix is given by the inequality
Hence, Tr
, known as the
degree of purity [
25], can be used to distinguish between pure and mixed states in a basis-independent manner.
5. Density Operator Formalism and Emergent Systems
Earlier work by Prokopenko et al. [
13] has set the basis for a discussion of complexity, self-organization and the emergence of classical systems in information-theoretical terms, particularly in terms of Shannon entropy and mutual information.
Taking all of this into account, we will rephrase the earlier arguments in terms of the (von Neumann) entropy of density matrices. However, first, let us review the behavior of the information entropy (or Shannon entropy) for classical, correlated systems.
We adopt the Bayesian view by Jaynes [
27] and others of the Shannon entropy related to the information content of a model based on, in principle, subjective probabilities, but consistent with known facts. Shannon entropy is then a measure of missing information in a probabilistic model about some aspect of reality, and is therefore dependent of the state of knowledge used to construct said model. For a state of knowledge
, where we ask an arbitrary question with
N possible answers, denoted by the proposition
, the Shannon entropy is defined as
where
is the probability
of the answer
being true under
. Please note that, for two ‘observers’ with different states of knowledge
and
, the Shannon entropies
and
that they assign to an unknown question will, in general, be different. For instance, if the first observer knows that
is true, whereas the second only knows that either
or
is true, then
because
, but
because
and
.
In the case where the question involves the unknown value of one or more variable, the information entropy directly translates in terms of the probability distribution. For instance, for the joint probability distribution
of the variables
X and
Y under the state of knowledge
, we have
Using the product rule of probability,
this entropy can always be separated into two terms [
28],
where the first term is the entropy of the variable
X, and the second term,
is the expected value of the conditional entropy of
Y given
X. This conditional entropy cannot be negative, it being the expected value of a non-negative quantity.
It is possible to extend the Bayesian idea of probabilities as degrees of belief constrained on the available information to quantum systems [
29,
30]. Let
. We can obtain the eigenvalues of
and
. Hence, the diagonal elements of
are given by all products of the form
, where
and
. Here,
and
are the dimensions of
and
, respectively; then:
On the other hand, the sum of the entropy of each system is given by
Thus, the entropy for an ensemble
for which the subsystems are uncorrelated is just equal to the sum of the entropies of the reduced ensembles for the subsystems. When there are correlations, we should expect an inequality instead (called the subadditivity property of entropy [
5,
31,
32]), since, in this case,
contains additional information concerning the correlations, which is not present in
and in
(those are the partial traces of
, respectively), as in
Given this inequality, we can use the mutual information formulation
such that
whenever there is subadditive behavior and
for additivity.
From Equation (19), we obtain a descriptor of weak emergency in systems that are correlated. In these types of systems, new information emerges from the relation of its parts in terms of correlations.
6. An Example of a Subadditive System
As is depicted in Equation (4), we can write the density matrix as follows:
but where we now interpret
as
where primed states are states after a measurement.
Because of the marginalization rule, it must hold that
while, simultaneously,
must be true. As an example, consider an abstract system composed of two integers,
a and
b, such that
and
. Let us set the constraint
that
a and
b are either both even or both odd; then, there are eight allowed states, namely
Let us now define the
measurement by
such that the system undergoes the following transitions:
,
,
and
. In this way, if the system has
a = 3 and
b = 1, that is, it is in the state
, and we perform the measurement
M, we will obtain a value
and the state will change to
= 4 and
= 2, that is, to
. This makes the system fragile regarding the measurement
M, with the integers
a and
b being the
hidden variables of the system. Lastly, we will define the parity of each integer as 0 if the number is odd, and 1 if it is even. In this way, there are only four observable states, namely
where the first and second position of the tuple represent the parity bit of
a and
b, respectively. Of these states, only
and
are consistent with the constraint
. Considering the allowed transitions, we have that
The matrix elements
are given in
Table 1, together with the density matrix elements
. The latter are simply
because
. The states given by Equation (27) are equivalent to the ket notation
. The density matrix elements
correspond to the density operator of a pure, but entangled state,
with
as one of the Bell states,
The reduced density matrices are
Hence, , and the operator is not separable. The values of the von Neumann entropy are and = 0, the latter being a pure state; therefore, the system is subadditive. With this example, we have found a density operator that is not only correlated but could describe an emergent system, since the entropy of the whole is less than that of its parts.