Next Article in Journal
Point Cloud Geometry Compression Based on Multi-Layer Residual Structure
Previous Article in Journal
Evaluating the Applications of Dendritic Neuron Model with Metaheuristic Optimization Algorithms for Crude-Oil-Production Forecasting
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Quantum Mutual Information, Fragile Systems and Emergence

by
Yasmín Navarrete
1,*,† and
Sergio Davis
2,3,†
1
Instituto de Filosofía y Ciencias de la Complejidad, Los Alerces 3024 Ñuñoa, Santiago 7780192, Chile
2
Research Center in the Intersection of Plasma Physics, Matter and Complexity (P2mc), Comisión Chilena de Energía Nuclear, Casilla 188-D, Santiago 8320000, Chile
3
Departamento de Física, Facultad de Ciencias Exactas, Universidad Andres Bello, Sazié 2212 piso 7, Santiago 8370136, Chile
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Submission received: 12 September 2022 / Revised: 6 November 2022 / Accepted: 11 November 2022 / Published: 17 November 2022
(This article belongs to the Section Information Theory, Probability and Statistics)

Abstract

:
In this paper, we present an analytical description of emergence from the density matrix framework as a state of knowledge of the system, and its generalized probability formulation. This description is based on the idea of fragile systems, wherein the observer modifies the system by the measurement (i.e., the observer effect) in order to detect possible emergent behavior. We propose the use of a descriptor, based on quantum mutual information, to calculate if subsystems of systems have inner correlations. This may contribute to a definition of emergent systems in terms of emergent information.

1. Introduction

From quantum mechanics, it is well known that any separable state belongs to a composite space that can be factored into individual states from separate subspaces. One state is said to be correlated if it is not separable. As a matter of fact, to determine if a state is separable or not is not trivial and the problem is classed as NP-hard from the theory of complex systems [1].
A density operator, in quantum mechanics, is used to describe the statistical state of a quantum system. The usual meaning of it is that the eigenvalues are the probabilities of finding a system in one state corresponding to the eigenvectors. In a physical sense, we can see its elements as relative frequencies corresponding to an appropriate ensemble of N identical copies of the system that are in several possible states under a certain setup or preparation protocol. Thus, we have a superposition of quantum states | v i with probabilities p i (real numbers) satisfying p i = 1 . However, it can be shown that the density operator formalism can be recovered in a Bayesian formalism for noncommutative expectations, wherein the system depends on the order of the measurements. This is not restricted just to a quantum mechanical system, and can be understood under the framework of what we have called a fragile system [2]. This operator provides a convenient mean for describing quantum systems whose state is not completely known, it being mathematically equivalent to a state vector approach [3,4].
By applying the entropy to the density matrix, we can obtain the degree of disinformation of the state of the system. The systems can be composed of subsystems and, using the subadditivity property (the probability of the whole is less than that of its parts) [5], it is possible to quantify if the entropy of the whole is less than that of its parts. Holzer and De Meer [6] make a comparison between the information at the system level with the information at a lower level. As they state, “this measure gives a high value of emergence for systems with many dependencies (interdependent components) and a low value of emergence for systems with few dependencies (independent components)”; therefore, the information of the whole is more than the information of the parts. In that sense, the entropy can be a good parameter to measure a type of emergence in systems.
This paper is organized as follows. In Section 2, we talk about emergent systems and its current definitions. In Section 3, we define fragile systems as ones that are modified by the act of measurement because of the change in internal variables. In Section 4, we introduce the density matrix formalism. In Section 5, we depict a mathematical formulation of emergent systems within the density matrix formalism. In Section 6, we show a concrete example of a subadditive system. Finally, we provide some concluding remarks in Section 7.

2. Emergent Systems

Several definitions of emergence exist, taking into account different aspects of their origin or behavior. For instance, Peter Checkland [7] defines emergence as “the principle that entities exhibit properties which are meaningful only when attributed to the whole, not to its parts”. Emergent systems are structured in such a way that their components interact, allowing for the structure of global patterns, depicted as a consequence of interrelations/correlations between subsystem elements, them being the result of complex and self-organizing processes. This process may be triggered by an external stimulus.
There are basically three types of emergence: simple, weak and strong ones, described as follows:
Simple emergence is composed by the combination of certain properties and relationships between elements in a non-linear behavior. For instance, in order to achieve the flight of an airplane, we cannot consider the motors, the propulsion system and their wings separately; all of these properties must be considered together because they are interconnected and they have interrelations through which flight emerges. This type of emergence can be predicted from the functioning of its parts and it is referred to as the concept of synergy, which means interacting or working together [8].
Weak emergence describes the emergence of properties of systems that may be predictable (not completely) and also reducible. They can be reduced to basic rules at an initial time. After a while, the behavior can be unpredictable, as is mentioned in chaos theory [9]; nevertheless, it is possible to make computational simulations about such systems because of the knowledge of the basic rules. Weak emergence is the product of complex system dynamics (i.e., non-linear behavior, spontaneous order and adaptation); an example of the latter is cellular automaton, known as Conway’s Game of Life [10,11,12].
Strong emergence is a case of non-expected emergence, as well as weak emergence. The difference lies in its non-reducible behavior, which appears just when the system is running. As it is systematically determined by low-level attributes, it is not possible to deduce it from the components at lower levels. The consciousness phenomenon is one example of this type of emergence, and appears as a construction process. There is likely no algorithm from the bottom up because it is a dynamical process evolving along time at the highest level, with non-linear relations at the lower ones.
Another way of conceptualizing emergence is the separation of levels of complexity of the system at different spatial or temporal scales [13].
Some main characteristics of strong emergence to consider are the following:
  • Non-reducible phenomenon: the global state of the emergent system cannot be explainable, and neither is it reducible to its sub-system components.
  • Downward causation: emergent high-level properties appear from a non-obvious consequence of low-level properties, but, at the same time, all processes at the lower level of hierarchy are constrained by and act in coherence with the laws of the higher level [14].
  • Wholeness: a phenomenon wherein a complex, interesting high-level function appears as a result of combining low-level mechanisms in straightforward ways.
  • Radical novelty emergence: a phenomenon wherein a system is designed according to certain principles. Interesting unexpected properties arise from the behavior of sub-system elements [15].
In general, emergent systems are common in nature and technology. One example of the latter is the speed of a vehicle affected by the center of gravity, the driver skills, weather and friction, among other attributes (in this case, we can predict the emergent property of the speed from the relation between components, it being the case of weak emergence). It is possible to find strong emergent properties in the consciousness phenomenon [16], human body and social phenomena, among others. One important example is ’self-awareness’, which is a result of the interconnection of neurons in the brain [17].
In the case of biological systems (as well as in the case of social phenomena), emergent models are appropriate for describing those situations. We can observe the characteristics mentioned regarding strong emergence: non-reducibility, radical novelty, wholeness and downward causation.
The main aspect of biological emergent systems is that they may be observed from inside them by the system (“from internal control of a system which might be fully controlled by an observer/controller architecture that is part of the system” [18]); this is in concordance with the autopoiesis theory proposed by the Chilean biologists H. Maturana and F. Varela [19] to define the self-maintaining chemistry of any living cell, which is under the perspective of chemical organization theory used to formalize autopoietic structures, “providing a basis to operationalize goals as an emergent process” [20]. When the measurement is affected by the observer, it is the case of what we will call a fragile system. In the next section, we will go deeper into the concept.

3. Fragile Systems

In simple terms, a fragile system is one that is affected by the measurement [21]. This distinguishes it from a non-fragile (classical) system, which is not modified upon observation.
Because any system (being fragile or not) possesses information, we will think of a system as a “black box” that can be found in different internal states, which is to be denoted by λ . In general, λ contains many degrees of freedom, but we will not make use of that inner structure here. The internal state λ contains all of the information necessary to describe any aspect of the system.
The crucial difference between a fragile system and a non-fragile one is that, in a fragile system, access to the internal state λ is impossible, because it is precisely this internal state that is modified by the measurement. As the modification of the state λ depends on the details of the environment when carrying out the measurement (which we do not know or control with accuracy), the outcome of a measurement is unavoidably stochastic, and a mathematical formulation requires probability theory [22].

4. The Density Operator

In this section, we formulate the density matrix operator, as a previous concept, in order to define emergent systems using the calculation of entropies.
The density operator is a positive semi-definite Hermitian operator of trace one. If A is the matrix representation of an arbitrary observable A ^ , we can write
A = a 1 a 2 a N a ¯ 1 a ¯ 2 a ¯ N .
Hence, we have
A = a 1 a ¯ 1 a 1 a ¯ N a 2 a ¯ 1 a 2 a ¯ N a N a ¯ 1 a N a ¯ N
where a ¯ i is the conjugate of the element a i , and, in the case of real numbers, the same element. We can take the average of different measurements represented by different matrices A of the same observable, and then normalize and diagonalize it; finally, we obtain the density matrix of a mixture as below:
ρ ^ = i N p i | v v |
where p i = P ( λ i | I ) and λ i is obtained from the eigen-value problem λ i | v i = λ i A ^ [21].
We can formulate the density matrix operator by the use of a complex Hilbert space just as in Von Neumann’s formulation of quantum theory [23]. For this, we consider an arbitrary orthonormal basis set |n〉 (n = 1, …, N) with i | j = δ ( i , j ) , and define the density operator [24] ρ ^ as
ρ ^ : = i = 1 N j = 1 N ρ i j | i j | ,
with ρ i j complex numbers. Imposing that ρ ^ is Hermitian, we see that the diagonal elements ρ i i must be real and ρ i j = ρ j i ¯ . It is always possible to make a choice of such complex matrix elements ρ i j so that they are proportional to the elements p i j ; these are given by
ρ i j = ϵ i | ρ ^ | ϵ j = 1 2 η [ p i j + p j i ] + i [ p i j p j i ] ,
where η = i p i i is a normalization factor that imposes Tr ( ρ ^ ) = 1 .

Pure and Mixed States

Consider an ensemble of measurements { | v i } i = 1 N . If the state vector is known [25], the ensemble represents a pure state. Assuming that the system is in the state |v〉, we can expand it with respect to the eigenvector of a Hermitian operator A ^ as follows:
| v = n λ n | v n ,
A ^ | v = λ n | v .
Finally, we can define a pure state by the following term [25]:
ρ ^ = | v v | .
Because ρ ^ 2 = ( | v v | ) ( | v v | ) = | v ( v | v ) v | = ρ ^ , we can distinguish a density operator of a pure state by tracing; then, we have
Tr ( ρ ^ ) = Tr ( ρ ^ 2 ) = 1 .
When we cannot repeat exactly the same initial condition, because of the noise of the system, we represent this situation in a mathematical formulation in terms of an operator called a density matrix for mixed states. This is a superposition of pure states [26]:
ρ ^ = i p i | v v | ,
where the weights of each measurement satisfy the normalization condition i p i = 1 . Each p i is the probability of finding a system in a given pure state.
In contrast to a pure state, when we have a mixed-density matrix, the trace of the square density matrix is given by the inequality
Tr ( ρ ^ 2 ) < 1 .
Hence, Tr ( ρ ^ 2 ) , known as the degree of purity [25], can be used to distinguish between pure and mixed states in a basis-independent manner.

5. Density Operator Formalism and Emergent Systems

Earlier work by Prokopenko et al. [13] has set the basis for a discussion of complexity, self-organization and the emergence of classical systems in information-theoretical terms, particularly in terms of Shannon entropy and mutual information.
Taking all of this into account, we will rephrase the earlier arguments in terms of the (von Neumann) entropy of density matrices. However, first, let us review the behavior of the information entropy (or Shannon entropy) for classical, correlated systems.
We adopt the Bayesian view by Jaynes [27] and others of the Shannon entropy related to the information content of a model based on, in principle, subjective probabilities, but consistent with known facts. Shannon entropy is then a measure of missing information in a probabilistic model about some aspect of reality, and is therefore dependent of the state of knowledge used to construct said model. For a state of knowledge I , where we ask an arbitrary question with N possible answers, denoted by the proposition A 1 , A 2 , , A N , the Shannon entropy is defined as
S ( I ) : = i = 1 N p i ln p i
where p i is the probability P ( A i | I ) of the answer A i being true under I . Please note that, for two ‘observers’ with different states of knowledge I 1 and I 2 , the Shannon entropies S ( I 1 ) and S ( I 2 ) that they assign to an unknown question will, in general, be different. For instance, if the first observer knows that A 1 is true, whereas the second only knows that either A 1 or A 2 is true, then S ( I 1 ) = 0 because P ( A k | I 1 ) = δ 1 k , but S ( I 2 ) > 0 because 0 < P ( A 1 | I 2 ) < 1 and P ( A 2 | I 2 ) = 1 P ( A 1 | I 2 ) > 0 .
In the case where the question involves the unknown value of one or more variable, the information entropy directly translates in terms of the probability distribution. For instance, for the joint probability distribution P ( X , Y | I ) of the variables X and Y under the state of knowledge I , we have
S X Y ( I ) : = x y P ( x , y | I ) ln P ( x , y | I ) .
Using the product rule of probability,
P ( x , y | I ) = P ( y | x , I ) P ( x | I ) ,
this entropy can always be separated into two terms [28],
S X Y ( I ) = S X ( I ) + S Y | X I ,
where the first term is the entropy of the variable X, and the second term,
S Y | X I : = x P ( x | I ) y P ( y | x , I ) ln P ( y | x , I )
is the expected value of the conditional entropy of Y given X. This conditional entropy cannot be negative, it being the expected value of a non-negative quantity.
It is possible to extend the Bayesian idea of probabilities as degrees of belief constrained on the available information to quantum systems [29,30]. Let ρ 12 = ρ 1 ρ 2 . We can obtain the eigenvalues of ρ 1 a i = { ρ 1 } i i and ρ 2 b j = { ρ 2 } i i . Hence, the diagonal elements of ρ 12 are given by all products of the form a i b j , where i = 1 , . . . , n 1 and j = 1 , . . . , n 2 . Here, n 1 and n 2 are the dimensions of ρ 1 and ρ 2 , respectively; then:
S ( ρ 12 ) = Tr ( ρ 12 ln ρ 12 ) = i = 1 n 1 j = 1 n 2 ( a i b j ) ln ( a i b j ) .
On the other hand, the sum of the entropy of each system is given by
S ( ρ 1 ) + S ( ρ 2 ) = i = 1 n 1 a i ln ( a i ) j = 1 n 2 b j ln ( b j ) .
Thus, the entropy for an ensemble ( ρ 12 ) for which the subsystems are uncorrelated is just equal to the sum of the entropies of the reduced ensembles for the subsystems. When there are correlations, we should expect an inequality instead (called the subadditivity property of entropy [5,31,32]), since, in this case, ρ 12 contains additional information concerning the correlations, which is not present in ρ 1 and in ρ 2 (those are the partial traces of ρ 12 , respectively), as in
S ( ρ 12 ) S ( ρ 1 ) + S ( ρ 2 ) .
Given this inequality, we can use the mutual information formulation
I : = S ( ρ 1 ) + S ( ρ 2 ) S ( ρ 12 ) ,
such that I > 0 whenever there is subadditive behavior and I = 0 for additivity.
From Equation (19), we obtain a descriptor of weak emergency in systems that are correlated. In these types of systems, new information emerges from the relation of its parts in terms of correlations.

6. An Example of a Subadditive System

As is depicted in Equation (4), we can write the density matrix as follows:
ρ i j = ϵ i | ρ ^ | ϵ j = 1 2 η [ p i j + p j i ] + i [ p i j p j i ] .
but where we now interpret p i j as
p i j = P ( ϵ i , ϵ j | I ) ,
η = i p i i
where primed states are states after a measurement.
Because of the marginalization rule, it must hold that
i p i j = P ( ϵ j | I ) for all j = 1 , , N ,
while, simultaneously,
j p i j = P ( ϵ i | I ) for all i = 1 , , N
must be true. As an example, consider an abstract system composed of two integers, a and b, such that a { 1 , 2 , 3 , 4 } and b { 1 , 2 , 3 , 4 } . Let us set the constraint I that a and b are either both even or both odd; then, there are eight allowed states, namely
Γ 1 : a = 1 , b = 1 , Γ 2 : a = 1 , b = 3 , Γ 3 : a = 3 , b = 1 , Γ 4 : a = 3 , b = 3 , Γ 5 : a = 2 , b = 2 , Γ 6 : a = 2 , b = 4 , Γ 7 : a = 4 , b = 2 , Γ 8 : a = 4 , b = 4 .
Let us now define the measurement  M ( a , b ) by
M ( a , b ) : = 1 if a = b , 2 , if a > b , 3 if a < b .
such that the system undergoes the following transitions: Γ 1 Γ 4 , Γ 5 Γ 8 , Γ 2 Γ 6 and Γ 3 Γ 7 . In this way, if the system has a = 3 and b = 1, that is, it is in the state Γ 3 , and we perform the measurement M, we will obtain a value M = 2 and the state will change to a = 4 and b = 2, that is, to Γ 7 . This makes the system fragile regarding the measurement M, with the integers a and b being the hidden variables of the system. Lastly, we will define the parity of each integer as 0 if the number is odd, and 1 if it is even. In this way, there are only four observable states, namely
s 1 = ( 0 , 0 ) , s 2 = ( 0 , 1 ) , s 3 = ( 1 , 0 ) , s 4 = ( 1 , 1 ) ,
where the first and second position of the tuple represent the parity bit of a and b, respectively. Of these states, only s 1 and s 4 are consistent with the constraint I . Considering the allowed transitions, we have that
P ( s , s | I ) = 1 4 if s { s 1 , s 4 } and s { s 1 , s 4 } , 0 otherwise .
The matrix elements p i j = P ( s = s i , s = s j | I ) are given in Table 1, together with the density matrix elements ρ i j . The latter are simply
ρ i j = 1 2 η p i j + p j i + i [ p i j p j i ] = p i j η = 2 p i j ,
because η = i = 1 4 p i i = 1 2 . The states given by Equation (27) are equivalent to the ket notation ( | 00 , | 11 , | 01 , | 10 ) . The density matrix elements ρ i j correspond to the density operator of a pure, but entangled state,
ρ ^ A B = 1 2 | 00 00 | + | 00 11 | + | 11 00 | + | 11 11 | = | Φ + Φ + |
with | Φ + as one of the Bell states,
| Φ + : = 1 2 | 00 + 11 | .
The reduced density matrices are
ρ A = 1 2 1 0 0 1 , ρ B = 1 2 1 0 0 1 .
Hence, ρ ^ A B ρ ^ A ρ ^ B , and the operator ρ ^ A B is not separable. The values of the von Neumann entropy are S A = S B = ln 2 and S A B = 0, the latter being a pure state; therefore, the system is subadditive. With this example, we have found a density operator that is not only correlated but could describe an emergent system, since the entropy of the whole is less than that of its parts.

7. Concluding Remarks

In this work, we reviewed the idea of emergence in the context of quantum mechanics and fragile systems in order to detect emergent behavior in systems under the framework of the density matrix theory. We proposed the mutual information as a descriptor of emergence in fragile systems [21]. When the parameter I > 0 , there is more information in the whole than its parts; therefore, new information emerges as a result of the interaction of the subsystems as correlations. We have presented a concrete example of an operator that could describe a subadditive system.
We believe that this formulation may give some insights about adaptative, self-sustainable and interconnected models. In particular, we aim to describe how, in an entangled system, the observer is part of the observation by being an inseparable element of it; for instance, by considering the side effect that produces the act of measurement (i.e., the observer effect [33,34,35]).

Author Contributions

Conceptualization, Y.N.; methodology, Y.N. and S.D.; software, S.D.; validation, Y.N. and S.D.; formal analysis, Y.N. and S.D.; writing—original draft preparation, Y.N. and S.D.; writing—review and editing, Y.N. and S.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by ANID FONDECYT 1220651 grant.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gharibian, S. Strong NP-hardness of the quantum separability problem. Quantum Inf. Comput. 2010, 10, 343–360. [Google Scholar] [CrossRef]
  2. Navarrete, Y.; Davis, S. Noncommutative Bayesian expectation and its connection to quantum theory. AIP Conf. Proc. 2022, in press. [Google Scholar]
  3. Warmuth, M.K.; Kuzmin, D. Bayesian generalized probability calculus for density matrices. Mach. Learn. 2010, 78, 63–101. [Google Scholar] [CrossRef] [Green Version]
  4. Warmuth, M.K.; Kuzmin, D. A Bayesian probability calculus for density matrices. arXiv 2014, arXiv:1408.3100. [Google Scholar] [CrossRef] [Green Version]
  5. Bhatia, R. Positive Definite Matrices; Princeton University Press: Princeton, NJ, USA, 2015. [Google Scholar]
  6. Holzer, R.; de Meer, H. Methods for Approximations of Quantitative Measures in Self-Organizing Systems; Springer: Berlin/Heidelberg, Germany, 2011; pp. 1–15. [Google Scholar]
  7. Checkland, P. Systems Thinking, Systems Practice; John Wiley: Hoboken, NJ, USA, 1999. [Google Scholar]
  8. Hitchins, D.K. Systems Engineering: A 21st Century Systems Methodology; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  9. Alligood, K.T.; Sauer, T.; Yorke, J.A.; Crawford, J.D. Chaos: An Introduction to Dynamical Systems; Springer: Berlin/Heidelberg, Germany, 1997. [Google Scholar]
  10. Hanson, J.E.; Crutchfield, J.P. Computational mechanics of cellular automata: An example. Physica D 1997, 103, 169–189. [Google Scholar] [CrossRef]
  11. Bak, P.; Chen, K.; Creutz, M. Self-organized criticality in the Game of Life. Nature 1989, 342, 780–782. [Google Scholar] [CrossRef]
  12. Wolfram, S. A New Kind of Science; Wolfram Media Champaign: Champaign, IL, USA, 2002. [Google Scholar]
  13. Prokopenko, M.; Boschetti, F.; Ryan, A.J. An information-theoretic primer on complexity, self-organization, and emergence. Complexity 2009, 15, 11–28. [Google Scholar] [CrossRef]
  14. Campbell, D.T. ‘Downward causation’ in hierarchically organised biological systems. In Studies in the Philosophy of Biology; Springer: Berlin/Heidelberg, Germany, 1974; pp. 179–186. [Google Scholar]
  15. Chalmers, D.J. The Conscious Mind: In Search of a Fundamental Theory; Oxford Paperbacks: Oxford, UK, 1996. [Google Scholar]
  16. Varela, F.J. Neurophenomenology: A Methodological Remedy for the Hard Problem; Imprint Academic: Exeter, UK, 1996; Volume 3, pp. 330–349. [Google Scholar]
  17. Jackson, S.; Hitchins, D.; Eisner, H. What is the systems approach? Insight 2010, 13, 41–43. [Google Scholar] [CrossRef]
  18. Schmeck, H.; Müller-Schloer, C.; Çakar, E.; Mnif, M.; Richter, U. Adaptivity and self-organization in organic computing systems. ACM Trans. Auton. Adapt. Syst. 2010, 5, 1–32. [Google Scholar] [CrossRef]
  19. Maturana, H.; Varela, E.G. De Máquinas y Seres Vivos; Editorial Universitaria: Santiago, Chile, 1973. [Google Scholar]
  20. Veloz, T. Goals as Emergent Autopoietic Processes. Front. Bioeng. Biotechnol. 2021, 9, 720652. [Google Scholar] [CrossRef] [PubMed]
  21. Navarrete, Y.; Davis, S. Fragile Systems. arXiv 2016, arXiv:1609.01972. [Google Scholar]
  22. Paul, W.; Baschnagel, J. Stochastic Processes: From Physics to Finance; Springer International Publishing: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  23. Von Neumann, J. Mathematical Foundations of Quantum Mechanics; Princeton University Press: Princeton, NJ, USA, 1955. [Google Scholar]
  24. Cohen-Tannoudji, C.; Diu, B.; Laloë, F. Quantum Mechanics; Wiley: Hoboken, NJ, USA, 2005. [Google Scholar]
  25. Barnett, S.M. Quantum Information; OUP Oxford: Oxford, UK, 2009. [Google Scholar]
  26. Nielsen, M.A.; Chuang, I.L. Quantum Computation and Quantum Information; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar]
  27. Jaynes, E.T. Probability Theory: The Logic of Science; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
  28. Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley and Sons: Hoboken, NJ, USA, 2006. [Google Scholar]
  29. Schack, R.; Brun, T.A.; Caves, C.M. Quantum Bayes rule. Phys. Rev. A 2001, 64, 014305. [Google Scholar] [CrossRef]
  30. Caves, C.M.; Fuchs, C.A.; Schack, R. Quantum probabilities as Bayesian probabilities. Phys. Rev. A 2002, 65, 22305. [Google Scholar] [CrossRef] [Green Version]
  31. Hayashi, M. Quantum Information: An Introduction; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  32. Wilde, M.M. Quantum Information Theory; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
  33. Dirac, P.A.M. The Principles of Quantum Mechanics; Clarendon Press: Oxford, UK, 1930. [Google Scholar]
  34. de Bianchi, M.S. The Observer Effect. Found. Sci. 2013, 18, 213–243. [Google Scholar] [CrossRef] [Green Version]
  35. Rogala, M.; Dabrowski, P.; Kowalczyk, P.J.; Wlasny, I.; Kozlowski, W.; Busiakiewicz, A.; Karaduman, I.; Lipinska, L.; Baranowski, J.M.; Klusek, Z. The observer effect in graphene oxide—How the standard measurements affect the chemical and electronic structure. Carbon 2016, 103, 235–241. [Google Scholar] [CrossRef]
Table 1. The set of joint probabilities p i j = P ( s = s i , s = s j | I ) (third column) for the fragile system arising from the two-integer example described in Section 6. The fourth column shows the values of the elements of the corresponding density matrix.
Table 1. The set of joint probabilities p i j = P ( s = s i , s = s j | I ) (third column) for the fragile system arising from the two-integer example described in Section 6. The fourth column shows the values of the elements of the corresponding density matrix.
ij p ij ρ ij
11¼½
1200
1300
14¼½
2100
2200
2300
2400
3100
3200
3300
3400
41¼½
4200
4300
44¼½
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Navarrete, Y.; Davis, S. Quantum Mutual Information, Fragile Systems and Emergence. Entropy 2022, 24, 1676. https://doi.org/10.3390/e24111676

AMA Style

Navarrete Y, Davis S. Quantum Mutual Information, Fragile Systems and Emergence. Entropy. 2022; 24(11):1676. https://doi.org/10.3390/e24111676

Chicago/Turabian Style

Navarrete, Yasmín, and Sergio Davis. 2022. "Quantum Mutual Information, Fragile Systems and Emergence" Entropy 24, no. 11: 1676. https://doi.org/10.3390/e24111676

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop