Logical Entropy
Open Access
Volume 5, 2022
Logical Entropy
Article Number 1
Number of page(s) 33
Section Physics - Applied Physics
DOI https://doi.org/10.1051/fopen/2021004
Published online 13 January 2022
  • Birkhoff G (1948), Lattice theory, American Mathematical Society, New York. [Google Scholar]
  • Grätzer G (2003), General Lattice Theory, 2nd edn., Birkhäuser Verlag, Boston. [Google Scholar]
  • Ellerman D (2010), The logic of partitions: introduction to the dual of the logic of subsets. Rev Symb Log 3, 287–350. https://doi.org/10.1017/S1755020310000018. [Google Scholar]
  • Ellerman D (2014), An introduction to partition logic. Log J IGPL 22, 94–125. https://doi.org/10.1093/jigpal/jzt036. [Google Scholar]
  • Lawvere FW, Rosebrugh R (2003), Sets for mathematics, Cambridge University Press, Cambridge, MA. [Google Scholar]
  • Rota G-C (2001), Twelve problems in probability no one likes to bring up, in: H. Crapo, D. Senato (Eds.), Algebraic combinatorics and computer science: a tribute to Gian-Carlo Rota, Springer, Milano, pp. 57–93. [Google Scholar]
  • Rota G-C (1998) Probability Vol. I & II: The guidi notes, MIT Copy Services, Cambridge, MA. [Google Scholar]
  • Ellerman D (2021), The logical theory of canonical maps: the elements and distinctions analysis of the morphisms, duality, canonicity and universal constructions in Set. https://ArXiv.org, https://arxiv.org/abs/2104.08583. [Google Scholar]
  • Halmos PR (1974), Measure theory, Springer-Verlag, New York. [Google Scholar]
  • Rao KPSB, Rao MB (1983), Theory of charges: a study of finitely additive measures, Academic Press, London. [Google Scholar]
  • Wilkins J (1707), Mercury or the secret and swift messenger, London. Original in 1641. [Google Scholar]
  • Gleick J (2011), The information: a history, a theory, a flood, Pantheon, New York. [Google Scholar]
  • Bateson G (1979), Mind and nature: a necessary unity, Dutton, New York. [Google Scholar]
  • Gini C (1912), Variabilità e mutabilità, Tipografia di Paolo Cuppini, Bologna. [Google Scholar]
  • Friedman WF (1922), The index of coincidence and its applications in cryptography, Riverbank Laboratories, Geneva IL. [Google Scholar]
  • Kullback S (1976), Statistical methods in cryptanalysis, Aegean Park Press, Walnut Creek, CA. [Google Scholar]
  • Rejewski M (1981), How Polish mathematicians deciphered the enigma. IEEE Ann Hist Comput 3, 213–234. [Google Scholar]
  • Simpson EH (1949), Measurement of diversity. Nature 163, 688. [Google Scholar]
  • Ricotta C, Szeidl L (2006), Towards a unifying approach to diversity measures: bridging the gap between the Shannon entropy and Rao’s quadratic index. Theor Popul Biol 70, 237–243. https://doi.org/10.1016/j.tpb.2006.06.003. [Google Scholar]
  • Nei M (1973), Analysis of Gene Diversity in subdivided populations. Proc Nat Acad Sci USA 70, 3321–3323. [Google Scholar]
  • Good IJ (1979), A.M. Turing’s statistical work in World War II. Biometrika 66, 393–396. [Google Scholar]
  • Good IJ (1982), Comment (on Patil and Taillie: diversity as a concept and its measurement). J Am Stat Assoc 77, 561–563. [Google Scholar]
  • Stigler SM (1999), Statistics on the table, Harvard University Press, Cambridge. [Google Scholar]
  • Hirschman AO (1945), National power and the structure of foreign trade, University of California Press, Berkeley. [Google Scholar]
  • Herfindahl OC (1950), Concentration in the US Steel Industry, Unpublished Doctoral Dissertation, Columbia University. [Google Scholar]
  • Rao CR (1982), Diversity and dissimilarity coefficients: a unified approach. Theor Popul Biol 21, 24–43. [Google Scholar]
  • Havrda J, Charvat F (1967), Quantification methods of classification processes: concept of structural α-entropy. Kybernetika (Prague) 3, 30–35. [Google Scholar]
  • Tsallis C (1988), Possible generalization for Boltzmann-Gibbs statistics. J Stat Phys 52, 479–487. [Google Scholar]
  • Brukner Č, Zeilinger A (2000), Operationally invariant information in quantum measurements. https://ArXiv.org. https://arxiv.org/abs/quant-ph/0005084, 19 May 2000. [Google Scholar]
  • Brukner Č, Zeilinger A (2003), Information and fundamental elements of the structure of quantum theory, in: L. Castell, O. Ischebeck (Eds.), Time, quantum and information, Springer-Verlag, Berlin, pp. 323–354. [Google Scholar]
  • Shannon CE (1948), A mathematical theory of communication. Bell Syst Tech J 27, 379–423, 623–656. [Google Scholar]
  • Shannon CE, Weaver W (1964), The mathematical theory of communication, University of Illinois Press, Urbana. [Google Scholar]
  • Shannon CE (1993), The Bandwagon, in: N.J.A. Sloane, A.D. Wyner (Eds.), Claude E. Shannon: Collected Papers, IEEE Press, Piscataway, NJ, p. 462. [Google Scholar]
  • Tribus M (1978), Thirty years of information theory, in: R.D. Levine, M. Tribus (Eds.), The maximum entropy formalism, MIT, Cambridge, MA, pp. 1–14. [Google Scholar]
  • Shannon CE (1993), Some topics in information theory, in: N.J.A. Sloane, A.D. Wyner (Eds.), Claude E. Shannon: Collected Papers, IEEE Press, Piscataway, NJ, pp. 458–459. [Google Scholar]
  • Ramshaw JD (2018), The Statistical Foundations of Entropy, World Scientific Publishing, Singapore. [Google Scholar]
  • Lewis GN (1930), The Symmetry of Time in Physics. Science 71, 569–577. [Google Scholar]
  • Brillouin L (1962), Science and Information Theory, Academic Press, New York. [Google Scholar]
  • Aczel J, Daroczy Z (1975), On Measures of Information and Their Characterization, Academic Press, New York. [Google Scholar]
  • Campbell LL (1965), Entropy as a Measure. IEEE Trans Inform Theory IT-11, 112–114. [Google Scholar]
  • Doob JL (1994), Measure Theory, Springer Science+Business Media, New York. [Google Scholar]
  • Polya G, Szego G (1998), Problems and Theorems in Analysis, Vol. II, Springer-Verlag, Berlin. [Google Scholar]
  • Hu KT (1962), On the amount of information. Probability Theory and Its Applications 7, 439–447. https://doi.org/10.1137/1107041. [Google Scholar]
  • Ryser HJ (1963), Combinatorial Mathematics, Mathematical Association of America, Washington DC. [Google Scholar]
  • Takacs L (1967), On the method of inclusion and exclusion. J Am Stat Assoc 62, 102–113. https://doi.org/10.1080/01621459.1967.10482891. [Google Scholar]
  • Cover T, Thomas J (2006), Elements of information theory, 2nd edn., John Wiley and Sons, Hoboken, NJ. [Google Scholar]
  • Csiszar I, Körner J (1981), Information theory: coding theorems for discrete memoryless systems, Academic Press, New York. [Google Scholar]
  • Wilson RJ (1972), Introduction to graph theory, Longman, London. [Google Scholar]
  • Rozeboom WW (1968), The theory of abstract partials: an introduction. Psychometrika 33, 133–167. [Google Scholar]
  • McGill WJ (1954), Multivariate information transmission. Trans IRE Prof Group Inform Theory 4, 93–111. https://doi.org/10.1109/TIT.1954.1057469. [Google Scholar]
  • Fano RM (1961), Transmission of Information, MIT Press, Cambridge, MA. [Google Scholar]
  • Yeung RW (1991), A new outlook on Shannon’s information measures. IEEE Trans on Information Theory 37, 466–474. https://doi.org/10.1109/18.79902. [Google Scholar]
  • MacKay DJC (2003), Information theory, inference, and learning algorithms, Cambridge University Press, Cambridge, UK. [Google Scholar]
  • Atkins P, de Paula J, Keeler J (2018), Atkins’ physical chemistry, 11th edn., Oxford University Press, Oxford UK. [Google Scholar]
  • Johnson E (2018), Anxiety and the equation: Understanding Boltzmann’s entropy, MIT Press, Cambridge, MA. [Google Scholar]
  • Jaynes ET (2003), Probability theory: The logic of science, Cambridge University Press, Cambridge, UK. [Google Scholar]
  • Kaplan W (1999), Maxima and minima with applications: practical optimization and duality, John Wiley & Sons, New York. [Google Scholar]
  • Best MJ (2017), Quadratic programming with computer programs, CRC Press, Boca Raton FL. [Google Scholar]
  • Jaynes ET (1978), Where do we stand on maximum entropy? in: R.D. Levine, M. Tribus (Eds.), The Maximum Entropy Formalism, MIT, Cambridge, MA, pp. 15–118. [Google Scholar]
  • Papoulis A (1990), Probability and statistics, Prentice-Hall, Englewood Cliffs, NJ. [Google Scholar]
  • Dantzig GB (1963), Linear programming and extensions, Princeton University Press, Princeton. [Google Scholar]
  • Kullback S, Leibler RA (1951), On information and sufficiency. Ann Math Stat 22, 79–86. https://doi.org/10.1214/aoms/1177729694. [Google Scholar]
  • Rao CR (2010), Quadratic entropy and analysis of diversity. Sankhyā Indian J Stat 72-A, 70–80. [Google Scholar]
  • Zhang Y, Wu H, Cheng L (2012), Some new deformation formulas about variance and covariance, in: Proceedings of 2012 International Conference on Modelling, Identification and Control (ICMIC2012), pp. 987–992. [Google Scholar]
  • McEliece RJ (1977), The theory of information and coding: a mathematical framework for communication (Encyclopedia of Mathematics and its Applications, Vol. 3). Addison-Wesley, Reading, MA. [Google Scholar]
  • Tamir B, Cohen E (2015), A Holevo-type bound for a Hilbert Schmidt distance measure. J Quantum Inf Sci 5, 127–133. https://doi.org/10.4236/jqis.2015.54015. [Google Scholar]
  • Ellerman D (2018), Logical entropy: introduction to classical and quantum logical information theory. Entropy 20, 679. https://doi.org/10.3390/e20090679. [Google Scholar]
  • Auletta G, Fortunato M, Parisi G (2009), Quantum mechanics, Cambridge University Press, Cambridge, UK. [Google Scholar]
  • Bennett CH (2003), Quantum information: qubits and quantum error correction. Int J Theor Phys 42, 153–176. https://doi.org/10.1023/A:1024439131297. [Google Scholar]
  • Jaeger G (2007), Quantum information: an overview, Springer Science+Business Media, New York. [Google Scholar]
  • Manfredi G, Feix MR (2000), Entropy and Wigner Functions. Phys Rev E 62, 4665–4674. https://doi.org/10.1103/PhysRevE.62.4665. [Google Scholar]
  • Birkhoff G, Von Neumann J (1936), The logic of quantum mechanics. Ann Math 37, 823–843. [Google Scholar]
  • Ellerman D (2017), Quantum mechanics over sets: a pedagogical model with non-commutative finite probability theory as its quantum probability calculus. Synthese 194, 4863–4896. https://doi.org/10.1007/s11229-016-1175-0. [Google Scholar]
  • Ellerman D (2018), The quantum logic of direct-sum decompositions: the dual to the quantum logic of subspaces. Logic J IGPL 26, 1–13. https://doi.org/10.1093/jigpal/jzx026. [Google Scholar]
  • Hoffman K, Kunze R (1961), Linear algebra, Prentice-Hall, Englewood Cliffs, NJ. [Google Scholar]
  • Kolmogorov AN (1983), Combinatorial foundations of information theory and the calculus of probabilities. Russian Math Surv 38, 29–40. [Google Scholar]
  • Zurek WH (2003), Decoherence, einselection, and the quantum origins of the classical. Rev Modern Phys 75, 715–775. [Google Scholar]
  • Fano U (1957), Description of states in quantum mechanics by density matrix and operator techniques. Rev Mod Phys 29, 74–93. [Google Scholar]
  • Nielsen M, Chuang I (2000), Quantum computation and quantum information, Cambridge University Press, Cambridge. [Google Scholar]
  • Tamir B, Cohen E (2014), Logical entropy for quantum states. https://ArXiv.org. http://arxiv.org/abs/1412.0616v2 [Google Scholar]
  • Ellerman D (2009), Counting distinctions: on the conceptual foundations of Shannon’s Information Theory. Synthese 168, 119–149. https://doi.org/10.1007/s11229-008-9333-7. [Google Scholar]
  • Ellerman D (2021), New foundations for information theory: logical entropy and Shannon entropy, Springer Nature, Cham, Switzerland. [Google Scholar]
  • Tamir B, Piava IL, Schwartzman-Nowik Z, Cohen E (2021), Quantum logical entropy: fundamentals and general properties. https://ArXiv.org. https://arxiv.org/abs/2108.02726. [Google Scholar]