Search This Blog

Thursday 23 November 2017

Presentation with original massive heterogeneity argument against multiple realisability of human-like intelligence and cognition...

Sunday 19 November 2017

Bibliography for my symposium talk for the working group from Beijing Council...

Adami, C. (2004). "Information theory in molecular biology." Physics of Life Reviews 1(1): 3-22.
This article introduces the physics of information in the context of molecular biology and genomics. Entropy and information, the two central concepts of Shannon's theory of information and communication, are often confused with each other but play transparent roles when applied to statistical ensembles (i.e., identically prepared sets) of symbolic sequences. Such an approach can distinguish between entropy and information in genes, predict the secondary structure of ribozymes, and detect the covariation between residues in folded proteins. We also review applications to molecular sequence and structure analysis, and introduce new tools in the characterization of resistance mutations, and in drug design. © 2004 Elsevier B.V. All rights reserved.;This article introduces the physics of information in the context of molecular biology and genomics. Entropy and information, the two central concepts of Shannon's theory of information and communication, are often confused with each other but play transparent roles when applied to statistical ensembles (i.e., identically prepared sets) of symbolic sequences. Such an approach can distinguish between entropy and information in genes, predict the secondary structure of ribozymes, and detect the covariation between residues in folded proteins. We also review applications to molecular sequence and structure analysis, and introduce new tools in the characterization of resistance mutations, and in drug design.;This article introduces the physics of information in the context of molecular biology and genomics. Entropy and information, the two central concepts of Shannon's theory of information and communication, are often confused with each other but play transparent roles when applied to statistical ensembles (i.e., identically prepared sets) of symbolic sequences. Such an approach can distinguish between entropy and information in genes, predict the secondary structure of ribozymes, and detect the covariation between residues in folded proteins. We also review applications to molecular sequence and structure analysis, and introduce new tools in the characterization of resistance mutations, and in drug design. (c) 2004 Elsevier B.V. All rights reserved.;

Adami, C. (2015). "Information-Theoretic Considerations Concerning the Origin of Life." Origins of Life and Evolution of Biospheres 45(3): 309-317.
Research investigating the origins of life usually either focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Comparatively little work has explored fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show (under certain simplifying assumptions) that the probability to discover a self-replicator by chance depends exponentially on the relative rate of formation of the monomers. If the rate at which monomers are formed is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida.;Research investigating the origins of life usually focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Little work has been done exploring fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show that (under certain simplifying assumptions) the probability to discover a self-replicator by chance depends exponentially on the rate of formation of the monomers. If the rate at which monomers are formed is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida; Research investigating the origins of life usually either focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Comparatively little work has explored fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show (under certain simplifying assumptions) that the probability to discover a self-replicator by chance depends exponentially on the relative rate of formation of the monomers. If the rate at which monomers are formed is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida.;Research investigating the origins of life usually either focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Comparatively little work has explored fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show (under certain simplifying assumptions) that the probability to discover a self-replicator by chance depends exponentially on the relative rate of formation of the monomers. If the rate at which monomers are fo med is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida.;Research investigating the origins of life usually either focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Comparatively little work has explored fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show (under certain simplifying assumptions) that the probability to discover a self-replicator by chance depends exponentially on the relative rate of formation of the monomers. If the rate at which monomers are formed is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida.;

Adams, F. and J. A. de Moraes (2016). "Is There a Philosophy of Information?" Topoi 35(1): 161-171.
In 2002, Luciano Floridi published a paper called What is the Philosophy of Information?, where he argues for a new paradigm in philosophical research. To what extent should his proposal be accepted? Is the Philosophy of Information actually a new paradigm, in the Kuhninan sense, in Philosophy? Or is it only a new branch of Epistemology? In our discussion we will argue in defense of Floridi's proposal. We believe that Philosophy of Information has the types of features had by other areas already acknowledge as authentic in Philosophy. By way of an analogical argument we will argue that since Philosophy of Information has its own topics, method and problems it would be counter-intuitive not to accept it as a new philosophical area. To strengthen our position we present and discuss main topics of Philosophy of Information.;In 2002, Luciano Floridi published a paper called What is the Philosophy of Information?, where he argues for a new paradigm in philosophical research. To what extent should his proposal be accepted? Is the Philosophy of Information actually a new paradigm, in the Kuhninan sense, in Philosophy? Or is it only a new branch of Epistemology? In our discussion we will argue in defense of Floridi’s proposal. We believe that Philosophy of Information has the types of features had by other areas already acknowledge as authentic in Philosophy. By way of an analogical argument we will argue that since Philosophy of Information has its own topics, method and problems it would be counter-intuitive not to accept it as a new philosophical area. To strengthen our position we present and discuss main topics of Philosophy of Information.;

Adriaans, P. (2010). "A Critical Analysis of Floridi's Theory of Semantic Information." Knowledge, Technology & Policy 23(1-2): 1-16.
Issue Title: Special Issue: Luciano Floridi's Philosophy of Technology: Critical Reflections / Guest Edited by Hilmi Demir In various publications over the past years, Floridi has developed a theory of semantic information as well-formed, meaningful, and truthful data. This theory is more or less orthogonal to the standard entropy-based notions of information known from physics, information theory, and computer science that all define the amount of information in a certain system as a scalar value without any direct semantic implication. In this context the question rises what the exact relation between these various conceptions of information is and whether there is a real need to enrich these mathematically more or less rigid definitions with a less formal notion of semantic information. I investigate various philosophical aspects of the more formal definitions of information in the light of Floridi's theory. The position I defend is that the formal treatment of the notion of information as a general theory of entropy is one of the fundamental achievements of modern science that in itself is a rich source for new philosophical reflection. This makes information theory a competitor of classical epistemology rather than a servant. In this light Floridi's philosophy of information is more a reprise of classical epistemology that only pays lip service to information theory but fails to address the important central questions of philosophy of information. Specifically, I will defend the view that notions that are associated with truth, knowledge, and meaning all can adequately be reconstructed in the context of modern information theory and that consequently there is no need to introduce a concept of semantic information.[PUBLICATION ABSTRACT]

Agte, S., et al. (2017). "Two different mechanosensitive calcium responses in Müller glial cells of the guinea pig retina: Differential dependence on purinergic receptor signaling: Calcium Waves in Retinal Glial Cells." Glia 65(1): 62-74.

Aguirre, A., et al. (2015). It From Bit or Bit From It?: On Physics and Information. Cham, Springer International Publishing.

Allo, P. and E. Mares (2012). "Informational Semantics as a Third Alternative?" Erkenntnis (1975-) 77(2): 167-185.
Informational semantics were first developed as an interpretation of the model-theory of substructural (and especially relevant) logics. In this paper we argue that such a semantics is of independent value and that it should be considered as a genuine alternative explication of the notion of logical consequence alongside the traditional model-theoretical and the proof-theoretical accounts. Our starting point is the content-nonexpansion platitude which stipulates that an argument is valid iff the content of the conclusion does not exceed the combined content of the premises. We show that this basic platitude can be used to characterise the extension of classical as well as non-classical consequence relations. The distinctive trait of an informational semantics is that truth-conditions are replaced by information-conditions. The latter leads to an inversion of the usual order of explanation: Considerations about logical discrimination (how finely propositions are individuated) are conceptually prior to considerations about deductive strength. Because this allows us to bypass considerations about truth, an informational semantics provides an attractive and metaphysically unencumbered account of logical consequence, non-classical logics, logical rivalry and pluralism about logical consequence.;Informational semantics were first developed as an interpretation of the model-theory of substructural (and especially relevant) logics. In this paper we argue that such a semantics is of independent value and that it should be considered as a genuine alternative explication of the notion of logical consequence alongside the traditional model-theoretical and the proof-theoretical accounts. Our starting point is the content-nonexpansion platitude which stipulates that an argument is valid iff the content of the conclusion does not exceed the combined content of the premises. We show that this basic platitude can be used to characterise the extension of classical as well as non-classical consequence relations. The distinctive trait of an informational semantics is that truth-conditions are replaced by information-conditions. The latter leads to an inversion of the usual order of explanation: Considerations about logical discrimination (how finely propositions are individuated) are conceptually prior to considerations about deductive strength. Because this allows us to bypass considerations about truth, an informational semantics provides an attractive and metaphysically unencumbered account of logical consequence, non-classical logics, logical rivalry and pluralism about logical consequence.;Informational semantics were first developed as an interpretation of the model-theory of substructural (and especially relevant) logics. In this paper we argue that such a semantics is of independent value and that it should be considered as a genuine alternative explication of the notion of logical consequence alongside the traditional model-theoretical and the proof-theoretical accounts. Our starting point is the content-nonexpansion platitude which stipulates that an argument is valid iff the content of the conclusion does not exceed the combined content of the premises. We show that this basic platitude can be used to characterise the extension of classical as well as non-classical consequence relations. The distinctive trait of an informational semantics is that truth-conditions are replaced by information-conditions. The latter leads to an inversion of the usual order of explanation: Considerations about logical discrimination (how finely propositions are individuated) are conceptually prior to considerations about deductive strength. Because this allows us to bypass considerations about truth, an informational semantics provides an attractive and metaphysically unencumbered account of logical consequence, non-classical logics, logical rivalry and pluralism about logical consequence. Adapted from the source document;Informational semantics were first developed as an interpretation of the model-theory of substructural (and especially relev nt) logics. In this paper we argue that such a semantics is of independent value and that it should be considered as a genuine alternative explication of the notion of logical consequence alongside the traditional model-theoretical and the proof-theoretical accounts. Our starting point is the content-nonexpansion platitude which stipulates that an argument is valid iff the content of the conclusion does not exceed the combined content of the premises. We show that this basic platitude can be used to characterise the extension of classical as well as non-classical consequence relations. The distinctive trait of an informational semantics is that truth-conditions are replaced by information-conditions. The latter leads to an inversion of the usual order of explanation: Considerations about logical discrimination (how finely propositions are individuated) are conceptually prior to considerations about deductive strength. Because this allows us to bypass considerations about truth, an informational semantics provides an attractive and metaphysically unencumbered account of logical consequence, non-classical logics, logical rivalry and pluralism about logical consequence.; Informational semantics were first developed as an interpretation of the model-theory of substructural (and especially relevant) logics. In this paper we argue that such a semantics is of independent value and that it should be considered as a genuine alternative explication of the notion of logical consequence alongside the traditional model-theoretical and the proof-theoretical accounts. Our starting point is the content-nonexpansion platitude which stipulates that an argument is valid iff the content of the conclusion does not exceed the combined content of the premises. We show that this basic platitude can be used to characterise the extension of classical as well as non-classical consequence relations. The distinctive trait of an informational semantics is that truth-conditions are replaced by information-conditions. The latter leads to an inversion of the usual order of explanation: Considerations about logical discrimination (how finely propositions are individuated) are conceptually prior to considerations about deductive strength. Because this allows us to bypass considerations about truth, an informational semantics provides an attractive and metaphysically unencumbered account of logical consequence, non-classical logics, logical rivalry and pluralism about logical consequence.[PUBLICATION ABSTRACT];Informational semantics were first developed as an interpretation of the model-theory of substructural (and especially relevant) logics. In this paper we argue that such a semantics is of independent value and that it should be considered as a genuine alternative explication of the notion of logical consequence alongside the traditional model-theoretical and the proof-theoretical accounts. Our starting point is the content-nonexpansion platitude which stipulates that an argument is valid iff the content of the conclusion does not exceed the combined content of the premises. We show that this basic platitude can be used to characterise the extension of classical as well as non-classical consequence relations. The distinctive trait of an informational semantics is that truth-conditions are replaced by information-conditions. The latter leads to an inversion of the usual order of explanation: Considerations about logical discrimination (how finely propositions are individuated) are conceptually prior to considerations about deductive strength. Because this allows us to bypass considerations about truth, an informational semantics provides an attractive and metaphysically unencumbered account of logical consequence, non-classical logics, logical rivalry and pluralism about logical consequence.;

Al-Safi, S. W. and A. J. Short (2011). "Information causality from an entropic and a probabilistic perspective." Phys. Rev. A 84(4): 042323.

Anonymous (2011). Foresight and Information Flows, National Bureau of Economic Research.

Artmann, S. (2008). Biological Information: 22-39.

Asselmeyer-Maluga, T. (2015). Spacetime Weave—Bit as the Connection Between Its or the Informational Content of Spacetime. It From Bit or Bit From It?, Springer: 129-142.

Aydede, M. and G. Güzeldere (2005). "Cognitive Architecture, Concepts, and Introspection: An Information‐Theoretic Solution to the Problem of Phenomenal Consciousness." NOUS 39(2): 197-255.

Baltag, A. and S. Smets (2015). "Logics of Informational Interactions." Journal of Philosophical Logic 44(6): 595-607.
The pre-eminence of logical dynamics, over a static and purely propositional view of Logic, lies at the core of a new understanding of both formal epistemology and the logical foundations of quantum mechanics. Both areas appear at first sight to be based on purely static propositional formalisms, but in our view their fundamental operators are essentially dynamic in nature. Quantum logic can be best understood as the logic of physically-constrained informational interactions (in the form of measurements and entanglement) between subsystems of a global physical system. Similarly, (multi-agent) epistemic logic is the logic of socially-constrained informational interactions (in the form of direct observations, learning, various forms of communication and testimony) between "subsystems" of a social system. Dynamic Epistemic Logic (DEL) provides us with a unifying setting in which these informational interactions, coming from seemingly very different areas of research, can be fully compared and analyzed. The DEL formalism comes with a powerful set of tools that allows us to make the underlying dynamic/interactive mechanisms fully transparent.;The pre-eminence of logical dynamics, over a static and purely propositional view of Logic, lies at the core of a new understanding of both formal epistemology and the logical foundations of quantum mechanics. Both areas appear at first sight to be based on purely static propositional formalisms, but in our view their fundamental operators are essentially dynamic in nature. Quantum logic can be best understood as the logic of physically-constrained informational interactions (in the form of measurements and entanglement) between subsystems of a global physical system. Similarly, (multi-agent) epistemic logic is the logic of socially-constrained informational interactions (in the form of direct observations, learning, various forms of communication and testimony) between “subsystems” of a social system. Dynamic Epistemic Logic (DEL) provides us with a unifying setting in which these informational interactions, coming from seemingly very different areas of research, can be fully compared and analyzed. The DEL formalism comes with a powerful set of tools that allows us to make the underlying dynamic/interactive mechanisms fully transparent.; Issue Title: JPL40: The Fortieth Anniversary Issue The pre-eminence of logical dynamics, over a static and purely propositional view of Logic, lies at the core of a new understanding of both formal epistemology and the logical foundations of quantum mechanics. Both areas appear at first sight to be based on purely static propositional formalisms, but in our view their fundamental operators are essentially dynamic in nature. Quantum logic can be best understood as the logic of physically-constrained informational interactions (in the form of measurements and entanglement) between subsystems of a global physical system. Similarly, (multi-agent) epistemic logic is the logic of socially-constrained informational interactions (in the form of direct observations, learning, various forms of communication and testimony) between "subsystems" of a social system. Dynamic Epistemic Logic (DEL) provides us with a unifying setting in which these informational interactions, coming from seemingly very different areas of research, can be fully compared and analyzed. The DEL formalism comes with a powerful set of tools that allows us to make the underlying dynamic/interactive mechanisms fully transparent.;

Barbieri, M. (2012). "What is Information?" Biosemiotics 5(2): 147-152.

Bar-Hillel, Y. and R. Carnap (1953). "Semantic Information." The British Journal for the Philosophy of Science 4(14): 147-157.

Barwise, J. (1983). "Information and semantics." Behavioral and Brain Sciences 6(1): 65-65.

Barwise, J., et al. (1996). Information Flow and the Lambek Calculus: 49-64.

Bassi, A., et al. (2015). Information and the foundations of quantum theory. It From Bit or Bit From It?, Springer: 87-95.

Baumgaertner, B. and L. Floridi (2016). "Introduction: The Philosophy of Information." Topoi 35(1): 157-159.

Bavaud, F. (2009). Information theory, relative entropy and statistics.

Beavers, A. F. and C. D. Harrison (2012). Information-theoretic teleodynamics in natural and artificial systems.

Bekenstein, J. D. (2003). Information in the Holographic Universe.
By studying the mysterious properties of black holes, physicists have reduced absolute limits on how much information a region of space or a quantity of matter and energy can hold. The holographic principle holds that the principle is like a hologram, and that the seemingly three-dimensional universe could be completely equivalent to alternative quantum fields and physical laws painted on a distant, vast surface.

Beni, M. D. (2016). "Epistemic Informational Structural Realism." MINDS AND MACHINES 26(4): 323-339.
The paper surveys Floridi’s attempt for laying down informational structural realism (ISR). After considering a number of reactions to the pars destruens of Floridi’s attack on the digital ontology, I show that Floridi’s enterprise for enriching the ISR by borrowing elements from the ontic form of structural realism (in the pars construens) is blighted by a haunting inconsistency. ISR has been originally developed by Floridi as a restricted and level dependent form of structural realism which remains mainly bonded within the borders of a Kantian perspective. I argue that this perspective doesn’t mesh nicely with the ontic interpretation that Floridi attached to the ISR. I substantiate this claim through the assessment of Floridi’s strategy for reconciling the epistemic and ontic forms of the SR, as well as by close examination of his use of method of levels of abstraction and his notion of semantic information. My proposal is that the ISR could be defended best against the mentioned and similar objections by being interpreted as an extension of the epistemic SR.;The paper surveys Floridi’s attempt for laying down informational structural realism (ISR). After considering a number of reactions to the pars destruens of Floridi’s attack on the digital ontology, I show that Floridi’s enterprise for enriching the ISR by borrowing elements from the ontic form of structural realism (in the pars construens) is blighted by a haunting inconsistency. ISR has been originally developed by Floridi as a restricted and level dependent form of structural realism which remains mainly bonded within the borders of a Kantian perspective. I argue that this perspective doesn’t mesh nicely with the ontic interpretation that Floridi attached to the ISR. I substantiate this claim through the assessment of Floridi’s strategy for reconciling the epistemic and ontic forms of the SR, as well as by close examination of his use of method of levels of abstraction and his notion of semantic information. My proposal is that the ISR could be defended best against the mentioned and similar objections by being interpreted as an extension of the epistemic SR.;The paper surveys Floridi’s attempt for laying down informational structural realism (ISR). After considering a number of reactions to the pars destruens of Floridi’s attack on the digital ontology, I show that Floridi’s enterprise for enriching the ISR by borrowing elements from the ontic form of structural realism (in the pars construens) is blighted by a haunting inconsistency. ISR has been originally developed by Floridi as a restricted and level dependent form of structural realism which remains mainly bonded within the borders of a Kantian perspective. I argue that this perspective doesn’t mesh nicely with the ontic interpretation that Floridi attached to the ISR. I substantiate this claim through the assessment of Floridi’s strategy for reconciling the epistemic and ontic forms of the SR, as well as by close examination of his use of method of levels of abstraction and his notion of semantic information. My proposal is that the ISR could be defended best against the mentioned and similar objections by being interpreted as an extension of the epistemic SR.;

Beni, M. D. (2017). "The Downward Path to Epistemic Informational Structural Realism." Acta Analytica.

Bennett, C. H. (2004). "A Resource-based View of Quantum Information." Quantum Info. Comput. 4(6): 460-466.

Bennett, C. H., et al. (1993). Thermodynamics of Computation and Information Distance, ACM.

Bergstrom, C. T. and M. Rosvall (2011). "Response to commentaries on “The Transmission Sense of Information”." Biology & Philosophy 26(2): 195-200.

Berta, M., et al. (2011). "The Quantum Reverse Shannon Theorem Based on One-Shot Information Theory." Communications in Mathematical Physics 306(3): 579-615.
The Quantum Reverse Shannon Theorem states that any quantum channel can be simulated by an unlimited amount of shared entanglement and an amount of classical communication equal to the channel's entanglement assisted classical capacity. In this paper, we provide a new proof of this theorem, which has previously been proved by Bennett, Devetak, Harrow, Shor, and Winter. Our proof has a clear structure being based on two recent information-theoretic results: one-shot Quantum State Merging and the Post-Selection Technique for quantum channels.;The Quantum Reverse Shannon Theorem states that any quantum channel can be simulated by an unlimited amount of shared entanglement and an amount of classical communication equal to the channel's entanglement assisted classical capacity. In this paper, we provide a new proof of this theorem, which has previously been proved by Bennett, Devetak, Harrow, Shor, and Winter. Our proof has a clear structure being based on two recent information-theoretic results: one-shot Quantum State Merging and the Post-Selection Technique for quantum channels.;The Quantum Reverse Shannon Theorem states that any quantum channel can be simulated by an unlimited amount of shared entanglement and an amount of classical communication equal to the channel's entanglement assisted classical capacity. In this paper, we provide a new proof of this theorem, which has previously been proved by Bennett, Devetak, Harrow, Shor, and Winter. Our proof has a clear structure being based on two recent information-theoretic results: one-shot Quantum State Merging and the Post-Selection Technique for quantum channels.;The Quantum Reverse Shannon Theorem states that any quantum channel can be simulated by an unlimited amount of shared entanglement and an amount of classical communication equal to the channel’s entanglement assisted classical capacity. In this paper, we provide a new proof of this theorem, which has previously been proved by Bennett, Devetak, Harrow, Shor, and Winter. Our proof has a clear structure being based on two recent information-theoretic results: one-shot Quantum State Merging and the Post-Selection Technique for quantum channels.;

Boden, M. A. (1990). The philosophy of artificial intelligence. New York;Oxford [England];, Oxford University Press.

Borrill, P. L. (2015). An Insight into Information, Entanglement and Time. It From Bit or Bit From It?, Springer: 97-112.

Braunstein, S. L., et al. (2013). "Better late than never: Information retrieval from black holes." PHYSICAL REVIEW LETTERS 110(10): 101301.
We show that, in order to preserve the equivalence principle until late times in unitarily evaporating black holes, the thermodynamic entropy of a black hole must be primarily entropy of entanglement across the event horizon. For such black holes, we show that the information entering a black hole becomes encoded in correlations within a tripartite quantum state, the quantum analogue of a one-time pad, and is only decoded into the outgoing radiation very late in the evaporation. This behavior generically describes the unitary evaporation of highly entangled black holes and requires no specially designed evolution. Our work suggests the existence of a matter-field sum rule for any fundamental theory. DOI: 10.1103/PhysRevLett.110.101301; We show that, in order to preserve the equivalence principle until late times in unitarily evaporating black holes, the thermodynamic entropy of a black hole must be primarily entropy of entanglement across the event horizon. For such black holes, we show that the information entering a black hole becomes encoded in correlations within a tripartite quantum state, the quantum analogue of a one-time pad, and is only decoded into the outgoing radiation very late in the evaporation. This behavior generically describes the unitary evaporation of highly entangled black holes and requires no specially designed evolution. Our work suggests the existence of a matter-field sum rule for any fundamental theory.

Bremer, M. and D. Cohnitz (2004). Information and Information Flow: An Introduction. Berlin/Boston, De Gruyter.

Brenner, J. E. (2011). "On Representation in Information Theory." Information 2(4): 560-578.
Semiotics is widely applied in theories of information. Following the original triadic characterization of reality by Peirce, the linguistic processes involved in information—production, transmission, reception, and understanding—would all appear to be interpretable in terms of signs and their relations to their objects. Perhaps the most important of these relations is that of the representation-one, entity, standing for or representing some other. For example, an index—one of the three major kinds of signs—is said to represent something by being directly related to its object. My position, however, is that the concept of symbolic representations having such roles in information, as intermediaries, is fraught with the same difficulties as in representational theories of mind. I have proposed an extension of logic to complex real phenomena, including mind and information (Logic in Reality; LIR), most recently at the 4th International Conference on the Foundations of Information Science (Beijing, August, 2010). LIR provides explanations for the evolution of complex processes, including information, that do not require any entities other than the processes themselves. In this paper, I discuss the limitations of the standard relation of representation. I argue that more realistic pictures of informational systems can be provided by reference to information as an energetic process, following the categorial ontology of LIR. This approach enables naïve, anti-realist conceptions of anti-representationalism to be avoided, and enables an approach to both information and meaning in the same novel logical framework.

Brooks, M. (2012). If Information.. Then Universe.
Brooks discusses that the universe is a computer and everything that goes on in it can be explained in terms of information processing. The connection between reality and computing may not be immediately obvious, but strip away the layers and that's exactly what some researchers think people find. People think of the world as made up of particles held together by forces, but quantum theory tells them that these are just a mess of field they can only properly describe by invoking the mathematics of quantum physics.; Brooks discusses that the universe is a computer and everything that goes on in it can be explained in terms of information processing. The connection between reality and computing may not be immediately obvious, but strip away the layers and that's exactly what some researchers think people find. People think of the world as made up of particles held together by forces, but quantum theory tells them that these are just a mess of field they can only properly describe by invoking the mathematics of quantum physics.;

Bruza, P. D. and D. Song (2001). "Informational inference via information flow." 12th International Workshop on Database and Expert Systems Applications, Database and Expert Systems Applications, 2001. Proceedings. 12th International Workshop on, Database and expert systems applications: 237.
Human judgments about information would seem to have an inferential character. The article presents an informational inference mechanism realized via computations of information flow through a high dimensional conceptual space. The conceptual space is realized via the Hyperspace Analogue to Language Algorithm (HAL), which produces vector representations of concepts compatible with those used in human information processing. We show how inference at the symbolic level can be implemented by employing Barwise and Seligman's (1996) theory of information flow. The real valued state spaces advocated by them are realized by HAL vectors to represent the information "state" of a word in the context of a collection of words. Examples of information flow are given to illustrate how it can be used to drive informational inference.

Bub, J. (2005). "Quantum Mechanics is About Quantum Information." FOUNDATIONS OF PHYSICS 35(4): 541-560.
I argue that quantum mechanics is fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles. The notion of quantum information is to be understood as a new physical primitive—just as, following Einsteins special theory of relativity, a field is no longer regarded as the physical manifestation of vibrations in a mechanical medium, but recognized as a new physical entity in its own right.; I argue that quantum mechanics is fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles. The notion of quantum information is to be understood as a new physical primitive—just as, following Einstein’s special theory of relativity, a field is no longer regarded as the physical manifestation of vibrations in a mechanical medium, but recognized as a new physical entity in its own right.

Bueno, O. (2010). "Structuralism and Information." Metaphilosophy 41(3): 365-379.
According to Luciano Floridi (2008), informational structural realism provides a framework to reconcile the two main versions of realism about structure: the epistemic formulation (according to which all we can know is structure) and the ontic version (according to which structure is all there is). The reconciliation is achieved by introducing suitable levels of abstraction and by articulating a conception of structural objects in information-theoretic terms. In this essay, I argue that the proposed reconciliation works at the expense of realism. I then propose an alternative framework, in terms of partial structures, that offers a way of combining information and structure in a realist setting while still preserving the distinctive features of the two formulations of structural realism. Suitably interpreted, the proposed framework also makes room for an empiricist form of informational structuralism (structural empiricism). Pluralism then emerges.

Buonomano, D. V. and M. M. Merzenich (1998). "Cortical plasticity: from synapses to maps." Annu Rev Neurosci 21.

Bynum, T. W. (2014). "On the Possibility of Quantum Informational Structural Realism." MINDS AND MACHINES 24(1): 123-139.
In The Philosophy of Information, Luciano Floridi presents an ontological theory of Being qua Being, which he calls "Informational Structural Realism", a theory which applies, he says, to every possible world. He identifies primordial information ("dedomena") as the foundation of any structure in any possible world. The present essay examines Floridi's defense of that theory, as well as his refutation of "Digital Ontology" (which some people might confuse with his own). Then, using Floridi's ontology as a starting point, the present essay adds quantum features to dedomena, yielding an ontological theory for our own universe, Quantum Informational Structural Realism, which provides a metaphysical interpretation of key quantum phenomena, and diminishes the "weirdness" or "spookiness" of quantum mechanics.; In The Philosophy of Information, Luciano Floridi presents an ontological theory of Being qua Being, which he calls "Informational Structural Realism", a theory which applies, he says, to every possible world. He identifies primordial information ("dedomena") as the foundation of any structure in any possible world. The present essay examines Floridi's defense of that theory, as well as his refutation of "Digital Ontology" (which some people might confuse with his own). Then, using Floridi's ontology as a starting point, the present essay adds quantum features to dedomena, yielding an ontological theory for our own universe, Quantum Informational Structural Realism, which provides a metaphysical interpretation of key quantum phenomena, and diminishes the "weirdness" or "spookiness" of quantum mechanics.; In The Philosophy of Information, Luciano Floridi presents an ontological theory of Being qua Being, which he calls “Informational Structural Realism”, a theory which applies, he says, to every possible world. He identifies primordial information (“dedomena”) as the foundation of any structure in any possible world. The present essay examines Floridi’s defense of that theory, as well as his refutation of “Digital Ontology” (which some people might confuse with his own). Then, using Floridi’s ontology as a starting point, the present essay adds quantum features to dedomena, yielding an ontological theory for our own universe, Quantum Informational Structural Realism, which provides a metaphysical interpretation of key quantum phenomena, and diminishes the “weirdness” or “spookiness” of quantum mechanics.

Cacciatore, M. A., et al. (2017). "Information flow and communication practice challenges: A global study on effective responsive strategies." Corporate Communications 22(3): 292.
Purpose How to effectively manage information flow continues presenting challenges for effective responsive strategies in communication, reflecting the magnitude and impact of a data-driven and strategy-oriented market environment globally. Therefore, the purpose of this paper is to discover how concerns related to the rise of social media have affected communication leaders' operational and managerial practice from an international perspective. The overarching aim is to better understand these concerns in order to contribute to effective responsive strategies in communication practice in the future. Design/methodology/approach The authors relied on data from an international online survey of public relations and communication professionals in multiple countries who were asked their perceptions and behaviors concerning the impact of information flow and the digital revolution on their practice. ANOVA analyses and hierarchical regression models were used to identify the heterogeneity across five clustered groups of countries. Findings Results confirmed a strong desire among communication professionals in multiple countries to learn more about information management in practice. Results identified the overall patterns of responsive strategies that have been widely adopted by public relations professionals in specific country clusters across the globe. In order to better manage social media and the digital revolution, all five of the surveyed country clusters indicated that it is effective to integrate more social media strategies and to train employees in social media. Originality/value The research has explored the importance surrounding information management in an era of widespread digital content, including how concerns in this area have affected strategic decision-making in communication practice. Equally important, the authors provide a more global perspective on this critical topic by analyzing communication professionals' perceptions in grouped country clusters. Results of the research have identified the similarities and differences in responsive strategies to cope with information flow concerns across grouped country clusters.;Purpose How to effectively manage information flow continues presenting challenges for effective responsive strategies in communication, reflecting the magnitude and impact of a data-driven and strategy-oriented market environment globally. Therefore, the purpose of this paper is to discover how concerns related to the rise of social media have affected communication leaders' operational and managerial practice from an international perspective. The overarching aim is to better understand these concerns in order to contribute to effective responsive strategies in communication practice in the future. Design/methodology/approach The authors relied on data from an international online survey of public relations and communication professionals in multiple countries who were asked their perceptions and behaviors concerning the impact of information flow and the digital revolution on their practice. ANOVA analyses and hierarchical regression models were used to identify the heterogeneity across five clustered groups of countries. Findings Results confirmed a strong desire among communication professionals in multiple countries to learn more about information management in practice. Results identified the overall patterns of responsive strategies that have been widely adopted by public relations professionals in specific country clusters across the globe. In order to better manage social media and the digital revolution, all five of the surveyed country clusters indicated that it is effective to integrate more social media strategies and to train employees in social media. Originality/value The research has explored the importance surrounding information management in an era of widespread digital content, including how concerns in this area have affected strategic decision-making in communication practice. Equally important, the authors provide a more global perspective on this critical top c by analyzing communication professionals' perceptions in grouped country clusters. Results of the research have identified the similarities and differences in responsive strategies to cope with information flow concerns across grouped country clusters.;

Calude, C. S. (2009). Information: the algorithmic paradigm.

Carnap, R. and Y. Bar-Hillel (1952). An Outline of a Theory of Semantic Information. CAMBRIDGE, MASSACHUSETTS, RESEARCH LABORATORY OF ELECTRONICS MASSACHUSETTS INSTITUTE OF TECHNOLOGY.

Caticha, A. (2014). "Towards an Informational Pragmatic Realism." MINDS AND MACHINES 24(1): 37-70.
I discuss the design of the method of entropic inference as a general framework for reasoning under conditions of uncertainty. The main contribution of this discussion is to emphasize the pragmatic elements in the derivation. More specifically: (1) Probability theory is designed as the uniquely natural tool for representing states of incomplete information. (2) An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. (3) The method of updating from a prior to a posterior probability distribution is designed through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting framework includes as special cases both MaxEnt and Bayes’ rule. It therefore unifies entropic and Bayesian methods into a single general inference scheme. I find that similar pragmatic elements are an integral part of Putnam’s internal realism, of Floridi’s informational structural realism, and also of van Fraasen’s empiricist structuralism. I conclude with the conjecture that their valuable insights can be incorporated into a single coherent doctrine—an informational pragmatic realism.; I discuss the design of the method of entropic inference as a general framework for reasoning under conditions of uncertainty. The main contribution of this discussion is to emphasize the pragmatic elements in the derivation. More specifically: (1) Probability theory is designed as the uniquely natural tool for representing states of incomplete information. (2) An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. (3) The method of updating from a prior to a posterior probability distribution is designed through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting framework includes as special cases both MaxEnt and Bayes' rule. It therefore unifies entropic and Bayesian methods into a single general inference scheme. I find that similar pragmatic elements are an integral part of Putnam's internal realism, of Floridi's informational structural realism, and also of van Fraasen's empiricist structuralism. I conclude with the conjecture that their valuable insights can be incorporated into a single coherent doctrine-an informational pragmatic realism.

Chaitin, G. (1974). "Information-Theoretic Limitations of Formal Systems." Journal of the ACM (JACM) 21(3): 403-424.
An attempt is made to apply information-theoretic computational complexity to meta-mathematics. The paper studies the number of bits of instructions that must be given to a computer for it to perform finite and infinite tasks, and also the time it takes the computer to perform these tasks. This is applied to measuring the difficulty of proving a given set of theorems, in terms of the number of bits of axioms that are assumed, and the size of the proofs needed to deduce the theorems from the axioms.

Chaitin, G. (1975). "A Theory of Program Size Formally Identical to Information Theory." Journal of the ACM (JACM) 22(3): 329-340.

Chatzisavvas, K. C., et al. (2005). "Information entropy, information distances, and complexity in atoms." Journal of Chemical Physics 123(17): 174111.
Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 <= Z <= 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S = a+ b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy, and a complexity measure recently proposed. Shell effects at closed-shell atoms are observed. The complexity measure shows local minima at the closed-shell atoms indicating that for the above atoms complexity decreases with respect to neighboring atoms. It is seen that complexity fluctuates around an average value, indicating that the atom cannot grow in complexity as Z increases. Onicescu's information energy is correlated with the ionization potential. Kullback distance and Jensen-Shannon distance are employed to compare Roothaan-Hartree-Fock density distributions with other densities of previous works. (c) 2005 American Institute of Physics.Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 <= Z <= 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S=a+b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy, and a complexity measure recently proposed. Shell effects at closed-shell atoms are observed. The complexity measure shows local minima at the closed-shell atoms indicating that for the above atoms complexity decreases with respect to neighboring atoms. It is seen that complexity fluctuates around an average value, indicating that the atom cannot grow in complexity as Z increases. Onicescu's information energy is correlated with the ionization potential. Kullback distance and Jensen-Shannon distance are employed to compare Roothaan-Hartree-Fock density distributions with other densities of previous works.;Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 < or = Z < or = 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S = a + b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy, and a complexity measure recently proposed. Shell effects at closed-shell atoms are observed. The complexity measure shows local minima at the closed-shell atoms indicating that for the above atoms complexity decreases with respect to neighboring atoms. It is seen that complexity fluctuates around an average value, indicating that the atom cannot grow in complexity as Z increases. Onicescu's information energy is correlated with the ionization potential. Kullback distance and Jensen-Shannon distance are employed to compare Roothaan-Hartree-Fock density distributions with other densities of previous works.;Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z (Z=2-54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S=a+b lnZ is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy and a complexity measure recently proposed. Shell effects at closed shells atoms are observed. The complexity measure shows local minima at the closed shells atoms indicating that for the above atoms complexity decreases with respect to neighboring atoms. It is seen that complexity fluctuates around an average value, indicating that the atom cannot grow in complexity as Z increases. Onicescu's information energy is correlated with the ionization potential. Kullback distance and Jensen-Shannon distance are employed to compare Roothaan-Hartree-Fock density distributions with other densities of previous works.;

Chemero, A. (2003). "Information for Perception and Information Processing." MINDS AND MACHINES 13(4): 577-588.
Do psychologists and computer/cognitive scientists mean the same thing by the term `information'? In this essay, I answer this question by comparing information as understood by Gibsonian, ecological psychologists with information as understood in Barwise and Perry's situation semantics. I argue that, with suitable massaging, these views of information can be brought into line. I end by discussing some issues in (the philosophy of) cognitive science and artificial intelligence.

Chen, J. (2001). Mao's China and the cold war. Chapel Hill, University of North Carolina Press.

Chong, A. (2014). "Information Warfare?: The Case for an Asian Perspective on Information Operations." Armed Forces & Society 40(4): 599-624.
While information warfare (IW) has been treated by its foremost western proponents as a strategic revolution, the reasons for such a claim are actually rather weak if one considers how non-western approaches to the informational components of warfare have put forth their positions within a multidimensional context of strategy. This article ventures an Asian perspective that can potentially offer a more nuanced contribution to the study of IW. This article will pan out by first critically analyzing the predominantly American interpretation of IW as a set of five characteristics that can be contrasted to an Asian rival. Subsequently, we will elaborate a list of features likely to characterize a generic Asian IW approach, which I will argue, is more appropriately termed information operations (IO). These Asian IO features will be teased out through a reading of Sun Tzu, Mao Zedong, and Vo Nguyen Giap. An Asian IO approach will not distinguish wartime and peacetime applications, and neither will it place a premium on liberal democratic ideology as a basis for information superiority.;While information warfare (IW) has been treated by its foremost western proponents as a strategic revolution, the reasons for such a claim are actually rather weak if one considers how non-western approaches to the informational components of warfare have put forth their positions within a multidimensional context of strategy. This article ventures an Asian perspective that can potentially offer a more nuanced contribution to the study of IW. This article will pan out by first critically analyzing the predominantly American interpretation of IW as a set of five characteristics that can be contrasted to an Asian rival. Subsequently, we will elaborate a list of features likely to characterize a generic Asian IW approach, which I will argue, is more appropriately termed information operations (IO). These Asian IO features will be teased out through a reading of Sun Tzu, Mao Zedong, and Vo Nguyen Giap. An Asian IO approach will not distinguish wartime and peacetime applications, and neither will it place a premium on liberal democratic ideology as a basis for information superiority.;

Clifford, A. H., et al. (2012). Mathematical foundations of information flow: Clifford Lectures Information Flow in Physics, Geometry, and Logic and Computation, March 12-15, 2008, Tulane University, New Orleans, Louisiana. Providence, R.I, American Mathematical Society.

Clifton, R. (2002). "The Subtleties of Entanglement and its Role in Quantum Information Theory." Philosophy of Science 69(S3): S150-S167.
My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, and entanglement thermodynamics.;My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement‐assisted communication, and entanglement thermodynamics.;My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, and entanglement thermodynamics.;My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, and entanglement thermodynamics.;

Cohen, M. R. and W. T. Newsome (2004). "What electrical microstimulation has revealed about the neural basis of cognition." Curr Opin Neurobiol 14.

Cohen, Y. E. and R. A. Andersen (2002). "A common reference frame for movement plans in the posterior parietal cortex." Nat Rev Neurosci 3.

Cole, C. (1993). "Shannon revisited: Information in terms of uncertainty." Journal of the American Society for Information Science 44(4): 204-211.
Shannon's theory of communication is discussed from the point of view of his concept of uncertainty. It is suggested that there are two information concepts in Shannon, two different uncertainties, and at least two different entropy concepts. Information science focuses on the uncertainty associated with the transmission of the signal rather than the uncertainty associated with the selection of a message from a set of possible messages. The author believes the latter information concept, which is from the sender's point of view, has more to say to information science about what information is than the former, which is from the receiver's point of view and is mainly concerned with “noise” reduction.

Collier, J. (2008). Information in Biological Systems: 763-787.

Cook, D. J. (2015). "Leibniz, China, and the Problem of Pagan Wisdom." PHILOSOPHY EAST & WEST 65(3): 936-947.
Gottfried Wilhelm Leibniz' mature years coincided with the first major European encounter with the thought and civilization of China. From the 1660s onward, many European thinkers and theologians were especially fascinated with this pagan culture. Although Christendom had previously encountered the pagan traditions of Greece and Rome, they had been incorporated into European thought since the origins of Christianity. From the time of the Church fathers, Christendom had developed various strategies to appropriate and reform classical philosophical thought for its own purposes. Leibniz echoes this Christianizing attitude when he says that the doctrine of Plato concerning metaphysics and morality is holy and just. Leibniz sought a path that could recognize the equality of pagan wisdom but at the same time support the unique and supreme nature of Christianity's doctrines and mysteries. He believed that the superior scholarship and methodology developed by the Christian tradition to convert Jews could be used to this end.

Copeland, B. J. (2004). The essential Turing: seminal writings in computing, logic, philosophy, artificial intelligence, and artificial life, plus, the secrets of Enigma. Oxford, Clarendon Press.

Cosmelli, D., et al. (2004). "Waves of consciousness: ongoing cortical patterns during binocular rivalry." Neuroimage 23.

Cover, T. M. (1994). Complexity, Entropy and the Physics of Information.

Crick, F. and C. Koch (2003). "A framework for consciousness." Nat Neurosci 6.

D’Alfonso, S. (2011). "On Quantifying Semantic Information." Information 2(4): 61-101.
The purpose of this paper is to look at some existing methods of semantic information quantification and suggest some alternatives. It begins with an outline of Bar-Hillel and Carnap’s theory of semantic information before going on to look at Floridi’s theory of strongly semantic information. The latter then serves to initiate an in-depth investigation into the idea of utilising the notion of truthlikeness to quantify semantic information. Firstly, a couple of approaches to measure truthlikeness are drawn from the literature and explored, with a focus on their applicability to semantic information quantification. Secondly, a similar but new approach to measure truthlikeness/information is presented and some supplementary points are made.

D’Alfonso, S. (2014). "The Logic of Knowledge and the Flow of Information." MINDS AND MACHINES 24(3): 307-325.
In this paper I look at Fred Dretske's account of information and knowledge as developed in Knowledge and The Flow of Information. In particular, I translate Dretske's probabilistic definition of information to a modal logical framework and subsequently use this to explicate the conception of information and its flow which is central to his account, including the notions of channel conditions and relevant alternatives. Some key products of this task are an analysis of the issue of information closure and an investigation into some of the logical properties of Dretske's account of information flow.; In this paper I look at Fred Dretske's account of information and knowledge as developed in Knowledge and The Flow of Information. In particular, I translate Dretske's probabilistic definition of information to a modal logical framework and subsequently use this to explicate the conception of information and its flow which is central to his account, including the notions of channel conditions and relevant alternatives. Some key products of this task are an analysis of the issue of information closure and an investigation into some of the logical properties of Dretske's account of information flow.; In this paper I look at Fred Dretske’s account of information and knowledge as developed in Knowledge and The Flow of Information. In particular, I translate Dretske’s probabilistic definition of information to a modal logical framework and subsequently use this to explicate the conception of information and its flow which is central to his account, including the notions of channel conditions and relevant alternatives. Some key products of this task are an analysis of the issue of information closure and an investigation into some of the logical properties of Dretske’s account of information flow.

Davies, P. C. W. and N. H. Gregersen (2010). Information and the nature of reality: from physics to metaphysics. Cambridge, Cambridge University Press.

Delancey, C. (2007). "Phenomenal Experience and the Measure of Information." Erkenntnis (1975-) 66(3): 329-352.
This paper defends the hypothesis that phenomenal experiences may be very complex information states. This can explain some of our most perplexing anti-physicalist intuitions about phenomenal experience. The approach is to describe some basic facts about information in such a way as to make clear the essential oversight involved, by way illustrating how various intuitive arguments against physicalism (such as Frank Jackson's Knowledge Argument, and Thomas Nagel's Bat Argument) can be interpreted to show that phenomenal information is not different in kind from physical information, but rather is just more information than we typically attribute to our understanding of a physical theory. I clarify how this hypothesis is distinct from Nagel's claim that the theory of consciousness may be inconceivable, and then in conclusion briefly describe how these results might suggest a positive and conservative physicalist account of phenomenal experience.;This paper defends the hypothesis that phenomenal experiences may be very complex information states. This can explain some of our most perplexing anti-physicalist intuitions about phenomenal experience. The approach is to describe some basic facts about information in such a way as to make clear the essential oversight involved, by way illustrating how various intuitive arguments against physicalism (such as Frank Jackson's Knowledge Argument, and Thomas Nagel's Bat Argument) can be interpreted to show that phenomenal information is not different in kind from physical information, but rather is just more information than we typically attribute to our understanding of a physical theory. I clarify how this hypothesis is distinct from Nagel's claim that the theory of consciousness may be inconceivable, and then in conclusion briefly describe how these results might suggest a positive and conservative physicalist account of phenomenal experience.;This paper defends the hypothesis that phenomenal experiences may be very complex information states. This can explain some of our most perplexing anti-physicalist intuitions about phenomenal experience. The approach is to describe some basic facts about information in such a way as to make clear the essential oversight involved, by way illustrating how various intuitive arguments against physicalism (such as Frank Jackson’s Knowledge Argument, and Thomas Nagel’s Bat Argument) can be interpreted to show that phenomenal information is not different in kind from physical information, but rather is just more information than we typically attribute to our understanding of a physical theory. I clarify how this hypothesis is distinct from Nagel’s claim that the theory of consciousness may be inconceivable, and then in conclusion briefly describe how these results might suggest a positive and conservative physicalist account of phenomenal experience.;This paper defends the hypothesis that phenomenal experiences may be very complex information states. This can explain some of our most perplexing anti-physicalist intuitions about phenomenal experience. The approach is to describe some basic facts about information in such a way as to make clear the essential oversight involved, by way illustrating how various intuitive arguments against physicalism (such as Frank Jackson's Knowledge Argument, and Thomas Nagel's Bat Argument) can be interpreted to show that phenomenal information is not different in kind from physical information, but rather is just more information than we typically attribute to our understanding of a physical theory. I clarify how this hypothesis is distinct from Nagel's claim that the theory of consciousness may be inconceivable, and then in conclusion briefly describe how these results might suggest a positive and conservative physicalist account of phenomenal experience.;

Demirel, Y. (2014). "Information in Biological Systems and the Fluctuation Theorem." ENTROPY 16(4): 1931-1948.
Some critical trends in information theory, its role in living systems and utilization in fluctuation theory are discussed. The mutual information of thermodynamic coupling is incorporated into the generalized fluctuation theorem by using information theory and nonequilibrium thermodynamics. Thermodynamically coupled dissipative structures in living systems are capable of degrading more energy, and processing complex information through developmental and environmental constraints. The generalized fluctuation theorem can quantify the hysteresis observed in the amount of the irreversible work in nonequilibrium regimes in the presence of information and thermodynamic coupling.; Some critical trends in information theory, its role in living systems and utilization in fluctuation theory are discussed. The mutual information of thermodynamic coupling is incorporated into the generalized fluctuation theorem by using information theory and nonequilibrium thermodynamics. Thermodynamically coupled dissipative structures in living systems are capable of degrading more energy, and processing complex information through developmental and environmental constraints. The generalized fluctuation theorem can quantify the hysteresis observed in the amount of the irreversible work in nonequilibrium regimes in the presence of information and thermodynamic coupling.

Denning, P. J. and T. Bell (2012). "The Information Paradox." American Scientist 100(6): 470.
[...]the concept of information seems fuzzy and abstract to many people, making it hard for them to understand how information systems really work. [...]all the components are physical, and information is always encoded into some sort of signal, which can be transmitted and translated without losing the information it encodes (see Figure 3). Because information is always represented by physical means, it takes time and energy to read, write and transform it. Bayesian inference programs are extensively used in data mining - they can infer complex hypotheses using the evidence in very large data sets.; [...]the concept of information seems fuzzy and abstract to many people, making it hard for them to understand how information systems really work. [...]all the components are physical, and information is always encoded into some sort of signal, which can be transmitted and translated without losing the information it encodes (see Figure 3). Because information is always represented by physical means, it takes time and energy to read, write and transform it. Bayesian inference programs are extensively used in data mining - they can infer complex hypotheses using the evidence in very large data sets.

Deutsch, D. and P. Hayden (2000). "Information flow in entangled quantum systems." Proceedings of the Royal Society of London. Series A. Mathematical, Physical and Engineering Sciences 456(1999): 1759-1774.

Devin, M. (2014). "Musings on Firewalls and the Information Paradox." Galaxies 2(2): 189-198.
The past year has seen an explosion of new and old ideas about black hole physics. Prior to the firewall paper, the dominant picture was the thermofield model apparently implied by anti-de Sitter conformal field theory duality. While some seek a narrow responce to Almheiri, Marolf, Polchinski, and Sully (AMPS) , there are a number of competing models. One problem in the field is the ambiguity of the competing proposals. Some are equivalent while others incompatible. This paper will attempt to define and classify a few models representative of the current discussions.

Devlin, K. (1998). "Information flow: the logic of distributed systems by Jon Barwise and Jerry Seligman." Complexity 4(2): 30-32.

Dill, C. (2010). "VOLTAIRE (FRANÇOIS-MARIE AROUET), OEUVRES DE 1738–1740 (III); WRITINGS FOR MUSIC (1720–1740) ED. ROGER J. V. COTTE, RUSSELL GOULBOURNE, GILLIAN PINK, GERHARDT STENGER, RAYMOND TROUSSON AND DAVID WILLIAMS Complete Works of Voltaire 18c Oxford: Voltaire Foundation, 2008 pp. xxvi + 430, isbn 978 0 7294 0913 1VOLTAIRE (FRANÇOIS-MARIE AROUET), ŒUVRES DE 1742–1745 (I) ED. OLIVIER FERRET, RUSSELL GOULBOURNE, RALPH A. NABLOW AND DAVID WILLIAMS Complete Works of Voltaire 28a Oxford: Voltaire Foundation, 2006 pp. xxvi + 528, isbn 978 0 7294 0871 4." Eighteenth-Century Music 7(1): 140-143.

Doyle, L. R., et al. (2011). "Information theory, animal communication, and the search for extraterrestrial intelligence." Acta Astronautica 68(3): 406-417.

Dretske, F. (2008). Epistemology and Information: 29-47.

Dretske, F. I. (1983). "Précis of Knowledge and the Flow of Information." Behavioral and Brain Sciences 6(1): 55-63.

Dretske, F. I. (1999). Knowledge and the Flow of Information, Cambridge University Press.
LCCN: 99012546

Elias, P. (1954). "Review: Rudolf Carnap, Yehoshua Bar-Hillel, An Outline of a Theory of Semantic Information." J. Symbolic Logic 19(iss. 3): 230-232.

Engel, A. K., et al. (2001). "Dynamic predictions: oscillations and synchrony in top-down processing." Nat Rev Neurosci 2.

English, S., et al. (2015). "The Information Value of Non-Genetic Inheritance in Plants and Animals." PLOS ONE 10(1): e0116996.
Parents influence the development of their offspring in many ways beyond the transmission of DNA. This includes transfer of epigenetic states, nutrients, antibodies and hormones, and behavioural interactions after birth. While the evolutionary consequences of such non-genetic inheritance are increasingly well understood, less is known about how inheritance mechanisms evolve. Here, we present a simple but versatile model to explore the adaptive evolution of non-genetic inheritance. Our model is based on a switch mechanism that produces alternative phenotypes in response to different inputs, including genes and non-genetic factors transmitted from parents and the environment experienced during development. This framework shows how genetic and non-genetic inheritance mechanisms and environmental conditions can act as cues by carrying correlational information about future selective conditions. Differential use of these cues is manifested as different degrees of genetic, parental or environmental morph determination. We use this framework to evaluate the conditions favouring non-genetic inheritance, as opposed to genetic determination of phenotype or within-generation plasticity, by applying it to two putative examples of adaptive non-genetic inheritance: maternal effects on seed germination in plants and transgenerational phase shift in desert locusts. Our simulation models show how the adaptive value of non-genetic inheritance depends on its mechanism, the pace of environmental change, and life history characteristics.; Parents influence the development of their offspring in many ways beyond the transmission of DNA. This includes transfer of epigenetic states, nutrients, antibodies and hormones, and behavioural interactions after birth. While the evolutionary consequences of such non-genetic inheritance are increasingly well understood, less is known about how inheritance mechanisms evolve. Here, we present a simple but versatile model to explore the adaptive evolution of non-genetic inheritance. Our model is based on a switch mechanism that produces alternative phenotypes in response to different inputs, including genes and non-genetic factors transmitted from parents and the environment experienced during development. This framework shows how genetic and non-genetic inheritance mechanisms and environmental conditions can act as cues by carrying correlational information about future selective conditions. Differential use of these cues is manifested as different degrees of genetic, parental or environmental morph determination. We use this framework to evaluate the conditions favouring non-genetic inheritance, as opposed to genetic determination of phenotype or within-generation plasticity, by applying it to two putative examples of adaptive non-genetic inheritance: maternal effects on seed germination in plants and transgenerational phase shift in desert locusts. Our simulation models show how the adaptive value of non-genetic inheritance depends on its mechanism, the pace of environmental change, and life history characteristics.

Epstein, R., et al. (2009). Parsing the Turing Test: Philosophical and Methodological Issues in the Quest for the Thinking Computer. Dordrecht, Springer Netherlands.

Ercan, I. and N. G. Anderson (2013). "Heat Dissipation in Nanocomputing: Lower Bounds From Physical Information Theory." IEEE Transactions on Nanotechnology 12(6): 1047-1060.

Fetzer, J. H. (2004). "Information: Does it Have To Be True?" MINDS AND MACHINES 14(2): 223-229.
Luciano Floridi (2003) offers a theory of information as a "strongly semantic" notion, according to which information encapsulates truth, thereby making truth a necessary condition for a sentence to qualify as "information". While Floridi provides an impressive development of this position, the aspects of his approach of greatest philosophical significance are its foundations rather than its formalization. He rejects the conception of information as meaningful data, which entails at least three theses - that information can be false; that tautologies are information; and, that "It is true that." is non- redundant - appear to be defensible. This inquiry offers various logical, epistemic, and ordinary-language grounds to demonstrate that an account of his kind is too narrow to be true and that its adoption would hopelessly obscure crucial differences between information, misinformation, and disinformation.;Luciano Floridi (2003) offers a theory of information as a “strongly semantic” notion, according to which information encapsulates truth, thereby making truth a necessary condition for a sentence to qualify as “information”. While Floridi provides an impressive development of this position, the aspects of his approach of greatest philosophical significance are its foundations rather than its formalization. He rejects the conception of information as meaningful data, which entails at least three theses – that information can be false; that tautologies are information; and, that “It is true that .” is non-redundant – appear to be defensible. This inquiry offers various logical, epistemic, and ordinary-language grounds to demonstrate that an account of his kind is too narrow to be true and that its adoption would hopelessly obscure crucial differences between information, misinformation, and disinformation.;

Floridi, L. (2003). "Two Approaches to the Philosophy of Information." MINDS AND MACHINES 13(4): 459-469.

Floridi, L. (2004). "Open Problems in the Philosophy of Information." Metaphilosophy 35(4): 554-582.
The philosophy of information (PI) is a new area of research with its own field of investigation and methodology. This article, based on the Herbert A. Simon Lecture of Computing and Philosophy I gave at Carnegie Mellon University in 2001, analyses the eighteen principal open problems in PI. Section 1 introduces the analysis by outlining Herbert Simon's approach to PI. Section 2 discusses some methodological considerations about what counts as a good philosophical problem. The discussion centers on Hilbert's famous analysis of the central problems in mathematics. The rest of the article is devoted to the eighteen problems. These are organized into five sections: problems in the analysis of the concept of information, in semantics, in the study of intelligence, in the relation between information and nature, and in the investigation of values.; The philosophy of information (PI) is a new area of research with its own field of investigation and methodology. This article, based on the Herbert A. Simon Lecture of Computing and Philosophy I gave at Carnegie Mellon University in 2001, analyses the eighteen principal open problems in PI. Section 1 introduces the analysis by outlining Herbert Simon's approach to PI. Section 2 discusses some methodological considerations about what counts as a good philosophical problem. The discussion centers on Hilbert's famous analysis of the central problems in mathematics. The rest of the article is devoted to the eighteen problems. These are organized into five sections: problems in the analysis of the concept of information, in semantics, in the study of intelligence, in the relation between information and nature, and in the investigation of values.

Floridi, L. (2004). "Outline of a Theory of Strongly Semantic Information." MINDS AND MACHINES 14(2): 197-221.
This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truth-values rather than probability distributions. The main hypothesis supported in the paper is that the classic quantitative theory of weakly semantic information (TWSI), based on probability distributions, assumes that truth-values supervene on factual semantic information, yet this principle is too weak and generates a well-known semantic paradox, whereas TSSI, according to which factual semantic information encapsulates truth, can avoid the paradox and is more in line with the standard conception of what generally counts as semantic information. After a brief introduction, section two outlines the semantic paradox implied by TWSI, analysing it in terms of an initial conflict between two requisites of a quantitative theory of semantic information. In section three, three criteria of semantic information equivalence are used to provide a taxonomy of quantitative approaches to semantic information and introduce TSSI. In section four, some further desiderata that should be fulfilled by a quantitative TSSI are explained. From section five to section seven, TSSI is developed on the basis of a calculus of truth-values and semantic discrepancy with respect to a given situation. In section eight, it is shown how TSSI succeeds in solving the paradox. Section nine summarises the main results of the paper and indicates some future developments.; This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truth-values rather than probability distributions. The main hypothesis supported in the paper is that the classic quantitative theory of weakly semantic information (TWSI), based on probability distributions, assumes that truth-values supervene on factual semantic information, yet this principle is too weak and generates a well-known semantic paradox, whereas TSSI, according to which factual semantic information encapsulates truth, can avoid the paradox and is more in line with the standard conception of what generally counts as semantic information. After a brief introduction, section two outlines the semantic paradox implied by TWSI, analysing it in terms of an initial conflict between two requisites of a quantitative theory of semantic information. In section three, three criteria of semantic information equivalence are used to provide a taxonomy of quantitative approaches to semantic information and introduce TSSI. In section four, some further desiderata that should be fulfilled by a quantitative TSSI are explained. From section five to section seven, TSSI is developed on the basis of a calculus of truth-values and semantic discrepancy with respect to a given situation. In section eight, it is shown how TSSI succeeds in solving the paradox. Section nine summarises the main results of the paper and indicates some future developments.

Floridi, L. (2005). "Is Semantic Information Meaningful Data?" Philosophy and Phenomenological Research 70(2): 351-370.
There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates some interesting areas of application of the revised definition.; There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates some interesting areas of application of the revised definition.; There is no consensus yet on the definition of semantic information. This article contributes to the current debate by criticising & revising the Standard Definition of semantic Information (SDI) as meaningful data, in favor of the Dretske-Grice approach: meaningful & well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information; & misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information - in other words, not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarizes the main results of the article & indicates some interesting areas of application of the revised definition. 30 . Adapted from the source document; There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske‐Grice approach: meaningful and well‐formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect becau e truth‐values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo‐information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo‐information. As a consequence, SDI is revised to include a necessary truth‐condition. The last section summarises the main results of the paper and indicates some interesting areas of application of the revised definition.; There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates some interesting areas of application of the revised definition.

Floridi, L. (2005). Semantic Conceptions of Information. The Stanford Encyclopedia of Philosophy, Stanfrod Univeristy CSLI.

Floridi, L. (2008). "Artificial intelligence's new frontier: Artificial companions and the fourth revolution." Metaphilosophy 39(4-5): 651-655.
In this article I argue that the best way to understand the information turn is in terms of a fourth revolution in the long process of reassessing humanity's fundamental nature and role in the universe. We are not immobile, at the centre of the universe (Copernicus); we are not unnaturally distinct and different from the rest of the animal world (Darwin); and we are far from being entirely transparent to ourselves (Freud). We are now slowly accepting the idea that we might be informational organisms among many agents (Turing), inforgs not so dramatically different from clever, engineered artefacts, but sharing with them a global environment that is ultimately made of information, the infosphere.;In this article I argue that the best way to understand the information turn is in terms of a fourth revolution in the long process of reassessing humanity's fundamental nature and role in the universe. We are not immobile, at the centre of the universe (Copernicus); we are not unnaturally distinct and different from the rest of the animal world (Darwin); and we are far from being entirely transparent to ourselves (Freud). We are now slowly accepting the idea that we might be informational organisms among many agents (Turing), inforgs not so dramatically different from clever, engineered artefacts, but sharing with them a global environment that is ultimately made of information, the infosphere.;

Floridi, L. (2008). "A Defence of Informational Structural Realism." SYNTHESE 161(2): 219-253.
This is the revised version of an invited keynote lecture delivered at the 1st Australian Computing and Philosophy Conference (CAP@AU; the Australian National University in Canberra, 31 October-2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further. [PUBLICATION ABSTRACT]; This is the revised version of an invited keynote lecture delivered at the 1st Australian Computing and Philosophy Conference (CAP@AU; the Australian National University in Canberra, 31 October–2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further.; This is the revised version of an invited keynote lecture delivered at the "1st Australian Computing and Philosophy Conference" (CAP@AU; the Australian National University in Canberra, 31 October—2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify t further.; This is the revised version of an invited keynote lecture delivered at the 1st Australian Computing and Philosophy Conference (CAP@AU; the Australian National University in Canberra, 31 October-2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further.; This is the revised version of an invited keynote lecture delivered at the 1st Australian Computing and Philosophy Conference (CAP@AU the Australian National University in Canberra, 31 October–2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further.

Floridi, L. (2008). Trends in the Philosophy of Information: 113-131.

Floridi, L. (2009). A Distributed Model of Truth for Semantic Information.

Floridi, L. (2009). "The Information Society and Its Philosophy: Introduction to the Special Issue on "The Philosophy of Information, Its Nature, and Future Developments"." The Information Society 25(3): 153-158.
The article introduces the special issue dedicated to "The Philosophy of Information, Its Nature, and Future Developments." It outlines the origins of the information society and then briefly discusses the definition of the philosophy of information, the possibility of reconciling nature and technology, the informational turn as a fourth revolution (after Copernicus, Darwin, and Freud), and the metaphysics of the infosphere.; The article introduces the special issue dedicated to "The Philosophy of Information, Its Nature, and Future Developments." It outlines the origins of the information society and then briefly discusses the definition of the philosophy of information, the possibility of reconciling nature and technology, the informational turn as a fourth revolution (after Copernicus, Darwin, and Freud), and the metaphysics of the infosphere. Adapted from the source document.; The article introduces the special issue dedicated to The Philosophy of Information, Its Nature, and Future Developments. It outlines the origins of the information society and then briefly discusses the definition of the philosophy of information, the possibility of reconciling nature and technology, the informational turn as a fourth revolution (after Copernicus, Darwin, and Freud), and the metaphysics of the infosphere.

Floridi, L. (2010). "Information, possible worlds and the cooptation of scepticism." SYNTHESE 175(S1): 63-88.
The article investigates the sceptical challenge from an informationtheoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant; or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borei numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borei numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borei numbers and Hamming distances, and shown to be either harmless (extreme form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.; The article investigates the sceptical challenge from an information-theoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant; or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borel numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borel numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borel numbers and Hamming distances, and shown to be either harmless (extreme form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.; The article investigates the sceptical challenge from an information-theoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant; or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borel numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borel numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borel numbers and Hamming distances, and shown to be either harmless (extreme form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.; The article investigates the sceptical challenge from an information-theoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borel numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borel numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borel numbers and Hamming distances, and shown to be either harmless (extre e form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.; Issue Title: Special issue on The Nature and Scope of Information The article investigates the sceptical challenge from an information-theoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant; or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borel numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borel numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borel numbers and Hamming distances, and shown to be either harmless (extreme form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.[PUBLICATION ABSTRACT]

Floridi, L. (2010). Information: a very short introduction. Oxford;New York;, Oxford University Press.

Floridi, L. (2010). "The Philosophy of Information as a Conceptual Framework." 1-29.

Floridi, L. (2011). "The Informational Nature of Personal Identity." MINDS AND MACHINES 21(4): 549-566.
In this paper, I present an informational approach to the nature of personal identity. In “Plato and the problem of the chariot”, I use Plato’s famous metaphor of the chariot to introduce a specific problem regarding the nature of the self as an informational multiagent system: what keeps the self together as a whole and coherent unity? In “Egology and its two branches” and “Egology as synchronic individualisation”, I outline two branches of the theory of the self: one concerning the individualisation of the self as an entity, the other concerning the identification of such entity. I argue that both presuppose an informational approach, defend the view that the individualisation of the self is logically prior to its identification, and suggest that such individualisation can be provided in informational terms. Hence, in “A reconciling hypothesis: the three membranes model”, I offer an informational individualisation of the self, based on a tripartite model, which can help to solve the problem of the chariot. Once this model of the self is outlined, in “ICTs as technologies of the self” I use it to show how ICTs may be interpreted as technologies of the self. In “The logic of realisation”, I introduce the concept of “realization” (Aristotle’s anagnorisis) and support the rather Spinozian view according to which, from the perspective of informational structural realism, selves are the final stage in the development of informational structures. The final “Conclusion: from the egology to the ecology of the self” briefly concludes the article with a reference to the purposeful shaping of the self, in a shift from egology to ecology.; In this paper, I present an informational approach to the nature of personal identity. In "Plato and the problem of the chariot”, I use Plato's famous metaphor of the chariot to introduce a specific problem regarding the nature of the self as an informational multiagent system: what keeps the self together as a whole and coherent unity? In "Egology and its two branches” and "Egology as synchronic individualisation”, I outline two branches of the theory of the self: one concerning the individualisation of the self as an entity, the other concerning the identification of such entity. I argue that both presuppose an informational approach, defend the view that the individualisation of the self is logically prior to its identification, and suggest that such individualisation can be provided in informational terms. Hence, in "A reconciling hypothesis: the three membranes model”, I offer an informational individualisation of the self, based on a tripartite model, which can help to solve the problem of the chariot. Once this model of the self is outlined, in "ICTs as technologies of the self” I use it to show how ICTs may be interpreted as technologies of the self. In "The logic of realisation”, I introduce the concept of "realization” (Aristotle's anagnorisis) and support the rather Spinozian view according to which, from the perspective of informational structural realism, selves are the final stage in the development of informational structures. The final "Conclusion: from the egology to the ecology of the self” briefly concludes the article with a reference to the purposeful shaping of the self, in a shift from egology to ecology.

Floridi, L. (2011). "Semantic Information and the Correctness Theory of Truth." Erkenntnis (1975-) 74(2): 147-175.
Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science's sense of "verification" and "validation") that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q that (3) is the case if and only if (4) m is a proxy of s (in the computer science's sense of "proxy") and (5) proximal access to m commutes with the distal access to s (in the category theory's sense of "commutation") and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science's technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained.; Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science's sense of "verification" and "validation"); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science's sense of "proxy") and (5) proximal access to m commutes with the distal access to s (in the category theory's sense of "commutation"); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science's technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are exp ained.; Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science's sense of "verification" and "validation"); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science's sense of "proxy") and (5) proximal access to m commutes with the distal access to s (in the category theory's sense of "commutation"); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science's technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained.[PUBLICATION ABSTRACT]; Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science's sense of "verification" and "validation"); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science's sense of "proxy") and (5) proximal access to m commutes with the distal access to s (in the category theory's sense of "commutation"); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science's technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions fr m computer science are explained.; Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science’s sense of “verification” and “validation”); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science’s sense of “proxy”) and (5) proximal access to m commutes with the distal access to s (in the category theory’s sense of “commutation”); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science’s technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained.

Floridi, L. (2013). "Information Quality." Philosophy & Technology 26(1): 1-6.

Floridi, L. (2014). "Information closure and the sceptical objection." SYNTHESE 191(6): 1037-1050.
In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for a formalization of the logic of " S S is informed that p p " among the non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of " S S is informed that p p " should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of " S S is informed that p p ", which remains plausible insofar as this specific obstacle is concerned.; (ProQuest: ... denotes formulae and/or non-USASCII text omitted; see image) In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for a formalization of the logic of "... is informed that ..." among the non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of "... is informed that ..." should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of "... is informed that ...", which remains plausible insofar as this specific obstacle is concerned.[PUBLICATION ABSTRACT]; In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for a formalization of the logic of “ $$S$$ S is informed that $$p$$ p ” among the non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of “ $$S$$ S is informed that $$p$$ p ” should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of “ $$S$$ S is informed that $$p$$ p ”, which remains plausible insofar as this specific obstacle is concerned.; In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for a formalization of the logic of " is informed that " among he non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of " is informed that " should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of " is informed that ", which remains plausible insofar as this specific obstacle is concerned.

Floridi, L. (2014). "The Latent Nature of Global Information Warfare." 27(3): 317-319.
Issue Title: Trends in the History and Philosophy of Computing

Fresco, N. and M. Michael (2016). "Information and Veridicality: Information Processing and the Bar-Hillel/Carnap Paradox." Philosophy of Science 83(1): 131-151.
Floridi's Theory of Strongly Semantic Information posits the Veridicality Thesis (i.e., information is true). One motivation is that it can serve as a foundation for information-based epistemology being an alternative to the tripartite theory of knowledge. However, the Veridicality thesis is false, if 'information' is to play an explanatory role in human cognition. Another motivation is avoiding the so-called Bar-Hillel/Carnap paradox (i.e., any contradiction is maximally informative). But this paradox only seems paradoxical, if (a) 'information' and 'informativeness' are synonymous, (b) logic is a theory of inference, or (c) validity suffices for rational inference; a, b, and c are false. [web URL: http://www.jstor.org/stable/10.1086/684165?seq=1#page_scan_tab_contents];Floridi's Theory of Strongly Semantic Information posits the Veridicality Thesis (i.e., information is true). One motivation is that it can serve as a foundation for information-based epistemology being an alternative to the tripartite theory of knowledge. However, the Veridicality thesis is false, if 'information' is to play an explanatory role in human cognition. Another motivation is avoiding the so-called Bar-Hillel/Carnap paradox (i.e., any contradiction is maximally informative). But this paradox only seems paradoxical, if (a) 'information' and 'informativeness' are synonymous, (b) logic is a theory of inference, or (c) validity suffices for rational inference; a, b, and c are false.;Floridi’s Theory of Strongly Semantic Information posits the Veridicality Thesis (i.e., information is true). One motivation is that it can serve as a foundation for information-based epistemology being an alternative to the tripartite theory of knowledge. However, the Veridicality thesis is false, if ‘information’ is to play an explanatory role in human cognition. Another motivation is avoiding the so-called Bar-Hillel/Carnap paradox (i.e., any contradiction is maximally informative). But this paradox only seems paradoxical, if ( ) ‘information’ and ‘informativeness’ are synonymous, ( ) logic is a theory of inference, or ( ) validity suffices for rational inference; , , and are false.;

Freund, J. and J. Jones (2015). Measuring and managing information risk: a FAIR approach. Oxford, U.K, Butterworth-Heinemann.

Frieden, B. R. (2007). "Information-based uncertainty for a photon." Optics Communications 271(1): 71-72.
It is shown on the basis of Fisher information that the ultimate root-mean square uncertainty in the position of a single photon of wavelength λ is 0.112λ in vacuum. This is as well an “effective size” for the photon.

Friston, K. J. (2001). "Brain function, nonlinear coupling, and neuronal transients." Neuroscientist 7.

Frolov, V. P. (2014). "Information loss problem and a ‘black hole’ model with a closed apparent horizon." Journal of High Energy Physics 2014(5): 1-21.

Fu, T., et al. (2012). "Sentimental Spidering: Leveraging Opinion Information in Focused Crawlers." ACM Transactions on Information Systems (TOIS) 30(4): 1-30.
Despite the increased prevalence of sentiment-related information on the Web, there has been limited work on focused crawlers capable of effectively collecting not only topic-relevant but also sentiment-relevant content. In this article, we propose a novel focused crawler that incorporates topic and sentiment information as well as a graph-based tunneling mechanism for enhanced collection of opinion-rich Web content regarding a particular topic. The graph-based sentiment (GBS) crawler uses a text classifier that employs both topic and sentiment categorization modules to assess the relevance of candidate pages. This information is also used to label nodes in web graphs that are employed by the tunneling mechanism to improve collection recall. Experimental results on two test beds revealed that GBS was able to provide better precision and recall than seven comparison crawlers. Moreover, GBS was able to collect a large proportion of the relevant content after traversing far fewer pages than comparison methods. GBS outperformed comparison methods on various categories of Web pages in the test beds, including collection of blogs, Web forums, and social networking Web site content. Further analysis revealed that both the sentiment classification module and graph-based tunneling mechanism played an integral role in the overall effectiveness of the GBS crawler.;Despite the increased prevalence of sentiment-related information on the Web, there has been limited work on focused crawlers capable of effectively collecting not only topic-relevant but also sentiment-relevant content. In this article, we propose a novel focused crawler that incorporates topic and sentiment information as well as a graph-based tunneling mechanism for enhanced collection of opinion-rich Web content regarding a particular topic. The graph-based sentiment (GBS) crawler uses a text classifier that employs both topic and sentiment categorization modules to assess the relevance of candidate pages. This information is also used to label nodes in web graphs that are employed by the tunneling mechanism to improve collection recall. Experimental results on two test beds revealed that GBS was able to provide better precision and recall than seven comparison crawlers. Moreover, GBS was able to collect a large proportion of the relevant content after traversing far fewer pages than comparison methods. GBS outperformed comparison methods on various categories of Web pages in the test beds, including collection of blogs, Web forums, and social networking Web site content. Further analysis revealed that both the sentiment classification module and graph-based tunneling mechanism played an integral role in the overall effectiveness of the GBS crawler.;Despite the increased prevalence of sentiment-related information on the Web, there has been limited work on focused crawlers capable of effectively collecting not only topic-relevant but also sentiment-relevant content. In this article, we propose a novel focused crawler that incorporates topic and sentiment information as well as a graph-based tunneling mechanism for enhanced collection of opinion-rich Web content regarding a particular topic. The graph-based sentiment (GBS) crawler uses a text classifier that employs both topic and sentiment categorization modules to assess the relevance of candidate pages. This information is also used to label nodes in web graphs that are employed by the tunneling mechanism to improve collection recall. Experimental results on two test beds revealed that GBS was able to provide better precision and recall than seven comparison crawlers. Moreover, GBS was able to collect a large proportion of the relevant content after traversing far fewer pages than comparison methods. GBS outperformed comparison methods on various categories of Web pages in the test beds, including collection of blogs, Web forums, and social networking Web site content. Further analysis revealed that both the sentiment classification module and graph-based tunneling mechanism played an integral role in the over ll effectiveness of the GBS crawler.;

Fyffe, R. (2015). "The Value of Information: Normativity, Epistemology, and LIS in Luciano Floridi." PORTAL-LIBRARIES AND THE ACADEMY 15(2): 267-286.
This paper is a critical reconstruction of Luciano Floridi's view of librarianship as "stewardship of a semantic environment," a view that is at odds with the dominant tradition in which library and information science (LIS) is understood as social epistemology. Floridi's work helps to explain the normative dimensions of librarianship in ways that epistemology does not, and his Philosophy of Information frames librarians' traditional stewardship role in terms appropriate for our growing involvement in the management and preservation of information through its entire life cycle. Floridi's work also helps illuminate what is coming to be called "knowledge as a commons." Librarianship is concerned with maintaining and enhancing information environments over time, environments that include the behavior of the people who create and use them. The integrity of these environments makes possible the epistemic projects of faculty, students, and other researchers, but librarianship is not, itself, epistemological. Floridi's ecological reframing of philosophy of information and information ethics, bridging the dichotomy between information and user, has a variety of implications for information literacy education and other academic library services in higher education.;This paper is a critical reconstruction of Luciano Floridi's view of librarianship as "stewardship of a semantic environment," a view that is at odds with the dominant tradition in which library and information science (LIS) is understood as social epistemology. Floridi's work helps to explain the normative dimensions of librarianship in ways that epistemology does not, and his Philosophy of Information frames librarians' traditional stewardship role in terms appropriate for our growing involvement in the management and preservation of information through its entire life cycle. Floridi's work also helps illuminate what is coming to be called "knowledge as a commons." Librarianship is concerned with maintaining and enhancing information environments over time, environments that include the behavior of the people who create and use them. The integrity of these environments makes possible the epistemic projects of faculty, students, and other researchers, but librarianship is not, itself, epistemological. Floridi's ecological reframing of philosophy of information and information ethics, bridging the dichotomy between information and user, has a variety of implications for information literacy education and other academic library services in higher education.;This paper is a critical reconstruction of Luciano Floridi's view of librarianship as "stewardship of a semantic environment," a view that is at odds with the dominant tradition in which library and information science (LIS) is understood as social epistemology. Floridi's work helps to explain the normative dimensions of librarianship in ways that epistemology does not, and his Philosophy of Information frames librarians' traditional stewardship role in terms appropriate for our growing involvement in the management and preservation of information through its entire life cycle. Floridi's work also helps illuminate what is coming to be called "knowledge as a commons." Librarianship is concerned with maintaining and enhancing information environments over time, environments that include the behavior of the people who create and use them. The integrity of these environments makes possible the epistemic projects of faculty, students, and other researchers, but librarianship is not, itself, epistemological. Floridi's ecological reframing of philosophy of information and information ethics, bridging the dichotomy between information and user, has a variety of implications for information literacy education and other academic library services in higher education.;

Gaber, T., et al. (2016). The 1st International Conference on Advanced Intelligent System and Informatics (AISI2015), November 28-30, 2015, Beni Suef, Egypt. Cham, Springer International Publishing.

Galas, D. J., et al. (2010). "Biological Information as Set-Based Complexity." IEEE Transactions on Information Theory 56(2): 667-677.
The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples.; The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples. [PUBLICATION ABSTRACT]; The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples.; The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are nherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples. [PUBLICATION ABSTRACT]

Gao, L., et al. (2014). "Quantifying Information Flow During Emergencies." SCIENTIFIC REPORTS 4: 3997.
Recent advances on human dynamics have focused on the normal patterns of human activities, with the quantitative understanding of human behavior under extreme events remaining a crucial missing chapter. This has a wide array of potential applications, ranging from emergency response and detection to traffic control and management. Previous studies have shown that human communications are both temporally and spatially localized following the onset of emergencies, indicating that social propagation is a primary means to propagate situational awareness. We study real anomalous events using country-wide mobile phone data, finding that information flow during emergencies is dominated by repeated communications. We further demonstrate that the observed communication patterns cannot be explained by inherent reciprocity in social networks, and are universal across different demographics.;Recent advances on human dynamics have focused on the normal patterns of human activities, with the quantitative understanding of human behavior under extreme events remaining a crucial missing chapter. This has a wide array of potential applications, ranging from emergency response and detection to traffic control and management. Previous studies have shown that human communications are both temporally and spatially localized following the onset of emergencies, indicating that social propagation is a primary means to propagate situational awareness. We study real anomalous events using country-wide mobile phone data, finding that information flow during emergencies is dominated by repeated communications. We further demonstrate that the observed communication patterns cannot be explained by inherent reciprocity in social networks, and are universal across different demographics.;Recent advances on human dynamics have focused on the normal patterns of human activities, with the quantitative understanding of human behavior under extreme events remaining a crucial missing chapter. This has a wide array of potential applications, ranging from emergency response and detection to traffic control and management. Previous studies have shown that human communications are both temporally and spatially localized following the onset of emergencies, indicating that social propagation is a primary means to propagate situational awareness. We study real anomalous events using country-wide mobile phone data, finding that information flow during emergencies is dominated by repeated communications. We further demonstrate that the observed communication patterns cannot be explained by inherent reciprocity in social networks, and are universal across different demographics.;Recent advances on human dynamics have focused on the normal patterns of human activities, with the quantitative understanding of human behavior under extreme events remaining a crucial missing chapter. This has a wide array of potential applications, ranging from emergency response and detection to traffic control and management. Previous studies have shown that human communications are both temporally and spatially localized following the onset of emergencies, indicating that social propagation is a primary means to propagate situational awareness. We study real anomalous events using country-wide mobile phone data, finding that information flow during emergencies is dominated by repeated communications. We further demonstrate that the observed communication patterns cannot be explained by inherent reciprocity in social networks, and are universal across different demographics.;

Gatenby, R. A. and B. R. Frieden (2007). "Information Theory in Living Systems, Methods, Applications, and Challenges." Bulletin of mathematical biology 69(2): 635-657.

Gillies, D. (2010). "Informational Realism and World 3." Knowledge, Technology & Policy: 1-18.

Godfrey-Smith, P. (2000). "Information, Arbitrariness, and Selection: Comments on Maynard Smith." Philosophy of Science 67(2): 202-207.

Godfrey-Smith, P. (2007). Information in Biology. Cambridge, Cambridge University Press: 103-119.
INTRODUCTION The concept of information has acquired a strikingly prominent role in contemporary biology. This trend is especially marked within genetics, but it has also become important in other areas, such as evolutionary theory and developmental biology, especially where these fields border on genetics. The most distinctive biological role for informational concepts, and the one that has generated the most discussion, is in the description of the relations between genes and the various structures and processes that genes play a role in causing. For many biologists, the causal role of genes should be understood in terms of their carrying information about their various products. That information might require the cooperation of various environmental factors before it can be "expressed," but the same can be said of other kinds of message. An initial response might be to think that this mode of description is entirely anchored in a set of well-established facts about the role of DNA and RNA within protein synthesis, summarized in the familiar chart representing the "genetic code," mapping DNA base triplets to amino acids. However, informational enthusiasm in biology predates even a rudimentary understanding of these mechanisms (Schrodinger 1944). And more importantly, current applications of informational concepts extend far beyond anything that can receive an obvious justification in terms of the familiar facts about the specification of protein molecules by DNA.

Gray, R. M. (2011). Distortion and Information. Entropy and Information Theory. Boston, MA, Springer US: 237-263.

Greco, J. (2015). Testimonial Knowledge and the Flow of Information. Oxford, Oxford University Press.
This chapter reviews a number of related problems in the epistemology of testimony, and suggests some dilemmas for any theory of knowledge that tries to solve them. Here a common theme emerges: It can seem that any theory must make testimonial knowledge either too hard or too easy, and that therefore no adequate account of testimonial knowledge is possible. The chapter then puts forward a proposal for making progress. Specifically, an important function of the concept of knowledge is to govern the acquisition and distribution of quality information within an epistemic community. Testimonial exchanges paradigmatically serve in the distribution role, but sometimes serve in the acquisition role. The resulting position, it is argued, explains why testimonial knowledge is sometimes easy to get, and sometimes much harder.

Griffiths, P. E. (2001). "Genetic Information: A Metaphor in Search of a Theory." Philosophy of Science 68(3): 394-412.

Grünwald, P. D. and P. M. B. Vitányi (2003). "Kolmogorov Complexity and Information Theory: With an Interpretation in Terms of Questions and Answers." Journal of Logic, Language and Information 12(4): 497-529.
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual information and Kolmogorov ("algorithmic") mutual information. We explain how universal coding may be viewed as a middle ground between the two theories. We consider Shannon's rate distortion theory, which quantifies useful (in a certain sense) information. We use the communication of information as our guiding motif, and we explain how it relates to sequential question-answer sessions.; We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (“algorithmic”) mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding motif, and we explain howit relates to sequential question-answer sessions.

Grünwald, P. D. and P. M. B. Vitányi (2008). Algorithmic Information Theory: 281-317.

Hackett, T. A. (2011). "Information flow in the auditory cortical network." Hearing Research 271(1): 133-146.
Auditory processing in the cerebral cortex is comprised of an interconnected network of auditory and auditory-related areas distributed throughout the forebrain. The nexus of auditory activity is located in temporal cortex among several specialized areas, or fields, that receive dense inputs from the medial geniculate complex. These areas are collectively referred to as auditory cortex. Auditory activity is extended beyond auditory cortex via connections with auditory-related areas elsewhere in the cortex. Within this network, information flows between areas to and from countless targets, but in a manner that is characterized by orderly regional, areal and laminar patterns. These patterns reflect some of the structural constraints that passively govern the flow of information at all levels of the network. In addition, the exchange of information within these circuits is dynamically regulated by intrinsic neurochemical properties of projecting neurons and their targets. This article begins with an overview of the principal circuits and how each is related to information flow along major axes of the network. The discussion then turns to a description of neurochemical gradients along these axes, highlighting recent work on glutamate transporters in the thalamocortical projections to auditory cortex. The article concludes with a brief discussion of relevant neurophysiological findings as they relate to structural gradients in the network. © 2010 Elsevier B.V.;Auditory processing in the cerebral cortex is comprised of an interconnected network of auditory and auditory-related areas distributed throughout the forebrain. The nexus of auditory activity is located in temporal cortex among several specialized areas, or fields, that receive dense inputs from the medial geniculate complex. These areas are collectively referred to as auditory cortex. Auditory activity is extended beyond auditory cortex via connections with auditory-related areas elsewhere in the cortex. Within this network, information flows between areas to and from countless targets, but in a manner that is characterized by orderly regional, areal and laminar patterns. These patterns reflect some of the structural constraints that passively govern the flow of information at all levels of the network. In addition, the exchange of information within these circuits is dynamically regulated by intrinsic neurochemical properties of projecting neurons and their targets. This article begins with an overview of the principal circuits and how each is related to information flow along major axes of the network. The discussion then turns to a description of neurochemical gradients along these axes, highlighting recent work on glutamate transporters in the thalamocortical projections to auditory cortex. The article concludes with a brief discussion of relevant neurophysiological findings as they relate to structural gradients in the network. (C) 2010 Elsevier B.V. All rights reserved.;Auditory processing in the cerebral cortex is comprised of an interconnected network of auditory and auditory-related areas distributed throughout the forebrain. The nexus of auditory activity is located in temporal cortex among several specialized areas, or fields, that receive dense inputs from the medial geniculate complex. These areas are collectively referred to as auditory cortex. Auditory activity is extended beyond auditory cortex via connections with auditory-related areas elsewhere in the cortex. Within this network, information flows between areas to and from countless targets, but in a manner that is characterized by orderly regional, areal and laminar patterns. These patterns reflect some of the structural constraints that passively govern the flow of information at all levels of the network. In addition, the exchange of information within these circuits is dynamically regulated by intrinsic neurochemical properties of projecting neurons and their targets. This article begins with an overview of the principal circuits and how each is related to information flow along major axe of the network. The discussion then turns to a description of neurochemical gradients along these axes, highlighting recent work on glutamate transporters in the thalamocortical projections to auditory cortex. The article concludes with a brief discussion of relevant neurophysiological findings as they relate to structural gradients in the network.;

Hale, J., et al. (2014). "Better together? The cognitive advantages of synaesthesia for time, numbers, and space." Cognitive Neuropsychology 31(7-8): 545-564.
Synaesthesia for time, numbers, and space (TNS synaesthesia) is thought to have costs and benefits for recalling and manipulating time and number. There are two competing theories about how TNS synaesthesia affects cognition. The "magnitude" account predicts that TNS synaesthesia may affect cardinal magnitude judgements, whereas the "sequence" account suggests that it may affect ordinal sequence judgements and could rely on visuospatial working memory. We aimed to comprehensively assess the cognitive consequences of TNS synaesthesia and distinguish between these two accounts. TNS synaesthetes, grapheme-colour synaesthetes, and nonsynaesthetes completed a behavioural task battery. Three tasks involved cardinal and ordinal comparisons of temporal, numerical, and spatial stimuli; we also examined visuospatial working memory. TNS synaesthetes were significantly more accurate than nonsynaesthetes in making ordinal judgements about space. This difference was explained by significantly higher visuospatial working memory accuracy. Our findings demonstrate an advantage of TNS synaesthesia that is more in line with the sequence account.;Synaesthesia for time, numbers, and space (TNS synaesthesia) is thought to have costs and benefits for recalling and manipulating time and number. There are two competing theories about how TNS synaesthesia affects cognition. The "magnitude" account predicts that TNS synaesthesia may affect cardinal magnitude judgements, whereas the "sequence" account suggests that it may affect ordinal sequence judgements and could rely on visuospatial working memory. We aimed to comprehensively assess the cognitive consequences of TNS synaesthesia and distinguish between these two accounts. TNS synaesthetes, grapheme-colour synaesthetes, and nonsynaesthetes completed a behavioural task battery. Three tasks involved cardinal and ordinal comparisons of temporal, numerical, and spatial stimuli; we also examined visuospatial working memory. TNS synaesthetes were significantly more accurate than nonsynaesthetes in making ordinal judgements about space. This difference was explained by significantly higher visuospatial working memory accuracy. Our findings demonstrate an advantage of TNS synaesthesia that is more in line with the sequence account.;Synaesthesia for time, numbers, and space (TNS synaesthesia) is thought to have costs and benefits for recalling and manipulating time and number. There are two competing theories about how TNS synaesthesia affects cognition. The "magnitude" account predicts that TNS synaesthesia may affect cardinal magnitude judgements, whereas the "sequence" account suggests that it may affect ordinal sequence judgements and could rely on visuospatial working memory. We aimed to comprehensively assess the cognitive consequences of TNS synaesthesia and distinguish between these two accounts. TNS synaesthetes, grapheme-colour synaesthetes, and nonsynaesthetes completed a behavioural task battery. Three tasks involved cardinal and ordinal comparisons of temporal, numerical, and spatial stimuli; we also examined visuospatial working memory. TNS synaesthetes were significantly more accurate than nonsynaesthetes in making ordinal judgements about space. This difference was explained by significantly higher visuospatial working memory accuracy. Our findings demonstrate an advantage of TNS synaesthesia that is more in line with the sequence account.;

Harms, W. F. (1998). "The Use of Information Theory in Epistemology." Philosophy of Science 65(3): 472-501.
Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist.;Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual information places both upper and lower bounds on payoffs. This constitutes a non-trivial relationship to utility.;

Harms, W. F. (2004). Information and meaning in evolutionary processes. New York; Cambridge, U.K, Cambridge University Press.

Harms, W. F. (2006). "What Is Information? Three Concepts." Biological Theory 1(3): 230-242.
The concept of information tempts us as a theoretical primitive, partly because of the respectability lent to it by highly successful applications of Shannon’s information theory, partly because of its broad range of applicability in various domains, partly because of its neutrality with respect to what basic sorts of things there are. This versatility, however, is the very reason why information cannot be the theoretical primitive we might like it to be. “Information,” as it is variously used, is systematically ambiguous between whether it involves continuous or discrete quantities, causal or noncausal relationships, and intrinsic or relational properties. Many uses can be firmly grounded in existing theory, however. Continuous quantities of information involving probabilities can be related to information theory proper. Information defined relative to systems of rules or conventions can be understood relative to the theory of meaning (semantics). A number of causal notions may possibly be located relative to standard notions in physics. Precise specification of these distinct properties involved in the common notion of information can allow us to map the relationships between them. Consequently, while information is not in itself the kind of single thing that can play a significant unifying role, analyzing its ambiguities may facilitate headway toward that goal.

Harwood, C. J. (1999). "Philosophical Aspects of Information Systems." Kybernetes 28(1): 109-110.

Harwood, C. J. (1999). "Philosophical Aspects of Information Systems994Edited by R. Winder, S.K. Probert, I.A. Beeson. Philosophical Aspects of Information Systems. Taylor & Francis, Price £35.00." Kybernetes 28(1): 109-110.

Haught, J. F. (2010). Information, theology, and the universe. Information and the Nature of Reality: From Physics to Metaphysics. P. Davies and N. H. Gregersen, Cambridge University Press: 301-318.

Hawking, S. W. (2005). "Information loss in black holes." Physical Review D - Particles, Fields, Gravitation and Cosmology 72(8).
The question of whether information is lost in black holes is investigated using Euclidean path integrals. The formation and evaporation of black holes is regarded as a scattering problem with all measurements being made at infinity. This seems to be well formulated only in asymptotically AdS spacetimes. The path integral over metrics with trivial topology is unitary and information preserving. On the other hand, the path integral over metrics with nontrivial topologies leads to correlation functions that decay to zero. Thus at late times only the unitary information preserving path integrals over trivial topologies will contribute. Elementary quantum gravity interactions do not lose information or quantum coherence.

Hawking, S. W. (2014). "Information Preservation and Weather Forecasting for Black Holes." arXiv:1401.5761(arXiv:1401.5761 [hep-th]).
It has been suggested [1] that the resolution of the information paradox for evaporating black holes is that the holes are surrounded by firewalls, bolts of outgoing radiation that would destroy any infalling observer. Such firewalls would break the CPT invariance of quantum gravity and seem to be ruled out on other grounds. A different resolution of the paradox is proposed, namely that gravitational collapse produces apparent horizons but no event horizons behind which information is lost. This proposal is supported by ADS-CFT and is the only resolution of the paradox compatible with CPT. The collapse to form a black hole will in general be chaotic and the dual CFT on the boundary of ADS will be turbulent. Thus, like weather forecasting on Earth, information will effectively be lost, although there would be no loss of unitarity.

Hayashi, M. (2017). Quantum Information Theory: Mathematical Foundation. Berlin, Heidelberg, Springer Berlin Heidelberg.

Hodgson, G. M. and T. Knudsen (2008). "Information, complexity and generative replication." Biology & Philosophy 23(1): 47-65.
The established definition of replication in terms of the conditions of causality, similarity and information transfer is very broad. We draw inspiration from the literature on self-reproducing automata to strengthen the notion of information transfer in replication processes. To the triple conditions of causality, similarity and information transfer, we add a fourth condition that defines a “generative replicator” as a conditional generative mechanism, which can turn input signals from an environment into developmental instructions. Generative replication must have the potential to enhance complexity, which in turn requires that developmental instructions are part of the information that is transmitted in replication. Demonstrating the usefulness of the generative replicator concept in the social domain, we identify social generative replicators that satisfy all of the four proposed conditions.; The established definition of replication in terms of the conditions of causality, similarity and information transfer is very broad. We draw inspiration from the literature on self-reproducing automata to strengthen the notion of information transfer in replication processes. To the triple conditions of causality, similarity and information transfer, we add a fourth condition that defines a "generative replicator" as a conditional generative mechanism, which can turn input signals from an environment into developmental instructions. Generative replication must have the potential to enhance complexity, which in turn requires that developmental instructions are part of the information that is transmitted in replication. Demonstrating the usefulness of the generative replicator concept in the social domain, we identify social generative replicators that satisfy all of the four proposed conditions.

Holevo, A. S. (2012). Quantum Systems, Channels, Information: A Mathematical Introduction. Berlin ;Boston, De Gruyter.

Horowitz, J. M. and M. Esposito (2014). "Thermodynamics with Continuous Information Flow." PHYSICAL REVIEW X 4(3): 031015.
We provide a unified thermodynamic formalism describing information transfers in autonomous as well as nonautonomous systems described by stochastic thermodynamics. We demonstrate how information is continuously generated in an auxiliary system and then transferred to a relevant system that can utilize it to fuel otherwise impossible processes. Indeed, while the joint system satisfies the second law, the entropy balance for the relevant system is modified by an information term related to the mutual information rate between the two systems We show that many important results previously derived for nonautonomous Maxwell demons can be recovered from our formalism and use a cycle decomposition to analyze the continuous information flow in autonomous systems operating at a steady state. A model system is used to illustrate our findings.;We provide a unified thermodynamic formalism describing information transfers in autonomous as well as nonautonomous systems described by stochastic thermodynamics. We demonstrate how information is continuously generated in an auxiliary system and then transferred to a relevant system that can utilize it to fuel otherwise impossible processes. Indeed, while the joint system satisfies the second law, the entropy balance for the relevant system is modified by an information term related to the mutual information rate between the two systems. We show that many important results previously derived for nonautonomous Maxwell demons can be recovered from our formalism and use a cycle decomposition to analyze the continuous information flow in autonomous systems operating at a steady state. A model system is used to illustrate our findings.;

Ibekwe-SanJuan, F. and T. M. Dousa (2013). Theories of Information, Communication and Knowledge: A Multidisciplinary Approach. Dordrecht, Springer Netherlands.
This book addresses some of the key questions that scientists have been asking themselves for centuries: what is knowledge? What is information? How do we know that we know something? How do we construct meaning from the perceptions of things? Although no consensus exists on a common definition of the concepts of information and communication, few can reject the hypothesis that information - whether perceived as « object » or as « process » - is a pre-condition for knowledge. Epistemology is the study of how we know things (anglophone meaning) or the study of how scientific knowledge is arrived at and validated (francophone conception). To adopt an epistemological stance is to commit oneself to render an account of what constitutes knowledge or in procedural terms, to render an account of when one can claim to know something. An epistemological theory imposes constraints on the interpretation of human cognitive interaction with the world. It goes without saying that different epistemological theories will have more or less restrictive criteria to distinguish what constitutes knowledge from what is not. If information is a pre-condition for knowledge acquisition, giving an account of how knowledge is acquired should impact our comprehension of information and communication as concepts. While a lot has been written on the definition of these concepts, less research has attempted to establish explicit links between differing theoretical conceptions of these concepts and the underlying epistemological stances. This is what this volume attempts to do. It offers a multidisciplinary exploration of information and communication as perceived in different disciplines and how those perceptions affect theories of knowledge.;This book addresses some of the key questions that scientists have been asking themselves for centuries: what is knowledge? What is information? How do we know that we know something? How do we construct meaning from the perceptions of things? Although no consensus exists on a common definition of the concepts of information and communication, few can reject the hypothesis that information - whether perceived as object or as process - is a pre-condition for knowledge. Epistemology is the study of how we know things (anglophone meaning) or the study of how scientific knowledge is arrived at and validated (francophone conception). To adopt an epistemological stance is to commit oneself to render an account of what constitutes knowledge or in procedural terms, to render an account of when one can claim to know something. An epistemological theory imposes constraints on the interpretation of human cognitive interaction with the world. It goes without saying that different epistemological theories will have more or less restrictive criteria to distinguish what constitutes knowledge from what is not. If information is a pre-condition for knowledge acquisition, giving an account of how knowledge is acquired should impact our comprehension of information and communication as concepts. While a lot has been written on the definition of these concepts, less research has attempted to establish explicit links between differing theoretical conceptions of these concepts and the underlying epistemological stances. This is what this volume attempts to do. It offers a multidisciplinary exploration of information and communication as perceived in different disciplines and how those perceptions affect theories of knowledge.;

Jablonka, E. (2002). "Information: Its Interpretation, Its Inheritance, and Its Sharing." Philosophy of Science 69(4): 578-605.
The semantic concept of information is one of the most important, and one of the most problematical concepts in biology. I suggest a broad definition of biological information: a source becomes an informational input when an interpreting receiver can react to the form of the source (and variations in this form) in a functional manner. The definition accommodates information stemming from environmental cues as well as from evolved signals, and calls for a comparison between information-transmission in different types of inheritance systems-the genetic, the epigenetic, the behavioral, and the cultural-symbolic. This comparative perspective highlights the different ways in which information is acquired and transmitted, and the role that such information plays in heredity and evolution. Focusing on the special properties of the transfer of information, which are very different from those associated with the transfer of materials or energy, also helps to uncover interesting evolutionary effects and suggests better explanations for some aspects of the evolution of communication.; The semantic concept of information is one of the most important, and one of the most problematical concepts in biology. I suggest a broad definition of biological information: a source becomes an informational input when an interpreting receiver can react to the form of the source (and variations in this form) in a functional manner. The definition accommodates information stemming from environmental cues as well as from evolved signals, and calls for a comparison between information‐transmission in different types of inheritance systems—the genetic, the epigenetic, the behavioral, and the cultural‐symbolic. This comparative perspective highlights the different ways in which information is acquired and transmitted, and the role that such information plays in heredity and evolution. Focusing on the special properties of the transfer of information, which are very different from those associated with the transfer of materials or energy, also helps to uncover interesting evolutionary effects and suggests better explanations for some aspects of the evolution of communication.; The semantic concept of information is one of the most important, and one of the most problematical concepts in biology. I suggest a broad definition of biological information: a source becomes an informational input when an interpreting receiver can react to the form of the source (and variations in this form) in a functional manner. The definition accommodates information stemming from environmental cues as well as from evolved signals, and calls for a comparison between information-transmission in different types of inheritance systems-the genetic, the epigenetic, the behavioral, and the cultural-symbolic. This comparative perspective highlights the different ways in which information is acquired and transmitted, and the role that such information plays in heredity and evolution. Focusing on the special properties of the transfer of information, which are very different from those associated with the transfer of materials or energy, also helps to uncover interesting evolutionary effects and suggests better explanations for some aspects of the evolution of communication.; The semantic concept of information is one of the most important, and one of the most problematical concepts in biology. I suggest a broad definition of biological information: a source becomes an informational input when an interpreting receiver can react to the form of the source (and variations in this form) in a functional manner. The definition accommodates information stemming from environmental cues as well as from evolved signals, and calls for a comparison between information-transmission in different types of inheritance systems-the genetic, the epigenetic, the behavioral, and the cultural-symbolic. This comparative perspective highlights the different ways in which information is acquired and transmitted, and the role that such information plays in heredity and evolution. Focusing on the s ecial properties of the transfer of information, which are very different from those associated with the transfer of materials or energy, also helps to uncover interesting evolutionary effects and suggests better explanations for some aspects of the evolution of communication.

Kåhre, J. (2002). The Mathematical Theory of Information. New York, Springer US.

Kamp, H. and M. Stokhof (2008). Information in Natural Language: 49-111.

K'Ang, H. and R. G. Henricks (1983). Philosophy and Argumentation in Third-Century China: The Essays of Hsi K'ang: The Essays of Hsi K'ang. Princeton, N.J, Princeton University Press.

Kassir, A. (2014). Communication Efficiency in Information Gathering through Dynamic Information Flow.
This thesis addresses the problem of how to improve the performance of multi-robot information gathering tasks by actively controlling the rate of communication between robots. Examples of such tasks include cooperative tracking and cooperative environmental monitoring. Communication is essential in such systems for both decentralised data fusion and decision making, but wireless networks impose capacity constraints that are frequently overlooked. While existing research has focussed on improving available communication throughput, the aim in this thesis is to develop algorithms that make more efficient use of the available communication capacity. Since information may be shared at various levels of abstraction, another challenge is the decision of where information should be processed based on limits of the computational resources available. Therefore, the flow of information needs to be controlled based on the trade-off between communication limits, computation limits and information value. In this thesis, we approach the trade-off by introducing the dynamic information flow (DIF) problem. We suggest variants of DIF that either consider data fusion communication independently or both data fusion and decision making communication simultaneously. For the data fusion case, we propose efficient decentralised solutions that dynamically adjust the flow of information. For the decision making case, we present an algorithm for communication efficiency based on local LQ approximations of information gathering problems. The algorithm is then integrated with our solution for the data fusion case to produce a complete communication efficiency solution for information gathering. We analyse our suggested algorithms and present important performance guarantees. The algorithms are validated in a custom-designed decentralised simulation framework and through field-robotic experimental demonstrations.

Kawahigashi, Y., et al. (2016). "Introduction to Special Issue: Operator Algebras and Quantum Information Theory." Journal of Mathematical Physics 57(1): 15101.

Knight, N. (1999). "The Dilemma of Determinism: Qu Qiubai and the Origins of Marxist Philosophy in China." China Information 13(4): 1-26.

Knight, N. and S. service (2005). Marxist Philosophy in China: From Qu Qiubai to Mao Zedong, 1923–1945. Dordrecht, Springer Netherlands.

Knuth, K. H. (2015). Information-based physics and the influence network. It From Bit or Bit From It?, Springer: 65-78.

Kohlas, J. and C. Schneuwly (2009). Information algebra.

Kullback, S. (1959). Information theory and statistics. New York, Wiley.

Kullback, S. (1968). Information theory and statistics. New York, Dover Publications.

Kullback, S. and R. A. Leibler (1951). "On Information and Sufficiency." The Annals of Mathematical Statistics 22(1): 79-86.

Kun, W. and J. E. Brenner (2015). "An Informational Ontology and Epistemology of Cognition." Foundations of Science 20(3): 249-279.
Despite recent major advances in the neuroscience underlying cognition, the processes of its emergence and evolution are far from being understood. In our view, current interrelated concepts of mind; knowledge; epistemology; perception; cognition and information fail to reflect the real dynamics of mental processes, their ontology and their logic. It has become routine to talk about information in relation to these processes, but there is no consensus about its most relevant qualitative and functional properties. We present a theory of human cognition based on an ontology and epistemology of information and information processes originally proposed by Wu including (1) an ontological doctrine of the different grades of information; and (2) an informational epistemology based on a noegenesis of the doctrine of informational intermediaries that mediate between the cognitive subject and object. This theory is supported by the new, non-propositional logic proposed recently by Brenner. We demonstrate the utility of our approach for the reconceptualization of the virtual properties of reality and cognition. It is strongly anti-representationalist and can provide the basis for the integration of inputs from outside the brain (and body) into cognitive structures. For us, the philosophy of information is a metaphilosophy, implying major changes in both the content and methodology of standard philosophical disciplines. We suggest that this philosophy of information and our informational approach may help guide research in a number of current areas of cognitive science.;Despite recent major advances in the neuroscience underlying cognition, the processes of its emergence and evolution are far from being understood. In our view, current interrelated concepts of mind; knowledge; epistemology; perception; cognition and information fail to reflect the real dynamics of mental processes, their ontology and their logic. It has become routine to talk about information in relation to these processes, but there is no consensus about its most relevant qualitative and functional properties. We present a theory of human cognition based on an ontology and epistemology of information and information processes originally proposed by Wu including (1) an ontological doctrine of the different grades of information; and (2) an informational epistemology based on a noegenesis of the doctrine of informational intermediaries that mediate between the cognitive subject and object. This theory is supported by the new, non-propositional logic proposed recently by Brenner. We demonstrate the utility of our approach for the reconceptualization of the virtual properties of reality and cognition. It is strongly anti-representationalist and can provide the basis for the integration of inputs from outside the brain (and body) into cognitive structures. For us, the philosophy of information is a metaphilosophy, implying major changes in both the content and methodology of standard philosophical disciplines. We suggest that this philosophy of information and our informational approach may help guide research in a number of current areas of cognitive science.;

Küppers, B.-O. (2010). Information and communication in living matter. Information and the Nature of Reality: From Physics to Metaphysics. P. Davies and N. H. Gregersen, Cambridge University Press: 170-184.

Lai, J. (2015). "“Patrimonial Bureaucracy” and Chinese Law: Max Weber’s Legacy and Its Limits." Modern China 41(1): 40-58.
Max Weber’s claim that traditional Chinese law was “khadi justice” continues to be a challenge facing scholars of Chinese legal history. While many scholars respond to this thesis at the level of empirical studies, their research cannot replace responses at the theoretical level. Thus, it is necessary to enter into a theoretical dialogue with Weber on the basis of empirical findings. Weber’s thought about China within the framework of his sociology of domination is a crucial factor influencing his presentation of Chinese law. In his original theoretical design, “patrimonial bureaucracy” is a mixed or in-between state of irrational and rational types of domination, since it contains elements of both patrimonialism and bureaucracy. Thus, justice and administration under such domination are not completely arbitrary. Weber’s assumption that world history increasingly moves toward full rationality, however, induces him to place China in the initial stage of the process of rationalization and to put the modern Western world at the end of that process. Under the influence of this thinking, Weber consciously or unconsciously neglects bureaucratic facets of imperial China’s patrimonial bureaucracy and deliberately amplifies its patrimonial facets. As a result, Chinese law in Weber’s writings is substantively irrational “khadi justice.” Although Weber’s concepts and insights are limited, if refined and revised they can still benefit future scholars of traditional Chinese law.

Lamme, V. A. and P. R. Roelfsema (2000). "The distinct modes of vision offered by feedforward and recurrent processing." Trends Neurosci 23.

Lancaster, G. (2009). "Minds and Computers: An Introduction to the Philosophy of Artificial Intelligence. By Matt Carter." The Heythrop Journal 50(3): 565-565.

Landauer, R. (1991). "Information is physical." Physics Today 44(5): 23-29.

Landauer, R. (1999). "Information is a physical entity." Physica A: Statistical Mechanics and its Applications 263(1): 63-67.
This paper, associated with a broader conference talk on the fundamental physical limits of information handling, emphasizes the aspects still least appreciated. Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe. The mathematician's vision of an unlimited sequence of totally reliable operations is unlikely to be implementable in this real universe. Speculative remarks about the possible impact of that, on the ultimate nature of the laws of physics are included.

Laureys, S., et al. (2002). "Brain function in the vegetative state." Acta Neurol Belg 102.

Leeper (2011). Foresight and Information Flows, National Bureau of Economic Research.

Levy, A. (2011). "Information in Biology: A Fictionalist Account." NOUS 45(4): 640-657.

Li, F. (2008). Bureaucracy and the state in early China: governing the western Zhou. Cambridge, UK;New York;, Cambridge University Press.

Li, G. (2011). "Information philosophy in China: Professor Wu Kun's 30 years of academic thinking in information philosophy." TripleC 9(2): 316-321.

Libet, B. (1982). "Brain stimulation in the study of neuronal functions for conscious sensory experiences." Human Neurobiology 1.

Liu, D., et al. (2011). "Constructivism scenario evolutionary analysis of zero emission regional planning: A case of Qaidam Circular Economy Pilot Area in China." International Journal of Production Economics 140(1): 341.
The concept of zero emission contributes significantly to sustainable development. China is planning to build a circular economy pilot area with zero emissions in the Western Qaidam Basin. The paper discusses some theoretical and practical issues in the regional planning. In particular, the paper proposes a constructivism scenario evolutionary analysis based on the bargaining games among the involved stakeholders, which embodies the harmonic society criteria in regional plannings. Then, the paper applies the scenario evolutionary analysis method to the multi-objective programming model of the Qaidam Circular Economy Pilot Area, where the objective of environmental pollution minimization can be converted into constraint conditions. Therefore, the problem is converted into the a single objective mixed-integral programming model of the maximal profit under various constraint conditions including natural resources, environment capacity, and social and economic factors, etc. Furthermore, the model provides solutions to the optimal expected output levels of main chemical products and minimal quantity of pollution treatment facilities according to the optimal scenario evolutionary path which embodies the self-purification emission with moderate resource exploitation. The results of the paper will have important policy implications to the regional development planning in China. [PUBLICATION ABSTRACT]

Lombardi, O. (2004). "What is Information?" Foundations of Science 9(2): 105-134.
The main aim of this work is to contribute tothe elucidation of the concept of informationby comparing three different views about thismatter: the view of Fred Dretske's semantictheory of information, the perspective adoptedby Peter Kosso in his interaction-informationaccount of scientific observation, and thesyntactic approach of Thomas Cover and JoyThomas. We will see that these views involvevery different concepts of information, eachone useful in its own field of application. This comparison will allow us to argue in favorof a terminological `cleansing': it is necessaryto make a terminological distinction among thedifferent concepts of information, in order toavoid conceptual confusions when the word`information' is used to elucidate relatedconcepts as knowledge, observation orentropy.

Long, B. R. (2009). Informationist Science Fiction and Informationist Science Fiction Theory, The University of Sydney.

Long, B. R. (2014). "Information is intrinsically semantic but alethically neutral." SYNTHESE 191(14): 3447-3467.
In this paper I argue that, according to a particular physicalist conception of information, information is both alethically neutral or non-alethic, and is intrinsically semantic. The conception of information presented is physicalist and reductionist, and is contrary to most current pluralist and non-reductionist philosophical opinion about the nature of information. The ontology assumed for this conception of information is based upon physicalist non-eliminative ontic structural realism. However, the argument of primary interest is that information so construed is intrinsically semantic on a reductionist and non-alethic basis where semantic content is constituted by indication along causal pathways. Similar arguments have been presented by philosophers with respect to representation. I suggest the conception of information that I present is correct by the lights of the best applied mathematical and scientific theories of information. If so, there is no need for any separate theory of semantic information. Thus I present a theory of intrinsically semantic information which also constitutes an informational theory of truth where truth reduces to information. In the last section I discuss weakly and strongly semantic information, and reject them in favour of alethically neutral intrinsically semantic information.;In this paper I argue that, according to a particular physicalist conception of information, information is both alethically neutral or non-alethic, and is intrinsically semantic. The conception of information presented is physicalist and reductionist, and is contrary to most current pluralist and non-reductionist philosophical opinion about the nature of information. The ontology assumed for this conception of information is based upon physicalist non-eliminative ontic structural realism. However, the argument of primary interest is that information so construed is intrinsically semantic on a reductionist and non-alethic basis where semantic content is constituted by indication along causal pathways. Similar arguments have been presented by philosophers with respect to representation. I suggest the conception of information that I present is correct by the lights of the best applied mathematical and scientific theories of information. If so, there is no need for any separate theory of semantic information. Thus I present a theory of intrinsically semantic information which also constitutes an informational theory of truth where truth reduces to information. In the last section I discuss weakly and strongly semantic information, and reject them in favour of alethically neutral intrinsically semantic information.;In this paper I argue that, according to a particular physicalist conception of information, information is both alethically neutral or non-alethic, and is intrinsically semantic. The conception of information presented is physicalist and reductionist, and is contrary to most current pluralist and non-reductionist philosophical opinion about the nature of information. The ontology assumed for this conception of information is based upon physicalist non-eliminative ontic structural realism. However, the argument of primary interest is that information so construed is intrinsically semantic on a reductionist and non-alethic basis where semantic content is constituted by indication along causal pathways. Similar arguments have been presented by philosophers with respect to representation. I suggest the conception of information that I present is correct by the lights of the best applied mathematical and scientific theories of information. If so, there is no need for any separate theory of semantic information. Thus I present a theory of intrinsically semantic information which also constitutes an informational theory of truth where truth reduces to information. In the last section I discuss weakly and strongly semantic information, and reject them in favour of alethically neutral intrinsically semantic information.; In this paper I argue that, according to a particular physicalist conception of informa ion, information is both alethically neutral or non-alethic, and is intrinsically semantic. The conception of information presented is physicalist and reductionist, and is contrary to most current pluralist and non-reductionist philosophical opinion about the nature of information. The ontology assumed for this conception of information is based upon physicalist non-eliminative ontic structural realism. However, the argument of primary interest is that information so construed is intrinsically semantic on a reductionist and non-alethic basis where semantic content is constituted by indication along causal pathways. Similar arguments have been presented by philosophers with respect to representation. I suggest the conception of information that I present is correct by the lights of the best applied mathematical and scientific theories of information. If so, there is no need for any separate theory of semantic information. Thus I present a theory of intrinsically semantic information which also constitutes an informational theory of truth where truth reduces to information. In the last section I discuss weakly and strongly semantic information, and reject them in favour of alethically neutral intrinsically semantic information.[PUBLICATION ABSTRACT];

Losee, R. M. (2012). Information from Processes: About the Nature of Information Creation, Use, and Representation. Dordrecht, Springer.

Lt. Gen. Peter Cuviello, A. C. I. O. G. (2002). "Army on Cusp of Sun Tzu's Dream." Pentagon Brief: 5.
Consequently, it is increasingly the information domain that must be protected and defended to enable a force to generate combat power in the face of offensive actions taken by an adversary. In the all- important battle for information superiority, the information domain is ground zero. But, the information domain, as a complement to the physical domain of warfare, is not enough.; Consequently, it is increasingly the information domain that must be protected and defended to enable a force to generate combat power in the face of offensive actions taken by an adversary. In the all- important battle for information superiority, the information domain is ground zero. But, the information domain, as a complement to the physical domain of warfare, is not enough.;

Mares, E. D. (1996). "Relevant Logic and the Theory of Information." SYNTHESE 109(3): 345-360.
This paper provides an interpretation of the Routley-Meyer semantics for a weak negation-free relevant logic using Israel and Perry's theory of information. In particular, Routley and Meyer's ternary accessibility relation is given an interpretation in information-theoretic terms.;This paper provides an interpretation of the Routley-Meyer semantics for a weak negation-free relevant logic using Israel and Perry's theory of information. In particular, Routley and Meyer's ternary accessibility relation is given an interpretation in information-theoretic terms.;

Mathiesen, K. (2004). "What is Information Ethics?" ACM SIGCAS Computers and Society 34(1): 6.

Mathur, S. D. (2009). What Exactly is the Information Paradox? Berlin, Heidelberg, Springer Berlin Heidelberg. 769: 3-48.
The black hole information paradox tells us something important about the way quantum mechanics and gravity fit together. In these lectures I try to give a pedagogical review of the essential physics leading to the paradox, using mostly pictures. Hawking’s argument is recast as a ‘theorem’: if quantum gravity effects are confined to within a given length scale and the vacuum is assumed to be unique, then there will be information loss. We conclude with a brief summary of how quantum effects in string theory violate the first condition and make the interior of the hole a ‘fuzzball’.

McIrvine, E. C. and M. Tribus (1971). "Energy and Information." Scientific American 225(3): 179-188.

Merkel, R. F. and E. Hochstetter (2015). Leibniz und China. Berlin ;Boston, De Gruyter.

Moles, A. A. (1966). Information theory and esthetic perception. Urbana, University of Illinois Press.

Moruzzi, G. and H. W. Magoun (1949). "Brain stem reticular formation and activation of the EEG." Electroencephalog Clin Neurophysiol 1.

Müller, V. C. (2012). Philosophy and Theory of Artificial Intelligence. Dordrecht, Springer.

Müller, V. C. (2014). Philosophy and Theory of Artificial Intelligence. Berlin, Heidelberg, GERMANY, Springer Berlin Heidelberg.

Müller, V. C. and SpringerLink (2016). Fundamental Issues of Artificial Intelligence. Cham, Springer International Publishing.

Nikitina, E., et al. (2014). "Artificial intelligence: philosophy, methodology, innovation." Philosophical Problems of Information Technologies and Cyberspace: 108-122.

Page, D. N. (1993). "Information in black hole radiation." PHYSICAL REVIEW LETTERS 71(23): 3743-3746.
If black hole formation and evaporation can be described by an S matrix, information would be expected to come out in black hole radiation. An estimate shows that it may come out initially so slowly, or else be so spread out, that it would never show up in an analysis perturbative in M(Planck)/M, or in 1/N for two-dimensional dilatonic black holes with a large number N of minimally coupled scalar fields.

Passingham, R. E., et al. (2002). "The anatomical basis of functional localization in the cortex." Nat Rev Neurosci 3.

Perkins, F. (2004). Leibniz and China: a commerce of light. New York, Cambridge University Press.

Pfaff, D. W. (2006). Brain Arousal and Information Theory: Neural and Genetic Mechanisms. Cambridge, Mass. [u.a.], Harvard University Press.

Pitalúa-García, D. (2013). "Quantum Information Causality." Phys. Rev. Lett. 110(21): 210402.

Plastino, A. and A. R. Plastino (2007). Information and thermal physics: 119-154.

Pouget, A., et al. (2002). "A computational perspective on the neural basis of multisensory spatial representations." Nat Rev Neurosci 3.

Primiero, G. (2011). "Giovanni Sommaruga (ed): Formal Theories of Information: From Shannon to Semantic Information Theory and General Concepts of Information: Lecture Notes in Computer Science, vol. 5363, Springer, New York, 2009, vii+269, $ 64.95, ISBN 978-3-642-00658-6." MINDS AND MACHINES 21(1): 119-122.

Rabinovich, M. I., et al. (2012). "Information flow dynamics in the brain." Physics of Life Reviews 9(1): 51-73.
Timing and dynamics of information in the brain is a hot field in modern neuroscience. The analysis of the temporal evolution of brain information is crucially important for the understanding of higher cognitive mechanisms in normal and pathological states. From the perspective of information dynamics, in this review we discuss working memory capacity, language dynamics, goal-dependent behavior programming and other functions of brain activity. In contrast with the classical description of information theory, which is mostly algebraic, brain flow information dynamics deals with problems such as the stability/instability of information flows, their quality, the timing of sequential processing, the top-down cognitive control of perceptual information, and information creation. In this framework, different types of information flow instabilities correspond to different cognitive disorders. On the other hand, the robustness of cognitive activity is related to the control of the information flow stability. We discuss these problems using both experimental and theoretical approaches, and we argue that brain activity is better understood considering information flows in the phase space of the corresponding dynamical model. In particular, we show how theory helps to understand intriguing experimental results in this matter, and how recent knowledge inspires new theoretical formalisms that can be tested with modern experimental techniques. © 2011.;Timing and dynamics of information in the brain is a hot field in modern neuroscience. The analysis of the temporal evolution of brain information is crucially important for the understanding of higher cognitive mechanisms in normal and pathological states. From the perspective of information dynamics, in this review we discuss working memory capacity, language dynamics, goal-dependent behavior programming and other functions of brain activity. In contrast with the classical description of information theory, which is mostly algebraic, brain flow information dynamics deals with problems such as the stability/instability of information flows, their quality, the timing of sequential processing, the top-down cognitive control of perceptual information, and information creation. In this framework, different types of information flow instabilities correspond to different cognitive disorders. On the other hand, the robustness of cognitive activity is related to the control of the information flow stability. We discuss these problems using both experimental and theoretical approaches, and we argue that brain activity is better understood considering information flows in the phase space of the corresponding dynamical model. In particular, we show how theory helps to understand intriguing experimental results in this matter, and how recent knowledge inspires new theoretical formalisms that can be tested with modern experimental techniques.;Timing and dynamics of information in the brain is a hot field in modern neuroscience. The analysis of the temporal evolution of brain information is crucially important for the understanding of higher cognitive mechanisms in normal and pathological states. From the perspective of information dynamics, in this review we discuss working memory capacity, language dynamics, goal-dependent behavior programming and other functions of brain activity. In contrast with the classical description of information theory, which is mostly algebraic, brain flow information dynamics deals with problems such as the stability/instability of information flows, their quality, the timing of sequential processing, the top-down cognitive control of perceptual information, and information creation. In this framework, different types of information flow instabilities correspond to different cognitive disorders. On the other hand, the robustness of cognitive activity is related to the control of the information flow stability. We discuss these problems using both experimental and theoretical approaches, and we argue that brain activity is better understood considering informatio flows in the phase space of the corresponding dynamical model. In particular, we show how theory helps to understand intriguing experimental results in this matter, and how recent knowledge inspires new theoretical formalisms that can be tested with modern experimental techniques.;Timing and dynamics of information in the brain is a hot field in modern neuroscience. The analysis of the temporal evolution of brain information is crucially important for the understanding of higher cognitive mechanisms in normal and pathological states. From the perspective of information dynamics, in this review we discuss working memory capacity, language dynamics, goal-dependent behavior programming and other functions of brain activity. In contrast with the classical description of information theory, which is mostly algebraic, brain flow information dynamics deals with problems such as the stability/instability of information flows, their quality, the timing of sequential processing, the top-down cognitive control of perceptual information, and information creation. In this framework, different types of information flow instabilities correspond to different cognitive disorders. On the other hand, the robustness of cognitive activity is related to the control of the information flow stability. We discuss these problems using both experimental and theoretical approaches, and we argue that brain activity is better understood considering information flows in the phase space of the corresponding dynamical model. In particular, we show how theory helps to understand intriguing experimental results in this matter, and how recent knowledge inspires new theoretical formalisms that can be tested with modem experimental techniques. Published by Elsevier B.V.;

Rees, G., et al. (2002). "Neural correlates of consciousness in humans." Nat Rev Neurosci 3.

Restall, G. (1996). Information Flow and Relevant Logics: 463-477.

Rioul, O. (2007). A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information, IEEE.
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher's information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.;While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher's information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.;

Rocchi, P. (2012). "What is Information: Beyond the Jungle of Information Theories." Computer Journal 55(7): 856-860.
Digital experts rigorously apply the principles of sharpness and semantics whereas analog experts pay less attention to these principles. We therefore put forward a way to distinguish the digital paradigm from the analog paradigm: the former adheres to these principle through precise rules; the latter adheres to them in generic and imprecise terms. [PUBLICATION ABSTRACT];Digital experts rigorously apply the principles of sharpness and semantics whereas analog experts pay less attention to these principles. We therefore put forward a way to distinguish the digital paradigm from the analog paradigm: the former adheres to these principle through precise rules; the latter adheres to them in generic and imprecise terms.;

Rorty, R. (2003). "Dewey, Democracy, and China." Dao 3(1): 1-6.

Rovelli, C. (2016). "Meaning = Information + Evolution."
Notions like meaning, signal, intentionality, are difficult to relate to a physical word. I study a purely physical definition of "meaningful information", from which these notions can be derived. It is inspired by a model recently illustrated by Kolchinsky and Wolpert, and improves on Dretske classic work on the relation between knowledge and information. I discuss what makes a physical process into a "signal".

Roy Frieden, B. (2007). "Information-based uncertainty for a photon." Optics Communications 271(1): 71-72.
It is shown on the basis of Fisher information that the ultimate root-mean square uncertainty in the position of a single photon of wavelength I' is 0.112I' in vacuum. This is as well an "effective size" for the photon.; It is shown on the basis of Fisher information that the ultimate root-mean square uncertainty in the position of a single photon of wavelength lambda is 0.112 lambda in vacuum. This is as well an "effective size" for the photon. (c) 2006 Elsevier B.V. All rights reserved.; It is shown on the basis of Fisher information that the ultimate root-mean square uncertainty in the position of a single photon of wavelength λ is 0.112λ in vacuum. This is as well an "effective size" for the photon. © 2006 Elsevier B.V. All rights reserved.

Salinas, E. (2004). "Fast remapping of sensory stimuli onto motor actions on the basis of contextual modulation." J Neurosci 24.

Salthe, S. N. (2011). "Naturalizing Information." Information 2(4): 417-425.
Certain definitions of information can be seen to be compatible with each other if their relationships are properly understood as referring to different levels of organization in a subsumptive hierarchy. The resulting hierarchy, with thermodynamics subsuming information theory, and that in turn subsuming semiotics, amounts to a naturalizing of the information concept.;Certain definitions of information can be seen to be compatible with each other if their relationships are properly understood as referring to different levels of organization in a subsumptive hierarchy. The resulting hierarchy, with thermodynamics subsuming information theory, and that in turn subsuming semiotics, amounts to a naturalizing of the information concept.;

Santacroce, M., et al. (2015). "Power Utility Maximization Problems Under Partial Information and Information Sufficiency in a Brownian Setting." Stochastic Analysis and Applications 33(3): 493-509.
We consider the problem of expected power utility maximization from terminal wealth in diffusion market models under partial information. After obtaining novel neat expressions for the value-process and for the optimal strategy, the issue of information sufficiency is addressed. In particular, necessary and sufficient conditions that guarantee that the partial information optimal strategy is still optimal when having access to all market information, are provided.;We consider the problem of expected power utility maximization from terminal wealth in diffusion market models under partial information. After obtaining novel neat expressions for the value-process and for the optimal strategy, the issue of information sufficiency is addressed. In particular, necessary and sufficient conditions that guarantee that the partial information optimal strategy is still optimal when having access to all market information, are provided.;

Sarkar, S. (1996). Biological Information: A Skeptical Look at Some Central Dogmas of Molecular Biology. Molecular Models of Life: Philosophical Papers on Molecular Biology. Massachusetts, MIT Press: 205-260.

Sarkar, S. (2000). "Information in Genetics and Developmental Biology: Comments on Maynard Smith." Philosophy of Science 67(2): 208-213.

Sarkar, S. (2006). How Genes Encode Information for Phenotypic Traits. Molecular Models of Life: Philosophical Papers on Molecular Biology. Massachusetts, MIT Press: 261-286.

Sarkar, S. (2014). "Does “Information” Provide a Compelling Framework for a Theory of Natural Selection? Grounds for Caution." Philosophy of Science 81(1): 22-30.
Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter ((for population change)) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart.; Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter (for population change) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart. [PUBLICATION ABSTRACT]; Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter ((for population change)) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart.; Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter (for population change) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart.

Schneider, T. D. (1999). "Measuring Molecular Information." Journal of Theoretical Biology 201(1): 87-92.

Sequoiah-Grayson, S. (2007). "The Metaphilosophy of Information." MINDS AND MACHINES 17(3): 331-344.
This article mounts a defence of Floridi's theory of strongly semantic information against recent independent objections from Fetzer and Dodig-Crnkovic. It is argued that Fetzer and Dodig-Crnkovic's objections result from an adherence to a redundant practice of analysis. This leads them to fail to accept an informational pluralism, as stipulated by what will be referred to as Shannon's Principle, and the non-reductionist stance. It is demonstrated that Fetzer and Dodig-Crnkovic fail to acknowledge that Floridi's theory of strongly semantic information captures one of our deepest and most compelling intuitions regarding informativeness as a basic notion. This modal intuition will be referred to as the contingency requirement on informativeness. It will be demonstrated that its clarification validates the theory of strongly semantic information as a novel, and non ad hoc solution to the Bar-Hillel-Carnap semantic paradox.

Sequoiah-Grayson, S. (2009). "A Positive Information Logic for Inferential Information." SYNTHESE 167(2): 409-431.
Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.

Shamir, M. (2014). "Emerging principles of population coding: in search for the neural code." Current Opinion in Neurobiology 25: 140-148.
Population coding theory aims to provide quantitative tests for hypotheses concerning the neural code. Over the last two decades theory has focused on analyzing the ways in which various parameters that characterize neuronal responses to external stimuli affect the information content of these responses. This article reviews and provides an intuitive explanation for the major effects of noise correlations and neuronal heterogeneity, and discusses their implications for our ability to investigate the neural code. It is argued that to test neural code hypotheses further, additional constraints are required, including relating trial-to-trial variation in neuronal population responses to behavioral decisions and specifying how information is decoded by downstream networks.

Shannon, C. E., et al. (1993). "Information Theory" from Encyclopedia Britannica 14th edition 1968. Claude Elwood Shannon: collected papers. New York, IEEE Press: 212-220.

Shea, N. (2007). "Consumers Need Information: Supplementing Teleosemantics with an Input Condition." Philosophy and Phenomenological Research 75(2): 404-435.
The success of a piece of behaviour is often explained by its being caused by a true representation (similarly, failure falsity). In some simple organisms, success is just survival and reproduction. Scientists explain why a piece of behaviour helped the organism to survive and reproduce by adverting to the behaviour's having been caused by a true representation. That usage should, if possible, be vindicated by an adequate naturalistic theory of content. Teleosemantics cannot do so, when it is applied to simple representing systems (Godfrey-Smith 1996). Here it is argued that the teleosemantic approach to content should therefore be modified, not abandoned, at least for simple representing systems. The new 'infotel-semantics' adds an input condition to the output condition offered by teleosemantics, recognising that it is constitutive of content in a simple representing system that the tokening of a representation should correlate probabilistically with the obtaining of its specific evolutionary success condition.; The success of a piece of behaviour is often explained by its being caused by a true representation (similarly, failure falsity). In some simple organisms, success is just survival and reproduction. Scientists explain why a piece of behaviour helped the organism to survive and reproduce by adverting to the behaviour’s having been caused by a true representation. That usage should, if possible, be vindicated by an adequate naturalistic theory of content. Teleosemantics cannot do so, when it is applied to simple representing systems (Godfrey‐Smith 1996). Here it is argued that the teleosemantic approach to content should therefore be modified, not abandoned, at least for simple representing systems. The new ‘infotel‐semantics’ adds an input condition to the output condition offered by teleosemantics, recognising that it is constitutive of content in a simple representing system that the tokening of a representation should correlate probabilistically with the obtaining of its specific evolutionary success condition.; The success of a piece of behaviour is often explained by its being caused by a true representation (similarly, failure falsity). In some simple organisms, success is just survival and reproduction. Scientists explain why a piece of behaviour helped the organism to survive and reproduce by adverting to the behaviour's having been caused by a true representation. That usage should, if possible, be vindicated by an adequate naturalistic theory of content. Teleosemantics cannot do so, when it is applied to simple representing systems (Godfrey-Smith 1996). Here it is argued that the teleosemantic approach to content should therefore be modified, not abandoned, at least for simple representing systems. The new 'infotel-semantics' adds an input condition to the output condition offered by teleosemantics, recognising that it is constitutive of content in a simple representing system that the tokening of a representation should correlate probabilistically with the obtaining of its specific evolutionary success condition.

Silva, F. S. C. d. and J. Agustí i Cullell (2008). Information flow and knowledge sharing. Amsterdam;Boston;, Elsevier.

Sivin, N. (1985). "Why the scientific revolution did not take place in China — or did it?" The Environmentalist 5(1): 39-50.

Smith, J. M. (1999). "The Idea of Information in Biology." The Quarterly Review of Biology 74(4): 395-400.
The idea of information in biology is discussed. There has been, in the course of evolution, a series of changes in the way in which information is stored and transmitted. However, philosophers of biology have either ignored the concept of information or have argued that it is irrelevant or misleading. The manner in which the concept has been applied in genetics, evolution, and developmental biology is considered.;Smith discusses how the idea of Information Theory has been applied in genetics, evolution and developmental biology. He is not, he concedes, competent to discuss Information Theory in neurobiology.;

Smith, J. M. (2000). "The Concept of Information in Biology." Philosophy of Science 67(2): 177-194.
The use of informational terms is widespread in molecular and developmental biology. In biology, the use of informational terms implies intentionality, in that both the form of the signal, and the response to it, have evolved by selection.;The use of informational terms is widespread in molecular and developmental biology. The usage dates back to Weismann. In both protein synthesis and in later development, genes are symbols, in that there is no necessary connection between their form (sequence) and their effects. The sequence of a gene has been determined, by past natural selection, because of the effects it produces. In biology, the use of informational terms implies intentionality, in that both the form of the signal, and the response to it, have evolved by selection. Where an engineer sees design, a biologist sees natural selection.;

Smokler, H. (1966). "Informational Content: A Problem of Definition." The Journal of Philosophy 63(8): 201-211.

Sommerfeldt, E. J. (2015). "Disasters and Information Source Repertoires: Information Seeking and Information Sufficiency in Postearthquake Haiti." Journal of Applied Communication Research 43(1): 1-22.
This study examines how Haitians used "information source repertoires" to meet information insufficiencies following the 2010 earthquake. Using survey data gained in Haiti, the study explores which demographic and structural factors predicted the number of sources used and combinations of information sources following the disaster. Analysis of the data revealed two distinct repertoires of information sources: a "traditional" repertoire of radio, TV, church, and word of mouth; and an "elite" repertoire of newspapers, the Internet, short-message-service, billboards, and the national police. Results of hierarchical multiple regression analyses showed that demographic variables like education were stronger predictors of information repertoires than conditions like living in a refugee camp or having one's home destroyed. Results also suggested that greater reliance on a traditional repertoire led to decreased information sufficiency. Contrary to previous crisis research, men were found to be more active information seekers than women, suggesting that scholarly knowledge about information seeking and media use after crises in developing nations is limited. Implications for practice are directed at international development and aid organizations in planning postdisaster information provision efforts.;This study examines how Haitians used 'information source repertoires' to meet information insufficiencies following the 2010 earthquake. Using survey data gained in Haiti, the study explores which demographic and structural factors predicted the number of sources used and combinations of information sources following the disaster. Analysis of the data revealed two distinct repertoires of information sources: a 'traditional' repertoire of radio, TV, church, and word of mouth; and an 'elite' repertoire of newspapers, the Internet, short-message-service, billboards, and the national police. Results of hierarchical multiple regression analyses showed that demographic variables like education were stronger predictors of information repertoires than conditions like living in a refugee camp or having one's home destroyed. Results also suggested that greater reliance on a traditional repertoire led to decreased information sufficiency. Contrary to previous crisis research, men were found to be more active information seekers than women, suggesting that scholarly knowledge about information seeking and media use after crises in developing nations is limited. Implications for practice are directed at international development and aid organizations in planning postdisaster information provision efforts. Adapted from the source document.;This study examines how Haitians used "information source repertoires" to meet information insufficiencies following the 2010 earthquake. Using survey data gained in Haiti, the study explores which demographic and structural factors predicted the number of sources used and combinations of information sources following the disaster. Analysis of the data revealed two distinct repertoires of information sources: a "traditional" repertoire of radio, TV, church, and word of mouth; and an "elite" repertoire of newspapers, the Internet, short-message-service, billboards, and the national police. Results of hierarchical multiple regression analyses showed that demographic variables like education were stronger predictors of information repertoires than conditions like living in a refugee camp or having one's home destroyed. Results also suggested that greater reliance on a traditional repertoire led to decreased information sufficiency. Contrary to previous crisis research, men were found to be more active information seekers than women, suggesting that scholarly knowledge about information seeking and media use after crises in developing nations is limited. Implications for practice are directed at international development and aid organizations in planning postdisaster information provision efforts.; This study examines how Haitians used "information source repertoires" to meet information insufficiencies fol owing the 2010 earthquake. Using survey data gained in Haiti, the study explores which demographic and structural factors predicted the number of sources used and combinations of information sources following the disaster. Analysis of the data revealed two distinct repertoires of information sources: a "traditional" repertoire of radio, TV, church, and word of mouth; and an "elite" repertoire of newspapers, the Internet, short-message-service, billboards, and the national police. Results of hierarchical multiple regression analyses showed that demographic variables like education were stronger predictors of information repertoires than conditions like living in a refugee camp or having one's home destroyed. Results also suggested that greater reliance on a traditional repertoire led to decreased information sufficiency. Contrary to previous crisis research, men were found to be more active information seekers than women, suggesting that scholarly knowledge about information seeking and media use after crises in developing nations is limited. Implications for practice are directed at international development and aid organizations in planning postdisaster information provision efforts.; This study examines how Haitians used "information source repertoires" to meet information insufficiencies following the 2010 earthquake. Using survey data gained in Haiti, the study explores which demographic and structural factors predicted the number of sources used and combinations of information sources following the disaster. Analysis of the data revealed two distinct repertoires of information sources: a "traditional" repertoire of radio, TV, church, and word of mouth; and an "elite" repertoire of newspapers, the Internet, short-message-service, billboards, and the national police. Results of hierarchical multiple regression analyses showed that demographic variables like education were stronger predictors of information repertoires than conditions like living in a refugee camp or having one's home destroyed. Results also suggested that greater reliance on a traditional repertoire led to decreased information sufficiency. Contrary to previous crisis research, men were found to be more active information seekers than women, suggesting that scholarly knowledge about information seeking and media use after crises in developing nations is limited. Implications for practice are directed at international development and aid organizations in planning postdisaster information provision efforts.;This study examines how Haitians used 'information source repertoires' to meet information insufficiencies following the 2010 earthquake. Using survey data gained in Haiti, the study explores which demographic and structural factors predicted the number of sources used and combinations of information sources following the disaster. Analysis of the data revealed two distinct repertoires of information sources: a 'traditional' repertoire of radio, TV, church, and word of mouth; and an 'elite' repertoire of newspapers, the Internet, short-message-service, billboards, and the national police. Results of hierarchical multiple regression analyses showed that demographic variables like education were stronger predictors of information repertoires than conditions like living in a refugee camp or having one's home destroyed. Results also suggested that greater reliance on a traditional repertoire led to decreased information sufficiency. Contrary to previous crisis research, men were found to be more active information seekers than women, suggesting that scholarly knowledge about information seeking and media use after crises in developing nations is limited. Implications for practice are directed at international development and aid organizations in planning postdisaster information provision efforts. Adapted from the source document.;

Sperry, R. (1984). "Consciousness, personal identity and the divided brain." Neuropsychologia 22.

Srinivasan, R., et al. (1999). "Increased synchronization of neuromagnetic responses during conscious perception." J Neurosci 19.

Stegmann, U. E. (2005). "Genetic Information as Instructional Content." Philosophy of Science 72(3): 425-443.

Stegmann, U. E. (2009). "DNA, Inference, and Information." The British Journal for the Philosophy of Science 60(1): 1-17.
This paper assesses Sarkar's ([2003]) deflationary account of genetic information. On Sarkar's account, genes carry information about proteins because protein synthesis exemplifies what Sarkar calls a 'formal information system'. Furthermore, genes are informationally privileged over non-genetic factors of development because only genes enter into arbitrary relations to their products (in virtue of the alleged arbitrariness of the genetic code). I argue that the deflationary theory does not capture four essential features of the ordinary concept of genetic information: intentionality, exclusiveness, asymmetry, and causal relevance. It is therefore further removed from what is customarily meant by genetic information than Sarkar admits. Moreover, I argue that it is questionable whether the account succeeds in demonstrating that information is theoretically useful in molecular genetics. [PUBLICATION ABSTRACT]; This paper assesses Sarkar's ([2003]) deflationary account of genetic information. On Sarkar's account, genes carry information about proteins because protein synthesis exemplifies what Sarkar calls a 'formal information system'. Furthermore, genes are informationally privileged over non-genetic factors of development because only genes enter into arbitrary relations to their products (in virtue of the alleged arbitrariness of the genetic code). I argue that the deflationary theory does not capture four essential features of the ordinary concept of genetic information: intentionality, exclusiveness, asymmetry, and causal relevance. It is therefore further removed from what is customarily meant by genetic information than Sarkar admits. Moreover, I argue that it is questionable whether the account succeeds in demonstrating that information is theoretically useful in molecular genetics. [PUBLICATION ABSTRACT];This paper assesses Sarkar's ([2003]) deflationary account of genetic information. On Sarkar's account, genes carry information about proteins because protein synthesis exemplifies what Sarkar calls a 'formal information system'. Furthermore, genes are informationally privileged over non-genetic factors of development because only genes enter into arbitrary relations to their products (in virtue of the alleged arbitrariness of the genetic code). I argue that the deflationary theory does not capture four essential features of the ordinary concept of genetic information: intentionality, exclusiveness, asymmetry, and causal relevance. It is therefore further removed from what is customarily meant by genetic information than Sarkar admits. Moreover, I argue that it is questionable whether the account succeeds in demonstrating that information is theoretically useful in molecular genetics. Introduction Sarkar's Information System The Pre-theoretic Features of Genetic Information 3.1 Intentionality 3.2 Exclusiveness 3.3 Asymmetry 3.4 Causal relevance Theoretical Usefulness Conclusion;

Sterelny, K. (2000). "The "Genetic Program" Program: A Commentary on Maynard Smith on Information in Biology." Philosophy of Science 67(2): 195-201.

Sterner, B. (2014). "The Practical Value of Biological Information for Research." Philosophy of Science 81(2): 175-194.
Many philosophers are skeptical about the scientific value of the concept of biological information. However, several have recently proposed a more positive view of ascribing information as an exercise in scientific modeling. I argue for an alternative role: guiding empirical data collection for the sake of theorizing about the evolution of semantics. I clarify and expand on Bergstrom and Rosvall's suggestion of taking a diagnostic approach that defines biological information operationally as a procedure for collecting empirical cases. The more recent modeling-based accounts still perpetuate a theory-centric view of scientific concepts, which motivated philosophers' misplaced skepticism in the first place.; Many philosophers are skeptical about the scientific value of the concept of biological information. However, several have recently proposed a more positive view of ascribing information as an exercise in scientific modeling. I argue for an alternative role: guiding empirical data collection for the sake of theorizing about the evolution of semantics. I clarify and expand on Bergstrom and Rosvall’s suggestion of taking a “diagnostic” approach that defines biological information operationally as a procedure for collecting empirical cases. The more recent modeling-based accounts still perpetuate a theory-centric view of scientific concepts, which motivated philosophers’ misplaced skepticism in the first place.; Many philosophers are skeptical about the scientific value of the concept of biological information. However, several have recently proposed a more positive view of ascribing information as an exercise in scientific modeling. I argue for an alternative role: guiding empirical data collection for the sake of theorizing about the evolution of semantics. I clarify and expand on Bergstrom and Rosvall's suggestion of taking a "diagnostic" approach that defines biological information operationally as a procedure for collecting empirical cases. The more recent modeling-based accounts still perpetuate a theory-centric view of scientific concepts, which motivated philosophers' misplaced skepticism in the first place.

Stojmirović, A. and Y.-K. Yu (2007). "Information flow in interaction networks." Journal of Computational Biology 14(8): 1115-1143.
Interaction networks, consisting of agents linked by their interactions, are ubiquitous accross many disciplines of modern science. Many methods of analysis of interaction networks have been proposed, mainly concentrating on node degree distribution or aiming to discover clusters of agents that are very strongly connected between themselves. These methods are principally based on graph-theory or machine learning. We present a mathematically simple formalism for modelling context-specific information propagation in interaction networks based on random walks. The context is provided by selection of sources and destinations of information and by use of potential functions that direct the now towards the destinations. We also use the concept of dissipation to model the aging of information as it diffuses from its source. Using examples from yeast protein-protein interaction networks and some of the histone acetyltransferases involved in control of transcription, we demonstrate the utility of the concepts and the mathematical constructs introduced in this paper.;Interaction networks, consisting of agents linked by their interactions, are ubiquitous across many disciplines of modern science. Many methods of analysis of interaction networks have been proposed, mainly concentrating on node degree distribution or aiming to discover clusters of agents that are very strongly connected between themselves. These methods are principally based on graph-theory or machine learning. We present a mathematically simple formalism for modelling context-specific information propagation in interaction networks based on random walks. The context is provided by selection of sources and destinations of information and by use of potential functions that direct the flow towards the destinations. We also use the concept of dissipation to model the aging of information as it diffuses from its source. Using examples from yeast protein-protein interaction networks and some of the histone acetyltransferases involved in control of transcription, we demonstrate the utility of the concepts and the mathematical constructs introduced in this paper.;Interaction networks, consisting of agents linked by their interactions, are ubiquitous across many disciplines of modern science. Many methods of analysis of interaction networks have been proposed, mainly concentrating on node degree distribution or aiming to discover clusters of agents that are very strongly connected between themselves. These methods are principally based on graph-theory or machine learning. We present a mathematically simple formalism for modelling context-specific information propagation in interaction networks based on random walks. The context is provided by selection of sources and destinations of information and by use of potential functions that direct the flow towards the destinations. We also use the concept of dissipation to model the aging of information as it diffuses from its source. Using examples from yeast protein-protein interaction networks and some of the histone acetyltransferases involved in control of transcription, we demonstrate the utility of the concepts and the mathematical constructs introduced in this paper.;

Sun, F., et al. (2013). Foundations and Practical Applications of Cognitive Systems and Information Processing: Proceedings of the First International Conference on Cognitive Systems and Information Processing, Beijing, China, Dec 2012 (CSIP2012). Dordrecht, Springer.

Tai, Y., et al. (2017). "Gate Level Information Flow analysis for multi-valued logic system." 2017 2nd International Conference on Image, Vision and Computing (ICIVC), Image, Vision and Computing (ICIVC), 2017 2nd International Conference on: 1102.
As the scale of integrated circuits continuously increasing, guaranteeing intensity of testing and coverage rate of verification in the design phase absolutely is becoming severe challenges. As a solution, Gate Level Information Flow Tracking (GLIFT) method is able to precisely measure all the logical information flows in the underlying hardware to prevent information leakage resulting from these harmful flows of information. However, preliminary research work for GLIFT mainly focused on the basic theories as well as the generation algorithms, which only can track logic formalization under the Boolean logic system. These approaches ignored that digital circuits are typically described by multi-valued logic during hardware design and verification. To address that issue, we present to expand the GLIFT method for multi-valued logic system. In this paper, the label propagation rule set is respectively derived for four-valued and nine-valued logic systems by extending the label propagation

Taylor, K. A. (1987). "Belief, Information and Semantic Content: A Naturalist's Lament." SYNTHESE 71(1): 97-124.

Timpson, C. G. (2005). "Nonlocality and Information Flow: The Approach of Deutsch and Hayden." FOUNDATIONS OF PHYSICS 35(2): 313-343.

Timpson, C. G. (2006). "Philosophical Aspects of Quantum Information Theory."
Quantum information theory represents a rich subject of discussion for those interested in the philosphical and foundational issues surrounding quantum mechanics for a simple reason: one can cast its central concerns in terms of a long-familiar question: How does the quantum world differ from the classical one? Moreover, deployment of the concepts of information and computation in novel contexts hints at new (or better) means of understanding quantum mechanics, and perhaps even invites re-assessment of traditional material conceptions of the basic nature of the physical world. In this paper I review some of these philosophical aspects of quantum information theory, begining with an elementary survey of the theory, seeking to highlight some of the principles and heuristics involved. We move on to a discussion of the nature and definition of quantum information and deploy the findings in discussing the puzzles surrounding teleportation. The final two sections discuss, respectively, what one might learn from the development of quantum computation (both about the nature of quantum systems and about the nature of computation) and consider the impact of quantum information theory on the traditional foundational questions of quantum mechanics (treating of the views of Zeilinger, Bub and Fuchs, amongst others).

Timpson, C. G. (2013). What is Information? Oxford, Oxford University Press.
Distinctions are drawn between a number of different information concepts. It is noted that ‘information’ in both the everyday and Shannon-theory setting is an abstract noun, though derived in different ways. A general definition of the concept(s) of information in the Shannon mould is provided and it is shown that a concept of both bits (how much) and pieces (what) of Shannon information is available. It is emphasised that the Shannon information, as a measure of information, should not be understood as an uncertainty; neither is the notion of correlation key to the Shannon concept. Corollaries regarding the ontological status of information and on the notion of information’s flow are drawn. The chapter closes with a brief discussion of Dretske’s attempt to base a semantic notion of information on Shannon’s theory. It is argued that the attempt is not successful.

Tkačik, G., et al. (2008). "Information Flow and Optimization in Transcriptional Regulation." Proceedings of the National Academy of Sciences of the United States of America 105(34): 12265-12270.
In the simplest view of transcriptional regulation, the expression of a gene is turned on or off by changes in the concentration of a transcription factor (TF). We use recent data on noise levels in gene expression to show that it should be possible to transmit much more than just one regulatory bit. Realizing this optimal information capacity would require that the dynamic range of TF concentrations used by the cell, the input/output relation of the regulatory module, and the noise in gene expression satisfy certain matching relations, which we derive. These results provide parameter-free, quantitative predictions connecting independently measurable quantities. Although we have considered only the simplified problem of a single gene responding to a single TF, we find that these predictions are in surprisingly good agreement with recent experiments on the Bicoid/Hunchback system in the early Drosophila embryo and that this system achieves similar to 90% of its theoretical maximum information transmission.;In the simplest view of transcriptional regulation, the expression of a gene is turned on or off by changes in the concentration of a transcription factor (TF). We use recent data on noise levels in gene expression to show that it should be possible to transmit much more than just one regulatory bit. Realizing this optimal information capacity would require that the dynamic range of TF concentrations used by the cell, the input/output relation of the regulatory module, and the noise in gene expression satisfy certain matching relations, which we derive. These results provide parameter-free, quantitative predictions connecting independently measurable quantities. Although we have considered only the simplified problem of a single gene responding to a single TF, we find that these predictions are in surprisingly good agreement with recent experiments on the Bicoid/Hunchback system in the early Drosophila embryo and that this system achieves ∼90% of its theoretical maximum information transmission.;In the simplest view of transcriptional regulation, the expression of a gene is turned on or off by changes in the concentration of a transcription factor (TF). We use recent data on noise levels in gene expression to show that it should be possible to transmit much more than just one regulatory bit. Realizing this optimal information capacity would require that the dynamic range of TF concentrations used by the cell, the input/output relation of the regulatory module, and the noise in gene expression satisfy certain matching relations, which we derive. These results provide parameter-free, quantitative predictions connecting independently measurable quantities. Although we have considered only the simplified problem of a single gene responding to a single TF, we find that these predictions are in surprisingly good agreement with recent experiments on the Bicoid/Hunchback system in the early Drosophila embryo and that this system achieves approximately 90% of its theoretical maximum information transmission.; In the simplest view of transcriptional regulation, the expression of a gene is turned on or off by changes in the concentration of a transcription factor (TF). We use recent data on noise levels in gene expression to show that it should be possible to transmit much more than just one regulatory bit. Realizing this optimal information capacity would require that the dynamic range of TF concentrations used by the cell, the input/output relation of the regulatory module, and the noise in gene expression satisfy certain matching relations, which we derive. These results provide parameter-free, quantitative predictions connecting independently measurable quantities. Although we have considered only the simplified problem of a single gene responding to a single TF, we find that these predictions are in surprisingly good agreement with recent experiments on the Bicoid/Hunchback system in the early Drosophila embryo and that this system achieves 90% of its theoretical maximum information transmission. [PUBLICATION ABSTRACT ;In the simplest view of transcriptional regulation, the expression of a gene is turned on or off by changes in the concentration of a transcription factor (TF). We use recent data on noise levels in gene expression to show that it should be possible to transmit much more than just one regulatory bit. Realizing this optimal information capacity would require that the dynamic range of TF concentrations used by the ceil, the input/output relation of the regulatory module, and the noise in gene expression satisfy certain matching relations, which we derive. These results provide parameter-free, quantitative predictions connecting independently measurable quantities. Although we have considered only the simplified problem of a single gene responding to a single TF, we find that these predictions are in surprisingly good agreement with recent experiments on the Bicoid/Hunchback system in the early Drosophila embryo and that this system achieves 90% of its theoretical maximum information transmission.;In the simplest view of transcriptional regulation, the expression of a gene is turned on or off by changes in the concentration of a transcription factor (TF). We use recent data on noise levels in gene expression to show that it should be possible to transmit much more than just one regulatory bit. Realizing this optimal information capacity would require that the dynamic range of TF concentrations used by the cell, the input/output relation of the regulatory module, and the noise in gene expression satisfy certain matching relations, which we derive. These results provide parameter-free, quantitative predictions connecting independently measurable quantities. Although we have considered only the simplified problem of a single gene responding to a single TF, we find that these predictions are in surprisingly good agreement with recent experiments on the Bicoid/Hunchback system in the early Drosophila embryo and that this system achieves ∼90% of its theoretical maximum information transmission. gene regulatory networks information theory;

Tononi, G. (2001). "Information measures for conscious experience." Arch Ital Biol 139.

Tononi, G. (2004). "An information integration theory of consciousness." BMC Neuroscience 5(1): 42.
Consciousness poses two main problems. The first is understanding the conditions that determine to what extent a system has conscious experience. For instance, why is our consciousness generated by certain parts of our brain, such as the thalamocortical system, and not by other parts, such as the cerebellum? And why are we conscious during wakefulness and much less so during dreamless sleep? The second problem is understanding the conditions that determine what kind of consciousness a system has. For example, why do specific parts of the brain contribute specific qualities to our conscious experience, such as vision and audition?

Tononi, G. and O. Sporns (2003). "Measuring information integration." BMC Neurosci 4.

Tribus, M. (1963). Information Theory and Thermodynamics. Heat Transfer, Thermodynamics, and Education: Boelter Anniversary Volume, McGraw Hill: 348-368.
LCCN: 63022596

Vitanyi, P. (2000). Three Approaches to the Quantitative Definition of Information in an Individual Pure Quantum State, IEEE Computer Society.

Walleczek, J. and G. Grössing (2016). "Nonlocal Quantum Information Transfer Without Superluminal Signalling and Communication." FOUNDATIONS OF PHYSICS 46(9): 1208-1228.
It is a frequent assumption that—via superluminal information transfers—superluminal signals capable of enabling communication are necessarily exchanged in any quantum theory that posits hidden superluminal influences. However, does the presence of hidden superluminal influences automatically imply superluminal signalling and communication? The non-signalling theorem mediates the apparent conflict between quantum mechanics and the theory of special relativity. However, as a ‘no-go’ theorem there exist two opposing interpretations of the non-signalling constraint: foundational and operational. Concerning Bell’s theorem, we argue that Bell employed both interpretations, and that he finally adopted the operational position which is associated often with ontological quantum theory, e.g., de Broglie–Bohm theory. This position we refer to as “effective non-signalling”. By contrast, associated with orthodox quantum mechanics is the foundational position referred to here as “axiomatic non-signalling”. In search of a decisive communication-theoretic criterion for differentiating between “axiomatic” and “effective” non-signalling, we employ the operational framework offered by Shannon’s mathematical theory of communication, whereby we distinguish between Shannon signals and non-Shannon signals. We find that an effective non-signalling theorem represents two sub-theorems: (1) Non-transfer-control (NTC) theorem, and (2) Non-signification-control (NSC) theorem. Employing NTC and NSC theorems, we report that effective, instead of axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal communication. Effective non-signalling prevents the instantaneous, i.e., superluminal, transfer of message-encoded information through the controlled use—by a sender-receiver pair —of informationally-correlated detection events, e.g., in EPR-type experiments. An effective non-signalling theorem allows for nonlocal quantum information transfer yet—at the same time—effectively denies superluminal signalling and communication.;It is a frequent assumption that-via superluminal information transfers-superluminal signals capable of enabling communication are necessarily exchanged in any quantum theory that posits hidden superluminal influences. However, does the presence of hidden superluminal influences automatically imply superluminal signalling and communication? The non-signalling theorem mediates the apparent conflict between quantum mechanics and the theory of special relativity. However, as a 'no-go' theorem there exist two opposing interpretations of the non-signalling constraint: foundational and operational. Concerning Bell's theorem, we argue that Bell employed both interpretations, and that he finally adopted the operational position which is associated often with ontological quantum theory, e.g., de Broglie-Bohm theory. This position we refer to as "effective non-signalling". By contrast, associated with orthodox quantum mechanics is the foundational position referred to here as "axiomatic non-signalling". In search of a decisive communication-theoretic criterion for differentiating between "axiomatic" and "effective" non-signalling, we employ the operational framework offered by Shannon's mathematical theory of communication, whereby we distinguish between Shannon signals and non-Shannon signals. We find that an effective non-signalling theorem represents two sub-theorems: (1) Non-transfer-control (NTC) theorem, and (2) Non-signification-control (NSC) theorem. Employing NTC and NSC theorems, we report that effective, instead of axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal communication. Effective non-signalling prevents the instantaneous, i.e., superluminal, transfer of message-encoded information through the controlled use-by a sender-receiver pair -of informationally-correlated detection events, e.g., in EPR-type experiments. An effective non-signalling theorem allows for nonlocal quantum information transfer yet-at the same time-effectively denies super uminal signalling and communication.;It is a frequent assumption that - via superluminal information transfers - superluminal signals capable of enabling communication are necessarily exchanged in any quantum theory that posits hidden superluminal influences. However, does the presence of hidden superluminal influences automatically imply superluminal signalling and communication? The non-signalling theorem mediates the apparent conflict between quantum mechanics and the theory of special relativity. However, as a 'no-go' theorem there exist two opposing interpretations of the non-signalling constraint: foundational and operational. Concerning Bell's theorem, we argue that Bell employed both interpretations at different times. Bell finally pursued an explicitly operational position on non-signalling which is often associated with ontological quantum theory, e.g., de Broglie-Bohm theory. This position we refer to as "effective non-signalling". By contrast, associated with orthodox quantum mechanics is the foundational position referred to here as "axiomatic non-signalling". In search of a decisive communication-theoretic criterion for differentiating between "axiomatic" and "effective" non-signalling, we employ the operational framework offered by Shannon's mathematical theory of communication. We find that an effective non-signalling theorem represents two sub-theorems, which we call (1) non-transfer-control (NTC) theorem, and (2) non-signification-control (NSC) theorem. Employing NTC and NSC theorems, we report that effective, instead of axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal communication. An effective non-signalling theorem allows for nonlocal quantum information transfer yet - at the same time - effectively denies superluminal signalling and communication.;

Weiss, O., et al. (2000). "Information Content of Protein Sequences." Journal of Theoretical Biology 206(3): 379-386.
The complexity of large sets of non-redundant protein sequences is measured. This is done by estimating the Shannon entropy as well as applying compression algorithms to estimate the algorithmic complexity. The estimators are also applied to randomly generated surrogates of the protein data. Our results show that proteins are fairly close to random sequences. The entropy reduction due to correlations is only about 1%. However, precise estimations of the entropy of the source are not possible due to finite sample effects. Compression algorithms also indicate that the redundancy is in the order of 1%. These results confirm the idea that protein sequences can be regarded as slightly edited random strings. We discuss secondary structure and low-complexity regions as causes of the redundancy observed. The findings are related to numerical and biochemical experiments with random polypeptides. Copyright 2000 Academic Press;The complexity of large sets of non-redundant protein sequences is measured. This is done by estimating the Shannon entropy as well as applying compression algorithms to estimate the algorithmic complexity. The estimators are also applied to randomly generated surrogates of the protein data. Our results show that proteins are fairly close to random sequences. The entropy reduction due to correlations is only about 1%. However, precise estimations of the entropy of the source are not possible due to finite sample effects. Compression algorithms also indicate that the redundancy is in the order of 1%. These results confirm the idea that protein sequences can be regarded as slightly edited random strings. We discuss secondary structure and low-complexity regions as causes of the redundancy observed. The findings are related to numerical and biochemical experiments with random polypeptides.;

Wheeler, J. (1989). Information, physics, quantum: the search for links.

Wolf, F., et al. (2014). "Dynamical models of cortical circuits." Current Opinion in Neurobiology 25: 228-236.
Cortical neurons operate within recurrent neuronal circuits. Dissecting their operation is key to understanding information processing in the cortex and requires transparent and adequate dynamical models of circuit function. Convergent evidence from experimental and theoretical studies indicates that strong feedback inhibition shapes the operating regime of cortical circuits. For circuits operating in inhibition-dominated regimes, mathematical and computational studies over the past several years achieved substantial advances in understanding response modulation and heterogeneity, emergent stimulus selectivity, inter-neuron correlations, and microstate dynamics. The latter indicate a surprisingly strong dependence of the collective circuit dynamics on the features of single neuron action potential generation. New approaches are needed to definitely characterize the cortical operating regime.

Wolski, J. (2010). "Measure of Amount of Information and Its Meaning." FILOZOFIA NAUKI 18(3): 105-105.
Jacek Wolski, Measure of Amount of Information and Its Meaning There are five different conceptions of information which have been created in last sixty years. Each of them diversely define notion of information. The aim of this article is to prove that these five conceptions represents two separate trends. Every introduced conception of information is qualified either to quantitative trend, which is supported on rating of amount of information, or to semantic trend, which describe meaning of information. These trends are exclusive.

Wu, K. (2015). "The Development of Philosophy and Its Fundamental Informational Turn." Information 6(4): 693-703.
Through the rescientification of philosophy and the philosophization of science, an entirely new concept of science and philosophy as part of general human knowledge is developing. In this concept, science and philosophy become intrinsically integrated and unified, forming dynamic feedback-loops which lead to further mutual transformation and integration. This development is taking place in the face of two kinds of dogmatism: one is naturalistic dogmatism, the other the dogmatism of consciousness philosophy. These two kinds of dogmatism are an inevitable consequence of the method of segmentation of the field of existence in traditional philosophy namely: existence = matter + Spirit (mind). The development of the Science and Philosophy of Information reveals a world of information-by-itself lying between the worlds of matter and Sprit, and re-interprets the essence of the Spiritual world in the sense of prior information activities. Accordingly, we can describe the movements from matter to Spirit, and from Spirit to matter in these activities as processes, eliminating their dualistic separation, and achieve an informational turn in philosophy, the first truly fundamental one.; Through the rescientification of philosophy and the philosophization of science, an entirely new concept of science and philosophy as part of general human knowledge is developing. In this concept, science and philosophy become intrinsically integrated and unified, forming dynamic feedback-loops which lead to further mutual transformation and integration. This development is taking place in the face of two kinds of dogmatism: one is naturalistic dogmatism, the other the dogmatism of consciousness philosophy. These two kinds of dogmatism are an inevitable consequence of the method of segmentation of the field of existence in traditional philosophy namely: existence = matter + Spirit (mind). The development of the Science and Philosophy of Information reveals a world of information-by-itself lying between the worlds of matter and Sprit, and re-interprets the essence of the Spiritual world in the sense of prior information activities. Accordingly, we can describe the movements from matter to Spirit, and from Spirit to matter in these activities as processes, eliminating their dualistic separation, and achieve an informational turn in philosophy, the first truly fundamental one.;

Wu, K. and W. Wu (2016). "Optimal Controls for a Large Insurance Under a CEV Model: Based on the Legendre Transform-Dual Method." Journal of Quantitative Economics 14(2): 167-178.
The purpose of this paper is to consider the optimal proportional reinsurance and investment strategies for an insurance company. The insurer’s surplus process is approximated by a Brownian motion with drift. The insurance company can purchase proportional reinsurance and invest the surplus in a financial market which includes one risk-free asset and one risky asset whose price is modeled by a CEV model. The primary problem is changed to the dual problem by implying Legendre transform. When the objective of the insurance company is to maximize the expected logarithmic utility from terminal wealth, the closed-form expressions for the optimal reinsurance-investment policy which is different to the Merton case to the primal optimal problem are obtained and numerical simulations are provided to demonstrate our results. Moreover, we find an interesting result that risk exposure is non-monotonic in the cost of reinsurance.

Wu, T.-q. and H. Jin (2010). "The Rise of Philosophy of Information in China." Jiangnan Daxue Xuebao/Journal of Jiangnan University: Humanities & Social Sciences Edition 9(5): 27-32.
In 1980s, discussions on philosophy of information were heated in the Chinese academics circles, which brought a large number of papers and monographs associated with this issue into being. The publication of article "Outline of philosophy Information Theory" (1985) and the book "Introduction of Philosophy Information Theory" (1987) by Mr. Wu Kun marked the establishment for philosophy of information in China. Mr. Wu Kun stressed in particular that as the philosophy of information built a whole new existence area split mode, which essentially changed the concrete way to express basic philosophical issues, therefore, philosophy of Information was a type of meta-philosophy or a ultimate philosophy. Philosophy of Information achieved the first fundamental turn in human philosophy, and leaded fundamental changes for human philosophy in all directions. Adapted from the source document.

Wu, Z., et al. (2013). "Measurement interpretation and information measures in general probabilistic theory." Open Physics 11(3): 317.

Wu, Z.-Z., et al. (2017). "Survey on information flow control." Ruan Jian Xue Bao/Journal of Software 28(1): 135-159.

Yapp, C. (2011). "What Is Information?" ITNOW 53(2): 18-18.

Zurek, W. H. (1990). Complexity, entropy, and the physics of information: the proceedings of the 1988 Workshop on Complexity, Entropy, and the Physics of Information held May-June, 1989, in Santa Fe, New Mexico, Addison-Wesley Pub. Co.